The tech rally and its perils

Since the beginning of the pandemic, global stock markets have gained +48%. At the same time, the US stock market has increased its capitalisation by 69%, led by the tech sector which more than doubled in value, gaining 134%. Those numbers come against a backdrop of lockdowns, trade wars, broken supply chains, below-trend economic growth, high inflation and fairly restrictive monetary policy. Ex-technology, the US large-caps have gained +49%, almost in line with global stocks.

This tech rally echoes the 2000’dot.com’ bubble. Back then, markets were right to think that all major companies would be built on the Internet. They were just a decade early. Have we run ahead of ourselves once again?

The long and short answer is one I would perilously put in print: “This Time is Different”. In 2000 it was all about the promises of software. This time around, the rally is mostly about tangible hardware. Hardware, in 2024, is the new software.

A key resource

Most of my generation grew up with the fear, if not the actual imagery, of long queues in gas stations, every time OPEC threatened to turn off the taps. Oil price spikes have always sent tremors across the globe. Oil was the be-all-an-end-all commodity that would power the future. Up to a point it still is.

Currently, The Middle East holds 58% of oil reserves, and more than a third of natural gas reserves (and sells only a third of that).

Whilst the green transition is well underway, fossil fuels still account for over 80% of global energy usage and it will take a significant period of time to move away from a commodity which has been powering the world’s economy for over 150 years.

As global resources go, oil is valuable. But there is one at least as valuable: computing power. While this resource is not measured in ‘proven reserves’, like most commodities, it is much more scarce than we may believe.

In the last three decades, the internet changed our interaction with the world. The globe has never been so small in terms of communication, business and culture as it is today. Young people are increasingly ’citizens of the world’. For all the restrictions and tariffs in the last few years, global trade continues to expand.

Virtually every person in the world is connected to one another through social media. Our food, our literature, and our thinking are now inundated with global influences. It is no exaggeration to say that all modern civilisation depends, one way or another, on expanding computing power.

This was just the third industrial revolution.

The Fourth Industrial Revolution

Then came the fourth. Artificial Intelligence. The human race is primarily a toolmaking race, advancing by evolving tools. AI has been around for more than eight decades, ever since Alan Turing created the world’s first computer.

It was studied at Dartmouth in the 1960s and was backed by government funding by the 1980s. By the 1990s it was beating the world chess champion and becoming the subject of many a film and book. Yet progress towards real human-like intelligence remained slower than originally anticipated. By 2010 we were talking about Deep Learning and Machine Learning, with real-world applications, but never really about AI.

Then, in 2022, we asked a computer a question. For the first time in history, one of our tools answered back in human language. The Turing Test (a series of questions by which a person can discern whether they were talking to a human or a computer) was smashed, for the whole world to see. Chat GPT took AI from the realm of sci-fi and testing, into the realm of real-world applications.

History books will probably see this moment as the one that kicked off the fourth industrial revolution. Much like when local miners in Klondike started a 100,000-people mass migration in what became known as the ’Gold Rush’, thousands of small companies, that we know nothing about, are already experimenting with Artificial Intelligence applications. Law firms and investment banks have Chat GPT writing simple documents that used to take hours of associate’s time. Within a year, ’Large Language Model’ became a household term.

The investment community jumped on the opportunity… and on history. The lesson from 1869 is that the miners made less money than the pick-and-axe stores at the bottom of the mountain. Focusing on the hardware is important. Also, while the next AI-generated behemoth might still be in a garage in Silicon Valley, it has several rounds of seed money before it goes public, or is taken over by a larger tech competitor. A large part of institutional money is placed with listed equities.

It’s all about tech

Since the end of 2022, global equities have gained +29%. US equities, which are more tech-heavy, have gained +35%. Meanwhile, the Magnificent 7, the seven largest US tech companies, have made over +94% (see Figure 1). At the time of writing, the top 10 US large-cap stock account for more than a quarter of US large-cap capitalisation, the largest concentration in recent history. Just last year, these companies generated over 60% of returns in the space. 

Figure 1. Since the end of 2022, global equities have gained +29%, US equities +35% and the Magnificent 7 over +90%

But not all tech is created equal. There are a small number of firms that are both hardware focused and listed, and these are mostly data centres, like Google, Amazon (AWS), Equinix, Microsoft and a few more. But these companies get no more than a third of their earnings from data centres, and a much smaller part from the AI revolution. They are mature businesses that may participate and even lead in some fields, but for the time being, we have no idea what these fields will be.

This narrows the field down to the three microchip companies that power data centres, such as AMD, Intel and Nvidia. Of these, the latter is a company which not long ago made 70% of its earnings from high-end microchips for gaming. Just three short years later, Nvidia makes 70% of its profits from networking hardware. As it produces the most high-end microchips, it now powers 75% of data centre computing power.

This makes the company something close to a monopoly in powering the AI rally. Essentially, the ’Magnificent Seven’ has been narrowed down to the ’Magnificent One’. Since the end of 2022, Nvidia has led the pack, increasing its capitalisation by 464%, and becoming one of the most important companies in the world. Semiconductor companies are becoming what James Watt and Co. was for the first industrial revolution, Vanderbilt, Standard Oil and Ford for the second, and Microsoft, IBM and Apple for the third. The engine of growth towards a new era.

Many an investor are now primarily focusing their portfolios on technology. Its five-year relentless bull run and the good fundamentals underpinning it only reinforce this attitude.

Supporting the sentiment, valuations are nowhere near where they were in 2000. Presently, the US large-cap IT stocks trade at 36x times their last 12 months earnings. In 1999-2000 the number was closer to 70x, reaching 445x at their peak. This was the very definition of ’irrational exuberance’. Nvidia, which was trading 218x times its historical earnings by the end of last July, saw the number drop to 100x and then to 66x (at the time of writing), after consecutive blowout earnings announcements. So, earnings support valuations which may be above average but by no means exuberant.

But make no mistake. There are significant pitfalls to this ’trade of trades’.

The perils of investing in one theme

There are four key risks one needs to consider:

1.     That AI underdelivers

2.     That ’chip wars’ mute the supply of high-end microchips

3.     That computing power stops increasing

4.     That one stock can’t support a global market

Overpromising

Let’s start with the first key risk. That AI under-delivers. At this point, the above-average valuations and the rally rest on one hypothesis: that in the near future, Artificial Intelligence will provide us with enough real-world applications that will change the world and provide us with a productivity leap, the same way Google and Apple did in the previous decade.

At the time of writing, no such application had been discovered. If anything, the performance of ChatGPT has been deteriorating as it interacts more with humans and less with simple data. As our quant analyst Tao Yu says in an upcoming article:

“At first glance, it is not obvious why an entity with these abilities shouldn’t be able to fully replace humans already. Coders write code. If AI can also write code, why do we need coders?

It turns out that this logic does not work in practice. In the case of software engineering, LLMs can ace programming interview questions, but perform far worse on real-world problems – which fall outside of their training data. Princeton’s SWEbench software engineering benchmark showed that AI models still fail to solve a majority of real-world programming problems (100% of SWEbench problems are solved by humans).”

In layman’s terms, what if the tech does not live up to expectations, at least in a speedy manner? It wouldn’t be the first time. Five years ago, analysts were raving about the possibilities of the Metaverse. Since then, the hardware hasn’t lived up to the expectations. Facebook, which became ’Meta’, has slowed down its investment after losing $21bn per year.  Seven years ago, the Blockchain and Cryptocurrencies would change how we interact. Fifteen years after Bitcoin’s launch, about 250 major companies accept payment in cryptocurrencies, which doesn’t exactly make it a global tender.

Chip wars and capacity

The second major risk is the availability of high-end microchips.

A microchip (officially an ’integrated circuit’ is a very small electronic device, consisting of transistors, resistors and capacitors, etched on a small piece of semiconductor material, usually silicone. The more transistors that can fit in a small silicon wafer, the bigger the computing power. In 1965, Intel’s co-founder and engineer Gordon Moore suggested that computing power would double roughly every year (later revised to two years). To this day, ’Moore’s Law’ more or less holds.

In the 1970s, transistor sizes decreased from tens of micrometres (one millionth of a metre), to just 2-3 nanometres (one billionth of a metre). For comparison, a human virus is 150-300 nanometres in size.

The lower-end electronic machinery, such as a run-of-the-mill laptop or a ’smart’ toaster, use 28nm chips or larger. For AI applications and the newest iPhones, the computing power of smaller 2-3nm chips is required.

This is where the problems begin.

The first problem is availability. The chips are constructed with very expensive and specialised lithography machinery, usually produced by Dutch company ASML. Then they are transported to Taiwan, where specialised factories produce high-end chip, using many inputs, such as specialised chemicals mostly produced in Japan.

In fact, 90% of high-end chip production takes place in Taiwan, creating an important choke point for the distribution of global high-end technology.

Taiwan’s independence is fiercely contested by China, a mere 90 miles away. Chinese leader Xi Jinping has stated that he would consider invading the country by 2027. As geopolitical rifts between China and the west grow, the technology and capacity to create high-end microchips necessary for the development of the next generation of AI applications is at risk. The more the West continues to block China from high-end technology, the larger the incentive for China to block the West’s access to the high-end technology produced just off its border.

Capacity could, of course, be built elsewhere, but it would come with significant costs and would take time.

The second problem is a simpler one: computing capacity. While Moore’s law still stands, Gordon Moore himself has acknowledged that “no exponential is forever.”. But, he told his engineers in 2003, “Your job is delaying forever”. Since 2005, processor clock speeds (measured in Megahertz, a unit of frequency), have been stable. However, engineers continue to fit more transistors into a chip at roughly the same rate since the 1970s.

Because of the near-atomic size of the chips, most production happens in 2 dimensions. Engineers have been working on ways to print chips in 3 dimensions, increasing capacity. This could buy a few years of computing power increasing at the current pace. But will that be enough to develop high-powered AI applications? Or will the development of quantum computers (still very experimental) be needed before Artificial Intelligence becomes part of our daily lives?

The One Stock

The simplest of problems is concentration. In the last few months, it is not a sector that has driven the rally, as much as one stock, Nvidia. While we have demonstrated the clarity of the fundamental case behind it, the simple fact remains that one stock can’t drive a whole bull market for equities.

For the time being, valuations ex-tech are near average, and so are most of the big tech companies. Google is trading at 26x times its historic earnings (average since 2009 is 27x), Apple also at 26x (average since 2009 is 19x) and Meta at 25x (average since 2014 is 40x). Amazon, a company that has historically operated on losses which makes valuation very difficult, is trading at 60x, about half its post-2016 average of 133x. Only Microsoft is trading above-average valuations, at 37x (average since 2009 is 22x).

This leaves Nvidia pulling the stock market cart. While the rally has become broader in the last few months, the effect of Nvidia on global equity returns continues to be outsized.

In conclusion – what this means for investors

So where does that leave investors who have, whether by design or by following an index, been dependent on chasing the returns of a handful of stocks?

As far as technology itself is concerned, we don’t see evidence of 2000-like ’irrational exuberance’. Nvidia’s dominant market position in the semiconductor sector is unquestionable. Markets are not betting on a company that may do things tomorrow, but rather on one that delivers today, due to strong demand for cloud processing power. But, given current high valuations, a lot of good news is priced in, while risks may not be.

In this market, it really pays to be diversified and strategic. The market offers many opportunities. Valuations ex-Nvidia and Microsoft are average, while earnings have not been disappointing. European and UK stocks are trading below their averages. Japanese equities returned to levels not seen since the late 1980s. Meanwhile, short and long-term bond yields are back at December levels. Gold has also been rallying on the back of strong demand from central banks and Asian consumers. We have one part of the market rallying, on strong fundamentals but high valuations, and the rest of the market either rallying on lower valuations or waiting for positive catalysts. A diversified portfolio can capture more opportunities, rather than take a singular bet.

Managers with a strong Strategic Asset Allocation proposition may weather volatility over the longer term, especially in the bond market, and take tactical advantage of opportunities.

Even if this looks like a one-stock-driven market, the question for investors is not whether or not to buy it, but rather whether investors are positioned to take advantage of the opportunities created by increased volatility.

George Lagarias - Chief Economist