Premature Spending

Moore's Law now goes off vibes instead of transistors

Happy Wednesday. I spent a few hours Sunday writing about 1100 words for this week's post and then decided Monday to write about something else. Rewarding stuff. 

Anyways, if you want to receive these emails in your inbox every week. Sign up below:

Also, built this puppy over the weekend. Sharing it here because I know a few friends who will be pissed to see it for the 7th time, but who doesn’t love a good fire? 

As seen on @tdozzi12

Moore’s Law

Ever heard of Moore’s Law? Gordon E. Moore was one of the co-founders of Intel back in 1968. A few years before the founding, Gordon noticed that roughly every 2 years, the number of transistors in an integrated circuit (IC) doubles. To put into plain english, the computing power of semiconductor chips doubles every two years. While that is an extremely dumbed down version, it serves its purpose for the rest of this piece. Here is a great chart to show this law in effect every year since the first microchip. 

Yeah, I know it is hard to read the names of the chips, but the main takeaway here is the chart is up and to the right. Moore’s law is an extremely popular concept in the tech world, but has drifted away from its original meaning. As we stated above, the law directly measures the number of transistors in integrated circuits each year. Yet, nowadays the Law feels to be applied more broadly to the feeling that technology is giving us. There is a consensus that things are just happening faster in technology. People are creating newer, better, more advanced technology faster than ever before. 

I believe Moore’s law illustrates this concept, but we can’t strictly look at transistors in chips anymore. For me, I am just going off vibes (mainly because I don’t know what the fuck transistors are). And currently, the vibes are that every two years, technology doubles in what it can do. In other words, it’s getting better and faster. Here’s what that looks like:

Now some PHD in engineering will probably tell me that if I look at the computing power in Nvidia’s recent chips I can come to this same conclusion, but I really don’t feel like spending my Monday night learning about graphic processing units. 

Alright, so why am I writing about this? Well as always, it relates to the markets.

Jumping the Gun

Over the past few months, the biggest tech companies in the world, Apple, Meta, Google, Microsoft, and Amazon, all announced significant investments in AI. The focus would mainly be on data centers, chips, renewable energy, and other AI infrastructure. In 2025, these companies said they would invest a total of roughly ~$325 billion into these efforts. 

Then DeepSeek came along and it rocked the foundational reasoning of this spending. 

These companies were all anticipating that AI would require incredible amounts of energy, infrastructure, and hardware. DeepSeek proved what we know from Moore’s law. Things don’t only get faster, they get more efficient. It was inevitable that AI would get more efficient over time, we just weren’t expecting it to get this much more efficient this quickly.

(I’ve actually never watched this movie, I don’t know how I knew about this meme, nor to I get the reference)

So, we are now seeing the effects of this in the form of canceled contracts and walked-back commitments. Laid out in the article above, Microsoft canceled a “couple hundred” megawatts worth of data center capacity in the recent weeks. I also have no idea what megawatts mean and the relative scale of what a couple hundred of those look like. So, I looked it up. To put it in perspective, 200 MW can power around 150k - 200k homes. That’s enough to power a small region. 

This is the first domino that we are seeing, but the ripple effect is being felt throughout the market. There is a general anxiety around what this new efficiency means for AI and AI spending. 

Here’s my take

It’s only going to keep getting more efficient, and faster. Let’s refresh ourselves with the vibes chart from above and focus on the red box.

We are almost going vertical in this red section. That’s the magic of compounding. This means that it is going to be harder to predict the needs for new technology simply because of how much more efficient and better it becomes over such a short period of time. How can you predict your capex spending on new technology for a fiscal year when the technology changes almost monthly? 

Short of it is, these companies are going to have to be as efficient as the technology that they are investing in. Will this new realization on spending result in a bear market? Probably not, but there will likely be a correction (likely what we are seeing in the markets right now) that will present a great buying opportunity. 

I also will personally never try and predict a bear market. I’m too optimistic. I will always look for a bull case because A) it’s a better way to live and B) the market, over time, always goes up. In this specific scenario, I am still trying to find how efficient AI and the speed of technology results in a bull case. 

My leading theory is that the effectiveness and productivity of workers will increase by a substantial amount. This leads to better earnings, opex, and overall, earnings per share. The companies that will win in the short-term are the ones who don’t blow their load with AI spending early. 

Only one company seems to have done this, and you guessed it, Apple. Those fuckers did it again. Even though they haven’t innovated on the IPhone in 9 years, they still seem to find a way to win no matter what. Every analyst was shitting on them last year for not spending hundreds of billions when it comes to AI capex. 

Who’s laughing now? This guy.

Oh also me, I own a lot of Apple stock. 

Alright that’s all. I’m off to Nashville for the weekend.

Thanks.