The perpetual motion dream is scientifically impossible. The laws of thermodynamics state that energy cannot be created or destroyed (first law) and that some energy is always lost as heat in any energy transformation (second law). For centuries, inventors and enthusiasts have continued to propose theoretical designs, while the scientific consensus is that perpetual motion is a fantasy. Are we living yet another fantasy moment?
Earlier this month NVIDIA passed Apple as the most valuable company in the world, with a market cap approaching $3.5T. Their growth has been fueled by an explosive demand for GPU processors used in training Large Language Models. The GPUs, or graphic processing units, were originally designed for vector processing in image rendering and video games, which became vital to training AI models. At the core of the current AI technology is the representation of words or objects as n-dimensional vectors or embeddings. Think about attaching many properties to a word and drawing associations along these dimensions. The ability of GPUs to do simultaneous processing of several data points in parallel is what makes them a perfect fit for these computations. Understandably, given NVIDIA’s technology leadership, their products are in high demand by the companies driving today’s AI wave. Who is funding this?
Over the last decade, venture capital investment in AI has been growing, and there are no signs that this trend is abating. Recently, Elon Musk’s xAI, secured $5B in funding with an implicit valuation of $50B, more than what the billionaire paid for Twitter. In parallel, some of the largest tech companies, notably the largest cloud computing capacity providers - Google, Amazon, and Microsoft - have been throwing billions of dollars into leading AI labs like OpenAI and Anthropic. A significant part of this money is used to buy the computing power to train the models using GPUs. The increasing stakes in the game have been driving industry concentration, catching the Federal Trade Commission’s attention, leading to an inquiry into Generative AI partnerships launched last January.
AI labs need capacity. Labs get funding and investments from cloud capacity providers and VCs. These large tech investors recover their investment via the growth in their investment’s valuation. They also benefit by showing growth in their own businesses in a somewhat circular logic. In parallel, NVIDIA is providing the picks and shovels for this 21st-century gold rush.
While valuations grow, the perpetual motion continues, and we watch it at our own peril. We can bet that the productivity gains driven by AI adoption will continue to fuel this growth, but until when?
The history of pursuing perpetual motion showcases humanity's creativity and ambition to challenge limits and test the boundaries of science. While perpetual motion proved unattainable, its efforts fueled scientific advances and highlighted the importance of grounding ambitions in evidence and balancing idealism with realism. How far can we go with AI?