IT& Telecom

AI Leaders Reveal Key Bottlenecks in AI Economy Growth

At this year’s Milken Global Conference in Beverly Hills, five influential figures across the AI supply chain shared insights into the pressing challenges hindering the rapid expansion of the artificial intelligence economy. The panel included Christophe Fouquet, CEO of ASML; Francis deSouza, COO of Google Cloud; Qasar Younis, CEO of Applied Intuition; Dimitry Shevelenko, chief business officer of Perplexity; and Eve Bodnia, founder of Logical Intelligence.

Christophe Fouquet highlighted the severe chip supply shortages as a critical bottleneck. ASML, a Dutch company with a monopoly on extreme ultraviolet lithography machines essential for producing modern semiconductors, confirmed that despite ramped-up manufacturing efforts, the industry remains supply-limited. This limitation means that hyperscale companies like Google, Amazon, Microsoft, and Meta will be unable to procure the quantity of chips they require for the foreseeable future, with supply constraints expected to persist for two to five years.

Francis deSouza stressed the enormous and rapidly growing demand for compute infrastructure at Google Cloud, noting that its revenue recently surpassed $20 billion with a 63% growth rate in a single quarter. The company’s backlog for committed but undelivered contracts escalated from $250 billion to $460 billion in just one quarter, underscoring the immense appetite for AI-related cloud services.

Qasar Younis described a different bottleneck related to data acquisition for autonomous vehicle systems. His company, Applied Intuition, focuses on autonomy across cars, trucks, drones, and defense vehicles, but the critical constraint is not silicon availability. Instead, it is the necessity of collecting real-world data by deploying machines in physical environments, something synthetic simulations cannot fully replace. Younis emphasized that training models to operate reliably in the real world will require a long time before synthetic data can close the gap effectively.

Energy consumption emerged as another formidable challenge. Francis deSouza revealed that Google is actively investigating the feasibility of data centers located in orbit to address energy constraints by tapping into more abundant resources available in space. Despite significant engineering difficulties—such as the vacuum environment allowing only radiative cooling—Google considers this approach a serious option. He also pointed out that vertical integration at Google, where the company designs custom chips and develops AI models collaboratively, leads to superior energy efficiency compared to off-the-shelf configurations.

Echoing this, Christophe Fouquet reminded the audience that increasing computational power inherently leads to escalating energy costs. The investment in expanding compute capacity is substantial and driven by urgent strategic demands, but the associated energy expenses cannot be overlooked.

Amid discussions of scaling and efficiency, Eve Bodnia presented a contrasting approach with her startup Logical Intelligence. Departing from the dominant large language model (LLM) paradigm, her company focuses on energy-based models (EBMs) that seek to understand the underlying rules within data, closely mimicking human brain processes. According to Bodnia, their models are significantly smaller—200 million parameters compared to the hundreds of billions in current LLMs—and operate thousands of times faster, potentially offering a more efficient path forward for AI reasoning abilities.

These conversations reflect a complex ecosystem where hardware shortages, data limitations, energy demands, and new AI architectures collectively shape the trajectory of the AI economy. The sector’s leading minds are grappling with these intertwined challenges as they attempt to sustain unprecedented growth and technological advancement.

Related Stories

Leave a Reply

Your email address will not be published. Required fields are marked *