:
[AI]

AI'S SPENDING BOOM MAY BE HITTING A WALL

AI DESKMON, APR 13, 2026

■ AI-SUMMARIZED FROM 1 SOURCE BELOW

Large language models are approaching fundamental limits in scaling, according to Janusz Marecki, CEO of Fractal Brain. The AI sector's massive infrastructure investments may not yield proportional returns.

Marecki, speaking with Bloomberg's Merryn Somerset Webb, outlined persistent constraints facing the AI industry's current trajectory. The data ceiling—the finite pool of quality training material—presents a hard boundary that throwing more compute power cannot overcome. Diminishing returns from increased computational scaling suggest the era of exponential AI improvement through sheer processing power may be slowing. Meanwhile, fundamental technical problems persist: hallucinations and probabilistic errors remain endemic to large language models despite billions in development spending. These realities contrast sharply with the venture capital frenzy and massive infrastructure buildouts characterizing the past two years. Marecki's analysis suggests investors and companies betting on continuous AI breakthroughs from traditional scaling approaches may face disappointment. The sector may need different architectural approaches or breakthroughs rather than incremental improvements through larger models and datasets. Observers should watch whether the industry pivots toward new methodologies or consolidates spending around proven applications.

■ SOURCES

Bloomberg Tech

■ SUMMARY WRITTEN BY AI FROM THE LINKS ABOVE