Pratham: Companies Solving AI Cost Issues Will Become Tech Giants in the Next 20 Years
Pratham stated that the first company to truly solve the AI cost issue will become a tech giant for at least the next 20 years.
He believes that the biggest bottleneck in current AI development is the high cost of inference and training. Whoever can break through this limitation will dominate the next generation of the tech cycle.
In terms of market mechanisms, funding for AI infrastructure is accelerating towards companies that can significantly reduce computing costs. Companies focusing on chip optimization, model optimization, and efficient inference platforms will benefit, while traditional AI labs relying on high-energy-consuming training will face pressure. Capital is heavily leaning towards AI technologies and business models that are "cost killers."
Source: Public Information
ABAB AI Insight
Pratham has long observed AI engineering and cost curves, previously emphasizing that "cost is everything." He believes that the current training and inference costs of large models remain extremely high. Similar to how early cloud computing significantly reduced unit prices through scaling, whoever can achieve an order of magnitude cost reduction first will replicate AWS's dominance in the cloud era.
In terms of capital pathways, leading AI companies are investing substantial resources into model distillation, MoE architecture, dedicated chips, and inference optimization. Funding is shifting from pure computing power procurement to cost engineering and energy efficiency improvements. The strategic goal is to achieve larger-scale deployment and commercialization through lower marginal costs, thereby creating an unassailable moat.
Similar cases include AWS continuously lowering cloud computing costs through scale effects and self-developed chips, as well as Tesla's long-term optimization of autonomous driving hardware costs. The current AI industry is at a critical turning point from "burning money for scale" to "competing on cost efficiency."
Essentially, this is about capital concentration: AI infrastructure costs have become a core competitive barrier. The mechanism is that under the exponential demand for computing power in training and inference, the ability to optimize costs determines commercial viability, leading to a concentration of pricing power from high-cost general model labs to companies mastering low-cost inference, architectural innovation, and energy efficiency breakthroughs, which will determine the dominant players in the tech landscape for the next 20 years.