OpenAI Founder Greg Brockman: Tokens are Becoming a Universal Input for Problem Solving
OpenAI co-founder Greg Brockman stated that Tokens are rapidly evolving into a universal input format for solving various problems.
This trend allows AI systems to process multiple modalities of data, including text, code, images, and audio, in a unified Token format, greatly simplifying the problem-solving process across different fields.
The standardization of Token inputs is accelerating the shift of AI from specialized tools to general intelligent agents, expanding the boundaries of application.
Source: Public Information
ABAB AI Insight
Greg Brockman has been promoting Tokenizer and Scaling Laws research since the early days of OpenAI. Between 2023 and 2025, he has repeatedly emphasized long context and multimodal tokenization. This viewpoint continues his long-standing advocacy for a "unified interface" strategy, having previously led the transition of the GPT series from pure text to a multimodal token architecture.
On the capital path, OpenAI is investing core R&D resources into optimizing Tokenizer efficiency, larger context windows, and native multimodal models, motivated by the goal of building a closed-loop ecosystem where "Token is input, Token is output." This aims to lock in developers and enterprise users through a unified interface, expand platform-level pricing power, and create a data flywheel.
Similar to how bits became the universal language of the digital age, pixels unified image processing, and the HTTP protocol defined internet interactions, AI is currently in a mid-stage transformation from fragmented inputs to a unified Token layer.
Essentially, this represents a restructuring of the industry chain: traditional AI relies on specialized input pipelines for different modalities, while Token unification will transform heterogeneous data into interoperable standardized units, fundamentally changing model training, inference, and integration methods. Mechanically, this allows platforms like OpenAI to control upstream interface pricing power by managing Tokenizer and context protocols, pushing the entire AI industry chain towards a "Token economy."
ABAB News · Cognitive Laws
Unified input is essential for general intelligence.
Tokens are not just technical details; they are the currency of the next generation of computing.
Whoever defines the input format owns the entry point for problem-solving.