OpenAI's $6.5B Acquisition of Jony Ive's IO: The Dawn of Embodied Intelligence

In a groundbreaking move that signals a major shift in artificial intelligence strategy, OpenAI announced on May 22, 2025, that it would acquire IO Products—the hardware startup founded by former Apple design chief Jony Ive—in an all-stock transaction valued at approximately $6.5 billion. This acquisition represents not just OpenAI's largest deal to date, but a pivotal moment marking the evolution of AI from cloud-based models to physical hardware.
The Strategic Pivot from Cloud AI to Physical Presence
Over the past two years, we've witnessed remarkable advancements in large language models, from ChatGPT to Gemini and Claude, with tech giants engaging in an arms race focused on parameter count, inference speed, and generation quality. However, as model capabilities increasingly converge, the next competitive frontier in AI is shifting from "who has the strongest model" to "who truly owns the user relationship."
This paradigm shift redefines the significance of edge AI hardware. If large language models represent the "brain" of next-generation intelligence, hardware devices serve as their "body" and "interface." The entity that controls the user entry point ultimately controls the data, feedback loops, interactions, and ecosystem development.
Sam Altman, OpenAI's CEO, clearly recognizes this reality. He has publicly stated that for AI to reach its full potential, users must be able to "interact with AI in natural, everyday ways." The current approach of accessing ChatGPT through mobile apps or web browsers is not a sustainable long-term strategy.
Unlike Google with its Android operating system and Chrome browser, Meta with its social networks and smart glasses, or Apple with its system-level AI features, OpenAI lacks native distribution channels. By continuing to rely on these platforms, OpenAI effectively surrenders control of ChatGPT's "user relationship" to third parties. Every interaction, subscription, and usage data must flow through channels defined by others—a costly arrangement that prevents true ecosystem closure.
The Birth of AI Companion Devices
OpenAI's solution is to build its own distribution system—not by developing another app, but by directly entering the hardware layer to create an "AI-native entry device." This explains the collaboration with IO to develop what industry insiders call an "AI Companion Device."
This device represents a departure from conventional smartphones, tablets, or wearables. Instead, it's an entirely new form of AI interaction terminal: always present, potentially screen-free, voice-interactive, contextually aware, and deeply integrated into users' daily lives—becoming the "first point of contact" between AI and humans.
According to multiple media reports, including The Information, WIRED, and TechCrunch, the OpenAI-IO collaboration draws inspiration from the 2013 film "Her," which depicted a future where an omnipresent AI operating system could understand human emotions, engage in conversation, provide companionship, and even form emotional connections. A decade later, we may be just one AI device away from this reality.
The device under development is reportedly:
- Screen-free or minimally dependent on visual interfaces
- Designed to be constantly present (wearable, pocket-sized, or desktop form factor)
- Capable of environmental awareness, recognizing context, tone, location, and emotions
- Primarily voice-interactive, requiring no manual operation or wake words
- Able to continuously learn user behaviors and preferences, evolving toward "understanding the user"
This represents a fundamental shift in how we interact with AI—from "you go find it" to "it's always with you." The AI's presence transitions from cloud computing centers to every scenario and moment of your daily life.
Jony Ive: Redesigning the Human-AI Relationship
The partnership with Jony Ive is central to realizing this vision of "companion intelligence." Technical capability is merely the foundation; the true breakthrough lies in redesigning the human-machine relationship—precisely why OpenAI chose to collaborate with Ive.
Ive isn't simply "designing a device shell" for OpenAI; he's fundamentally reconstructing how AI and humans interact, starting from first principles. According to TechCrunch, Ive and his design firm LoveFrom have taken complete control of OpenAI's design work, leading the user experience and visual language of both hardware and software products.
This isn't an outsourced collaboration but a strategic design co-creation. Ive, the visionary behind the iPhone, iPad, and Apple Watch, now aims to create an AI device "more disruptive than the iPhone." The difference is that he's not building a communication tool but constructing an "intelligent companion with physical form."
This transformation is particularly crucial in design language:
- Shifting from "technological feel" to "lifestyle feel" and "emotional warmth"
- Moving from "functionality" to "presence" and "emotional affinity"
- Evolving from "user operation" to "device proactive understanding and collaboration"
As Ive stated: "I increasingly feel that everything I've learned over the past 30 years has led me to this moment." This marks not just a designer's return but a rewriting of the human-machine interface philosophy.
Embodied Intelligence: The Next Evolution of AI

We can describe this new type of device as an "Embodied Intelligence Agent" with three core characteristics:
- Emotional Interface: Building "warm" communication through voice, tone, rhythm, language content, and semantic understanding
- Perceptual Capability: Understanding the user's environment, emotions, needs, and historical behaviors
- Behavioral Autonomy: Not just passively responding but actively assisting, reminding, and planning—becoming an "intelligent agent" rather than a "toolbox"
This signals the evolution of AI hardware from "passive tool" to "active partner," from "intelligent terminal" to "everyday intelligent agent."
In this process, hardware is no longer just a carrier for models but an extension of personality. AI doesn't just answer your questions; it coexists, empathizes, and lives alongside you.
When ChatGPT acquires a "body," AI ceases to be merely "technology" and becomes a "presence" in your daily life.
The Dual Path to Embodied Intelligence: B2B Edge AI and B2C Companion Devices
In the development of AI hardware, today and tomorrow aren't disconnected extremes but intertwined in a "double helix structure"—one path being the rapidly deploying edge B2B industry-specific AI devices, and the other being the emerging AI companion assistants.
One represents current technological reality, while the other points toward a reconstruction of human-machine relationships. Together, they drive large AI models from "cloud brains" toward "intelligent existence in the real world."
Edge industry-specific AI devices represent the most realistic path today. The essence of edge B2B AI devices is deploying lightweight, edge-optimized artificial intelligence models to various IoT terminals, giving these terminals a certain degree of local perception, processing, and decision-making capabilities. Compared to traditional cloud computing-dependent AI solutions, edge devices offer three major practical advantages: faster response, energy efficiency, and privacy-friendliness.
Currently, edge industry-specific AI devices have been implemented in multiple scenarios. In smart cities, cameras, traffic signals, and public facilities are achieving real-time recognition, event detection, and automatic scheduling through edge AI. In industrial settings, edge devices are deployed in production lines, robots, and sensors, performing quality inspection, predictive maintenance, and other tasks. In smart homes, voice assistants, smart speakers, robot vacuums, and door lock systems are gradually incorporating local AI models to improve response speed and user experience.
According to predictions released by Transforma Insights, by 2033, over 900 million edge AI devices will be connected to networks globally. For most enterprises, edge devices are the first step in bringing AI into the physical world, allowing data to be processed and responded to in real-time locally rather than just being uploaded to the cloud for analysis.
However, while edge B2B AI devices achieve "object intelligence," their intelligence is distributed and fragmented—"point intelligence" in the environment. They focus more on scenario efficiency, device coordination, and system optimization rather than understanding and accompanying "people themselves." This is precisely what AI companion assistants aim to complement.
B2C AI companion assistants represent the long-term vision of AI entering daily life. If edge industry-specific AI devices represent "scenario intelligence," then B2C AI companion assistants embody "user intelligence."
The former embeds AI into the environment, while the latter brings AI closer to human bodies and consciousness. Their differences lie not only in technological form but also in the depth of interactive relationships and the possibility of emotional connections.
The AI companion assistant (Companion Device) is an intelligent terminal driven by large models, designed not to control lights, adjust temperature, or monitor data, but to understand you, accompany you, serve you, and gradually build an interactive relationship with you as an "intelligent agent."
It may not have a screen or even a clear form, but it's always by your side, ready to respond, understand context, participate in conversations, provide suggestions, and silently record and learn your behavioral preferences.
The significance of AI companion assistants is that they make AI "embodiment" possible and allow AI to evolve from a tool to a "second self."
Technologically, it relies on the perception, reasoning, and language capabilities of AI large models; in design, it requires highly contextualized and humanized interaction logic. Commercially, it means AI is no longer just software subscriptions or cloud services but a "resident interactive presence"—an "intelligent individual" that can be carried, worn, and trusted.
From this perspective, B2B edge AI hardware solves "how AI enters the real world," while AI companion assistants solve "how AI truly enters human life." One is system-level, the other personal-level; one distributed, the other resident; one emphasizing efficiency, the other connection.
As revealed by the OpenAI-Ive collaboration to create AI companion assistants, future AI hardware is no longer just about "making devices smarter" but about "giving intelligence a body, warmth, and resonance." This is an evolution that crosses the boundaries of technology, design, and philosophy.
Therefore, edge B2B AI hardware and B2C AI companion assistants aren't mutually exclusive paths but double helix routes jointly constructing the future form of AI hardware: the former builds infrastructure, the latter defines the ultimate form. Together, they drive AI from "visible device intelligence" toward "felt intelligent presence."
If edge industry-specific AI hardware is the first step in making AI "visible," then AI companion assistants are the endpoint of making AI "perceptible." Each step of evolution between them represents a closer relationship between humans and artificial intelligence.
The Future of AI Hardware: From Tools to Companions
Today when we discuss artificial intelligence, the focus remains largely on model strength, parameter count, and inference speed. But what truly determines whether AI can deeply integrate into human life has never been just algorithms and computing power, but how it exists in our daily routines.
AI hardware is becoming the physical embodiment of artificial intelligence's "presence."
It's not just a terminal executing instructions but an entry point for understanding, accompanying, and coexisting with humans. The next decisive platform won't be faster CPUs but devices that understand you better. Through them, AI will gain its own "body," and humans will welcome truly meaningful "intelligent companions."
OpenAI's acquisition of IO represents more than just a business transaction—it signals a fundamental shift in how we'll interact with artificial intelligence in the coming years. As the boundaries between digital and physical continue to blur, we're witnessing the dawn of embodied intelligence that promises to transform our relationship with technology in ways we're only beginning to imagine.