In addition to AI Agents, embodied robots represent another significant vertical application in the AI era. Morgan Stanley once predicted in a report that by 2050, the global market size for humanoid robots is expected to exceed $5 trillion.
With the development of AI, robots will gradually evolve from mechanical arms in factories to companions in our daily lives, gaining perception and understanding through AI, ultimately achieving the ability for independent decision-making. The problem is that today's robots resemble a group of "mute" machines that cannot communicate with each other: each manufacturer uses its own language and logic, with software that is incompatible, and intelligence that cannot be shared. It's like buying a Xiaomi and a Tesla, but they can't even assess road conditions together, let alone collaborate on tasks.
What OpenMind aims to change is this "each fighting their own battle" situation. They are not building robots; instead, they want to create a collaborative system that allows robots to "speak the same language, follow the same rules, and work together." For example, just as iOS and Android sparked an explosion of smart applications for phones, and Ethereum provided a common foundation for the crypto world, OpenMind seeks to create a unified "operating system" and "collaboration network" for global robots.
In short, OpenMind is building a universal operating system for robots, enabling them not only to perceive and act but also to collaborate safely and at scale in any environment through decentralized cooperation.
Who is Supporting this Open Foundation
OpenMind has completed a $20 million seed and Series A round, led by Pantera Capital. More importantly, the "breadth and complementarity" of the capital has almost brought together all the key pieces of this track: on one end are the long-term forces from the Western technology and finance ecosystem—Ribbit, Coinbase Ventures, DCG, Lightspeed Faction, Anagram, Pi Network Ventures, Topology, Primitive Ventures—they are familiar with the paradigm shift in crypto and AI infrastructure and can provide models, networks, and compliance experience for the "agent economy + machine internet"; on the other end is the industrial momentum from the East—represented by Sequoia China’s supply chain and manufacturing system—fully aware of what it means to turn a prototype into a scalable product in terms of processes and cost thresholds. The combination of these two forces allows OpenMind to not only secure funding but also gain the pathways and resources "from the lab to the production line, from software to underlying manufacturing."
This pathway is also connecting with traditional capital markets. In June 2025, when KraneShares launched the global humanoid and embodied intelligence index ETF (KOID), they chose the humanoid robot Iris, co-customized by OpenMind and RoboStore, to ring the opening bell at Nasdaq, making it the first "robot guest" in the history of the exchange to complete this ceremony. This represents a synchronization of technology and financial narratives, as well as a public signal regarding "how machine assets are priced and settled."
As Pantera Capital partner Nihal Maunder stated:
"If we want intelligent machines to operate in open environments, we need an open intelligent network. What OpenMind is doing for robots is akin to what Linux did for software and Ethereum did for blockchain."
From Lab to Production Line Team
OpenMind's founder, Jan Liphardt, is an associate professor at Stanford University and a former professor at Berkeley, with a long-standing focus on data and distributed systems, deeply engaged in both academia and engineering. He advocates for advancing open-source reuse, replacing black boxes with auditable and traceable mechanisms, and integrating AI, robotics, and cryptography through interdisciplinary approaches.
OpenMind's core team comes from institutions such as OKX Ventures, Oxford Robotics Institute, Palantir, Databricks, and Perplexity, covering key areas such as robot control, perception and navigation, multimodal and LLM scheduling, distributed systems, and on-chain protocols. Additionally, a team of experts from academia and industry (including Stanford robotics head Steve Cousins, Oxford Blockchain Center's Bill Roscoe, and Imperial College's AI safety professor Alessio Lomuscio) provides assurance for the robots' "safety, compliance, and reliability."
OpenMind's Solution: Two-Tier Architecture, One Set of Order
OpenMind has built a reusable infrastructure that allows robots to collaborate and share information across devices, manufacturers, and even national borders:
Device Side: It provides an AI-native operating system for physical robots, OM1, which connects the entire link from perception to execution, enabling different types of machines to understand their environment and complete tasks;
Network Side: It constructs a decentralized collaboration network, FABRIC, providing identity, task allocation, and communication mechanisms to ensure that robots can recognize each other, allocate tasks, and share states during collaboration.
This combination of "operating system + network layer" allows robots not only to act independently but also to cooperate, align processes, and complete complex tasks together within a unified collaboration network.
OM1: AI-Native Operating System for the Physical World
Just as smartphones need iOS or Android to run applications, robots also require an operating system to run AI models, process sensor data, make reasoning decisions, and execute actions.
OM1 was born for this purpose; it is an AI-native operating system for real-world robots, enabling them to perceive, understand, plan, and complete tasks in various environments. Unlike traditional, closed robot control systems, OM1 is open-source, modular, and hardware-agnostic, capable of running on various forms such as humanoid, quadruped, wheeled, and robotic arms.
Four Core Steps: From Perception to Execution
OM1 breaks down robotic intelligence into four universal steps: Perception → Memory → Planning → Action. This process is fully modularized in OM1 and connected through a unified data language, enabling the construction of intelligent capabilities that are combinable, replaceable, and verifiable.
OM1's Architecture
Specifically, the architecture of OM1 consists of the following seven layers:
Sensor Layer collects information: multimodal inputs from cameras, LIDAR, microphones, battery status, GPS, etc.
AI + World Captioning Layer translates information: multimodal models convert visual, auditory, and status data into natural language descriptions (e.g., "You see a person waving").
Natural Language Data Bus transmits information: all perceptions are converted into timestamped language fragments for transmission between different modules.
Data Fuser combines information: integrates multi-source inputs to generate a complete context for decision-making (prompt).
Multi-AI Planning/Decision Layer generates decisions: multiple LLMs read the context and generate action plans based on on-chain rules.
NLDB Downstream Channel: transmits decision results to the hardware execution system through a language intermediary layer.
Hardware Abstraction Layer executes actions: converts language instructions into low-level control commands to drive hardware actions (movement, voice broadcasting, transactions, etc.).
Quick to Get Started, Widely Applicable
To quickly turn "an idea" into "a robot-executable task," OM1 has streamlined the development path into a ready-to-use process: developers define goals and constraints using natural language combined with large models, generating reusable skill packages within hours instead of enduring months of hard coding; the multimodal pipeline natively integrates LiDAR, vision, and audio, eliminating the need for complex manual sensor fusion; the model side is pre-connected to GPT-4o, DeepSeek, and mainstream VLMs, allowing direct use of voice input and output; the system layer is fully compatible with ROS2 and Cyclone DDS, seamlessly integrating with Unitree G1, Go2, Turtlebot, and various robotic arms through the HAL adaptation layer; at the same time, it is natively linked with FABRIC's identity, task orchestration, and on-chain settlement interfaces, enabling robots to execute tasks independently or join a global collaboration network for pay-per-use and auditing.
In the real world, OM1 has already completed multi-scenario validations: the quadruped platform Frenchie (Unitree Go2) successfully navigated complex tasks at the 2024 USS Hornet Defense Technology Showcase, while the humanoid platform Iris (Unitree G1) completed live human-robot interactions at the Coinbase booth during EthDenver 2025, and through RoboStore's educational program, it has entered university curricula across the United States, extending the same development paradigm to teaching and research.
FABRIC: Decentralized Human-Robot Collaboration Network
Even if individual intelligence is strong enough, if they cannot collaborate under trusted conditions, robots will still only fight their own battles. The fragmentation in reality stems from three fundamental issues: identity and location cannot be standardized and verified, making it difficult for external parties to trust "who I am, where I am, and what I am doing"; skills and data lack controllable authorization pathways, preventing safe sharing and invocation among multiple entities; unclear boundaries of control and responsibility make it difficult to pre-agree on frequency, scope, and return conditions, and to trace them afterward. FABRIC addresses these pain points with a system-level solution: using decentralized protocols to establish verifiable on-chain identities for robots and operators, providing integrated infrastructure for task publishing and matching, end-to-end encrypted communication, execution records, and automatic settlement around that identity, transforming collaboration from "temporary connections" into "institutionalized agreements with evidence."
In terms of operational form, FABRIC can be understood as a network layer that combines "positioning, connection, and scheduling": identities and locations are continuously signed and verified, allowing nodes to naturally possess a "mutually visible and trustworthy" proximity relationship; point-to-point channels act like on-demand encrypted tunnels, enabling remote control and monitoring without the need for public IPs and complex network setups; the entire process from task publishing to order taking, execution, and acceptance is standardized and recorded, allowing for automatic profit sharing and deposit refunds during settlement, as well as verification of "who did what, when, and where" in compliance or insurance scenarios. On top of this, typical applications naturally emerge: enterprises can remotely operate equipment across regions, cities can turn cleaning, inspection, and delivery into on-demand Robot-as-a-Service, fleets can report real-time road conditions and obstacles to generate shared maps, and when needed, robots can be dispatched nearby to complete 3D scanning, construction surveying, or insurance evidence collection.
As identity, tasks, and settlements are hosted on the same network, the boundaries of collaboration are clearly defined in advance, the facts of execution are verified afterward, and the invocation of skills has measurable costs and benefits. In the long run, FABRIC will evolve into the "application distribution layer" for machine intelligence: skills will circulate globally with programmable authorization terms, and the data generated from these calls will feed back into models and strategies, allowing the entire collaboration network to continuously self-upgrade under trustworthy constraints.
Web3 is Writing "Openness" into the Machine Society
The robotics industry is rapidly concentrating on a few platforms, with hardware, algorithms, and networks locked in closed stacks. The value of decentralization lies in enabling robots from any brand and region to collaborate, exchange skills, and complete settlements within the same open network, without relying on a single platform. OpenMind encodes this order using on-chain infrastructure: each robot and operator has a unique on-chain identity (ERC-7777), with hardware fingerprints and permissions verifiable; tasks are published, bid on, and matched under public rules, with the execution process generating encrypted proof with time and location recorded on-chain; after task completion, contracts automatically settle profit sharing, insurance, and deposits, with results verifiable in real-time; new skills are set through contracts with defined invocation counts and compatible devices, achieving global circulation while protecting intellectual property. Thus, the robot economy is inherently equipped with anti-monopoly, combinable, and auditable genes from its inception, with "openness" written into the underlying protocols of the machine society.
Bringing Embodied Intelligence Out of Isolation
Robots are moving from exhibition stands into daily life: patrolling hospital wards, learning new skills on campuses, and completing inspections and modeling in cities. The real challenge lies not in stronger motors, but in making machines from different sources trustworthy, able to communicate information, and collaborate effectively; to scale, distribution and supply are even more critical than technology.
Therefore, OpenMind's path to implementation starts from channels rather than stacking parameters. In collaboration with RoboStore (one of the largest distributors of Unitree in the U.S.), OM1 is being developed into standardized teaching materials and experimental kits, promoting integrated hardware and software supply across thousands of universities in the U.S. The education system represents the most stable demand side, directly embedding OM1 into the developer and application increments of the coming years.
For broader social distribution, OpenMind leverages its investor ecosystem to turn "the software's outlet" into a platform. Large crypto ecosystems like Pi also add imagination to this model, gradually forming a positive feedback loop of "someone writes, someone uses, someone pays." Stable supply is provided by educational channels, while platform distribution brings scale demand, thus OM1 and its upper-layer applications possess replicable expansion trajectories.
In the Web2 era, robots were often locked in the closed stacks of single manufacturers, making it difficult for functions and data to flow across platforms; once the teaching materials standard and distribution platform are connected, OpenMind makes openness the default setting: the same system enters campuses, moves into industries, and continues to spread through platform networks, making openness the default starting point for scalable implementation.
免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。