The first mass production factory of humanoid robots in Shanghai plans to start production in October
2024-08-19
한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina
Half a year after settling in the Lingang New Area, Zhiyuan Robotics, led by Zhihuijun (real name: Peng Zhihui), launched its first humanoid robot, the Expedition A1; a year later, this company, which benchmarks Tesla's humanoid robots, officially announced the mass production and shipment time, and the "first year of commercial use" plan was implemented.
On August 18, a reporter from The Paper (www.thepaper.cn) learned from a media conference held by Zhiyuan Robotics that the company will start shipping in October this year, and the estimated shipment volume in 2024 will reach about 300 units, including about 200 bipedal robots and about 100 wheeled robots. According to Jiang Qingsong, partner and vice president of marketing services of Zhiyuan Robotics, the company's actual order volume has exceeded 300 units.
The first phase of Zhiyuan Robotics' factory is located in Lingang Fengxian, which is also the first mass production factory of humanoid robots in Shanghai. "The factory has entered the final preparation stage for mass production, the production lines have been built, and the corresponding settings are now being made." Jiang Qingsong further added to the reporter of The Paper that the entire mass production rhythm in the fourth quarter of this year is arranged as follows: "There will be small-scale production in October, and our mass production will reach 100 units in November, and more in December."
It is worth mentioning that Zhiyuan co-founder Peng Zhihui himself also released the company's 2024 new products live on the morning of August 18. The new product release includes five new commercial humanoid robots in two series, "Yuanzheng" and "Lingxi" - Yuanzheng A2, Yuanzheng A2-W, Yuanzheng A2-Max, Lingxi X1 and Lingxi X1-W. It has been exactly one year since Peng Zhihui released the company's first humanoid robot Yuanzheng A1.
The above five robots adopt a family design language, combining wheeled and foot-based forms, covering application scenarios such as interactive services, flexible intelligent manufacturing, special operations, scientific research and education, and data collection.
In terms of technology, Peng Zhihui introduced that Zhiyuan Robot divides the robot system into power domain, perception domain, communication domain, and control domain, and has made a complete layout since the product started. The highlights of this time include: in the power domain, the PowerFlow joint module has achieved mass production and iterative upgrades, the number of degrees of freedom of the dexterous hand has jumped to 19, the active degrees of freedom have doubled to 12, the introduction of tactile perception and visual tactile perception technology based on MEMS principles, and high-precision force control of 7-degree-of-freedom dual arms; in the perception domain, RGBD cameras, laser radars, panoramic cameras and other sensors have been integrated, and the cutting-edge perception solution for autonomous driving Occupancy has been introduced, and the ability to understand the environment has been further improved through the SLAM algorithm.
In addition, in the communication field, Zhiyuan Robotics has developed its own native, lightweight, high-performance intelligent robot communication framework AimRT. Peng Zhihui mentioned that compared with third-party middleware such as ROS, it has improved performance, stability, efficiency and flexibility of system deployment, while being fully compatible with the existing ROS/ROS2 ecosystem.
In the control domain, Zhiyuan Robot combines Model-based and Learning-based algorithms to further improve the robot's motion control and adaptability; it has pre-researched AgentOS, which is driven by a natural language instruction set and can be adapted to different robot bodies, and based on reinforcement learning, it can achieve precise orchestration and efficient execution of robot skills.
It is worth mentioning that, similar to the levels of autonomous driving, Zhiyuan Robotics has also defined the technical evolution route of embodied intelligence G1 to G5.
According to reports, in the past year, Zhiyuan Robotics has made a breakthrough in the G2 route, realizing a series of zero-shot and few-shot general atomic skills, including the universal pose estimation model UniPose, the universal grasping model UniGrasp, and the universal force control plug-in model UniPlug. The atomic capability model in the G2 stage, oriented to flexible intelligent manufacturing and interactive service scenarios, has been commercially applied in multiple actual scenarios.
"No one has proposed a standard for embodied intelligence. Based on our own research and development progress, we found that we must make an effective definition so that we can better guide research and development." Jiang Qingsong said that the current technical evolution route of embodied intelligence G1 to G5 has been accepted to a certain extent in the industry. "I hope it can become a consensus among everyone and finally form a standard."
It is also worth noting that Zhiyuan Robotics also announced a series of open source plans, including: AimRT, a high-performance communication framework for intelligent robots, will be open source at the end of September, Lingxi X1 incubated by X-Lab will be fully open sourced in September, and it was announced that one million real machine and ten million simulation data sets based on AIDEA will be open sourced in the fourth quarter of this year.
Regarding real-machine data collection, Jiang Qingsong told The Paper that the real-machine data collection training field has been built. "We expect to build a collection field with about 100 machines by the end of September." He mentioned that compared with Internet data and simulation data, real-machine data is the data that the embodied brain really needs. "It is equivalent to a person teaching a machine to do an action, thereby generating a piece of data. This data is the most difficult and costly to obtain, but it is also the most efficient."
"Currently, there are 1.5 workers for each machine. Our goal is for one worker to produce 1,000 pieces of data a day. The current cost of each piece of data is estimated to be 0.4 yuan. Ultimately, we want to obtain one million pieces of real machine data." It is reported that nearly 6,000 pieces of real machine data can generalize an action training. "The scenarios are basically around picking up, placing, and transferring."
As for the open source of most of the design data and codes of Lingxi X1, as well as the sale of core components, Peng Zhihui said that this marks the arrival of the era of "humanoid robots made by humans". Lingxi X1 was incubated by X-Lab (Zhihuijun Laboratory), which was established by Zhiyuan Robotics specifically for extreme innovation and agile exploration.
Zhiyuan Robotics was established in the Lingang New Area in February 2023. The founding team includes many industry veterans including "Zhihuijun" Peng Zhihui. In the previously released "Lingang New Area 2023 Enterprise Financing List", Zhiyuan Robotics won the "fastest financing speed", "most financing times" and "most number of investors" in one fell swoop by obtaining angel round financing in March 2023, 5 consecutive rounds of financing within a year, and a total of 21 participating institutions.
Jiang Qingsong also told the reporter from The Paper that the Lingang New Area has provided great support for the early development of Zhiyuan Robotics, "including our factory, because our factory is currently in Lingang. At the same time, we also cooperate with Lingang in scenario-based development. For example, Lingang provides a good testing ground for some interactive service scenarios." He said that as the company's products are commercialized in the future, "I believe the scope of cooperation will be wider."
Thepaper.cn reporter He Liping
(This article is from The Paper. For more original information, please download the "The Paper" APP)