Xiaomi presented a bipedal humanoid robot in the course of a release function for its foldable handsets. The CyberOne is capable of perceiving 3D space and identifying people, movements, and expressions. CyberOne is able to distinguish 45 classifications of human emotion as well as comfort users in times of sadness. Xiaomi declares a series of real-world applications for the bot, including fabricating assistance and human friendship.
Long since gone are the days when an electronics company could just release a device and conclude the event. At today’s large launch function in Beijing, Xiaomi followed up its foldable news by giving the stage over to CyberOne. The bipedal robot teamed up with Lei Jun onstage, exchanging greetings with the CEO and presenting him a long-stem flower.
Upon initial glimpse, the robot isn’t exactly spritely, in whens it come to locomotion, yet it’s still a good demonstration and very much not a person in a spandex suit. It’s the most recent sign of Xiaomi’s progressing robotics goals, that commenced with vacuums and has since grown to incorporate last year’s Spot like CyberDog.
And we have watched numerous consumer brands flex their robotic prowess at functions like this, including Samsung and LG, therefore it’s tough to realize where CyberOne falls in the spectrum between serious pursuit and stage performance.
Lei Jun was fast to specify the company’s investment in the sector pointing out that CyberOne’s AI and its mechanical abilities are entirely developed by Xiaomi Robotics Lab. He said the company has invested heavily in research and development spanning numerous areas, including things like software applications, hardware and algorithms development.
— leijun (@leijun) August 11, 2022
There is an extremely broad series of claims here, such as the capability to understand human emotions. Xiaomi indicates that humanoid robots depend on visual sense to understand their surroundings. Armed with a self-developed Mi-Sense depth vision module and blended with an AI interaction algorithm, CyberOne is up to the task of identifying 3D space, and also distinguishing individuals, gestures, as well as expressions, empowering it to not only view but to process its surroundings. To be able to interact with the world, CyberOne is furnished with a self-developed MiAI environment semantics recognition system as well as a MiAI vocal emotion identification engine, letting it to recognize 85 types of environmental noises and 45 classifications of human emotion. CyberOne has the ability to detect joy and happiness, and even condole the person in times of depression. Each one of these capabilities are integrated into CyberOne’s processing systems, which in turn are coupled with a curved OLED module to show real-time interactive information and facts.
Equally broad are the promised real-world uses, varying from manufacturing help to human company. Certainly there will be a lot of use for both of these of these capability sets in the coming future, but that’s a long way from this demo. For the time being, it probably makes the most sense to look at CyberOne as a little something of an analog to, for example, Honda’s Asimo: an encouraging experiment that serves as a good brand ambassador for much of the work being undertaken in the field.