Skip to main content

Robot uses neuromorphic vision to mimic human movement

SynSense has demonstrated the capabilities of its neuromorphic system on chip (SoC) in a robot capable of learning and imitating human body movements.

At the 22nd China Shantou International Toy Fair in Chenghai, the firm’s Speck SoC was shown to be integrated with a robot capable of visual perception, human body recognition, and mimicry. 

Made by QunYu, a Chinese developer of programmable intelligent toys, the robot is able to recognise up to eight different human body postures, with the possibility of more being added through software updates.

Speck combines a low-power vision processor with an event-based sensor to capture real-time information, recognise and detect objects, and conduct smart scene analysis. 

“With our chip, the robot has the ability to ‘see’ and ‘learn’, allowing it to ‘mimic’ human movements” said Dr Yannan Xing, a senior algorithm application engineer at SynSense. “By waving your arms, the robot can learn your movements and wave its arms in response. When you cross your arms in front of your chest, the robot senses this and also crosses its arms.”

The neuromorphic vision solution itself is lightweight, low-power, low-latency, privacy-protecting, and low-cost, which makes it beneficial for integration with toy robots.

The showcased toy represents the world’s first neuromorphic programmable robot, according to the two firms.

“Neuromorphic intelligence is a cutting-edge AI technology, and we are very proud to launch the world’s first neuromorphic toy and provide a new form of human-robot interaction,” said Ruifeng Chen, CEO of QunYu.

SynSense and QunYu have since signed a strategic cooperation agreement to jointly launch neuromorphic interactive blocks.

Topics

Media Partners