Robot uses neuromorphic vision to mimic human movement

Share this on social media:

The robot is able to recognise up to eight different human body postures (Image: Synsense)

SynSense has demonstrated the capabilities of its neuromorphic system on chip (SoC) in a robot capable of learning and imitating human body movements.

At the 22nd China Shantou International Toy Fair in Chenghai, the firm’s Speck SoC was shown to be integrated with a robot capable of visual perception, human body recognition, and mimicry. 

Made by QunYu, a Chinese developer of programmable intelligent toys, the robot is able to recognise up to eight different human body postures, with the possibility of more being added through software updates.

Speck combines a low-power vision processor with an event-based sensor to capture real-time information, recognise and detect objects, and conduct smart scene analysis. 

“With our chip, the robot has the ability to ‘see’ and ‘learn’, allowing it to ‘mimic’ human movements” said Dr Yannan Xing, a senior algorithm application engineer at SynSense. “By waving your arms, the robot can learn your movements and wave its arms in response. When you cross your arms in front of your chest, the robot senses this and also crosses its arms.”

The neuromorphic vision solution itself is lightweight, low-power, low-latency, privacy-protecting, and low-cost, which makes it beneficial for integration with toy robots.

The showcased toy represents the world’s first neuromorphic programmable robot, according to the two firms.

“Neuromorphic intelligence is a cutting-edge AI technology, and we are very proud to launch the world’s first neuromorphic toy and provide a new form of human-robot interaction,” said Ruifeng Chen, CEO of QunYu.

SynSense and QunYu have since signed a strategic cooperation agreement to jointly launch neuromorphic interactive blocks.

Luca Verre, Co-founder and CEO of Prophesee, as well as a Photonics100 honoree, highlights how event-based vision is set to revolutionise mobile photography

06 April 2023

Mira 02Y-E can be integrated into various air platforms, missiles, vehicles, and hand-held devices to identify and detect hostile fire.

28 April 2023

Luca Verre, Co-founder and CEO of Prophesee, as well as a Photonics100 honoree, highlights how event-based vision is set to revolutionise mobile photography

06 April 2023

SynSense has combined event-based image sensing technology with a 320,000-neuron processor to deliver real-time vision processing at milliwatt power consumption

30 March 2023

Mandar Sohoni, left, and Tianyu Wang adjust their research setup that tests the ability of an optical neural network to measure objects in a 3D scene

28 April 2023