EPSRC project to explore how eye-like vision sensors could guide robots

Share this on social media:

Collaborators from three London universities are researching how an artificial vision system inspired by the human eye could be used to guide robots in a £1.3 million three-year EPSRC project. The new state-of-the-art sensors are able to capture, compress and transmit data using a fraction of the energy that current systems use.

(Credit: JAY LEE STUDIO/REX/Shutterstock)

Researchers from Kingston University London, University College London, and King's College London are working with the new dynamic, neuromorphic visual sensors, which are designed to only update the parts of an image where movement occurs, similar to how the human eye detects movement. This dramatically reduces the energy and processing needs of the sensors, and during the project the researchers will be exploring how the high-quality footage they capture can be sourced efficiently and shared between machines, or uploaded to a server in the cloud.

‘Conventional camera technology captures video in a series of separate frames, or images, which can be a waste of resources if there is more motion in some areas than in others,’ explained Professor Maria Martini, leader of the Kingston University research team working on the project. ‘But these sensors … instead sample different parts of the scene at different rates, acquiring information only when there are changes in the light conditions.’

The new sensors will therefore be suitable for filming dynamic scenes such as explosions, where fast-moving sections are often captured inaccurately by standard cameras due to frame-rate and processing power limitations, and data is used inefficiently to update static sections.

‘This energy saving opens up a world of new possibilities for surveillance and other uses, from robots and drones to the next generation of retinal implants,’ commented Martini. ‘They could be implemented in small devices where people can't go and it's not possible to recharge the battery. 

‘Sometimes sensors are thrown from a plane into a forest and stay for years. The idea is that different devices with these sensors should be able to share high quality data efficiently with each other without the intervention of human beings.’

The research could also have wide-ranging implications for using the sensors in the field of medicine, according to Martini.

As part of the project, the team will be looking at how these sensors could work together as part of the Internet of Things (IoT) – devices that can be connected over the internet and then operated remotely. Project partners include global technology firms Samsung, Ericsson and Thales, as well as semiconductor company Mediatek and neuromorphic sensor specialists iniLabs.

Luca Verre, Co-founder and CEO of Prophesee, as well as a Photonics100 honoree, highlights how event-based vision is set to revolutionise mobile photography

06 April 2023

Mandar Sohoni, left, and Tianyu Wang adjust their research setup that tests the ability of an optical neural network to measure objects in a 3D scene

28 April 2023

The robot is able to recognise up to eight different human body postures (Image: Synsense)

27 April 2023

Luca Verre, Co-founder and CEO of Prophesee, as well as a Photonics100 honoree, highlights how event-based vision is set to revolutionise mobile photography

06 April 2023

SynSense has combined event-based image sensing technology with a 320,000-neuron processor to deliver real-time vision processing at milliwatt power consumption

30 March 2023