EPSRC project to explore how eye-like vision sensors could guide robots

Share this on social media:

Collaborators from three London universities are researching how an artificial vision system inspired by the human eye could be used to guide robots in a £1.3 million three-year EPSRC project. The new state-of-the-art sensors are able to capture, compress and transmit data using a fraction of the energy that current systems use.

(Credit: JAY LEE STUDIO/REX/Shutterstock)

Researchers from Kingston University London, University College London, and King's College London are working with the new dynamic, neuromorphic visual sensors, which are designed to only update the parts of an image where movement occurs, similar to how the human eye detects movement. This dramatically reduces the energy and processing needs of the sensors, and during the project the researchers will be exploring how the high-quality footage they capture can be sourced efficiently and shared between machines, or uploaded to a server in the cloud.

‘Conventional camera technology captures video in a series of separate frames, or images, which can be a waste of resources if there is more motion in some areas than in others,’ explained Professor Maria Martini, leader of the Kingston University research team working on the project. ‘But these sensors … instead sample different parts of the scene at different rates, acquiring information only when there are changes in the light conditions.’

The new sensors will therefore be suitable for filming dynamic scenes such as explosions, where fast-moving sections are often captured inaccurately by standard cameras due to frame-rate and processing power limitations, and data is used inefficiently to update static sections.

‘This energy saving opens up a world of new possibilities for surveillance and other uses, from robots and drones to the next generation of retinal implants,’ commented Martini. ‘They could be implemented in small devices where people can't go and it's not possible to recharge the battery. 

‘Sometimes sensors are thrown from a plane into a forest and stay for years. The idea is that different devices with these sensors should be able to share high quality data efficiently with each other without the intervention of human beings.’

The research could also have wide-ranging implications for using the sensors in the field of medicine, according to Martini.

As part of the project, the team will be looking at how these sensors could work together as part of the Internet of Things (IoT) – devices that can be connected over the internet and then operated remotely. Project partners include global technology firms Samsung, Ericsson and Thales, as well as semiconductor company Mediatek and neuromorphic sensor specialists iniLabs.

Related news

Credit: Sony Semiconductor Solutions

10 September 2021

The industrial sensors, with a 4.86μm pixel pitch, are the result of work with Prophesee presented last year at the International Solid-State Circuits Conference

Image: Martial Red/shutterstock.com

06 July 2021

The involvement by Sinovation and its founder Dr Kai-Fu Lee strengthens Prophesee's presence in China

Credit: CEA; metamorworks-shutterstock - 2021/01

25 May 2021

The face recognition imager consumes 10,000 times less energy than a typical camera and processor. CEA-Leti is working with STMicroelectronics on the imager

Credit: Sony Semiconductor Solutions

10 September 2021

The industrial sensors, with a 4.86μm pixel pitch, are the result of work with Prophesee presented last year at the International Solid-State Circuits Conference

Image: Silina

26 July 2021

Being able to curve multiple image sensors simultaneously opens up the ability to scale the process. Curving a sensor improves image quality without complex lenses

06 May 2021

The GTOF0503 sensor features a 5µm three-tap iToF pixel, incorporating an array with a resolution of 640 x 480 pixels