Thanks for visiting Imaging and Machine Vision Europe.

You're trying to access an editorial feature that is only available to logged in, registered users of Imaging and Machine Vision Europe. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

EPSRC project to explore how eye-like vision sensors could guide robots

Share this on social media:

Collaborators from three London universities are researching how an artificial vision system inspired by the human eye could be used to guide robots in a £1.3 million three-year EPSRC project. The new state-of-the-art sensors are able to capture, compress and transmit data using a fraction of the energy that current systems use.

(Credit: JAY LEE STUDIO/REX/Shutterstock)

Researchers from Kingston University London, University College London, and King's College London are working with the new dynamic, neuromorphic visual sensors, which are designed to only update the parts of an image where movement occurs, similar to how the human eye detects movement. This dramatically reduces the energy and processing needs of the sensors, and during the project the researchers will be exploring how the high-quality footage they capture can be sourced efficiently and shared between machines, or uploaded to a server in the cloud.

‘Conventional camera technology captures video in a series of separate frames, or images, which can be a waste of resources if there is more motion in some areas than in others,’ explained Professor Maria Martini, leader of the Kingston University research team working on the project. ‘But these sensors … instead sample different parts of the scene at different rates, acquiring information only when there are changes in the light conditions.’

The new sensors will therefore be suitable for filming dynamic scenes such as explosions, where fast-moving sections are often captured inaccurately by standard cameras due to frame-rate and processing power limitations, and data is used inefficiently to update static sections.

‘This energy saving opens up a world of new possibilities for surveillance and other uses, from robots and drones to the next generation of retinal implants,’ commented Martini. ‘They could be implemented in small devices where people can't go and it's not possible to recharge the battery. 

‘Sometimes sensors are thrown from a plane into a forest and stay for years. The idea is that different devices with these sensors should be able to share high quality data efficiently with each other without the intervention of human beings.’

The research could also have wide-ranging implications for using the sensors in the field of medicine, according to Martini.

As part of the project, the team will be looking at how these sensors could work together as part of the Internet of Things (IoT) – devices that can be connected over the internet and then operated remotely. Project partners include global technology firms Samsung, Ericsson and Thales, as well as semiconductor company Mediatek and neuromorphic sensor specialists iniLabs.

Related news

Recent News

19 May 2020

The National Institute of Standards and Technology and ASTM Committee E57 have released proceedings on a workshop to define the performance of 3D imaging systems for robots in manufacturing

12 May 2020

The sensors boast a pixel pitch of 5μm thanks to Sony's stacking technology using a copper-to-copper connection. They also deliver high quantum efficiency even in the visible range

06 April 2020

Zensors' algorithms analyse feeds from CCTV cameras to provide real-time data on the number of people in an area and whether safe distances are maintained between them

02 April 2020

Research at MIT and Toho University in Japan, using high-speed imaging, has shown how aerosols from a sneeze and from speech can travel in the air