EPSRC project to explore how eye-like vision sensors could guide robots

Share this on social media:

Collaborators from three London universities are researching how an artificial vision system inspired by the human eye could be used to guide robots in a £1.3 million three-year EPSRC project. The new state-of-the-art sensors are able to capture, compress and transmit data using a fraction of the energy that current systems use.

(Credit: JAY LEE STUDIO/REX/Shutterstock)

Researchers from Kingston University London, University College London, and King's College London are working with the new dynamic, neuromorphic visual sensors, which are designed to only update the parts of an image where movement occurs, similar to how the human eye detects movement. This dramatically reduces the energy and processing needs of the sensors, and during the project the researchers will be exploring how the high-quality footage they capture can be sourced efficiently and shared between machines, or uploaded to a server in the cloud.

‘Conventional camera technology captures video in a series of separate frames, or images, which can be a waste of resources if there is more motion in some areas than in others,’ explained Professor Maria Martini, leader of the Kingston University research team working on the project. ‘But these sensors … instead sample different parts of the scene at different rates, acquiring information only when there are changes in the light conditions.’

The new sensors will therefore be suitable for filming dynamic scenes such as explosions, where fast-moving sections are often captured inaccurately by standard cameras due to frame-rate and processing power limitations, and data is used inefficiently to update static sections.

‘This energy saving opens up a world of new possibilities for surveillance and other uses, from robots and drones to the next generation of retinal implants,’ commented Martini. ‘They could be implemented in small devices where people can't go and it's not possible to recharge the battery. 

‘Sometimes sensors are thrown from a plane into a forest and stay for years. The idea is that different devices with these sensors should be able to share high quality data efficiently with each other without the intervention of human beings.’

The research could also have wide-ranging implications for using the sensors in the field of medicine, according to Martini.

As part of the project, the team will be looking at how these sensors could work together as part of the Internet of Things (IoT) – devices that can be connected over the internet and then operated remotely. Project partners include global technology firms Samsung, Ericsson and Thales, as well as semiconductor company Mediatek and neuromorphic sensor specialists iniLabs.

Related news

Recent News

09 April 2019

UK firm Winnow has launched an AI-enabled product incorporating vision that aims to help commercial kitchens reduce food waste

09 April 2019

Contract R&D organisation, Southwest Research Institute, has developed a vision solution that improves robot handling of shiny metallic objects, which it is showing at Automate in Chicago

04 April 2019

Scientists at the Swiss technical university, EPFL, have combined hyperspectral imaging with metasurfaces – artificial materials covered in millions of nano-sized elements – to develop a label-free biosensing platform

02 April 2019

A campaign to raise awareness among young people about careers in engineering has drawn up a list of seven technologies rarely recognised as feats of engineering