NEWS
Tags: 

EPSRC project to explore how eye-like vision sensors could guide robots

Collaborators from three London universities are researching how an artificial vision system inspired by the human eye could be used to guide robots in a £1.3 million three-year EPSRC project. The new state-of-the-art sensors are able to capture, compress and transmit data using a fraction of the energy that current systems use.

(Credit: JAY LEE STUDIO/REX/Shutterstock)

Researchers from Kingston University London, University College London, and King's College London are working with the new dynamic, neuromorphic visual sensors, which are designed to only update the parts of an image where movement occurs, similar to how the human eye detects movement. This dramatically reduces the energy and processing needs of the sensors, and during the project the researchers will be exploring how the high-quality footage they capture can be sourced efficiently and shared between machines, or uploaded to a server in the cloud.

‘Conventional camera technology captures video in a series of separate frames, or images, which can be a waste of resources if there is more motion in some areas than in others,’ explained Professor Maria Martini, leader of the Kingston University research team working on the project. ‘But these sensors … instead sample different parts of the scene at different rates, acquiring information only when there are changes in the light conditions.’

The new sensors will therefore be suitable for filming dynamic scenes such as explosions, where fast-moving sections are often captured inaccurately by standard cameras due to frame-rate and processing power limitations, and data is used inefficiently to update static sections.

‘This energy saving opens up a world of new possibilities for surveillance and other uses, from robots and drones to the next generation of retinal implants,’ commented Martini. ‘They could be implemented in small devices where people can't go and it's not possible to recharge the battery. 

‘Sometimes sensors are thrown from a plane into a forest and stay for years. The idea is that different devices with these sensors should be able to share high quality data efficiently with each other without the intervention of human beings.’

The research could also have wide-ranging implications for using the sensors in the field of medicine, according to Martini.

As part of the project, the team will be looking at how these sensors could work together as part of the Internet of Things (IoT) – devices that can be connected over the internet and then operated remotely. Project partners include global technology firms Samsung, Ericsson and Thales, as well as semiconductor company Mediatek and neuromorphic sensor specialists iniLabs.

Twitter icon
Google icon
Del.icio.us icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon
Feature

Greg Blackman explores the latest advances made in scientific CMOS sensors and asks whether CCDs still have a place in life science imaging

Feature

Denis Bulgin speaks to Mark Williamson and David Hearn, who both started their own vision companies in the UK 20 years ago and are both now at Stemmer Imaging

Feature

Matthew Dale investigates a new class of highly-efficient image sensor that’s just starting to find its way onto the commercial market, all based on the principles of biological sight

Feature

Andrew Williams on the uses and current state of hyperspectral imaging, along with the technique’s potential as an industrial inspection tool

Feature

Stemmer Imaging’s series of technology days included talks from various lens manufacturers. Here, we round up some of what was discussed at the event

Feature

Greg Blackman charts the meteoric rise of Chinese firm Hikvision, one of the top suppliers of video surveillance equipment that has now turned its sights on industrial vision