Thanks for visiting Imaging and Machine Vision Europe.

You're trying to access an editorial feature that is only available to logged in, registered users of Imaging and Machine Vision Europe. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Robot helicopter uses imaging to learn as it flies

Share this on social media:

Scientists at the University of Sheffield, UK, have engineered a robotic quadcopter that can learn as it flies using imaging and other sensing data. The team, based in Sheffield’s Department of Automatic Control and Systems Engineering (ACSE), has created software that enables the robot to learn about its surroundings using a forward facing camera mounted at the front of the machine.

The researchers used a computer concept called game theory to programme the quadcopters. In this framework, each robot is a player in the game and must complete its given task in order to ‘win’ the game.

The robot starts with no information about its environment and the objects within it. By overlaying different frames from the camera and selecting key reference points within the scene, it builds up a 3D map of the world around it. Other sensors pick up barometric and ultrasonic data, which give the robot additional clues about its environment. All this information is fed into autopilot software to allow the robot to navigate safely, but also to learn about the objects nearby and navigate to specific items.

‘We are used to the robots of science fiction films being able to act independently, recognise objects and individuals and make decisions,’ explained Professor Sandor Veres, who is leading the research. ‘In the real world, however, although robots can be extremely intelligent individually, their ability to co-operate and interact with each other and with humans is still very limited.

‘As we develop robots for use in space or to send into nuclear environments – places where humans cannot easily go – the goal will be for them to understand their surroundings and make decisions based on that understanding.’

While the University of Sheffield work is still academic, EOS Innovation, a European startup that specialises in the design and manufacture of mobile surveillance robots, has released a commercial robotic surveillance system called E-Vigilante at the IFSEC International trade fair, which ran from 17 to 19 June in London.

Designed for use in warehouses, when a threat is detected an array of dissuasive tactics, such as an alarm sound, flashing lights and voice control, can be used by the agent to combat potential theft from the safety of a remote location.

Equipped with a high-resolution surveillance camera with 360° pan capability, E-Vigilante saves and transmits audio and video data in real-time whenever an incident is detected. In order to qualify the existence of a security threat, the agent takes control of the robot remotely.

E-Vigilante has been designed to supplement human surveillance, operating in tandem with an off-site security agent, who retains control of the important decision-making process when the robot detects an alert situation or other incident. The robot can patrol randomly or follow a pre-programmed round, set up by the controller.

Recent News

24 October 2019

Imec says the new production method promises an order of magnitude gain in fabrication throughput and cost compared to processing conventional infrared imagers

04 October 2019

Each pixel in Prophesee’s Metavision sensor only activates if it detects a change in the scene – an event – which means low power, latency and data processing requirements

18 September 2019

3D sensing company, Outsight, has introduced a 3D semantic camera that combines lidar ranging with hyperspectral material analysis. The camera was introduced at the Autosens conference in Brussels

16 September 2019

OmniVision Technologies will be showing an automotive camera module at the AutoSens conference in Brussels from 17 to 19 September, built using OmniVision’s OX03A1Y image sensor with an Arm Mali-C71 image signal processor