Thanks for visiting Imaging and Machine Vision Europe.

You're trying to access an editorial feature that is only available to logged in, registered users of Imaging and Machine Vision Europe. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Robotics and vision research discussed at BMVA event

Share this on social media:

Tags: 

Robotic systems designed for space exploration must be robust and carry out tasks with limited power. At the British Machine Vision Association's (BMVA) Robotics and Vision technical meeting, held in London on 24 March, two presentations were given on the European Space Agency's ExoMars mission to send two robotic rovers to explore the surface of Mars (launch date of 2018).

Mark Woods of UK technology solutions company, SciSys, presented on the company's task of using intelligent planning and scheduling (IPS) in the software to speed up decisions made by the rover. The vehicle will use various sensors to make decisions on firstly, whether an area is worth investigating, secondly, whether it can physically reach the intended site and thirdly, decisions on the execution of the task – raising the robot arm to take a sample, for example. Woods commented that it can take days to position the robot arm correctly and for measurements to be taken and it is SciSys' task to reduce that timeframe.

Maria North of Roke Manor Research, a contract engineering R&D company owned by Siemens, also spoke about some of the algorithms that the company is developing for use onboard the rover, including those to determine the 3D position of objects in the environment.

Autonomous systems generally have to be able to localise themselves within their surroundings. SLAM (Simultaneous Localisation and Mapping) is one technique used in robots and autonomous vehicles to do this. Dr Ian Reid of the Robotics Research Group at the University of Oxford is using SLAM in the development of a mobile maritime surveillance system to monitor boats in busy harbours. The system uses radar images in SLAM to build a model of its surroundings as well as locate itself within the model. Boats are found and tracked using radar data. Cameras on the system are then positioned according to the radar images and the system switches to optical tracking for a better visual of the boat.

Reid commented that the military are interested in using SLAM for localisation of unmanned vehicles as an alternative to GPS. According to Reid, SLAM has some advantages over GPS – the GPS signal can be blocked and GPS can't position other nearby features or moving objects, which SLAM does. Furthermore, in some cases, SLAM can localise the robot with greater accuracy than GPS.

Nicolas Pugeault at the University of Surrey presented on a research project using images to predict the driving behaviour of humans, potentially for the development of autonomous vehicles. The team used a video footage from 3hrs of driving (158,668 frames) to classified and predict a crude description of driver’s actions, namely steering left/right, accelerating, braking and use of clutch. Preliminary conclusions suggest that, within these categories, driver's actions can be predicted from images with a high degree of accuracy (80 per cent of braking can be predicted).

Other presentations included those from Ingmar Poser, Oxford University, on using vision for robot navigation and Toby Breckon, Cranfield University, on integrating vision into complex deployable unmanned robots, as part of the 2008 MOD Grand Challenge.

The next BMVA technical day, entitled 'Microscopy Image Analysis for Biomedical Applications' will be held in London on 21 April.