Thanks for visiting Imaging and Machine Vision Europe.

You're trying to access an editorial feature that is only available to logged in, registered users of Imaging and Machine Vision Europe. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Vision-enabled underwater Arctic robot selected as finalist for European student design contest

Share this on social media:

An underwater vision-equipped robot designed for Arctic research has been selected as a finalist for the Northern European Student Design Contest, an annual competition held by National Instruments to reward the most innovative student projects. 

The finalist team from Aarhaus University, Denmark developed a remotely operated vehicle (ROV), named DeepFreezeROV, to further their research into understanding of the Arctic environment and dynamic effects in this area.

Gaining a detailed understanding of ice algae is essential for developing accurate climate models and fishing quotas. Ice algae is the main food source for all life beneath the ice-covered Arctic ocean and significantly contributes to global CO2 absorption and the ability to store CO2 on the bottom of the ocean.

The SIAP-project (Sea Ice Algae Photobiology) is studying the ice algae that lives under and within the sea ice. However, to further enhance the studies there is a need to be able to analyse the algae within its own environment (i.e. without drilling ice cores), which requires conducting measurements such as light intensity, light spectrum and snow depth, both underneath and on top of the ice.

An ROV was required for this task; however, most commercially available ROVs are either unable to navigate the harsh Arctic environment, are too expensive, and logistically unsuitable for the expedition. The main challenge of the project was to develop a new short-range ROV able to manoeuvre and navigate underneath the fjord ice of Greenland.

The DeepFreezeROV is a lightweight, highly-manoeuvrable, aquatic inspection robot, ready for under-ice deployment in Greenland.

The ROV uses two cameras in its positioning system, which enables it to navigate locally with respect to the two reference points. The system uses two cameras – one looking forward and one backwards towards each reference point. The images are acquired and processed with NI LabView.

For image processing, the NI Vision Assistant within LabView is used to manipulate the image and extract salient information (see figure 1). From each image, the Vision Assistant outputs the number of pixels between the two lights on the reference point, as well as the horizontal distance from the centre of the image. This information is then converted to angles and distances with the use of calibrated pixel to metre/angle curves. LabView then processes the data using simple trigonometric functions to output the ROV's placement along the line and offset from the line.

Figure 1: Physical overview of positioning system. Credit: Aarhus University School of Engineering/NI

 

A third camera stream is simply displayed on a LabView user interface for visual inspections and recovery (for example for helping the operator guide the ROV to the deployment hole in the ice).

A total of six projects were chosen for the Northern European Student Design Contest, ranging from a space rocket that beat the European Altitude record, to a robot designed to help stroke survivors recover. All of the projects used NI technology.

Voting runs from the 20 September to the 21 October. The winner of the public vote will be presented the ‘People’s choice Award’ at the Engineering Impact Awards held at the RSA in London, 28 November, along with an award for the Judges choice.

Further information 

Deep Freeze ROV: An Underwater Robot for Arctic Research 

National Instruments 

 

 

Recent News

04 October 2019

Each pixel in Prophesee’s Metavision sensor only activates if it detects a change in the scene – an event – which means low power, latency and data processing requirements

18 September 2019

3D sensing company, Outsight, has introduced a 3D semantic camera that combines lidar ranging with hyperspectral material analysis. The camera was introduced at the Autosens conference in Brussels

16 September 2019

OmniVision Technologies will be showing an automotive camera module at the AutoSens conference in Brussels from 17 to 19 September, built using OmniVision’s OX03A1Y image sensor with an Arm Mali-C71 image signal processor

09 September 2019

Hamamatsu Photonics claims it is the first company to mass produce a mid-infrared detector that doesn’t use mercury and cadmium, which are restricted under the European Commission’s RoHS directive