Thanks for visiting Imaging and Machine Vision Europe.

You're trying to access an editorial feature that is only available to logged in, registered users of Imaging and Machine Vision Europe. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Vision-enabled underwater Arctic robot selected as finalist for European student design contest

Share this on social media:

An underwater vision-equipped robot designed for Arctic research has been selected as a finalist for the Northern European Student Design Contest, an annual competition held by National Instruments to reward the most innovative student projects. 

The finalist team from Aarhaus University, Denmark developed a remotely operated vehicle (ROV), named DeepFreezeROV, to further their research into understanding of the Arctic environment and dynamic effects in this area.

Gaining a detailed understanding of ice algae is essential for developing accurate climate models and fishing quotas. Ice algae is the main food source for all life beneath the ice-covered Arctic ocean and significantly contributes to global CO2 absorption and the ability to store CO2 on the bottom of the ocean.

The SIAP-project (Sea Ice Algae Photobiology) is studying the ice algae that lives under and within the sea ice. However, to further enhance the studies there is a need to be able to analyse the algae within its own environment (i.e. without drilling ice cores), which requires conducting measurements such as light intensity, light spectrum and snow depth, both underneath and on top of the ice.

An ROV was required for this task; however, most commercially available ROVs are either unable to navigate the harsh Arctic environment, are too expensive, and logistically unsuitable for the expedition. The main challenge of the project was to develop a new short-range ROV able to manoeuvre and navigate underneath the fjord ice of Greenland.

The DeepFreezeROV is a lightweight, highly-manoeuvrable, aquatic inspection robot, ready for under-ice deployment in Greenland.

The ROV uses two cameras in its positioning system, which enables it to navigate locally with respect to the two reference points. The system uses two cameras – one looking forward and one backwards towards each reference point. The images are acquired and processed with NI LabView.

For image processing, the NI Vision Assistant within LabView is used to manipulate the image and extract salient information (see figure 1). From each image, the Vision Assistant outputs the number of pixels between the two lights on the reference point, as well as the horizontal distance from the centre of the image. This information is then converted to angles and distances with the use of calibrated pixel to metre/angle curves. LabView then processes the data using simple trigonometric functions to output the ROV's placement along the line and offset from the line.

Figure 1: Physical overview of positioning system. Credit: Aarhus University School of Engineering/NI

 

A third camera stream is simply displayed on a LabView user interface for visual inspections and recovery (for example for helping the operator guide the ROV to the deployment hole in the ice).

A total of six projects were chosen for the Northern European Student Design Contest, ranging from a space rocket that beat the European Altitude record, to a robot designed to help stroke survivors recover. All of the projects used NI technology.

Voting runs from the 20 September to the 21 October. The winner of the public vote will be presented the ‘People’s choice Award’ at the Engineering Impact Awards held at the RSA in London, 28 November, along with an award for the Judges choice.

Further information 

Deep Freeze ROV: An Underwater Robot for Arctic Research 

National Instruments 

 

 

Recent News

23 March 2020

Infervision's coronavirus AI system has now been installed at the Campus Bio-Medico University Hospital in Rome to screen and diagnose Covid-19 patients

19 February 2020

Sony and Prophesee have developed a stacked event-based vision sensor, it was announced at the International Solid-State Circuits Conference in San Francisco

17 January 2020

The lens-free technology from CEA-Leti, which digitally reconstructs microscope images and will be presented at SPIE Photonics West, opens up fast, automated cell screening

24 October 2019

Imec says the new production method promises an order of magnitude gain in fabrication throughput and cost compared to processing conventional infrared imagers