Vision-enabled underwater Arctic robot selected as finalist for European student design contest

Share this on social media:

An underwater vision-equipped robot designed for Arctic research has been selected as a finalist for the Northern European Student Design Contest, an annual competition held by National Instruments to reward the most innovative student projects. 

The finalist team from Aarhaus University, Denmark developed a remotely operated vehicle (ROV), named DeepFreezeROV, to further their research into understanding of the Arctic environment and dynamic effects in this area.

Gaining a detailed understanding of ice algae is essential for developing accurate climate models and fishing quotas. Ice algae is the main food source for all life beneath the ice-covered Arctic ocean and significantly contributes to global CO2 absorption and the ability to store CO2 on the bottom of the ocean.

The SIAP-project (Sea Ice Algae Photobiology) is studying the ice algae that lives under and within the sea ice. However, to further enhance the studies there is a need to be able to analyse the algae within its own environment (i.e. without drilling ice cores), which requires conducting measurements such as light intensity, light spectrum and snow depth, both underneath and on top of the ice.

An ROV was required for this task; however, most commercially available ROVs are either unable to navigate the harsh Arctic environment, are too expensive, and logistically unsuitable for the expedition. The main challenge of the project was to develop a new short-range ROV able to manoeuvre and navigate underneath the fjord ice of Greenland.

The DeepFreezeROV is a lightweight, highly-manoeuvrable, aquatic inspection robot, ready for under-ice deployment in Greenland.

The ROV uses two cameras in its positioning system, which enables it to navigate locally with respect to the two reference points. The system uses two cameras – one looking forward and one backwards towards each reference point. The images are acquired and processed with NI LabView.

For image processing, the NI Vision Assistant within LabView is used to manipulate the image and extract salient information (see figure 1). From each image, the Vision Assistant outputs the number of pixels between the two lights on the reference point, as well as the horizontal distance from the centre of the image. This information is then converted to angles and distances with the use of calibrated pixel to metre/angle curves. LabView then processes the data using simple trigonometric functions to output the ROV's placement along the line and offset from the line.

Figure 1: Physical overview of positioning system. Credit: Aarhus University School of Engineering/NI

 

A third camera stream is simply displayed on a LabView user interface for visual inspections and recovery (for example for helping the operator guide the ROV to the deployment hole in the ice).

A total of six projects were chosen for the Northern European Student Design Contest, ranging from a space rocket that beat the European Altitude record, to a robot designed to help stroke survivors recover. All of the projects used NI technology.

Voting runs from the 20 September to the 21 October. The winner of the public vote will be presented the ‘People’s choice Award’ at the Engineering Impact Awards held at the RSA in London, 28 November, along with an award for the Judges choice.

Further information 

Deep Freeze ROV: An Underwater Robot for Arctic Research 

National Instruments 

 

 

Recent News

03 September 2020

Terahertz imaging company, Tihive, has been awarded €8.6m from the European Innovation Council's Accelerator programme to scale up its industrial inspection technology

19 May 2020

The National Institute of Standards and Technology and ASTM Committee E57 have released proceedings on a workshop to define the performance of 3D imaging systems for robots in manufacturing

12 May 2020

The sensors boast a pixel pitch of 5μm thanks to Sony's stacking technology using a copper-to-copper connection. They also deliver high quantum efficiency even in the visible range

06 April 2020

Zensors' algorithms analyse feeds from CCTV cameras to provide real-time data on the number of people in an area and whether safe distances are maintained between them