Thanks for visiting Imaging and Machine Vision Europe.

You're trying to access an editorial feature that is only available to logged in, registered users of Imaging and Machine Vision Europe. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

BMW makes car body inspections with gestures

Share this on social media:

Researchers at the Fraunhofer Institute for Optronics, System Technologies and Image Exploitation (IOSB) in Karlsruhe have engineered an intelligent gesture control system on behalf of BMW. The gesture-detection system is based on 3D data and is designed for BMW staff to document any defects on car body parts simply by pointing at them on the part.

The system means BMW employees no longer have note defects and input them into a PC; they simply point at the area on the part and the system will automatically record the defect.

The gesture control system works by reconstructing the test environment in 3D using a standard PC and two Microsoft Kinect cameras and 3D sensors. The researchers developed algorithms to fuse multiple 2D and 3D images together and adapted them to BMW’s requirements.

The reconstruction includes tracking the inspector in 3D in real time, as well as the object with which he is working. Alexander Schick, scientist at IOSB, explained: ‘What does the inspector look like? Where is he situated? How does he move? What is he doing? Where is the object? – all of these data are required so that the pointing gesture can properly link to the bumper [or part under inspection].’

Schick described how the inspection process used to be carried out: ‘Previously, the inspector had to note all defects that were detected, leave his workstation, go to the PC terminal, operate multiple input screens and then label the position of the defect and the defect type. That approach is laborious, time-intensive and prone to error.’

The gesture control system, by contrast, allows the employee to remain at his workstation and interact directly with the test object. ‘If the bumper is fine, then he swipes over it from left to right. In the event of damage, he points to the location of the defect,’ said Schick.

Fraunhofer has built a Smart Control Room, where people can interact with the room and use pointing gestures to operate remote displays. The room recognises what actions are taking place at that moment, and offers the appropriate information and tools. ‘Since gesture detection does not depend on display screens, this means we can implement applications that use no monitors, like the gesture interaction here with real objects,’ said Schick.

Recent News

15 November 2019

A time-lapse 3D video of a zebrafish heart growing over a day has been captured for the first time by a new microscope imaging technique

26 September 2019

Rugby fans are now able to watch highlights from the Rugby World Cup, currently taking place in Japan, from angles and viewpoints not possible with conventional cameras, thanks to a multi-camera system from Canon

13 September 2019

A hyperspectral imaging system built by US research centre Battelle, using Headwall sensors, has been chosen as a finalist for the Department of Homeland Security’s Opioid Detection Challenge

23 July 2019

On the 50th anniversary of the Moon landing on 20 July 1969, Zeiss has described how, in less than nine months, it built the camera lens used to capture the iconic images during the Apollo 11 mission