Thanks for visiting Imaging and Machine Vision Europe.

You're trying to access an editorial feature that is only available to logged in, registered users of Imaging and Machine Vision Europe. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Driver assistance with time-of-flight at IS2013

Share this on social media:

Tags: 

Greg Blackman reports from the Image Sensors 2013 conference, including the latest from Toyota Central R&D Labs on its time-of-flight sensor for driver assistance

Toyota's Central R&D Labs in Japan has developed a prototype high-definition imaging lidar sensor for its Advanced Driver Assistance Systems (ADAS). The time-of-flight 3D sensor, which has a 100m range, could potentially be used in future Toyota vehicles for aspects like obstacle avoidance, pedestrian detection and lane detection.

Speaking at the Image Sensors conference (IS2013) in London (19-21 March), Dr Cristiano Niclass of Toyota Central R&D Labs said that the imaging lidar sensor provides more comprehensive coverage of the road for driver assistance than millimetre-wave radar or stereovision, both of which have drawbacks – millimetre-wave radar has excellent range, but low angular resolution and field of view, whereas stereovision has a better field of view, but only operates over a shorter distance.

The lidar sensor has a range of 100m and operates under bright sunlight, as well as in rain and adverse weather conditions, Dr Niclass said. The system uses a polygon mirror to direct the laser beam in six different directions and a time-of-flight sensor running at 202 x 96 pixels at 10fps. It is eye-safe, with a laser output power of 21mW.

The system was tested under a background illuminance of 70klux and a sky illuminance of greater than 100klux, and recorded an error of less than 15cm at 100m distance at 10fps.

Dr Niclass said the sensor would recognise most objects based on the 3D data. He said that at 10fps there would be some motion artefacts when driving at speed on a motorway, for instance, but added that the system is designed for driver assistance rather than autonomous driving, which is a lot more challenging.

The IS2013 conference covers all aspects of image sensor technology for consumer, broadcast, and industrial and scientific imagers. In his presentation, Jim Lewis, CEO of Swiss 3D imaging company, Mesa Imaging, challenged some of what he considered were myths surrounding the capabilities of time-of-flight (TOF) imaging. He suggested that TOF technology needed improvement in operating outdoors, where direct sunlight, longer operating ranges, variable scenes, and bad weather conditions all make acquiring accurate 3D data difficult. He also said there was more to building an industrial-grade TOF camera than perfecting the pixel – LEDs will degrade over time and the camera will have an inherent temperature variation, all of which complicates engineering a robust industrial camera.

Lewis said that to widen the adoption of TOF imaging in industry, depth map stability has to be improved, the technology has to be able to track fast moving objects better, and the dynamic range has to improve. Mesa Imaging’s latest TOF sensor is the Onyx, a 25k pixel analogue sensor.

As well as 3D imaging, the conference had presentations on multi-aperture imaging, broadcast image sensor technology, CMOS RGB Clear technology, digital cameras, and medical applications. The latest market data was also presented by Paul Danini, a market analyst at French market research firm, Yole Développement. He commented that the CMOS image sensor market was worth $6.6 billion in 2012, 60 per cent of which was represented by the mobile phone sector. He said that computer tablets and automotive were two areas expected to grow significantly in the future.

Related analysis & opinion

Test image at 940nm

21 May 2019

Imec is working to industrialise thin-film photodetector stacks on CMOS in order to fabricate low-cost infrared image sensors, as the Belgian institute’s Dr Paweł Malinowski explains

28 March 2019

Greg Blackman reports from the Image Sensors Europe conference in London, which took place from 13 to 14 March

22 February 2019

Ron Low, Framos head of sales Americas and APAC, reports from Framos Tech Days at Photonics West in San Francisco where Sony Japan representatives presented image sensor roadmap updates

19 February 2019

Greg Blackman reports on CEA Leti's new image sensor, shown at Photonics West, which contains onboard processing and is able to image at 5,500 frames per second

28 August 2018

Technology that advances 3D imaging, makes lenses more resistant to vibration, turns a CMOS camera virtually into a CCD, and makes SWIR imaging less expensive, are all innovations shortlisted for this year’s Vision Award, to be presented at the Vision show in Stuttgart

Related features and analysis & opinion

Test image at 940nm

21 May 2019

Imec is working to industrialise thin-film photodetector stacks on CMOS in order to fabricate low-cost infrared image sensors, as the Belgian institute’s Dr Paweł Malinowski explains

29 March 2019

Greg Blackman speaks to Sofradir about its €150 million investment to advance infrared sensor technology

28 March 2019

Greg Blackman reports from the Image Sensors Europe conference in London, which took place from 13 to 14 March

22 February 2019

Ron Low, Framos head of sales Americas and APAC, reports from Framos Tech Days at Photonics West in San Francisco where Sony Japan representatives presented image sensor roadmap updates

19 February 2019

Greg Blackman reports on CEA Leti's new image sensor, shown at Photonics West, which contains onboard processing and is able to image at 5,500 frames per second

Surface-based matching in Halcon, from MVTec Software

22 November 2019

Greg Blackman looks at the latest techniques to capture and analyse 3D image data

Depth map from the SceneScan. Credit: Nerian Vision

21 November 2019

Dr Konstantin Schauwecker, CEO of Nerian Vision, describes the firm’s stereo vision sensor for fast depth perception with FPGAs

26 July 2019

Matthew Dale explores the high-resolution imaging solutions emerging for inspecting OLEDs and other electronic displays