One dollar micro-cameras to increase uptake of vision

Share this on social media:

Tags: 

Camera systems measuring millimetres and costing only one dollar are now possible thanks to electronic and optical developments in wafer-level technology.

Antonio Gomes of semiconductor manufacturer Ams spoke about manufacturing minature cameras at the EMVA’s debut Embedded Vision Europe conference in Stuttgart from 12 to 13 October.

The first Embedded Vision Europe Conference attracted around 200 participants. (Credit: EMVA)

Gomes remarked that digital camera module heads measuring 0.7 x 0.7 x 1.1mm have already been produced for medical endoscopy - when Ams acquired Cmosis in 2015, it also acquired technology from Awaiba, which developed sub-millimetre CMOS modules for endoscopes. Gomes added that by introducing additional electronics to the system, fully functioning cameras can be produced, and can be done so for less than one dollar.

‘It’s known that visions systems are expensive, bulky, and consume a lot of power, so that’s something we want to change,’ Gomes said. ‘Nowadays cameras really don’t need to be bulky... In the past five years I’ve been helping customers to drive vision into the smallest places that you can imagine.’

In order to compress the vision technology into millimetre volumes, Ams has had to reduce the chip size of the system, optimise power consumption, select a suitable interface, use wafer-level lens technology, and shrink the size of the camera housing.

For the chip, a considerable contributing factor to its size is the pixel area, as this has to be fixed depending on the resolution. Ams therefore had to focus on reducing the size of the surrounding area – the periphery – and was able to bring this down to 85μm.  The chip still included vital components and functionality, such as the pixel guard ring; edge of die, row and column; analogue-to-digital convertors (ADCs); a serialiser; an LVDS driver; power-on reset; and four contacts for through silicon vias on the backside.

To improve the power consumption Ams has optimised the ADCs and off-chip driver architectures; they measure 0.08mmand consume 0.36mW of power. The LVDS driver architectures were also optimised to use only the minimal currennecessary.

When considering the most suitable interface to use, LVDS, MIPI, CIF, I2C and SPI were all candidates that were either too power hungry or took up too much space in the system. Ams is hoping to overcome this by implementing MIPI I3C in its next generation of devices, which offers both low power consumption and low volume while being able to drive a complete data stream.

Miniature optics are also essential for such a small camera. Modern wafer-level technology now allows multi-element micro-lenses to be packaged into a stack and bonded to the chip of the camera.

Three main technologies are available for producing wafer-level optics, according to Gomes: plasma etching, which requires long processing times and presents limitations in lens height; glass formed lenses, which are no longer used because of process stability issues; and UV replicated polymer lenses. It wasn’t disclosed by Gomes which technology was used by Ams in producing its cameras.

‘Current technology allows us to make multiple types of lenses at wafer-level, such as twin lenses, dual lenses, hybrid lenses and Fresnel lenses,’ said Gomes. ‘Multiple lenses can be combined together in the same model.’ Ams recently acquired Heptagon, a firm with expertise in replicating optics in wafer-level packaging, which can now be used by its new owner to house the wafer-level optics in the camera. Wafer-level chip-scale packaging offers both size and cost benefits over traditional wire bonded assemblies thanks to its simpler assembly process, according to Gomes. 

Ams’s upcoming NanEye camera module is a full camera system using multiple element wafer-level lenses in a footprint of around 1 x 1mm. The pixel array has a size of 750 x 750μm and enables resolutions of 250 x 250 pixels (using 3µm pixels) or 320 x 320 pixels (2.4µm pixels), with 500 x 500 pixels (1.4µm pixels) also being introduced in 2018, according to Gomes. He said that while the resolutions are not as high as other vision systems, they are adequate for applications such as endoscopy, gesture recognition, and eye tracking. The micro-cameras can also be used in multi-camera solutions to provide features such as stereovision, allowing 3D measurements to be taken in confined spaces.

Increasing the resolution of wafer-level optics is particularly challenging, Gomes pointed out, as the lenses themselves don’t necessarily have high enough resolving power to be able to take advantage of the higher pixel count.

The size of the micro-cameras means they can be fabricated in large volumes and for very little cost. According to Gomes, a 300mm image sensor wafer can be used to produce 66,000 1 x 1mm devices at a cost of 20 cents each, and a 100 x 100mm glass substrate can be used to produce 6,000 1mm-size lenses. When combined to form a full, high-performance camera system, the cost rises to just under one dollar per camera. This low cost makes the cameras perfect for disposable applications, particularly in medicine.

Ams believes that thanks to their low cost and their ability to be produced in large numbers, the micro-cameras will enable a wider uptake of vision to take place as anyone will be able to afford them.

‘We believe the use of cameras for automated vision will become ubiquitous,’ Gomes concluded.

Related article:

Deep learning for embedded vision highlighted at EMVA conference

Company: 

Related analysis & opinion

28 August 2018

Technology that advances 3D imaging, makes lenses more resistant to vibration, turns a CMOS camera virtually into a CCD, and makes SWIR imaging less expensive, are all innovations shortlisted for this year’s Vision Award, to be presented at the Vision show in Stuttgart

11 December 2017

Professor Dr Bernd Jähne at HCI, Heidelberg University, and an EMVA board member, argues that an open lens-camera communication standard would greatly benefit the machine vision community

22 February 2019

Ron Low, Framos head of sales Americas and APAC, reports from Framos Tech Days at Photonics West in San Francisco where Sony Japan representatives presented image sensor roadmap updates

20 February 2019

Jeff Bier, founder of the Embedded Vision Alliance, discusses the four key trends driving the proliferation of visual perception in machines

19 February 2019

Greg Blackman reports on CEA Leti's new image sensor, shown at Photonics West, which contains onboard processing and is able to image at 5,500 frames per second

Related features and analysis & opinion

31 July 2018

Endoscopy allows the examination and treatment of internal organs with minimal incision or invasion to the patient. In its simplest form, an endoscope features a tiny camera and light source on the end of a long tube. The technology actually goes back centuries, with the first such invention often attributed to German physicist Philip Bozzini and his Lichtleiter device, or light conductor, which was first demonstrated in 1806 using a candle and concave mirrors to view and illuminate the inside of the body.

03 January 2019

Greg Blackman reports on the latest lens technology presented during Vision Stuttgart, including optics designed to withstand shock

31 October 2018

Qualcomm Technologies' Snapdragon board is designed for mobile devices, but can be used to create other embedded vision systems. Credit: Qualcomm Technologies

Embedded computing promises to lower the cost of building vision solutions, making imaging ubiquitous across many areas of society. Whether this turns out to be the case or not remains to be seen, but in the industrial sector the G3 vision group, led by the European Machine Vision Association (EMVA), is preparing for an influx of embedded vision products with a new standard.

28 August 2018

Technology that advances 3D imaging, makes lenses more resistant to vibration, turns a CMOS camera virtually into a CCD, and makes SWIR imaging less expensive, are all innovations shortlisted for this year’s Vision Award, to be presented at the Vision show in Stuttgart

22 February 2019

Ron Low, Framos head of sales Americas and APAC, reports from Framos Tech Days at Photonics West in San Francisco where Sony Japan representatives presented image sensor roadmap updates

20 February 2019

Jeff Bier, founder of the Embedded Vision Alliance, discusses the four key trends driving the proliferation of visual perception in machines

19 February 2019

Greg Blackman reports on CEA Leti's new image sensor, shown at Photonics West, which contains onboard processing and is able to image at 5,500 frames per second

15 November 2018

Greg Blackman reports on the buzz surrounding embedded vision at the Vision Stuttgart trade fair, which took place from 6 to 8 November