Machine vision firms target embedded market

Share this on social media:

Greg Blackman reports from the Embedded Vision Summit in Santa Clara, where Allied Vision launched its new camera platform

Allied Vision has launched a €99 camera with onboard ASIC processor specifically for the embedded vision market. The camera aims to be a bridge between the high performance, costly and low volume industrial vision market and the higher volume, lower cost embedded market.

It was launched at the Embedded Vision Summit, a computer vision conference organised by the Embedded Vision Alliance and held in Santa Clara, California from 1 to 3 May.

Andreas Gerk, Allied Vision’s CTO, said during the show that the typical cameras in the embedded market are not as rich in features as in the machine vision sector. Allied Vision hopes to offer some of the functionality found in machine vision to embedded vision developers through its new product line. Gerk added that the new camera platform is ‘totally different to what we have done before’.

Embedded vision is a hot topic in the machine vision sector at the moment, with the VDMA organising a panel discussion at the Vision show in Stuttgart last year, and companies like Basler introducing its online vision community, Imaginghub, for those building embedded vision solutions. Mark Hebbel, head of new business development at Basler, gave a tutorial at the Embedded Vision Summit on choosing time of flight sensors for embedded applications.

Other notable machine vision names that were exhibiting at the event in Santa Clara included Ximea, MVTec, Euresys, and Vision Components.

Jeff Bier, the founder of the Embedded Vision Alliance, said during the conference that embedded vision can mean many things: it can be an industrial camera with a processor inside; an embedded system with an integrated camera or with an external camera; or even a system sending images to the cloud.

Neural networks

Half of the technical insight presentations at the conference focused on deep learning and neural networks. These are algorithms that can be trained to recognise objects in a scene using lots of data, as opposed to the traditional method of writing an algorithm for a specific task. Bier said that 70 per cent of vision developers surveyed by the Alliance were using neural networks, a huge shift compared to only three years ago at the 2014 summit when hardly anyone was using them.

Bier gave a presentation at the conference predicting that the cost and power consumption for the computation required for vision will decrease by 1,000 times over the next three years, much of this thanks to neural networks.

Bier clarified the statement saying that the 1,000 times came from, firstly a 10 times improvement in the efficiency of the neural networks, which have largely been developed for accuracy rather than efficiency, compounded with a 10 times efficiency improvement in the processors running neural networks, and a 10 times improvement in the software that mediates between the processors and the algorithms.

In an article on the Embedded Vision Alliance’s website, Bier noted five computer vision trends that are likely to have a big impact on society in general: huge amounts of image data; deep learning; 3D sensing; simultaneous location and mapping (SLAM) used in robotics; and computing on the edge, a term that means doing processing on the device rather than on a server or in the cloud.

The advances in computer vision are opening up all kinds of new ways of using vision technology, from the embedded vision inside Microsoft’s Hololens augmented reality headset – Marc Pollefeys, director of science at Microsoft and a Professor at ETH Zurich, gave a keynote presentation about Hololens – to cameras for generating analytics for retail. Embedded vision also has the potential to disrupt more traditional markets, like surveillance – Michael Tusch at ARM gave a presentation on this topic.

Rudy Burger at Woodside Capital Partners though made the point that there haven’t actually been many large scale embedded vision products – he mentioned Kinect and Mobileye, which Intel acquired last year, as two examples. ‘We’re just at the very beginning,’ he said.

Turning back to machine vision, and Arun Chhabra at 3D surface inspection company 8tree gave a presentation about making an embedded 3D vision system for mapping dents on aircraft, a practice that traditionally is extremely rudimentary and labour intensive. 8tree’s system is a 3D scanner that operates by pattern projection and can annotate the area of the plane being inspected to measure any dents.

The embedded vision sector is not just a new market for machine vision companies, but the way vision in general is being deployed is changing, which in turn could impact machine vision. The reason why Allied Vision, Basler and others are starting to provide embedded vision products is to cater for these new ways of using vision technologies, so that if a customer asks whether a system can be employed on an ARM chip, for instance, then they are able to do that.

Related analysis & opinion

The first colour image, captured by a CMV20000 sensor, to be sent back by the hazard cameras on the Perseverance rover after its landing on Mars on 18 February. Credit: NASA/JPL-Caltech

26 February 2021

Greg Blackman speaks to Guy Meynants, formerly of Cmosis, and Paul Jerram, of Teledyne e2v, about the history of the image sensors onboard the Mars rover

05 May 2020

Greg Blackman speaks to Kieran Edge at the University of Sheffield's Advanced Manufacturing Research Centre, about new vision projects and the presentation he is to give for UKIVA's vision technology hub, to be broadcast on 14 May

Andreas Franz, CEO of Framos

19 August 2021

As Framos turns 40, we talk to the firm’s CEO, Andreas Franz, on diversifying into vertical markets outside factory automation

Jan-Erik Schmitt

10 August 2021

Jan-Erik Schmitt of Vision Components compares embedded vision today with that of 25 years ago when the firm was founded

Hardy Mehl, chief financial officer at Basler

05 August 2021

We speak to Basler’s Hardy Mehl on industry consolidation, embedded vision, and operating in Asia

Related features and analysis & opinion

Using drones could enable visual aircraft inspections in under an hour. Credit: Mainblades

08 June 2021

Matthew Dale takes a bird’s eye view of aircraft inspection, where drones equipped with vision are replacing manual checks

The first colour image, captured by a CMV20000 sensor, to be sent back by the hazard cameras on the Perseverance rover after its landing on Mars on 18 February. Credit: NASA/JPL-Caltech

26 February 2021

Greg Blackman speaks to Guy Meynants, formerly of Cmosis, and Paul Jerram, of Teledyne e2v, about the history of the image sensors onboard the Mars rover

Andreas Franz, CEO of Framos

19 August 2021

As Framos turns 40, we talk to the firm’s CEO, Andreas Franz, on diversifying into vertical markets outside factory automation

Jan-Erik Schmitt

10 August 2021

Jan-Erik Schmitt of Vision Components compares embedded vision today with that of 25 years ago when the firm was founded

Hardy Mehl, chief financial officer at Basler

05 August 2021

We speak to Basler’s Hardy Mehl on industry consolidation, embedded vision, and operating in Asia

Neil Trevett and Chris Yates

16 March 2021

The Khronos Group and the EMVA are to explore software standards for embedded vision. Khronos’ Neil Trevett and EMVA’s Chris Yates explain the work

05 March 2021

Greg Blackman reports from the Embedded World show, where industry experts gave insights into vision processing at the edge

Vision Components MIPI modules can be connected to various embedded processors, including Nvidia Jetson boards

17 February 2021

Greg Blackman examines the effort that goes into creating an embedded vision system