Skip to main content

A new embedded era?

Greg Blackman reports from the European Machine Vision Forum in Vienna

The machine vision industry is on the verge of entering a new era, one based on embedded processing boards, according to Basler’s CEO Dr Dietmar Ley.

‘We [Basler] feel that this is an interesting point in time; that it is opening up a lot of new opportunities for the [machine vision] industry; that we can move out of the factory space into other new, appealing spaces,’ he said during a panel discussion at the EMVA-organised European Machine Vision Forum, held in Vienna from 6 to 8 September.

But he also warned that machine vision companies need to be ready for such a dramatic drop in the average sales price of vision systems that will accompany embedded processing - 'otherwise we are in trouble', he said.

The forum brought together representatives from industry and academia to foster collaboration and discuss the latest trends in machine vision technology.

Industrial imaging has always been based on the PC and Ley made the point that for the last 15 years the technology has been stable, not changing a great deal. But the availability of low-cost, powerful embedded computing boards like those from ARM – processors developed in the consumer space – has the potential to transform the machine vision sector.

Basler purchased Mycable, an embedded computing consultancy firm, in June 2017.

Reaffirming what he said during the European Machine Vision Association's business conference in Prague in June, Jochem Herrmann, chief scientist at Adimec and EMVA president, remarked that, in the end, everyone will benefit from smaller, more affordable, and easier to use embedded systems. ‘[Embedded computing] is really helping the breakthrough of machine vision technology. It will start with the systems with larger volume. For a lot of other applications I’m sure the transition [to embedded vision] will go pretty fast.’

The advantages of embedded computing over PC-based systems is that the devices become more affordable, smaller, and consume less energy, all of which opens up new applications for imaging. These might be in areas outside factory automation such as retail, and volumes could potentially be higher than in the industrial market. However, with any change – especially a change that could potentially define a new era – there are risks, and Ley warned that the industrial imaging community must not miss the boat with embedded vision, saying it will not emerge in the machine vision space, but in the consumer sector.

‘If it’s [embedded vision] ready then it will come as a wave,’ Ley said. ‘If we wait too long because we don’t hear much about it, because it’s too complex to use it; let’s not misread the situation, these things are being worked on by very large companies for high volume applications and they’re going to come. It’s not a question of if they come; the only question is when will it come? At that time we should be ready, otherwise we are in trouble.’

Ley quoted the average PC-based vision system bill of materials as being somewhere in the region of €3,000 to €5,000. He said that this could be brought down to a few hundred Euros in the future.

The processors are already at a point where quite sophisticated vision applications can be run on them. However, these systems are complex to develop and integrate at the moment, because they’re not standard equipment and integration is not well documented. But this shouldn’t hinder industrial imaging companies from trying to tap into this market and provide vision solutions for embedded processors, Ley said.

‘Once enough people know how to [develop embedded vision systems] and your competitor comes up with a solution that is a fraction of the price of what the state-of-the-art is, then you are in trouble,’ Ley stated. ‘That’s going to happen. Someone’s going to do it. The Chinese or the Koreans, they’re going to do it.’

Stefan Schönegger, CMO of Bernecker and Rainer (B&R), an industrial automation provider based in Eggelsberg, Austria, commented during the panel discussion that ‘everyone is waiting for embedded vision’ in the industrial automation market. There is a need for low-power PLCs that could control a robot as well as including vision capabilities.

New business models

With the average sales price per vision system dropping, imaging becomes much more accessible and it opens up new applications for vision suppliers to exploit. However, Ley noted if the average sales price is cut in half or a third, regional machine vision distributors and integrators will have to change their business model to cut costs.

‘Just reselling components into a regional market is not going to do the job for these distribution companies anymore once this [sales price erosion] happens,’ he stated.

‘The system integrator that only has access to a regional market – say the German market – is not going to sell 10 times more units into the same territory when the price is coming down by the same magnitude. That’s going to be a problem.’

And with fewer distributors, what will that mean for the manufacturer? ‘Either you have enough volume to go direct, or you go online,’ Ley said. ‘How do you meet your customer, how do you consult with your customer without a distribution company? That’s going to happen. With the advent of embedded vision, with the average sales price erosion that’s coming along with that, there will also be the need for new business models.’

Deep learning

Another new trend that is developing outside of factory automation but could impact machine vision is artificial intelligence and deep learning algorithms. These convolutional neural networks (CNNs) learn tasks – face recognition is one example – through processing large datasets, the kind of datasets not usually found in traditional machine vision.

‘Deep learning is a kind of hype on one side,’ commented Dr Wolfgang Eckstein of MVTec Software. ‘On the other side you can do great things with it if you are knowledgeable.’ He said that work needs to be done to make deep learning usable and to assist customers using CNNs.

While factory automation is not the obvious first choice for deploying deep learning algorithms, there is a lot of investment happening in the field – half of the technical insight presentations at the Embedded Vision Summit earlier in the year in May in Santa Clara focused on deep learning, and 70 per cent of vision developers surveyed by the Embedded Vision Alliance were using neural networks.

Ley commented: ‘Yes it’s [deep learning] a hype, but there is a lot of money behind it. Google is your new competition. It’s not just the system integrator next door, you have a company that teams up with Google because they have an interest in getting knowledge about vegetable imaging; they put a few hundred thousand dollars behind it and suddenly you have a sizeable, dangerous competitor. That’s also something to look at, because that’s new.’

The question was also raised by Professor Dr Horst Bischof of TU Graz of whether automation and machine vision companies should care about the data their systems generate. The value in data for Facebook is enormous, for example, so should the machine vision sector learn from its data in a similar way? ‘Should this be our business model, because this [data] has value. We can train our future vision system on data from another customer, and we can improve our system. Not just sell hardware and software; we should care about the data. There are some things like face recognition where you cannot compete with Google, and this shows the value of the data.’

Ley advised that vision companies should look at the easy applications first, because these can be big markets – a market outside factory automation might consume six digit numbers a year, he said. ‘These markets exist, and these markets are not requesting the last photon in a very specific situation. It’s not necessarily that complex. If we can develop such programs in easier applications first, learn along the way, and then go to something with higher complexity.’

Make vision easier to use

It’s hard to say how quickly embedded technologies and deep learning will be embraced by the industrial markets, but the classic challenge of making the system simple to use remains in many cases, according to Schönegger. ‘The limit today for increasing the amount of vision systems used is usability,’ he stated. ‘Usability is 90 per cent and 10 per cent is potentially speed.’

In terms of standards, Schönegger noted that OPC UA was expected in industrial automation circles. ‘Try to speak and use the language in terms of standards of that [factory automation] market. OPC UA is a key technology in terms of Industry 4.0 and IoT.’

How embedded vision will impact the machine vision sector is as yet unclear. Herrmann stated: ‘No one of us alone can do this; it’s bigger than us all and bigger than the industry.’

Related article:

Vision on a different scale - Embedded processing is opening up a huge market for imaging, a market that machine vision suppliers are trying to tap into. Greg Blackman attended the Embedded Vision Summit in Santa Clara, where Allied Vision launched its new camera platform

Topics

Media Partners