Skip to main content

Vision the smart way

The early days of machine vision were dominated by the challenge of getting a good image, processing it and taking measurements from that image. In recent years the trend has been to take all that complexity and put it into a smart camera that just looks and does something as a result of what it sees.

There are many smart camera manufacturers out there, some are traditional camera makers expanding what their device can do, while others have their roots in frame grabbers and the software libraries developed for vision applications. Some are start-up companies aiming their products at non-expert users who want the device to work out of the box with a few configuration screens. Easy use does not compete with sophisticated vision systems; the idea is to expand the total market by opening up new applications, like the replacement of more basic instruments in factory automation.

Nobody expected that the market would need a software company to launch a range of smart cameras, but it happened. National Instruments (NI), most famous for its LabView application development system, has done just that. LabView has long since left the laboratory and is widely used to build factory-scale automated testing and control systems. NI had always dallied with machine vision by launching a few basic frame grabbers and I/O boards – it has always been in the business of supplying hardware bits to connect its software to the instruments and control systems. Consequently, about 150 cameras from 18 different manufacturers can be interfaced with LabView.

The NI range of smart cameras makes it relatively easy for those who have already invested heavily in LabView software to bolt on some basic machine vision without even reading the instruction booklet – the smart camera is just another block in LabView and it just plugs into the rest of the software like a thermometer or position sensor.

It may not be enough for anyone to throw away their Cognex boxes or the like, but hey, if you are already using LabView – and it is estimated to have a 32 per cent share of the measurement and control software market – it’s hard to find an argument against its smart cameras, at least for the basic vision work. It is estimated that there are 800 system integrators out there writing LabView. Who knows where it could lead? There is 10 years’ worth of development of machine vision routines in the LabView library, which can all run on the smart camera’s embedded processor. How smart can a smart camera get?

NI has come a long way since a few guys worked in a garage to make a device to get data out of lab instruments. And while it remains at its heart a software company, it has plenty of experience making hardware that supports it.

National Instruments was founded in 1976 in Austin, Texas, by three engineers who were working at University of Texas Physics Department: James Truchard, Jeff Kodosky and Bill Nowlin. They saw little prospect of advancement until their managers retired, so they formed their own company. Truchard is still CEO of the company, where he is known affectionately as ‘Dr T’ and is famous for having no designated parking space for his pick-up truck, sitting in a cubicle rather than having his own office and often wearing jeans to work.

While they were working at the university lab they found some of the work very tedious so they set up their own company in a garage. They had noticed that all the instruments had a communications port using Hewlett-Packard’s HPIB standard so they could be controlled by HP computers. The new company’s first product was a GPIB (the non-HP version of the HP-IB) interface for the DEC PDP 11 minicomputer allowing control of instrumentation and automatic test equipment. The PDP 11 was becoming extremely popular at the time and the interface boards sold well. By 1981 the company’s turnover had reached $1m.

The next big milestone was the release of the IBM PC, which was also being made in Austin. IBM encouraged the company to make a GPIB card for the PC, which opened up a huge new market in measurement automation.

Industry soon started using PCs and the DEC machines for automated measurement and control, but talking to customers the four founders realised that the biggest problem was the lack of software. There were ways of creating software using scripting languages and, of course they could write bespoke code, but what they really wanted was a specialised tool geared to automated test, measurement and control.

Jeff Kodosky started a research programme at the University of Texas and early on decided that engineers like diagrams, so decided that would be a good way for a programming language to work. At the same time the Apple Macintosh was launched and opened up a whole new way of designing software. He created a programme that allowed data to be defined in blocks and linked to other blocks with simple graphical strings. In 1986 LabView was released. The next year a PC DOS version was released but the Mac version was proving very popular. When Microsoft Windows caught up with Apple, NI was quick to take advantage and by 1993 turnover had reached $100m.

Jeff Kodosky, co-founder of National Instruments, and architect of the LabView programming language.

As the PC became more powerful, an industry grew up around low-cost instruments that plugged into the ISA or PCI bus and relied on the PC processor to do all the work. This led to what became known as ‘virtual instruments’, where the cards provided the basic I/O and the PC-running LabView created the instrument in software. This approach led to LabView moving from the laboratory, controlling a few instruments to the factory floor, where a combination of instruments and software can create an automated test and control system for a production line, particularly for electronic products.

This is where NI and LabView started getting involved in machine vision. Ian Bell, UK marketing director of NI, says: ‘Over the years we have moved out of the lab into the testing environment. In the 1990s we started adding different kinds of I/O and machine vision was one of those. This allowed people to interface not only with analytical instruments, but also to interface with motion control systems and cameras and be able to do image processing for lab applications, as well as machine vision for production applications. The centre of this is LabView, because all of this plugs into our software. It allows you to build the routines that test the radio interface with LabView, and you can also build the routines that test the screen by machine vision using LabView. Using the same tool provides a very powerful platform to link the disparate forms of I/O together to make an integrated testing environment.’

There are many clever features of LabView, which is why it has become so popular. One is that it can automatically take advantage of multiprocessor architecture, because it can handle multiple channels of data and their associated processors, the second is that the runtime engine that makes it work can be ported onto almost any kind of hardware, making it ideal for embedded systems using exotic real-time operating systems or even reconfigurable hardware such as FPGAs.

This led to the development of the Rio (reconfigurable I/O) device, which combines data acquisition and control into a device with its own processor and FPGAs that can effectively run LabView generated code. The latest generation is called the Compact Rio, which has been widely adopted as an embedded system in everything from medical imaging systems to in-vehicle data logging and control. The machine vision world has adopted these devices, because the inputs can be several FireWire inputs and the FPGA can store the timing and control software for the vision system using software developed on the LabView visual interface. The Rio device does not need any CPU time from the system that it is plugged into to control the automation of the input tests.

The desktop CPU can therefore concentrate on its application – for example, image recognition.

Bell says: ‘We took that a step further last year when we launched our own smart cameras with a VGA CCD sensor and an embedded processor that effectively runs LabView software on that sensor. You can either use the wealth of software already developed using LabView or you can use a tool written in LabView, called Vision Builder for Automated Inspection, that end users can use to configure automation tasks. There is an industrial grade I/O as well as Ethernet ports, which can be used to expand the I/O options.’

Ian Bell, UK marketing director

Bell says the camera was not designed to produce images for detailed processing, it was designed to be an automation device, making simple pass/fail or other decisions on a production line based on a visual input so the emphasis was on bringing in and out triggering signals for other devices under the control of a LabView application.

Bell says: ‘We are a software-centric company. We are not producing a piece of hardware and then writing software to work with it. We start with the fact that everything plugs into the LabView software and that it is a huge benefit to our customers to be able to interface with a range of I/O including motion controls, machine vision and cameras, but doing it all on the same software platform. Our first image processing tools were a frame grabber for a desktop PC with the modules needed to use it with LabView.’

Bell says that NI’s experience was in pulling data from an I/O card and into a PC so the user can do something with it, and machine vision has long been one of its key applications. But its products in this area have been frame grabbers, Camera Link and FireWire cards that help bring the data from other hardware into a PC where the user can develop software to use it.

The NI Smart Camera is an extension of this with its first camera hardware, but in reality the more significant contribution to the machine vision sector is the huge libraries of software developed using its platform and the tools available for creating more. The camera is designed and built in-house, but the sensors are bought in from Sony and the lenses and accessories are bought in and resold.

Bell adds: ‘We have a platform approach, which means we are rarely just selling a machine vision system to somebody. Usually machine vision is just one element of an industrial or scientific application. Machine vision is often added to a test application. Machine vision is usually talking to something else and we do the “something else”. The Rio got us into a lot of large industrial applications and the advantage of having a smart camera that integrates very easily and using the same software means the engineer does not have to go through the pain of making them work together. Our customers have told us that when the controllers and the smart cameras come from different suppliers, it is often difficult to integrate them.’

Bell says that for many customers the use of a LabView Smart Camera is just a logical extension of their investment in the software platform. But there is also another market out there that is just discovering smart cameras. There are many to choose from and most come with an ‘easy-to-use’ configuration tool. But when that configuration tool runs out of functionality there is nowhere to go. The NI camera user can start with a configuration tool and then grow into a full blown LabView system to control everything else in the final application – with the certain knowledge that LabView is going to supported for a long time to come.

Bell believes that LabView will make some significant inroads in the high-end machine vision applications, because it can easily take advantage of the multi-core processors that are emerging.

He says: ‘People are trying to bolt things on to C to make it more parallel, but it has the inherent problem that it is a sequential programming language. It turns out that the approach to programming of dataflow diagrams lends itself to a parallel processing. People have been writing concurrent parallel programs in LabView since 1986 without really thinking about it.

‘The application can no longer be made to go faster by waiting for the processor clock speed to go faster. You have to learn new techniques and find new ways of doing things. The advantage of LabView is that you can develop parallel applications and I think that LabView and graphical methods are going to become the standard way of developing applications for parallel processors.’



Topics

Read more about:

Interview

Editor's picks

Media Partners