Skip to main content

US vision market to grow 4 per cent in 2013, according to AIA

The North American machine vision market will grow by 4 per cent, according to market data presented at the AIA business conference by Alex Shikany, AIA’s director of market analysis. Last year saw the North American market contract by 4.5 per cent, due to a slowdown in manufacturing in the US, Shikany said. The conference took place in Orlando, Florida from 20-22 February.

Shikany noted that electronics and petroleum were both strong manufacturing sectors in 2012 as was the automotive sector, but that semiconductor experienced very little growth.

In terms of the wider US economy, Alan Beaulieu, an economist at the Institute for Trend Research, gave a largely positive message in his presentation, saying all the leading indicators are pointing up, suggesting growth, although he did predict a mild recession in 2014.

Isabel Yang, general manager of Luster LightVision Tech, predicted 20 per cent growth in machine vision in China from data from the China Machine Vision Union (CMVU). She said that while China’s economy is expected to slow from a GDP of 8 per cent to 5-6 per cent, machine vision will remain strong. This is due, she said, to a greater amount of automation in manufacturing, to the need for higher quality control, and for improved safety in terms of inspection in food and pharmaceuticals. The growth would also be driven by policy decisions. The Chinese machine vision market was worth around 1.225 billion RMB in 2012, she said

Computer vision

Among the presentations were two on computer vision, a branch of science based on building machines with human sight.

Humans interpret the visual world largely from the context of a scene and from memory of past events. According to Aude Oliva, principal investigator at the Computer Science and Artificial Intelligence Laboratory at MIT, humans can categorise an unknown object based on information from the scene with 68.5 per cent success rate; computers can only manage about half that rate, but this is still reasonably high.

Oliva and her team have mined a data set of 80 million images from internet websites like Facebook and Flickr. This amount of information provides a computer algorithm with an excellent understanding of a scene by comparing like images, to the point where it can begin to predict what might happen in a scene – Oliva gave the example of an image of a street with the computer able to predict that a car would drive through the scene from its analysis of similar images.

In his presentation, Dr David Forsyth, a professor of computer vision at the University of Illinois Urbana-Champaign, noted that knowledge of the environment is a powerful tool for a computer to classify an object accurately – an algorithm detecting people in a scene might use the horizon, for instance, to help classify pedestrians, since an object floating above the horizon can’t possibly be a person.

The advances made in computer vision and being able to map a scene effectively has had a direct impact on improvements in industrial robots, said Rodney Brooks, CTO of Rethink Robotics, speaking at the conference. He noted that the decreasing cost of computation and image sensors has also played an important role.

Rethink Robotics has developed an industrial robot, Baxter, which is designed to work with humans. Baxter can detect a human presence using a front-mounted camera; it has vision-guided movement and can be trained by the operator literally taking it by the hand and leading it through its processing steps. The robot has pressure sensors in its joints so it can be taught easily how to position its arms to pick up an object. Baxter is about to be released with a research SDK with an aim to installing it in research laboratories, not just those with a manufacturing slant but any area where a two-armed robot could be used.

Space exploration

In astronomy, a lot of scientific data on the Universe is generated from infrared sensors – the oldest observed object in the Universe, at 13.2 billion light years, was imaged by infrared sensors onboard the Hubble Space Telescope. Dr James Beletic, vice president, space and astronomy at Teledyne Imaging Sensors, spoke on the use of infrared detectors in astronomy, including sensors being developed for the James Webb Space Telescope. Unlike Hubble’s 1 megapixel IR detectors that have to be heated to operate, not ideal for making infrared measurements, JWST’s 63 megapixel IR sensors will operate at 50°K with only six electron noise. Only 0.75g of mercury cadmium telluride (HgCdTe) will make up the $10 million arrays, according to Beletic.

The A3 conference sold out with 380 attendees present from AIA, the RIA for robotics and the MCA for motion control. In the AIA tract, other presentations were given on industry standards and on the International Traffic in Arms Regulations (ITAR). The next conference will take place in Orlando from 22-24 January 2014.

Topics

Read more about:

Business

Media Partners