Skip to main content

Well-oiled machine

Industrial robotics has changed a great deal since the design of the first model in 1954. The functionality of the first industrial robots involved the transfer of objects from one point to another over short distances. These first models were the vision of George Devol, who would go on to set up Unimation to manufacture industrial robots.

Fast forward almost 65 years and the market has evolved. With industrial robots becoming more sophisticated, there is arguably a shift towards more collaborative relationships between human and robot (or ‘cobot’) workers. If you believe the available data, the market for collaborative robots is already a healthy one, and is set to become even more so.

Market research firm Market.biz valued the current global and regional market for collaborative robots at $0.28 billion for 2017, and believes it will grow to $4.22 billion by the end of 2023. Likewise, the most recent intelligence from Market Insights found that, in 2015, collaborative robots held nearly three per cent of total robot units sold; and 22,000 were sold in 2016. This is expected to reach more than 400 million by the end of 2024.

Embracing the cobot

This technology has been embraced by a number of widely recognised brands, with logistics firm DHL recently investing in four collaborative robots for use in co-packing and production logistics centres in the UK. British online supermarket Ocado also revealed earlier this year that it was trialling a prototype collaborative robot designed to help human technicians maintain and repair equipment in the company’s automated warehouses.

One of the biggest challenges for this market is human safety, and vision systems are an integral factor in ensuring collaborative robots can work safely alongside their human counterparts, as specified by the ISO 10218 safety requirements. A good example is the recent collaboration between Sick and Danish robot arms manufacturer, Universal Robots, which produced an entry-level vision-guided cobot system for use in pick and place, quality inspection and measurement applications. The supplier incorporated 2D vision sensors to ensure the adaptability of the system, while making it fast and simple to program and configure at most skill levels.

Neil Sandhu, a Sick vision specialist, said: ‘An entry-level package allows engineers to set up the robot to be guided to pick randomly positioned objects, as well as to inspect or measure the objects prior to picking.’

Sick showcased a bin picking demonstration using robot arms from Universal Robots at the recent UKIVA Machine Vision Conference in Milton Keynes, UK. ‘Bin picking was a lot harder based on laser triangulation,’ said Sandhu. ‘With stereovision we can work with smaller parts, down to 20mm.’

Getting a grip

Sick’s sterovision camera uses structured light to generate a 3D point cloud. The robot is programmed with the CAD model of the part; it then matches the 3D image data to the CAD model to locate individual parts in a bin and calculate the best orientation to pick up the object. Parts passing on a conveyor belt can be picked with a robot gripper using only 2D image data matched to the CAD model of the part.

‘Because it doesn’t need specialist programming expertise, it makes an ideal ‘starter’ vision-guided robotic solution for a production line,’ Sandhu explained, ‘even on a small scale, and is sufficiently adaptable to equip a pilot process facility for production development work.

‘The in-camera software guides you through the set-up and calibration process, based on Sick’s Sopas software, so even if an engineer is new to 2D vision robot guidance, development of an effective solution is simple.’

2D vision sensors such as the PIM60 from Sick can provide system adaptability. Credit: Sick

For standard configurations, such as changing jobs and pick-points, calibration and alignment are undertaken directly from the robot control pendant, while more advanced operations, such as inspection and dimension measurement of objects prior to picking, can be done through the Sopas device configuration tool. For high-speed positioning, inspection and measurement the PIM60 vision sensor is another 2D solution from the range.

Safety first

The partnership with Universal Robots spawned a second application for collaborative robots using two Sick sensors connected to a robot for identification and placement purposes, with a barcode reader that informs the robot about the pieces on the conveyor belt via 2D identification. The cobots incorporate 15 safety functions certified by TÜV-Nord, allowing operation to take place safely without the use of a fence. Robots are programmed to halt immediately in case of accidental contact with either a human or object, and the operating speed is automatically reduced should human intrusion be detected.

Both of these applications combine Sick’s and Universal Robots’ virtues, allowing for people and robots to work within the same workspace where the robot takes over all the repetitive and heavy movements while its human colleagues are able to go about their work in a safe environment. Sick has also integrated its sensors into robots from other suppliers, such as ABB and Kuka – which recently revealed its new cobot at this year’s Hannover Messe trade show for industrial technology.

The new model, LBR iisy, follows on from the launch four years previously of what the company called the ‘first robot that is approved for human-robot collaboration’, the Kuka LBR iiwa. Demonstrated at the Hannover show, the LBR iisy can be up and running swiftly, thanks to a new operating concept. With a reach of 600mm and a payload of 3kg, the six-axis lightweight robot is suited for flexible use in unstructured working environments, with applications including loading or unloading machines, packaging, precise testing, reliable assembly and handling.

Flexible friend

Flexibility is also the key factor for Oliver Barz, key account manager at Imago Technologies, when it comes to vision systems for contemporary robotics. ‘What we see in the field of robotics,’ he explained, ‘is more robot manufacturers are using machine vision. It is typically for use in a flexible environment – or we wouldn’t need a robot at all. So, we need to have a camera system where you can program image processing according to your needs.’

Another challenge Barz recognises is the ability to make collaborative robots an all-inclusive technology, as this will be key to its market growth. ‘We need to be able to do more than just set parameters,’ he said. ‘If a system works, it will be a cost-effective solution setting only parameters, but if it does not, we would otherwise need to replace the whole system. Again, the image processing needs to be adaptable. Ultimately, the market for cobots will depend on how far we can go to reduce costs for imaging.

Kuka's LBR iisy robot is suited to loading and unloading, packaging, precise testing, reliable assembly, and handling in unstructured working environments. Credit: Kuka

‘We believe the solution is an intelligent, programmable camera for a combination of robotic and computer vision. Customers have special requirements, so it is programmed exactly to their needs. And it might be possible in the future to combine a powerful intelligent camera with a normal camera to process images from two sensors without a separate computer. I’m willing to follow that brilliant – or maybe not brilliant – idea!’

Cost is key

Barz revealed that Imago Technologies is currently in discussions with a substantial manufacturer to enable their collaborative robot to see. ‘It has to be a cost-effective solution,’ he re-iterated, ‘and it is something that is currently targeting new markets. There is a target market of “seeing robots” at maybe less than €20,000, and there needs to be extra effort in the hardware selection and the programming, but the cost is a key feature on those systems.’

Nevertheless: ‘I have seen, in the past, people try to process the images on a computer which has primarily a different task, which might work, but equally, screws up the complexity. For our customers, machine builders need to have entirely separate working units – like a vision system – and all units must then be taken together to the HMI.’

Chris Varney, managing director at Laser Components, believes that lidar could lend itself to offering an effective solution in making cobots safe for use in close environments with humans, enabling the machines to ascertain the dimensions of the environment, as well as any potential obstacles.

Non-contact measurement

Laser Components supplies both pulsed laser diodes (PLDs) and avalanche photodiodes (APDs), and when used together, the time of flight (ToF) of a laser pulse can be measured.

‘Cobots can use lidar,’ Varney said. ‘However, the range is very short, unlike with aircraft. This makes the time-of-flight very short.’ Varney feels that the company’s hybrid 905nm high-power pulse laser diode – which generates short pulses and high pulse frequencies for higher resolutions in lidar and scanners for security and aerospace appliances – could provide the answer.

‘Laser Components has developed the QuickSwitch PLD with a very short 2.5ns pulse of light, thus being capable of sub-metre measurement. Developers of any short-range lidar will benefit significantly from the PLD and APD arrays, a strip of APDs within a single hermetic package, thus facilitating multiple measurements in an instant, reducing the scanning time otherwise required.’

For Imago customers, machine builders need to have entirely separate working units, like a vision system. Credit: Imago Technologies

Paul Wilson, managing director at Scorpion Vision, offers a slightly more sceptical view of the collaborative robot’s place in industry, particularly following the company’s presence at the UKIVA Machine Vision Conference. He explained: ‘I spoke to people at the conference and they were quite cynical about collaborative robots. The thing is, robots are meant to be fast, but collaborative robots have to be slow in order to be safe, so would not be suitable for volume production. There is some cynicism.’

Deep learning

One big talking point from the event, according to Wilson, was deep learning, which enables robots to learn by watching a human’s actions. ‘There were seven theatres all covering different subjects, including a deep learning theatre,’ he said. ‘This is an area that people are talking about. When it comes to alphanumeric batch code reading, for example, the font, characters, etc need to be the same size. Our code reading alphanumeric system uses a deep learning model to train itself to read the code, enhancing its ability.’

Wilson believes that it is 3D vision technology that will best benefit this area in the future. ‘We use 3D so we can measure the distance between the camera and the area, and measure the size of the character. We have just delivered the system to a pharmaceutical company. The uptake of 3D is going to keep growing. We have also delivered an entire block picking system. If a block comes out of the factory deburred, it can rock around, so a robot has to know how to pick it up. This is a growth area for which 3D is potentially well suited.’

That’s not to say the company has dismissed the idea of collaborative robots – far from it. Scorpion Vision is working on an application for a customer making low-volume, but high-quality products. Wilson said: ‘The 3D vision from us is used to verify that parts are correct. The robot moves a 3D camera over the part, it is measured and identified, and the robot picks that part and marks it with an inkjet. Once marked, a second camera is used to verify that it is correct. The robot sits on the bench next to the operator, so it is much more suitable. But it is not a volume function.’



Topics

Read more about:

Robotics, 3D imaging

Media Partners