Skip to main content

High-tech harvesting

Broccoli is a difficult crop to harvest. Firstly, the window for when broccoli heads reach the correct size for picking is only three days. If the crop is harvested too early the heads will be under developed; too late and they’ll be too large. In addition, the weight of a broccoli head that a supermarket asks for can vary from one supermarket to another, complicating the harvesting process further.

Manual harvesting currently results in up to 30 per cent of crops being left in the field because of the speeds at which the pickers are required to work. But now a combination of machine vision, deep learning – whereby the algorithm is trained to recognise variation using lots of data – and robotics has the potential to automate and replace these manual harvesting methods.

One such robotic system, being built by integrator Capture Automation in partnership with Stemmer Imaging, is designed to harvest broccoli. ‘Ideally, vision can be used to harvest at source,’ Ky Ellen, machine vision consultant at Capture Automation, commented. ‘The system is set up to collect a certain amount of broccoli heads of a particular size and is then sent on its way to harvest them automatically.’

Stemmer Imaging and Capture Automation are currently involved in a long-term partnership that designs and delivers vision systems for these automated harvesting machines, which enables picking to take place outside normal working hours, according to Chris Pitt, sales manager at Stemmer Imaging. ‘Picking at night time is also more beneficial, as the product stays cooler and fresher for longer,’ he said.

One of the problems with manual harvesting is that growers have no knowledge of what remains in the field after some of the crop has been harvested. ‘With an imaging system, it’s not only possible to collect data on the crops being picked, but also the data of the plants left in the ground,’ said Ellen. ‘You can send back information on why crops have been left in the field, perhaps because they’re a certain size. This allows you to come back at a later date and harvest the rest of the field.’

Vision systems enable farmers to harvest all their crops at the optimal time. ‘Vision technology could be used to start grazing a field as soon as growers believe their crops are close to being ready,’ Ellen explained. ‘The system would go and only pick the crops that are ready for a particular order, while also sending data back on the crops that are left in the field. This is quite valuable, as the data can then be correlated with the weather forecast to estimate when the rest of the crops will be ready, so then the machine can be sent out again to pick everything at the right time.’

The new automated harvesting systems are currently still in the trial phase and are being guided using 3D vision, which offers numerous advantages over more traditional 2D imaging, according to Pitt, such as being able to discern between weeds and crops.

‘The main reason that 3D imaging is better for this, is that with 2D imaging everything within the image is based on the intensity of light that’s being reflected back from the product, rather than the product’s physical shape,’ commented Ellen. ‘Therefore, with 2D imaging, the closer the product is to the camera, the brighter the image; if it’s further away, then you’ll get a darker image. This adds a lot of variability.

‘Part of the challenge with broccoli is separating the leaf from the head, so if one of the leaves from one plant is covering the head of another, a 2D system will struggle to tell the two apart,’ Ellen continued. ‘With a 3D system, it can be trained to look for a domed shape and a different texture to a leaf, making them easier to separate.’

The 3D vision system provided by Stemmer Imaging for this application is LMI’s Gocator camera, an all-in-one calibrated solution with an IP67 rating, meaning it is both dust and waterproof, and therefore suitable for the harsh conditions experienced out in a field. A laser is used alongside the camera to project a laser line onto the crop, enabling 2D cross sections of the target to be obtained by triangulating the geometric coordinates across its surface. A 3D profile of the whole target can then be constructed by combining the cross sections.

A system from Blue River Technology designed to maximise lettuce yield by removing unwanted plants (Credit: Blue River Technology)

To enable it to identify and classify crops correctly, the vision system is equipped with Stemmer Imaging’s Polimago deep learning software, which is first trained using a set of pre-defined data. ‘As part of the training process we have to feed the system images of broccoli heads,’ explained Ellen. ‘Thousands of images are gone through and identified manually. When the deep learning system is then faced with a new image, it can provide a score out of a hundred that reflects how sure it is that the image contains a broccoli head.’

According to Ellen, as 3D imaging is based on the shape of the target rather than the intensity of the light being reflected off it, the technique is much less susceptible to variation in lighting, which makes it easier to train the deep learning software.

Certain deep learning challenges still exist, however, due to the increasing variety of crops grown in agriculture. ‘Producers are using selective breeding to give certain qualities [to crops], such as dark green broccoli heads, or heads that protrude further than the leaves. They’re constantly evolving and constantly changing, so we have to make sure that the vision system can cope with all these variants, which means that in the deep learning training set, we have to include broccoli that’s in the early, middle, and late stages of growth,’ Ellen said.

‘There’s also an issue in that the colour of soil differs depending on the climate,’ added Pitt. ‘While we have dark soil over here in the UK, in Spain the soil is a lighter, sandy colour with stones often mixed in, which might be mistaken for small gem lettuces. The vision system has to cope with everything.’

Using their current system, the two companies are able to achieve a 96 per cent success rate in identifying crops, with the camera taking around 100ms to identify and grade them, and the robotic system taking approximately five seconds to complete the picking cycle.

The vision-guided harvesting machines are currently only operated by Capture Automation and those involved in their production. The next stage of development, according to Ellen, is creating user interfaces for the vision system that can be used even by those who aren’t skilled in imaging: ‘You have complex robotic and vision systems on there, and you have to tie that all together in a simple start/stop interface, so tailoring this to the end users is a challenge. If everything is successful I would expect to see this out in the field in three years.’

Ellen and Pitt both believe that faster, simpler embedded systems will drive vision-guided harvesting in the future. ‘Going forward it’s about embedding all this into a small box to have a nice, simple system running bespoke algorithms that can plug directly into a robot,’ Ellen concluded. ‘This will help the market I’m sure, as vision systems get smaller and more embedded.’

Spray to kill

Weeds in the back garden might be a pain, but farmers are required to deal with weeds on a much larger scale. For this purpose, California-based Blue River Technology has combined a deep learning vision system with a set of herbicide-spraying nozzles on its See and Spray machine, a device that can be pulled by a tractor and identify any weeds it passes over, enabling it to apply a dose of herbicide to them without harming surrounding crops.

The company equips the machines with standard 2D machine vision cameras and global shutter sensors that are already widely used in industrial factories. With the technology, the machines are able to discern weeds from crops with a 95 per cent success rate.

‘Currently, all the weeds are in one class,’ explained Ben Chostner, vice president of business development at Blue River Technology. ‘We are starting to do more to distinguish the different classes of weeds. This information will be valuable in the long term, as we can use it to prepare different mixtures of herbicides, which can then be carried on board the machines in order to deliver more specific treatments to each type of weed.’

According to Chostner, since the company first introduced its vision-guided spraying technology in 2013 as part of an automated lettuce-thinning system, deep learning has become more widely used. ‘We have therefore changed the entire architecture of our system, both on the software side and the hardware side, to be able to take an image and process it with deep learning,’ he said. ‘We’ve changed our algorithms and also our processors from traditional desktop Intel processors to Jetson GPUs from Nvidia.’

Since 2013, the company has also moved from using Ethernet to USB3-based cameras, which has significantly increased the speed at which it can pull images off its sensors, ultimately enabling its machines to travel faster through the fields.

‘Our machines travel between four and eight miles per hour and are anywhere between 20 and 60 feet wide,’ Chostner commented. ‘As the machine moves through the field, we have about one hundred milliseconds to capture an image, get that image off the sensor and into the processor, run our deep learning algorithm, make a decision and spray the plant.’ In total this equates to each machine being able to treat around 40 acres of land per day.

Blue River Technology’s lettuce thinning machines, which use similar technology to the See and Spray systems, selectively remove unwanted lettuce plants from a field to optimise growing yields. The LettuceBot uses its vision technology to determine which plants are the most uniform in size and have optimal space to grow, and then applies herbicide to the remaining plants in order to remove them.

‘Our equipment has been running in lettuce production in the US for about four years now,’ remarked Chostner. ‘It is actually used on about 10 per cent of lettuce produced in the US. This year we are expanding the application of this technology to cotton production.’

The company has primarily used standard RGB cameras in its machines up until now, thanks to their cost and availability. However, it is now starting to consider using other narrow wavelength bands to obtain information from plants. ‘This could involve adding a near-infrared aspect to the cameras,’ Chostner said. ‘Being able to customise cameras to look at bands of light other than the visible would be extremely valuable in our situation.’

Chostner also explained that having a higher dynamic range in the cameras would aid Blue River Technology’s See and Spray applications. ‘The prevailing lighting conditions vary quite a bit outdoors, and we often run into shadows created by our own equipment. In the worst scenario, there’s both full shadow and full bright sunlight. We have yet to find a camera that, with one exposure, can appropriately expose all of the information in that perspective. This would be a game changer for us.’

Camera manufacturer Ximea is also involved in the selective application of herbicides, but rather than working in the visible bands, the company supplies its hyperspectral imaging technology to a number of partner projects aimed at reducing the volume of herbicide used.

According to the company’s CTO Jürgen Hillmann, Ximea’s main focus for precision agriculture at the moment lies in hyperspectral imaging. The company is currently developing a demonstration application with the University of Münster to measure agricultural indices such as MBDI – which gives an indication of plant health – in order to optimise fertiliser use.

Ximea offers its xiSpec hyperspectral cameras for use on drones, which can give a bird’s eye view of a field of crops. The camera is small and lightweight, and offers a low power consumption of 1.8W, making it ideal for prolonging drone battery life. The camera is USB3-compliant and can produce 170 hyperspectral imaging data cubes, or up to 1,360 lines, per second.

Ximea's xiSpec hyperspectral imaging camera is suited to flying on a drone thanks to its low weight, size and power consumption

According to Hillmann, line scan cameras are well suited to drone use. ‘The advantage is the high spectral resolution, as in this case you have 100 or 150 bands, so it is better for detecting diseases on crops, or detecting unwanted weeds. The spectral information of healthy plants is compared to those with diseases beforehand, allowing imaging technology to discern between the two.’

Hyperspectral imaging does not come without its limitations, however. Enough light needs to be present for accurate readings, which means the technology will sometimes have issues in dark conditions.

‘In every case you have to measure the light conditions,’ commented Hillmann. ‘If you have a foggy day, then the spectral pattern of the sunlight coming down to the earth will be a little bit different, and you have to calculate the camera for the current light situation.’

An additional hyperspectral imaging system or a spectral meter will therefore often be attached to the drone alongside the primary camera, so that online measurements of the lighting conditions and spectral response can be performed. Multi-camera systems such as these have become a further focal area for Ximea in recent years. ‘We have developed technologies to aggregate various pictures from different cameras, for example a regular colour camera and a hyperspectral imaging camera,’ said Hillmann, ‘Using regular colour pictures in addition to the hyperspectral imaging allows you to provide a stereoscopic, high-level view of a field.

‘The idea behind the projects we are involved in is to use cheaper, smaller systems to optimise agricultural processes,’ concluded Hillmann. 



Topics

Media Partners