Skip to main content

Sowing the seeds of digital tech

The United Nations Department of Economic and Social Affairs predicts the global population will reach 8.5 billion by 2030 and 9.7 billion by 2050. This means, according to research from the World Resources Institute, the planet will need to produce 56 per cent more food than now. This would require 593 million additional hectares of agricultural land, and it needs to be achieved in parallel with a reduction of agriculture’s impact on climate, ecosystems and water. 

A widely held belief in the agricultural community is that we are about to enter the fourth agricultural revolution. The first was around 12,000 years ago, when farming began, and the second was from the 17th century with the reorganisation of farmland. The third, which was also known as the green revolution, came in the 1950s and 1960s with the introduction of chemical fertilisers and pesticides, as well as engineered high-yield crop breeds and the use of heavy machinery.

The fourth agricultural revolution involves the use of new technologies, such as AI, to make smarter planning decisions and power autonomous robots. Developments such as these could be used in applications such as growing and picking crops, weeding, milking livestock and distributing agrochemicals via drone – and machine vision is a crucial component in their development. 

Cleaning up

According to Simon Pearson, director at the Lincoln Institute for Agri-Food Technology (LIAT), based at the University of Lincoln, a good deal of work for the fourth revolution is to change much of what took place during the third. He explained: ‘Our challenge is almost cleaning up a bit. The third revolution worked almost too well, and while people obviously need to eat food, we also need to reduce the use of pesticides to get towards net zero.’

The LIAT is home to a working farm with research facilities. Its scientists run projects designed to make a difference across the food chain, with machine vision playing a vital role. ‘We have a quite significant group working on agricultural robotics and the vision system associated with those technologies,’ explained Pearson. ‘We’ve got an Engineering and Physical Sciences Research Council centre for doctoral training, which is for 50 PhDs joint with Cambridge and the University of East Anglia (UEA). And that’s engineering at Cambridge and UEA in the machine vision lab, so you get a view that vision is very important.’

Pearson went on to explain that 18 of these PhDs have been recruited so far, with the rest to be assigned over the next eight years. ‘We’re recruiting and we’re getting a lot of really positive connections with industry,’ he said. ‘We’ve also got a £6.3m global centre of excellence in agricultural robotics called Lincoln Agri Robotics, which was funded by Research England. We are very dependent for all of this on machine vision, which makes it a really interesting area of research. What we’re trying to do is machine vision in very, very challenging environments with optically variable objects. If we can crack the vision nut, it unlocks lots of applications for agri-robotics, and the impact is really significant.’ 

The use of agri-robotics can reduce the impact of chemicals used in agriculture in a number of ways. Robotic weeding, for example, will use vision technology to image plants to identify the weed from the crop, and kill only the weeds. However, there is another issue for agriculture that has intensified during the pandemic – a lack of people to pick the fruit at the critical time. Pearson elaborated: ‘Labour to pick fruit of age is a massive issue with the reduction in seasonal workers. For robotic picking to be an alternative for fruits and vegetables, you’ve got to identify an object to be able to move a robot to pick it. Then you’ve got to move a robot around the farm to remove what it picked. So you need machine vision.’ 

It is this issue that one of the LIAT’s latest projects is designed to address. Funded by UK Research and Innovation (UKRI) under the Innovate UK council, the £2.5m Robot Highways project was set up to help tackle labour shortages in soft fruit farming and the need for global food production, while reducing the environmental impact of the farming sector.

The Robot Highways project was set up to help address labour shortages in soft fruit farming. Credit: LIAT and Saga Robotics

LIAT was part of a successful consortium selected to deliver the project. It will lead the academic contribution for robot development and co-ordinating the fleet control system. The team believes that the work could be key to industry sustainability by reducing sector reliance on seasonal labour, estimating a 40 per cent reduction in labour required.

The aim is to deliver the project across the UK by 2025, with a fleet of robots able to perform a number of farming functions as one operation, powered by renewable energy. Solutions will also be provided for moving the sector to a carbon zero future. The consortium estimates that it will cut fruit waste by 20 per cent, reduce fungicide use by 90 per cent, lower use of fossil fuel across farm logistic operations, and increase farm productivity by 15 per cent.

Pearson said: ‘I’m delighted that opportunities are being realised for the sector, and agri-food robotics specifically. With fruit and vegetable picking, you have very complex occluded structures, so you need 3D vision.’ An example he gave was imaging strawberries in 3D to measure their size.

Two of the university’s research partners are also consortium members for the Robotic Highways project – UK soft fruit marketing co-operative, Berry Gardens Growers, and Saga Robotics. The latter will supply robots and autonomous systems for the project, such as the modular robot Thorvald. It can operate in open fields, tunnels, orchards and greenhouses, performing tasks such as light treatment for disease management, picking fruits and vegetables, phenotyping, in-field transportation, cutting grass for forage production, spraying, and data collection and crop prediction. It uses advanced navigation methods and AI to perform these tasks.

Saga Robotics will supply its Thorvald robot. Credit: LIAT and Saga Robotics

Clock House Farm will be the demonstration farm for the project and, focusing on the packing side, the Manufacturing Technology Centre (MTC) will use its expertise in simulation, automation and process optimisation to ensure that the facility is working as efficiently as possible, and that any automation used is in the right area and delivering value.

MTC is no stranger to research in this area, having recently made a breakthrough with a learning robot that could benefit agriculture, among a number of sectors. The latest development allows single objects to be picked out of a random tray or bin, without the need for high-cost sensors or lengthy programming. 

The new process, which has been given the codename Project Viper, uses a deep neural network implemented using the PyTorch machine learning framework to generalise learned behaviour to new scenes. Algorithms were trained with open-source manually-labelled datasets and simulated datasets. Performance tests, using low-cost depth cameras and collaborative robots, were carried out on a wide range of objects including metal components, cosmetic containers and fruit, and the team demonstrated that 94 per cent of attempted picks were successful.

The Manufacturing Technology Centre has built a learning robot that could benefit agriculture

MTC senior research engineer, Mark Robson, said: ‘Building the system around a neural network architecture allows us to update the model as we gather data in operation, enabling the system’s performance to continue to improve over time. Using simulation to automate the creation of training data significantly reduced the cost and time typically required to manually produce the large quantities of data needed to train a neural network.’

The MTC showed that the model could be trained on simulated data, therefore reducing the need for labour-intensive manual data collection and labelling.

Future gazing

While work on automating fruit picking continues at LIAT, Pearson noted that there’s still lots of research on robot weeding. ‘We’re doing a lot of revision work for that,’ he said.

The work revolves around identifying the crop from the weed, along with how to make the vision process stable, from one field to the next, on different days and with different crops. ‘These are really difficult machine vision challenges in real-world environments,’ he added. ‘What we’re having to do is really advance the state-of-the-art in vision technologies, to get to a point where we can apply them.’ 

As well as identifying weeds, the institute is also working with machine vision to identify crop diseases. It has recently partnered with start-up Fotenix, which uses imaging and machine learning to design systems to monitor crop health. The key technology for the firm is three-dimensional multispectral imaging, which allows laboratory analysis at a more economical cost, making data-driven farming more freely available. ‘We’re trying to identify [crop] diseases at a very early stage,’ explained Pearson, ‘so that they can be brought under control before they become an epidemic.’ 

Fotenix’s technology, which was developed at Manchester University, uses 3D spectral imaging, sensitive from 365 to 1,050nm, to analyse plant health in the field. It can determine detailed plant health information such as nutrients and disease at a cellular scale, using different wavelengths of light to detect unique identifiers of specific plant diseases, before they’re detectable by the human eye.

‘Digital techniques are the next thing in agriculture. They use real intelligence to try and transform agriculture production,’ Pearson said, adding that he expects adoption of digital technologies in agriculture to be relatively fast.

Blossoming tech

Many fresh digital technologies are already making their way into the commercial landscape, via firms such as Fotenix and Cambridge, UK-based agri-tech start-up, Outfield Technologies. It was established by co-founders Jim McDougall and Oli Hilbourne to build systems that help fruit growers be more productive, more sustainable and more environmentally friendly. Having collaborated with Cambridge University’s Machine Intelligence Laboratory, the company now offers a yield-measurement and orchard-management system for high-value fruit crops. 

One of the key issues Outfield helps to address is accurate yield estimation in orchards. McDougall, who is also commercial director, explained: ‘The ability for growers to manually manage the number of apples, for example, is not an easy task.’

Yield management based on manual estimation can be affected by a number of factors; apple trees can be unpredictable. ‘If a grower has an orchard with 10,000 apple trees, that is a lot of variability and estimates can often be out by around 20 per cent,’ McDougall added. ‘This results in lost income and inefficient operations for the growers. It can also produce substantial food waste.’

Outfield’s technology can help by analysing images taken in orchards, counting how many blossoms there are on the trees and estimating fruit yields. ‘We can manage the number of apples more precisely, depending on the amount of apple blossom, for example,’ said McDougall.

Credit: Outfield Technologies

The images are captured by off-the-shelf drones, and Outfield analyses the imagery using machine learning to provide detailed maps of fruit loading and fruit counts, to helps growers visualise and track the parameters that matter to their particular orchard or farm.

‘The drones are very cool,’ enthused McDougall. ‘We don’t think anyone else is currently doing it. You can tell them where to fly, and they capture the aerial view images of orchards, and the secret sauce is our online platform that looks at the image to detect each blossom or piece of fruit, to calculate how many there are.’

Bringing this kind of technology from the research environment to the commercial arena is not without its challenges.McDougall explained: ‘A lot of early agri-tech companies struggle finding the right skillset, because it is so specialist. Then there are the challenges associated with getting the technology onto the farm. Something that works in a laboratory may not work in an agricultural environment.

‘There are some great growers in the UK, but every farm is different in terms of scale, market and crops. 

‘But then, that’s why agri-tech drives technology. If something works on a farm, it will work anywhere, and people are really keen to see things like this work.’ 

McDougall agrees that the fourth agricultural revolution will see robotic and machine vision technology implemented quickly. 

‘There will be a huge acceleration in the coming years,’ he said, ‘which will lead to a confluence of different things that we can do, and growers will be able to take a more strategic view.’

He also concurs that the fourth agricultural revolution has its part to play in helping reduce some of the chemical damage caused during the third iteration. ‘We will certainly see more precision-spraying robots,’ he said. ‘This type of technology can help to reduce any further damage to the planet, and not before time. We’ve got to sort this out now, and technology is the only way.’



Picking up rocks

How do you automate the removal of rocks from agricultural land? American firm TerraClear has built a solution using computer vision that maps the terrain and identifies the size and location of rocks for a more targeted approach to rock removal.

Farmers spend a lot of time removing rocks from their fields after tillage. There are mechanical solutions that churn through the soil, but these are slow and can only be used in certain soil conditions.

TerraClear’s technology is fitted to a tractor or skid steer

TerraClear’s rock clearance solution starts with a drone surveying the field. Images from the drone are analysed using a neural network to map the location and size of rocks. The farmer then drives a tractor with a hydraulic picker mounted to it on a route provided by the drone map. Cameras are then used to identify rocks as the tractor moves through the field, so the picker can pull them into the tractor bucket.

The system uses Triton 2.3-megapixel cameras from Lucid Vision Labs. Images from the cameras are used to train the neural network, identify rocks from aerial imagery, and for real-time identification of rocks with the picking robot. According to TerraClear, cameras need to be compact, lightweight and energy-efficient for use on aerial platforms and mobile machinery.

Topics

Read more about:

Agriculture, Robotics, Food and drink

Media Partners