Lettuce imaging is tip of the iceberg

Share this on social media:

Researchers at Earlham Institute have developed a machine learning platform to categorise lettuce crops using computer vision and aerial images.

The platform, called AirSurf-Lettuce, is capable of scoring iceberg lettuces with high accuracy of greater than 98 per cent, according to the study.

The researchers conducted field trials at G's Growers, the second largest vegetable grower in the UK, based in Ely.

Aerial imagery is used by crop researchers, growers and farmers to monitor crops during the growing season.

To extract meaningful information from large-scale aerial images collected from the field, high-throughput phenotypic analysis solutions are required, which not only produce high-quality measures of key crop traits, but also support farmers to make prompt and reliable crop management decisions.

The software includes measuring quantity, size and pinpointing location to help farmers harvest with precision and getting the crop to market in the most efficient way. Importantly, this technology can be applied to other crops, widening the scope for positive impact across the food chain.

Lettuce is big business, especially in East Anglia, with 122,000 tonnes produced in the UK each year. Up to 30 per cent of yield can be lost due to inefficiencies in the growing process as well as harvest strategies, which, if made up, could provide a significant economic boost.

It's very important that farmers and growers understand precisely when crops will become harvest-ready, so that they can set in motion the planning of logistics, trading and marketing their produce further along the chain.

Traditionally, however, measuring crops in fields has been very time-consuming and labour intensive, as well as prone to error; therefore novel AI solutions based on aerial images can provide a much more robust and effective method.

Another barrier to efficiency is the fact that inclement weather conditions, which have been increasing in recent years, can throw off harvesting times quite significantly, as crops take different lengths of time to mature.

The AirSurf technology – developed by members of the Earlham Institute’s Zhou Group, including first authors of the paper on the project, Alan Bauer and Aaron Bostrom – uses deep learning combined with ultra-wide-scale imaging analysis, to measure iceberg lettuce in a high-throughput mode. This is able to identify the precise quantity and location of lettuce plants, with the additional advantage of recognising crop quality.

Combining this system with GPS allows farmers to track size distribution of lettuce in fields, which can only help in increasing the precision and effectiveness of farming practice, including harvest time.

First author, Alan Bauer at EI, said: ‘This cross-disciplinary collaboration integrates computer vision and machine learning with the lettuce growing business to demonstrate how we can improve crop yields using machine learning.’

Industry partner at G's Growers, Innovation Manager Jacob Kirwan, added: ‘Farming at a large scale means that precision is essential when ensuring that we are producing crops in an environmentally and economically sustainable way. Using technology like AirSurf means that growers are able to understand the variability in their fields and crops at a much higher level of detail that was previously possible.

‘The decisions that can then be taken from this information, such as varying applications of inputs and irrigation; changing harvest strategies and planning the optimum time to sell crop, will all contribute towards increasing on farm yields and improving farm productivity.’

Related news

Reconstruction of traffic signs with high-resolution colour non-line-of-sight imaging using conventional CMOS cameras sensors

21 June 2019

Dr Johannes Meyer (left) and EMVA president Jochem Herrmann

20 May 2019

Recent News

18 July 2019

Scientists at the Harvard School of Engineering and Applied Sciences have developed a compact snapshot polarisation camera based on diffraction gratings containing nanoscale structures

18 July 2019

Embedded system designers now have a reference platform for vision-based development work, thanks to a €4 million Horizon 2020 project called Tulipp, which has recently concluded

18 July 2019

The UK quantum enhanced imaging hub, Quantic, has secured £28 million in UK funding to continue development of sensing and imaging equipment using quantum technologies

21 June 2019

Carnegie Mellon University showed a non-line-of-sight imaging technique able to compute millimetre- and micrometre-scale shapes of curved objects