Skip to main content

Smartphone deep learning app developed for crop disease diagnosis

Farmers could soon diagnose crop diseases using their smartphones thanks to a deep learning algorithm developed by scientists from École Polytechnique Fédérale de Lausanne (EPFL) in France and Penn State University in the USA.

The algorithm, which is part of the PlantVillage project, will be used to build a smartphone app that can analyse pictures of crops. The work represents the first successful proof of concept for plant disease diagnosis through smartphone pictures, and was published in Frontiers in Plant Science in September.

The PlantVillage project, formed by Marcel Salathé at EPFL and David Hughes at Penn State, employs algorithms to train computers to diagnose crop disease. The aim of the project is to put the algorithms in the hands of farmers, agriculturists, and everyday gardeners in the form of a smartphone app. ‘People will be able to snap a photograph of their sick plant with the app and get a diagnosis within seconds,’ said Salathé.

The Frontiers in Plant Science paper demonstrates the use of deep learning, a type of machine learning that uses algorithms to find patterns in big sets of data. In the paper, the researchers assigned every one of 54,306 photographs of diseased and healthy plant leaves to one of 38 classes of crop-disease pairs, such as tomato plant-tomato early blight and apple tree-apple scab as examples.

The researchers then trained a deep convolutional neural network to identify plants and diseases (or lack thereof for healthy plants), and measured how accurately it could assign each image to the correct class. In total, working with 14 crop species and 26 different plant diseases, the system could identify diseases on images it had never seen before with an accuracy of 99.35 per cent.

Despite the time and computing power needed to build the algorithm and train the model, the resulting code is small enough to be installed on a smartphone. 

‘We do believe that the [smartphone] approach represents a viable additional method to help prevent yield loss,’ said Hughes. ‘With the ever-improving number and quality of sensors on mobile devices, we consider it likely that highly accurate diagnoses via the smartphone are only a question of time.’

The photographs used to train the neural network were taken under controlled conditions, which do not always reflect the conditions of an open field. To address this, the team is now expanding their database of images to about 150,000 in order to improve the system’s ability to identify diseases. 

‘At this point, we’re relying on a photograph taken by a user in a field under natural conditions,’ explained Salathé. ‘But in the future, we would like to also bring in time, location, epidemiological trends, weather conditions and other signals to bear upon the network, which would vastly improve its abilities.’

Topics

Read more about:

Technology

Media Partners