Jessica Rowbury looks at how imaging technology is allowing farmers to reduce their environmental impact while improving profitability
Farmers are under an increasing amount of pressure nowadays to become more environmentally friendly. Shoppers are becoming more aware of how agricultural chemicals can affect the environment and human health, evidenced by year-on-year increases in the demand for organic food. In addition, Californian farmers have been the subject of recent debate as to whether they are to blame for the emergency water restrictions put in place in May.
Luckily, imaging and machine vision solutions are allowing agriculturists to better manage their recourses without damaging their profitability.
There has been a growing consumer demand for organic produce across Europe and in the United States. The Soil Association’s 2015 Organic Market Report revealed sales of organic products in the UK increased by four per cent in 2014, whereas in the United States, the organic market grew by 11.3 per cent from the previous year, according to separate market analysis from the Organic Trade Association.
‘Lots of studies are being conducted nowadays to quantify the negative impact of pesticides and herbicides,’ said Jean Inderchit, robotics engineer at Naïo Technologies, a French company specialising in agricultural robots. ‘The demand for healthy and organic products will continue to increase since more and more people realise that some chemicals compounds for agriculture can be harmful to humans.’
Furthermore, governments are continuing to take regulatory actions in an attempt to control and reduce the amount of pesticides used in agriculture. ‘There are always more environmental and agribusiness constraints applied by the [United] States,’ Inderchit said. ‘In France for example, the objectives of the “Grenelle de L’environement” (environmental roundtable) is to reduce the use of phytosanitary product by 50 per cent by 2020.’
Therefore, farmers are under an increasing amount of pressure to reconsider their use of chemicals. ‘It is crucial to think of alternative solutions to using herbicides and pesticides,’ stated Inderchit.
Naïo Technologies has recently introduced a weeding robot to the commercial agriculture market, which can effectively detect and remove weeds at power costs of less than €1 per hectare. Currently, it is mainly being used by vegetable farmers and horticulturists who have less than 10 hectares with an economic model based on a short supply chain.
The Oz robot is able to detect the width and length of paths between crops and switch from one path to the next without human intervention. It can operate for four hours at a time without charging, which represents about 48 x 100m crop rows. The robot also sends status reports to its owner via a mobile phone message.
In order to be able to travel down the paths between crops effectively, the robot requires a reliable positioning system. ‘The biggest problem with a robot in an open environment like [a field of] vegetables is not knowing if you are moving or if your wheels are slowly drifting on muddy soil,’ explained Inderchit. This means that a normal odometer, which can determine the distance travelled based on data provided by the propulsion system, cannot be used.
The Oz robot employs two cameras connected in stereo to ensure precise positioning. The two cameras are calibrated to each other, but also to the real world by a translation vector and a rotation matrix. The system can then compare corresponding points in two successive stereo recordings, and use triangulation to calculate the change in the z-axis in order to obtain the movement information.
Another requirement is that the control of the robot needs to be perfect. It needs to operate close enough to the vegetables in order to ensure accurate detection and removal of the weeds, but not too close to cause damage to the produce. To extract the navigable path, a stereo correspondence algorithm is also used, which computes a disparity map of the stereo camera pair. In combination with stereo-matching, the robot can also detect obstacles. ‘That is where computer vision starts to become really handy,’ said Inderchit. ‘We used colours and textures in mono-vision and also stereo correspondence to overcome those problems. It also helps us to detect any kinds of obstacles that are worth reporting to the user, like big rocks, or irrigation pipelines.’
Both systems operate at different frequencies − the positioning system at 15 frames per second, and the stereo correspondence algorithm at five frames per second. Using both results, Naïo reconstructs a 3D map which the robot can store in the memory.
A Matrix Vision USB 2.0 board camera, the MvBlueFox-MLC100w, is used on the Oz robot, in combination with an Aptina global-shutter CMOS sensor, which makes it suitable for outdoor use in variable light conditions.
The latest thing Naïo is developing is the robot’s ability to provide a greater depth of feedback about the crops to the user. ‘We are working on algorithms for meticulously counting the vegetables, for estimating the surface of the leaves, among other useful information,’ said Inderchit.
Using chemicals has always been the most successful and cost-effective method of controlling weeds. However, by providing alternatives that are effective and also economically viable, it may cause more agriculturists to adopt chemical-free methods in the future. ‘By talking with a lot of different people working in agriculture, it is fair to say that none of them use pesticides and herbicides with pleasure,’ Inderchit remarked. ‘It is just a necessity to control the weed simply and efficiently. If we offer a reliable alternative solution most of them will change their point of view and the use of chemical products will decrease.’
However, the great challenge in drastically reducing the use of chemical products throughout the agricultural industry is to be able to keep the profitability that exists today while drastically reducing chemical products, Inderchit added. ‘One solution would be to apply the chemical products as locally as possible on the plant by using precise imaging solutions,’ he pointed out.
Indeed, hyper- and multispectral imaging solutions are being employed by more agriculturists for managing their recourses, whether harmful to the environment, expensive, or in limited supply.
California is currently experiencing one of the most severe droughts on record. At the beginning of May, the first mandatory urban water conservation rules in state history were imposed, meaning that California must cut water use in urban areas by 25 per cent in the next year.
Emergency regulations for agriculture in the state are still under discussion, but the situation highlights the growing pressure farmers are facing to be smarter with their use of resources.
Hyperspectral imagers are being used by agriculturists to create plant vitality maps, which allow them to make the best decisions as to how and where to apply limited resources such as water.
The traditional method to assess plant health is through airborne surveys, but these are often expensive and difficult to schedule. According to David Bannon, CEO of Headwall Photonics, there has been an incredible growth in the use of unmanned aerial vehicles (UAVs) in the precision agricultural space. ‘If you look at areas such as California which are highly dependent on agricultural yields, the introduction and deployment of UAVs, particularly as the US federal aviation agency has opened up the airspace, will become increasingly important for plant monitoring and plant health assessment.’
Deployed on a UAV, hyperspectral sensors fly over a field and capture the spatial and spectral data within the scene. ‘We will collect the spectral and spatial data, and through some algorithms we will create a colour-coded map that will allow the farmer, who is not an optical scientist, to look at his crop land and say “these plants over here are not as healthy as the plants in other regions, perhaps I have to apply more water or fertiliser,”’ explained Bannon.
‘This is important for the environment, because you don’t want to be spraying a lot of herbicides and pesticides across acres and acres of cropland if only one part of your field is under stress,’ Bannon added. ‘So, they are able to provide… the application of resources to those areas of crop land to adjust their yield quality and quantity.’
Weighing 0.68kg with dimensions of 7.6 x 7.6 x 11.9cm, Headwall’s Nano-Hyperpec sensors were specifically designed for use on these UAV platforms, which are becoming smaller and more lightweight. ‘UAVs don’t carry a very large payload, and so when Headwall designs these sensors, particularly with these small UAVs, what we focus on is size, weight and power,’ commented Bannon. The sensors measure in the VNIR (400-1,000nm) spectral range, and also include on-board data collection and storage.
A challenge that can arise from using UAVs is that factors such as wind or user inexperience can cause sudden movements which can affect the quality of the images. ‘You need to be able to pinpoint exact locations in order to best apply your resources, and if you cannot pinpoint exact locations, the imager becomes useless,’ Bannon pointed out.
Software is therefore used to rectify this jitter in the images. ‘We made a significant engineering investment as a company to develop the software capability that provides the farmer with a rectified image. Orthorectification is critically important, particularly with UAV applications to generate spatially accurate maps, so Headwall has designed a software package that allows that volatility to be smoothed out,’ said Bannon.
Projecting crop yield
But besides environmental concerns, helping agricultural companies to maintain or even improve their profitability will continue to drive the application of imaging solutions in the agricultural industry, according to Bannon: ‘I think the economic importance of plant vitality is something that will underscore the adoption of new technology approaches.’
For example, analysing chlorophyll fluorescence emissions can allow farmers to make more reliable predictions of crop yield. ‘A very small amount [of sunlight] is given off as chlorophyll fluorescence, and if you can read that fluorescence, it is a key indicator of crop yield for that planting season or planting field,’ explained Bannon.
Previously, the only method of measuring chlorophyll emission was through satellite-based systems, but Headwall recently worked with crop science companies to develop its high resolution Hyperspec instrument specifically for agricultural crop yield assessment.
‘Farming companies will use this sensor to assess how the harvest will look that year, to decide on what they can/cannot afford, and what they need to do to improve the yield based on the current state of the plant,’ Bannon added.
Compared to hyperspectral imagers used in UAVs, the fluorescence imager targets smaller spectral wavelength ranges for use in extracting data with respect to chlorophyll fluorescence. ‘For example, Headwall’s standard hyperspectral sensors cover 380-1,000nm with 2-3nm spectral resolution. The high resolution Hyperspec covers 730-780nm with 0.1nm resolution,’ explained Bannon.
The imager contains no transmissive elements, and a high efficiency, all-reflective diffraction grating that allows for the 0.1nm resolution. ‘There are no transmissive elements because transmissive elements such as prisms, when used in agriculture, can create significant image aberrations and thermal instability issues,’ Bannon said. ‘So, your accuracy is limited and you can’t repeat your data sets. With any instability, a very high resolution instrument is worthless to you.’
From the remote sensing of plant vitality and evaluation of crop yield, to the management of limited or potentially harmful resources, hyperspectral and multispectral solutions are becoming more of a crucial tool within agriculture.