Skip to main content

Drones gain spectral sight

In the recent past, the prevalence of unmanned aerial vehicles (UAVs) has grown significantly, which is attracting the attention of manufacturers of small cameras. Running parallel to this are the developments being made in hyperspectral imaging technology, such as those by the Belgian research institute Imec, which offer advanced imaging capabilities for UAV users.

UAVs used to be exclusive to military applications but are now entering the commercial sector. When carrying a camera payload drones can be used to monitor the health of a field of crops, observe the effects of deforestation, or survey national parks for wildlife populations.

Max Larin, CEO of camera maker Ximea, observed: ‘The advantages of using UAVs are so high and so obvious – it’s so attractive to many companies.’

According to Larin the number of vendors making drones is now comparable to the number of manufacturers making the camera payloads to be placed on them. In 2013, Imec released a hyperspectral imager based on a CMOS sensor overlaid with hyperspectral filters, which Larin considers to be a ‘game changer’, in this area. Ximea has developed a hyperspectral camera using Imec’s sensors.

Before Imec’s hyperspectral sensor was developed, hyperspectral imaging involved complex optical set ups, Larin explained, ‘typically with a sleep aperture, some dispersion elements like a prism and a 2D sensor to collect the information.’

This would once have kept hyperspectral imaging on the sidelines of UAV applications, but a number of companies are now making use of Imec’s developments. The advantage of Imec’s sensors is that all of the components can be squeezed into a small camera – Ximea’s camera measures one cubic inch, weighs 27g and consumes around 1W of power.

The commercial UAV market demands lightweight, small size, and low power requirements so the drones are one of the most obvious beneficiaries of small hyperspectral technology. Every gram added and every watt of power used reduces the time a UAV can stay airborne and therefore increases the overall time taken to collect all of the necessary data.

Larin said: ‘For environmental monitoring, forestry, anything where you need to analyse large areas – and need spectral information – will benefit from having compact and lightweight systems and will sooner or later adopt this [CMOS hyperspectral imaging] technology.’

Ximea’s business is orientated towards making fast, small form factor cameras that are robust and light, systems which therefore lend themselves to being mounted on aerial vehicles. But it wasn’t until recently that they became involved in hyperspectral applications. According to Larin, around two and a half years ago the company extended its portfolio to incorporate hyperspectral imaging.

At the same time Imec developed its CMOS hyperspectral sensor. As the Imec sensor is based on CMOS technology that Ximea had used in the past, the company could integrate the sensor quickly into its product line.

However, Larin stated: ‘There was no initial intent to focus on the UAV or any other particular market; our primary interest was to extend our offering to our existing customers in medical and research applications. But it turned out that the interest from areas such as precision agriculture was so high that it was decided to add more features that specifically target mobile platforms like UAVs.’ Ximea is now providing a new set of APIs which allows its cameras to be integrated onto drones.

Embedded processing

Another area UAV makers are interested in is embedded processing. UAV mounted units provide a demand for more efficient algorithms in order to reduce the computational power required to handle incoming information. ‘It requires substantial effort to optimise and tune the algorithms before they can run in real time,’ Larin remarked.

By streamlining the process required to carry out image compression or analysis, less hardware is required which saves weight and space on the drone. Also, if the user wants a live feed of information the image data must be compressed before being transmitted, often by radio, in order to keep data transmission rates reasonable.

Larin said: ‘The camera itself is a source of raw data which is very generic; so is the computational back-end which is also generic in terms of the algorithms implemented there. The differentiation comes from the next layer, what type of library and data extractable methods are used.’

Another thing to consider is that most end users are unlikely to need the full spectral information of a scene. The results also need to be understandable to users that may not have experience handling the raw form. Onboard processors perform real time calculations and analysis to provide actionable data.

If the UAV is being used in agriculture, for example, the required information is whether or not a certain part of a field needs more water – essentially only a location and a ‘yes’ or ‘no’ form answer is necessary. So, as Larin stated: ‘The amount of information that we need to store or send via downlink is extremely small; just a few bytes actually.’

To assist this streamlining, Ximea’s customers will work with the end user to create a library of stored results before the system is installed. For instance, if the spectral signature shows high results for water, the area doesn’t need watering. ‘All of this requires a research phase and analysis in order to keep those reference signatures,’ Larin said.

‘The predetermined results can be stored within the camera allowing for instant analysis thus avoiding extremely bulky large datasets, which would otherwise need to be recorded on a ground station and processed offline,’ he added.

While Ximea adapted from research and medical applications to UAV, a Dutch company called 3D-One took experience from space programmes its founding company, Cosine Measurement Systems, is involved in. Both use Imec’s small hyperspectral sensor.

3D-One integrates optical sensors for daylight imaging, as well as shortwave infrared, thermal infrared, and hyperspectral image cameras together with a data processing engine and mission sensors for OEM customers. The company builds customised solutions that are optimised for performance, weight and size.

‘We like to make things small and low power – this comes from the background of our founders at Cosine which are developing equipment for space applications,’ explained Marco van Hout, business development manager at 3D-One. ‘For space applications it is obligatory to make equipment rugged, low weight, and low power.’

3D-One uses two Imec sensors in order to cover a large spectral band. Van Hout said: ‘You have to cover a broad spectrum by default, but the current Imec sensors only cover part of the spectrum, either 460nm to 600nm or 600nm to 1,000nm range. Many applications are interested in covering from 460nm to 1,000nm as they require bands in both the lower and the upper wavelength range to make a solid classification of what they see.’

Broader spectrum sensors could allow lower power systems but, as van Hout warned, ‘there is always a drawback’. The sensors that Imec make have a mosaic pattern which is ideal for imaging moving objects; the snapshot mode allows all the data to be obtained from one image acquisition which avoids motion artefacts. ‘Putting additional filters on one sensor to increase the spectral resolution would result in larger macropixels. This by itself will give you another challenge in your algorithms making them more complex,’ van Hout said.

‘Our solution with the two sensors means you can cover the whole spectral range while having the benefits of small macropixels. By this method we have increased the spectral range and the spatial accuracy,’ he added.

3D-One’s system has an embedded data processing computer on board to host the classifications algorithms and carry out de-cubing or reflectance calculations. This makes the systems ideal for deployment within UAV applications for precision agriculture and remote sensing applications for environmental monitoring and surveillance.

‘We have some core processes running at FPGA level. It’s an enormous amount of data that you have to process in real time, and that’s better done with an FPGA,’ said van Hout. The FPGA takes some of the load from the onboard embedded computer. The system also uses a mass storage device capable of storing one terabyte of data, which can be downloaded after each flight.

The real time processing combined with its terabyte storage capacity means the system can stay operational much longer than a UAV can stay in the air on its battery.

Data transfer

Where there is data acquisition, there is a fresh battleground for interfaces to fight it out and the UAV market is no different. Ximea uses USB3, which Larin said brings the simplicity and ease of integration with embedded computers such as Arm processors, and boards that already exist within the UAV infrastructure.

He added that: ‘Camera Link is a historical interface that is good for the applications for which it was developed, specifically machine vision, but as soon as it comes to something lightweight and compact, it [Camera Link] becomes too bulky, too heavy, too difficult to integrate.’

However, for certain applications often found within the military sector where bigger UAVs are used and size is less of a concern, Camera Link offers advantages when using traditional imaging, as opposed to hyperspectral.

Andrew Buglass, strategic business manager at Active Silicon, said: ‘Where our technology sits is for acquiring data from the camera and presenting it to an onboard embedded computer. If, for example the camera uses the Camera Link standard, then one of our embedded, ruggedised frame grabbers would be a good candidate to be integrated into the system.

‘If there is some sort of processing going on, this may well be because there is a lot of data involved that they want to process before they send it down [to a ground station]. If they’re using high bandwidth cameras they are typically using some sort of frame grabber hardware to capture data.’

Because some of the military UAVs are massive, the weight restrictions can be quite minimal. But commercial ones tend to be smaller, which Buglass said was often related to cost of the technology available, and therefore requires tighter limits on the hardware used. He said: ‘What they are doing is using small embedded technology, so our 104 board [part of the Phoenix family] is a form-factor that works well in that environment. It’s about 10 x 10cm2 and stackable design, and, using this architecture, you can achieve good performance along with relatively low power which is also important.’

Buglass noted that there are themes developing in what Active Silicon is being asked to do for these UAV applications, which involves onboard processing. ‘In some of the scenarios we have come across, customers might use a lower bit rate downlink while acquiring at high resolution and storing data locally on the machine,’ he said.

The user might want to see what is going on in real time, but with most cameras now offering HD images, transmission of these images would take time due to their size. Buglass explained: ‘Most customers want two paths, a high resolution path that is used for analytical processing and perhaps recording, and a low resolution stream to relay to the ground station for real-time viewing.’

In particular, the demand for compression hardware is something that Buglass has noticed is on the rise. ‘Integrated hardware compression is on our roadmap now in order to capitalise on the growing demand. Back in the analogue days data could be streamed directly from a typical “standard resolution” analogue camera, but now with higher resolutions, frame rates and bit depths, this is no longer possible and customers are asking us to integrate hardware compression into our image acquisition technology.’



Media Partners