The sensor stories behind the Mars Perseverance images

Share this on social media:

The first colour image, captured by a CMV20000 sensor, to be sent back by the hazard cameras on the Perseverance rover after its landing on Mars on 18 February. Credit: NASA/JPL-Caltech

Greg Blackman speaks to Guy Meynants, formerly of Cmosis, and Paul Jerram, of Teledyne e2v, about the history of the image sensors onboard the Mars rover

Perseverance made it to Mars, and we all cheered. Years of engineering effort safely flown and gently deposited onto the surface of the red planet in what was a tense and quite complicated set of entry, descent and landing procedures – described as 'the seven minutes of terror' to get Perseverance down onto Mars in one piece.

The images returning from the rover tell their own story, about the imaging community that supplied the sensors, and, in a way, about the history of image sensor development in general.

Belgian firm Cmosis built its CMV20000 sensor originally as a custom product for traffic monitoring, releasing it in the summer of 2012, around the time when Perseverance's predecessor, Curiosity, was touching down on Mars.

Perseverance is an upgraded version of Curiosity. Early in 2013 an internal study looked at ways of modernising the engineering cameras that had flown on Curiosity, as the design of those cameras was at that time more than 20 years old. It was felt that the era of these cameras was coming to a close, wrote scientists from Nasa's Jet Propulsion Laboratory (JPL) in a paper published in Space Science Reviews in November 2020.

One of the upgrades was to use the CMV20000 20-megapixel sensor in the engineering cameras, the navigation and hazard detection eyes of the rover. Guy Meynants, one of the founders of Cmosis and now working for image sensor firm Photolitics, told Imaging and Machine Vision Europe that JPL initially requested information about the sensors much like any other customer, but at that time the team at Cmosis didn't know what it would be used for.

Then, later on, 'we started to get very specific questions, which came to me,' Meynants recalled. 'There was some urgency, and I was CTO at Cmosis at the time. It was then that we started to realise it might be used for something like this [the Mars rover].'

This was in 2015, around when Cmosis was taken over by Ams – the CMV20000 sensors are still available from Ams. 'Initially we didn't know what it would be used for, but we [Cmosis] did have a heritage in providing imagers for space,' Meynants added.

Meynants' own history with designing CMOS sensors for space exploration began in the 1990s while working for Imec as a researcher. He was involved in a number of projects for the European Space Agency (ESA), and some of those sensors ended up in space. The Mars Webcam on the Mars Express has been orbiting Mars for the past 17 years, since the end of 2003, taking pictures of the red planet. The imager onboard is a chip Meynants developed during his PhD at Imec, and it's still operational. It was made as a technology demonstrator, Meynants said, to show that CMOS sensors could be used in space – CMOS sensors are more tolerant to exposure to radiation than CCDs, which tend to suffer from charge transfer degradation over time.

Cmosis had also been involved in projects for ESA, for a docking camera to guide spacecraft docking with space stations or satellites, for instance. The company had also worked on projects for JPL and Nasa in the past.

A panorama, taken on 20 February by the navigation cameras onboard the rover, was stitched together from six individual images. Credit: NASA/JPL-Caltech

The cameras on Perseverance have three main improvements over those that flew on Curiosity, say the JPL scientists in the Space Science Reviews paper. Firstly, the Cmosis CMV20000 sensors are colour chips, which gives better contextual imaging capabilities than the monochrome predecessors. The second improvement is that the cameras have a wider field of view – 90° x 70° as opposed to 45° x 45° – which means only five overlapping images are needed to create a 360° panoramic view (Curiosity needed 10 images to achieve the same effect). The third improvement is that the 20-megapixel sensors can resolve greater detail than the older model.

Meynants also made the point that CMV20000 is a global shutter sensor, which isn't highlighted in the Space Science Reviews paper. The rover moves around the surface of Mars largely autonomously using six hazard avoidance cameras (controllers on Earth can't make real-time decisions to drive the rover because of the time delay in sending signals to and from Mars).

All the data is acquired synchronously on a global shutter sensor, which 'is a benefit for any kind of autonomous driving algorithm,' Meynants explained.

'You don't need to worry about rolling shutter problems,' he continued. 'You can ensure all six cameras acquire images at the same time.' With a rolling shutter sensor, exposure times can be synchronised but every row of pixels will capture a signal at a slightly different time, which could create problems when processing the data for autonomous driving.

Seven minutes of terror

The Cmosis sensors are in the engineering cameras on the rover, but some of the first images released, and what made the descent even more dramatic for those watching, was footage from six Flir Chameleon3 cameras, formerly Point Grey. These captured the entry, descent and landing phase, from the parachute being deployed to the final sky crane manoeuvre over the landing site, where the rover was lowered onto the surface from a jetpack. 

A key part of the landing system was the Lander Vision System camera (LCAM) mounted on the underside of the rover. As the rover descended on the parachute the LCAM images were correlated to a reference map preloaded on the vehicle to locate itself and land safely. Like the Cmosis technology, the sensor used in LCAM also has Belgian heritage and a tentative connection to Meynants: On Semiconductor's Python 5000 global shutter chip was developed at the company's Mechelen site, the former Fillfactory facility – and Meynants was co-founder of Fillfactory before it was bought by On Semiconductor.

In terms of the analytical instruments onboard the rover, a combination of CMOS and CCD detectors are used. Cmosis has a sensor – CV4000, a 2k x 2k device – in the Supercam, which will examine the chemical composition of rocks and soil. Meynants said that the route to include CV4000 in Supercam was a more classical space science project, with a full qualification of the microlenses and scientific work on whether the sensor was suited to the task.

Supercam also contains a CCD from Teledyne e2v, part of Teledyne Imaging, as does the Sherloc instrument – Scanning Habitable Environments with Raman and Luminescence for Organics and Chemicals – which will look for minerals, organic molecules and potential biosignatures. Teledyne Imaging companies also produced key optical components for the Sherloc UV spectrometer, including many of the lenses and mirrors.

Both the Teledyne CCDs on Supercam and Sherloc are 2k wide by 512 pixels high. Here, sensitivity is key, in the visible spectrum but also out into the UV and infrared, according to Paul Jerram, chief engineer at Teledyne e2v.

Jerram told Imaging and Machine Vision Europe that this particular CCD has a heritage stretching back to the early 1990s. It was originally made for the Royal Greenwich Observatory as a large area detector. After that, what was then e2v made variants of the detector for spectroscopy.

A version of the sensor was used onboard Curiosity, so the work for Perseverance had to some extent already been proven. 'We provide a detector that's really efficient,' Jerram said, almost 100 per cent detection efficiency. The signal returning to the detector from pulses of laser light is not huge, and instruments like Supercam and Sherloc need a sensor with very low noise that captures all of the signal.

A still image from video taken during the descent stage as Perseverance touched down on 18 February. Credit: NASA/JPL-Caltech

Jerram said e2v began making sensors for space science in the 1980s. 'Effectively that's meant our technology has evolved in a way that makes it suitable for space exploration – things like making sure we have detectors with very high detection efficiency, making sure we have detectors that can go into the near infrared and ultraviolet,' he said. 'If you are in space it is generally quite expensive to collect signal, whether that's collecting astronomy signal or anything else, so you do need to make sure the detector is really efficient.'

Teledyne Imaging also has a lot of expertise in designing sensors that will withstand exposure to radiation and temperature extremes, as well as vibration from launch and landings. Teledyne e2v, Dalsa and Imaging Sensors business units offer back-thinned backside illuminated CMOS and CCDs, as well as infrared HgCdTe detectors, among other technologies for space applications.

One of Teledyne e2v’s largest ongoing space projects is a new design of detector for the Plato space telescope – Planetary transits and oscillations of stars – being built by the European Space Agency. Plato, due to launch in 2026, will look for exoplanets, planets orbiting around other stars. 'It's [Plato] a really huge focal plane, so they [ESA] want a big detector,' Jerram said. The instrument will have the largest digital combined camera ever flown in space, with a field of view covering an area of the sky of approximately 2,250 square degrees and a resolution of 2.12 gigapixels.

Each of Teledyne’s CCDs for this telescope measure around 8 x 8cm – 4.5k x 4.5k pixels, with an 18µm pitch. CCD wafers start off at a size of six inches, and for Plato the detector fills almost all of the wafer, Jerram said. 'That's an example of a project that's taken 15 years from start to finish,' he added.

Technology for space has a long development cycle. Meynants said that a space project early on in Cmosis's existence, around 2010, pushed the company to work with cutting-edge sensor technology and further its CMOS fabrication capabilities. The sensor was a fully custom CMOS chip for ESA's Solar Orbiter satellite, boasting a very high dynamic range pixel. 'We were one of the first to demonstrate [this pixel] on a chip,' Meynants recalled.

The sensor was to be used for extreme UV imaging, so the chip couldn't be illuminated from the front as the radiation would have been absorbed by the glass cover layers. 'We had to thin the device and illuminate from the backside,' he said. 'At the time, we thought this was a good opportunity to learn to develop that [backside illumination] technology – it was around 2010, 2011 and BSI was really new, and we thought this is a way that we can learn how to make it and also execute the project.'

The flight devices were made and the satellite was launched last year, so a development time of ten years. 'These are 2k x 2k, 10µm pixels – really big devices, 3 x 3cm – and there are four of those on that satellite,' Meynants said.

'That was state-of-the-art technology [in 2010], BSI high dynamic range pixels,' he added. 'Now you can find that technology in mobile phones.' Space science will continue to push boundaries, even if the technology might be mainstream by the time it is launched into space.

--

Guy Meynants is currently on the board of Photolitics, which provides custom image sensor designs, and also working part time at KU Leuven doing research on image sensors in harsh environments like those found in space.

Paul Jerram has worked in the imaging division of Teledyne e2v for nearly 25 years. As chief engineer for space imaging he ensures Teledyne e2v’s technology meets customers’ needs, and shares the firm's roadmap with clients.

Related analysis & opinion

05 May 2020

Greg Blackman speaks to Kieran Edge at the University of Sheffield's Advanced Manufacturing Research Centre, about new vision projects and the presentation he is to give for UKIVA's vision technology hub, to be broadcast on 14 May

Related features and analysis & opinion

An Illustration of the surface of PAN fibres (left) and carbon fibres after spreading (right). Credit: Fraunhofer IGCV and Chromasens

10 November 2020

Work is underway to build a vision system to detect defects in webs of composite materials

18 February 2021

Greg Blackman speaks to Imec’s Paweł Malinowski about the institute’s new quantum dot SWIR sensor