Color Camera for Curiosity's Robotic Arm

Color Camera for Curiosity's Robotic Arm: The flight MAHLI camera head. Credits: NASA/JPL-Caltech/Malin Space Science Systems.

The Mars Hand Lens Imager (MAHLI) is a focusable color camera located on the turret at the end of the MSL robotic arm. The instrument acquires images of up to 1600 by 1200 pixels with a color quality equivalent to that of consumer digital cameras. The table below summarizes the basic characteristics of the instrument.

Parameter Value/Description
Focus Adjustable; working distances 20.4 mm to ∞
Focus group range of motion 11.44 mm
Bandpass 380-680 nm
Pixel scale Variable from 13.9 μm/pixel to >>>> 13.9 μm/pixel
Focus-Position Dependent
25 mm working distance,
15 μm/pixel
∞ working distance
Depth of field 1.6 mm > 4800 mm
Field of view 34.0° diagonal 39.4° diagonal
Focal ratio f/9.8 f/8.5
Effective focal length 18.3 mm 21.3 mm
Back focal length 19.8 mm 8.4 mm
MAHLI Example Image
Flight MAHLI image of a zinc ore sample from Franklin, New Jersey, illustrating the color hand “hand lens” properties of the camera. Crystal shapes and colors are evident and light glints off of crystal faces. Note the 1 mm scale bar. The red mineral is zincite (ZnO4).

MAHLI images can be acquired at working distances between ~20.5 mm and infinity, permitting acquisition of closeup views with a pixel scale/spatial resolution as high as 13.9 microns per pixel, as well as selection of context views at greater working distances. Owing to likely uncertainties in robotic arm placement, the very highest resolution images might be challenging to obtain on Mars (we expect to learn more during testing of the arm placement capabilities prior to launch). The figure below shows an example image acquired by the flight MAHLI during pre-delivery testing. The ability to focus at infinity, combined with the location of the instrument on the MSL robotic arm, also permits imaging of some areas that are inaccessible to the other cameras on MSL, including in-focus views under the rover or—because the robotic arm can be raised to stand higher above the ground than the Remote Sensing Mast (RSM)—from a vantage point higher above the surface than that of the cameras on the RSM.


Goals and Objectives

The primary objective of the MAHLI investigation is to acquire images, particularly at (but not at all not limited to) hand lens scale, which facilitate the interpretation of the petrography and mineralogy of rocks and regolith fines at the MSL investigation site, attributes that are critical for describing the materials and deciphering the processes that have acted on them. Additionally, images from MAHLI will be used to help select materials to be sampled or examined by the other instruments (particularly APXS, CheMin, and SAM) and document the sampled or examined targets and collected materials.


Color: Color imaging is achieved using a Bayer pattern filter, a common approach found in many consumer digital cameras. The luminance component of a resulting demosaiced image preserves the pixel-to-pixel geometric/spatial relations that one would see in a comparable single-band image.

Image Resolution: The MAHLI design permits imaging over a range of spatial scales between about 13.9 microns/pixel and infinity. Malin Space Science Systems follows a strict definition for resolution of in-focus images wherein the optical blur circle is equal to or less than one pixel across. Acquiring images with 13.9 micron per pixel under actual operational conditions on Mars will likely be challenged by as-yet unmeasured uncertainties in the ability to place the camera using the robotic arm.

Night Illumination: The MAHLI includes two sets of two white light LEDs to permit nighttime imaging. Each pair can be independently commanded on/off. MAHLI also has two ultraviolet (365 nm) LEDs to look for materials that fluoresce under longwave UV illumination. The UV LEDs are included on an exploratory, "best efforts" basis and are not a calibrated investigative tool. The MSL Project is required to accommodate night operations of MAHLI, but thermal and power constraints might preclude more than just occasional night-time operation of the instrument and the robotic arm.

Focus and Autofocus: The MAHLI optics can be focused. An autofocus capability permits acquisition of in-focus images. Thus, MAHLI images will be in focus regardless of the exact positioning of the camera head by the rover's robotic arm.

Onboard Focus Stacking: Depending on the working distance and a target's surface relief, a close-focus view of a geologic target might not be in focus over the entire image. For those cases, MAHLI can be commanded to acquire a series of images taken at up to 8 focus positions that bracket the location of best focus. MAHLI can then be commanded to use onboard software to merge them into a single best-focus image (focal plane merge or "z-stack"). The MAHLI focus stacking algorithm also produces a range map of the target, providing a measure
of the target's microtopography.

Camera Head Placement: The camera is positioned for imaging a target using the MSL robotic arm. A contact sensor, which has 2 "pokers" (a design similar to the 1 poker contact sensor used for the Mars Exploration Rover Microscopic Imagers) prevents the robotic arm from causing the MAHLI camera head, particularly its front optical element, to touch the target. The camera head may be placed by the robotic arm in positions that allow for nested (context) imaging of increasing spatial resolution (hence decreasing viewing area) as well as positioning for acquiring stereo pairs and mosaics.

Data Storage: Owing to the commonality in designs between the MSL MARDI, Mastcam, and MAHLI, and the requirements for considerable image storage space for the Mastcam and MARDI (which operates and stores data during MSL's descent to the Martian surface without interaction with the spacecraft), MAHLI has an 8 gigabyte non-volatile NAND flash memory storage capability. The 8 gigabytes of storage is in addition to the camera's volatile 128 megabyte synchronous dynamic random access memory (SDRAM) buffer. This large data storage capability will permit acquisition of MAHLI images bracketed for exposure and robotic arm placement uncertainty; "thumbnail" images can be returned to Earth, examined, and then the best image of a given set of bracketed products can be requested for later return to Earth. The large data storage also means it is possible to store the data uncompressed, return the image in compressed form, and, if necessary, retrieve the image a second (or more) time with a different compression scheme.

Image Sub-Frames: To further provide users with the flexibility to consider image commanding trade-offs for a given downlink condition, images can be commanded at the full 1600 by 1200 pixel size, or as sub-frames with smaller pixel dimensions.

Video: Owing to the design commonality with the MARDI and Mastcams, the MAHLI can acquire 720p, ~7 Hz high definition video.

Science Team: The Mastcam, MAHLI, and MARDI investigations share a single science team which brings together persons with extensive terrestrial geoscience field expertise with people possessing considerable recent Mars rover and Mars camera development and operation experience.


The MAHLI instrument consists of three major parts: a camera head, a Digital Electronics Assembly (DEA), and a calibration target. The camera head and DEA are connected by a JPL-provided cable. The DEA is housed within the rover Warm Electronics Box (WEB). The camera head is mounted with other tools on the turret at the end of the rover's robotic arm. The calibration target is mounted on the robotic arm azimuth actuator housing.

Camera Head

The MAHLI camera head consists of three functional elements: an optomechanical assembly, a focal plane assembly, and the camera head electronics assembly. The latter two are common to the MSL MAHLI, MARDI, and Mastcam.

Optomechanical Assembly: The optomechanical assembly includes the integrated optics, focus and dust cover mechanisms, and a single drive motor to adjust focus and open/close the dust cover.

Optics. The designed effective focal length of the optics ranges from 18.3 mm at the closest working distance to 21.3 mm for focus at infinity. Over that same range, the focal ratio and field of view ranges from f/9.8 and 34° to f/8.5 and 39.4°. The optical design consists of a group of six fixed elements, a movable group of three elements, and the front element, a fixed sapphire window. Undesired near-infrared radiation is blocked by a coating deposited on the inside surface of the sapphire window. The combination of glass element transmission properties, infrared cutoff filter, and the RGB microfilters result in a spectral range for MAHLI images of 380–680 nm. The depth of field varies as a function of working distance, with the highest resolution MAHLI views having a depth of field of about 1.6 mm; at the pixel scale of MER MI images (~30 µm/pixel), the depth of field is about 2 mm.

MAHLI Schematic
MAHLI optics diagram with focus group in close-focus position.

Mechanical. The optics and all moving parts are sealed within the assembly to prevent dust contamination. The system is driven by a MER flight heritage Aeroflex 10 mm stepper motor with a 256:1 gearhead. The motor and gearing govern the distance the lens focus group travels. Wet lubrication (grease) of the movable parts in the motor and optomechanical assembly requires operation temperatures above –70 °C (preferably above –50 °C). Rover power constraints permitting, heaters can be applied to raise the camera head temperature to within operating range for night and early morning operations (this may also be dependent, however, on power to bring the robotic arm up to safe operating temperatures).

MAHLI Passbands
MAHLI spectral passband 380-680 nm. The shaded regions are not visible to MAHLI. The colored curves represent the red, green, and blue microfilters of the Kodak KAI-2020CM detector; the black curve represents the transmission properties of the lens elements and infrared cut-off coating.

Dust Cover. The Aeroflex actuator also controls the commandable opening and closing of a dust cover designed by Alliance to protect the front optical element and LEDs from dust contamination when the instrument is not in use. The cover has a window composed of clear, transparent Lexan®, so that, if the cover fails to open, images can still be acquired (although with the risk that adhering dust will obscure the view) and the LEDs can still illuminate targets.

Focal Plane Assembly: The Focal Plane Assembly (FPA) includes the CCD and associated electronics to amplify and digitize its output. The optics image onto a Kodak KAI-2020CM interline transfer CCD. The detector array has 1600 by 1200 active 7.4 micrometer square pixels. The FPA also includes the circuit elements that are tightly coupled to the CCD (e.g., the initial stage of amplification).

Camera Head Electronics: The camera head electronics include the CCD driver electronics, the motor driver electronics, and the electronics to accept commands from and transmit data to the DEA and to power the LEDs. The camera head outputs uncompressed 12-bit pixel values at rates up to 120 Mbps over a six pair parallel interface, corresponding to a frame rate of 5 Hz. The four MAHLI white light LEDs are Avago Technologies HSMW-10x White Surface Mount LED Indicator SMT PLCC-2 (specification sheet AV02-0490EN). The two UV LEDs are Nichia Model NSHU550B.

Digital Electronics Assembly

The MAHLI Digital Electronics Assembly (DEA) is mounted within the rover Warm Electronics Box (WEB). The DEA incorporates all of the circuit elements required for data processing, compression and buffering. It also includes all power conversion and regulation for both the DEA data processing electronics and the camera head. The DEA accepts images made up of 12-bit pixel values from the camera head, converts them to 8-bit images, does commanded image compression, and buffers them in DEA nonvolatile memory. High speed pixel processing, including Bayer pattern filter interpolation and image compression, are performed in hardware in a field programmable gate array (FPGA). The MAHLI z-stacking (focus merging) is done in software.

Contact Sensor

The MAHLI contact sensor was designed and is being fabricated by JPL. It mounts around the outside of the MAHLI camera head housing. Its design is based on the contact sensor used in flight for the two MER Microscopic Imagers (MI). For the MI, the contact sensor consisted of a single “poker” that, when it made contact with the target surface, would communicate to the robotic arm to stop moving toward the target and back off the target a little bit. The MAHLI contact sensor has 2 such “poker” devices. This design works best for solid targets, such as rock, but cannot be used to poke and make contact with unconsolidated regolith fines. Work-arounds for obtaining images of fine regolith include imaging areas previously contacted by the APXS contact sensor or making no contact and using extreme caution when approaching such targets (i.e., stand-off distances with sufficient margin to prevent collision with the material).

Preflight Characterization and Calibration

The flight MAHLI instrument underwent characterization and calibration testing prior to delivery in 2008. Tests included characterization of absolute and relative radiometry (required accuracy: 10% absolute, 5% relative), light transfer and noise (e.g., dark current), geometry (focal length, field of view, distortion), resolution (modulation transfer function, point spread function), scattered and stray light, system spectral throughput, and accuracy and precision of the z-stacking range map. Additional tests will be conducted in late 2010 or early 2011 after the camera is mounted on the rover, to determine the MAHLI boresight, locate any onboard noise sources, and characterize the robotic arm’s MAHLI positioning capabilities and verify contact sensor performance.

Calibration on Mars

MSL carries the MAHLI Flight Calibration Target for color/white balance, resolution and focus checks, and verification of UV LED functionality. The target will be mounted in a vertical position on the rover (i.e., vertical when the rover is on a surface with a slope of 0°) to help prevent dust accumulation.


Exposures Times

Typical daylight exposure times for each MAHLI image are of the order of 5–15 milliseconds. The longer exposure value is based on the blue filter case for a low signal, solar illuminated target with a signal to noise ratio (SNR) of > 50, a target albedo of 0.1, and an incidence angle of 75°. The shorter exposure time is based on a green filter case with a high signal, solar illuminated target with signal below saturation, an albedo of 0.6, and an incidence angle of 0°. For imaging in shadow or using the white light LEDs, the bounding, dark-case exposure time is 80 milliseconds for a SNR of > 50, albedo 0.1, and solar incidence angle of 75° or higher. Imaging of UV LED-illuminated targets occurs at night and requires exposure times of the order of 2 seconds.

Image Data Formats and Onboard Image Compression

The MAHLI is capable of producing images of three formats: raw (no RGB interpolation, no compression), lossless predictive compression (no RGB interpolation, approximately 1.7:1 compression), and JPEG (with interpolated color). The amount of JPEG compression can be changed from essentially lossless to very lossy. Operationally, most images will be returned as JPEGs because of their lower data volume. The compression factor is commanded from the ground and implemented as the image is acquired. The Bayer demosaicing algorithm is based on the method of Malvar et al. (2004). In addition to the above formats, MAHLI video products are Bayer pattern-interpolated, 8-bit companded, lossy JPEG-compressed standard JPEG-formatted images concatenated into 16-frame motion-JPEG groups of pictures (GOP), with a single instrument and spacecraft header for each GOP. The instrument also returns color JPEG “thumbnail” images, typically of about 150 × 200 pixels in size. A “thumbnail” for every MAHLI image acquired is intended to be returned to Earth and these will be used to judge whether to return (and the best compression to use) images that are not required for immediate, tactical operations planning purposes.

Onboard Focus Stacking

MAHLI’s onboard focus stacking (z-stacking) capability is a form of image compression. MAHLI can acquire up to 8 images on either side of best focus and then merge them to form a single best focus image and a range map. This reduces the number of images returned to Earth from 8 to 2, and the second one, the range map, is a smaller data volume, grayscale image.

The data for z-stacking are acquired in raw form. The data are RGB interpolated, and then the focus stacking algorithm uses the interpolated data as input. The software registers these focal plane images using multiresolution Kanade-Lucas-Tomasi (KLT) feature tracking with Harris corner detection to identify feature points and track them between image pairs in the stack. The focus merge and range map are generated via a windowed Sum-Modified-Laplacian focus measure to determine the areas of best focus and uses a Gaussian interpolation for computing the depth map. The software also has an option to combine the images using multiresolution spline based image blending.

Onboard focus stacking will not always be used. It will typically be used for only the highest resolution images and MAHLI users will routinely determine under which circumstances it is to be employed, carefully considering trades between downlink availability, pixel scale, knowledge of camera head vibrational environment (during image acquisition) at the end of the robotic arm under Martian environmental conditions, and the science objectives at a given imaging target. Note that the image above is not a z-stacked product, it is a single-frame image.

Robotic Arm Placement of the Camera Head

The camera head is mounted to a vibration damping system on the side of the drill on the turret at the end of MSL’s robotic arm. The JPL-provided contact sensor, described above, is used to prevent the camera head from impacting a target rock. Arm placement is a function of two factors: the ability of the robotic arm to place the camera head in the desired location, and the accuracy of the navigation and hazard camera images and visualization tools, coupled with the commanding tools, to allow users to generate the commands that will place the camera in the desired position. The robotic arm will have better placement capabilities for targets that are repeatedly imaged (e.g., the MAHLI Flight Calibration Target, a rock that is returned to more than once) or that has been contact-sensed by the APXS contact sensor, first. Placement of the MAHLI camera head for the first time, however, may have an uncertainty of as much as ~20 mm in three dimensions.

Operations Team

The MAHLI will be operated at Malin Space Science Systems (MSSS, San Diego, California) by a team of dedicated spacecraft instrument operations professionals that have been operating cameras on multiple Mars spacecraft since 1997. They are responsible, based on science team input, for commanding the instrument and providing the information needed to command placement of the robotic arm for MAHLI imaging. They are also responsible for receipt and logging of the MAHLI science and engineering data, reporting on and maintaining instrument health, and archiving of the MAHLI data with the NASA PDS.

Data Archiving

Most MAHLI data will be returned to Earth as color JPEGs ready for viewing with standard JPEG-viewing software (including web browsers and many e-mail tools). Pre-validated “raw” MAHLI science data are required by NASA to be made available to the entire world via the internet within 24 hours of receipt on Earth. MAHLI image data will be validated and archived with the NASA PDS following a schedule determined by the MSL Project. PDS data products will be archived in the form received from Mars (i.e., raw) and (resources permitting) in geometrically and radiometrically calibrated forms in a standard PDS file format. Products generated by the science or operations teams for data analysis or tactical planning purposes (e.g., mosaics, stereopair products) might be archived with the PDS or (more likely) made available through scientific publications, depending upon resource availability.

Examples of How MAHLI will be Used

MAHLI has a range of capabilities and presents considerable flexibility for use. Some ways the camera will be used include (but are not limited to):

  • Closeup imaging of rocks and fine regolith targets from a near-normal (i.e., along z-axis of the camera lens) viewing position.
  • Context imaging of targets viewed at highest MAHLI resolutions.
  • Imaging oblique views (e.g., bug’s eye, dog’s eye, or standing-human’s eye views) of rocks, regolith, and terrain.
  • Night imaging.
  • Searching for fluorescent materials using the UV LEDs.
  • Observing seasonal frost; monitoring changes in frost over night.
  • Mosaicing and stereo-pair imaging.
  • Imaging of terrain and dust-raising event monitoring when MAHLI is in a stowed position (when robotic arm/turret are stowed).
  • Imaging of the MAHLI Flight Calibration Target.
  • Sky imaging (requires knowledge of position of Sun in the sky) for flat field calibration.
  • Drill hole imaging (might involve shining LED illumination into the hole).
  • Sample Observation Tray imaging of a split of drill or CHIMRA samples.
  • Periscope Imaging—robotic arm is extended upward to allow MAHLI to look over the top of something that the other cameras cannot reach (the robotic arm can place MAHLI higher above the ground than the top of the Remote Sensing Mast).
  • Acquiring scientific video sequences (e.g., documenting grain movement on the surface).
  • Acquiring public outreach or documentary video sequences (e.g., opening of a sample inlet cover; viewing landscape go by as rover drives and MAHLI is stowed; moving rover a very short distance while arm is deployed such that MAHLI can observe wheels rolling over the surface; movement of Remote Sensing Mast).
  • Rover problem diagnosis (view under the rover as done with on Spirit, only this time in focus and in color; view wheels from the side; look down inside CheMin or SAM sample inlets).
  • Rover self portraits (for education/public outreach) by holding camera head up above the rover or out at some distance from the rover.

Last updated: 2012