FURI | Spring 2020

Analyzing Sensor Quantization of RAW Images for Visual SLAM

Data icon, disabled. Four grey bars arranged like a vertical bar chart.

Visual simultaneous localization and mapping (SLAM) is an emerging technology that enables low-power devices with a single camera to perform robotic navigation. Most visual SLAM algorithms are tuned for images produced through the image sensor processing (ISP) pipeline optimized for highly aesthetic photography. We investigate the feasibility of varying sensor quantization on RAW images directly from the sensor to save energy for visual SLAM. An 88% energy savings has been achieved by decreasing quantization bit level to five bits. We also introduce a gradient-based quantization scheme that increases energy savings. This work opens a new direction in energy-efficient image sensing for SLAM.

Student researcher

Olivia Christie

Electrical engineering

Hometown: Mesa, Arizona, United States

Graduation date: Spring 2021