FURI | Fall 2025

Event-guided Vehicle Tracking in Low Light: A/B Testing Event RGB Fusion for Robust Traffic Monitoring

Sustainability icon, disabled. A green leaf.

Traffic monitoring under low illumination remains challenging for RGB-only systems due to motion blur, sensor noise, and glare. We propose an event-guided fusion pipeline for nighttime vehicle detection, tracking, and counting. Events are aggregated over short windows (Delta t) and rendered into motion frames; thresholded maps act as masks to suppress static background and emphasize movers. Foreground points are clustered via OPTICS/DBSCAN to form candidate blobs and intersected with YOLO boxes for tracks aware of objects. A lightweight tracker (using centroid/velocity gating) assigns persistent IDs.

We will evaluate performance using a controlled A/B test:

Arm A (Control): RGB-only YOLO plus a tracker. Arm B (Treatment): Event-guided fusion (events mask + OPTICS/DBSCAN + YOLO + tracker).

Clips will be paired (same scenes, times, and durations) and stratified by illumination (street-lit versus very low light), traffic density, and motion blur severity. Primary endpoints are detection precision/recall and IDF1; secondary endpoints include MOTA, ID switches, counting MAE, and runtime per frame. Preregistered analysis includes ablations over Delta t aggregation, mask thresholds, clustering parameters, and association radii. We hypothesize Arm B yields (i) higher precision in low light, (ii) fewer ID switches through glare/occlusions, and (iii) improved counting accuracy at comparable or lower compute by confining inference to motion-salient regions. Results can enable robust, cost-effective nighttime monitoring and improve downstream traffic video question-answering about motion (e.g. lane changes, stops).

Student researcher

Katha Naik

Computer science

Hometown: Mumbai, Maharashtra, India

Graduation date: Spring 2026