FURI | Spring 2026

Accelerating Convergence to Effective Capacity in Lightweight Vision Models via Self-Competitive Distillation

Data icon, disabled. Four grey bars arranged like a vertical bar chart.

This project investigates Self-Competitive Distillation (SCD), a parameter-neutral training framework designed to improve the effective capacity of lightweight computer vision models under resource constraints. By training two identical models that dynamically exchange asymmetric teacher–student roles, SCD improves cross-domain generalization compared to Distilled Mutual Learning (DML). These results suggest that training dynamics play a critical role in practical model performance, enabling more accurate and efficient AI on mobile and edge devices. Future work will extend this approach to additional architectures and real-world deployment settings.

Student researcher

Ahmet Arda Dalyanci

Computer science

Hometown: Istanbul, Istanbul, Türkiye

Graduation date: Spring 2028