SensiGesture
Internal R&D - Automotive HMIVision AIFeatured
Automotive In-Cabin AI

SensiGesture

AI-driven driver safety and gesture interaction with 22 gestures and 20 activities.

ProgrammeInternal R&D - Automotive HMI
DomainAutomotive In-Cabin AI
TypeVision AI

About This Project

SensiGesture is an automotive-grade AI system for reducing driver distraction through intelligent gesture and behaviour recognition. The project created a proprietary in-cabin dataset with more than 350,000 annotated RGB frames across day, night, glare, and overcast conditions, enabling robust recognition of infotainment gestures as well as risky driver behaviours such as drowsiness, phone use, smoking, eating, and seatbelt violations.

This project reflects Sensifai's applied innovation approach, combining research-grade AI capabilities with practical product and deployment requirements.

Key Highlights

Proprietary SensiGesture dataset with more than 350,000 annotated in-cabin RGB frames.
Recognition of 22 hand gestures and 20 driver activity classes.
Temporal Shift Module pipelines deployed on MobileNet backbones for efficient inference.
Validation showed MobileNetV3 improving precision, top-5 accuracy, and inference speed over V2.

Use Cases

This project supports several practical scenarios across research, commercialization, and deployment contexts:

  • Computer vision inference in production scenarios
  • On-device or edge AI for visual understanding
  • Automotive In-Cabin AI workflow automation
  • Internal R&D - Automotive HMI innovation validation
  • Cross-functional product and research collaboration

Related Projects