In-Vehicle Warning System
This project investigates how drivers perceive, interpret, and respond to safety-critical warnings in modern vehicles, where increasing automation and reduced engine noise shift reliance toward intentionally designed auditory, visual, and multimodal alerts.
Problem Definition & Research Approach
1. User Pain Point
Drivers must interpret multiple simultaneous alerts with varying urgency levels within visually dense interfaces, often leading to cognitive overload, ambiguity, and misresponses without a unified structure.
2. Root Cause
Coherent multimodal communication in ADAS demands a unified framework spanning perception, cognition, regulation, and system behavior. See the Full UX Research for Root Cause Analysis.
3. Data Structuring
Organized 122 use cases through structured fields including urgency, functional unit, dashboard message, audio file, visual asset, and spatial channel.
4. System Modeling
Developed a priority model that resolves simultaneous alerts by sorting warning cases through urgency, functional unit, and intra-unit ranking.
5. UX Design
Translated the priority model into driver-facing interfaces, defining visual hierarchy, spatial layout, and audio-visual synchronization across alert states.
Core Framework
The research outcome is an information architecture–driven UX model that organizes in-vehicle warning signals across three interconnected layers:
- Six-level urgency taxonomy: WarningHigh, WarningMiddle, WarningLow, Caution, Notification, and Feedback.
- Functional unit classification: ADAS Awareness, ADAS Primary Control, Brake Systems, High Emergency, Low Emergency, Road & Traffic Sign Cases, Low Safety, and Media.
- Priority-ordering algorithm: a lexicographic model that resolves simultaneous warnings by urgency first, then functional unit, then intra-unit rank.
System Design
The system design translates research findings into a structured data architecture. Each warning case is represented through identification fields, functional categorization, algorithm parameters, and implementation specifications. This allows sound and visual UX design teams to connect abstract research decisions to concrete vehicle outputs: sounds, telltales, dashboard messages, and spatial audio channels.
Importantly, the model supports conflict resolution when multiple alerts occur at the same time. Rather than allowing warnings to compete randomly, the system determines which alert should take priority based on threat severity and vehicle function.
UX Design Translation
The UX design phase translates the system into clear, user-facing signals. Urgency is shown through color (e.g., red for critical, green for low), repeated sounds get faster as urgency increases, different sound types help distinguish functions, and spatial audio indicates where the hazard is coming from.
- Visual design: color-coded urgency states from critical red to feedback green.
- Auditory design: metrical acceleration, where more urgent warnings repeat faster.
- Semantic sound design: timbre differentiates functional units such as awareness, braking, and emergency states.
- Spatial design: left, right, rear, and center channels communicate hazard direction and system focus.
Impact
This research provides a scalable framework for designing consistent in-vehicle warning systems across many ADAS features. It helps reduce ambiguity, supports more predictable driver interpretation, and gives product, design, and engineering teams a shared language for discussing warning priority, perceptual urgency, and multimodal coherence.
Prototype & Documentation
Interactive High-Fidelity Prototype
Explore the real-time ADAS warning system prototype, demonstrating full UX behavior, prioritization logic, dashboard interaction, and multimodal interface dynamics.
LaunchFull Research Documentation
This case study is based on a full research article presenting the complete UX framework, data architecture, priority algorithm, visual hierarchy, and auditory design strategy.
Launch