The application captured audio data but provided no meaningful way to utilize or represent it beyond indicating that a recording existed.
My task was to uncover opportunities to surface and transform audio metadata into useful, user-friendly features. This included exploring ways to visualize, contextualize, and apply the data to enhance usability and deliver actionable insights.
Johnson Controls
(Patented Invention)
4 UX Designers, 2 Developers
and 1 Design Manager
Lead UX Designer
February- June, 2024
A review of the application and its competitors quickly revealed opportunities to differentiate the product through audio functionality. I identified gaps in the market, including limited user interfaces for browsing, searching, and interpreting audio data. Additionally, most systems lacked robust tools for metadata extraction—such as sound classification—making raw audio difficult to analyze or act on.
No Visual Indication of Audio Presence
No Representation of Audio Metadata
Lack of Audio Classification
No Correlation Between Audio Sources and Events
To address the lack of audio awareness in video footage, I conducted user interviews and usability tests that informed three design iterations, delivered in a phased rollout. I introduced a unified audio-visual timeline, audio presence indicators for live and historical events, and smart classification of critical audio cues. This solution significantly improved situational awareness and led to a patent filing supporting future innovation.
Additionally, this exploration uncovered further opportunities to enhance the user experience. As a result, we redesigned the Push-to-Talk (PTT) feature to support more intuitive two-way communication and streamlined the process of associating multiple devices with multiple audio sources, improving overall system usability.
The following outlines several solutions I developed to tackle the identified challenges and drive product improvement.
This project highlighted the critical yet often overlooked role of audio in situational awareness. We learned that users need contextual, interpretable cues—not just raw data—to effectively engage with audio. Unifying audio and video into a single timeline improved mental models and event correlation, while flexible filtering and alert customization empowered users in high-stakes environments.
Small, focused UI changes unlocked complex functionality, and the work demonstrated how UX design can drive innovation and contribute directly to patentable product strategy.
Reduction in time to identify relevant audio events during incident review
Fewer false positives due to customizable alert thresholds and audio classification
Improvement in successful device associations between audio and video sources
Increase in user interaction with audio devices via the new audio-focused view