Movement inspires ideas—and sound can inspire perception.
The Kia Soundscapes project reimagines how we experience the world by transforming landscapes into immersive musical journeys. Designed with passengers with vision loss in mind, this initiative utilizes Kia’s Advanced Driver Assistance System (ADAS)—originally intended for obstacle detection—to identify natural elements like trees, rocks, and mountains. These are then translated into sound through a specially developed system where each feature is paired with its own musical voice: for example, flutes and woodwinds evoke the presence of trees, while low drones represent distant peaks.
While the project is presented as being generated by artificial intelligence, the musical language that brings this system to life was in fact carefully designed by human creators. As the music supervisor for Kia Soundscapes, I worked in close collaboration with DaHouse Audio to guide the development of the sonic identity for each landscape element. Our team crafted the sound assets and compositional rules that the AI draws from—ensuring that the emotional and aesthetic quality of the experience remained central to its function.
By aligning data analysis with nuanced musical storytelling, we created a system that goes beyond utility to offer a profound sensory experience. Kia Soundscapes is a testament to the power of combining technology with human artistic direction, allowing those without sight to feel the world’s beauty through movement and sound.