Milian Ingco, a recent CS alumnus (December 2025), presented findings from his year-long mentored research in a presentation titled “Sight By Sound: Real-Time Sonification of Stereo Depth Maps using Hilbert Curves for Assistive Navigation supported by a Virtual Training Environment” at the IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR). This year’s conference was held in Osaka, Japan, between January 26-28, 2026, and Milian’s travel to to Osaka was possible with generous support from the School of Science.
Beginning in Spring 2025, under the supervision of his faculty mentor, Dr. Sejong Yoon, Milian developed an inexpensive device that can convert a depth image, generated from two cameras, into an audio signal. This signal can help visually impaired people navigate the environment with reduced risk from objects at or above head position. In his work, Milian utilized a Hilbert curve, a type of space-filling curve that can convert a 2D image into a 1D signal. He continued his work in the summer with support from the National Science Foundation Grant 1955365.
In the Fall 2025 semester, Milian conducted a TCNJ IRB-approved user test using a virtual training environment, and results from the study showed that the majority of participants saw the potential of the method for its intended purpose. Milian also implemented a version of his system that runs on Meta Quest 3 that can be used in future related research, and he received positive feedback after demonstrating this system at the AIxVR conference. Milian has ultimately shared his implementation on GitHub, an open-source platform, to allow others to build upon it.
Congratulations to Milian and Dr. Yoon!
All photos included in this post were taken with Dr. Sejong Yoon.



