Reimagined Battleship as a hybrid physical-digital game to teach historical naval battles. Used
computer vision and projection mapping to merge digital feedback with physical gameplay.
Built with OpenCV (Python), PyGame, and image processing techniques. I learned how playful design can make education tangible, and how UX testing matters when designing for real-world interaction.
Created a physical audio trigger box to enhance atmosphere in tabletop RPGs, mapping player
actions to ambient sounds.
Developed with Arduino (C/C++) and custom sound logic. I learned to bridge hardware with experience design — and that audio has emotional power in gameplay.
Developed a VR experience that lets users simulate prehistoric rock carving through hand-tracked
gestures, blending culture and interactivity.
Built in Unity using C#. I learned how spatial tech can turn history into a tactile experience, and how intuitive interaction makes abstract ideas come alive.
Trained neural models to work along an generative autoencoder to learn and generate effects on
music tracks based on artists' destinct voices, analyzing sound
transformations through user testing.
Built in Python using TensorFlow. I gained ML and signal processing skills, and learned to handle imperfect data with creative resilience.
Designed a VR walkthrough to help users explore sustainable architecture, gathering design
feedback through interactive tours.
Built in Unreal Engine using C++. I learned how to blend clarity and exploration in immersive design, and how environment layout affects understanding.
Replaced button-based pointing with thrust-based gestures to create more engaging and physical VR
interaction.
Built in Unreal Engine using C++. I learned to map physical motion to user intent — and started thinking more critically about embodiment in design.