


These visuals explore different ways of expressing rainfall through sound-driven generative design.
Using TouchDesigner, I tested various techniques—from particle systems and video textures to noise-based simulations and feedback distortion. Each scene responds to the rainstick’s sound input, creating a unique visual atmosphere that ranges from calm ripples to abstract storms.








I recorded rainstick sounds at different tilt angles and defined four rain states: ambient, light, moderate, and heavy. I trained these samples in Teachable Machine and connected the model to TouchDesigner, so real-time mic input classifies the rain type and drives matching visual changes.


This interactive project turns the tactile sound of a rainstick into generative, audio-reactive environments.
Using TouchDesigner and Teachable Machine, I explored multiple visualizations of rain—particles, ripples, surface deformation, and atmospheric flows. Rather than a single outcome, the system stays iterative: each change in intensity becomes a new way to test how sound can generate presence.