
The falling rain particles are monitored in real-time — each collision with the ground triggers a ripple at the point of contact, creating an generative texture that simulates rain disturbing a reflective water surface.


These visuals explore different ways of expressing rainfall through sound-driven generative design.
Using TouchDesigner, I tested various techniques—from particle systems and video textures to noise-based simulations and feedback distortion. Each scene responds to the rainstick’s sound input, creating a unique visual atmosphere that ranges from calm ripples to abstract storms.








By recording the varying sounds and frequencies produced through different tilt angles of the rainstick, I captured four distinct states of rain: ambient background, light rain, moderate rain, and heavy rain.
These audio samples were trained using Teachable Machine, and the resulting model was connected to TouchDesigner. Through real-time microphone input, the system detects the rain type and drives corresponding changes in visual elements.


This is an interactive visual exploration that transforms the tactile sound of a rainstick into generative, audio-reactive environments. Inspired by ambient rain simulators and ritual instruments, this project investigates how bodily sound input can guide and shape visual forms.
Working with TouchDesigner and Teachable Machine, I experimented with multiple ways to visualize rain—from particles and ripples to surface deformations and abstract atmospheric flows. Rather than arriving at a single result, the system became a space for continuous iteration—where each rainfall intensity offered a new opportunity to test, observe, and feel how sound could generate presence.