Dream Sequence is a visual reconstruction of two recurring childhood dreams, rebuilt through Blender and transformed by AI-driven hallucination.
The project explores how dream logic—fragmented space, symbolic objects, and unstable perspective—can be translated into a hybrid workflow combining 3D modeling, volumetric lighting, and AI-generated alterations.
Dream A

Dream B

AI Previsulization
I began by translating the dream into descriptive prompts in Midjourney, generating multiple visual directions to explore atmosphere, lighting, scale, and emotional tone.


Dream A Prompt
First-person POV climbing a soaked concrete staircase on the side of a massive dam, looking upward into the storm. Two colossal dam walls tower on both sides, vanishing into lightning-lit clouds. Torrential rain slants down, streaming along the steps and dripping from the metal railings, which gleam dimly. Wet rough concrete textures glisten in brief flashes of light. Wind-driven mist swirls in the air, cinematic blue-gray tones, high contrast, ultra-realistic atmosphere. Created Using: RED Komodo 6K, storm cinematography rig, wet surface PBR shaders, volumetric mist simulation, lightning HDR pass, immersive low-angle lens, hyperreal rain particle effects, industrial texture pack

Dream B Prompt
A surreal first-person space perspective, looking upward from beneath a massive desaturated gray planetary object. The sphere looms overhead, filling almost the entire screen, its surface covered in glitching mosaic and digital decay textures. Only faint light outlines its curved underside. Scene is silent, blurred, as if trembling. Hyperreal dreamcore mood, Canon 35mm analog tone, pixel noise overlay, soft shadows, overwhelming visual pressure
3D Reconstruction
Using the selected AI references, I rebuilt the core environment in Blender.
I blocked out major shapes, set the spatial proportions, established lighting, and added volumetric fog to recreate the dream’s emptiness and tension.


AI Transformation — ComfyUI & Runway
The Blender render sequence was then processed through ComfyUI and Runway to introduce dream-like texture shifts and lighting variations.
Using depth-guided models, stylization modules, and controlled noise levels, I altered edges, surface materials, and ambient light in a way that enhanced instability without distorting the underlying scene.