Portfolio
Perlin Noise 4D
Experiment in Four-Dimensional Perception
What would a four-dimensional object look like to us? This is not a new question, but most attempts to answer it approach it from the outside — unfolding 4D geometry into a form our eyes can parse, the way you might unfold a cube into a cross to understand its structure. The tesseract is the canonical example: a familiar, elegant solution that lets us reason about four dimensions without ever asking us to perceive them.
This work asks something different. Not how does 4D geometry look when projected for our convenience, but how would a four-dimensional object actually appear to a three-dimensional observer — from the inside, without accommodation, without unfolding.
The Setup
Consider two objects sharing the same position in XYZ space. They occupy the same room, the same point. To every spatial sense available to us, they are identical. But one of them sits further along the fourth axis, W. Their actual distance — the true Euclidean distance including W — is different:
length(x, y, z, w₁) ≠ length(x, y, z, w₂)
The spatial distance gives us nothing. What else might?
The Architectural Decision
The key technical and philosophical decision in this work concerns the order of operations in the rendering pipeline.
In conventional 4D visualization — the tesseract approach — stereographic projection is applied before the view matrix. This maps four-dimensional structure into three-dimensional space first, then positions it in the scene. The result is legible 4D geometry: you can see the shape, trace the edges, understand the form.
Here, the order is reversed. The view matrix is applied first, placing objects in camera space, and only then is stereographic projection applied. The consequence is precise: objects that share XYZ coordinates but differ in W are collapsed onto the same screen position. There is no spatial separation. You cannot see where one ends and the other begins.
This is not a limitation. It is the premise. The work does not try to show you where W is. It asks what you would feel in its presence.
The Line
A four-dimensional field rendered as hard geometric lines would risk feeling like a diagram — precise, readable, analytical. The intention here is the opposite.
In most rendering pipelines, organic line geometry — with rounded caps, variable width, and soft edges — is handled by a geometry shader, which can take a simple line primitive and expand it into a strip of triangles per-vertex. WebGL does not support geometry shaders.
The solution is to construct the line entirely within the vertex shader, using gl_VertexID to address a hand-designed vertex template — sixteen vertices per instance, encoding the body of the line and both caps as triangle strip geometry. Each vertex knows its role: edge, cap tip, or degenerate break. The rounding of the caps and the soft falloff along the line body are handled in the fragment shader through distance fields.
The result is a line that behaves more like light than geometry — it bleeds at the edges, tapers at the ends, and accumulates where lines overlap.
The Perceptual Channels
If position cannot reveal W, something else must. Three perceptual channels are used, each driven by the true 4D distance:
Focus. A camera lens does not focus on XYZ coordinates — it focuses on distance. When actual distance includes W, objects at the same spatial position may fall at entirely different depths of field. One is sharp. Another is soft. The blur is not decorative; it is the lens honestly reporting what it measures.
Transparency. Alpha falls off with 4D distance. An object further along W is not spatially further away, but it is perceptually dimmer — present but attenuated, as if seen through something that is not quite there.
Color. A palette mapped to the W coordinate gives each object a chromatic signature of its position in the fourth dimension. Two objects overlapping in space but separated in W will carry different colors, bleeding into each other at the overlap — an interference pattern of dimensional difference.
The color shift is also a viewer's guide. Objects sharing a similar hue are close in W — they inhabit the same region of the fourth dimension. Diverging colors signal W separation. Where colors bleed into each other at the same spatial point, the viewer is seeing two four-dimensionally distant objects occupying the same three-dimensional location — the very condition the work sets out to make visible.
The Field
The objects themselves are not placed manually. They emerge from a four-dimensional Perlin noise field, sampled across a grid of 250 × 250 instances. Each instance is a line segment connecting two points in 4D space, animated continuously through time. The field breathes — expanding, contracting, shifting — as the camera's own W coordinate oscillates, moving the observer gently through the fourth dimension.
The result is not a diagram. It is an environment: a space in which four-dimensional presence is felt rather than read, perceived through softness and color and the quiet insistence of things that are here but not quite here.
Rendered in WebGL using Three.js. Geometry generated procedurally in the vertex shader via stereographic projection from 4D noise space.
Click or tap to transform the noise field!
Voronoise 4D
Motion Through Hyperspace
What happens when a four-dimensional field is allowed to evolve freely, and we observe only its projection?
Rather than placing objects manually, structure here emerges from Voronoise, a four-dimensional noise function originally developed by Inigo Quilez, sampled across a grid of instances. Each instance is a line segment in 4D space. As the field evolves, these segments shift, rotate, and stretch, forming patterns that briefly resemble tesseract fragments before dissolving again.
The simulation runs entirely on the GPU using a ping-pong render target scheme. The transform-feedback-style approach allows smooth, continuous motion across thousands of instances without any CPU involvement.
Each line is drawn using instanced rendering: a single vertex template is stamped across the entire field in one draw call, with the vertex shader resolving each instance's position, orientation, and length from the stored textures. Motion blur is not a post-processing effect; the geometry itself stretches between past and present positions, so faster regions streak and slower ones condense into stable clusters.
What appears on screen is not a single object but a continuously shifting slice of a higher-dimensional field — the shadows of a structure that cannot fully exist in three-dimensional space.
Rendered in WebGL using Three.js. Simulation and geometry executed entirely on the GPU. Voronoise function originally developed by Inigo Quilez.
Velarium
Velarium explores the interplay between harpsichord resonance and contemporary digital imagery in a reciprocal dialogue where sound shapes image and image shapes sound. The work blends structured and improvised elements, allowing the performance to shift in response to the evolving sound-image interaction. Microscopic acoustic traces are captured and transformed through live electronics, creating a sonic environment as if the instrument's physical structure had shifted.
Real-time visuals are generated through a custom particle system of roughly 500,000 spheres, forming geometries that shimmer, blur, and drift. Through live performance and responsive electronics, Velarium constructs an imaginary acoustic space where sound becomes visible and image becomes audible.
A key aspect of the project is the three-part performance ecosystem linking acoustic sound, processed audio, and real-time visuals. Each component listens and reacts to the others: audio drives the visuals, visuals trigger or modulate electronic processes, and performers adapt their playing in response to the resulting images and electronic sound. This creates a multi-layered interaction that makes each performance unique.
Created and performed using: Max/MSP/Jitter, Max for Live, Ableton Live, with a Schertler contact microphone
Harpsichord: Liubov Titarenko
Live electronics and visuals: Yu Oda
Premiered November 22, 2025
Het Orgelpark, Amsterdam
Prix Annelie de Man Festival
Vorex
Fragments of Renaissance composer John Dowland's "Burst forth my tears" form a quiet backbone to this work, appearing and dissolving among other elements as the film unfolds. The music follows the sequence of scenes yet continually departs from it, a tension that mirrors stages of change and growth.
The work is part of "Sound of View", a collaborative project between filmmaker Barbara Meter, four composers and Postland. In this project, the same film is presented four times with four newly composed live soundtracks, each offering a distinct musical interpretation of the same images.
Film: Barbara Meter - "Voorbij de Wegen"
Music: Yu Oda - "Vorex"
Performed by Postland
Premiered at Gaudeamus Festival
September 14, 2025
TivoliVredenburg, Utrecht
May Remain
Electroacoustic work for Positive Organ, vibraphone, and electronics.
The detailed acoustic sounds of the organ and the vibraphone are captured with various types of microphones (contact, condenser, and pickup system) and processed with analog electronics (effect pedals and modular synth).
Co-commissioned by Gaudeamus Muzeiweek and Het Orgelpark and premiered in 2021. Recorded during the concert at Het Orgelpark on November 5th 2022.
Performed by Plastiklova:
Vibraphone: Laurent Warnier (percussionist)
Organ & Electronics: Yu Oda (composer)
Route to Mute
A composition that showcases the art of deconstruction, arrangement, and recreation, as part of the "Dowland series" that draws on existing music materials from works by John Dowland. The piece serves as a nod to the Renaissance song "If my complaints could passions move." Through this series, the composer aims to reinvigorate and reinterpret these classic works by reworking them in unique ways. "Route to Mute" is a reflection of this approach, offering a glimpse into the creative process behind the Dowland series and the composer's efforts to put their own spin on the original material.
Performed by New European Ensemble
Premiered on November 18th, 2022
Nieuwe Kerk, Den Haag
NoiseSample
Audio-reactive particle system inspired by and built around "noise" and "sample" concepts both in audio and image parts. Presented by a half-improvised live performance, the work is about experiencing the process of finding a fine balance in those concepts.
Premiered at the ISCM World New Music Days 2023
Johannesburg, South Africa.
Made with Max Jitter and Ableton Live
Orthorealm – Violin Excerpts
The performance lasts between 20 and 25 minutes and is presented in a concert setting designed to immerse the audience. Seated at the center, the audience is surrounded by the four musicians and four loudspeakers positioned at each corner of the space. Two projection screens face each other on opposite sides.
The visuals are designed in a four-dimensional geometric space and react dynamically to the live audio input from the musicians. Though projected onto flat screens, the images are crafted to evoke a sense of four-dimensional space through the audience's perception.
Violin - Paul Pankert / Ensemble 88
Premiered September 2024
Eupen, Belgium
Grain Threshold
Audiovisual work combining vibraphone's electroacoustic sounds and morphing audio-reactive particles rendered with the ray marching technique.
Music performed by Plastiklova:
Vibraphone: Laurent Warnier (percussionist)
Organ & Electronics: Yu Oda (composer)
Made with Ableton Live and Max Jitter.
The Scheme of the Sea Organ
Composition for three Paetzold recorders, bass clarinet, and percussion (two performers), inspired by the Sea Organ (Morske orgulje) in Zadar. The work explores the translation of architectural space and natural forces into musical structure.
Composed and premiered in 2010. Nominated for the Gaudeamus Prize 2011 and Toonzetters Prize 2012; awarded an honorable mention at the 32nd Irino Prize.
Base to Base
Written for David Kweksilber Big Band.
Composed and premiered in 2015
Het Bimhuis, Amsterdam
Directions of Time
Composed for Plastiklova using various vibraphone sounds and electronics, the work was created remotely during the quarantine period in 2020.
The piece was commissioned by Crazy Quarantine Sessions, an initiative supporting music-making during the COVID-19 pandemic. The music video was premiered online on May 1, 2020.
Vibraphone: Laurent Warnier
Electronics: Yu Oda
Higher-Dimensional Grid in VR
Experiment to explore the concept of a fourth spatial dimension, represented as an additional spatial axis beyond height, width, and depth.
Rendered in 8K and best experienced with a VR headset at the highest resolution.
Created with Max Jitter with custom shaders
4D Grid and Perlin Noise
3D visualization of a higher-dimensional grid and 4D Perlin noise.
Created with Max Jitter with custom shaders.
Music: "rificia" by Plastiklova
Riakage
One of the tracks written for the album release of Plastiklova in 2022.
Vibraphone: Laurent Warnier (percussionist)
Electronics: Yu Oda (composer)
Shade in Sustention
Composed in 2020 as part of the "Dowland Series," a collection inspired by the music of Renaissance composer John Dowland. This series explores the essence and meaning of composition by delving into the artistic elements and creativity that arise from deconstructing, arranging, and reimagining Dowland's original works. Based on his song "In darkness let me dwell," the piece "Shade in Sustention" seeks to capture and convey the atmosphere and emotion of the original piece through a solo cello performance.
Performed by Aki Kitajima
December 8th, 2023
Tokyo Opera City Recital Hall
Behind the Scene
A piece written for piano and toy piano in 2012 as the first work of the "Dowland Series". It is an arrangement, deconstruction, and re-creation of "Flow My Tears" by John Dowland. Premiered at the Toy Piano Summit during the Rainy Days Festival 2012 in Luxembourg.
Toy Piano: Rachel Xi ZHang
Piano: Pascal Meyer
December 16th, 2015
Intro in Situ, Maastricht
Like a Beautiful Woman with Dirty Clothes
This recording comes from the 2016 CD release of Looptail, the Amsterdam-based new music ensemble I co-founded and for which I served as artistic director.
The piece was originally composed in 2009 for eleven musicians and adapted in 2011 for Looptail's sextet instrumentation: flute, clarinet, piano, percussion, violin, and cello.
Synesthetic Fountain RGB - an immersive experience of frequencies
An immersive audiovisual installation exploring frequency as a shared structure between sound, light, and water. Created in collaboration with multidisciplinary artist Federico Murgia, the installation was built in 2020 and transforms a fountain into a synesthetic system driven by modular synthesis, water acoustics, and RGB strobe light. Through precisely controlled frequencies, all elements operate in synchronization, generating interdependent visual and sonic patterns.
Forest of One
Written for voice and cello, Forest of One features a reduced texture in which the cello is played entirely pizzicato, highlighting the instrument's warm resonance in dialogue with the human voice.
The work was composed for the 2018 theatre production FAUST™, a collaboration between Noord Nederlands Toneel and the Noord Nederlands Orkest. The text is inspired by Fernando Pessoa's poem Não sei quantas almas tenho, which reflects on the shifting and multiple nature of the self.
The interplay between voice and cello suggests an inner dialogue, echoing themes of identity, multiplicity, and solitude.
Performed by Isabel Vaz