Live Event: The Room

The Room

The Room is a mystical place. Inspired by artist Yayoi Kusama’s Infinity Mirror Rooms as well as the feelings you get when you enter a Carnival Mirror Funhouse, it is supposed to transport you to a different world where you are to get lost in forever – or at least for the length of the song you choose.

Looking at it from the outside you can see yourself. All 4 sides of the cube + the top of the cube are mirrors, seemingly creating a warp in time, a warp in space. When you walk into the room, you will see nothing, other than a faint light coming from the media glass in front of you and perhaps the projector light that is hitting one of the four walls. You might even see your reflection on one of the four mirrored walls (including the ceiling) surrounding you. You will then think of a song you want to hear, and say it out loud. A few moments will pass and suddenly you will be in an infinite space. Your song of choice playing, the visuals in front of you reflecting in front of you. Magic.

In collaboration with my classmate Harry Wilde-Greer, he was able to help me bring my project to life momentarily. The visual might be a little absurd but it also reflects how I hope the room is able to make one feel.


Of course, looking at it from a technical standpoint it won’t be magic and there will be very logical reasoning on how everything (visual wise) is happening.

Now due to the Coronavirus situation that is happening in the world at the moment, this project was not able to be realized in real life. I missed out on the opportunity of creating Vectorworks renderings of the dimensions of the room as well as projector/sound/microphone system diagrams and whatnot as there was not enough and too much time in the world. I also missed out on the opportunity of being told: “No, this is a bit too ambitious, come back down to the real world.” This situation has, at least, let my imagination run wild and create a space one day I hope to realize, even if I have not figured out how I’m going to create it yet. I’ll keep coming back and revising this project until the day I can.

Now it’s time to talk TouchDesigner!

TouchDesigner File

Although I have been posting along this process different iterations of this file and how I have been creating my visuals, it is time to go more in-depth!

Full Network

My full TouchDesigner Network lives in just one container. Inside this container, the network can be broken up into three distinct “areas”.

  • Cyan – Audio
    • This area contains nodes pertaining to audio manipulation/reactivity. It sets up the way the shapes react to music.
  • Peach – Shapes
    • This area contains nodes pertaining to Geometry setup. It is tied to the way the visuals look.
  • Blue – Visual Aesthetic
    • This area contains nodes that create the overall look of the visuals.


Taking a closer look at the Audio section, this can be further broken down into three parts.

  • Cyan
    • These two nodes are how the audio is input into TouchDesigner. As mentioned in a previous post, with the node Audio Device In, I am using a program called BlackHole that is feeding any audio I play on my laptop into TouchDesigner. The Audio Device Out node is how I hear what is playing.
  • Yellow
    • These two nodes are Audio Filters. They allow me to add a lowpass and high pass filter in order to isolate the lows and highs of the sound being input for later use.
  • Red
    • These nodes just help clean up the audio a bit more in so the sound isn’t as erratic and I can have a smoother audio reaction.
  • Peach
    • These nodes are how the visuals fade to black when nothing is playing and appear when a song starts to play! More about that bellow.

The code in the panel on the left is looking at the values the Analyze node is outputting. For reasons I haven’t quite figured out yet when a song is not playing, the values being output are in the 0-.99 range. When a song is playing, the values are above 1. My code is looking for those two ranges. On the right, I have a Constant CHOP, attached to a Math attached to a Filter whose values are then affecting the Alpha value of a Constant TOP. When the Alpha value of that Constant TOP is 0, my visuals are able to be visible, when the Alpha value is 1 there is a Black ‘constant’ that appears and gives the fade-out effect. My code is telling my Constant CHOP weather to be 1 or 0 depending on what the values of the Analyze node are which is then, in turn, allowing the Constant TOP to be visible or not.

I want to say that I wish I had had the opportunity to show off more of the coding skills that I have been working on all semester but I am happy I was able to code this nonetheless!


These following nodes set up the base of my visuals. They are what is being affected by the audio.

  • Cyan
    • These nodes are how my background “Audio color spectrum” is created. I have a Sphere which is then twisted into a tube and then is made bigger by the transform. It is then brought into the first Geometry wherein the Instancing tab, it’s X/Y position is being manipulated by the audio.
  • Red
    • These two nodes create the big shape other shapes are being Instanced into. They consist of three Shapes (circle, rectangle, box) which are fed into a Switch, who in turn is being manipulated by a Delay, in order to ‘switch’ between the different shapes. I am then taking the XYZ attributes of the shape using an SOP to, in order to manipulate the XY position of the second Geometry in the instancing tab.
  • Fushia
    • These nodes are the shapes that are being instanced inside the other shape. Like the other ones, they are connected to a Switch and in turn instanced in the Geo being controlled by the Shape attributes and audio.
  • Yellow
    • These are the nodes that control the change of the shapes in their respective Switch.
  • Peach
    • These nodes set up the visibility of my shapes. Without them, they do not exist in space. The top Geometry node contains my background visuals. The Camera node allows the shapes to be viewed. The bottom Geometry node contains my foreground visuals. In the Render, I changed it so instead of rendering my Geos as one thing, it allows one to be on top of the other acting separately.


The ‘Visuals’ section can be split up into two distinct sections: The background, and the foreground.

  • Cyan
    • These nodes make up the Background visual. The background visual is the colorful audio-reactive spectrum.
  • Fushia
    • These nodes make up the Foreground visual. The foreground visual is the shapes that appear and change with the audio.

Looking closer at the setup of the Background visuals, it is split in two

  • Blue
    • This feedback loop is distorting the way the twisted Sphere from before looks. As it stretches and changes with the music, it creates a ghost, feedback that is captured by the look.
  • Red
    • These nodes give us the actual color. The first Noise gives color to what was present in the feedback loop. The second Noise works as a background for the first Noise giving it more dimension. The First noise is moving constantly in the X-axis while the second one is constantly moving on the Y-axis.
  • Green
    • These nodes create a mirror effect on the duplicate of the visual opposite of each other.
  • Red
    • These nodes rotate the visual slowly so it is not static all the time.
  • Blue
    • These nodes create another Feedback loop using a Blur.

The Future

When it comes to art, it is never completely finished. This, for now, I will call it The Room Vol 1. Here’s a couple of things I want to change for the next iterations:

  • When it comes to the magic of the person speaking their song choice out loud, that is so I can search up the song on Youtube or Spotify and play it. I thought about having a desktop where they physically type in the song and watch it play but I like the idea of it happening behind the scenes. Maybe once I get better at coding, I can mess around with Spotify/Youtube API’s and hardcode the input of it and create an interface where people do just say their song out loud and, like Shazam, TouchDesigner is able to play it.
  • I want to have more control over the colors in the background. It took me a while to create the noise in a way I wanted it to look but I definitely want even more control over it.
  • I want to put the space into a gaming engine like Unreel or Unity where you can walk around the space using a VR headset. Unfortunately, my friends who know how to use these programs were busy with their own gaming finals and sadly did not have time to help
  • I am not sure if creating a room like this is actually possible, but I would like to try to bring it to life eventually. I want to create all the paperwork that goes along with it. I greatly look forward to it.


Now, my ultimate goal for this project was to demonstrate my understanding of how the creation of real-time interactive visualizations plays an impactful role in audience engagement when experiencing live entertainment. The way I was going to show this was going to be through gauging people’s reactions once they stepped out of the live event/the room. But of course, I was unable to do this.

I did show these visuals to friends and told them about what I was wanting to achieve. They told me that it sounded cool and that the visuals were engaging and looked awesome but I think there is a definite difference between viewing the visuals on a computer and being in a physical space where you are surrounded by these visuals. The way the visuals are presented plays a huge role in the way the audience perceives it. Although I am happy that my friends liked what I created, you can’t easily replicate the immersive nature that this space was intended to have. There is something very mesmerizing and appealing about auditory and visual in sync-ness, it is what draws me specifically to it. I believe that is a huge aspect of what creates an effective immersive space. It is why when you come out of a concert or even a movie, real-life seems jarring. Your two senses were entrapped in a different space, one that made you forget where you were for a short while.

I have read many articles about the future of the Live Entertainment industry after COVID-19, and while many of them seem kind of bleak it is going to very fascinating how these spaces are going to be reimagined. I think now more than ever there is going to be a greater push for real-time interactive visualizations as the audience is going to want to be and be more engaged without touching or being in a crowded area. After all this, we are all going to want to be transported into spaces where we can forget what is happening in the world for a while.

Download the TouchDesigner File here.