Week 3: Audio Reactivity and Particles

February 10th

Notch Academy:

007 Particles

I had not planned on exploring particles this week but I wanted to give myself a little sneak peek and ended up watching and following along to half of the particle video.

From there I was able to explore more aspects to it and found out how to do some cool plexus style graphics! This part excited me greatly as I have a huge interest in plexus systems. Last year when I was working on a plexus system for a graphic for a project I was trying to use TouchDesigner to do it. Unfortunately, it was way out of my scope of expertise and had to end up using a tox file someone else had made and modified it to suit my needs. I am greatly impressed with how easy it was to accomplish.

As I was talking to another fellow Notch user we were discussing some things that we did not exactly find user-friendly concerning Notch. One of the things that came up was how slightly annoying it was that if we wanted to use the text node in Notch we have to import a font to be used. I wonder why Notch decided to do it that way. The second part concerned the way the cursor acts when you are trying to scale a resource tab or the properties tab. For some reason on some laptops, it’s a challenge to be able to get the drag handles to rearrange your workspace. We don’t know if it is a bug or just the way Notch acts on our computers but it has been tedious.

February 11th

Audio Reactivity

Notch Tutorials – Audio Reactivity by Kev Zhu

This has been by far my favorite tutorial that I have watched so far. Audio reactivity has been what I have been wanting to focus on when it comes to creating visuals in Notch. Kev was very easy to follow along to and took the time to explain everything he was doing. The most important aspect that I took from this was how I could use an Audio Capture device to bring external audio into Notch. I was wondering how I could do that, especially since I wanted to create a project where people could come up to a computer, search a song on youtube, play it, and then visuals in Notch would react to it. By using the program Voice Meeter to Notch I now see how simple it is to do that. Another aspect that I was intrigued in is how you can move through the audio as if it is a landscape combined with the ability to add post VFX effects to it as if it was a video. I think this expands just how dynamic audio reactivity can be. This is a video that I will find myself continuously coming back to and taking notes on closely as it has a lot of useful information.

February 13th

Notch Manual

Keyboard Shortcuts + Nodes

I took a bit of time to explore the manual more and I know I said this last time but I am seriously impressed with how in-depth the manual is. I have found a lot of the answers I have questions concerning nodes lives in the manual. I have slowly started making my way down the nodes section exploring what each one is capable of. Honestly, I did not realize just how many nodes Notch actually has. Something that stuck with me when Sarah was talking with the class a few weeks ago was how Notch doesn’t require code because they are trying to create all the nodes you would ever need. I think that it is an interesting way to go about things but I also wonder will there be a point where there will just be too many nodes and it will not be easy to handle it all?

After I watched Kev’s video, I was hoping to find more videos like his and sadly not many exist but I did stumble upon this video that expanded the way I view audio reactivity. I have always had this static way of thinking that visuals can be affected by the audio but I have never had the thought “what if the audio was created by the visuals?” I was thoroughly impressed by the project that Brett Bolton and his team created.

I am very intrigued now to explore the Hot Zone node that he is using. This node detects if the center point of an object moves into a region. I believe this node will serve me greatly whenever I start exploring tracking live bodies in Notch. But in the meantime, I think I am going to have to spend a lot more time living and exploring in the Interactive Nodes section as I think that this is where the bulk of information in what I am interested in using Notch for lives.

Also in more exciting news I finally ordered my Notch dongle! I am excited to have the capabilities to save as I have been wanting to go deeper into Notch files but have been slightly discouraged as I could not save what I was doing. I cannot wait to start playing even more with it next week!

To see my progress in Python for this week, click here.