Goals – Using Cycle 1‘s set up and extending it into Isadora for manipulation – Testing and understanding the connection between Isadora and OBS Virtual Camera – Testing prerecorded video of paintings and live streamed Tilt Brush paintings in Isadora – Moving to a larger space for position sensitive tracking through Isadora Open NDI Tracker – Projection mapping
Challenges and Solutions – Catalina Mac OS doesn’t function with Syphon so I had to use OBS Virtual Camera in Isadora – Not having a live body to test motion tracking and pin pointing specific locations required going back and forth. I wouldn’t be able to do this in a really large space but for my smaller space I put my Isadora patch on the projection and showed half the product and half the patch so I could see what was firing and what the projection looked like at the same time. – Understanding the difference between the blob and skeleton trackers and what exactly I was going for took a while. I spent a lot of time on the blob tracker and then finally realized the skeleton tracker was probably what I actually needed in the end. – I realized the headset will need more light to track if I’m to use it live.
Looking Ahead The final product of this goal wasn’t finished for my presentation but I finished it this week which really brought about some really important choices I need to make. In my small space, if I’m standing in front of the projection it is very hard to see if I’m affecting it because of my shadow, so either the projection needs to be large enough to see over my head or my costume needs to be able to show the projection.
I am also considering a reveal, where the feed is mixed up (pre-recorded or live or a mix – I haven’t decided yet) and as I traverse from left to right the paintings begin to show up in the right order (possibly right to left/reverse of what I’m doing). Instead of audience participation, I’m thinking of having this performer triggered; my own position tracking and triggering the shift in content perhaps 3-4 times and then it stays in the live feed. Once I get to the other side, it is a full reveal of the live feed coming from my headset. This will be tricky as the headset needs light to work (more than projection provides), which is a reason I switched to using movies in my testing as I didn’t have the proper lights to light me so the headset could track and you could see the projection. I also was considering triggering the height of the mapped projection panel (like Kenny’s animation from class) and revealing what is behind that way. Although I do want to keep the fade in and out.