Cycle 3: Xylophone Hero at Fort Izzy

This Cycle I decided to focus on the physical presentation of my project. My first task was to connect my xylophone to Isadora using the Makey Makey. I wrapped foil around each key of the xylophone, then made “wires” out of foil that connected to each foiled key. Then I remapped the back inputs of the Makey Makey to the numbers 1-8 and attached the breadboard wires to the Makey Makey. I attached the Makey Makey to the inside of a shoebox and fed the breadboard wires through to the outside of the box. From there, I connected the alligator clips and the breadboard wires. To ground the mallet, I tied one end of some conductive string to Earth on the Makey Makey and tied the other end around the handle of the mallet, taping the extra string to the mallet. I then covered the top of the mallet in foil.

Before my Cycle 3 presentation, I mapped the output of my Isadora patch onto the xylophone and I set up a “blanket fort” around my projection in the Motion Lab. I also went through my patch to make sure everything was working as expected, and I added a start screen, an end screen, and a restart. I staged the fort with stuffed animals, playing cards, and other items that might be found in a children’s bedroom.

I am extremely happy with how this project turned out and I am so proud of what I produced. If I were to do more cycles, I would paint the shoebox to look more like a toy box, secure the bottom sheet to the floor, and I would figure out a way to hold a high score or display a scoreboard. I could also figure out a way to introduce game difficulties.


Cycle 2: Keeping Score

In between Cycle 1 and Cycle 2, I decided that my input device would be a children’s xylophone, which I would make work using the Makey Makey.

During Cycle 2, I spent most of my time working in Isadora. I wanted to make sure I had my Isadora patch working the way I wanted before I started any of the physical presentation elements. To start, I turned my four arrow shapes into rectangles, then added four more to simulate a xylophone.

Next, I added in a way for the computer to tell the player which key to hit. To do this I started by duplicating the user actors I had made for the previous rectangles. Each user actor is almost identical to the blinking arrows in Cycle 1, but has input from a Get Global Values Actor inside the User Actor, and an Inside Range Actor outside of the User Actor. The outside mechanism uses an Enter Scene Trigger, Timed Trigger, Trigger Value, Pulse Generator, Activate and Deactivate Scene Actors, the Random Actor and the Float to Integer Actor, as well as an Inside Range Actor connected to each of the duplicated User Actors. Both patch snippets are pictured below.

I added in a goal for the game; To score as many points as possible in one minute. First I added a way for Isadora to track the player’s score using the Counter. Actor. Then I added in a game timer using the TimerTool Actor downloaded from the Add-Ons page of the TroikaTronics website.

I also tried to add in a scoreboard feature, but it turned out to be much more difficult than I had anticipated, so a scoreboard is likely a Cycle 4 type of addition.

In Cycle 3 I am going to spend more time focusing on the physical presentation of my project. My goals are to get the xylophone connected to my Isadora patch, figure out how I want to set up in the Motion Lab, and projection map my Stage onto the xylophone.


Lawson: Cycle 3 “Wash Me Away and Birth Me Again”

Changes to the Physical Set Up

For cycle 3, knowing that I wanted to encourage people to physically engage with my installation, I replaced the bunched up canvas drop cloths with a 6 ft x 10 ft inflatable pool. I built up the bottom of pool with two folded wrestling mats. Building up the bottom of the pool made the pool more stable and reduced the volume of silk rose petals that I would need to fill the pool. Additionally, I wrapped the pool with a layer of blue drop cloths. This reduced the kitschy or flimsy look of the pool, increased the contrast of the rose petals, and allowed the blue of the projection to “feather” at the edges to make the water projection appear more realistic. To further encourage the audience to physically engage with the pool, I placed an extra strip of drop cloth on one side of the pool and set my own shoes on the mat as a visual indicator of how people should engage: take your shoes off and get in. This also served as a location to brush the rose petals off of your clothes if they stuck to you.

In addition to the pool, I also made slight adjustments to the lighting of the installation. I tilted and shutter cut three mid, incandescent lights. One light bounced off of the petals. Because the petals were asymmetrically mounded, this light gave the petals a wave like appearance as the animation moved over top of them. The other two shins were shutter cut just above the pool to light the participant’s body from stage left and stage right.

Changes to the Isadora Patch

During cycle 2, it was suggested that I add auditory elements to my project to support participant engagement with the installation. For this cycle, I added 3 elements to my project: a recording of running water, a recording of the poem that I read live during cycle 2, and a recording of an invitation to the audience.

The words of the poem can be found in my cycle 2 post.

The invitation:

“Welcome in. Take a rest. What can you release? What can the water carry away?”

I set the water recording to play upon opening the patch and to continue to run as long as the patch was open. I set the recordings of the poem and the invitation to alternate continuously with a 30 second pause between each loop.

Additionally, I made changes to the reflection scene of the patch. First, I re-designed the reflection. Rather than using the rotation feature of the projection to rotate the projected image from the webcam, I used the spinner actor and then zoomed in the projection map so it would fit into the pool. Rather than try to make the image hyper-realistic, I decided to amplify the distortion of the reflection by desaturating it and then using a colorizer actor to give the edges of the moving image a purple hue. I also made minor adjustments to the motion blur to play up the ghostliness of the emmanation.

Second, I sped up the trigger delay to 3 seconds and the deactivate scene trigger to 2 seconds. I made this change as a result of feedback from a peer that assisted me with my adjustments to the projection mapping. She stated that because the length of time of the fading up and down of reflection scene took so long to turn on and off and the reflection itself was so subtle that it was difficult to determine how her presence in the pool was triggering any change. I found the ghostliness of the final reflection to be incredibly satisfying.

Impact of Motion Lab Set Up

On the day of our class showing, I found that the presence of my installation in the context of other tactile and movement driven exhibits in the Motion Lab helped the handful of context-less visitors figure out how to engage with my space. When people entered the Motion Lab, they first encountered Natasha’s “Xylophone Hero” followed by Amy’s “seance” of voices and lightbulbs. I found that moving through these exhibits established an expectation that people could touch and manipulate my project and encouraged them engage to more fully with my project.

I also observed that the presence of the pool itself and the mat in front of it also encouraged full-body engagement with the project. I watched people “swim” and dance in the petals and describe a desire to lay down or to make snow angels in the petals. The presence of the petals in a physical object that visitors recognized appeared to frame and suggest the possibilities for interacting with the exhibit by making it clear that it was something that they could enter that would support their weight and movement. I also observed that hearing the water sounds in conjunction with my poem also suggested how the participants could interact with my work. Natasha observed that my descriptions of my movement in my poem help her to create her own dance in the pool sprinkling the rose petals and spinning around with them as she would in a pool.

The main hiccup that I observed was that viewers often would not stay very long in the pool once they realized that the petals were clinging to their clothes because of static electricity. This is something that I think I can overcome through the use of static guard or another measure to prevent static electricity from building up on the surface of the petals.


A note about sound…

My intention for this project is for it to serve as a space of quiet meditation through a pleasant sensory experience. However, as a person on the autism spectrum that is easily overwhelmed by a lot of light and noise, I found that I was overwhelmed by my auditory components in conjunction with the auditory components of the three other projects. For the purpose of a group showing, I wish that I had only added the water sound to my project and let viewers take in the sounds from Amy and CG’s works from my exhibit. I ended up severely overstimulated as the day went on and I wonder if this was the impact on other people with similar sensory disorders. This is something that I am taking into consideration as I think about my installation in January.

What would a cycle 4 look like?

I feel incredibly fortunate that this project will get a “cycle 4” as part of my MFA graduation project.

Two of my main considerations for the analog set up at Urban Arts Space are disguising and securing the web camera and creating lighting that will support the project using the gallery’s track system. My plan for hiding the web camera is to tape it to the side of the pool and then wrap it in the drop cloth. This will not make the camera completely invisible to the audience, but it will minimize it’s presence and make it less likely that the web cam could be knocked off or into the pool. As for the lighting, I intend make the back room dim and possibly use amber gels to create a warmer lighting environment to at least get the warmth of theatrical lighting. I may need to obtain floor lamps to get more side light without over brightening the space.

Arcvuken posed the question to me as to how I will communicate how to interact with the exhibit to visitors while I am not present in the gallery. For this, I am going to turn to my experience as neurodivergent person and my experience as an educator of neurodivergent students. I am going to explicitly state that visitors can touch and get into the pool and provide some suggested meditation practices that they can do while in the pool in placards on the walls. Commen sense isn’t common – sometimes it is better for everyone if you just say what you mean and want. I will be placing placards like this throughout the entire gallery for this reason to ensure that visitors – who are generally socialized not to touch anything in a gallery – that they are indeed permitted to physically interact with the space.

To address the overstimulation that I experienced in Motion Lab, I am also going to reduce the auditory components of my installation. I will definitely keep the water sound and play it through a sound shower, as I found that to be soothing. However, I think that I will provide a QR code link to recordings of the poems so that people can choose whether or not they want to listen and have more agency over their sensory experience.


Cycle 3

After I studied connecting Isadora to Arduino and experimented its potentials in Cycle 1 and 2, I focused on considering what type of experience I could create with that technology in Cycle 3.
Feedbacks from everyone in Cycle 2, that just a tiny motion of small servo motors and its fuzzy structure could have its own quality, really inspired me.



I decided not to make such a “well designed” machine but just collected found objects and combined them with mechanics.
It was like installing an electric life in dead objects; how static objects get its another life as an interactive/kinetic machine.


Reactions from viewers (not only in our class but I also showed it to Art grad students and faculties) exceeded my expectations; the interactive motion which responds to viewers but still keeps a certain unpredictability made the sense of life (this object is living its own life) in each of viewers.
While its rule of interaction is very very very simple (just chasing and responding to a viewer in front of a webcam), viewers grows the sense and the meaning of that object in their own way…it was my discovery through this Cycle 3.
It could be a powerful potential of such an interactive project a creator doesn’t need fill out all the concept and meaning but viewers would find and grow it through the relationship with that object.



Body as Reservoir of Secrets, DEMS Cycle 3

In cycle 3, the three things I focused on were:

  1. Refined the composition of the graphics generated. Included more gaussian blur effects to reduce the number of hard-edges present in the work. The reduction of hard edges was intentional to reduce the “digital” nature of the graphics.
  2. I continued to develop the costume. I drew, painted and stitched in the costume, to create a more shamanistic leaning for the costume, hoping to steer away from an immediate aviation or space travel reference.
  3. I managed to program a Hot Folder in my computer, so that syncs up with the stage to image actor on isadora, and whatever Isadora exports to the Capture folder is automatically printed.

Decisions made:

  1. I decided to include an additional web-cam to do a live-capture of the pile of printed matter that was generated. The pile would then appear in some print outs.
  2. I moved the pulse sensing contact mic to my wrist due to the headache I experienced after the cycle 2 presentation.
  3. The score of the piece was supposed to be live-painting/collaging. There was not enough space/time for that to happen at ACCAD and the dec 8th critique. So I devised a score that was about folding each sheet of paper 8 times, and being unable to fold it the 9th time as a parallel to the irretrievable nature of ancestry.

Challenges/Feedback:

  1. My patch got very complicated and towards the end I was having some difficulty figuring how everything was wired, even though I did make an effort to be organized.
  2. Isadora crashed from time to time. I will look into buying a high capacity windows laptop next year.
  3. I needed some troubleshooting with the Hot Folder and took awhile to figure out why I was generating so many prints. Turned out that I had an earlier version isadora file running in the background which was feeding the hot folder a bunch of images.
  4. The prints generated had a general purple hue to them. It seems like I needed to vary the colors more in the graphics generated, so that this purple hue is counteracted/balanced.

Observations/ Ongoing curiosities:

  1. The printer working generated an unexpected mechanical soundscape to the piece. The rhythmic and hasty machine sound creates a sense of urgent-ness to the work, I find myself feeling some anxiousness and felt pressure to work at that rhythm/pace. I felt “chased” by the printer.
  2. A classmate posed me a question about generative art, and how generative art tend not to have any material cost to it. She asked me how I felt about the paper and ink expended for this performance. I actually welcome the creation of these materials because as an artist working in collage, I hardly throw away anything because I save everything in flat files. These materials will eventually find its way into another piece of work in the future.
  3. I intend to stage a cycle 4 in the Sherman Studio Arts Center after the semester quietens down. Cycle 4 will have the live-painting aspect and I will be exploring using what I have created in this class, as means to create a video work. I am also curious about how feature extraction can influence sound and be used for composing.

Body as Reservoir of Secrets, DEMS Cycle 2

For cycle 2, the idea started to take shape in a more concrete manner. I started to develop the costume using coveralls bought on amazon which I tailored myself for a better fit.

The following feature extraction functionalities were in the costume-

  1. On the coveralls I stitched on some patches with dark pieces of digitally printed fabric. This increases the overall contrast of the figure, which helps the eyes++ actor and webcam track object movements better.
  2. A contact microphone was housed in the chest pocket running up to my neck. An additional elastic band was devised to apply pressure and hold the contact microphone in place.
  3. Two iphone Gyroscopes were housed within blue/purple pockets with elastic openings on the arm and calf of the custom.
  4. A microphone was blown into whenever a scene change was desired, as a means of activating breath.

Challenges/Feedback

  • Composition looked too busy given that more than 10 variables were being fed into isadora.
  • My computer’s graphics card seems to be struggling and isadora crashes easily.
  • A classmate mentioned that the costume appeared like an astronaut, which is not what I intended.
  • I had a headache after wearing the neck piece with the contact mic for 20 minutes because it was restricting bloodflow.

Observations/On-going curiosities

  • The class seems to be very interested to map/figure out which visual element was coupled with the various sensors.
  • The class seems to be interested view the live generated graphics as a video piece, when I had intended the graphics primarily to be viewed as prints. I am open to the possibility of the video being screened alongside the painting performance.

Cycle 3: The Sound Station

Hello again. My work culminates into cycle 3 as The Sound Station:

The MaxMSP granular synthesis patch runs on my laptop, while the Isadora video response runs on the ACCAD desktop – the MaxMSP patch sends OSC over to Isadora via Alex’s router (it took some finagling to get around the ACCAD desktop’s firewall, with some help from IT folks).

I used the Mira app on my iPad to create an interface to interact with the MaxMSP patch. This meant that I had the chance make the digital aspect of my work seem more inviting and encourage more experimentation. I faced a bit of a challenge, though, because some important MaxMSP objects do not actually appear on the Mira app on the iPad. I spent a lot of time rearranging and rewording parts of the Mira interface to avoid confusion from the user. Additionally I wrote out a little guide page to set on the table, in case people needed additional information to understand the interface and what they were “allowed” to do with it.

Video 1:

The Isadora video is responsive to the both the microphone input and the granular synthesis output. The microphone input alters the colors of the stylized webcam feed to parallel the loudness of the sound, going from red to green to blue with especially loud sounds. This helps the audience mentally connect the video feed to the sounds they are making. The granular synthesis output appears as the floating line in the middle of the screen: it elongates into a circle/oval with the loudness of the granular synthesis output, creating a dancing inversion of the webcam colors. I also threw a little slider in the iPad interface to change the color of the non mic-responsive half of the video, to direct audience focus toward the computer screen so that they recognize the relationship between the screen and the sounds they were making.

The video aspect of this project does personally feel a little arbitrary – I would definitely focus more on it for a potential cycle 4. I would need to make the video feed larger (on a bigger screen) and more responsive for it to actually have any impact on the audience. I feel like the audience focuses so much more on the instruments, microphone, and iPad interface to really necessitate the addition of the video feed, but I wanted to keep it as an aspect of my project just to illustrate the capacity MaxMSP and Isadora have to work together on separate devices.

Video 2:

Overall I wanted my project to incite playfulness and experimentation in its audience. I brought my flat guitar (“skinned” guitar), a kazoo, a can full of bottlecaps, a deck of cards, and miraculously found a rubber chicken in the classroom to contribute to the array of instruments I offered at The Sound Station. The curiosity and novelty of the objects serves the playfulness of the space.

Before our group critique we had one visitor go around for essentially one-on-one project presentations. I took a hands-off approach with this individual, partially because I didn’t want to be watching over their shoulder and telling them how to use my project correctly. While they found some entertainment engaging with my work, I felt like they were missing essential context that would have enabled more interaction with the granular synthesis and the instruments. In stark contrast, I tried to be very active in presenting my project to the larger group. I lead them to The Sound Station and showed them how to use the flat guitar, and joined in making sounds and moving the iPad controls with the whole group. This was a fascinating exploration of how group dynamics and human presence within a media system can enable greater activity. I served as an example for the audience to mirror, my actions and presence served as permission for everyone else to become more involved with the project. This definitely made me think more about what direction I would take this project in future cycles, if it were for group use versus personal use (since I plan on using the maxMSP patch for a solo musical performance). I wonder how I would have started this project differently if I did not think of it as a personal tool and instead as directly intended for group/cooperative play. I probably would have taken much more time to work on the user interface and removed the video feed entirely!