Interactive

MaxMSP Work

Electrica: Creating Convergence is a gallery show (Feb 24, 2011 – March 26, 2011) at the Instituto Cervantes in New York City that Milena worked on as interaction designer. Alongside artists Ryan V Brennan and Mar Gomez, she designed the interaction scheme for a teeter totter that would act as a video controller, based on where it was in space. The video displayed two sides of a conversation and in order to hear and see both sides, two users had to be in balance. To execute the interaction, she used an accelerometer, an arduino microcontroller, a custom made MaxMSP patch which was routed out to a video splitter and to two large flat screens. Here are some photos of the piece. Video documentation coming soon.


Projected Affection is a storefront interactive window display by Emily Ryan for AC Gears on 8th Street in New York City. Milena worked with Emily on the coding and video mapping for the display. Using computer vision and face detection as a trigger, the passerby on the street can look at the display and suddenly see an image of their own face projected on to the mannequin head. You can see the outcome of the project in the video below.

Projected Affection for AC Gears from Emily Ryan on Vimeo.


Jilted is a virtual rock band, controlled by the lead performer in real time. Learn more about it here.

Take a look at the foot controller, which uses hardware, arduino and a custom MaxMSP program to trigger changes in video, audio, color, speed, and song.

The Human Slot Machine is an interactive music video project, which deals with prejudice, genetics, and facial identity, letting the user select a face out of many pairs of eyes, noses, and mouths which all align in sync with a clip of a song. In essence, the user is able to construct a new type of face, while creating a new kind of music video, which unfolds in the order of their choosing, eliminating song structure.

The “Cryptic” Human Slot Machine from Milena Selkirk on Vimeo.

Processing/Java

Mouse-fuelled Morphing Animation. Created with Processing. The speed of the animation speeds up or slows down based on the location of the mouse


The Human Petting Zoo An Interactive Morphing Photobooth

The Human Petting Zoo was the next step for me, after making the Morphing Animation (above). I wanted to create a live image capture that would combine a human face with that of an animal. My first step was to play around with how I would capture the human face. After I got processing to take live images, I started experimenting with how I would fuse this with animal images.  As I proceeded, I noticed that there were problems in stablizing each person’s face in terms of movement and distance from the camera. I realized that I needed to build a device which would line up each person’s face (more specifically their eyes, nose and mouth) in the same place every time. I made this head hole out of card stock, metal, cloth and fur. If the user sticks his/her head through the hole a set distance away from the computer screen, I’d get the right proportions.

Then I worked on code to make Processing select an animal image pulled from an array. After the image is selected, the participant places their face in the hole, and clicks the mouse to capture the morphed image. Once a picture is taken, the mouse advances to another random animal image. I also wanted the program to save the image in a folder, so that participants can keep their animal pictures.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: