2006, Group show at a warehouse gallery space in Chicago’s Pilsen neighborhood.
I have done this installation 4 times. The first felt the first show was the most successful. The piece was inspired and created specifically for that show. I had a abandoned dark room to work with which i later found out – having a dark room/background was key to the installation working.

Description
The installation consists of one projection of a live camera feed using nightshot/infrared mode converted to b&w. This feed is fed through long delay (est 1-1.5 min) and real time effects. Activity in the room / on the camera dynamically controls the length of the delay, amount of visual echo and volume of the sound. More motion turns up the effects and sound. Less motion turns down the effects and sound.

Concept & Functionality:
I originally conceived the piece to be about ghosts, but it explores ideas of vanity, how we view art and if art exists void of an audience. I am guilty as most of quickly skimming a piece of art work.
The piece functions best in exploration of how we view art. If the installation room has no activity, there will be no image or sound — lifeless. For viewers to see themselves in the projection they need to be patient and wait for a couple minutes. If the viewer does wait to see themselves, what does that say about our desire to see ourselves?
As the view strong er walks into the room with the installation they see that it is a simple live camera feeding the projector. The video camera’s LCD screen is faced out so you can see that it is in fact shooting live video. They recognize the projector and stand that is behind them. They wonder why they are not in the image and realize the image is of the previous viewer. The current viewer will show up in the image after the long video delay but they will only see themselves if they wait long enough. The long delay keeps the piece from becoming a novelty, where people can immediately see a 1-to-1 reaction to their movement. The delay is long enough that you forget that your motions are being recorded and you act more normally.

2008 at ECHO presented by M5 and The 86 Collective Chicago IL

The 2nd installation tested the idea that i could do the installation in variety of spaces – not just a dedicated room. The reception was great and i won 2nd prize at the show (and didn’t know there was going to be prizes). I also missed receiving my prize again due to miscommunication. I learned that I really need a flat black background and a semi isolated space for the installation to work. I also found out that for a short while no one was manufacturing red light bulbs (odd). Having the dedicated room at the first instal made a huge difference in keeping the conceptual ideas functioning intact to make it something more than a fun house mirror. I also made a point of getting documentation of the video projection with the audience to see the interaction — vs a recording of the projection alone. Ideally, it would be best to have both to show side by side.

2010 Bushwick Open Studios, Brooklyn NYInstallation V2

I participated as part of BOS 2010 with a revised and reprogrammed version of my “I’m sorry, I’m not here right now” interactive video installation.
I also showed photographyc, comercial motion graphics work and (new to anyone in nyc, but quickly starting to feel very old looking at the date) live a/v recordings and vj material.

This is a evolution of a previous installation i did using 2 laptops & 2 copies of gridpro – using it’s video analysis to control the delay of the live feed. the initial idea being, that there is a projection presented as a live video feed but the more you move the less real time the image is.

I built this using quartz composer so it could be turned into a fx plug for vdmx – or more likely making an fx plug from the image analysis part of it. theres some math interpolation i would like to try to see if i could get to be exponential not linear, but otherwise feels done. if i were to have it installed longer than a day, it would be nice to make it a more presentable as well.

The processing take live video feed. I used vade’s optical flow plug into detect motion (looks like the key to my use is the diffrencing of the new/old image. I do an over all average of the optical flow patch and feed it to a imgpixel to get a luminance value. this value controls a video dely and cross faded between the live image and delayed/frozen image. It also controls the vol, temp and pitch of a synthesizer instrument in garage band via midi.

@