Audio Luminosity VJ Content Pack

Title

GET AUDIO LUMINOSITY HERE

I wanted to talk a bit more in detail about the content in this package and my process in general. Since 2003 I have been automating, sequencing and synthesizing video to music. It use to be precise and tightly controlled by midi playback tracks and eventually bloomed into what could be considered VJ, live video performance.

Audio Luminosity consists of HD video’s shot with my JVC Everio HD GZ. One of my favorite clips in this collection is Aquarium_Jellyfish_Blackout2 it was shot at the Monterey Aquarium in Monterey California. The shot takes advantage of the Everio’s Shutter and aperture adjustments to make a fade out with no additional editing needed. Differing from many VJ of this day I feel the need to bring tangible creatures and objects to the screen in synthesis with my animations and generative graphics. It is like unifying the right and left brain, emotion and data.

This brings me to another point about this collection it is not cut and short like most VJ content packs. Lots of the video’s such as LaserGlass and the Still_Life_Cellular_Automota series are long and meant to be edited to the performers preference. These long clips have many moments that might speak to different individuals for only a texture, moment or frame. However, when brought into context by the performer will write a completely new story. This is what VJ is about: Synthesis, technological amalgamation and learning. I want to go back to the Still Life series again and instruct to those that are not aware of how images like this can be made.

Essentially the still life is brought into animation by a feedback loop between a camera and a projector. To make it interesting a dimmable studio light and a strobe light were incorporated. When the lighting and camera variables are static the scene is a looking glass, an infinite expression of itself. As the cameras zoom, aperture and shutter are modulated in different patterns along with the lighting intensity and strobe you start to get some interesting imagery similar in geometry to the workings of slime mold and other self organizing systems.

I must mention the inclusion of two clips from a past piece piece Masked Beauty. This was a special project that used photo’s from the body paintin works of Michael Rosner of Eye Level Studio. Both clips happen to use the same model but my absolute favorite is Eye level studio vs Salvador Dali (titled Eyelevelstudio.mp4 in the content pack.) The use of that Gala Galactica as a tunnel texture traveling through the face paint of Rosner was one of the best synthesis I have constructed to date!

There are many loops in this pack that are made using the application Motion. I’m new to it have been using it for about a year and have made some decent progress. Headress3 is a good example of how still images can be turned into animation using the program. I still have quite a ways to go with that but I think that you will find these clips complimentary.

Two other groups of videos that are included are datamosh and animations using the application Groboto. Data Mosh is a rad technique that has been brought into the main stream now but I use it in a specific way in all of my clips. I like to use still images and the Ken Burns effect (photos zooming in and moving across the screen) to create a series of images jumping out of each other. Cool stuff begins to happen using this method. An amalgamation that I found interesting is the Ernst Haeckel VS MC Escher data mosh. As for the Groboto animations, I purchased the cheap program instead of buying a VJ content pack from someone else. The program is not the most precise to control but the rendering and lighting engine that they have put together is fucking amazing. You can see the power of the program at work in clips such as Mobius4, Fan_Test and HelloWorld.

As of late I have been trying to figure out imagery on a programing level. I began on this path back in 2008 when Jason Seal and I created the Æ-table for multi touch interaction. Since then I have been using Pure Data and Quartz Composer to create generative graphics and networking solutions for displaying data. This has led me to a better understanding of computation and is allowing me to start thinking about video mapping projects involving complex geometry and constructed space rather than found space.

I hope you have enjoyed reading this. If you made it to the bottom of the post please comment to let me know that I’m getting through.

Cheers,
JZ