Hacked Raven Input (Nina Wenhart, jonCates + jake elliot) remixes and transcodes raw data from various Media Art Archives, converting files into realtime audio video noise, building a nest for digital punk computer witches and crashing systems across a network connecting Linz/Chicago.

Thursday, June 19, 2008

offline browsers as performance tools

we should consider offline browsers as performance tools for future performances...

Sunday, January 27, 2008

hacked raven input @ qujochoe website


Dan Sandin's 2001 Ars Electronica article


EVL: Alive on the Grid

Dan Sandin

Networking, sound and interaction are all key elements of Dan Sandin’s EVL: Alive on the Grid (Electronic Visualisation Laboratory), a collection of virtual art worlds where local and distant participants alike can interact in shared virtual spaces.

Enabled by the Grid—collections of networks, computers and virtual reality displays that span the globe—users of this unique media can “virtually” interact with one another—and the models contained within each piece. Displayed in a CAVE—a four-wall, theater-style virtual reality environment—users don lightweight head and hand trackers and access the worlds through a virtual atrium, where they encounter visitors from around Europe and the U.S. Through their virtual representative, or avatar, users can navigate and interact with others in real time. Each visitor’s avatar, identifiable by a photorealistic 3D face, is able to create and alter the virtual worlds they visit, leaving “ghosts” of themselves for others to view. All of the worlds are persistent, which means they continue to grow and collect information from remote participants on the Grid even after festival users leave the environment. All of the CAVE pieces were created using Ygdrasil (YG), a system for authoring networked virtual environments developed by Electronic Visualization Laboratory (EVL) Ph.D. graduate Dave Pape. Ygdrasil provides a shared scene graph to automatically connect distributed virtual elements and users, and uses a high-level scripting layer and plug-ins to easily assemble environments from existing components. It is an extensible system, based on EVL’s CAVERNsoft and the commercial software OpenGL Performer. It is available at . Dan Sandin, co-director of Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago, co-invented the CAVE in 1991, and brought it to the Ars Electronica Center when it opened in 1996. Since then, EVL has concentrated on the development and deployment of networked virtual reality, that is, VR worlds that distantly located people can view and change in real time. Sandin curated this collection of networked virtual environments created by students of the EVL and the following participating sites:

Electronic Visualization Laboratory University of Illinois at Chicagowww.evl.uic.edu

Interactive Institute—Tools for Creativity Studio, Umea, Sweden

State University of New York (SUNY), Buffalo
Center for Computational Research and the Department of Media Study

C3 Center for Culture & Communication Foundation, Budapest, Hungary

Indiana University / H.R. Hope School of Fine Arts / University Information Technology Services / Advanced Visualization Laboratory

V2 Lab/V2 Organization Institute for the Unstable Media, Rotterdam,
collaborating with the Technische Universiteit Eindhoven (TU /e) and Stichting
Academisch Rekencentrum Amsterdam (SARA), The Netherlands
www.v2.nl/ ; www.tue.nl/ ; www2.sara.nl/

Tim Portlock

Super Spectacular is an immersive virtual-reality environment that presents different spectator arenas for users to interact: sports stadium, art gallery, corporate outreach and others. Each specific person who journeys through the environment creates a unique trail of footprints that may persist for several executions of the application.
Dan Sandin
In this networked virtual environment, the user perspective begins in outer space, surrounded by sun and Earth images based on real-time satellite information, than falls to Earth and lands on a northern Lake Michigan archipelago of islands. While exploring these islands, from Death’s Door to the Garden Peninsula, the participants share a watery experience. The satellite images are an animation of actual Earth and sun weather data collected over months using the web. The images are updated daily to form a 3D visual history. The 3D models of the archipelago of islands are based on video images taken while kayaking. The 3D information has been extracted from the moving video images to allow participants to move about and explore these worlds. In the end, the world dissolves into moving water.
Josephine Anstey, Dave Pape, Dan Neveu
PAAPAB is a networked virtual dance floor with a steady dance beat. The user joins the dance by selecting one or more life-sized puppets whose movements mirror that of the user’s body movements through a simple motion capture system. As more users join in the dance, the floor gets more crowded. The CAVE is equipped with a tracking sensor for the user's head and either one or two arms. PAAPAB uses this data to create the puppets’ realistic dance motions. Each puppet shape varies, and the trackers are mapped to their body parts in different ways. Hence, the users may move their arms while the puppet moves its legs, tail or wings. Over time, a diverse group of animated puppets inhabit the virtual dance floor. At each networked site, a local heartbeat keeps the dancers and the dance floors in sync with one another.
Todd Margolis
Infinite Studio is a new paradigm of “art-making," that allows the user to create, in real-time, interactive virtual reality artwork from inside a virtual environment. Using a color palette and several drawing and modeling tools, the user can create and modify virtual objects for any effect desired. Individuals can also collaborate across the network to create group constructions. The scenes can be easily scaled, rotated or moved from one part of the world to another, thereby changing focus or eliminating elements. Every line drawn has a lifespan causing the entire art piece to constantly evolve.
Marientina Gotsis
La Boîte is a networked virtual environment where visitors are invited to see short dance performances on video screens. In the virtual dance studio, visitors can view brief performances, then try to imitate them or leave their own traces of a performance. The piece places some emphasis on classical ballet training and the importance of the geography of the classical dance studio. This project was inspired by a decade-long friendship, and our mutual love of dance, performance and technology.
Beth Cerny, Alex Hill, Ya Ju Lin, Brenda López, Todd Margolis
The Dreambox project is a collaboration between digital media artists and scientists to create a dynamic virtual environment that reflects the subconscious meaning of symbolic objects arranged in space. It has long been held that humans consistently recognize and internalize the meaning of certain primal symbols, the so called archetypes. In an effort to gain insight into the minds of their patients, psychiatrists developed a technique through which physical artifacts are used to create scenes in a box of sand. This technique, called sandtray analysis, seeks to gain insight into the workings of the subconscious mind by interpreting the arrangement of these archetypal symbols by the conscious mind. Each of the artists in the Dreambox project has developed symbolic objects for placement in a virtual environment. The Dreambox user begins by arranging a scene with the available objects in a virtual sandbox. Unlike the static physical objects of a sandtray, Dreambox objects respond to their arrangement dynamically. Based on their relative proximity, the Dreambox objects take on an array of different manifestations and behaviors. A key next to a boundary may create an opening, yet a house next to a boundary may form a prison, while a house next to a key may evoke a church. Through the use of tele-immersive collaboration, each user entering the world is invited to make a personal contribution to the Dreambox experience. Each participant then becomes a Dreambox object in the world, taking on attributes and influences in line with their changes to the arrangement of the scene. And, as users enter and make their contributions to the evolving state of the Dreambox world, a deeper shared meaning develops for all to see.
Geoffrey Allen Baum, Keith Miller
syn.aesthetic is a virtual-reality environment where a 3D score is created by the sonic input of the participants. This “objectified” sound manifests as a visible, virtual object that maintains the location, dynamism and organic qualities of the audible sound, but persists beyond the lifetime of the sound. Participants are able to create their own sound sculpture by adding and deleting sounds. Each user leaves their own trace in the space, contributing to a larger sculpture, or score, representative of the collaborative effort of all participants over time. The disconnect between sonic creation and visual composition is intended to lead to a new kind of creation possible only in this kind of environment—a living sculpture, where sound is the physical material and the material is ethereal.
Josephine Lipuma
This networked virtual reality CAVE application explores a wonderland inspired by Lewis Carrol’s popular story. The VR artist lures the visitor into making the environment his or her own fantasy realm by rearranging it. Rows and rows of blue mushrooms appear everywhere for the visitor to pick. Forty-foot rabbits appear and disappear, and self-portraits of the artist as a bluefaced spirit or a crazed, yellow-haired Alice character with a Cheshire Cat-like grin randomly pop up. An original vocal and string arrangement provide an eerie accompaniment to this mesmerizing environment.
Petra Gemeinböck, Joseph Tremonti
Excavation is a networked virtual environment that explores the surface of historiography. Excavation immerses its visitors in a metaphoric archaeology of encounter, in which their presence engenders temporally unique and continually adaptive sets of relationships between the networked participants, the fluctuating anatomy of the metaphoric landscape, and its multitudinous inhabitants. These inhabitants constitute a subjective aggregate of modern history as captured and conditioned by the technological apparatus of post-industrial society. Alive on the Grid, visitors are recorded and become virtual inhabitants extending and recontextualizing the history of Excavation.
Drew Browning, Annette Barbier
Home is an interactive, navigable virtual-reality environment that explores ideas about swelling and its relation to the psyche. In it, a house abandoned by its owners is up for sale. Half-empty spaces still reverberate with memories and the most private moments of its previous inhabitants. Deceptively normal looking from the outside, spaces stretch and break apart and walls fall away once you are inside, leaving only a small plateau hanging in emptiness, resounding with the lives and voices of those who still haunt it. The rooms and hallways include a soundscape composed of spoken narrative fragments and a musical ambient score. Immersants may encounter the ghostly presence of other voyagers in the space, and may leave a recording of their voice for future visitors.

Drew Browning and Annette Barbier, major artist/designers
Geoffrey Allen Baum, programming and VRML-CAVE conversion
John Loesel, composer
Dave Tolchinsky, screenwriter
Reid Perkins-Buzo, technical assistant
Sam Ball, theatrical consultant
Voices: Linda Gates, Delle Chatman, David Downs, Rives Collins, Molly Sullivan, Dan Brintz
Contributors: Melinda Levin, Karla Berry, Barbara Bird, Art Nomura, Michelle Citron, Paul Hertz, Ondrea Delio,
Laura Kissel, Harlan Wallach, Deb Diehl, Shawn Decker, Diane Hagamann

Saturday, January 26, 2008

proposal for chicago-performance

Sandin - CAVE as another connection between Chicago and Linz? and use the Sandin image processor. what do you think?

Monday, January 21, 2008

thoughts for next steps

for a future project (if you'd like to continue) i'd love to have compressions techniques as a topic for the visuals. like we already found out during the first project, different compression algorithms will produce different effects when used in pd, that there are no glitches when we used QT (sorensen 3), but a lot of glitches that respond to the changes in music with avi (divX 5.0.2).

"Different compression algorithms have different artifacts, some of them are apparent in the first generation, and some of them take many generations to appear. Some artifacts effect the detail within the image, and some effect the apparent motion of the image."