I visited the Allosphere at UC Santa Barbara today with the other video designers. I was trying to keep my expectations low and ready for disappointment, but I was totally blown away. The Allosphere is a spherical room with a bridge through the center. 26 projectors fill the walls (if you can call them that) with seamless 3D content. They use it for data visualizations, simulations and artistic endeavors. We saw demonstrations of fMRI scans, equations of knots passing through 4D space, a biological simulator, and raycast material data.

It had a simpler mode which allowed any desktop content to be presented on a broad swath of the screen, but for the full spherical immersion experience, the software had to be custom-coded. Apparently, a lot of work went into “state-synchronization,” solving the problem of how to keep 13 computers displaying different sections of the same environment sixty times per second. You’d think it would be pretty simple- just divvy up data and send it over OSC or something, or have each computer solving the same simulation, and just give it camera directions. Not so- when you’re dealing with millions of points per frame, the bandwidth challenges are significant, and there are too many variables to solve the simulation on each computer to keep them frame-accurate. Despite this, they’ve found a way. And it is amazing.

While there, we also saw a student who was developing a new platform for creative coding called Gibber. It allows you to live code music and visuals in the browser in Javascript and GLSL. It seems elegant and intuitive in itself, but it also features a number of options for collaboration. Not only does it have access to a shared code base to which any user can contribute instantly, it allows live collaboration between computers, allowing users to work on the same project across the room or across the world. Right now, it seems to be a little constrained by its limited abilities to access the local filesystem, and I imagine its lack of output options (separating display from code, Syphon support), but I bet these will come in time.


Finally, we met the team behind Open Drone Control. They showed us their motion-capture lab where they’ve programmed a Parrot AR to follow a user’s hand around. You have to wear a reflective glove, but once you don it, you become a drone-tamer. I’ve never felt so close to having magic abilities before. From the moment you lift your hand off the floor, there was a very tangible sense that you were dealing with a living being, one that responded to an invisible force cast from your hand. It was a great team doing great work.

A the end of the day, the trip was both inspiring and frustrating. It was amazing that my program was able to give me exposure to these projects, but it reaffirmed my desire for doing cutting-edge technology-driven work. Seeing what other media arts programs are doing provided a tangible example of what mine lacked for my interests. My plan is to push my mentor and the administration for more opportunities like this, and reaching out to teachers and artists that can be brought in to support this sort of work.