relief projection

augmentedsculpture by pablo valbuena, originally uploaded by hc gilje.

I found the title of this post in one of Michael Naimarks essays, I guess it also could have been called augmented reality, projection of a virtual object onto a physical object, projecting a virtual layer ontop of a physical geometry, masking of projections, etc.

I have been researching different ways of projecting on other things than flat surfaces: projections that project on objects, follow the shape of the room, and projections of virtual 3D shapes onto physical 3D shapes.

In my own work I have used projections as advanced light sources, masking as a way to fit flat projections on objects and surfaces, but also to create the illusion of multiple screens from a single source. Some examples here. (Update may 2008: some more recent examples)

My goal has been to create tools which make it easy to start working with a physical space immediately, being able to make changes in realtime. I have mainly done this by using multiple opengl videoplane layers in max/msp jitter, with one of the layers having a drawing mode so you are able to draw the shape of a particular object after you have placed a opengl layer over it. I made a crude 3 layer tool for the workshop I did at KHIO this summer to enable the participants to immediately start relating to the physical space.
A prime example of multiple opengl videoplanes is Piotr Pajchels work with Verdensteatret.

I have done some experiments with projecting a 3D shape onto physical objects, but still have a long way to go in terms of having a simple setup for this.
Obviously I have been looking at what other people have been doing, but none of systems I have found seems to be available to the public, and few of them seem to have been used beyond the developing-period of the system, which might be a sign of them not being as flexible as wanted, and maybe also quite timeconsuming to prepare.

Most systems uses a method to track the shape/space they want to project onto in combination with custommade software, to be able to map the projected image correctly onto the physical object, which is related to the lens specifications of the projector, the placement of the projector in relation to the objects to be projected on, etc.

The LightTwist system developed at the University of Montreal (not much seems to have happened after 2004) use “a panoramic (catadioptric) camera to get correspondances between each projector pixel with the camera pixel. This camera represents the viewpoint of our futur observers. Then, from what the observer should see, we can build the projector images from their respective mapping.”

The videobjects from Whitevoid design in Germany is a software for realtime distortion of video to fit physical objects, but using predistorted video, and you calibrate it either with a helpgrid or by importing a model of the realworld setup. So you would need to first create the 3D shapes to project onto, and then decide how the video will map onto the 3D objects, and finally doing the calibration to match up the virtual objects with the physical ones.

I think the most spectacular callibration solution so far is the “automatic projector calibration with embedded light sensors” (pdf), a collaboration between people from Carnegie-Mellon, Mitsubishi Electric Research Lab and Stanford. They use fiberoptics and light sensors built into the objects/surfaces to be projected on, and by projecting a series of grey coded binary patterns, a custom-made software is able to adjust the image in less than a second to perfectly fit the projectionsurface, with a much higher resolution than a camerabased solution. Take a look at the impressive video:

The pdf and video seems to be from 2004, but I found some more information at Johnny Chung Lee´s website. They are hoping to make the system fast enough for live tracking of moving objects, and also to make the calibration pattern invisible using infrared light.
update: there is now more information on Lee´s website.

If you have a big budget you could always invite Circus of Now to do the video for you (”We build skyscrapers of light”).

At Ars Electronica this year I had the pleasure to see Palbo Valbuena´s Augmented Sculpture (image at top of this post) which consists of a physical structure in the corner of the room, with the exact same virtual shape projected onto it using one projector. By then animating the color and lighting of this virtual shape, some very interesting light/shadowplays happen. Valbuena collaborates with some game developers in Spain who constructed the virtual model and animation in a standard 3D software.
This work shows the potential in augmented reality using videoprojection, and I hope to see more of his work soon (He has a big outdoor installation in Madrid at the moment, hopefully there will be some documentation soon.)

update feb 5th 2008: Valbuena has updated his website with documentation of several projects: different versions of the augmented sculpture and the public square installation in Madrid.

10 Responses to “relief projection”

  1. Network Research » Augmented reality projections Says:

    […] an interesting post a few days ago on HC Gilje’s weblog Conversations with Spaces concerning relief projections. The focus in the post was on the work Augmented Sculpture v1.0 by Pablo Valbuena, which is […]

  2. susan main Says:

    I am entranced by the “automatic calibration with embedded light sensors”. Are there more tech specs to experimenting with this?

  3. hcgilje Says:

    not more than what is mentioned in the pdf files and video. Mind you, most of this material is from 2004, so there doesnt seem to have been much development of it since then (somebody correct me if this is wrong)

  4. Juanjo Says:

    there is a workshop planned on this in Barcelona?
    I am very interested in for my next project.

  5. hcgilje Says:

    if you find a institution which can host a workshop I would definetly consider that

  6. Case Williams Says:

    There is more work on the automatic projection calibration, in fact they have it to a point that they can project on foldable objects as well, as moving objects, just check out johnny’s site to see more.

  7. hcgilje Says:

    thanks, case!

  8. karla’s: Schnickschnack Says:

    […] on projection into spaces or onto physical objects check out the relief projection 2008, relief projection and masking projections posts. New posts for inspiration: snow lab and shift v2 installation. I […]

  9. TommeeT Says:

    Well now it’s 2012 & I wonder if anyone knows if there have been any developments with the automatic projection calibration and it’s accessibility as a workable tool?
    Am trying to find an effective solution to track moving objects with a projector.
    I have seen the projections onto foldable & moving objects using embedded light sensors on Johnny Lee’s site – very impressive.
    Have tried to contact Mr. Lee a number of times – no response yet…
    I suppose using a Kinect sensor is one possible approach.

  10. hcgilje Says:

    I think Mr Lee went working for a Google research lab actually.

    Here are some proiojects that I find interesting:
    Daito Manabe http://www.youtube.com/watch?v=1lUJ0lalMtU&feature=plcp
    Kyle McDonald http://createdigitalmotion.com/2012/03/projector-and-camera-a-little-closer-new-magical-mapping-tools-3d-scanning-and-more/
    Elliot Woods http://www.kimchiandchips.com/blog/?p=800


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: