puls

light installation in the Fantoft-Paradis tunnel in Bergen, commissioned for the new metroline Bybanen which opened in end of june 2010.

The installation consists of two long wave patterns of light, one side red-white, the other blue-white. The total length is about 400m. The two wave patterns overlap, so for about 100 m you will see both the red and blue pulse. The material used is neon rope light.

Together with Marius Watz I was invited to make either a light or video work for one of the tunnels for the new Bybanen metroline in Bergen. You can see some images from Marius´ installation here.

I wanted to work with the fact that the train is moving at about 70km/h (about 20m/s) through the tunnel, using the movement of the train as a way of animating the two wave patterns mounted on the tunnel walls.

There were some interesting challenges to this project, as the installation was supposed to be ready for the opening of Bybanen in end of june. This meant I had to send in the proposal for the installation before the trains started to run. I ended up making an animation of some simple waveforms in Processing, which gave a pretty good impression of how this would be experienced when sitting in the train.

There are a lot of factors influencing how the work is experienced.  If you sit by the window closest to the tunnel wall the wave breaks up into rapidly changing curves and lines. If you sit closer to the middle of the tunnel you will see longer stretches of the wave. You will also see reflections of the waves in the windows of the train.

One thing I don´t have control over is the light inside the train, which is really bright. This dimishes the effect of the neon rope light washing the tunnel in different blends of blue, red and white (and potentially casting coloured shadows of the passengers onto the tunnel wall). Thus the installation exists in two different modes: The tunnel as a light modulator which can only be experienced when being inside the tunnel without any trains, or as animated wave pulses when riding the train through the tunnel.

Esemplasticism: The Truth is a Compromise

blink v2, originally uploaded by hc gilje.

Blink is part of an exhibition curated by Hicham Khalidi, produced by Den Haag art space TAG, made for Club Transmediale that opens in Berlin today.

From the exhibition decription:
Our brains are esemplastic. They are perfectly evolved for pattern recognition, designed to shape disconnected elements, like the incomplete or ambiguous information we get from our senses, into the seamless whole of our experience. What we see, hear, touch and feel is folded into an amalgam of data, emotions and cultural baggage. And in the contemporary world, this esemplastic power is pushed to the limit in the sea of information that we are floating in: data-visualizations, scientific studies and computer analyses become increasingly abstract and disconnected from our normal experiences. Are we losing our sense of meaning as we fail to join the billions of dots? What compromises are we making when we try to settle on a particular interpretation?

The works in Esemplasticism – the truth is a compromise are mostly low-tech, using everyday objects and media. Employing sound, objects and synchronicity; relatively ‘old’ technologies like field recordings, music, video, and projection, each piece lifts the curtain on the perceptual tactics that our esemplastic/apophonic/pattern recognising brains employ to negotiate the world; with wit and irony, they have much to say about verisimilitude as each exposes a different fracture between our expectations, our perceptions and our compromises about the objective ‘truth’ that exists ‘out there’.

Participating artists
Artists: Edwin Deen, Daniël Dennis de Wit, Lucinda Dayhew, Anke Eckardt, HC Gilje, Terrence Haggerty, Yolande Harris, Alexis O’hara, Pascal Petzinger, Mike Rijnierse, Willem Marijs, Bram Vreven, Katarina Zdjelar, Valentin Heun, Sagarika Sundaram, Gijs Burgmeijer.

I will post links to the catalogue when that becomes available.

The exhibition will be on until the end of February.

For me this was an opportunity to improve the installation both esthetically and technically. I constructed a platform for the equipment using a laser cutter, which turned out quite nice. This greatly simplified the installation of the work. As mentioned in previous posts, the installation uses my dimsun lighting system, and the design for this will be made available shortly.

Due to other obligations I needed to set up my installation before everyone else. It was a strange experience to work alone in the 900m2 empty building in Spandauer Strasse, close to Alexanderplatz. My only companion was the stepladder which also became the model for my documentation.

dimsun

The last half year I have been working quite a lot with lights and shadows, and this summer I decided to build my own lighting system, which I called dimsun. The first use of it was for my blink installation consisting of 24 leds placed in a circle.

It consists of dimmers based on the TLC5940 led controllers. Each of the dimmers can control 16 channels, up to 5w each, and are intended for power LEDS, very bright leds up to 5W. The dimmer is controlled by an arduino microcontroller, and up to 6 dimmers (96 channels) can be controlled from one arduino.

more images of the dimsun system.

The schematics will be made available soon.

The lamps are based on starshaped LED´s combined with a lens mounted on an aluminum profile.

relief projection 2008

This easter I got the opportunity to work in a large theaterspace (BIT Teatergarasjen) to continue my work on relief projection (or masked projection).
I made 9 plywood boxes to use as my projection objects, and worked with two projectors, having a total of 16 projection surfaces.
One of the aims for this session was to also work with sound (each object would double as a speaker), and to create a depth in the placement of the objects.
I implemented my nodio system into the projection patch, which made it possible for me to create sequences of movement.
There are plenty of images from the session here.

relief projection

augmentedsculpture by pablo valbuena, originally uploaded by hc gilje.

I found the title of this post in one of Michael Naimarks essays, I guess it also could have been called augmented reality, projection of a virtual object onto a physical object, projecting a virtual layer ontop of a physical geometry, masking of projections, etc.

I have been researching different ways of projecting on other things than flat surfaces: projections that project on objects, follow the shape of the room, and projections of virtual 3D shapes onto physical 3D shapes.

In my own work I have used projections as advanced light sources, masking as a way to fit flat projections on objects and surfaces, but also to create the illusion of multiple screens from a single source. Some examples here. (Update may 2008: some more recent examples)

My goal has been to create tools which make it easy to start working with a physical space immediately, being able to make changes in realtime. I have mainly done this by using multiple opengl videoplane layers in max/msp jitter, with one of the layers having a drawing mode so you are able to draw the shape of a particular object after you have placed a opengl layer over it. I made a crude 3 layer tool for the workshop I did at KHIO this summer to enable the participants to immediately start relating to the physical space.
A prime example of multiple opengl videoplanes is Piotr Pajchels work with Verdensteatret.

I have done some experiments with projecting a 3D shape onto physical objects, but still have a long way to go in terms of having a simple setup for this.
Obviously I have been looking at what other people have been doing, but none of systems I have found seems to be available to the public, and few of them seem to have been used beyond the developing-period of the system, which might be a sign of them not being as flexible as wanted, and maybe also quite timeconsuming to prepare.

Most systems uses a method to track the shape/space they want to project onto in combination with custommade software, to be able to map the projected image correctly onto the physical object, which is related to the lens specifications of the projector, the placement of the projector in relation to the objects to be projected on, etc.

The LightTwist system developed at the University of Montreal (not much seems to have happened after 2004) use “a panoramic (catadioptric) camera to get correspondances between each projector pixel with the camera pixel. This camera represents the viewpoint of our futur observers. Then, from what the observer should see, we can build the projector images from their respective mapping.”

The videobjects from Whitevoid design in Germany is a software for realtime distortion of video to fit physical objects, but using predistorted video, and you calibrate it either with a helpgrid or by importing a model of the realworld setup. So you would need to first create the 3D shapes to project onto, and then decide how the video will map onto the 3D objects, and finally doing the calibration to match up the virtual objects with the physical ones.

I think the most spectacular callibration solution so far is the “automatic projector calibration with embedded light sensors” (pdf), a collaboration between people from Carnegie-Mellon, Mitsubishi Electric Research Lab and Stanford. They use fiberoptics and light sensors built into the objects/surfaces to be projected on, and by projecting a series of grey coded binary patterns, a custom-made software is able to adjust the image in less than a second to perfectly fit the projectionsurface, with a much higher resolution than a camerabased solution. Take a look at the impressive video:

The pdf and video seems to be from 2004, but I found some more information at Johnny Chung Lee´s website. They are hoping to make the system fast enough for live tracking of moving objects, and also to make the calibration pattern invisible using infrared light.
update: there is now more information on Lee´s website.

If you have a big budget you could always invite Circus of Now to do the video for you (”We build skyscrapers of light”).

At Ars Electronica this year I had the pleasure to see Palbo Valbuena´s Augmented Sculpture (image at top of this post) which consists of a physical structure in the corner of the room, with the exact same virtual shape projected onto it using one projector. By then animating the color and lighting of this virtual shape, some very interesting light/shadowplays happen. Valbuena collaborates with some game developers in Spain who constructed the virtual model and animation in a standard 3D software.
This work shows the potential in augmented reality using videoprojection, and I hope to see more of his work soon (He has a big outdoor installation in Madrid at the moment, hopefully there will be some documentation soon.)

update feb 5th 2008: Valbuena has updated his website with documentation of several projects: different versions of the augmented sculpture and the public square installation in Madrid.

The queen is the supreme power

The queen is the supreme power 2, originally uploaded by hc gilje.

(Slideshow from the rehearsals and performances here)

My latest collaboration with Yannis Kyriakides is a piece based on old telegraph code books. I use scanned pages combined with microscope texture from these books and project onto the orchestra from two sides, using the musicians as screens in combination with a wide screen behind, trying to create a dynamic space using text fragments and letters as projected light.

It is a coproduction between ZKM in Karlsruhe and Musikfabrik in Cologne, and is performed may 17th in Cologne and on the 18th at ZKM.

lab jan 2007: masking projections

lab jan 2007, originally uploaded by hc gilje.

(Slideshow of more images from the projection sketches here)

The second focus in this lab session was to work with video projections, and masking them to create several projection surfaces from one projector, and to be able to relate to physical shapes in the room, like my projection on a sphere in Iball, or the masking of video to fit the gallery spaces in one of the sleepers installation:

The simplest way to do it is to work with a 2 dimensional mask. I also wanted to try to work with 3D masks: projecting a virtual 3D version of an object back onto itself. In this way I managed to cover 4 sides of a cube with 2 projectors. Another thing I did was to link the individual projection surfaces (from one projector) so focus could move from one area to another, quite like how the nodes in nodio operate.

Finally, I did some simple tests using the projector as a light beam.

 

Follow

Get every new post delivered to your Inbox.

Join 186 other followers