The intimacy of strangers


I just finished my new film of microscope lichen landscapes, found on a rock in a fence at Trondenes in Harstad, Norway. The film was commissioned by The Arctic Moving Image & Film Festival with premiere october 13th, 2022.
The film was created using a custom made computer-controlled mechanical stage and a digital microscope. Almost 50000 microscope images were stacked and stitched together into miniature lichen landscapes, with a virtual camera flying over these landscapes.
For the soundtrack I used free-impro drumming (performed by Justin Bennett), inspired by the free-impro symbiosis of the lichen, that transforms into a sonic landscape of sci-fi space exploration and encounters with mysterious creatures.

My hope when making this film was to create a bit of awareness about what lichens are.
Apart from the extreme variations in appearance, textures and color, lichens have become the poster organisms for a new biology which challenges the idea of the individual and supplements the theory of evolution.
The title The intimacy of strangers is taken from a chapter about lichen in the excellent book on fungi by Merlin Sheldrake, Entangled life from 2020. It refers to the fact that lichen is not one organism, but a symbiosis of several organisms, and these organisms are not related at all, they are from different kingdoms. Mainly a fungi that partners up with a photosynthesising organism (either algae or bacteria). So instead of acquiring traits over long stretches of time through evolution (so called vertical gene transfer), the lichen combines traits from fungi and algae/bacteria through horisontal gene transfer: Once they have partnered up they have acquired these new traits, no need to wait millions of years.
This partnership also means it doesn´t make sense to talk about an individual, but rather an ecosystem of players with different roles.

Me and microscopes


Digital microscopes have been a part of my practice since 2007 when my live collaboration mikro with Justin Bennett was first performed at Paradiso in Amsterdam.
Like telescopes, microscopes extend our perceptive range, letting us see details of objects and organisms which because of scale remains hidden to us in everyday life. For me it is always a joy to explore the microscopic universes of textures, materiality and colors.

In 2016 I was commissioned to make a film for Vertical Cinema, a project initiated by Sonic Acts where cinemascope 35 mm film is shown in a vertical format. I decided to make a microscope film based on plastic wrappings from consumables.
I wanted to work with vertical motion for the vertical screen so I made a computer-controllable mechanical stage for the microscope to be able to create animation along one axis: For every image captured, the stage would move by a very small increment making it possible to then create an animation from the still images. This resulted in rift which premiered at the Sonic Acts festival in 2017, with soundtrack by Bennett.


While working on this project I imagined creating a mechanical stage that could move in both x (sideways),y (back and forwards) and z (up and down) directions, and also rotate around the axis of the microscope to be able to create curves that followed the terrain of the microscopic landscape.

In the late summer of 2020 I spent a lot of time alone working underground in a former water reservoir to make the two large scale installations Shadowgrounds for Factory Light Festival. One of my studio neighbours suggested that I should read Underland by Robert McFarlane. Which I did, or rather listened to while driving to or working in the water reservoir. One of the chapters in that book is about Merlin Sheldrake, which made me listen to his book as well about the mind-blowing world of the fungi.
I don´t quite remember how I in the same setting ended up listening to a book by Jon Larsen, of Hot Club de Norvège fame, about his obsessive search for micro meteorites. To document the micro meteorites that he eventually found many of, he collaborated with Jon Kihle who had a powerful microscope camera. This is when I first heard about the concept of focus stacking:
Microscopes have a very shallow depth of field, which means if looking at something which isn´t flat most of what you are looking at is out of focus. Focus stacking works around this by combining images taken with different focus distances, thus bringing out the three-dimensionality of whatever is under the microscope.

After having visited Harstad for location scouting this spring I decided that this was an opportunity to realise my next microscope project.
I had found the perfect location around Trondenes Kirke in Harstad, a beautiful area with a lot of history, both from the viking times and more recently the second world war (with Soviet labour camp and later a camp for the people of Northern Norway who were forced to leave their homes at the end of the war).
There are rock fences built around the church and the graveyard, covered with a carpet of lichen, the orange colours being the most noticeable. I thought it could be interesting to do a very site specific project, to just focus on life on this one rock in the fence (as a parallell to life on the bigger rock Earth).


I have always been interested in how things and organisms have completely different time cycles / durations, and seeing the lichen growing on the rocks on the fences around the church it made me think of everything that has passed by them during their existence (the oldest lichen that has been dated is 9000 years old and lives in Sweden). Although lichen and humans inhabit the same planet we live in parallell worlds with different time cycles.
Sometimes these worlds interact with each other: Lichen mine minerals from the rocks, thus releasing minerals trapped in the rocks to the greater eco system (that might end up in your body at some point). Lichens are often the first settlers on new territory: They make the first soil on new rock formations (islands, lava, mountains) which becomes a starting point for other life. Lichen notice the presence of the human world mainly through pollution which has caused several species to go extinct.

Many of my projects starts with first developing hardware and software that I then use to make a piece. This involves a lot of research and getting into things I know little about.
I often quote Ursula K Le Guin: “I don’t know how to build and power a refrigerator, or program a computer, but I don’t know how to make a fishhook or a pair of shoes, either. I could learn. We all can learn. That’s the neat thing about technologies. They’re what we can learn to do. ”


Making, learning and sharing tools is an important part of my practice, so the next section is an attempt at sharing some of the process (nerd alert) in making this film.
The challenge for the hardware side of this project was to be able to create a moving platform that was able to repeat the same motion over and over again with very small steps.
I am not so experienced with mechanical motion, so there was lot of trial and error before I was able to get the motion of the mechanical stage stable enough. I am lucky to have access to both the maker spaces bitraf and Fellesverkstedet in Oslo where I can make prototypes and finished work with laser cutters and cnc routers. In fact the mechanics and motion control of laser cutters and cnc machines are very similar to what my needs were for the microscope stage: Horizontal motion (x and y) and Vertical motion (to move the microscope closer or further away from the sample). On top of that I wanted to add mechanical rotation to this, thus having 4 axis of motion.


Luckily in the midst of my mechanical struggles I discovered the technique of stitching individual microscope images together to get a bigger mosaic image, something often used in medical and biological microscopy. This is to compensate for another inherent issue with microscope photography: The larger the magnification, the less you see of the sample (small field of view).
This changed my working process dramatically, as I instead of making individual paths for the mechanical stage to follow, I could do an automated grid scanning of a sample, and then make as many paths I would like in software later, making the whole process much more flexible.
So for each position in the grid there would be x number of images taken at different focus depths. As an example a grid of 5×5 would cover an area of about 8x8mm, and with maybe 50 images per tile this becomes 1250 images for one sample of lichen.


So how did this setup work?
The microscope camera is connected to a computer running a microscope imaging software (toupview in my case).
The mechanical stage consists of 3 motors being controlled via a standard arduino-based cnc 3 axis controller used in DIY 3D printers and CNC machines. The software to control this was Universal Gcode Sender, which basically sends information to the controller about the position for the XYZ axis among other things.
So for each XY position in a predefined grid (positions calculated in rhino/grasshopper but that´s another story) the Z axis will move closer or further away from the lichen sample to get different focus for the microscope. The actual low and high points is different for each sample so I do a manual check to get an idea of what works best for that particular lichen formation. For each Z position a trigger is sent from the controller via another microcontroller to trigger image capture on a computer. This is then repeated for every XY position in the grid setup.

After the individual images of each tile have been captured the process of stacking and stitching starts.
There are several options for focus stacking software. The two most popular ones are ZereneStacker and Heliconfocus.
For stitching there are some free alternatives like ImageJ/Fiji, with various plugins but I ended up using a commercial application, PTgui which is very intuitive to use and with great results.
After all the images have been stacked together, and when each tile has been stitched together with the other tiles I have a lichen landscape I can explore.
For this I create camera paths in free 3D software Blender from which I make animated sequences of the camera moving across the landscape. I started getting into Blender and camera paths when working with 3D scans in the performance Nye Krigere, and later in the series of point cloud works Vardø Kystopprøret, so I could use that experience to create camera (and light) paths to get interesting camera movements.

Finally for the soundtrack, I ended up using Reaper, which is a very reasonably priced DAW (digital audio workstation), and like Blender there is a big community of enthusiastic users sharing knowledge, making it easy to find what you look for.

The stars of the film
Advertisement

VPT 8 released

vpt8_logo

I just released VPT 8, over ten years after the first release of Video Projection Tools (VPT).  VPT has proven to be a popular free alternative for windows and mac users interested in projection mapping and realtime video experiments: The previous version VPT 7 was downloaded over 100000 times, 2/3 was windows users, 1/3 mac users.

VPT 8 has a lot of under the hood changes. It is now 64-bit only. It will also be the last version of VPT.

I have created an online manual which should make it easier to get started doing interesting things with your computer and a projector.

You can download and read more about new features on the VPT home page.

VPT 7 released

vpt07logo2

After a long development period, VPT 7 is finally out.

The previous version,VPT 6, was downloaded over 20000 times.

VPT 7 is completely rebuilt making it faster, stabler, easier and more flexible to use than previous versions of VPT.

and yes, it is still FREE!

Some of the new features include:

dynamic layers: add the number of layers you need for a particular project.

flexible source setup: create your own mix of quicktime,hap,still and mix modules according to the need of your project (on top of the live, solid and syphon sources).

In the mac version VPT continues to support the syphon framework but has also added support for the HAP codec, an opensource codec that greatly improves playback of high resolution video.

there is a completely new mask editor which lets you add as many mask points as you need.

the cornerpin adjustements of layers have improved, now providing a more correct distortion

the mesh editor is also new, and you can now apply a mesh distortion together with cornerpin distortion (not possible in previous versions).

expanded possibillites with controllers and the cuelist

an optional alternative to the traditional VPT presets using source presets and source playlists.

The download includes a 70 page manual, and there is also a getting started video tutorial available, all from
https://hcgilje.wordpress.com/vpt

Please continue to use the VPT forum  for VPT related questions:
https://groups.google.com/forum/?hl=en#!forum/vpt-forum

romlab workshop

I just finished teaching a two week workshop exploring space using VPT and arduino.

I kept a project blog, romlab2012, which might have a few useful tips for the integration of VPT and arduino. It will be updated with a bit more documentation from the different projects in the near future.

 

VPT 6 video tutorials

VPT 6 available: A projection powerpack

With almost 5000 downloads of VPT 5.1, I am happy to announce that the next version of Video Projection Tools, VPT 6 is now available.

VPT is a powerful and flexible video projection tool that does much more than just mapping.

What´s new in VPT 6.0
A completely new graphical interface making it even simpler to position, scale and distort the layers.
32 layers (previously 16)
mesh distortion with variable-sized control grid.
a completely rewritten cuelist making it even simpler to create transitions and to build complex sequences. You can now also send osc formatted commands directly from the cuelist.
crop-scaling of sources.
increased syphon support with two syphon (crop-scalable) inputs as well as syphon output support (mac only)
a DMX module has been added, with support for Enttecs usbdmx pro interface, enabling both control of VPT from DMX light consoles, or the possibillity to control dmx devices from VPT.

VPT still supports mac and windows, and it is still free. However, I have added the option for contributions through paypal:

VPT 6 is made possible through the support of IMAL in Brussels and Atelier Nord in Oslo.

IMAL offered me a one month residency in april to give a VPT workshop and to do an exhibition.

Atelier Nord has supported me twice this year: First they hosted a workshop as part of Oslo Lux in January which ended up in a VPT 5.1 release, and in june they gave me a week residency to write documentation for VPT 5.5 which instead turned into intense development of VPT 6.

So hosting workshops and inviting me to residencies is another way of supporting my work.

Read more about VPT 6 and try it out

VPT 5.5 preview

(update: VPT 5.5 turned into VPT 6 before it got released, and can be expected very soon)

I am busy finishing the upcoming release of VPT 5.5, and wanted to share a few of the new features.

One major addition is the possibillity to perform mesh distortion to each layer, giving you even more flexibility over the projected output.

The control interface gives you control over each point in the mesh.

Syphon is becoming an important tool for sharing image data between applications on Mac osX, supporting more and more applications (there is now a more or less working Processing implementation as well).

VPT already impemented syphon-in capabilities with version 5, but with version 5.5 there is more:

You can record the output from VPT using the Syphon Recorder App.

You can use different parts of the incoming Syphon image data in different layers: Using your favourite syphon-enabled application you can tile your output so that VPT split up the single syphon source into multiple sources.

For instance, if your syphon-app sends out something like this:

VPT can either use the whole image or split it up into four individual layers:

There are plenty of other improvements, which will be covered when it is released.

For VPT 5.5 I finally get the possibillity to make proper video tutorials, as Atelier Nord has generously offered me a one week residency to do this.

Expected release date: mid-June.

As always, free (with the possibility to donate).

btw, about 3500 people have downloaded VPT 5.1

Results from openLAB exhibition at IMAL

In beginning of April IMAL hosted an openLAB workshop called “Projections on forms and spaces”.
The workshop was led by me and was mainly based on using VPT. We selected 7 projects to be realised during one week development at IMAL, and the documentation is now available from IMAL.

It was an intense and interesting week, and interesting to see VPT explored for new uses in installation and performances. A new development for me was to introduce a simple video and audio trigger setup (available in the xtension section), which allowed for some audience interaction, as well as use of the serial module for controlling lights inside sculptures.

VPT 5.1 available

After nearly 2500 downloads of the first version of Video Proejction Tools VPT 5 released in december 2010, I am happy to announce the new version 5.1, partly based on feedback from workshops and the VPT forum.

The release focuses on all the possibilities to expand and customize VPT to be the right tool for your particular project.

This includes a vastly improved controller section, making it possible to control almost every feature of VPT from another application (like maxmsp, processing and PD), computer, sensors through the arduino microcontroller, midicontroller as well as built in lfo controllers for automating control and software sliders and buttons. This comes on top of the already existing built-in preset and cuelist system for playback. Through OSC, VPT parameters can relatively easy be controlled by gestures, using camera or kinect tracking.
You can easily set up VPT to control simple projector shutters by controlling servomotors using the arduino module.

Templates are provided with the download for exploring OSC on maxmsp and Processing, PD will follow later. Arduino templates for input/output as well as a servo example is also included with the download.

VPT can be used with multiple projectors using Matrox Triplehead2go, and the latest version also includes support for multichannel sound: The movie sources can be individually routed to up to eight channels, depending on your sound card. Using the provided maxmsp template it is possible to process the sound further.

VPT 5.1 includes a new possibillity of processing the sources using the FX bus. The FX bus consists of eight FX chains, modules which contains shader based effects. New FX chain modules can easily be made using the provided maxmsp/jitter template, and can be dragndropped onto the existing FX chain to replace it, making it easy to customize the effects to your particular need.

Three new sources have been added to VPT: A solid (if you just want a colour in a layer), a noise texture generator and a text module which can either use text from file, osc or typed text. A maxmsp/jitter template is also provided if you want to create your own source.

Read more about VPT 5.1 and download it here

Read the online manual

Visit the VPTforum

VPT templates and examples for maxmsp/jitter, PD, Processing and Arduino

VPT is a free multipurpose realtime projection sofware tool for Mac and Windows developed by HC Gilje. Among other things it can be used for projecting video on complex forms, adapt a projection to a particular space/surface, combine recorded and live footage, or for multiscreen HD playback.

Light on White at Oslo Lux

Last week the symposium Oslo Lux took place at the School of Architecture and Design in Oslo, organized by Anthony Rowe and Ståle Stenslie.

It was a one day event with speakers from art, architecture and design discussing uses of light in different projects. I spoke at the seminar, created a snow projection and gave a 2 days VPT workshop so it was a very intense but interesting week.

The keynote speakers were AntiVJ and UVA. The list of speakers  also included Timo Arnall who had a very interesting talk on light painting, presenting work of people which I felt I should have know about already (Gjon Mili, Michael Weseley, Eric Staller). His second topic about visualizing the RFID and wifi networks range was also interesting. In fact the mix of speakers and topics made the day go by really fast!

I went to a separate event at Atelier Nord earlier in the week where Joanie and Simon from AntiVJ gave a 2 hour enthusiastic talk about their work. They were very excited about real-time software, and it reminded me a bit of the same energy which I experienced ten years ago with realtime video software like Imagine, nato and jitter.

White on Light video:

I was part of the exhibition and decided to do an outdoor snow projection. Unlike my previous attempts on working with snow I decided to not try to build anything with the snow, but instead cut out a piece of snow and work with the top curvy surfaces and the edges. It seems I am unable to make anything but slow meditative pieces nowadays, but the result wasn´t that bad, really.

More images from the projection at flickr.

The two day VPT workshop at Atelier Nord was fun, a very focused and eager-to-learn group, and several of the participants have already started doing their own projects with the software, which is exciting.