romlab workshop

I just finished teaching a two week workshop exploring space using VPT and arduino.

I kept a project blog, romlab2012, which might have a few useful tips for the integration of VPT and arduino. It will be updated with a bit more documentation from the different projects in the near future.


Results from openLAB exhibition at IMAL

In beginning of April IMAL hosted an openLAB workshop called “Projections on forms and spaces”.
The workshop was led by me and was mainly based on using VPT. We selected 7 projects to be realised during one week development at IMAL, and the documentation is now available from IMAL.

It was an intense and interesting week, and interesting to see VPT explored for new uses in installation and performances. A new development for me was to introduce a simple video and audio trigger setup (available in the xtension section), which allowed for some audience interaction, as well as use of the serial module for controlling lights inside sculptures.

VPT 5.1 available

After nearly 2500 downloads of the first version of Video Proejction Tools VPT 5 released in december 2010, I am happy to announce the new version 5.1, partly based on feedback from workshops and the VPT forum.

The release focuses on all the possibilities to expand and customize VPT to be the right tool for your particular project.

This includes a vastly improved controller section, making it possible to control almost every feature of VPT from another application (like maxmsp, processing and PD), computer, sensors through the arduino microcontroller, midicontroller as well as built in lfo controllers for automating control and software sliders and buttons. This comes on top of the already existing built-in preset and cuelist system for playback. Through OSC, VPT parameters can relatively easy be controlled by gestures, using camera or kinect tracking.
You can easily set up VPT to control simple projector shutters by controlling servomotors using the arduino module.

Templates are provided with the download for exploring OSC on maxmsp and Processing, PD will follow later. Arduino templates for input/output as well as a servo example is also included with the download.

VPT can be used with multiple projectors using Matrox Triplehead2go, and the latest version also includes support for multichannel sound: The movie sources can be individually routed to up to eight channels, depending on your sound card. Using the provided maxmsp template it is possible to process the sound further.

VPT 5.1 includes a new possibillity of processing the sources using the FX bus. The FX bus consists of eight FX chains, modules which contains shader based effects. New FX chain modules can easily be made using the provided maxmsp/jitter template, and can be dragndropped onto the existing FX chain to replace it, making it easy to customize the effects to your particular need.

Three new sources have been added to VPT: A solid (if you just want a colour in a layer), a noise texture generator and a text module which can either use text from file, osc or typed text. A maxmsp/jitter template is also provided if you want to create your own source.

Read more about VPT 5.1 and download it here

Read the online manual

Visit the VPTforum

VPT templates and examples for maxmsp/jitter, PD, Processing and Arduino

VPT is a free multipurpose realtime projection sofware tool for Mac and Windows developed by HC Gilje. Among other things it can be used for projecting video on complex forms, adapt a projection to a particular space/surface, combine recorded and live footage, or for multiscreen HD playback.

makerbot #508

One of the things I was planning to work on during my New York residency was to construct a makerbot, a 3D printer that comes as a kit, based on the reprap which I have written about before. The cupcake cnc, which is the official name, is made by a Brooklyn based company Makerbot Industries that came out of NYCresistor. The first model came out late spring last year, and they continually improve both the mechanical,electrical and software parts for each batch they send out. For about $1000 you get all you need to make your own desktop 3D printer, which sounded like a good way of getting started with digital fabrication.

For me, one of the entrypoints into fabbing has been through reading some of Mitchell Whitelaws postings, for instance Transduction, Transmateriality and Expanded Computing. Chris Anderson wrote a very optimistic article in Wired in January: “In the Next Industrial Revolution, Atoms Are the New Bits“, which is nicely balanced by the maybe more realistic gizmodo article “Atoms Are Not Bits; Wired Is Not A Business Magazine“. A few days ago Thingiverse posted their take on this: “Atoms are Not the New Bits. Yet.

I ordered my kit in december, and in the beginning of february it arrived at my doorstep in Brooklyn. The construction took about a week. The instructions for assembling the makerbot are pretty good (although I had my IKEA moments), but to actually get the makerbot printing was a lot more complicated and frustrating than what I had expected. After trying different things for almost a month with plastic-spaghetti as my only visible result I got a replacement card for the extruder controller (the part which controls the flow of plastic), so last night I was finally able to print something which was very exciting!

The cupcake cnc is made from lasercut plywood and acrylic. It has three stepper motors: The x and y axis are moving the build-platform, the z-axis moves the extruder vertically in relation to the build platform. The printed object is buillt layer by layer:plastic filament is fed into the extruder, which heats it up and squeezes it out similar to a hot-glue gun. The controller is based on the seeduino microcontroller, which is an arduino clone.

To make an object you obviously need a file. The final file is basically a description of operations for the makerbot, controlling the position in the x,y,z axis, as well as temperature and flow of the extruder. You start off with a 3d object, made for instance with blender, sketchup or openscad. This must be exported/converted to the STL format, which can be read by the application that converts the 3D model into slices and produce the controlcode understood by the makerbot. This is done in python-based skeinforge, but some more intuitive tools have started to emerge. The makerbot comes with its own controller software, ReplicatorG, which is very similar to the arduino interface. You can choose to either save the object-code onto a SD-card, or to print the object directly from ReplicatorG.

A good place to start is Thingiverse, where people upload their 3d objects for other people to make. I find it fascinating that I can make my own version of an object somebody else has modeled. I also find the makerbot blog quite useful. Bre Petis, one of the makerbot team, gave a nice presentation(mp4) at Chaos Computer Conference in Berlin in January.

More images of #508

iphone serial communication

Apple has not made it easy to let the iphone communicate with external devices. Basically, you need a jailbroken phone to do anything. This post covers the current state of serial communication on the iphone.

A succesful example of making an application that uses the serialport on the iphone is xGPS, which let you connect to an external GPS.

There are two sites that kept showing up when I looked at solutions for accessing the serial port.
One solution described on ohscope, uses the mobile terminal app available from Cydia, in combination with a serial comunication program called minicom. Basically it lets you open a serial connection and you can send and receive data using the terminal window.
For simple monitoring of input this might be all you will ever need, and it is relatively simple to get working.

The other solution on devdot, dating back to 2007, is a serial communication application written in C, which is still the example that shows up in different forums. There has been a few changes since 2007, when Iphone was on OS 1.x.

My main interest in getting serial communication working is that it would be very helpful for some of my mobile installations like the Wind-up birds, a network of mechanical woodpeckers, where a lot of the work is setting up a network between several xbee radio modems. It also involves climbing in trees, and it would be very convenient to have small controller/monitor app on my iphone, instead of using my laptop.
I also think it would work well with my dimsun dimmers, being able to control the lights remotely.

Other options
When searching for solutions, some projects show up which claim that “the iphone controls my RC car/my garagedoor” etc, but actually the iphone is communicating with a pc which then again communicates with an arduino. This is ok for protyping, but I want to get rid of the PC part (Actually there was a project in 2008 which uses iphone-xbee to control a rc-car. The instructions are in Japanese, and it was made using python I believe).

There are some promising work being done on making the built in bluetooth work as a bluetooth modem (something that Apple has blocked). Currently the BTstack is only able to read serial data. There is a demo app showing how a wii controller controls a model of the wii on the iphone. There is also the HumanAPI project which has made an interface for a pulserate sensor, based on the BTstack and a programming environment called PhoneGap.

Maybe sometime soon it will be possible to access the serialport on the iphone without jailbreaking it. In the openframeworks forum a new serial adapter for the iphone was mentioned.

The physical connection
To get access to the serialport you need some sort of adapter from the iphone connector. I got mine here, and sparkfun stocks a few different models. You can find an overview of the pinout at

The physical connection is the easiest part. You only need three connections from the Iphone: rx,tx and gnd. Optionally you could also use the +3.3v connector, making it possible to power up your device from the iphone. I do this with my xbee connector, but you need to be careful with this, as you might break the iphone if your device draws too much current. I haven´t found a reliable source for how much  current you can draw, but have seen 80-100mA mentioned a few places. This works fine with the standard xbee, as it draws around 40mA. It would also work fine with a 3.3v based arduino, and also probably with the FIO (not sure how much current the combination of the xbee and atmega chip draws).

Using the xbee which runs on 3.3v I  connect tx_iphone to rx_xbee, rx_iphone to tx_xbee, and gnd_iphone to gnd_xbee.

If you connect the iphone directly to an arduino running on 5v, you need to have a 1 k resistor between tx_arduino and rx_iphone.

approach 1: Serial communication using mobile terminal and minicom

Before getting started I recommend installing a very useful addition to the iphone software called sbsettings, available on Cydia, which basically is like a control panel for “hidden” features on the iphone. You can also install addons like syslog, which is very useful when debugging. The mobile terminal app is also very useful.

The instructions from ohscope are pretty clear.
You need to know how to SSH to the iphone from your computer. I use a combination of fugu and terminal on OSX, Putty should do the job on Windows.  There is a good explanation on how to do this on a mac (and many other useful things) at hackthatphone.

It is much easier to configure minicom from a terminal on a computer than from the iphone terminal program. So I open terminal on my mac,
and enter ssh -l root (ip of my iphone). The default password is alpine.

After I had updated my iphone to 3.1.2 I got an error message when trying to open minicom: “minicom cannot create lockfile. Sorry” This turned out to be a permission problem, so I navigated to   /var and changed the permission so anybody could write to lock: chmod a+w lock.

Some useful links related to mobile terminal and minicom.

mobile terminal wiki

I found some of the gestures not to be working, so I ended up mainly using the keyboard for navigation.
basic minicom commands

approach 2: make an app that can read and write to the serial port

Preparing for development on the iphone

Last saturday NYCresistor hosted an introductory openframeworks workshop with Zach Lieberman, one the creators.
Basically openframeworks makes it much easier to do interesting audiovisual programming than if you would have to write the code in C++, by wrapping low-level code in easier to use methods and functions.
I was new to both xcode, openframeworks and C, so there was a lot of new things to dive into, but it definetly helps having a background from programming with Processing.

The neat thing with openframeworks is that it is crossplatform: It runs on osX, linux, windows and iphone.  This was a perfect opportunity for me to move to step two: make my own serial communication program.

The drawback with writing code for the iphone is that you are supposed to pay $99 to be a developer. Since I am mainly interested in developing programs for using the serialport on my own phone I wasn´t so interested in paying for it. Fortunately there are ways around this, but it is rather complicated.
One way is to use the free toolchain.

I felt more comfortable using Xcode combined with openframeworks, so I searched for ways of making Xcode compile, install and debug to the iphone.
First of all, you need to register as a developer at apple and download the Iphone SDK.
Then you have to do some modifications to get around signing process. There are several ways of doing this, some of them extremely complicated. I think I found a good solution that is not too hard, and which works on iphone os 3.12 and xcode 3.21.
The basic instructions are found at the networkpx project blog. First you need to create a code signing certificate, and update your xcode target accordingly. If you only want to compile in xcode, it is very simple, you only need to change a few lines in a file (I found three places that should be changed, the instructions only mentions two).
To be able to install and debug it gets more complicated.
I used a combination of Appsync 3.1, available from repository on Cydia, and some of the instructions from the networkpx project blog. Basically Appsync makes steps 5-10 in the install-debug section unnecessary.
It is important to set the -gta flag in the options for your xcode project (found in the code signing section, other code signing flags).

Now I am able to use the iphone with xcode like a normal developer: I can run and debug on the iphone simulator, or install, run and debug on the iphone, which is very convenient when developing. Sometimes debugging from the iphone doesn´t work, then I activate syslog which creates a log on the iphone that I can read on my computer later.

Setting up the serial port

The port name is the first thing to know: /dev/tty/iap

The serial code itself is quite standard, so the example from devdot would probably still work, basically doing
fd = open(“/dev/tty.iap”, O_RDWR | O_NOCTTY | O_NDELAY);
to open the port.

Using openframeworks, I modified the oFserial example found in the mac distribution (this example is not in the iphone distribution).

the only thing I need to open the serial port using openframeworks is to specify portname and baudrate: serial.setup(“/dev/tty.iap”,9600);

In v0.60 of openframeworks for iphone, the iphone is not included in the serial library, so you need to manually add “|| defined(TARGET_OF_IPHONE” to each #if section in ofSerial.cpp which refers to osx and linux, so it would look like this:
#if defined( TARGET_OSX ) || defined( TARGET_LINUX ) || defined(TARGET_OF_IPHONE)
This will most likely be included in newer versions of openframeworks.

So, by modifying an existing serial code example I was able to compile and run the program but got an error saying that it was unable to open serial port.
After a hint on the GPSserial forum, I understood that my problem was related to permissions of the application. The application needs to be run as root.
I spent about two days learning about permissions and trying with setting different permissions, without any luck.
Why was I able to open the serial port via minicom and terminal and not from my application?
After some more research I discovered that in the early Iphone OS days, applications were installed as root, this was later changed as it could be a security risk. Maybe this is why the example from 2007 worked back then, because it was automatically installed as root?
Appstore apps are installed in a different location (/private/var/mobile/Applications) than Cydia apps like Terminal (/private/var/stash/Applications), and these places have different permissions. Xcode automatically installs programs in the appstore appsection, inside a folder with a long number as name, such as 03A7764B-DDE0-4A60-A56B-CF2ADBB0213F. So what would happen if I moved my app to the the other applications folder?
I was able to connect to the serialport!

There might be an easier way to move the application, but this is how I did it:  The easiest way to find the folder installed by xcode is to find the last created folder in the /var/mobile/Applications folder. Open this folder, and just copy the to the desktop. Then delete the folder using ssh from a terminal program. If you are in the Applications directory the command would be: rm -r 03A7764B-DDE0-4A60-A56B-CF2ADBB0213F (replace with your own folder name). Copy into the /var/stash/Applications, respring the springboard (using sbsettings) on your iphone and you should be set. If you need to install a new version of the app from xcode you need to first delete and respring the springboard or else you will receive an error when trying to install the app.

Two demos
To show that it actually works I have made two simple demo videos showing serial input and output.

The serial output example was my first attempt at integrating Interfacebuilder with openframeworks. There is one example of how to combine openframeworks and InterfaceBuilder from ITP.

The ITP example doesn´t include any interaction between the GUI elements and the code. I found the tutorial from switchonthecode to be quite helpful for this part.
I struggled with passing data from the interface part of the code to the serial communication part of the code, but solved this the dirty way by declaring some variables as extern, as described here. The supposedly more correct way is described here.

I was not able to find a good way of updating the interface elements based on incoming serialdata, so I decided to do that in openframeworks by making a simple line that moves vertically based on data read from a photocell attached to the arduino.

If I could have written this in Processing it would have taken me a few hours. Instead it took me two days, mainly because it was a new programming language and IDE for me, but also because Openframeworks/obj C is a lot more strict on how to write the code.
It would be great if iProcessing, a javascript version of Processing that runs on the iphone, would work with the serial port. But I assume since it is a web application I wouldn´t be able to run it as root.

Ideally, it would be nice to have a way of customizing the interface according to the need of the application. mrmr (not being developed anymore I think), one of the first OSC applications for the iphone, has a nice way of letting the user create a interface template and then upload it to the iphone. It could be interesting to integrate this with serial communication, especially when working with a microcontroller like the arduino. You could then have different templates depending on the functionality of the current setup, without having to create a new program each time.

It could also be interesting to incorporate firmata, which is already supported in the desktop versions of openframeworks. This has now been done by another openframeworks user, check the openframeworks forum thread.

Here is my openframeworks code.

Esemplasticism: The Truth is a Compromise

blink v2, originally uploaded by hc gilje.

Blink is part of an exhibition curated by Hicham Khalidi, produced by Den Haag art space TAG, made for Club Transmediale that opens in Berlin today.

From the exhibition decription:
Our brains are esemplastic. They are perfectly evolved for pattern recognition, designed to shape disconnected elements, like the incomplete or ambiguous information we get from our senses, into the seamless whole of our experience. What we see, hear, touch and feel is folded into an amalgam of data, emotions and cultural baggage. And in the contemporary world, this esemplastic power is pushed to the limit in the sea of information that we are floating in: data-visualizations, scientific studies and computer analyses become increasingly abstract and disconnected from our normal experiences. Are we losing our sense of meaning as we fail to join the billions of dots? What compromises are we making when we try to settle on a particular interpretation?

The works in Esemplasticism – the truth is a compromise are mostly low-tech, using everyday objects and media. Employing sound, objects and synchronicity; relatively ‘old’ technologies like field recordings, music, video, and projection, each piece lifts the curtain on the perceptual tactics that our esemplastic/apophonic/pattern recognising brains employ to negotiate the world; with wit and irony, they have much to say about verisimilitude as each exposes a different fracture between our expectations, our perceptions and our compromises about the objective ‘truth’ that exists ‘out there’.

Participating artists
Artists: Edwin Deen, Daniël Dennis de Wit, Lucinda Dayhew, Anke Eckardt, HC Gilje, Terrence Haggerty, Yolande Harris, Alexis O’hara, Pascal Petzinger, Mike Rijnierse, Willem Marijs, Bram Vreven, Katarina Zdjelar, Valentin Heun, Sagarika Sundaram, Gijs Burgmeijer.

I will post links to the catalogue when that becomes available.

The exhibition will be on until the end of February.

For me this was an opportunity to improve the installation both esthetically and technically. I constructed a platform for the equipment using a laser cutter, which turned out quite nice. This greatly simplified the installation of the work. As mentioned in previous posts, the installation uses my dimsun lighting system, and the design for this will be made available shortly.

Due to other obligations I needed to set up my installation before everyone else. It was a strange experience to work alone in the 900m2 empty building in Spandauer Strasse, close to Alexanderplatz. My only companion was the stepladder which also became the model for my documentation.


The last half year I have been working quite a lot with lights and shadows, and this summer I decided to build my own lighting system, which I called dimsun. The first use of it was for my blink installation consisting of 24 leds placed in a circle.

It consists of dimmers based on the TLC5940 led controllers. Each of the dimmers can control 16 channels, up to 5w each, and are intended for power LEDS, very bright leds up to 5W. The dimmer is controlled by an arduino microcontroller, and up to 6 dimmers (96 channels) can be controlled from one arduino.

more images of the dimsun system.

The schematics will be made available soon.

The lamps are based on starshaped LED´s combined with a lens mounted on an aluminum profile.

Controlling a xbee network using an arduino

As promised some days ago, here is the followup to the minimal arduino post. I share the arduino code used in controlling a znet 2.5/xbee series 2 network, as well as the schematics for the controller itself.

The wind-up birds continued

la forêt de Nouzhat Ibn Sima, originally uploaded by hc gilje.

The wind-up birds didn´t settle in the forest of Lillehammer.
Some of them went to the airport in Oslo, some of them to a park in Rabat, Morocco.
Two very different contexts in many ways:

Oslo Airport Gardermoen is celebrating its 10 year anniversary, and I was invited as one of two projects from the UT21 exhibition to be part of this anniversary.
The work was to be placed outside, in a passage between the parking area and the terminal building, a very busy pathway.

It was freezing, windy and wet the weekend it was installed. Of course everything at an airport involves heavy security, so I had a special permission card which I wore to avoid frightening passengers (bearded man climbing trees with electronics,wires and batteries). Actually I got a lot of strange questions, people wondering what these devices were against or for, was it to chase off the woodpeckers? When I explained that they were mechanical woodpeckers I got a lot of blank stares.
More images from the airport.

A week later I was off to Morocco, 25 degrees and sunny in Rabat, the capitol. There is a small art space there called l´appartement 22, run by Abdellah Karroum, which had been invited to present work at the first Brussels biennale. Abdellah decided to invite Anne Szefer Karlsen from HKS in Norway to curate some projects in Morocco, so she again invited Pedro Gomez Enza and myself to do projects in the frame of the Brussels Biennale, but in Morocco. It gets weirder.
Unlike the very organised airport project, things were a bit looser here. First we needed to find a location, and after some scouting I fell in love with a beautiful and strange park, on the outskirts of the center, la forêt de Nouzhat Ibn Sima( also known as le parc sportif), with lots of eucalyptus trees, cute fuzzy pine trees, mint tea houses and people exercising in the strangest ways.
Public art in Morocco isnt common and there had been quite a lot of discussion before my arrival as of what to do with permissions etc.
We ended up doing it without permissions, and therefore without a ladder to not draw attention on ourselves, and it turned into a strange undercover operation trying to set up woodpeckers in trees while pretending to do other things. We even drove around in a car while I was programming in the back seat.

I built a special version of the wind-up birds for this actionist installation. Basically I replaced the radio modems with a parasite brain (a timer and a light sensor), this made the birds more independent and maybe slightly more intelligent.
This actually corresponds quite well with the natural woodpeckers, some enjoy the company of its fellow creatures, while others insist on being alone.

more images from the wind-up birds in morocco, and some other images from morocco as well.

There are several types of natural woodpeckers in Morocco, but I unfortunately didn´t get a chance to see one.

wind-up bird(s)

Introducing a new species, the wind-up birds.
The wind-up birds are a flock of mechanical woodpeckers, having found their first home in a forest in Lillehammer, Norway as part of the UT-21 project.

How will nature treat them, with hostillity or acceptance? How will the wind-up birds adapt to heat/cold wet/dry conditions? Will small insects creep inside the circuitry creating possible short circuits, beetles eat the wood, squirrels use the wood slit as nut storage (or the roof as a slide?), birds use it as a shelter, etc.? Will they be treated as foreign objects or accepted into the local eco-system?
How do real woodpeckers react? Are they threatened, attracted, or not bothered? Will they use the roof as a pecking drum?
Initial tests indicate an attraction: it took 15 minutes for a real woodpecker to join a wind-up bird on the same tree.

Adding a layer to the perceived reality:
The sound of the wind-up birds easily fool humans. The initial reaction is surprise, and then bewilderment, as there seems to be a whole flock of birds communicating. Then the curiosity of trying to track them down, to localize the sound, becoming more aware of the surroundings, sharpening the senses.
This was the initial motivation for me, the movement of sound in a space, and the effort involved in trying to localize the source of the sounds which lead to a stimulation of our perceptive apparatus.
By introducing an element or layer which somehow relates to the environment, but still is a bit off (It is very unlikely to hear a flock of woodpeckers drumming at the same time, and it is usually restricted to the mating season in the spring), you perceive the reality differently. This could be called an animalistic alertness, one of the three listening modes described by Barthes  (Listening).
This project is related to my soundpockets project, and as with that work I feel it is somehow more interesting when people happen upon it by chance, instead of looking for a piece of art in the forest.

The development of the wind-up birds have gone through a lot of phases:
It was important for me that the sound produced was not playback of a recorded sound, but mechanically produced, so I looked at many different ways of creating resonance boxes and ended up with a construction resembling a wood block: a piece of wood with a slit. I ended up using a simple push-magnet solenoid for the mechanical part.
The first prototype was an arduino board, the solenoid and the woodblock, trying to find the right pecking frequency for the solenoid, and testing different woodblock designs.
I decided to add a roof, to protect the wood and circuitry from heavy rain.

Since the wind-up birds are communicating, they needed to be in a wireless network. I decided early to use the xbee radios which are programmable, low-energy, high speed radio modems which can work in a mesh network.
A lot of effort was put into creating and deciphering xbee datapackets to be used in the arduino/processing environment.
Energy consumption was an important factor in the project, since the wind-up birds would be in a forest with no access to electricity and should be active for a month. One strategy was to use low-power components. It´s amazing the difference between two voltage regulators for instance when they have to be on for a month(the difference in consumption was the size of the battery I ended up using for the whole project).
I also decided to use a low power version of the arduino, basically just the microcontroller chip running at half speed (which meant using a AVR programmer to program the chips).
The other important factor in reducing energy consumption was to make use of the xbee and arduino´s capability to go to sleep when inactive. I decided the wind-up birds would be pecking about every 5 minutes, and inbetween they would sleep. Also at night they would be sleeping.
After having decided upon the components to be used, I designed a prototype circuit, which was later made into a proper circuit board making it easier to mass produce the birds.
It took a lot of trial and error to get the wind-up birds alive and pecking in the lab, but I had a pretty reliable setup when I placed them in the forest. The challenge in the forest was to find interesting locations within the range of the network, and to find interesting pecking patterns. I ended up making a system where the pecking pattern is different everytime, so it wouldn´t become a simple playback of movement, but a dynamic system.

More images of the wind-up birds

thanks to Tom Igoe, Jeff Mann, Kristian Skjold and Roar Sletteland for helping me realize this project.

Here is a link to the first technical post related to the project, which covers how to program and hook up a atmega168 as a minimal arduino standalone, using the internal oscillator running at 8mhz and 3.3 volts.


Get every new post delivered to your Inbox.

Join 168 other followers