The intimacy of strangers


I just finished my new film of microscope lichen landscapes, found on a rock in a fence at Trondenes in Harstad, Norway. The film was commissioned by The Arctic Moving Image & Film Festival with premiere october 13th, 2022.
The film was created using a custom made computer-controlled mechanical stage and a digital microscope. Almost 50000 microscope images were stacked and stitched together into miniature lichen landscapes, with a virtual camera flying over these landscapes.
For the soundtrack I used free-impro drumming (performed by Justin Bennett), inspired by the free-impro symbiosis of the lichen, that transforms into a sonic landscape of sci-fi space exploration and encounters with mysterious creatures.

My hope when making this film was to create a bit of awareness about what lichens are.
Apart from the extreme variations in appearance, textures and color, lichens have become the poster organisms for a new biology which challenges the idea of the individual and supplements the theory of evolution.
The title The intimacy of strangers is taken from a chapter about lichen in the excellent book on fungi by Merlin Sheldrake, Entangled life from 2020. It refers to the fact that lichen is not one organism, but a symbiosis of several organisms, and these organisms are not related at all, they are from different kingdoms. Mainly a fungi that partners up with a photosynthesising organism (either algae or bacteria). So instead of acquiring traits over long stretches of time through evolution (so called vertical gene transfer), the lichen combines traits from fungi and algae/bacteria through horisontal gene transfer: Once they have partnered up they have acquired these new traits, no need to wait millions of years.
This partnership also means it doesn´t make sense to talk about an individual, but rather an ecosystem of players with different roles.

Me and microscopes


Digital microscopes have been a part of my practice since 2007 when my live collaboration mikro with Justin Bennett was first performed at Paradiso in Amsterdam.
Like telescopes, microscopes extend our perceptive range, letting us see details of objects and organisms which because of scale remains hidden to us in everyday life. For me it is always a joy to explore the microscopic universes of textures, materiality and colors.

In 2016 I was commissioned to make a film for Vertical Cinema, a project initiated by Sonic Acts where cinemascope 35 mm film is shown in a vertical format. I decided to make a microscope film based on plastic wrappings from consumables.
I wanted to work with vertical motion for the vertical screen so I made a computer-controllable mechanical stage for the microscope to be able to create animation along one axis: For every image captured, the stage would move by a very small increment making it possible to then create an animation from the still images. This resulted in rift which premiered at the Sonic Acts festival in 2017, with soundtrack by Bennett.


While working on this project I imagined creating a mechanical stage that could move in both x (sideways),y (back and forwards) and z (up and down) directions, and also rotate around the axis of the microscope to be able to create curves that followed the terrain of the microscopic landscape.

In the late summer of 2020 I spent a lot of time alone working underground in a former water reservoir to make the two large scale installations Shadowgrounds for Factory Light Festival. One of my studio neighbours suggested that I should read Underland by Robert McFarlane. Which I did, or rather listened to while driving to or working in the water reservoir. One of the chapters in that book is about Merlin Sheldrake, which made me listen to his book as well about the mind-blowing world of the fungi.
I don´t quite remember how I in the same setting ended up listening to a book by Jon Larsen, of Hot Club de Norvège fame, about his obsessive search for micro meteorites. To document the micro meteorites that he eventually found many of, he collaborated with Jon Kihle who had a powerful microscope camera. This is when I first heard about the concept of focus stacking:
Microscopes have a very shallow depth of field, which means if looking at something which isn´t flat most of what you are looking at is out of focus. Focus stacking works around this by combining images taken with different focus distances, thus bringing out the three-dimensionality of whatever is under the microscope.

After having visited Harstad for location scouting this spring I decided that this was an opportunity to realise my next microscope project.
I had found the perfect location around Trondenes Kirke in Harstad, a beautiful area with a lot of history, both from the viking times and more recently the second world war (with Soviet labour camp and later a camp for the people of Northern Norway who were forced to leave their homes at the end of the war).
There are rock fences built around the church and the graveyard, covered with a carpet of lichen, the orange colours being the most noticeable. I thought it could be interesting to do a very site specific project, to just focus on life on this one rock in the fence (as a parallell to life on the bigger rock Earth).


I have always been interested in how things and organisms have completely different time cycles / durations, and seeing the lichen growing on the rocks on the fences around the church it made me think of everything that has passed by them during their existence (the oldest lichen that has been dated is 9000 years old and lives in Sweden). Although lichen and humans inhabit the same planet we live in parallell worlds with different time cycles.
Sometimes these worlds interact with each other: Lichen mine minerals from the rocks, thus releasing minerals trapped in the rocks to the greater eco system (that might end up in your body at some point). Lichens are often the first settlers on new territory: They make the first soil on new rock formations (islands, lava, mountains) which becomes a starting point for other life. Lichen notice the presence of the human world mainly through pollution which has caused several species to go extinct.

Many of my projects starts with first developing hardware and software that I then use to make a piece. This involves a lot of research and getting into things I know little about.
I often quote Ursula K Le Guin: “I don’t know how to build and power a refrigerator, or program a computer, but I don’t know how to make a fishhook or a pair of shoes, either. I could learn. We all can learn. That’s the neat thing about technologies. They’re what we can learn to do. ”


Making, learning and sharing tools is an important part of my practice, so the next section is an attempt at sharing some of the process (nerd alert) in making this film.
The challenge for the hardware side of this project was to be able to create a moving platform that was able to repeat the same motion over and over again with very small steps.
I am not so experienced with mechanical motion, so there was lot of trial and error before I was able to get the motion of the mechanical stage stable enough. I am lucky to have access to both the maker spaces bitraf and Fellesverkstedet in Oslo where I can make prototypes and finished work with laser cutters and cnc routers. In fact the mechanics and motion control of laser cutters and cnc machines are very similar to what my needs were for the microscope stage: Horizontal motion (x and y) and Vertical motion (to move the microscope closer or further away from the sample). On top of that I wanted to add mechanical rotation to this, thus having 4 axis of motion.


Luckily in the midst of my mechanical struggles I discovered the technique of stitching individual microscope images together to get a bigger mosaic image, something often used in medical and biological microscopy. This is to compensate for another inherent issue with microscope photography: The larger the magnification, the less you see of the sample (small field of view).
This changed my working process dramatically, as I instead of making individual paths for the mechanical stage to follow, I could do an automated grid scanning of a sample, and then make as many paths I would like in software later, making the whole process much more flexible.
So for each position in the grid there would be x number of images taken at different focus depths. As an example a grid of 5×5 would cover an area of about 8x8mm, and with maybe 50 images per tile this becomes 1250 images for one sample of lichen.


So how did this setup work?
The microscope camera is connected to a computer running a microscope imaging software (toupview in my case).
The mechanical stage consists of 3 motors being controlled via a standard arduino-based cnc 3 axis controller used in DIY 3D printers and CNC machines. The software to control this was Universal Gcode Sender, which basically sends information to the controller about the position for the XYZ axis among other things.
So for each XY position in a predefined grid (positions calculated in rhino/grasshopper but that´s another story) the Z axis will move closer or further away from the lichen sample to get different focus for the microscope. The actual low and high points is different for each sample so I do a manual check to get an idea of what works best for that particular lichen formation. For each Z position a trigger is sent from the controller via another microcontroller to trigger image capture on a computer. This is then repeated for every XY position in the grid setup.

After the individual images of each tile have been captured the process of stacking and stitching starts.
There are several options for focus stacking software. The two most popular ones are ZereneStacker and Heliconfocus.
For stitching there are some free alternatives like ImageJ/Fiji, with various plugins but I ended up using a commercial application, PTgui which is very intuitive to use and with great results.
After all the images have been stacked together, and when each tile has been stitched together with the other tiles I have a lichen landscape I can explore.
For this I create camera paths in free 3D software Blender from which I make animated sequences of the camera moving across the landscape. I started getting into Blender and camera paths when working with 3D scans in the performance Nye Krigere, and later in the series of point cloud works Vardø Kystopprøret, so I could use that experience to create camera (and light) paths to get interesting camera movements.

Finally for the soundtrack, I ended up using Reaper, which is a very reasonably priced DAW (digital audio workstation), and like Blender there is a big community of enthusiastic users sharing knowledge, making it easy to find what you look for.

The stars of the film
Advertisement

romlab workshop

I just finished teaching a two week workshop exploring space using VPT and arduino.

I kept a project blog, romlab2012, which might have a few useful tips for the integration of VPT and arduino. It will be updated with a bit more documentation from the different projects in the near future.

 

Results from openLAB exhibition at IMAL

In beginning of April IMAL hosted an openLAB workshop called “Projections on forms and spaces”.
The workshop was led by me and was mainly based on using VPT. We selected 7 projects to be realised during one week development at IMAL, and the documentation is now available from IMAL.

It was an intense and interesting week, and interesting to see VPT explored for new uses in installation and performances. A new development for me was to introduce a simple video and audio trigger setup (available in the xtension section), which allowed for some audience interaction, as well as use of the serial module for controlling lights inside sculptures.

VPT 5.1 available

After nearly 2500 downloads of the first version of Video Proejction Tools VPT 5 released in december 2010, I am happy to announce the new version 5.1, partly based on feedback from workshops and the VPT forum.

The release focuses on all the possibilities to expand and customize VPT to be the right tool for your particular project.

This includes a vastly improved controller section, making it possible to control almost every feature of VPT from another application (like maxmsp, processing and PD), computer, sensors through the arduino microcontroller, midicontroller as well as built in lfo controllers for automating control and software sliders and buttons. This comes on top of the already existing built-in preset and cuelist system for playback. Through OSC, VPT parameters can relatively easy be controlled by gestures, using camera or kinect tracking.
You can easily set up VPT to control simple projector shutters by controlling servomotors using the arduino module.

Templates are provided with the download for exploring OSC on maxmsp and Processing, PD will follow later. Arduino templates for input/output as well as a servo example is also included with the download.

VPT can be used with multiple projectors using Matrox Triplehead2go, and the latest version also includes support for multichannel sound: The movie sources can be individually routed to up to eight channels, depending on your sound card. Using the provided maxmsp template it is possible to process the sound further.

VPT 5.1 includes a new possibillity of processing the sources using the FX bus. The FX bus consists of eight FX chains, modules which contains shader based effects. New FX chain modules can easily be made using the provided maxmsp/jitter template, and can be dragndropped onto the existing FX chain to replace it, making it easy to customize the effects to your particular need.

Three new sources have been added to VPT: A solid (if you just want a colour in a layer), a noise texture generator and a text module which can either use text from file, osc or typed text. A maxmsp/jitter template is also provided if you want to create your own source.

Read more about VPT 5.1 and download it here

Read the online manual

Visit the VPTforum

VPT templates and examples for maxmsp/jitter, PD, Processing and Arduino

VPT is a free multipurpose realtime projection sofware tool for Mac and Windows developed by HC Gilje. Among other things it can be used for projecting video on complex forms, adapt a projection to a particular space/surface, combine recorded and live footage, or for multiscreen HD playback.

makerbot #508

One of the things I was planning to work on during my New York residency was to construct a makerbot, a 3D printer that comes as a kit, based on the reprap which I have written about before. The cupcake cnc, which is the official name, is made by a Brooklyn based company Makerbot Industries that came out of NYCresistor. The first model came out late spring last year, and they continually improve both the mechanical,electrical and software parts for each batch they send out. For about $1000 you get all you need to make your own desktop 3D printer, which sounded like a good way of getting started with digital fabrication.

For me, one of the entrypoints into fabbing has been through reading some of Mitchell Whitelaws postings, for instance Transduction, Transmateriality and Expanded Computing. Chris Anderson wrote a very optimistic article in Wired in January: “In the Next Industrial Revolution, Atoms Are the New Bits“, which is nicely balanced by the maybe more realistic gizmodo article “Atoms Are Not Bits; Wired Is Not A Business Magazine“. A few days ago Thingiverse posted their take on this: “Atoms are Not the New Bits. Yet.

I ordered my kit in december, and in the beginning of february it arrived at my doorstep in Brooklyn. The construction took about a week. The instructions for assembling the makerbot are pretty good (although I had my IKEA moments), but to actually get the makerbot printing was a lot more complicated and frustrating than what I had expected. After trying different things for almost a month with plastic-spaghetti as my only visible result I got a replacement card for the extruder controller (the part which controls the flow of plastic), so last night I was finally able to print something which was very exciting!

The cupcake cnc is made from lasercut plywood and acrylic. It has three stepper motors: The x and y axis are moving the build-platform, the z-axis moves the extruder vertically in relation to the build platform. The printed object is buillt layer by layer:plastic filament is fed into the extruder, which heats it up and squeezes it out similar to a hot-glue gun. The controller is based on the seeduino microcontroller, which is an arduino clone.

To make an object you obviously need a file. The final file is basically a description of operations for the makerbot, controlling the position in the x,y,z axis, as well as temperature and flow of the extruder. You start off with a 3d object, made for instance with blender, sketchup or openscad. This must be exported/converted to the STL format, which can be read by the application that converts the 3D model into slices and produce the controlcode understood by the makerbot. This is done in python-based skeinforge, but some more intuitive tools have started to emerge. The makerbot comes with its own controller software, ReplicatorG, which is very similar to the arduino interface. You can choose to either save the object-code onto a SD-card, or to print the object directly from ReplicatorG.

A good place to start is Thingiverse, where people upload their 3d objects for other people to make. I find it fascinating that I can make my own version of an object somebody else has modeled. I also find the makerbot blog quite useful. Bre Petis, one of the makerbot team, gave a nice presentation(mp4) at Chaos Computer Conference in Berlin in January.

More images of #508

iphone serial communication

Apple has not made it easy to let the iphone communicate with external devices. Basically, you need a jailbroken phone to do anything. This post covers the current state of serial communication on the iphone.

A succesful example of making an application that uses the serialport on the iphone is xGPS, which let you connect to an external GPS.

There are two sites that kept showing up when I looked at solutions for accessing the serial port.
One solution described on ohscope, uses the mobile terminal app available from Cydia, in combination with a serial comunication program called minicom. Basically it lets you open a serial connection and you can send and receive data using the terminal window.
For simple monitoring of input this might be all you will ever need, and it is relatively simple to get working.

The other solution on devdot, dating back to 2007, is a serial communication application written in C, which is still the example that shows up in different forums. There has been a few changes since 2007, when Iphone was on OS 1.x.

My main interest in getting serial communication working is that it would be very helpful for some of my mobile installations like the Wind-up birds, a network of mechanical woodpeckers, where a lot of the work is setting up a network between several xbee radio modems. It also involves climbing in trees, and it would be very convenient to have small controller/monitor app on my iphone, instead of using my laptop.
I also think it would work well with my dimsun dimmers, being able to control the lights remotely.

Other options
When searching for solutions, some projects show up which claim that “the iphone controls my RC car/my garagedoor” etc, but actually the iphone is communicating with a pc which then again communicates with an arduino. This is ok for protyping, but I want to get rid of the PC part (Actually there was a project in 2008 which uses iphone-xbee to control a rc-car. The instructions are in Japanese, and it was made using python I believe).

There are some promising work being done on making the built in bluetooth work as a bluetooth modem (something that Apple has blocked). Currently the BTstack is only able to read serial data. There is a demo app showing how a wii controller controls a model of the wii on the iphone. There is also the HumanAPI project which has made an interface for a pulserate sensor, based on the BTstack and a programming environment called PhoneGap.

Maybe sometime soon it will be possible to access the serialport on the iphone without jailbreaking it. In the openframeworks forum a new serial adapter for the iphone was mentioned.

The physical connection
To get access to the serialport you need some sort of adapter from the iphone connector. I got mine here, and sparkfun stocks a few different models. You can find an overview of the pinout at pinouts.ru.

The physical connection is the easiest part. You only need three connections from the Iphone: rx,tx and gnd. Optionally you could also use the +3.3v connector, making it possible to power up your device from the iphone. I do this with my xbee connector, but you need to be careful with this, as you might break the iphone if your device draws too much current. I haven´t found a reliable source for how much  current you can draw, but have seen 80-100mA mentioned a few places. This works fine with the standard xbee, as it draws around 40mA. It would also work fine with a 3.3v based arduino, and also probably with the FIO (not sure how much current the combination of the xbee and atmega chip draws).

Using the xbee which runs on 3.3v I  connect tx_iphone to rx_xbee, rx_iphone to tx_xbee, and gnd_iphone to gnd_xbee.

If you connect the iphone directly to an arduino running on 5v, you need to have a 1 k resistor between tx_arduino and rx_iphone.

approach 1: Serial communication using mobile terminal and minicom

Before getting started I recommend installing a very useful addition to the iphone software called sbsettings, available on Cydia, which basically is like a control panel for “hidden” features on the iphone. You can also install addons like syslog, which is very useful when debugging. The mobile terminal app is also very useful.

The instructions from ohscope are pretty clear.
You need to know how to SSH to the iphone from your computer. I use a combination of fugu and terminal on OSX, Putty should do the job on Windows.  There is a good explanation on how to do this on a mac (and many other useful things) at hackthatphone.

It is much easier to configure minicom from a terminal on a computer than from the iphone terminal program. So I open terminal on my mac,
and enter ssh -l root 10.0.1.9 (ip of my iphone). The default password is alpine.

After I had updated my iphone to 3.1.2 I got an error message when trying to open minicom: “minicom cannot create lockfile. Sorry” This turned out to be a permission problem, so I navigated to   /var and changed the permission so anybody could write to lock: chmod a+w lock.

Some useful links related to mobile terminal and minicom.

mobile terminal wiki

I found some of the gestures not to be working, so I ended up mainly using the keyboard for navigation.
basic minicom commands

approach 2: make an app that can read and write to the serial port

Preparing for development on the iphone

Last saturday NYCresistor hosted an introductory openframeworks workshop with Zach Lieberman, one the creators.
Basically openframeworks makes it much easier to do interesting audiovisual programming than if you would have to write the code in C++, by wrapping low-level code in easier to use methods and functions.
I was new to both xcode, openframeworks and C, so there was a lot of new things to dive into, but it definetly helps having a background from programming with Processing.

The neat thing with openframeworks is that it is crossplatform: It runs on osX, linux, windows and iphone.  This was a perfect opportunity for me to move to step two: make my own serial communication program.

The drawback with writing code for the iphone is that you are supposed to pay $99 to be a developer. Since I am mainly interested in developing programs for using the serialport on my own phone I wasn´t so interested in paying for it. Fortunately there are ways around this, but it is rather complicated.
One way is to use the free toolchain.

I felt more comfortable using Xcode combined with openframeworks, so I searched for ways of making Xcode compile, install and debug to the iphone.
First of all, you need to register as a developer at apple and download the Iphone SDK.
Then you have to do some modifications to get around signing process. There are several ways of doing this, some of them extremely complicated. I think I found a good solution that is not too hard, and which works on iphone os 3.12 and xcode 3.21.
The basic instructions are found at the networkpx project blog. First you need to create a code signing certificate, and update your xcode target accordingly. If you only want to compile in xcode, it is very simple, you only need to change a few lines in a file (I found three places that should be changed, the instructions only mentions two).
To be able to install and debug it gets more complicated.
I used a combination of Appsync 3.1, available from hackulo.us repository on Cydia, and some of the instructions from the networkpx project blog. Basically Appsync makes steps 5-10 in the install-debug section unnecessary.
It is important to set the -gta flag in the options for your xcode project (found in the code signing section, other code signing flags).

Now I am able to use the iphone with xcode like a normal developer: I can run and debug on the iphone simulator, or install, run and debug on the iphone, which is very convenient when developing. Sometimes debugging from the iphone doesn´t work, then I activate syslog which creates a log on the iphone that I can read on my computer later.

Setting up the serial port

The port name is the first thing to know: /dev/tty/iap

The serial code itself is quite standard, so the example from devdot would probably still work, basically doing
fd = open(“/dev/tty.iap”, O_RDWR | O_NOCTTY | O_NDELAY);
to open the port.

Using openframeworks, I modified the oFserial example found in the mac distribution (this example is not in the iphone distribution).

the only thing I need to open the serial port using openframeworks is to specify portname and baudrate: serial.setup(“/dev/tty.iap”,9600);

In v0.60 of openframeworks for iphone, the iphone is not included in the serial library, so you need to manually add “|| defined(TARGET_OF_IPHONE” to each #if section in ofSerial.cpp which refers to osx and linux, so it would look like this:
#if defined( TARGET_OSX ) || defined( TARGET_LINUX ) || defined(TARGET_OF_IPHONE)
This will most likely be included in newer versions of openframeworks.

So, by modifying an existing serial code example I was able to compile and run the program but got an error saying that it was unable to open serial port.
After a hint on the GPSserial forum, I understood that my problem was related to permissions of the application. The application needs to be run as root.
I spent about two days learning about permissions and trying with setting different permissions, without any luck.
Why was I able to open the serial port via minicom and terminal and not from my application?
After some more research I discovered that in the early Iphone OS days, applications were installed as root, this was later changed as it could be a security risk. Maybe this is why the example from 2007 worked back then, because it was automatically installed as root?
Appstore apps are installed in a different location (/private/var/mobile/Applications) than Cydia apps like Terminal (/private/var/stash/Applications), and these places have different permissions. Xcode automatically installs programs in the appstore appsection, inside a folder with a long number as name, such as 03A7764B-DDE0-4A60-A56B-CF2ADBB0213F. So what would happen if I moved my app to the the other applications folder?
I was able to connect to the serialport!

There might be an easier way to move the application, but this is how I did it:  The easiest way to find the folder installed by xcode is to find the last created folder in the /var/mobile/Applications folder. Open this folder, and just copy the yourapp.app to the desktop. Then delete the folder using ssh from a terminal program. If you are in the Applications directory the command would be: rm -r 03A7764B-DDE0-4A60-A56B-CF2ADBB0213F (replace with your own folder name). Copy your.app into the /var/stash/Applications, respring the springboard (using sbsettings) on your iphone and you should be set. If you need to install a new version of the app from xcode you need to first delete yourapp.app and respring the springboard or else you will receive an error when trying to install the app.

Two demos
To show that it actually works I have made two simple demo videos showing serial input and output.

Esemplasticism: The Truth is a Compromise

blink v2, originally uploaded by hc gilje.

Blink is part of an exhibition curated by Hicham Khalidi, produced by Den Haag art space TAG, made for Club Transmediale that opens in Berlin today.

From the exhibition decription:
Our brains are esemplastic. They are perfectly evolved for pattern recognition, designed to shape disconnected elements, like the incomplete or ambiguous information we get from our senses, into the seamless whole of our experience. What we see, hear, touch and feel is folded into an amalgam of data, emotions and cultural baggage. And in the contemporary world, this esemplastic power is pushed to the limit in the sea of information that we are floating in: data-visualizations, scientific studies and computer analyses become increasingly abstract and disconnected from our normal experiences. Are we losing our sense of meaning as we fail to join the billions of dots? What compromises are we making when we try to settle on a particular interpretation?

The works in Esemplasticism – the truth is a compromise are mostly low-tech, using everyday objects and media. Employing sound, objects and synchronicity; relatively ‘old’ technologies like field recordings, music, video, and projection, each piece lifts the curtain on the perceptual tactics that our esemplastic/apophonic/pattern recognising brains employ to negotiate the world; with wit and irony, they have much to say about verisimilitude as each exposes a different fracture between our expectations, our perceptions and our compromises about the objective ‘truth’ that exists ‘out there’.

Participating artists
Artists: Edwin Deen, Daniël Dennis de Wit, Lucinda Dayhew, Anke Eckardt, HC Gilje, Terrence Haggerty, Yolande Harris, Alexis O’hara, Pascal Petzinger, Mike Rijnierse, Willem Marijs, Bram Vreven, Katarina Zdjelar, Valentin Heun, Sagarika Sundaram, Gijs Burgmeijer.

I will post links to the catalogue when that becomes available.

The exhibition will be on until the end of February.

For me this was an opportunity to improve the installation both esthetically and technically. I constructed a platform for the equipment using a laser cutter, which turned out quite nice. This greatly simplified the installation of the work. As mentioned in previous posts, the installation uses my dimsun lighting system, and the design for this will be made available shortly.

Due to other obligations I needed to set up my installation before everyone else. It was a strange experience to work alone in the 900m2 empty building in Spandauer Strasse, close to Alexanderplatz. My only companion was the stepladder which also became the model for my documentation.

dimsun

The last half year I have been working quite a lot with lights and shadows, and this summer I decided to build my own lighting system, which I called dimsun. The first use of it was for my blink installation consisting of 24 leds placed in a circle.

It consists of dimmers based on the TLC5940 led controllers. Each of the dimmers can control 16 channels, up to 5w each, and are intended for power LEDS, very bright leds up to 5W. The dimmer is controlled by an arduino microcontroller, and up to 6 dimmers (96 channels) can be controlled from one arduino.

more images of the dimsun system.

The schematics will be made available soon.

The lamps are based on starshaped LED´s combined with a lens mounted on an aluminum profile.

Controlling a xbee network using an arduino


As promised some days ago, here is the followup to the minimal arduino post. I share the arduino code used in controlling a znet 2.5/xbee series 2 network, as well as the schematics for the controller itself.

The wind-up birds continued

la forêt de Nouzhat Ibn Sima, originally uploaded by hc gilje.

The wind-up birds didn´t settle in the forest of Lillehammer.
Some of them went to the airport in Oslo, some of them to a park in Rabat, Morocco.
Two very different contexts in many ways:

Oslo Airport Gardermoen is celebrating its 10 year anniversary, and I was invited as one of two projects from the UT21 exhibition to be part of this anniversary.
The work was to be placed outside, in a passage between the parking area and the terminal building, a very busy pathway.

It was freezing, windy and wet the weekend it was installed. Of course everything at an airport involves heavy security, so I had a special permission card which I wore to avoid frightening passengers (bearded man climbing trees with electronics,wires and batteries). Actually I got a lot of strange questions, people wondering what these devices were against or for, was it to chase off the woodpeckers? When I explained that they were mechanical woodpeckers I got a lot of blank stares.
More images from the airport.


A week later I was off to Morocco, 25 degrees and sunny in Rabat, the capitol. There is a small art space there called l´appartement 22, run by Abdellah Karroum, which had been invited to present work at the first Brussels biennale. Abdellah decided to invite Anne Szefer Karlsen from HKS in Norway to curate some projects in Morocco, so she again invited Pedro Gomez Enza and myself to do projects in the frame of the Brussels Biennale, but in Morocco. It gets weirder.
Unlike the very organised airport project, things were a bit looser here. First we needed to find a location, and after some scouting I fell in love with a beautiful and strange park, on the outskirts of the center, la forêt de Nouzhat Ibn Sima( also known as le parc sportif), with lots of eucalyptus trees, cute fuzzy pine trees, mint tea houses and people exercising in the strangest ways.
Public art in Morocco isnt common and there had been quite a lot of discussion before my arrival as of what to do with permissions etc.
We ended up doing it without permissions, and therefore without a ladder to not draw attention on ourselves, and it turned into a strange undercover operation trying to set up woodpeckers in trees while pretending to do other things. We even drove around in a car while I was programming in the back seat.

I built a special version of the wind-up birds for this actionist installation. Basically I replaced the radio modems with a parasite brain (a timer and a light sensor), this made the birds more independent and maybe slightly more intelligent.
This actually corresponds quite well with the natural woodpeckers, some enjoy the company of its fellow creatures, while others insist on being alone.

more images from the wind-up birds in morocco, and some other images from morocco as well.

There are several types of natural woodpeckers in Morocco, but I unfortunately didn´t get a chance to see one.