romlab workshop

I just finished teaching a two week workshop exploring space using VPT and arduino.

I kept a project blog, romlab2012, which might have a few useful tips for the integration of VPT and arduino. It will be updated with a bit more documentation from the different projects in the near future.

 

Advertisement

VPT 5.1 available

After nearly 2500 downloads of the first version of Video Proejction Tools VPT 5 released in december 2010, I am happy to announce the new version 5.1, partly based on feedback from workshops and the VPT forum.

The release focuses on all the possibilities to expand and customize VPT to be the right tool for your particular project.

This includes a vastly improved controller section, making it possible to control almost every feature of VPT from another application (like maxmsp, processing and PD), computer, sensors through the arduino microcontroller, midicontroller as well as built in lfo controllers for automating control and software sliders and buttons. This comes on top of the already existing built-in preset and cuelist system for playback. Through OSC, VPT parameters can relatively easy be controlled by gestures, using camera or kinect tracking.
You can easily set up VPT to control simple projector shutters by controlling servomotors using the arduino module.

Templates are provided with the download for exploring OSC on maxmsp and Processing, PD will follow later. Arduino templates for input/output as well as a servo example is also included with the download.

VPT can be used with multiple projectors using Matrox Triplehead2go, and the latest version also includes support for multichannel sound: The movie sources can be individually routed to up to eight channels, depending on your sound card. Using the provided maxmsp template it is possible to process the sound further.

VPT 5.1 includes a new possibillity of processing the sources using the FX bus. The FX bus consists of eight FX chains, modules which contains shader based effects. New FX chain modules can easily be made using the provided maxmsp/jitter template, and can be dragndropped onto the existing FX chain to replace it, making it easy to customize the effects to your particular need.

Three new sources have been added to VPT: A solid (if you just want a colour in a layer), a noise texture generator and a text module which can either use text from file, osc or typed text. A maxmsp/jitter template is also provided if you want to create your own source.

Read more about VPT 5.1 and download it here

Read the online manual

Visit the VPTforum

VPT templates and examples for maxmsp/jitter, PD, Processing and Arduino

VPT is a free multipurpose realtime projection sofware tool for Mac and Windows developed by HC Gilje. Among other things it can be used for projecting video on complex forms, adapt a projection to a particular space/surface, combine recorded and live footage, or for multiscreen HD playback.

They´re back

For the occasion of Festpillene i Bergen 2010, a flock of Wind-up Birds has taken over the mountain side of the Fløien mountain. More specifically along the windy, steep path through the spring-green forest called Fløiensvingene. They will probably stay a few weeks. This has created a lot of buzz among the people using the path, so I will try to spend as much time as possible there to listen to audience reactions.

They might sound similar to previous generations, but there has been a few changes under the hood: The wood blocks have this time been milled instead of glued, making them more robust. Thanks to Ivar Bergseth and his CNC mill!

I use a new set of xbee modules, the xbee 2.4 pro digimesh modules. This gives a more stable network and is much easier to work with than the previous firmware.

I made new more practical circuitboards, which I got manufactured at BatchPCB, making the job of assembling the woody boards much easier. This was my first attempt at working with Eagle PCB circuit design software, but I leaned heavily on Roar Sletteland´s previous PCB layout for the first generation.

iphone serial communication

Apple has not made it easy to let the iphone communicate with external devices. Basically, you need a jailbroken phone to do anything. This post covers the current state of serial communication on the iphone.

A succesful example of making an application that uses the serialport on the iphone is xGPS, which let you connect to an external GPS.

There are two sites that kept showing up when I looked at solutions for accessing the serial port.
One solution described on ohscope, uses the mobile terminal app available from Cydia, in combination with a serial comunication program called minicom. Basically it lets you open a serial connection and you can send and receive data using the terminal window.
For simple monitoring of input this might be all you will ever need, and it is relatively simple to get working.

The other solution on devdot, dating back to 2007, is a serial communication application written in C, which is still the example that shows up in different forums. There has been a few changes since 2007, when Iphone was on OS 1.x.

My main interest in getting serial communication working is that it would be very helpful for some of my mobile installations like the Wind-up birds, a network of mechanical woodpeckers, where a lot of the work is setting up a network between several xbee radio modems. It also involves climbing in trees, and it would be very convenient to have small controller/monitor app on my iphone, instead of using my laptop.
I also think it would work well with my dimsun dimmers, being able to control the lights remotely.

Other options
When searching for solutions, some projects show up which claim that “the iphone controls my RC car/my garagedoor” etc, but actually the iphone is communicating with a pc which then again communicates with an arduino. This is ok for protyping, but I want to get rid of the PC part (Actually there was a project in 2008 which uses iphone-xbee to control a rc-car. The instructions are in Japanese, and it was made using python I believe).

There are some promising work being done on making the built in bluetooth work as a bluetooth modem (something that Apple has blocked). Currently the BTstack is only able to read serial data. There is a demo app showing how a wii controller controls a model of the wii on the iphone. There is also the HumanAPI project which has made an interface for a pulserate sensor, based on the BTstack and a programming environment called PhoneGap.

Maybe sometime soon it will be possible to access the serialport on the iphone without jailbreaking it. In the openframeworks forum a new serial adapter for the iphone was mentioned.

The physical connection
To get access to the serialport you need some sort of adapter from the iphone connector. I got mine here, and sparkfun stocks a few different models. You can find an overview of the pinout at pinouts.ru.

The physical connection is the easiest part. You only need three connections from the Iphone: rx,tx and gnd. Optionally you could also use the +3.3v connector, making it possible to power up your device from the iphone. I do this with my xbee connector, but you need to be careful with this, as you might break the iphone if your device draws too much current. I haven´t found a reliable source for how much  current you can draw, but have seen 80-100mA mentioned a few places. This works fine with the standard xbee, as it draws around 40mA. It would also work fine with a 3.3v based arduino, and also probably with the FIO (not sure how much current the combination of the xbee and atmega chip draws).

Using the xbee which runs on 3.3v I  connect tx_iphone to rx_xbee, rx_iphone to tx_xbee, and gnd_iphone to gnd_xbee.

If you connect the iphone directly to an arduino running on 5v, you need to have a 1 k resistor between tx_arduino and rx_iphone.

approach 1: Serial communication using mobile terminal and minicom

Before getting started I recommend installing a very useful addition to the iphone software called sbsettings, available on Cydia, which basically is like a control panel for “hidden” features on the iphone. You can also install addons like syslog, which is very useful when debugging. The mobile terminal app is also very useful.

The instructions from ohscope are pretty clear.
You need to know how to SSH to the iphone from your computer. I use a combination of fugu and terminal on OSX, Putty should do the job on Windows.  There is a good explanation on how to do this on a mac (and many other useful things) at hackthatphone.

It is much easier to configure minicom from a terminal on a computer than from the iphone terminal program. So I open terminal on my mac,
and enter ssh -l root 10.0.1.9 (ip of my iphone). The default password is alpine.

After I had updated my iphone to 3.1.2 I got an error message when trying to open minicom: “minicom cannot create lockfile. Sorry” This turned out to be a permission problem, so I navigated to   /var and changed the permission so anybody could write to lock: chmod a+w lock.

Some useful links related to mobile terminal and minicom.

mobile terminal wiki

I found some of the gestures not to be working, so I ended up mainly using the keyboard for navigation.
basic minicom commands

approach 2: make an app that can read and write to the serial port

Preparing for development on the iphone

Last saturday NYCresistor hosted an introductory openframeworks workshop with Zach Lieberman, one the creators.
Basically openframeworks makes it much easier to do interesting audiovisual programming than if you would have to write the code in C++, by wrapping low-level code in easier to use methods and functions.
I was new to both xcode, openframeworks and C, so there was a lot of new things to dive into, but it definetly helps having a background from programming with Processing.

The neat thing with openframeworks is that it is crossplatform: It runs on osX, linux, windows and iphone.  This was a perfect opportunity for me to move to step two: make my own serial communication program.

The drawback with writing code for the iphone is that you are supposed to pay $99 to be a developer. Since I am mainly interested in developing programs for using the serialport on my own phone I wasn´t so interested in paying for it. Fortunately there are ways around this, but it is rather complicated.
One way is to use the free toolchain.

I felt more comfortable using Xcode combined with openframeworks, so I searched for ways of making Xcode compile, install and debug to the iphone.
First of all, you need to register as a developer at apple and download the Iphone SDK.
Then you have to do some modifications to get around signing process. There are several ways of doing this, some of them extremely complicated. I think I found a good solution that is not too hard, and which works on iphone os 3.12 and xcode 3.21.
The basic instructions are found at the networkpx project blog. First you need to create a code signing certificate, and update your xcode target accordingly. If you only want to compile in xcode, it is very simple, you only need to change a few lines in a file (I found three places that should be changed, the instructions only mentions two).
To be able to install and debug it gets more complicated.
I used a combination of Appsync 3.1, available from hackulo.us repository on Cydia, and some of the instructions from the networkpx project blog. Basically Appsync makes steps 5-10 in the install-debug section unnecessary.
It is important to set the -gta flag in the options for your xcode project (found in the code signing section, other code signing flags).

Now I am able to use the iphone with xcode like a normal developer: I can run and debug on the iphone simulator, or install, run and debug on the iphone, which is very convenient when developing. Sometimes debugging from the iphone doesn´t work, then I activate syslog which creates a log on the iphone that I can read on my computer later.

Setting up the serial port

The port name is the first thing to know: /dev/tty/iap

The serial code itself is quite standard, so the example from devdot would probably still work, basically doing
fd = open(“/dev/tty.iap”, O_RDWR | O_NOCTTY | O_NDELAY);
to open the port.

Using openframeworks, I modified the oFserial example found in the mac distribution (this example is not in the iphone distribution).

the only thing I need to open the serial port using openframeworks is to specify portname and baudrate: serial.setup(“/dev/tty.iap”,9600);

In v0.60 of openframeworks for iphone, the iphone is not included in the serial library, so you need to manually add “|| defined(TARGET_OF_IPHONE” to each #if section in ofSerial.cpp which refers to osx and linux, so it would look like this:
#if defined( TARGET_OSX ) || defined( TARGET_LINUX ) || defined(TARGET_OF_IPHONE)
This will most likely be included in newer versions of openframeworks.

So, by modifying an existing serial code example I was able to compile and run the program but got an error saying that it was unable to open serial port.
After a hint on the GPSserial forum, I understood that my problem was related to permissions of the application. The application needs to be run as root.
I spent about two days learning about permissions and trying with setting different permissions, without any luck.
Why was I able to open the serial port via minicom and terminal and not from my application?
After some more research I discovered that in the early Iphone OS days, applications were installed as root, this was later changed as it could be a security risk. Maybe this is why the example from 2007 worked back then, because it was automatically installed as root?
Appstore apps are installed in a different location (/private/var/mobile/Applications) than Cydia apps like Terminal (/private/var/stash/Applications), and these places have different permissions. Xcode automatically installs programs in the appstore appsection, inside a folder with a long number as name, such as 03A7764B-DDE0-4A60-A56B-CF2ADBB0213F. So what would happen if I moved my app to the the other applications folder?
I was able to connect to the serialport!

There might be an easier way to move the application, but this is how I did it:  The easiest way to find the folder installed by xcode is to find the last created folder in the /var/mobile/Applications folder. Open this folder, and just copy the yourapp.app to the desktop. Then delete the folder using ssh from a terminal program. If you are in the Applications directory the command would be: rm -r 03A7764B-DDE0-4A60-A56B-CF2ADBB0213F (replace with your own folder name). Copy your.app into the /var/stash/Applications, respring the springboard (using sbsettings) on your iphone and you should be set. If you need to install a new version of the app from xcode you need to first delete yourapp.app and respring the springboard or else you will receive an error when trying to install the app.

Two demos
To show that it actually works I have made two simple demo videos showing serial input and output.

dimsun

The last half year I have been working quite a lot with lights and shadows, and this summer I decided to build my own lighting system, which I called dimsun. The first use of it was for my blink installation consisting of 24 leds placed in a circle.

It consists of dimmers based on the TLC5940 led controllers. Each of the dimmers can control 16 channels, up to 5w each, and are intended for power LEDS, very bright leds up to 5W. The dimmer is controlled by an arduino microcontroller, and up to 6 dimmers (96 channels) can be controlled from one arduino.

more images of the dimsun system.

The schematics will be made available soon.

The lamps are based on starshaped LED´s combined with a lens mounted on an aluminum profile.

Controlling a xbee network using an arduino


As promised some days ago, here is the followup to the minimal arduino post. I share the arduino code used in controlling a znet 2.5/xbee series 2 network, as well as the schematics for the controller itself.

wind-up bird(s)

Introducing a new species, the wind-up birds.
The wind-up birds are a flock of mechanical woodpeckers, having found their first home in a forest in Lillehammer, Norway as part of the UT-21 project.

How will nature treat them, with hostillity or acceptance? How will the wind-up birds adapt to heat/cold wet/dry conditions? Will small insects creep inside the circuitry creating possible short circuits, beetles eat the wood, squirrels use the wood slit as nut storage (or the roof as a slide?), birds use it as a shelter, etc.? Will they be treated as foreign objects or accepted into the local eco-system?
How do real woodpeckers react? Are they threatened, attracted, or not bothered? Will they use the roof as a pecking drum?
Initial tests indicate an attraction: it took 15 minutes for a real woodpecker to join a wind-up bird on the same tree.

Adding a layer to the perceived reality:
The sound of the wind-up birds easily fool humans. The initial reaction is surprise, and then bewilderment, as there seems to be a whole flock of birds communicating. Then the curiosity of trying to track them down, to localize the sound, becoming more aware of the surroundings, sharpening the senses.
This was the initial motivation for me, the movement of sound in a space, and the effort involved in trying to localize the source of the sounds which lead to a stimulation of our perceptive apparatus.
By introducing an element or layer which somehow relates to the environment, but still is a bit off (It is very unlikely to hear a flock of woodpeckers drumming at the same time, and it is usually restricted to the mating season in the spring), you perceive the reality differently. This could be called an animalistic alertness, one of the three listening modes described by Barthes  (Listening).
This project is related to my soundpockets project, and as with that work I feel it is somehow more interesting when people happen upon it by chance, instead of looking for a piece of art in the forest.

The development of the wind-up birds have gone through a lot of phases:
It was important for me that the sound produced was not playback of a recorded sound, but mechanically produced, so I looked at many different ways of creating resonance boxes and ended up with a construction resembling a wood block: a piece of wood with a slit. I ended up using a simple push-magnet solenoid for the mechanical part.
The first prototype was an arduino board, the solenoid and the woodblock, trying to find the right pecking frequency for the solenoid, and testing different woodblock designs.
I decided to add a roof, to protect the wood and circuitry from heavy rain.

Since the wind-up birds are communicating, they needed to be in a wireless network. I decided early to use the xbee radios which are programmable, low-energy, high speed radio modems which can work in a mesh network.
A lot of effort was put into creating and deciphering xbee datapackets to be used in the arduino/processing environment.
Energy consumption was an important factor in the project, since the wind-up birds would be in a forest with no access to electricity and should be active for a month. One strategy was to use low-power components. It´s amazing the difference between two voltage regulators for instance when they have to be on for a month(the difference in consumption was the size of the battery I ended up using for the whole project).
I also decided to use a low power version of the arduino, basically just the microcontroller chip running at half speed (which meant using a AVR programmer to program the chips).
The other important factor in reducing energy consumption was to make use of the xbee and arduino´s capability to go to sleep when inactive. I decided the wind-up birds would be pecking about every 5 minutes, and inbetween they would sleep. Also at night they would be sleeping.
After having decided upon the components to be used, I designed a prototype circuit, which was later made into a proper circuit board making it easier to mass produce the birds.
It took a lot of trial and error to get the wind-up birds alive and pecking in the lab, but I had a pretty reliable setup when I placed them in the forest. The challenge in the forest was to find interesting locations within the range of the network, and to find interesting pecking patterns. I ended up making a system where the pecking pattern is different everytime, so it wouldn´t become a simple playback of movement, but a dynamic system.

More images of the wind-up birds

thanks to Tom Igoe, Jeff Mann, Kristian Skjold and Roar Sletteland for helping me realize this project.

Here is a link to the first technical post related to the project, which covers how to program and hook up a atmega168 as a minimal arduino standalone, using the internal oscillator running at 8mhz and 3.3 volts.

David Cuartielles´s talk at Piksel

This year´s piksel festival is over, with focus on circuit bending and open hardware solutions. I could only stay the first day, but have been following part of the festival through the quite impressive streams archive from the festival.

I got to see the wonderful loud objects live, which I have written about before.

Last night I watched David Cuartielles from the Arduino team give an interesting talk about the beginning of the arduino project, the current status of the arduino development and challenges ahead. They have sold over 20000 boards, and it has become the standard tool for electronics prototyping. As a result of this position there are now clones being made all over the world, for instance a dual core arduino (!!) made by some people in South-Korea.

He talked about how everything with arduino is open, they only reserve the use of the name arduino. There is quite a big difference between open software and open hardware, as hardware needs to be manufactured so you need some initial investment. They have an interesting strategy for funding new development by letting corporations and institutions paying them to work on the arduino project, so for instance Samsung agreed to let Cuartielles make a lot of board designs which will be available for free soon, without Samsung having any rights to the designs. Some of these boards looked very interesting, like a 32 channel output board (which can be daisychained) with transistors to control lights, motors etc, a 64 channel input board, relay boards etc.

The arduino project is being restructured into a foundation, and some money will each year be used to support people and projects who have problems with access to hardware.

Here is a direct link to his talk (ogg stream).

Connect the dots

mouselab2.jpg, originally uploaded by hc gilje.

I have been very busy preparing and giving a 2 week physical computing workshop at The Academy of Fine Arts in Bergen, “Connect the dots”. It has its own blog, with lots of useful info related to arduino, mice etc. (look for resources category). There are also images available from the workshop.
The aim was to introduce to a mixed group of students the basic concepts of physical computing, and how to to create relations between objects,spaces,actions and people, so it was both a hands on workshop with arduino (analog in/out, digital in/out, serial communication with computer), different sensors, transistors and relays controlling 12 and 220 volt appliances, discussion and presentation of other artists´ work, and the production of a one day exhibition including a listening post, a mouse radio, a paper burning machine, a weather machine and a callstation (where the arduino picks up the phone when you called a specific number ++). Read all about it in the connect the dots blog!

arduino workshop

I had the opportunity to follow an Arduino workshop at the interaction design department at AHO (The Architecture School in Oslo). The workshop was led by Tom Igoe (co-author of Physical Computing), and covered the Arduino microcontroller, the Xbee wireless modules, bluesmirf bluetooth modules, small RFID readers, communication between pc and arduino using processing or max, and much more.