Project Research- Softwares


I will need a Software to process the data collected by the microcontroller or the web cam, and consequently to control the animation.

I look into this few softwares to check if there were suitable for my needs.



Isadora is a graphic programming environment that provides interactive control over digital media, with special emphasis on the real-time manipulation of digital video. In my project this could quite easily trigger and control a video (animation).

However this one would be more adequate for a sound installation with graphic manipulation.


Quartz Composer

This free visual tool makes some sophisticated video and 3D possible without coding…. Waow…It is free and comes with the Mac OS disc in the tool developer section.

This is The Tool for VJing and to create some reactive/interactive animation with a web cam.  As I am not to keen on using a web cam, I am afraid I would not have a go on it this year…Snifffff




Free and easy to use.

I could easily create a sketch together with tracking motion triggering and controlling an animation. I really recommend this tool for project with Web cam, unfortunately it is PC only……..



Project Research- Microcontroller

Now that I know what I want to do, I need to to concentrate on what do I need to create this interactive artefact.

If I decide to use a sensor of any kind I will need to plug it to a device, which will control and collect data from it. That is called a Microcontroller.



Which one should I go for?


One of my teachers is a big fan of the phidget interface. I do know if I ask him an advice, the answer is going to be Phidget. However I went on the UK website and check if this will be suitable.


The Phidgets are the user-friendliest system available for controlling and sensing the environment from computer. The Phidget Interface Kit is very versatile, it seems .

Analog inputs can be used to measure continuous quantities, such as temperature, humidity, position, pressure, etc. There are many sensors in the Phidgets product line that require no assembly, you just buy and plug, and you will more likely find some codes in the .com website. This little device seems very easy to handle, a lot easier than Arduino. Most of the sensors comes with cables design to be plugged to the interface in a very steady way, this is a big advantage against the Arduino.

Now there is no Ultrasound sensors or foot switch in the Phidget product line, so I will still have to choose independent one.

The other backside is a of tutorials, or example connecting this kind of independent sensors to the Phidget interface. It is very sad indeed, because that device seems more reliable than the Arduino. 

Definitely a item to keep in mind for some future project



I believe this microcontroller is one of the cheapest in the market. You can plug nearly everything. However I did found it fiddly, and not always reliable. The major attraction I think is the community, what ever your project is about you will find somewhere some code for the Arduino.

Arduino is an open platform that can communicate with a large number of other software packages to manipulate data in different ways according to the needs. Some examples of software are MAX MSP, Pure Data, Isadora, VVVV, Eyesweb, Processing, Flash , …

Because of this huge community online, I believe I am going to stick and try again to work with it…hopefully for the best.


Project Research-Sensors

First I need to decide what kind of sensors will be idle for my project


When the spectators get close enough to the painting, the animation start. Consequently we need something, which can found out where the spectator is in relation to the screen. This interaction will give the computer the ability to sense the position of the spectator in space.


Detecting Presence


Web cam


My first thought was to use a web cam to detect motion, unfortunately the information would not be accurate enough, and I am not sure how the camera will handle more than one visitor at the time. This will create more problems, especially regarding the environment. If I choose to use a web cam with some software such as Isadora, Eyesweb or even Quartz Composer I would need to constrain the space to make the most effective use of it. Only one person at the time would be allowed, which will take away the surprised effect, the unexpected will be in a way expected.


Foot Switch

This  device is very straightforward, and will trigger any action within a small area. One of the most common type is made of long stripes of metal separated by foam tape. When someone step s on these sensors, the foam tapes is compressed and the metal strips are touching each other’s.

This seems a cool website to get that kind of switch, but to be honest I do not dare asking for the price

The mat switch would be idle underneath a carpet, but I believe my budget would not allow it.


Determining Position


Infrared (IR)

IR sensors are very good for short distance sensors. These kind of sensors send out an infrared beam and read the reflection of the beam off a target, in our case the spectators.  Sharp IR sensors have been recommended in some books such as “Physical Computing” By Dan O’Sullivan and Tom Igoe. 

Some ranging formula comes from an excellent article on Acroname’s site.


Unfortunately the range cover by this kinds of sensor is too small up to 1.4m.  As I need to sense presence about 2 to 3 meter. Consequently the only alternative is the Ultrasonic sensor.


What I will need, I believe is a Range Sensor


Most of those sensors send out some form of energy, to be able to read the distance from a target. This could be light, magnetism or even sound. Those sensors will convert the distance into an electrical voltage or digital signal, which can be read by a microcontroller.

Similar principal of a airplane radar. This system operates by sending out a radio signal and measure the time it takes to bounce back from a target.



Short for “Radio Detection and Ranging: Radar, like sonar, uses a pulse of radio energy to map distance based on the length of time it takes the pulse to return from the source. Radar is based on the principle of sending very long wavelength radiation (called microwaves) from an antenna, and then detecting that energy after it bounces off a remote target. The wavelength of the microwave, its strength can be measured when it returns.


The radar will allow me to calculate the exact positions of visitors, and consequently will trigger the animation at the right time.


How much and where could I found it?

I have been told that will cost me about £200 to £300, well as much as I like the module, the student Loan would not pay for that.


Will it work with Arduino or the Phiget interface?

I could not found much information, but the price put me off quite a bit.




They work like sonar device, sending out a ping of ultra sound, and then time how long it takes to bounce back. This kind of little device can read from 6 inch to 6 feet. . They use an initiation pin and echo pin. In the code we will have to set the initiation pin high, then use a rctime command on the echo pin to measure how long it takes to return the ultrasound.

The Daventech SRF family seems to be very popular in the robotic world (amateur’s one of course). One of the cheapest one is the SRF04.  We will need to set the init pin high, and then wait for the ping to come back. Now to determine the distance we will need to divide the time taken by the speed of sound in the microcontroller’s software.

This sensor going to be perfect, I believe, if I decide to use the Arduino microcontroller. This little tool will provide very accurate ranging information.


I found a very good guide in the “physical Computing” by Dan O’Sullivan and Tom Igoe book, which might help us if we decide to use this kind of sensors.

Usman Haque

This artist is in fact a architect who does focus is research on interaction, connectivity, contextual awareness. These kind of technologies modify our understanding of space and change the way we relate to each other. He does not think of architecture as static and irreversible entity; instead he sees it as dynamic, responsive and conversant.

Two years ago I had the chance to see one of the project he worked on with Robert Davis:


 Evolving Sonic Environment

The installation is based t the collective behaviors of the devices, being affected by the way that the room is occupied (by people or other mobile objects) and, consequently the room will develop a “awareness” of its occupancy.

The devices work like neurons, cascading during high activity, altering their verges during periods of low activity and becoming. Inputs and outputs consist of high frequency sound near the threshold of human hearing. When they have received sufficient input energy they themselves “implose”, with a continuous sound of varying frequency.

When the visitor enter in this space, it can feel slightly dizzy because of the high frequency. I personally did not feel anything, but some schoolmate felt slightly strange……….

Evolving Sonic Environement

I found this paper on The Hague website, which I think could be very interesting for some of us, who has the aim to create an interactive installation.

This paper describes the results of a collaborative research project to develop a suite of low-tech sensors that might be useful for artists working with interactive Environments.


Music for Bodies – Sonic Bed

Last year we were had a trip to the see an exhibition with the sonic bed. Well, well the bed was not there anymore…L

Music for Bodies is a research project involving the sonic mapping of human bodies to architecture, through studies of bio-resonance and interface building. Its aim is to discover new methods of experimental music making, and make a new kind music more accessible to a wider community. The research is mainly based on feeling the music rather than just listening it to. 


Music for Bodies




During the workshop we have been introduce to some sound environment installation.

Within the realm of public art, the sound installation can be a musical equivalent of a sculpture, or and can be part of a sculpture.. Some sound installations have gained by applying a variety of electroacoustic techniques and technology in their creation and spatial arrangement. 


In this installation the visitors lie down and relax, watching the firmament above them. Using their finger pointed upwards, the visitor can insert new stars into orbit with distinctive visual and musical characteristics.

The sound will be determined by the  emplacement chosen on the orbit, Bigger you let the star grows louder it plays.


 “The orbiter is an interactive sound environment by Vera-Maria Glahn and Marcus Wendt. It invites you to reach for the stars and play their music!”

The installation is based on custom-built software , performing real-time analysis of a camera image of the player as well as generating 6-channel-audio and video signals. The video analysis is coded in C++, instructing SuperCollider for the audio generation, Processing for the graphics.

The synthesized sound used in this installation has a very heavy 70’s influences. Quite psychedelic in a way……




Group Work

We only have one week left. The teacher do not see anything happening, consequently he basically give a big wake up shake, which was needed but not really welcome. Not welcome because unfortunately, the only group member’s present were the one who try to do something, and that felt slightly unfair, even if it was for the better.  However we managed to create 2 stages during the workshop, but I confess I did wanted to leave the class. Most of the classmate arrived late and kept asking me what were we doing? I do believe that was not my day.

(See Portfolio for development)