1.3 Mirror Particles

Project by Jordy van Miltenburg & Anne Nelissen

Technologies increase our space. Not only do some technologies increase our physical space (you can travel anywhere using different means of transport, you can - with some money - even go into space) but they also extend our space by introducing virtual spaces. With this installation virtual space can be manipulated with physical movements. The relation between these spaces are central in this project.

In our project we explore this relation between the virtual and real space. In our contemporary technological environment our movements can convey things beyond their consequences in the physical world. By pushing down some buttons, the buttons are not only pushed down but also whole sentences in any kind of writing style can be created.


Marshall McLuhan categorized technologies as the extensions of the man. The human legs were once extended by a horse, nowadays they are extended by incredibly fast wheels.

“Guns do not kill people. People kill people.”

But do they without guns?

Is killing someone with a gun more easy than killing someone in a physically and emotionally exhausting fight? Do technologies only extend our initial actions or do they also convey other movements and actions? Does John kill people or does only John with a gun kill people?


In our project we approach these questions in a more positive and creative way. Does the link to the virtual world invite the user to new movements and new interactions with the object?


Our project

For our project we wanted to create a fun and physical experience that would not require any explanation beforehand. We have chosen a round shape for the controller because we did not want this object to suggest an upside, downside, left or right. People will try to track down which movements cause what to happen on the screen, but the round shape will make this incredibly difficult and therefore people are less likely to lose interest very fast. Besides that, since the object refers to a ball, it invites the user to pick it up and play with it. The input for the program are the movements made by the ball.

We put a phone inside the ball that sends information about his position and acceleration to the laptop. The phone app used in this installation is called “Sensor2OSC” which sends the data through an OSC channel. On the laptop we used a tweaked ofxOscReceiver example to capture the incoming data which is used in the particle system. For the visuals on the screen we made use of an example code from the book Mastering openFrameworks by Denis Perevalov. We set the different parameters for the moving of the particles to different movements of the phone and we tested whether this worked or not. In the figure below you can find which movement were connected to which parameters.

Links:

Screen capture installation

https://www.youtube.com/watch?v=FFqyJaEzgvo