flow no. 1 | kinect projector dance

flow #1

This choreography is about the duett of dance and interactive media.
My ispiration is to investigate different possibilities to melt organic hiphop dance with projected light – searching for new shapes, transisition, identities and meanings. It is a portray of urban artists giving a computer the acces to their very private natural flow.

Dancers follow their own flow. Beautiful enough. Till another dancer infiltrates this execution of ego. We have a duett ; and if done right – it can flow to everything and beyond. A duett is a partnership based upon exchange and communication leading to conflict or harmony. In the 20th century mankind asked the digital machine for a dance and didnt waited for the answer. Both are now damned in a duett forever in complete – conflony.

The flow series contains a number of choreographic and interactive media productions, searching for different digital illustrations of dancing bodies. Beeing a dancer i was always interested in the nature of continous harmonic movements. As an engineer i am fascinated in algorithmic solutions to illustrate this natural flow.
However the combination of dance methods and digital media can not breathe as we simply add one to the other. Instead i am interested to choreograph a hybrid of its own identity. An identity which disables the possibility to extract the dancer from the machine and vice versa. Forms created sponatiously by both, morphing into each other going somewhere else – becoming one organic shape moving in time.


The Installation

This low weight setup consists of a kinect camera capturing the dancers movements, a notebook to evaluate the captured information and a projector to display the interactive graphics. The idea was to create a concept which work and relate to spontanious urban scenarios. Art and science are beautiful engines for both, society and economy. Hence they should fill our environment and invite everyone, everywhere.



Projection and dance has melted to stunning experiences. Frieder Weiss and Klaus Obermeyer for example created beautiful work, within the field of ballet and contemporary dance. They inspired people all over the world and are pioneers of hybrid art between classic and digital worlds.

Within the upcoming flow series i am interested to investigate the benefits and limitations of projected graphics and how they can be applied to the flow of urban dances like hiphop. The field of urban dances reach from the fine arts, across commercials to train stations. Its a beautiful field in my opinion with unlimited possibilities concerning performance, exchange and communication. It has its own culture, way of thinking, way of creation and asthetics.



The flow of a dancer is a very individual phenomenon. It seems to be in harmony with an construct of anatomy and intention. The flow of a bboy for example seems to be related to humans desire to move economic. In order to manage elegant acrobatic movements a bboy needs to use each movement as the impuls for the next movement. Stopping this motion is possible, but can be economically critical to his stamina and overall performance. This method to create movement, align beautiful with physical computed media. Both relate to the idea of causality and physic.

The flow of a robotic dancer (or popper) on the other hand, is way more bizarre. It can be understood as a sequence of binary coded movements and signals. He moves – isolates – stops – isolates – stops – One|Zero. It is a dance method consisting of everything but human behaviour. It is the intention of this dance to be not organic and economic. It consits of discrete signals without humanistic flow. Curiously enough it seems to be very hard for a machine to read a humans who tries to be like machine.
it seems to be quite a big challenge to portray this dance by the use of flowing graphics. Extracting those discrete signals by the use of computer vision will be the challenge of the upcoming studies. Popping doesnt allow latency or low resolution cameras / projectos. Lets see…



The realtime software is entirely build in Open Frameworks. It enables me to adjust to conditions in environment, camera tracking and visual output. It depends on the following libraries and contributions.

  • ofxOpenNI – created by gameoverhack – openNi wrapper to read captrued data from the kinect camera in realtime.
  • ofxCV – created by Kyle McDonald – fast openCV wrapper.
  • ofxFluidSolver – created by Memo Atken. After Years, it is still one of my favourite calculation models to illustrate continous flow of a dancer and graphics.
  • ofxUI – created by rezaali – having worked a lot in processing, i completly fell in love with this GUI library as it speeds up my tweaking processes. Its easy to use and fast to bind to variables.


Technical Insight

Within the following abstract i will display a couple of techniques i used and methods i tried. i am going to enrich this abstract with more explicit code samples and tools as the flow series continues.

Some of the displayed graphical computations are based upon the msaFluid algorithm by Memo Atken. Since years it is my favourite model to illustrate dance due to two reasons. It forgives and enriches. It is a duett. It forgives any lack of camera tracking as it smoothly illustrates an oily flow and it enriches with generative patterns which are not explicitly created by the dancer. It is the prefect model to create harmony to a human body in motion – propably not the perfect model to create disharmony.

I used background subraction and optical flow in order to extract the necessary data from the performing dancer. The movement vectors calculated by the optical flow algorithms are used to mainpulate the interactive graphics. Within the upcoming projects i am going to portray further methods to play with Computer Vision and dance.

VVVV calibration in external environments 

I wanted to use Open Frameworks as my comfort zone to create interactive graphics. But on the other hand used the VVVV patch from Elliot Woods to calibrate. The following abstract is about some experiences i made while creating this system.

  • check this tutorial by Elliot Woods. The VVVV calibration process is explained perfectly in my opinion
  • add a node to the VVVV patch to store the calibration result in an xml or txt file for external usage. The camera matrix should be all you need.
  • Simply import this camera Matrix to your application within the IDE of your choice. Make sure columns and rows are correct and not switched.
  • the libraries that enable the kinect to work with a computer differ from environment to environment. Make sure that the 3 dimensional coordinate system of the kinects world space in your library, are identical with the coordinates in the VVVV calibration patch
  • now reading your 3d coordinates for each pixel, you want to map them two the 2d space of your projector. Please read the introduction to camera calibration here .
  • all kind of background information can be also found here .
  • Dont spin on your head. It hearts, increases the possibility of getting bald, limits the possibility of your virtual dancing partner to hug you, and is not the appropriate movement for a 30 fps camera device.

Comments are disabled.