by Sofia Paraskeva


Protected: Transtime

This post is password protected. To view it please enter your password below:

Database Art

7 Train

The Mashup Train Ride Interactive Projection is a software application developed by Fluid New Media Lab during the course of 8 weeks, in a workshop lead by teaching artists Sofia Paraskeva and Gabriel Roldos. Participants Ellen Pearlman, Esmeralda Kosmatopoulus, Carlos Martinez, Miroshlava Palavacini, Elizabeth Schwabe, Thelmo Cordones, Adriana Velasco and Paulina Ramirez, created video clips that relate to the experience of blending people from different cultures in or around trains. We used Pure Data, a real-time graphical dataflow programming environment for audio, video, and graphical processing, to create an interactive software application that mixes videos when the environment sound levels are high (ex. a train passing). The art piece combines the video clips and the computer software in a projection that relates to the mashup of events someone can experience in a train ride. This project was made possible, in part, with funds from the “Decentralization Program”, a re-grant program of the New York State Council on the Arts, administered by the Queens Council on the Arts, and additional support from Local Project Art Space.

The 7 Train Video Mashup Project was presented at Local Project on July 17 and at NARS Foundation on July 26, 2012. Below are some images from the Pure Data patch mixing images made my students according to surrounding noise level.

Rainbow Resonance using the Kinect

Photography by Kate Milford, Extreme Kids, Brooklyn, February 23rd, 2012

Rainbow Resonance now uses the  Microsoft Kinect 3D motion controller to detect motion in space while MAX/MSP interprets body movements into a mapping of color and sound.  The transition took a couple of months to be implemented, and was finally tested at Extreme Kids, Brooklyn, a playground haven for autistic children,  on February 23rd.

Using the Kinect has simplified setup time for installation and demands on lighting conditions.  It has also eliminated the need for using background subtraction methods used with RGB cameras. Most importantly it allows for implementing further user interaction using depth data by capturing and interpreting movement along the z-axis, that is, how close or far away the user is from the camera.

A number of sound design options are now ready to be implemented in the software taking into account movement along all three axis.  Alternative frequencies are still used are the basis for sound, contributing to a meditative mood during the installation.  The colorful mirror images and the sound accompaniments constitute an immersive user experience that facilitates play and social interaction.  The user is empowered as simple movements fo the body generate harmonic intervals and sounds.

Rainbow Resonance is likely to be exhibited at the same location on a more permanent basis to give us the opportunity to modify the software to the specific needs of autistic children.  The goal is to provide a safe and engaging environment where children can interact to reinforce and build social skills through dance, play and music.

Photography by Kate Milford, Extreme Kids, Brooklyn, February 23rd, 2012

Rainbow Resonance – Χ-Dream Arts Festival

Rainbow Resonance was exhibited at ARTos Cultural and Research Foundation at the 2nd X-Dream Arts Festival in Nicosia Cyprus.  Visitors including children had the opportunity to dance, play and enjoy this interactive experience.

Opening night of Rainbow Resonance at ARTos Foundation, Nicosia, Nov 1st 2010.
Click to watch video

Children and especially pre-schoolers are favorite fans of Rainbow Resonance.  They always seem to endlessly enjoy interacting with the piece.

Children, including pre-schoolers playing at Rainbow Resonance ARTos Foundation, Nicosia, Nov 3rd 2010.
Click to watch video

It is always a pleasure to observe how dancers experience Rainbow Resonance.  Rhythm and co-ordination in their movement always produces a unique and more consonant sound.  Next to children, dancers seem to enjoy this experience immensely.

Alexis Vassiliou dancing at Rainbow Resonance, ARTos Foundation, Nicosia, Nov 3rd 2010.
Click to watch video

More on Rainbow Resonance


CYBC (Cyprus Broadcasting Corporation) – TV interview / CYBC interview – English transcription

“Entexnos” weekly national TV program on the arts in Cyprus – January 16, 2011

Kipros TV - web interview – January 24, 2011
Online website supporting the Arts & Culture of Cyprus

Espresso magazine Athens - January 16, 2011 / Espresso – English transcription

Maison Figaro - October 2010  / Maison Figaro - English transcription

Cyprus Mail - July 4, 2010

Cyprus Weekly - July 9-15, 2010

Parikiaki - July 8, 2010

Eleftheria - July 10, 2010

Ta Nea - October 27 , 2010

Politis Newspaper - October 18, 2010 - August 2009.

Special Children interact with Rainbow Resonance

Zoe plays with Rainbow Resonance, Ms Viewpoint Ltd, Nicosia, July 13th 2010.

Click to watch video - Zoe and Alex play

On July 13th Alexandros and Zoe, two children with special needs were invited with their families and friends to experience the installation.  This was the first time Rainbow Resonance was presented to the disabled community.  It was a thrill to discover that the children responded vividly and enjoyed the piece.

Zoe plays with Rainbow Resonance, Ms Viewpoint Ltd, Nicosia, July 13th 2010

More on Rainbow Resonance

Rainbow Resonance – MS Viewpoint

Rainbow Resonance at Ms Viewpoint Ltd, Nicosia, July 8th 2010.
Click to watch video

Rainbow Resonance was presented for the first time in Cyprus at MS Viewpoint Ltd Nicosia on July 14th. Friends and visitors had the chance the experience the piece after a brief talk about the project by Sofia Paraskeva.

Rainbow Resonance presented to Special Children

Rainbow Resonance presented to special children, Ms Viewpoint Ltd, Nicosia, July 13th 2010.
Click to watch video

On July 13th Alexandros and Zoe, two children with special needs were invited with their families and friends to experience the installation.  This was the first time Rainbow Resonance was presented to the disabled community.  It was a thrill to discover that the children responded vividly and enjoyed the piece.

More on Rainbow Resonance

Computer vision installation setup

Aural Aura – Wireless Communication

Wireless Communication with the BUG

Wireless communication with BUGbees

The BUG serves as the central hub for processing data at the hardware level using sensor data and bugbee readings.  Three bugbees are attached to the bodysuit for location identification purposes using the method of triangulation.  My programming partner Dr. Constantinos Hadjiloizis and I have successfully tested wireless communication between the BUGbees and the BUG motion sensor using the dragonfly interface.  To test the BUGbees I physically attached them to my body, two on each arm, and one below the stomach. We will use the dragonfly interface to open a port of communication between MAX/MSP and the bug.  The BUGbee readings are analyzed and compared with sensor data readings transmitted via the xbees and packaged in MAX/MSP, to ensure a more accurate datastream in order to determine movement and location.  We are also conducting testing using the Von Hippel interface, which allows us to use the Arduino open-source platform, as an alternative to wireless communication, for comparison and testing purposes.

Using the dragonfly interface to compile BUBbee data readings from the BUG.

Aural Aura

Aural Aura in Development


Harvestworks July 09

Current developments I am currently in the process of replacing the flex sensors with hand made sensors made from neoprene and velostat.  I am also examining the possibility of eliminating body sensors in favor of a more sophisticated motion tracking system with the camera, which allows the precise identification of specific moves.  At the same time I am implementing RGB LEDs mapped to the chakra points on the body suit, which are illuminated through the astral body dress at the precise moment when each chakra is triggered.

Aural Aura


Aural Aura is a computer vision and sound installation/performance that visualizes color frequencies and sonifies resonances  of key energy points of the body, generated through movement and triggered using sensors.  The project aspires to enhance energy management through a mapping of color, sound and movements based on  the Shaolin 18 Lohan hands, the fundamental chi Kung exercises used for meditation since 527 BCE in China.   Active and/or passive bodily movements trigger specific sound frequencies and colors mapped to the chakras.  The goal of Aural Aura is to to unite sound, color and dance in a fluidly attuned communion that transcends the performer, and ideally the audience into a hyper reality that approaches a spiritual experience.

The Structure

The nature of Aural Aura is fivefold involving computer vision, a wearable interface, an astral body projection, musical compositions and sensors.


The Bodysuit

The bodysuit is the controller and generator of sounds and visual effects according to the readings of the attached sensors.  A choreography of 8 moves measured by the sensors, determines the triggering of sounds, and images.   The sensor bodysuit acts as an instrument capable of playing the 8 tone scale resembling the lydian mode scale, that is, the white keys of the piano.  Choosing a wearable interface as an event based generator for mapping gestures was necessary in order to identify and associate chakra to sound and color.

A very special thanks to Andrea Lauer whose contribution made the execution of the design of the body suit into a costume possible.

The Gloves

A pair of gloves  use two Flex sensors, one soft potentiometer, one FSR and soft switches, to provide continuous controllers and switches affecting rhythm, pitch, reverb and delay.  The gloves went to multiple design variations.  In the final design of the glove I used soft switches made of conductive fabric and tightly swed the remaining wire wrap onto the surface of the glove. For an update please check the link Musical Gloves.

The Astral Body

An exoskeleton wearable layer, the astral body dress, physically extends outwards from the central bodysuit, representing the aura and serving as a canvas for projecting images. The astral body dress is used for projection to visualize the aura in three dimensional space rather than on a screen.  Visual effects projected on screen merge with the astral body dress color effects to simulate the aura.


Computer Vision

As computer vision tracks movement, the screen/body projection reflects chakra points being triggered by glowing the equivalent colors and/or symbols, while corresponding sounds are generated in realtime.

The Sound

The musical compositions attempt to capture the ethereal aspect prevalent in the visual representation of the aura. The estoreric aspects of music inform the process of sound design as different energy points are mapped to distinct instrumental sounds and musical intervals based on the Pythagorean tuning system. Sounds convolve harmoniously into each other as gestural movements trigger different energy points (chakras). Pure oscillators are synthesized into alpha and delta vibrations, intervals and resonances that accompany the choreography of movements.  The low vibrational frequencies used help induce meditative states.  Sound does not accompany the dance but is instead created by the dancer.

The Mapping

Aural Aura relies on a mapping between energy point (chakra), color, sound frequency and movement, to produce a synesthetic experience that physically manifests the invisible and inaudible nature of the aura.  The mapping of color to sound is based on “Rainbow Resonance”, a project I developed in Spring 2008, which assigns sound frequencies to colors in an intelligible manner by transposing light wavelengths 40 octaves below their spectrum to match the audible spectrum of the electromagnetic field.  Aural Aura adds the element of movement to achieve an immersive synaesthetic experience.

The Moves

When the 8 shaolin moves are performed the chakra centers are activated allowing for a connectedness between body, mind and spirit, that enables a “balancing” of the energies to occur.

Why a Performance?

Performance art is an unprecedented expressive experience.  Inherent in its nature is the capacity to involve the audience so that they in turn may become a part of the performance.  Interactive art is a performative art that invites the audience to not only participate but in essence create the art.  In this performance the dancer is not merely interpreting an aural event, but “invents” the soundscape through the dance.  Through a tight mapping between color, sound and movement that allows a connectedness between body, mind and spirit, Aural Aura seeks to transcend the performer, and ultimately the audience into a “healing” experience.


The healing element of Aural Aura is implicit.  The performance seeks a balancing of the energies of the body, through an alignment of the chakras using sound vibrations in conjunction with the corresponding colors, as chi kung movements activate the chakra centers.

The Future

Aural Aura can be expanded to integrate movement in three dimensional space through a mapping of the physical location in which the performance takes place.  Sensors that measure parameters such as distance, vibration and force, can be coupled with multiple motion tracking cameras to enrich the performance.  Furthermore, a performance could include a group of dancers who in an ideal user scenario compose symphonies through their dance, using a combination of wearable sensors, and tracking movement in 3D space through computer vision and sensors placed on location.


Aural Aura is a natural development of my previous project Rainbow Resonance which acted as a computer vision musical instrument.  In this project the idea of producing sound through dance is further explored as the performer generates sound intervals through a mapping of her movements.   The content of this project sprang from the desire to visualize the aura that extends from the physical human body, and to sonify the associated colors, using a mapping based on mathematics.

Symbolisms in Aural Aura

  • Star tetrahedron

The metaphysical nature of numbers as they interact to construct the shapes of the five pointed and six pointed star is an underlying theme conveyed through all aspects of this project. Attuned to the immaterial reality of sound and color, Aural Aura attempts to physically manifest the immaterial dimensions of being.

Wearable Design

In designing the form of the astral body dress I was inspired by Issey Miyake’s designs extending from the body in unique flower forms, and also Philips Design Probe Skin Dresses.  My intention was to illuminate the astral body dress  “from within” so that I would achieve a “glowing effect” in the areas of the chakras.  Miyake’s illuminated dresses and Hussein Chalayan’s video dresses served as an inspiration.  Laurie Anderson, Pamela Z,  Letitia Sonami, Luibo Borissov inspired me with the use of a wearable interface and sensors to produce sound and images. Arleen Schloss’s fibre optic dress by Maurice Daniel designed in the mid 80s was also one of my influences.

Arleen Schloss performing in a fibre optic dress by Maurice Daniel

Probe Skin Dress by Philips Design

Issey Miyake

Issey Miyake

Issey Miyake


Liubo Borissov – Autopoiesis

Laurie Anderson – Zero and One

Pamela Z: Metalvoice

Toni Dove

Troika Ranch

Laetitia Sonami

The OpenEnded Group -  Paul Kaiser


The Taiwanese dance group Cloudgate who use tai chi in their performance Moonwater was one of the most important inspirations that drove me to implement Chi Kung as a dance in Aural Aura.  A beautiful solo performance by Yang LiPing entitled Moon that uses Kung Fu moves in the dance was another of my inspirations in designing Aural Aura.

Cloud Gate – Moonwater

Moon – Solo Dance by Yang LiPing

Thanks To

Dana Karwas

Andrea Lauer


Brian Gruber

Lower East Side Performing Arts