mac online apple blackjack http://www.euro-online.org

Live Electronic Performance: Variable animal sounds

B08new

The demo of variable animal sounds was made by pure data, which was based on the combination of three to four different patches, as the trial of testing sounds variation for group project electronic performance with Clare. The recordings was made based on half of my personal interests during the testing, and would like to be shared.

The patches are mainly focused on developing loop and delay unit to change the frequency of sound so that animal sounds cannot be heard clearly but still can be identified during the performance. The experiment is to explore the possibility of noise related to the cultural context by a narrative way.

Using 3 patches as below:
2 raw variable animal  patches + 1 telegram patch.


Digital Media Project: Experiment of code

QQ图片20140421015231

Experiment: Processing & Arduino
What I met the biggest problem during this period was the values of ‘translate’ and ‘random colour’. The image responded unclear version when I tested the code, and it could not rotate in the center of image itself but turning around the center of the screen. The problem of rotation has been solved by changing the translate X&Y number after several experiments. Besides, I set the float ‘x’ and ‘y’ instead of exact number in the code of each image in order to make the further experiment more effective.

Regarding the problem of random colour, I set the code named ‘.disableStyle();’ that was taught by allocated technical tutor of this project. This code is utilised to remove the original colour of the image and set new colour value in Processing.

Code of Images
void draw() {

background(255);
fill(0);
pushMatrix(); // save the current coordinate system to the stack
// translate to the center of screen
translate(width/2, height/2);
// rotate everything when the frameCount adds up
rotate(frameCount*0.01);

// small flower
if (sensorValue>300 && sensorValue<=400) {
fill(0, 180);
SVG09.disableStyle();
shape(SVG09, x+10, y+10, sensorValue+15, sensorValue+15);
SVG01.disableStyle();
shape(SVG01, x+10, y+10, sensorValue+15, sensorValue+15);
}

// bigger flower
else if (sensorValue>400 && sensorValue<=500) {
fill(0, 180);
SVG03.disableStyle();
shape(SVG03, x-20, y-20, sensorValue+20, sensorValue+20);
SVG04.disableStyle();
shape(SVG04, x-30, y-30, sensorValue+25, sensorValue+25);
}

// random colour
else if (sensorValue>500 && sensorValue<=700) {

//Audio triggered
minim = new Minim(this);
sou = minim.loadFile(“ESA_installation.wav”);
sou.play();
fill(random(0, 100), random(0, 100), random(0, 100), 80);
SVG05.disableStyle();
shape(SVG05, x-60, y-60, sensorValue+60, sensorValue+60);
fill(random(100, 200), random(100, 200), random(100, 200), 150);
SVG06.disableStyle();
shape(SVG06, x-50, y-50, sensorValue+35, sensorValue+35);
fill(random(200, 255), random(200, 255), random(200, 255), 50);
SVG08.disableStyle();
shape(SVG08, x-60, y-60, sensorValue+40, sensorValue+40);
}

// else if (sensorValue>700 && sensorValue<=800) {
// fill(random(0, 100), random(0, 100), random(0, 100), 100);
// }

else {
fill(255, 150);
SVG08.disableStyle();
shape(SVG08, x+30, y+30, sensorValue+300, sensorValue+300);
}

popMatrix(); // restores the prior coordinate system
}

The images below is the testing without colour variation after connected to Arduino. The shape will be changed based on the variation of sensorValue (according to capture the light intensity by photoresistor). This series of screenshoots is the version that before the problem of rotating around according to the center of one image has been solved, and it also showed another problem about the delay in responding in terms of rotation due to loading too many pictures in Processing.

Arduino Prototype

Experiment: Pure Data – Gem Patch
After almost finish to modify the code of image, I am still not satisfied with the effect in terms of either the dull outcome or the incomplete expression of concept. This problem of how to improve the project by technology annoyed me for nearly a week until I came across the book called Art of the digital age, and an installation digital artwork inspired me to continue to build it up, which will be introduced in the idea & inspiration account. In doing so, I references an embodied interaction tutorial on Youtube, and make some slightly change of values, including the shapes. The effect is shown as below, which is, it can capture the colour of object / surrounding by webcam.



Reference
morefun4art (2012) Available at: Youtube. Embodied Interaction – Puredata lesson016a [Accessed 20 April 2014].
Wands, B. (2006). Art of the digital age. New York: Thames & Hudson.

Live Electronic Performance: Noise sounds like telegram

telegram

Pure Data is pretty fun that is allowed to get a quick try with any possibility of linking and changing the value, which makes me feel less pressure and more motivation do the experiment of noise. This is the testing of how to make the noise with low/medium pitch – the outcome sounds like the combination of telegramming and whisper from data, which keeps me arising the further idea about improving the theme of duo performance, and it still needs to consult with mate next week.

Experiment: Quartz Composer with kinect

Screen Shot 2014-01-27 at 11.58.15


It is really exhausting to record that I found there are some boundaries of interactive visual expression after I tried Processing and Pure Data in terms of the form of particles, despite both of them have advantages in different aspects, such as audio or the flexibility of editing code.

I came across Quartz Composer when I was searching particle system with kinect, and then I have to say that I tried different software again….Actually, this experiment is a good start I reckon, which really saves me time to get a try and have fun playing with these stuff by using data stream.

Testing 1:
(1) add image
(2) change X|Y position and gravity.

Testing 2: Hand tracking
(1) connect QC with kinect (the tutorial of connecting can be found on Youtube)
(2) change the parameter

Testing 3: Hand tracking without body shape
get a try to add another library patch into it.

I will keep learning QC and PD after doing a series of experiment by different software – data stream is really suitable for those who have no idea with computer language to build confidence keep on going the next step.

Updated (24th January 2014)

Testing 4: Blur function with mouse
get a try based on template – black and white function has been added, and the image can be moved by mouse.

Testing 5: Blur function with video camera
make video camera be shown inside the surface of cube.

Updated (26th January 2014)

Testing 6: silhouette via kinect in QC
I am still not satisfied to see the final effect after doing a series of experiments, since the silhouette effect  remains a problem in this case. 

Updated (27th January 2014)

Testing 7: Hand tracking via kinect in QC
This might be the last testing of kinect project of Semester 1, which shows the concept of elapse of time. I changed data flow of particles and the parameters based on ‘kinect test’ of Quartz Composer, such as gravity and attraction, which circles can be shown radially when the user moves his/her both hands. At last, I feel a bit disappointed more or less due to the unsuccessful experiments, on the other hand, thanks sincerely for those who had supported quite a lot in terms of Processing and Kinect during we were doing a series of experiments about group project. By the way, it will be a good start in Semester 2 that sessions involve that learning Processing and doing the homework.

Bibliography
Challinor, R. (2011). Available at: Youtube,  Synapse for Kinect Quartz Composer Tutorial (Accessed: 21 January, 2014).
Robinson, G. and Buchwald, S. (2012). Learning quartz composer: A hands-on guide to creating motion graphics with quartz composer. Corporate and Government Sales, U.S.

Experiment: Motion(Kinect) via Processing

Screen Shot 2014-01-15 at 21.59.03

‘Making things see’(Borenstein 2013) that is recommended from Ping  is a nice guide book to let me understand the code step by step, but there are some codes cannot be used anymore in the latest Processing, which sometimes makes me really confused about some codes. But it always brings a sense of victory after the experiment works successfully.

I am trying to overcome how to add the particles effect into silhouette and how to set a image background, but it still cannot work at this moment. However, the motion effect appears coincidently after I add some codes from books into a SimpleopenNI example (another try to let it work based on the existing one!), though it covers the screen crazily within a quite short time………fair enough…I will try it tomorrow. Now  it’s time to have a rest…

Experiment: Particle effect testing (Pure Data)

particles_pd_testing

Share a fairly basic practice for testing particles that was made after understanding the PD tutorials in last week. The outcome of group project effect will be mainly handled by group member Wengchang who is familiar with AfterEffect. Although we focus on the combination of virtual element and photography in this project work, we three still learn interactive technique for the further development in either project work or individual one.

Here have a try and make it be possible if interactive effect between audience and particles can be achieved, though I still get stuck in the interactive step with lack of sufficient skills. (The camera effect was combined by a coincidence…not quite understand some code in it, but it seems a good combination!)

An idea about individual project comes from this testing – may I use PD Gem to fulfill the synchronised two-layer’s variation with sounds?

Testing:

Reference
morefun4art (2012) Available at: Youtube.  VA3460 Embodied Interaction [Accessed 26 December 2013].

Hotchkiss, J. (2010) Motion detection to midi with puredata. Available at:  http://hotchk155.blogspot.co.uk/2010/05/webcam-motion-detection-to-midi-with.html/ [Accessed 26 December 2013].

Experiment: Testing (Processing)

1

Here is a basic practice - the shape can be changed by mouse. Some parameters have been amended after understanding the principle of processing from online tutorials in this afternoon break…

Speaking of developing the idea, I am trying to combine concept of landscape (landscape/geography/mapping) with geometrical shapes, and trying to learn more about variation of geometrical  image by audience’s movement.  Anyway, still highly interested in geometrical things and the geometrical image will be made instead of cubes (previous concept), but the concept of two-layers is conserved. The concept of project is still about exploring the combination between people and the surroundings.

I have got some ideas about these two layers – the outside one will be a dynamic web like a geometrical mountain (geometrical image) and the inside one will be audience’s variation face by pc camera (if it can be achieved.. I am planning to ask for some advices from tutor…)

The effects: