Yinzhen Bao :)

May 24, 2014

Live Electronic Performance: Variable animal sounds

Filed under: DMS8012,Sharing,Testing — yinzhenbhao @ 7:10 PM

The demo of variable animal sounds was made by pure data, which was based on the combination of three to four different patches, as the trial of testing sounds variation for group project electronic performance with Clare. The recordings was made based on half of my personal interests during the testing, and would like to be shared.

The patches are mainly focused on developing loop and delay unit to change the frequency of sound so that animal sounds cannot be heard clearly but still can be identified during the performance. The experiment is to explore the possibility of noise related to the cultural context by a narrative way.

Using 3 patches as below:
2 raw variable animal  patches + 1 telegram patch.
B08new

G09new
QQ图片20140528205028

April 21, 2014

Digital Media Project: Experiment of code

Filed under: DigitalMediaProject,DMS8013,Testing — yinzhenbhao @ 1:35 AM

Experiment: Processing & Arduino
What I met the biggest problem during this period was the values of ‘translate’ and ‘random colour’. The image responded unclear version when I tested the code, and it could not rotate in the center of image itself but turning around the center of the screen. The problem of rotation has been solved by changing the translate X&Y number after several experiments. Besides, I set the float ‘x’ and ‘y’ instead of exact number in the code of each image in order to make the further experiment more effective.
QQ图片20140421015231
Regarding the problem of random colour, I set the code named ‘.disableStyle();’ that was taught by allocated technical tutor of this project. This code is utilised to remove the original colour of the image and set new colour value in Processing.

Code of Images
void draw() {

background(255);
fill(0);
pushMatrix(); // save the current coordinate system to the stack
// translate to the center of screen
translate(width/2, height/2);
// rotate everything when the frameCount adds up
rotate(frameCount*0.01);

// small flower
if (sensorValue>300 && sensorValue<=400) {
fill(0, 180);
SVG09.disableStyle();
shape(SVG09, x+10, y+10, sensorValue+15, sensorValue+15);
SVG01.disableStyle();
shape(SVG01, x+10, y+10, sensorValue+15, sensorValue+15);
}

// bigger flower
else if (sensorValue>400 && sensorValue<=500) {
fill(0, 180);
SVG03.disableStyle();
shape(SVG03, x-20, y-20, sensorValue+20, sensorValue+20);
SVG04.disableStyle();
shape(SVG04, x-30, y-30, sensorValue+25, sensorValue+25);
}

// random colour
else if (sensorValue>500 && sensorValue<=700) {

//Audio triggered
minim = new Minim(this);
sou = minim.loadFile(“ESA_installation.wav”);
sou.play();
fill(random(0, 100), random(0, 100), random(0, 100), 80);
SVG05.disableStyle();
shape(SVG05, x-60, y-60, sensorValue+60, sensorValue+60);
fill(random(100, 200), random(100, 200), random(100, 200), 150);
SVG06.disableStyle();
shape(SVG06, x-50, y-50, sensorValue+35, sensorValue+35);
fill(random(200, 255), random(200, 255), random(200, 255), 50);
SVG08.disableStyle();
shape(SVG08, x-60, y-60, sensorValue+40, sensorValue+40);
}

// else if (sensorValue>700 && sensorValue<=800) {
// fill(random(0, 100), random(0, 100), random(0, 100), 100);
// }

else {
fill(255, 150);
SVG08.disableStyle();
shape(SVG08, x+30, y+30, sensorValue+300, sensorValue+300);
}

popMatrix(); // restores the prior coordinate system
}

The images below is the testing without colour variation after connected to Arduino. The shape will be changed based on the variation of sensorValue (according to capture the light intensity by photoresistor). This series of screenshoots is the version that before the problem of rotating around according to the center of one image has been solved, and it also showed another problem about the delay in responding in terms of rotation due to loading too many pictures in Processing.QQ图片20140421015529QQ图片20140421015430QQ图片20140421015458

Arduino Prototype
IMG_20140324_154931
IMG_20140403_200158

Experiment: Pure Data – Gem Patch
After almost finish to modify the code of image, I am still not satisfied with the effect in terms of either the dull outcome or the incomplete expression of concept. This problem of how to improve the project by technology annoyed me for nearly a week until I came across the book called Art of the digital age, and an installation digital artwork (Transgenic Net Installation, Kac, 1999) inspired me to continue to build it up, which will be introduced in the relevant reference account. In doing so, I references an embodied interaction tutorial on Youtube, and make some slightly change of values, including the shapes. The effect is shown as below, which is, it can capture the colour of object / surrounding by webcam.
pdGem
QQ图片20140421003822
QQ图片20140421004659
QQ图片201404210112562

Reference
morefun4art (2012) Available at: Youtube. Embodied Interaction – Puredata lesson016a [Accessed 20 April 2014].
Wands, B. (2006). Art of the digital age. New York: Thames & Hudson.

March 22, 2014

DMS8012: Noise sounds like telegram

Filed under: DMS8012,GroupProject,Testing — yinzhenbhao @ 8:59 PM

Pure Data is pretty fun that is allowed to get a quick try with any possibility of linking and changing the value, which makes me feel less pressure and more motivation do the experiment of noise. This is the testing of how to make the noise with low/medium pitch – the outcome sounds like the combination of telegramming and whisper from data, which keeps me arising the further idea about improving the theme of duo performance, and it still needs to consult with mate next week.

telegram

Note:
1. The idea of the electric telegraph can be traced back to the middle of the eighteenth century but had not been adequately developed. (Corby 2006, p12)

2. […] The invention of the telegraph is not just a major technological development; it is also Samuel Morse’s most notable contribution to the history of art. His paintings have been to a large extent forgotten. (Corby 2006, p12)

3. […] More directly, his invention and others that were to come later, such as the telephone, film, television, video and the computer, opened up new possibilities for the production of art and ideas concerning its role. This can be seen early on in the work of the Futurists, and  Moholy Nagy’s Telephone paintings. Later developments such as John Cage’s use of radios and Ray Johnson, amongst others; and ultimately in the first substantial use of mass media technologies in video art, computer and cybernetic art, and of course, the recent development of net art and other practices intended for the Web. (Corby 2006, p12, p13)

Corby, T. (ed.) (2006) Network art: practices and positions. New York: Routledge .

January 21, 2014

Experiment: Quartz Composer with kinect

Filed under: GroupProject,Testing — yinzhenbhao @ 9:10 PM

Screen Shot 2014-01-27 at 11.58.15
It is really exhausting to record that I found there are some boundaries of interactive visual expression after I tried Processing and Pure Data in terms of the form of particles, despite both of them have advantages in different aspects, such as audio or the flexibility of editing code.

I came across Quartz Composer when I was searching particle system with kinect, and then I have to say that I tried different software again….Actually, this experiment is a good start I reckon, which really saves me time to get a try and have fun playing with these stuff by using data stream.

Testing 1:
(1) add image
(2) change X|Y position and gravity.
Screen Shot 2014-01-21 at 15.23.29

Testing 2: Hand tracking
(1) connect QC with kinect (the tutorial of connecting can be found on Youtube)
(2) change the parameter
Screen Shot 2014-01-21 at 16.10.20

Testing 3: Hand tracking without body shape
get a try to add another library patch into it.
Screen Shot 2014-01-21 at 16.43.15

I will keep learning QC and PD after doing a series of experiment by different software – data stream is really suitable for those who have no idea with computer language to build confidence keep on going the next step.

Updated (24th January 2014)

Testing 4: Blur function with mouse
get a try based on template – black and white function has been added, and the image can be moved by mouse.
Screen Shot 2014-01-22 at 16.43.57

Testing 5: Blur function with video camera
make video camera be shown inside the surface of cube.
Screen Shot 2014-01-24 at 16.02.54

Updated (26th January 2014)

Testing 6: silhouette via kinect in QC
I am still not satisfied to see the final effect after doing a series of experiments, since the silhouette effect  remains a problem in this case. Screen Shot 2014-01-26 at 15.34.25

Screen Shot 2014-01-26 at 15.33.41

Updated (27th January 2014)

Testing 7: Hand tracking via kinect in QC
This might be the last testing of kinect project of Semester 1, which shows the concept of elapse of time. I changed data flow of particles and the parameters based on ‘kinect test’ of Quartz Composer, such as gravity and attraction, which circles can be shown radially when the user moves his/her both hands. At last, I feel a bit disappointed more or less due to the unsuccessful experiments, on the other hand, thanks sincerely for those who had supported quite a lot in terms of Processing and Kinect during we were doing a series of experiments about group project. By the way, it will be a good start in Semester 2 that sessions involve that learning Processing and doing the homework.

Screen Shot 2014-01-27 at 11.57.09 Screen Shot 2014-01-27 at 11.58.15

Bibliography
Challinor, R. (2011). Available at: Youtube,  Synapse for Kinect Quartz Composer Tutorial (Accessed: 21 January, 2014).
Robinson, G. and Buchwald, S. (2012). Learning quartz composer: A hands-on guide to creating motion graphics with quartz composer. Corporate and Government Sales, U.S.

January 15, 2014

Experiment: Motion(Kinect) via Processing

Filed under: Testing — yinzhenbhao @ 11:04 PM

‘Making things see'(Borenstein 2013) that is recommended from Ping  is a nice guide book to let me understand the code step by step, but there are some codes cannot be used anymore in the latest Processing, which sometimes makes me really confused about some codes. But it always brings a sense of victory after the experiment works successfully.

I am trying to overcome how to add the particles effect into silhouette and how to set a image background, but it still cannot work at this moment. However, the motion effect appears coincidently after I add some codes from books into a SimpleopenNI example (another try to let it work based on the existing one!), though it covers the screen crazily within a quite short time………fair enough…I will try it tomorrow. Now  it’s time to have a rest…

Screen Shot 2014-01-15 at 21.59.03

Screen Shot 2014-01-15 at 22.00.35

December 26, 2013

Experiment: Particle effect testing (Pure Data)

Filed under: GroupProject,Testing — yinzhenbhao @ 9:41 PM

Share a fairly basic practice for testing particles that was made after understanding the PD tutorials in last week. The outcome of group project effect will be mainly handled by group member Wengchang who is familiar with AfterEffect. Although we focus on the combination of virtual element and photography in this project work, we three still learn interactive technique for the further development in either project work or individual one.

Here have a try and make it be possible if interactive effect between audience and particles can be achieved, though I still get stuck in the interactive step with lack of sufficient skills. (The camera effect was combined by a coincidence…not quite understand some code in it, but it seems a good combination!)

An idea about individual project comes from this testing – may I use PD Gem to fulfill the synchronised two-layer’s variation with sounds?

Testing:

particles_pd_testing

particle_testing_1

particle_tesing2

Reference
morefun4art (2012) Available at: Youtube.  VA3460 Embodied Interaction [Accessed 26 December 2013].

Hotchkiss, J. (2010) Motion detection to midi with puredata. Available at:  http://hotchk155.blogspot.co.uk/2010/05/webcam-motion-detection-to-midi-with.html/ [Accessed 26 December 2013].

December 10, 2013

Experiment: Testing (Processing)

Filed under: Testing — yinzhenbhao @ 2:46 PM

Here is a basic practice – the shape can be changed by mouse. Some parameters have been amended after understanding the principle of processing from online tutorials in this afternoon break…

Speaking of developing the idea, I am trying to combine concept of landscape (landscape/geography/mapping) with geometrical shapes, and trying to learn more about variation of geometrical  image by audience’s movement.  Anyway, still highly interested in geometrical things and the geometrical image will be made instead of cubes (previous concept), but the concept of two-layers is conserved. The concept of project is still about exploring the combination between people and the surroundings.

I have got some ideas about these two layers – the outside one will be a dynamic web like a geometrical mountain (geometrical image) and the inside one will be audience’s variation face by pc camera (if it can be achieved.. I am planning to ask for some advices from tutor…)

The effects:

1 2 3

Powered by WordPress