Yinzhen Bao :)

September 21, 2014

Final Project: Setting up

Filed under: PersonalProject — yinzhenbhao @ 12:39 PM

Unfortunately, black-and-white televisions could not be connected with computers. So, the Action B appears.

I would like to keep using B/W televisions as creative medium in this piece of work. And it will handle the part of literally embodied noise of signal. And there are three 24″ screens which are outputted the images on the computers showing the reflective images from the variable noise in the space.

A great thanks for technical team in Cultural Lab.

IMG_20140919_150852IMG_20140919_150944

August 20, 2014

Final Project: Three visions

Filed under: PersonalProject — yinzhenbhao @ 5:20 PM

Step by step, after figuring out what I am making, can I call this project is ‘The illusion of time and you’?

To some extent, it looks more like the advanced vision of our group project of the theme of ‘time’ in the aspect of time, space and individuals by interaction…

Vision 1 – B/W vision of time-delayed image:

1

Vision 2 – random particles with time-delayed image (responding by sound feedback):

QQ图片20140816140345

QQ图片20140816140557

QQ图片20140816140445

QQ图片20140816140641

QQ图片20140816140225

2-2

Vision 3 – variegated image  (responding by sound feedback):

QQ图片20140816141501

QQ图片20140816141049

QQ图片20140816141411

Reference

Chung, C. (2013) Multimedia Programming with Pure DataPackt Publishing.

 

August 2, 2014

Final Project: Inspiration of screens

Filed under: PersonalProject — yinzhenbhao @ 10:43 AM

The sketch of prototype

IMG_20140802_111346

After talking with tutor who suggested me to think about some issues that might be happened in building the technical part, I changed the mind of using projections and cloths, instead using old televisions under the consideration of interactive experience and retrospective development of technology.

The idea was built based on triptych (The triptych form arises from early Christian art – Wikipedia) which shows the religious content in the history of this art form. This art form was inspired me in the aspect of using three different forms of ‘one’ to show multi-layer visual experience by interaction.

beffi_lg

Master of the Beffi Triptych 
The Madonna and Child with Scenes from the
Life of Christ and the Virgin (The Beffi Triptych)
 
early fifteenth century

HaywainTriptych

The Haywain, Escorial

old_television-photoshop

Example of old television

June 19, 2014

Share: Heart Chamber Orchestra – Pixelache

Filed under: Sharing — yinzhenbhao @ 10:57 AM

More info : The Heart Chamber Orchestra

May 24, 2014

Live Electronic Performance: Variable animal sounds

Filed under: DMS8012,Sharing,Testing — yinzhenbhao @ 7:10 PM

The demo of variable animal sounds was made by pure data, which was based on the combination of three to four different patches, as the trial of testing sounds variation for group project electronic performance with Clare. The recordings was made based on half of my personal interests during the testing, and would like to be shared.

The patches are mainly focused on developing loop and delay unit to change the frequency of sound so that animal sounds cannot be heard clearly but still can be identified during the performance. The experiment is to explore the possibility of noise related to the cultural context by a narrative way.

Using 3 patches as below:
2 raw variable animal  patches + 1 telegram patch.
B08new

G09new
QQ图片20140528205028

May 15, 2014

Digital Media Project: A Moonish Installation

Filed under: DigitalMediaProject,DMS8013,PersonalProject — yinzhenbhao @ 1:29 AM

Documentation of A Moonish Installation

Flashy Version (the version with ringing sound)


Creative Arts Practice for Digital Media Project 2014.

This interactive installation is controlled by Arduino and a USB camera, which captures the light status and ‘request’ the light by the way of ringing and changing colour of the rotatable flower randomly to attract participant’s attention if too much light is blocked on the top of the installation. It shows the opposite colour (i.e. white for the demand of dark and black is for the need of light source) in alternative light condition (dark / bright) that the installation can be captured, and impels the participant to response it as a part of the creation.

The code was programmed by Processing and Pure Data and outputted by two different visual forms through interaction. This installation was designed to invite participants to ponder the perception and interaction, immersion and embodiment, partial and the whole by digital arts throughout the interactive process.

—————————————————————————————————————-
Project Code:

Processing Part:

import ddf.minim.spi.*;
import ddf.minim.signals.*;
import ddf.minim.*;
import ddf.minim.analysis.*;
import ddf.minim.ugens.*;
import ddf.minim.effects.*;

import processing.serial.*; // import the Serial library
Serial myPort;

Minim minim;
AudioPlayer sou; // variable name

float sensorValue;
float angle = 0;
int x = -142;
int y = -112;
int smallPoint, largePoint;

PImage img;

PShape SVG01;
PShape SVG02;
PShape SVG03;
PShape SVG04;
PShape SVG05;
PShape SVG06;
//PShape SVG07;
PShape SVG08;
PShape SVG09;
PFont font; // showing the sensorValue

void setup() {

// font = loadFont(“Serif-24.vlw”);
size(1300, 800);

// println(Serial.list()); // print a list of available serial ports
/* better way to print this
for (int i=0;i<Serial.list().length;i++) {
println(“[“+i+”]”+Serial.list()[i]);
}
*/

myPort = new Serial(this, “COM7”, 9600);
myPort.clear(); // Empty the buffer, removes all the data stored there.
myPort.bufferUntil(‘\n’); // Throw out the first reading, in case we started reading
// in the middle of a string from the sender. (start buffer Until “\n” is read)
smooth();
SVG01= loadShape(“img1.svg”);
SVG02= loadShape(“img2.svg”);
SVG03= loadShape(“img3.svg”);
SVG04= loadShape(“img4.svg”);
SVG05= loadShape(“img5.svg”);
SVG06= loadShape(“img6.svg”);
// SVG07= loadShape(“img7.svg”);
SVG08= loadShape(“img8.svg”);
SVG09= loadShape(“img0.svg”);
SVG51= loadShape(“img51.svg”);
SVG61= loadShape(“img61.svg”);
SVG81= loadShape(“img81.svg”);
}
void draw() {
background(255);
fill(0);
//textFont(font, 24);
// text(“sensorValue= “, width*0.3, height/2);
// text(sensorValue, width*0.55, height/2);
pushMatrix(); // save the current coordinate system to the stack
// translate to the center of screen
translate(width/2, height/2);
// rotate everything when the frameCount adds up
rotate(frameCount*0.01);

// small flower
if (sensorValue>300 && sensorValue<=400) {
fill(0, 180);
SVG09.disableStyle();
shape(SVG09, x+10, y+10, sensorValue+15, sensorValue+15);
SVG01.disableStyle();
shape(SVG01, x+10, y+10, sensorValue+15, sensorValue+15);
}
// bigger flower
else if (sensorValue>400 && sensorValue<=500) {
fill(0, 180);
SVG03.disableStyle();
shape(SVG03, x-20, y-20, sensorValue+20, sensorValue+20);
SVG04.disableStyle();
shape(SVG04, x-30, y-30, sensorValue+25, sensorValue+25);
}
// random colour
else if (sensorValue>500 && sensorValue<=700) {
//Audio is triggered
minim = new Minim(this);
sou = minim.loadFile(“ESA_installation.wav”);
sou.play();
fill(random(0, 100), random(0, 100), random(0, 100), 80);
SVG05.disableStyle();
shape(SVG51, x-60, y-60, sensorValue+60, sensorValue+60);
fill(random(100, 200), random(100, 200), random(100, 200), 150);
SVG06.disableStyle();
shape(SVG61, x-50, y-50, sensorValue+35, sensorValue+35);
fill(random(200, 255), random(200, 255), random(200, 255), 50);
SVG08.disableStyle();
shape(SVG81, x-60, y-60, sensorValue+40, sensorValue+40);
}

// else if (sensorValue>700 && sensorValue<=800) {
// fill(random(0, 100), random(0, 100), random(0, 100), 100);
// }
else {
fill(255, 150);
SVG08.disableStyle();
shape(SVG08, x+30, y+30, sensorValue+300, sensorValue+300);
}

popMatrix(); // restores the prior coordinate system
}

void serialEvent (Serial myPort) { // SerialEvent is called when data is available.
// get the ASCII string:
String inString = myPort.readStringUntil(‘\n’);
if (inString != null) { // only does the following when there is something
// convert to a float
sensorValue = float(inString);
}
}

Arduino Part:
void setup(){

Serial.begin(9600);
}
void loop(){

//Serial.println(sens);
Serial.println(analogRead(0));
delay(500);
}

Pure Data Part:pdGem

May 14, 2014

Digital Media Project: Original Inspiration

Filed under: DigitalMediaProject,DMS8013,Idea — yinzhenbhao @ 12:51 AM

The inspiration of this project stems from the concept of human-environment interaction. It is the experiment that converts human’s perception of light to embodied visual experience in two different ways, which invites participants to experience the mutual relationship between human and the environment. Meanwhile, this project is also conceived in terms of exploring human perception through mirror image to rethink immersion and embodiment, observation and interaction, partial and the whole through the form of digital art.

Human Environmental Interactions can be defined as interactions between the human social system and (the “rest” of) the ecosystem. Human social systems and ecosystems are complex adaptive systems (Marten, 2001). 

Although digital media in the context of exploring human experience by technology, it could be a different way to establish a new perception through the interaction of human and environment based on the possibility by using digital technology. It has broader forms of exploring environmental needs instead of users’ needs in this field to address the issue, such as energy conservation (e.g. lighting, temperature and electronic energy etc.) in a space.

Reference
Lill, A and Gräber, S. (2006). Human Environmental Interactions.  [Accessed 14 May, 2014]

Older Posts »

Powered by WordPress