mac online apple blackjack http://www.euro-online.org

Media in Public Assign 8

🙂

Sky message board

When we are looking at the sky upon this city through our mobile phone’s camera, what could we would we like to see? I asked some people “If you could write a short sentence in the sky and everyone living in newcastle could see it, what will you write? what the colour was it?” (Thanks Jamie and Michael’s advice, I’ve tried michael’s advice like Red Building, actually that part is really interesting, but I’m still suspect if I could do that – because I’m not sure the speed of the cellphone could deal with it and the control part, like how to build it also puzzled me :-), but thank you very much!)

Then what I got are:
1. Hello world! (Several answer like this came from a programer group…)
2. I wish XXX love me forever! (Pink)
3. I could defeat everything! (Golden)
4. I wish everyone seen this word could find their happiness. (Red, Big Smill)
5. Be humorous and relax, you could win~ (Red)
6. I love Newcastle!
7. Keep your head in the clouds. (Purple)
8. Keep calm and carry on. (Yellow)
9. Have a lovely day ! (Yellow)
10. I will be a star! (Orange)
11. Stay positive (Pink)
12. Peace! (White)
13. Olive! (White)
14. Love (Red)
15. Happy Mothers Day~ (Pink)
16. Love and Peace (Purple)
17. Love and Peace (Pink)
18. I wish it as sunny and warm today. (Yellow)
19. We are all under the same sky. (White)
20. The sky is beautiful every time. It doesn’t mattered is sunshine or rain. It’s something what I like. The Sky always makes me happy. I don’t think about my troubles, problems. When I am looking on the sky, I feel free and full relaxed.
21. I’m leaving you (Depends on the light)

Then I think the sky message board could have 2 function, the second one is based on the first one.

1. Send message for special people.
Scene:
One day, Marry received a message from her boyfriend, “Look at the sky over the city centre!”(it could be also the location of the sender)  She ran the app “Magic city”, used the camera of the phone to look at the sky, and found the big “I love you” made by the clouds floating on the sky, like this:

(The same with the original idea)

Also, their some pictures of skywriter, the effect would be similar.

(http://fiveprime.org/hivemind/Tags/skywriting/Interesting)

2. For everyone (Optional)
Scene:
One day, Pengfei is taking photo for him self alone, like this:

Isn’t it a little silly?
Now, we could do this:
1. The cellphone would judge if you are angry or smile (or just input the keywords, like “smile”, “angry”, or maybe “food”)
2. Based on the sky message board online data, we could get a photo like this~

Is it great?
This idea is similar to the danmaku video like this:
and the poster like this:

(To make it here: http://www.wordle.net/)

Also I got a interest point, it is likely when you are taking photos outside of a restaurant, the word cloud would be “Delicious!””Swallow my tongue!” or “Peaceful”. From commercial point, it could also be developed.

Everything needed are a mobile phone and net access, I’m still working on how could the software system work, like:

1. Import the Google earth map (Use the 2D model or 3D model like what streetview use?)

2. Calibration (make the map and the view calibrated)

3. Send and Receive message. (based on Twitter or?)

4. (Optional) Use wordnet to adapt a simple classification method to make every message sorted by “happy” and “unhappy”, based on the facial expression recognized by cellphone (if possible) or the keywords imputed “smile” to the cellphone.

5. (Optional) How to deal with the color of the sentence, changed time by time based on the sunlight? still? simple animation? Transparent?

Mobile Media Development

Outlined the technicalities of my Mobile Media project, which I’ve decided to call “Key to the City”. Excuse the crude format, I’m a fairly sketchy person and I’m in Scotland this week without my graphics tablet, so I wanted to do it by hand rather than crudely with a mouse. This is very much a first draft though, so expect a neater, more resolved version soon.

As you can see, it’s pretty simple. It’s not very complicated in terms of technical details, which will hopefully make it easy to implement. I also drafted a user scenario – again, still to be neatened up!

In terms of equipment and materials, here is my current spec (to make one working prototype):
1 x key
1 x RF receiver
6-8 x RF transmitters
1 x on/off switch
1 x vibration component
1 x battery

On the prototype front, I’m aiming to have one key working, picking up signals from approximately 6-8 transmitters. I’ll need to do a bit more work first, but if I can’t get the aesthetics I’m aiming to achieve, I suspect I’ll make a functioning prototype and and experience/aesthetic prototype.

MIP – solar LED

for the latest media in public assignment, we were asked to consider energy use within public space projects. i have found exploring sustainable energy sources an inspiring and interesting research area (i shall continue to wander down this path)

so for the assignment we were asked to sketch a public space art/design project that had no energy constraints… my initial proposal to the council involving several huge spotlights around the city, and with the additional connectivity to whatever controller is appropriate… this is a pretty power hungry project!

the other option was to create “a solar panel and hacked mp3 player to build a public-space messaging box.” wsn’t really sure what this meant (and i didn’t have an old mp3 player) so i bought a solar powered garden light from poundland, and started thinking about small energy friendly projects. having removed the solar panel and connected LED (see photo above) the light only turns on when it is dark (or when you cover the solar panel with your hand)… so i started to think about spaces in the city that are dark, or are regularly made dark, during the day in a city.

  • bike seats – light turns on when you sit down on the seat (not necessary during the day)
  • beer mats – when drink is placed on top beer mat is dark (what would an appropriate message or effect be?)
  • the road at traffic lights – particularly underneath buses (messages could appear on roadside LCDs – pro cycling messages?)
  • inside toilets – bit grim a location really…
  • benches – when people sit down, solar panel goes dark (again what would be the result of this?)

i am continuing to think of locations, and what the stored energy could be used for… not sure of the voltage, but its not going to be high (1.5v…3v max?) i would like to develop this sustainable energy supply, and hopefully use energy harvesting to power my prototype for the city council project.

(that might be a little bit too ambitious/expensive!)

 

Three ways to use Kinect in processing

I have got 3 ways to use Kinect, 1 is based on libfreenect, 2 are based on OpenNI.

 

Libfreenect

Shiffman’s library:

http://www.shiffman.net/2010/11/14/kinect-and-processing/

OpenNI

1. Use processing wrapper

http://code.google.com/p/simple-openni/

2. Via OSC in processing (Max&MSP) to read the data produced by OpenNI

http://tohmjudson.com/?p=30

This method could makes OpenNI applied in many platform because OSC (Open Sound Control) is widely supported now.

Installing OpenNI, Kinect drivers and NITE

Recently, some friends on the forum always ask questions about how to build the environment to use Kinect, so here I want to conclude some points.

Windows

As for windows, although there are lots of articles to describe it, there are still three points easily to be ignored but lots of problem caused them.

1.     OpenNI could only work well in 32-bit system until now.

2.     The sequence is important, first, OpenNI, second, KinectSensor (Driver), last, NITE. The versions must be matched.

3.     Don’t edit XML file using wordpad, it could arouse some problems related to the Unicode and UTF-8.

Most questions left could be tackled by reading the Guide in the folder “Documentation”.

Mac

http://kinecthesis.bakedmac.com/2011/01/11/installing-openni-kinect-drivers-and-nite-on-mac-os-x-10-6/

This article would guide you get the target step by step, The readme file of KinectSensor also introduced lots.

Problem maybe happened when we are installing Macports, after we complete the install work, we should also selfupdate it first, or it may notice you:

Warning: No index(es) found! Have you synced your source indexes?

And the command “sudo port install libtool” could not run, this problem are aroused by the firewall (maybe belongs to the school), and we couldn’t use svn to do the update work. We could tackle it in this way:

https://trac.macports.org/wiki/howto/SyncingWithSVN

In the last step, we should edit the source.conf, I’m new in Mac, so I’m not sure is there any tools other than “vi” in Terminal could do it. Don’t worry, “vi” is complex (and great), but for our task, it’s quite easy. First, enter the “insert mode”(vi start in “control mode”, press S to enter “insert mode”), second, edit, last, Save and quit(press esc to quit insert mode and enter control mode, input “:wq” to save and quit, input “:q!” to quit without saving).

Last, as for me, the samples could not run just by clicking their icon, but input ”./filename” in Terminal to run.

Good Luck!

PS: In Terminal, when you input password, nothing would show you, even “*”, but don’t stop, just input and press enter.

Be careful when you rename your file or folder, because if there is a space followed the name, when you input the name (without the last space), the machine would never found it.

Processing Note 1: Random and Color

From this day, I would select some examples which I thought important and comment them.

1. Random and Color

I thought for most freshman as me, processing means use a lot of object to make the output as magnificent as possible, which means we should use some function to build them, to manage them and make them different. So let’s begin at “random” and “color”

P206 Processing – Shiffman Example 13-3: Probabilities

void setup()

{

size(200,200);

background(255);

smooth();

noStroke();

}

void draw() {

// Probabilities for 3 different cases // These need to add up to 100%!

float red_prob = 0.10; // 60% chance of red color

float green_prob = 0.60; // 10% chance of green color

float blue_prob = 0.30; // 30% chance of blue color

//The first random(high) is used to get a number between 0 and 1.

float num = random(1); // pick a random number between 0 and 1

 

// If random number is less than .6

if (num < red_prob) {

// Once the color is decided, we could also change it a little

// Use 255*random(0.6,1) to make it change from “slightly” to “very”

//fill(value1, value2, value3, alpha), alpha means opacity of the fill

//alpha means the opacity of the fill, from 0 to 255.

fill(255*random(0.6,1),53,2,150);

// If random number is between .6 and .7

}

else if (num < green_prob + red_prob) {

fill(156,255*random(0.6,1),28,150);

// All other cases (i.e. between .7 and 1.0)

}

else {

fill(10,52,255*random(0.6,1),150);

}

ellipse(random(width),random(height),64,64);

}

[youtube]http://www.youtube.com/watch?v=ZCxfBMuG_hk[/youtube]

 

 

 

 

For the freshman in processing as me

http://bluethen.com/wordpress/index.php/category/processing-app/page/2/
This blog give several very different examples to show us how to use the knowledge on math and physics to create something really gorgeous. I plan to learning what’s the thinking method in processing, what do I want? How could I get it? How to make the big project into small steps and how to structure it?

LED drop

following on from my hidden pavement post i made some LED droppies to slip through the grating and illuminate the underground walkway. i went down to the quayside under the cover of darkness with james davoll‘s spirit in tow and dropped my pre-made LEDs down into the shadowy depths…

…the result was not amazing.

…the general public were drunkenly present…

…they did not notice the “blinding light” from below…

…neither did my camera really…

so, lessons have been learned… use brighter LEDS, possibly fashion some weighting mechanism so they point up, take a better camera…more LEDs?

also… i felt like i was littering (i was)… i didn’t want any police or bomb squad attention… and once the LEDs had been dropped, (with minimal/no success), i released i had pretty much dropped money down the drain (albeit not very much, but still…)

worth a shot though eh….

Signal Modulation and the Human Circuit

Looking back over the design of this piece, I have noticed certain recurring themes in my work. The more and more I work with the visualisation of audio signals, the more I become interested in the notion of signal modulation and aesthetics. My work with Tip31c transistors is such an example of aesthetic signal modulation, as the audio data fed into the circuit via the transistor effectively modulates the signal in the current, affecting the brightness of the LED. Similarly, my Processing based audio visualisers work in a similar manner: a shape is drawn, using the audio variable to control size, translation and colour. This variable is effectively modulated by variations in the signal being fed to the FFT. The FFT (or fast Fourier transform), is also of interest as it also fits with this notion of signal aesthetics. Using the FFT, a signal is broken down into its component oscillations. What I find most interesting about this application of audio signals to visual media is the sense of synchronicity. As both the audio and the visual are run off the same signal, they have the same source data. The data is just plotted in a different manner.

This whole concept of brainwave entrainment and its potential for therapeutic use is somewhat reminiscent of the Penfield Mood Organ from Phillip K Dick’s Do Android’s Dream of Electric Sheep? The device, which sadly does not appear in Ridley Scott’s adaptation, is a home appliance for the regulation of emotion states. The appliance is seemingly based around the Penfield Wave Transmitter, a similar tool that can also be used for the projection of emotional states. This concept is really interesting, given the nature of my research. Wahbeh, Calabrese and Zwickey’s 2007 study on the psychological and physiological effects of binaural beats suggests that exposure to delta (0-4Hz) binaural beat frequencies can have positive psychological and physiological effects. If this is a result of brainwave entrainment, then it fits with the concept of the Penfield Wave Generator: through the use of specific frequency signals, one can affect neurological activity, and effectively emotional states.

I find this reference to Penfield of further interest due to some of the reading I have been doing over the last year. About a year ago, I read Oliver Sacks’ The Man Who Mistook  His Wife for a Hat. One chapter, entitled Reminiscence, refers to a number of cases where patients would hear music when there appeared to be none. In the chapter, Sacks refers to Wilder Penfield, the Canadian neurologist who invented the Montreal Procedure. This procedure, in which the patient is conscious under general anaesthetic, allowed Penfield to map the function of the various areas of the brain through stimulation and observation. A similar procedure was performed on banjo player Eddie Adcock during an operation to treat a tremor in his hand. Adcock played his banjo whilst surgeons continued with the deep brain stimulation, allowing them to test the effect of the neurological implant on Adcock’s motor skills and the effectiveness of the treatment.

During Penfield’s experiments, he was able to identify the source of the elaborate mental states felt by many epileptics during the onset of a seizure. Looking through Penfield’s notes, there are multiple cases where Penfield is able to elicit this hallucinatory state through the precise stimulation of areas of the temporal lobe. This stimulus, as Sacks puts it, “would instantly call forth intensely vivid hallucinations of tunes, people, scenes, which would be experienced, lived. as compellingly real, in spite of the prosaic atmosphere of the operating room, and could be described to those present in fascinating detail.” These hallucinations, on further inspection, would appear rooted in the experiences and memories of the patients. Many recall hearing songs they associated with their youth, often not recognising the tune but being able to sing along. Furthermore, when the stimulus was less precise, Penfield observed an apparent blending of memories. In one case, a young boy, during stimulus the patient would observe a bank robbery in his hallucination, without ever having witnessed such an event. Additionally, there was no sound in the hallucination: it was purely visual. After the stimulus was less precise, with the signal spreading to other areas of the lobe, the patient appeared to have a hallucination in which further elements of his own memories were merged with the bank robbery hallucination. It transpires that the boy was a comic book reader, who had read a comic which involved a bank robbery. This, it would appear, explained the absence of audio from his initial hallucination, as the boy would have no contextual understanding of the sounds associated with a bank robbery. This also seems to suggest that memory and perception are a series of sensory snap shots, which are stored in the brain and can be accessed via stimulation of certain areas. This is somewhat fitting with the analogy of the human computer, suggesting that memories are stored similar to files on a hard drive.

Penfield’s work is of further interest when exploring this analogy of the human machine. The Montreal Procedure, from what I am able to interpret with my very limited experience with neurosurgery, involves the probing of various neurological tissues and the observation of the patient’s response. Penfield appears to use this technique to identify areas of scarred and damaged tissue, removing those areas that would not have a negative effect on the patient. In the analogy, these areas of damaged tissue seem to correspond to the damaged connectors in a circuit. Just as a faulty connection might result in a short circuit or electrical discharge in the circuit, these damaged tissues can cause an electrical discharge within the brain or a seizure.

I’m interested in exploring these ideas further. I have been experimenting with photo editing, coming up with a few prints that relate to this idea. I am currently referring to this one as the electronic homunculus.

I’ve also been experimenting with the Processing Image Adjust library, using audio signals to manipulate the contrast and gamma of an image. For the images, I have been using medical photographs, such as x rays, MRI and retinal photography. These sketches seem to work really well with glitchy, downtempo audio with interesting dynamics, such as Squarepusher’s Conc 2 Symmetric from Do You Know Squarepusher? The ambience of the music also seems fitting the atmosphere of the images. I’ll post a link when I record a demo of the sketch.

Clojure as a music programming environment

Rather nice looking by the looks of things!

http://mad.emotionull.com/