little stories

Eve and drone

looking at stories – being read to – the comfort of it – the sound of a voice when it is telling a story or reading – thinking about groups of people telling stories – thinking about a collection of stories being told in a cacophony – individual sounds from headphones

meditation on a future horizon


Brandon LaBelle

Revolutionary statements make a claim onto history to charge a given time and place with radical energy: to galvanize the masses, to overturn social behavior, to disrupt and ultimately transform reality. Such statements act as momentary bursts of outrage and political conscientiousness, giving definition to the here and now as a time in need of rupture.

Exploring revolutionary desire as a temporal moment, the project examines various historical texts and statements calling for social transformation. From Situationist graffiti to Black Caribbean rights, the statements are translated into random melodies using a music box mechanism: writing the statements out across musical staff paper, each melody turns into a lyrical homage to revolutions gone by, as well as suggests links between art production, as a project of reworking time, and the revolutionary moment as a recurring intensity throughout history.

The project is presented as a series of video works capturing the gesture of producing the melodies. These are shown on a set of laptops within an installation setting acting as a working space containing related documents, materials, books, CDs and artefacts.

Exhibited at Mario Mazzoli galerie, Berlin
January 20 – March 22, 2011

cornell and archiving/collecting and object poems

been looking at Joseph Cornell’s boxes and how the objects make connections with eachother. It has made me think about archiving work, representing it as an archive or collection of disperate objects which are connected by seeing or hearing.  thinking of digital ways to re-present, make connections between experiences.

 

stereoscopic .gif attempt

(click image to view .gif)

this image is created by taking two still photographs of the same subject, from two slightly differing viewpoints… technically it works best if the distance is around the same as the distance between your eyes… but we’ll see about that.

i have been wanting to make a stereoscopic animated .gif for a while now, i like the lo-fi 3-dness, no glasses, no x/y/z, rendering etc… just 2 photos slammed together on a loop. the source photos were taken on an iphone 4, with no precision regarding angles and composition.

i am pretty happy with the result, and i think the KIRAKIRA subject matter is appropriate for the jerky/glitchy imagery.

MIP: System Development

For this project, I wanted to develop a system that was a development of the idea i’d had for an audio filter for mobile use. However, given the project’s brief about engaging with spaces throughout the city, I thought it would be better to develop it as a technology installation rather than as a mobile app. Given that I was going to design the system using MaxMSP rather than Pd as I am familiar with it, I thought i’d design it a Max based app which could then be developed and ported for Pd, enabling me to embed it in a mobile device.

The current system spec runs the audio signal from a microphone via a number of variable filters. Using the Max biquad function and the associated help article, I was able to set up a pair of filters which could be set to high-pass, low-pass and band pass. There are also additional gain controls.

So far this was quite good, but a bit simple. I wanted to develop the system further, making it interactive and appealing to users willing to engage with the technology. I remembered the user feedback I’d had from a previous project, The Echo Chamber. The piece used realtime delay feedback looping of a signal from a microphone in a multi-speaker surround sound setup. Many users found the system really engaging, making full use of the distributed props (crisp packets, pens, cans) to make their own rhythms. I thought that this interface could be integrated into the design, allowing the user to manipulate the audio further and use it as a sort of drum machine.

In terms of the interface, I intend to use some form of midi controller utilising pads and dials rather than a traditional keyboard interface. This is due to some concerns that were raised regarding accessibility. When I was talking to potential users, many raised the point that they were not musicians, or at least not classically trained musicians. As a result, they felt that they probably wouldn’t interact with a system that made use of keys. Instead, I looked at the Akai LPD8: a laptop controller with 8 pads and rotary dials. There were a number of features that drew me to this particular interface:

1. It is small and simple, taking up little space whilst limiting the complexity of the interaction. As the piece is designed for a public installation. I felt that a degree of discretion would be good, so as not to attract undesired attention from vandals, whilst the the simplicity of the interface would promote interaction by not being intimidating

2. The interface has a cosmetically appealing feature: when pressed, the pads light up. I thought this would also promote interaction, as there is both an audio and visual response to user interaction.

3. The low cost of the unit. LPD8s are widely aver;able off the shelf at about £50, making them cheap and easy to replace if damaged.

So far I have integrated the controller into into the sketch, using the pads to control the delay feedback effects and the dials to control the filters and the gain. There are still three rotary dials that are unassigned, which could be used to control other effects, such reverb, distortion and a flanger. Such effects are common on modern electronic music equipment, such as the Numark Axis 9 cd player, one of my first DJing equipment purchases many years ago.

In terms of installation locations, I wanted to pick a number of areas in close proximity to each other, with varied soundscapes and a degree of local cultural significance. Having lived near to the Ouseburn valley for many years, I was quite aware of the varied architecture and topography of the area. My interest in doing a project in the area was further increased when I explored the more wooded area of the valley which is currently undergoing development. These wilder areas seem to stand in stark contrast with the geometric shapes that compose the bridges in the area. As a result, sounds in the area are a varied mixture of wildlife activity and the sounds of traffic (road, rail and pedestrian) overhead. A number of footpaths and byways cut through the area, under and over the various bridges in the area. I think this would be good for user interaction, as the units can be ‘discovered’ by potential useres as they walk through the area. This in turn encourages them to engage with the locations, as they stop and look around the area.

I’m quite eager for users to use their own headphones for this particular interface. This is for a number of reasons. Firstly, it reduces the costs and maintenance of the unit, as headphones installed in outdoor public locations would likely be damaged by the elements, potentially mouldy and infested with insects. Secondly, it serves as an additional means of engagement. Lone commuters often carry headphones with them, and a pair of plastic buds is a common sight in people’s ears. Many people, it would appear, would rather listen to their music than that of the area around them. The system encourages them to disengage from their personal media, and actively engage with the sound of the area around them. By offering a degree of control of ambient audio, the user is encouraged to use the system to reinterpret the sounds of the area around them.. Given the widespread use of headphones, we can assume that there are plenty of potential users. Whether someone would be willing to plug their headphones into an open, public audio port is another matter.

MIP: Reflections on Project develepment

Following the submission of the project document and the installation of the demo at OnSite, I have had a number of thoughts about the current technology setup. There are certain elements, particularly during the installation, that have proved bothersome, and I would likely redesign the unit to account for this.

One potential alternative to the current setup would be to use contact mics rather than the stereo condenser mic. If contact microphones were installed on the structures, then they would pick up the vibrations caused by traffic, be it road, rail or pedestrian. I quite like this idea, however there area couple of reasons why i thought it would be better to record all ambient sound rather than using contact mics:

1. By using the current setup, the unit is able to pick up audio from a much wider variety of sources. If I were to use contact mics, then the unit would only pick up vibrations within the structures. Given the broad variety of sounds that come from the Ouseburn area, as a result of the transport network that runs overhead, the local wildlife and farm as well as the flow of the Ouseburn river below. By having multiple locations situated around the valley, the technology draws attention to the varied soundscape of the area

2. Multiple artists have installed contact microphones in structures in order to create audio reinterpretation of structures. Although I find this idea appealing and I am very interested in the subject, I feel that such an installation would be almost too derivative of such artists as Mark Bain and Jodi Rose.

I am still very interested in using contact mics in a piece, possibly using them as the audio source in a similar setup. Given the high flow of traffic over the Tyne Bridge as well as the tower structures at each end, I feel that this location would be a really  interesting site for such an installation. Will Schrimshaw has made really interesting use of this location at the AV Festival in 2010, installing multiple loudspeakers in the tower structures. I think that s imiliar setup would be really interesting. However, rather than using a looped signal to determine the audio, I would rather use a series of contact microphones, which would be used to trigger sub bass frequencies. These frequencies would then be played via the loudspeaker, resonating within the structure creating a feedback loop, much like the use of audio looping in the current setup.

Another issue that I have encountered when working on this project is the issue of microphone to line level signal conversion. In the project book, I refer to the need for the commuter to have a microphone level input, or to have a soundcard that has such capabilities. For the exhibition installation I used a mac mini which unfortunately does not have such an input. Therefore, I had to run the microphone signal via a mixer to increase it to line level. For a permanent installation of the project, i would likely not use a Mac Mini, given the cost of the unit and the lack of microphone level input capacity. Instead, I would use a cheaper unit and install it with either a USB external or internal sound card, or use a reconditioned second hand laptop as the basis for the unit. However, given the client’s requirements as well as the timescale required for the current project, such a design and (re)construction would be unfeasible.

Another issue is that of signal loss and distortion during the installation. For the original setup, I planned on using a 10 metre audio extension cable so that I could install the microphone  in the high in the side of the railway viaduct. However, due to the buzz that using such a long cable created, as well as the proximity of some particularly large looking cables, I decided it would be best to install the microphone much lower and closer to the unit, to minimise the need for extension cables and reduce the loss of signal. Were I to install the piece again, I would probably use wireless micas, allowing for the unit to be installed at a significant distance from the microphone without the need for long, potentially vulnerable cables that could cause distortion or loss of the signal.

I would also like to make some further modifications to the programming of the unit. Firstly, I would like to make the whole unit more ‘tamper-friendly’. This is a result of watching people use the interface without understanding it, sometimes resulting in them changing the interface program. One alternative would be to make each program on the midi interface identical, so that such problems would not occur. Alternatively, the buttons could be covered with a piece of plastic or deactivated so that this sows not happen.

I am also considering including a number of additional features with the unit, possibly integrating drum loops that could be controlled via the top, unassigned dials. This would emphasise the rhythmic nature of the sound, whilst giving the unit additional potential as a musical instrument. These dials could alternatively used to control additional effects, such as reverb, distortion and a flanger.

Finally, I think that the unit could include some more clear instructions. I have tried to include instructions for use, to some beneficial effect: people seemed intrigued with the unit at the launch and many people interacted with it. Part of the difficulty of coming up with instructions is the level of detail to include, wanting to provide sufficient information without patronising or confusing the audience. I decided to include simple instructions in regular font, with more detailed instructions in italics. The idea behind this was to provide simple, accessible information for general users, whilst providing more detailed information for those who desire it. I also included a hint on how to use the pads to create more interesting effects.

MIP: Designing an App

One of the things that has come up when discussing sound walks and headphones  with a few friends and colleagues is the additional sounds people claim that they are able to pick up on when they listen to the sounds of the city via a microphone and headphones as opposed to without such technology. I thought this was quite interesting, given my interest in the brain and its processing of stimuli. It reminded me of some reading I’d done on ‘the cocktail party’ effect, ie, the brain’s ability to interpret various audio signals, filtering out those that might be deemed extraneous to the situation. This can be observed in people’s ability to focus on one conversation over a many in a crowded noisy room, hence its name. Somehow, the sense of disassociation that accompanies listening to the world via a microphone and headphones seems to negate this particular ability.

When looking at urban and suburban audio, the effect is similar. Without this cognitive ability, the urban soundscape would likely be unbearable due to the chaotic noise. In  A New Sense of City Through Hearing and Sound,  Eva Kekou and Matteo Marangoni discuss the chaos of urban sound.

There is a paradox between the fact that cities are highly structured spaces in which almost everything one senses has been processed through a human brain to be orderly, and the fact that interactions therein are far too complex to be controlled.

I wanted to design a technology that somehow worked as an artificial, controllable form of this inherent audio filtering ability. Using the mobile application template, I thought about designing a way of running the phone microphone input via a number of variable audio filters. The audio would then be fed out of the phones 3.5mm audio output to headphones. This technology would not be particularly difficult to design and implement, given Pure Data’s compatability with a number of mobile devices.

This has a couple of applications: firstly, as a more socially beneficial way, this technology could be adapted for use in dealing with various audio/noise issues. In another discussion with a colleague who had suffered a stroke, I was told that since then he finds it incredibly difficult to focus on sound: often conversation can be difficult when there are several sources of sound in the area, as he finds it impossible to concentrate. I thought this application could be quite helpful, enabling him to potentially filter out those undesired noises. The second use, a little more creative, is as a filter for use in an instrument, a sort of ambient subtractive synthesiser. I feel that this could make for an interesting piece of technology that could be used as a way of providing a fresh perspective of the city soundscape.

MIP: Work By Another Artist

I was really interested in the Kittler reading, particularly the section referring to transport networks and infrastructure as systems for the flow of information. Given Newcastle’s metropolitan design and busy traffic, the city can almost be seem as being at the centre of a massive communication network, This can be seen in the extreme business of the Tyne Bridge during the rush hour commute. At times it has taken me nearly an hour to get across that bridge in a car.

I wanted to bring the work of Jodi Rose and her Singing Bridges to the class. The project outline introduces the concept:

“‘Singing bridges’ is a sonic sculpture, playing the cables of stay-cabled and suspension bridges as musical instruments. To create this work I will amplify and record the sound of bridge cables around the world. Listening in to the secret voice of bridges as the inaudible vibrations in the cables are translated into sound.'”

I thought this idea was really interesting, as it was a new way of using the noise of some of the most architecturally impressive structures humanity has ever designed. Bridges serve as the linking points in Kittler’s urban information network, the connectors in the circuit. A byproduct of this is the extreme levels of noise. This applies to bridges, as well as many other points within the transport network. I have had the fortune in my time in Newcastle to live in the ground floor flat, next to a Number 1 route bus stop, and some of the noises when the bus is idling at the stop are incredible. The walls of the flat filter out much of the higher end engine ‘chug’, however, there is a powerful, strangely warm rumble that can be heard and felt to some degree in the walls.

I thought Jodi Rose’s project was really interesting as it is a form of reinterpretation of that noise byproduct of metropolitan living. I also thought the idea could be used in Newcastle, given the city’s 2007 award as the country’s noisiest city.

Final Project Update

Still beavering away at my final project. Here is an A3 Research poster prepared for last weeks’ project session, but expect some better stuff soon. Cultural probes have started to come back, I’ve started to analyse those so will post my findings soon.

DOING: Processing

CONCEPT

“Abertrack” is a Processing program and visualisation which tracks and plots the voy- ages of an international oil tanker called MV Aberdeen. Ship location data is provided by a service called “Ship AIS” which provides regular updates, including the latitude and longitude of the vessel. As my partner works at sea, I regularly check Ship AIS but often find the information confusing. There is an option to view the vessel position on a map, but it only shows the previous 6 hours of movement.

I wanted to create a nicer, more aesthetically pleasing visualisation of the MV Aberdeen’s movements over a longer period of time.

IMPLEMENTATION

When I embarked upon this project, I wanted to pull the vessel’s location data from the Ship AIS system via XML feed, however this proved to be an insurmountable feat. Most sources did not have an XML feed that Processing could read, and the only one I could find required a fee of several hundred pounds to access. However, I expect with enough time – and perhaps some outside advice – I could eventually overcome this problem by reading the val- ues straight from the PHP cell on the website.

After several weeks of battling with the above, I opted to manually store the data in a text file. I then called this into Processing as an array, with each line (latitude, longitude) being parsed as a string. This was an easy workaround and allowed me to achieve what I had set out to. For each string, I drew an eclipse to mark the co-ordinates of the ship at that time. Once I had my co- ordinates plotting correctly, I joined them with a line.

The biggest challenge I faced in this project was plotting the GPS co-ordinates onto a 2D map. It took a lot of research into mercartor maps, a distressing amount of maths and rather a lot of trial and error to finally come up with a solution which enabled me to plot GPS co-ordinates directly onto my map.

SCOPE & POTENTIAL

There is potential for this concept to be scaled up to create larger and more engaging visu- alisations. This example shows a mock-up of three different vessels in three different seas, in different colours.

There is also potential to make the map interactive, for example allowing the user to zoom in on specific areas or choose which ships they view. A live feed would be nice.

MYTHS

As inevitably predicted, I’ve gone for a very loose connection to the theme of “myths”!

When hearing people talk about myths, it is often sea-related things which spring to mind, eg. mermaids, the Loch Ness Monster, the Greek God Posiedon, etc. One of the most famous myths is based around the Chinese Ocean God T’ien Hou – a mythological being who guarded sailors and kept them safe. She would watch over sailors and ensure they were always safe on their voyages. Arguably, tracking vessels ensures their utmost safety and security – so my project relates to the theme in this way.

You can view the A3 poster I made to accompany the project by clicking here.