Opera set design project with Irene Brown & Shaney Barton.
Handel’s, Acis & Galatea. (11/05/2018).
(Photo’s in the above gallery by Ryan C. P. Boyle.)
My part in this project consisted of creating the original footage of the mountain (centre stage), as well as offering some editorial input. Further to this, I helped assemble the frames and set-up the projectors/coordinate the projection mapping whilst operating the program which ran the projections.
Irene Brown provided the footage of the folly, stage right & designed/built the frames as well as having editorial input.
Shaney Barton provided the footage of the waterfall & helped assemble the frames, as well as having editorial input.
In this post, I’ll be discussing my role in The Ex Nihilo Project below is an extract from my Live Electronic Performance essay, which was about an instrument built for submission to LEP, but designed to be used in The Ex Nihilo Project, (Scroll back a few pages to see the paper).
“The vision I have in mind is of an audiovisual-instrument/installation-piece designed to approximate the appearance of a “machine” built to observe the surfaces of alien-worlds, orbiting stars which output alternative colour spectrums. I.E. Red Dwarf Stars, Blue Giants & those yet to be encountered who’s colour spectrums differ from our own.
I’ve chosen to go for a “vintage sci-fi” aesthetic, for this reason, I’ve designed the piece around an old CRT computer monitor, with a plasma coil attached to the top, which triggers in response to sound outputted by the machine. Thus giving the impression of the coil being a receiver/aerial for the “machine”.
For Ex Nihilo, the “machine” will exist aesthetically within the context of an approximated laboratory environment. The “machine” will sit upon the end of a table covered by a black cloth, partnered with a row of multicoloured jars of glowing liquids. Stood directly behind the table will be a scientist-type character (myself), operating a large board covered with buttons/gadgets and glowing lights.
The buttons/lights are in actuality an amalgamation of various guitar FX pedals, a miniature amp, a chaos pad & midi keyboard. Assembled in one unit & acting as a control panel for the device (after all who doesn’t like glowing lights). The CRT will be displaying a Processing sketch accompanied by audio from PureData run preferably on a Raspberry Pi 3B+ but possibly on a concealed Macbook or laptop (If necessary).
I’ve written a PureData patch (Modular Dominator) that’s essentially a Modulating Synthesiser triggered by midi, which also output’s midi notes to the IAC driver, which is then picked up by the Processing sketch which shares the name of the project (In the Light of Other Stars).
It begins thusly; the performer simply plays any note on the midi keyboard, which sends that note value through PureData, triggering the sound synthesis. The processing sketch then detects the midi values output by PureData & triggers corresponding videos to display on the CRT monitor; with a different video (of an alien landscape) assigned to every white key of the midi keyboard, whilst the black keys apply image filters.
Following this, the sound is then output from the computer into a small USB mixing desk, where the stereo channels are split into 2 mono outputs allowing the performer to retain the original audio from the “Modular Dominator” whilst also being able to run the sound not only through the KAOSS pad but through a vast array of sound-altering FX pedals providing the performer with a superior scope in AV manipulation.“
(I’m aware the piece itself is not accessed in this module but it’s important to reference the work as this was the core component of my role in the Ex Nihilio Live performance.)
Beyond the live-streamed 4-hour endurance performance using the aforementioned setup at ACA, I was involved in various other aspects of the overall project; Such as the creation & editing of the visuals that graced the outside of the inflatable planetarium.
(Archived stream video)
We created these visuals initially via practical FX by using a fish tank, some LED lighting, dropping coloured Indian inks into the water, Myself, Pete, Kiran, Ning & Katie all took part in this.
(Making practical FX)
As for making the raw visuals into usable vids, this was solely my responsibility, I took on the role of editing and manipulating all pre-fabricated video content involved in the project including the visuals seen through the telescope. So while Pete wrote up the code & Kiran fabricated the mechanism for the rotary-encoders I provided the video/audio content.
(Edited dome visuals)
(Archived stream video)
The above (very informal) video is of myself and Pete Haughie testing both the live stream on youtube and the telescope whilst working in the studio together. Towards the end of the video, you can see some of the aforementioned video content inside the scope. I also aided in the setting up of the ballroom space, by helping to inflate the planetarium & assisting in positioning the projector & experimenting with lighting inside the dome.
When all of the above was done & ready for the show, it was just a case of travelling to ACA the next day and setting everything up at that end. Prof John Bowers & I, travelled to Hexham to meet up with Katie around 09.40 am, and we all travelled in the car together to ACA where Liam Slevin arrived around the same time at around 10.00 am, we subsequently spent the rest of the day setting up the room at ACA, & by the time 7 pm came about we were ready to go.
(Archived stream video)
Since this was the first project of this nature that I’ve taken part in, and having no previous experience of durational endurance performances, I’d say the whole thing went rather well. Myself Liam and Katie managed to communicate musically (& successfully I might add) for 4 straight hours, (As shown in the video above) which Is no mean feat considering we’d only been able to fit in 1 rehearsal prior to the event.
It was a long journey to arrive at that point, and not without its bumpy patches, It’s noted that the compartmentalisation of roles hindered communication somewhat with each other & perhaps with the venue, something that shouldn’t have been an issue but was. This issue was actually rectified in the last few days, upon the appointment of a producer (Katie) who happened to do a smashing job of pulling everybody together.
Had we appointed a producer at the start, this wouldn’t have been an issue. Beyond communication issues, everyone did their roles very well and the project came together nicely. Since I was in Allenheads live streaming at the time I was unable to witness the installation at Culture Lab myself, however, I’m told we had a turnout of between 190-200 people & received some great feedback.
So communication issues aside, I feel as if this project has greatly enhanced my experience in many areas, I’ve grown much closer with my respective classmates that were involved and would love to work with them again, namely Pete, Katie, Liam, Kiran, and Ning.
It would seem ACA are very happy with the outcome which is fantastic news, given the chance to work with the venue again I’d do so. The offer to return for future work was in fact made by Allen, who asked us to show both the Telescope & my “Machine” in their gallery.
So, in summary, I am very happy with the project’s outcome & have a new found appreciation for how challenging it can be to play semi-improvisational music for extended periods of time.
There is an argument for the speculation that big business is utilizing Hackathons/Maker spaces to steal free labour and new ideas from budding dedicated makers/artist, those who are unable to necessarily achieve their aims through lack of funding for making & sustaining technology-based art pieces.
Maker spaces often democratize art for a wider community of artists such as the maker space in Newcastle under the Commercial Union House. Look up, Dominic Smith on the subject of hackathons. Maker Spaces and Hackathons are often predominantly a source of good to those in the maker community & others who benefit from their feats.
However, it can be witnessed that big business can see this as an opportunity for exploitation. In recognising the lack of funding & lax state towards copyright & patent law in creative types, big businesses such as Apple or Google establish these opportunities for budding makers, who are altogether far too tempted by it to say no, which may come at a heavy price.
Anthropomorphisation is defined as the attribution of human qualities such as the personality or physical appearance onto non-human entities or even objects & technologies. So does a “public” strictly have to be solely humans or can it be otherwise?
Humans seem to exist within an interesting & contradictory state of being able to both anthropomorphise “non-humans” whilst also being able to do the opposite through political dehumanisation of other races and or faiths, take Nazi propaganda for example that referred to Jewish people as “rats” or “vermin”.
Or more recently when Trump referred to immigrants as “Animals”.
I’ve written a song about this sad state of affairs:
This is often done by far-right figureheads as a means to subvert public opinion to allow for tolerance of great evils, usually in the guise of normality. And yet we seem capable of imparting great sentimentality & affection on specific subsections of animals or inanimate objects/technological artefacts.
What with the recent upsurge in AI, the space between ourselves and technology would seem to be narrowing, were you to perform a song (such as the one in the link above) to your Google Voice Assistant or Siri? Would they then constitute a public? Some would argue yes, though they may be an audience of the heckling kind, especially when they try to find the song lyrics online.
• Visibility = making the process of building technology-based pieces visible to the public.
• Accessibility = making the process of building technology-based pieces accessible to the public.
• Accountability = Showing accountability for the work through the openness of the process.
The plan can be Informed by the equipment. No predefined outcome apart from a list of possibilities. This can be an exercise in improvisation. With the pre-established notion that “Success is a subjective concept.”
Rather than a predefined concept, the focus is on exploratory creativity & the air of welcoming the unknown, whilst being prepared for it; through an in-depth understanding of your chosen tools.
A telescope is placed in the Ballroom at Culture Lab housed within a planetarium. This telescope is pointed towards unknown worlds. Through this telescope, it will be possible to view performance(s) happening at the Allenheads Contemporary Arts space in the Old School House.
The accompanying background-audio will be built upon the semi-generative audio based on sounds captured via VLF (Very Low Frequency) radio transmissions, allowing us to perceive that which we ordinarily couldn’t.
In conjunction with this audible electromagnetic background, there will be four improvised endurance length AV performances. Inside the dome, as the viewer rotates the telescope horizontally they will be presented with different viewpoints from the ACA performance and if the telescope is rotated vertically then visual effects will be applied to the view.
Outside the dome, there will be science-fiction-style special effects projected onto the big screen in culture lab and the impression of a planet projected onto the surface of the dome (AKA the inflatable planetarium).
// elevator pitch
A telescope within a planetarium pointed at the surface of an unknown exoplanet; its viewfinder a visual pathway to other worlds, places of strange sounds, myths… and the means of traversing into the unknown.
Come and join us in this exciting project where you will be able to witness both pre-recorded and live audiovisual performances streamed from Allenheads direct to Culture Lab, Newcastle.
Produced in collaboration with Allenheads Contemporary Arts as part of their Beyond project.
Pete Haughie // Ryan Boyle // Katie Oswell // Liam Slevin // Nicholas Cooke // Kiran Pearce // Ning An.
As stated in my last post, the concept I held in my initial performance proposal for the ACA project did not survive the crucible of group discourse, nor did anybody’s initial proposals for that matter. The group (Triskel) is comprised of Myself, Pete Haughie, Katie Oswell, Kiran Pearce, Liam Slevin, Ning An and Nicholas Cooke (who wasn’t able to make it).
After all, meeting as a group at ACA, we were always destined to spend the best part of 16 hours consuming our body weight in tea, whilst chomping through a smorgasbord of unrefined ideas.
However, after the agony of working our way through abandoned epiphany after abandoned epiphany, and after so much gruelling debate then drunken philosophy (not to mention the odd bad joke, thrown in for good measure) comes the clarity of a new morning. We reconvened after a short sleep.
Once all were feeling (a stones-throw from) bright-eyed and bushy-tailed, we started with a blank slate. We began to utter whispers of what we could offer to the project once more. We explored the proposed space, taking advantage of having access to the telescopes & the inflatable planetarium.
The guardians of ACA (Allen and Hellen) spoke through their own vision of the event, then we all collectively ironed out the details with yet more precision. Finally landing on a concept we were all pleased with which I’ll discuss in my next blog post.
The above video is comprised of photographs blended in processing taken whilst the group explored the space and equipment available at ACA. The accompanying sound was sampled as part of a Live Electronic Performance Module.
Bellow is an Initial performance proposal (Mostly scrapped) that was put to the collaborative group, we’re currently co-developing an installation as part of the late shows and with Allenheads Contemporary Art’s or ACA as they’re otherwise known, as part of ACA’s “Beyond” Project.
Beyond performance proposal.
A Broken Translation, a series of alien landscapes/concept images in response to stimulus scraped from the internet on the subject of alien/terrestrial mythologies or interplanetary imaginings.
Taking the form of a sketch accompanied by sampled noise, acting as the backdrop for a spoken word performance which personifies the entity that is “The Broken Translator” an unhinged alien AI construct with the sole purpose of making contact with technologically advanced civilisations.
By utilizing television/radio broadcast signal analysis the AI construct alters its appearance and language based on assumed traits to better facilitate its role as an interplanetary ambassador but the AI often get’s things comedically or tragically wrong.
The AI functions through a deliberate cultural appropriation on a global scale through any information scraped from these broadcasts for the sake of interspecies communication, with a regular muddling of dialect.
The backdrop Images will portray a surreal sense of familiarity whilst seeming totally foreign to the observer, these images may include portrayals of deities or all-powerful beings of alien origin, sourced from the stimulus of the aforementioned alien and terrestrial mythologies.
These images will be representations of other worlds the A.I construct is simultaneously visiting/viewing. The images will represent the functioning of the signal analysis process, scouring the vastness of space for potential targets, whilst also reminding observers that there is so much unknown in the great beyond and deepening the sense of bi-locational communication.
The Broken Translator will hijack localised public screens to interact with a chosen ambassador for the species of any population (Via 2-way live streaming). Interplanetary facetime.
Throughout the performance “The Broken Translator” will attempt to maintain a conversation but will be erratically emotional and unstable in use of language through an inability to grasp the contradictory nature of human behaviour.
Image created by Ryan C. P. Boyle, using photo manipulation.