mac online apple blackjack http://www.euro-online.org

Max Fragmentation Reflection

I began this doing project by looking into the concept of fragmentation. From this I thought I would look into how to redraw fine art “masters” images, still lifes in particular. This resulted in me having a look at counter objects, pack, and getcell sencell objects…after unpacking the numbers I thought I could play about with the order of pixels, arranging them or producing music (noise). After a bit of playing around my idea evolved. The sketch below shows how I was unpacking pixel data from an image.

 

With my evolution of idea came my want to capture some sort of image or movie that would allow for some manner to fragment reality in some sort of way, playing with the concept of time and place. This coincided with a visit to a Skye, an Island or ‘fragment’ if you will of the coast of Scotland. Here I was looking out the window and thought about how a day passes unnoticed unless resting in the same location, if one stays still for long enough they can notice the subtle changes and evolution of a day. A nice way to capture this rotation is by capturing a timelapse, a video made out of many still images (or fragments) that are placed together to produce a representation of that day. This allows a speeded up version of the day to play condensing the day into fragments that are strung together giving a sense of speeded animation.

After careful consideration the use of the window that I was staring out of also came to play a key part, as this window shelters my vision of the entire vista only showing me a framed version of the world, likened to racehorse wearing blinkers to shield them from viewing their surroundings. This window frame only allows for a small representation of the entire outside World. A glimpse into the external.

This got me thinking how I could use my new found knowledge of Max to unpack a timelapse movie and produce further fragmentation within the context of a timelapse, arranging it via RGB values, Luma or even allowing it to play via the ambient noise that the microphone would pick up. If I could manage to implement this I would hit my goal of not only playing about with  fragmenting time but allowing the patch/ user to further this fragmentation.

The patch above shows how the amplitude of the mic controls the play back of the video, I’ve also included a drop box so that other files etc can be added to this so that a choice of footage can be made. This was one of my starting points as to how to further fragment the representation of a timelapse.

After playing about with this idea I then realised that should include the possibility of allowing the user to see fragments within the frame, details of the scene that was unfolding before them. I then proceeded to produce a timelapse of the tide coming in on the island of skye, I shot this deliberately through a window as to show a viewpoint of the world and also to hark back to theory about technology being transparent and that design should be a window an interface that you do not see.I then captured timelapse details of this scene to be viewed as fragments of the whole broken down to details so I developed several more drop boxes that would allow detailed footage of rocks and waves to be screened as well as the main footage of the timelapse.

It was then that I also pulled in my previous patches working with RGB and Luma to control the playback of the movie allowing an interface to control the play back of this fragmented day allowing the fragmentation to increase or decrease depending on the users wishes. This allows the user an abstract sort of control where there touch and sounds drive one or all of the screens. I’ve included several toggles to allow the user to turn functions on and of allowing them to find an aesthetic they are like or simply fragment individual screens further.

 

 

 

I would say that I’m pretty happy with the results of my sketch with the user allowed to play with options and or their ambient input. I think that the overall aesthetic of the piece could be evolved to communicate my idea better but for the main part I am happy with achieving my goal of fragmenting time and view. I think maybe playing about with other windows or pop ups would be beneficial, maybe allowing for face position or mouse position and ambient mic noise to bring up details of the scene or drive further screens. I think it would be worthwhile chatting to Adrian about this as he successfully implemented facial recognition to change viewpoint when the viewer moved. My idea would be to produce more window/ frames if the viewer moved or increased their contribution. A possibility for evolution.

Max Fragmentation 2

After thinking through this title and playing about with Max it seems that the idea that all things are fragmented when displayed (working wit the quote from William Vaughan in his essay ‘History of Art in the Digital Age: Problems and Possibilities’) that my idea of fragmenting a scene or art work would really compliment this theory further fragmenting the artwork and or image.

My thoughts have moved onto a manner as to fragment reality is some sort of playing with the concept of time and place. This coincides nicely with my theoretical research on windows and mirrors and the idea of user contribution/ experience.