Category: Unit 1: Final Works / Practice-based Research

Signals – Computer based Environment.

These stills are from the most recent and most successful version of “Signals”. This is a browser-based version, with mouse interaction made in Javascript using Three.js. I will incorporate the interactive version on the website I hope to release before the end of the year. Screen Shot 2015-11-09 at 03.45.10

Screen Shot 2015-11-09 at 03.45.03

Screen Shot 2015-11-09 at 03.45.02

Screen Shot 2015-11-09 at 03.44.58

Screen Shot 2015-11-09 at 03.44.54

Screen Shot 2015-11-09 at 03.44.53

Screen Shot 2015-11-09 at 03.44.51

Screen Shot 2015-11-09 at 03.44.50

Screen Shot 2015-11-09 at 03.44.47

Screen Shot 2015-11-09 at 03.44.40

15102015 Task Systems Showcase

Last Thursday, I showed some new work at an event for Task Systems and Pantone. The theme was Colour in Design, which gave me a starting point outside my comfort zone.

Here are some images:

Screen Shot 2015-10-12 at 13.11.36 Screen Shot 2015-10-14 at 23.16.19

These are some pretty terrible photos of the display, however I’m waiting for the photographer to get back to me with the high-res ones.

This gives an idea of the layout.

The projection below was a reactive installation using javascript and leap motion. Moving on from my work “Mixed Signals”. The Leap allows the audience to move through the space. Again, it aims to visualise the Behind the Screen of internet traffic, though of course this is still only an abstract visualisation, and doesn’t yet represent any actual live data.




Mixed Signals 29092015

Screen Shot 2015-09-29 at 07.43.34

Screen Shot 2015-09-29 at 07.43.35

Screen Shot 2015-09-29 at 07.43.44

Screen Shot 2015-09-29 at 07.43.46

Screen Shot 2015-09-29 at 07.43.58

Screen Shot 2015-09-29 at 07.44.03

This is clearly very similar to some of the work I’ve been doing this year, visualising online connections. It also highly contradicts what I wrote at the beginning of the term, but I couldn’t help myself. I’m looking into including Twitter’s open source APIs to integrate live feed information based on conflict. Its an ongoing struggle, especially as I get to terms with learning to code…..

However, these are stills from an interactive, Javascript model of the work. The next step will be to use the Leap Motion to interact with the programme. I’m looking to exhibit some prints in an exhibition this month. I’m thinking I may use the format from the first two, as there is almost a sense of conflict between the two sides. Ultimately though, I’d like to project it, and give the viewer some sort of interaction, whether its a mouse or a leap motion etc.. I still haven’t successfully integrated my figurative and architectural work with these systems, however I thought I’d display this version. Again, this isn’t I hoped, I want to move more towards meaningful displays of information, however this is proving harder than I thought… nonetheless I still feel a sense of progress from previous versions of the work.