Sean Clark's Blog
Interesting things are happening with Virtual Reality again. While I'm still not sure that it will be more than a niche technology in the long run (I think 'augmented' is more interesting than 'virtual'), it is quite fun to see something I was experimenting with over 20 years ago back in the spotlight.
Two approaches are popular at the moment. One is the classic VR 'goggles' system, like the Oculus Rift, in which a special stereo display and head tracker is worn by the user and driven by a computer. The other is to take advantage of the high-quality display, position tracker and computer many of us already have in our pockets - in the form of a smartphone - and simply provide a housing and lenses to enable it to be used as a 3D display. The best known version of the latter approach is the Google Cardboard - literally a cardboard holder with plastic lenses and a control switch.
You might think that the dedicated VR goggles would have significant advantages over the smartphone approach. This is true to an extent, but the power and quality of the modern smartphones is such that the difference is actually much less than expected.
I have been experimenting with various Google Cardboard designs and have managed to source lenses for as little as £2.50 a pair and whole kits off of eBay for £6 (or a bit more from here). On the whole they all work pretty well, but can be a little flimsy.
To find something more rohbust I have been looking at plastic and 3D printed alternatives. My favourite so far is the OpenDive by Durovis. This is a 3D-printable version of the commercial Durovis Dive product. This is a smartphone VR housing that actually predates the Google Cardboard.
While it doesn't have the magnetic switch of the Google Cardboard (although one can be easily added), it does the same basic job - that is hold your smartphone in front of some lenses. It also 3D prints pretty well - albeit taking almost 5 hours to produce on my Replicator. What's more, even with the lenses and an elastic head strap the total cost of the project is less than £5.
Of course, hardware is only one part of the system. Durovis also provides a free Unity plug-in that does head tracking using your phone's position tracker and renders your 3D model as two side by side views for stereo vision. This allows you to use the Unity game engine (a free version is available) to create interactive 3D worlds and compile them for Android and iOS.
This results in a complete 'home-brew' VR system that you can experiment with for the cost of a large latte or two at Starbucks! What's more, the Unity environment is also compatible with the Oculus Rift (although you have to upgrade to the Pro version) so the new skills you develop will be transferable to this VR environment.
Field Broadcast is an innovative arts platform that connects artists, audiences and obscure locations through live video broadcasts. It's a really successful project that has used a Windows and Mac app for the last few years to alert people when a broadcast is starting and then deliver the video stream to their desktop.
For the latest series of broadcasts Cuttlefish was asked to create a mobile app for Android and iOS that would allow mobile users to receive alerts and broadcasts on their smartphones. The Android app went live a couple of weeks ago, and the iOS app was finally approved by Apple this weekend.
I've been using the app myself to tune in to the recent broadcasts and have to say I'm hooked. When the broadcast alert arrives you never quite know what you are going to see. It could, literally, be a transmission from a field, or - as it the case with some of the current broadcasts - from a riverside, and even from a row boat.
I've taken a few screen grabs of the recent broadcasts that you can find on my Flickr here. To see the next live broadcast yourself you can download the desktop of mobile apps from www.fieldbroadcast.org. All downloads are free.
Last night was the private view of the Automatic Art exhibition at the GV Art Gallery in Marylebone, London. The show was curated by Ernest Edmonds and presents 50 years of British art that is generated from strict procedures.
The artwork on display ranged from constructivist sculptural forms, through systems-based paintings and drawings, to computer-based and interactive artworks. It was put together in a very coherent way, with background materials and supporting information, and made full use of the multi-level and multi-room gallery space.
The private view was very well attended, with many of the artists involved in the show in attendance. This provided an opportunity for me to catch up with quite a few friends and colleagues from over the years. These included people from LUTCHI (the research centre at Loughborough University where I began my graduate career in 1989), William Latham (who designed cover art for The Shamen in the 1990s, and whose first website I built), friends associated with the Computer Arts Society and present-day colleagues from the IOCT at De Montfort University. It ended up being a very enjoyable night - just a pity I had to get a train back to Loughborough at 10pm!
The exhibition is open to the public from today (Friday 4 July) and ends on Saturday 26 July 2014. Entrance is free. My pictures from the set-up and opening can be found here on Flickr.
The full list of artists featured in the exhibition is Stephen Bell, boredomresearch, Dominic Boreham, Paul Brown, John Carter, Harold Cohen, Nathan Cohen, Trevor Clarke, Ernest Edmonds, Julie Freeman, Anthony Hill, Malcolm Hughes, Michael Kidner, William Latham (picture attached), Peter Lowe, Kenneth Martin, Terry Pope, Stephen Scrivener, Steve Sproates, Jeffrey Steele, Susan Tebby and myself.
We've been doing a fair amount of work with 3D printing at Interact Labs over the past few months. More recently we have started to get in to 3D scanning to help us create models for printing. One of the easiest 3D scanning technologies to work with is 123D Catch from Autodesk. This software enables you to "scan" an object by simply taking photographs of it from multiple angles. Once you've taken your pictures you upload them to the Autodesk server, where they are processed and a 3D model is produced. The output is normally fine for 3D printing (and the service is free for non-commercial use).
We wondered how these models might look if, rather than being printed, they were converted to a format suitable for viewing in our Oculus Rift Virtual Reality headset. Using a free 3D modelling program called OpenSpace3D we've been doing just that and the results have been impressive.
We've been scanning a combination of building exteriors and small and large objects and have been combining them to produce little "worlds" that can be viewed on the Oculus Rift. While they are not quite as "real" as the real things, they look good and 123D Catch manages to capture the visual detail, as well as getting the shapes of the objects pretty much spot on.
Check out some pictures of this work-in-progress here on Flickr. The stars of this particular set of images are "Sockman", a public artwork in the centre of Loughborough, and a swan statue from Loughborough park.
I've been doing some work at Interact Labs recently with pioneering UK digital artist Paul Brown. Paul has been creating computer-based artworks since the late 1960s and was looking for some support in recreating a couple of early artworks for an exhibition.
The first of these was an electronic piece that involved sequencing lights and a tone generator that Paul first showed in the 1970s. While the electronic components needed to remake the piece are still available, modern components don't have the "aesthetic" appeal of the originals. Local electronics expert Tony Abbey was brought in to help and was able to source some lovely old transistors, resistors and capacitors that enabled him to recreate a great-looking new prototype of the artwork. The only real difference from the original is the use of LEDs rather than the old filament bulbs of the time. Paul will now be constructing a further ten versions himself.
The second was a computer-based artwork that originally ran on an early 'framestore' computer. Paul had re-coded the work in the Processing language. Processing allows you to save your program as a Linux compatible version and Paul wondered if it might be possible to run the artwork on a Raspberry Pi. As well as having the the ability to run the program, the Raspberry Pi has a composite video output that Paul hoped would look good on some early video monitors he had acquired. With a bit of configuring this worked like a dream. An artwork that once ran on a computer the size of a car is now running on one the size of a pack of cards!
Addressing the problem of how to preserve early digital artworks is going to be of increasing importance in the coming years. Technology is constantly changing and older technology becomes 'redundant' surprisingly quickly. Even I am finding that the only way to keep my artworks from the late 1990s running is to keep a stock of old computers running 1990s operating systems.
I'm just back from Cardiff setting up an artwork as part of Genetic Moo's latest Microworld event at Arcadecardiff in the Queens Arcade. The piece forms part of a collection of artworks that interact with each other as well as the visitors to the gallery. My piece is a tryptych of self-organising grids that swap colours with each other as well as incorporating new colours that is 'sees' in the gallery. The work develops over time, subtly reflecting the history of interactions in the space.
Other pieces on display include a collection of Genetic Moo's responsive, highly organic, images and systems; a piece by roboticist Sean Olsen; and works by interactive/digital artists Myles Leadbeatter, Banfield & Rees, Matthew Britten and others. The show is open to the public from the 26th May until 1st June 12pm until 6pm.
I have a number of exhibitions coming up and will be showing some new works that form part of my new major project. One of the important outcomes of my research over the past few years has been to define what I have come to call "Interconnected Digital Art Systems", or "Digital Art Ecologies. This involves a configuration of digital artworks that are designed to interactive with each other as well as their audience. You will have see this idea developing through my Interact Gallery exhibitions, my Symbiotic exhibition with Generic Moo towards the end of 2012 and my work on ColourNet with Ernest Edmonds in 2013.
My new work takes this idea further by making all of my artworks part of an interconnect system. I expect almost all artworks I produce over the next few years (at least) to be connected to each other - be it via exchange of light and sound through a physical space, or the exchange of data over the Internet. I have quite a few ideas how to realise this and will be starting with a new piece at Genetic Moo's Microworld exhibition in Cardiff next week, followed by a related piece at the Automatic Art exhibition at the GV Art Gallery in London in July.
For these two works I will be continuing with the grid of colours theme used in ColourNet, but plan to introduce new structural forms for a planned exhibition at the Kinetica Art Fair in London in October.
Remember, everything is connected!
In 2012 the Interact Gallery hosted a series of rehearsals and and performance for Ximena Alarcon's "Network Migrations" project. The project involved a joint vocal performance between two groups of people linked via the Internet, one group in Leicester and the other group in Mexico City - a distance of around 5,500 miles. Ximena's research paper about the project has now been published in "Liminalities: a Journal of Performance Studies". You can read the paper and find videos and audio from the project here. My documentation of the event can be found here on the archived Interact Gallery site.
The Automatic Art exhibition at the GV Gallery in London presents 50 years of British art that is generated from strict procedures. The artists featured make their work by following rules or by writing computer programs. They range from system-based paintings and drawings to evolving computer generated images. I'm rather pleased to be having a piece of work in the show - especially given the company I am going to be in.
The exhibition runs from Friday 4 July and ends Saturday 26 July 2014. The Private View is on Thursday 3 July 2014, 6-9pm. See http://www.gvart.co.uk for more information.
I'm just back from a five-day course at Schumacher College in Devon with writer Fritjof Capra. The subject of the course was Fritjof's new book, co-authored with Pier Luigi Luisi, The Systems View of Life: A Unifying Vision.
The book builds on Fritjof Capra's earlier work on the importance of 'systemic thinking' and the urgent need for a global shift in perception from a mechanistic world view to a more holistic one. Written as a textbook for students, systems thinking is placed in a full historic and contemporary context. There is a particular emphasis on Maturana and Varela's theory on the self-generating nature of living things, often called the theory of "autopoiesis". Like Maturana and Varela's book The Tree of Life, Capra and Luisi show how autopoietic theory can be applied to all levels of living systems (from single-celled organisms, to multicellular life, consciousness and social systems). Unlike Maturana and Varela's book (which is still excellent) it presents the theory in a way that is easy to grasp in a single read. This is an achievement in its own right! Guest essays - including one by Animate Earth author and Schumacher College faculty member Stephan Harding - are used to provide even more context. A large section on "Sustaining the Web of Life" shows how we could use the concepts discussed to address the major environmental and social issues currently facing us. There is an extensive bibliography for those wanting ready more about any of the topics discussed.
Fritjof Capra's first book was the Tao of Physics, and if you have only read this then this new book really should go next your reading list - it easily matches the Tao of Physics's importance and will bring you up to date with almost 40 years of Capra's thinking. If you have followed Fritjof Capra's work since the Tao of Physics, then this book is on familiar territory, but its level of coherence and clarity will give you a fresh take on the systems world-view - and the practical solutions he gives will give you (at least some) confidence that our planet could have a future with us on it.
I've put a few pictures from the course on my Flickr page.