Sean Clark's Blog
I went to the opening of the "A-Eye" exhibition at Goldsmiths College in London on Monday. It organised as part of the 50th Artificial Intelligence and Simulation of Behaviour (AISB) convention that is running all this week at Goldsmiths.
The theme of the exhibition was "art and nature inspired computation" and featured 35 artworks from a broad selection of national and international artists. The work was a mixture of prints, screen-based, multimedia and some interactive pieces.
It was opened by Californian-based British artist Harold Cohen, true pioneer who has been active in computation-based arts since the 1960s. Harold is particularly known for his work with Aaron - a computer system that he has been developing for many years. He uses Aaron to generate drawings, many of which form part of his paintings. A large such piece, entitled Another Spring (for A.C.) was on display in the exhibition.
Although I had seen Harold speak before, we had not met. This time, however, I had been introduced to him by email prior to the event and we managed to have some time together talking about his work. This provided me with some great insight into his process. I was particularly taken by his shift from seeing Aaron as an autonomous "artist" to more of a "collaborator" who helps him access creative deas than might otherwise be unavailable to him.
I completely agree with this. I have seen quite a bit of artwork "created" by computer, and even met people who claim that their computer systems are making some sort of creative decisions in the production of such work. While I agree that the computer's role in the creation or realisation of an artwork can certainly go beyond that of simply being a tool (like, maybe, a camera or a brush), at present (and maybe forever) art made for humans needs to involve humans. I don't think computers can make art for humans that has any meaning beyond simple aesthetics. Basically, I don't think computers can be artists. However, the computer as collaborator is a much nicer aspiration. It's also one that Harold - with 40 years of experience with Aaron - is pretty well-placed to argue he has achieved.
The A-Eye exhibition at Goldsmiths runs until the end of Thursday, so if you are in the area I'd recommend a visit. There is plenty of other interesting work to see and I may write a bit more about it here. I have uploaded pictures of some more of the pieces to my Flickr page.
The Oculus Rift Head Mounted Display is making the news a lot these days, and rightfully so - we have one in the office and it's an impressive bit of kit. The device allows you to enter a computer generated 'world' by putting on a headset and viewing it in 3D.
However, 'Virtual Reality' of this quality (and it's still not perfect) has been a long time coming. It first entered the public consciousness in the early 1990s and I - like many other people - dreamt of having my own Virtual Reality system to play with. A professional system cost many tens of thousands of pounds to buy, so I went down the 'homebrew' VR route. This typically involved using off-the-shelf PCs and home-made input devices and displays, or adapted video game hardware.
The system I built featured the classic combination of a PC running the REND386 3D software, a Nintendo Power Glove as an input device and a pair of Sega 3D 'shutter' glasses as a way of viewing the computer screen in 3D.
In a recent conversation someone expressed interest in how this system worked and it got me thinking that maybe I should dig out the old hardware and try to rebuild the system. As something of a hoarder, I mean archiver, I still had much of the old kit filed away and after finding a suitable PC of the era on eBay I'm now in the position to start putting the system back together.
I'm probably a few months away having something to show, but when it's complete I'll install it at Interact Labs at Phoenix for people to have a play with. Watch this space for details, and get ready to enter the blocky world that was 1990s homebrew Virtual Reality.
The new exhibition at in the Cube Gallery at Phoenix is an interesting one. It contains work by a number of artists and deals with the subjects of environment, place and time.
The first piece you notice when you enter the gallery is the Subterranean soundscape produced by Semiconductor. This takes seismic data from earthquakes, volcanos and glaciers and makes it audible. It instantly places you in a primal world of grinding rocks and cracking ice.
Next, Benedikt Gross and Bertrand Clerc present Metrography - an apparently distorted map of London based on the underground map. Reminding us that all maps are actually distortions of some sort (Gregory Bateson commented extensively on the relationship between the map and territory, as did and Alfred Korzybski).
Perhaps my favourite piece in the exhibition is the long-running Mesocosm animation by Marina Zurkow of a seated figure (in the style of one of Lucian Freud's paintings of Leigh Bowery) in a Nothumberland landscape. Each day in the landscape is represented by 24 minutes in the gallery. The entire animation lasts for 146 hours, but has a generative element so that no two cycles are the same.
Locally-based artist Eric Rosoman's piece GPS Ducks is an interesting response to the story of the accidental release of nearly 29,000 rubber ducks from a container ship in 1992. This became a really important event in the study of ocean currents since ducks have now turned up all over the world. Eric has released a more modest number of ducks in to the local river system, but this time equipped with solar powered GPS trackers so that we can watch their journeys.
Finally, Charles Danby and Rob Smith's work The Quarry explores the site of the photographer/landscape artist Robert Smithson's artwork Chalk Mirror Displacement. It presents material from the quarry used to create the work as well as a collection of triangulated photographs. I actually need to go back and have another look at this piece since I didn't realise that there were QR codes with the photographs! Scanning these apparently plays video works.
The exhibition is one that deserves time being spent at it and is very rewarding if you allow yourself to take it all in. If runs at Phoenix until 28th February 2014.
See my pictures from the exhibition here.
DMU Masters student Alice Tuppen has just finished a short run of her "Point. Forty" exhibition in the Cube Gallery at Phoenix in Leicester. The installation featured four tables on which were placed objects that when picked up would trigger the playback of videos. Each table contained objects from one of the 40-year-old female participants and the videos explored personal the thoughts and recollections of each participant.
The first thing you noticed about the piece was how completely it transformed the Phoenix Cube Gallery. I have seen many artworks in the space, and exhibited in there myself, and and can honestly say that I have never experienced such a change in the feel of the gallery. This was in part down to the props and objects used, but also the lighting and the engaging nature of the video material.
One measure of the success of a piece of work like this is how long people engage for. Again, the work excelled at this, with some people remaining in the space for up to an hour - going from table to table in order to explore each participant's objects and videos.
While the piece only had a short run, I'm sure that we will see more such work from Alice in the future. You can see my pictures from the show here on Flickr. You can find Alice's web site at http://www.artact.co.uk.
Leicester Hackspace, a venue for makers of digital, electronic, mechanical and creative projects, will open in Makers Yard on 1st March 2014. Like other Hackspaces across the country, we hope to build a community of practical and creative people and provide them with a place to pursue their projects, share techniques and concepts and learn new skills.
We have a space in Makers Yard with access to bike repair equipment, computers, a laser cutter, 3D printers, power tools and many other resources. This space will be open to members 24 hours a day, but we will also be running a number of special courses and events, in programming, arduino, electronics, soldering, laser cutting, and more, which will be open to members of the public.
If you have an existing interest in such projects, or wish to learn more, we hope you'll join! Find our membership form and more details about costs and joining, at www.leicesterhackspace.org.uk, or join the Leicester Hackspace Google Group to learn more!
We launched the St Georges App just over a month ago and are now up to around 400 downloads across iPhone and Android. Word of mouth feedback has been great with people commenting on both the ease of use of the app and the excellent content, in particular the Changing Industries in Leicester's Cultural Quarter guide.
However, we see this as just the starting point. The app has been designed to allow new guides to be easily added via the Empedia website. The web-based tools let a non-technical person upload and maintain multimedia content that can be fed to the app at the push of the button.
The next batch of new content is due to be delivered via the DigiCROP project with Leicester University. This should see soundscapes and spoken word guides being created that further explore the history of the Culture Quarter. Some of this work will be quite innovative, with new features being added to the app that will allow multiple layers of sound to be played as you explore the area.
We're also looking at ways of using different types of computer-readable 'tags' within the app. It already supports the use of QR Codes to trigger the playback of content, but work to integrate wireless iBeacons and RFID tags is underway. This will allow multimedia materials to play automatically when you, for example, enter buildings or pass indoor points of interest.
As we make use of more of these technologies, the St Georges App will be able to deliver increasingly personalised content to you as you walk around the Cultural Quarter. We hope that it will provide a fascinating example of how new technologies can enhance a person's understanding of the space they are in - both in terms of its past and its present.
We are also looking to work with people who have ideas of their own and would like to use the platform to realise them. If you haven't done so already then download the app and have a play. If you have ideas for new content you would like to see then please get in touch.
Everybody loves robots don't they? Especially low-cost robots controlled by low-cost computers? The last of my winter holiday projects is a simple way of controlling the low-cost robotic arm from Maplin (£29.99 when on offer) with the similarly low-priced Raspberry Pi computer. As per usual, most of the work getting this project together has been done by someone else, but I have made some additions to the control program that are worth sharing.
First you need to get and build the robotic arm itself. The easiest place to get it from is Maplin, but you can sometimes find it cheaper on Amazon or eBay. The yellow and black look for the arm make it easy to identify unbranded versions of it. It's a moderately complex kit to build, but take your time and I'm sure you'll be fine. No soldering is required. I spread the work for the whole project over an afternoon and an evening.
The kit comes with a USB connection and software for a PC. However, it is just as easy to connect it to a Raspberry Pi. Full instructions on how to do this - which involves downloading the PyUSB software and running a simple Python script - are given on this site. It's very straightforward.
I decided to upgrade the provided Python script so that you can send the robotic arm simple commands via the command line. For example, typing sudo python arm.py sd wd go tells the shoulder motor to do down, then the wrist motor to go down and then the gripper to open. Not particularly fancy, but a decent basis for a more complex project I think.
For the next step I fancy placing my Raspberry Pi camera on top of the arm and writing an image processing script that will get the robotic arm to follow people as they walk around a room!
I've made a few start of year updates to my website. Firstly, I've published a page of pictures of various Crass-related events and exhibitions I've been to over the past 5 years or so. I've got most stuff that I may upload in the future. Next, I've uploaded quite a big collection of early websites I produced in the 1990s. There's a mixture of arts and music sites (which I think are the most interesting) and examples of some of the commercial projects I worked on at the time. After 20 years the sites look quite primitive, but some of the content is good. Check out the work we did as Resonance in the mid-1990s. Geoff Broadway's The Axe is a particular highlight. Finally, I've done some housekeeping, like removing broken links and updating my profile. I'm going to try to get back in to weekly blog updates too, so remember to stay tuned for the usual mix of art and technology posts.
I've been using some of my time off this holiday season to put together a few projects that I've had ready and waiting in the studio. One of these was - to give it it's full name - the Evil Mad Scientist Laboratory and Super Awesome Sylvia WaterColorBot.
This Kickstarter-funded project from www.evilmadscientist.com is basically a 2D plotter that holds a watercolour brush instead of a pen. It comes in kit form and took me a couple of hours to put together. It takes standard watercolour paper and the popular (and cheap) Crayola watercolour paint set. With a bit of clever programming (and a few water resevoirs) the machine is able to produce actual watercolour paintings based on live user control or standard PNG graphic files.
Now that I've had a bit of time to play with it I have to say I'm really impressed. The idea for the machine apparently came from Super Awesome Sylvia - a young girl in the US who is into making/hacking - and it's definitely something that young people will want to play with. It also looks great.
I particularly like the way you can run off multiple copies of a painting and that, despite following the same set of commands, every painting is slightly different. This is due to differences in how the paint sits on the brush and the small amounts of slack in the mechanism. It makes a nice contrast to the carbon-copy 'perfection' expected in most modern printing devices.
The machine is also very hackable, with expectation that owners will try different painting and drawing implements with it and write their own software (everything is open source and examples of code are available). I've already seen examples of people using marker pens with the WaterColorBot to good effect.
I plan to move my WaterColorBot to Interact Labs at Phoenix in a few weeks time. Come along to the next Digital Makers Group meeting on the 18th January 2014 if you want to have a go at making your own paintings with it. See a short video of the 'bot in action here and some pictures of it here.
I can't believe it's been over a month since I've posted to my blog! It must be the longest break I've had for years. Still, while I may have been quiet here I've been quite busy over at the new Interact Labs site. This new project officially launched on the 26th October and is starting to take off nicely.
Here's a summary of what has happened at Interact Labs over the past month. The Creative Manifesto group are now meeting at the space fortnightly and used it as a base for the construction of their lights for their Light the Night project. Leicester Hackspace is also meeting there fortnightly and is making moves towards setting up it's own space in the new year. We've also had Steve Mills making use of the 3D printer as part of his residency at Two Queens and Esther Rolinson visited to work on her installation for next year. The Digital Makers Group has had an event, Martin Rieser gave a great Computer Arts Society talk and various other meetings and visits have taken place.
The next activity will be the launch of the new St Georges iPhone and Android app on Friday 29th November. This is the result of a collaboration between Cuttlefish, Phoenix and Leicester University and will give users a new way of exploring Leicester's Cultural Quarter. We also have a talk by Antipodes artist Layla Curtis on the 10th December.