The London Data Table

The London Data Table was one of my personal favourites from the exhibition accompanying the CASA “Smart Cities” conference which took place at the University of London last Friday. The concept was thought up by Steven Gray and it consists of a wooden table, cut by programmable lathe into the outline of London. A special “short throw” projector with a fish-eye lens was purchased. It was mounted vertically on a converted basketball hoop stand, pointing downwards and outwards, allowing the content to be approached and examined without the projector getting in the way. Steven has blogged about the construction process.

I created a generic dark grey background map (from Ordnance Survey OpenData) with a blue River Thames as the main identifying feature. This was used by several authors, including myself, to create either Processing “sketches” in Java, or pre-recorded videos, which were displayed on the table during the exhibition. A simple Javascript script running on Node.JS was written to automatically cycle through the animations.

By ensuring that the background map and accompanying sketches/videos where “pixel perfect”, we were able to take advantage of having control of every individual pixel, producing the quite pleasing pixellated effect as seen in the below closeup of one of the sketches (a photo taken of a part of the table) – it is showing a bike share station animation that I created, based on the same data that powers the equivalent website.

The photo above shows the table running another Processing sketch, showing point information from CityDashboard and similar to the map view on the website, except that points are randomly and automatically selected to be displayed, as people stand beside and watch the table.

The most interesting sketch presented on the table (and shown on the right – photo by Helen) was built by Steven Gray and connected to a airplane sensor box, that picked up near-real-time broadcasts of location, speed and aircraft ID, of planes flying over London. The sketch stored recently received information, and so was able to project little images of plans, orientated correctly and with trails showing their recent path. Attached to each plane image was a a readout of height and speed, and most innovatively of all, a QR code was programmatically generated and rendered behind each plane, allowing smartphone users to scan it. QR codes are normally encoded URLs, and these ones were set to point to a flight information website, with the aircraft’s details preloaded, showing a photo, and the origin and destination at a glance.

The QR codes were able to be made very small – using a single projector pixel per QR code pixel and little error correction. Various smoothing and blurring digital effects having been switched off, and a digital connection between computer and projector used, to allow the sharpest possible representation. As a result, my iPhone was able to tell me more about the planes I was seeing fly, in near real time, around the table.