London 3D Augmented Reality Map

CASA hosted a very successful Smart Cities event last Friday, including presentations from Carlo Ratti, Mike Batty and Andy Hudson-Smith. The event premiered an interactive exhibition we have been working on, based on the theme of mixing physical and digital worlds. Some fantastic and fun exhibits have been developed by colleagues including George MacKerron, Steven GrayOllie O’Brien, Fabian Neuhaus, James Cheshire, Richard MiltonMartin de Jode, Ralph BarthelJon Reades, Hannah Fry, Toby DaviesPete Ferguson and Martin Austwick, who no doubt will be blogging about them all soon. Thanks to everyone who attended and contributed to a great day.

For my own exhibit I had a try at developing an augmented reality app to explore 3D urban data. The idea was to use iPads as the window into a 3D urban map of London, allowing the user to navigate around the virtual model to see different perspectives and focus on interesting parts of the data. Do we respond differently to data with a seemingly physical presence? Well this is one way to find out…

The app was developed in Unity using the Vuforia AR extension, and I was impressed with how accessible augmented reality technology has become using such tools. Firstly GIS data on urban form in London and air pollution was exported from ArcMap into Unity, and an interface to the data was developed. The core app without the AR capabilities can be viewed here (Unity web player required).

Next I followed the Vuforia iOS tutorials to add AR functionality. This approach uses a tracking image to position and scale the 3D model to the user’s viewpoint. Nice features of Vuforia include the ability to select your own tracking image, and that it can handle some occlusion of the image when the user moves to a particular part of the model, although a part of the tracking image must be in view of the camera at all times otherwise the model disappears from the user’s view. A large A0 poster was used as the tracking image, giving users greater flexibility in navigating the data.

The resulting app is very intuitive and delivered the desired ‘wow’ factor with many of the attendees at the conference. The AR aspect certainly encouraged users to explore the data, and identify patterns at different scales.

Adding more interactivity, animation and sorting out some issues with the target image (multiple smaller images would have worked better than one very large image) would all be nice for version 2. I’ll do a more detailed tutorial on the workflow developed later on if this is of interest.