Hi there! I'm Steve and I am a Spatial Software Researcher and Developer at the Center for Advanced Spacial Analysis (CASA), University College London. My research interests include social media, mobile systems, human computer interaction and big data. In my spare time I'm a Musician/Composer originally from Glasgow, Scotland but now living in London.

#bellogate – A breakdown of the spam.

This morning (9th October 2014) was not like any other morning. Usually I wake up and check my nightly email while having my breakfast. This morning, however, I awoke to my work email account having just fewer than 3,000 unread emails waiting for me. During the night someone on the UCL Students email list had worked out how to send an email from the provosts email account to the all students mailing list saying the single word “bello”. What’s unclear was that this email appeared to come from the Provost’s alias and no-one knows if the account was hacked (which would signal a breech of an account) or just some student on campus who knew how to spoof the email headers.

No one knows exactly what has happened, and this is only speculation, but what I think has happened is that the general mailing list for all students has been setup incorrectly allowing anyone with the email address to send to any message to the student body. Until an official statement has been announced we won’t know for certain.

Naturally my first reaction was to start to read all of these emails and see what was being said between the students to get an understanding of how they were using service. We had emails from students who were saying “hello” or
“bello” in some cases, many students responded to the mailing list saying “Please remove my name from the list”. My favourite of all these emails were the mailing lists that the mailing list alias (the One Direction Fan Club and the along with a poem about the event:

As of 9:30am the mailing list was closed down and an investigation is underway according to the @uclnews twitter account. @uclisd have done a great job keeping everyone notified even to the point of apologising to all the students via a text message to mitigate any concerns.

So what happens when you are researching ways to deal with unstructured textual data, have a toolkit, which collects data from various services and access to all the emails that were sent? Obviously you analyse the data!
I quickly wrote some software to pull the data into the Big Data Toolkit and processed the data. I stripped out all identifying details such as email address and analysed only the date, time, subject heading and message body for information on what was being discussed. Below is a short breakdown of the data processed by my Big Data Toolkit.

The Data

2,968 emails were sent out during the spam attack. Assuming that there is 26,000 students at UCL (from 2012 stats) then the total load on the email servers was 71,168,000 messages sent over a period of 11 hours.

First Email Sent: Wed Oct 08 2014 22:48:25 GMT+0100 (BST)
Last Email Sent: 09/10/2014 09:45:41 GMT+0100 (BST)
Total Period: 10 hours 57 minutes
Total Size of all 2,968 emails: 85.61 Mb
Total Data storage for all students: 2.226 Tb
Emails which were Subscriptions (Mailing Lists): 1,254

Distribution of sent messages (every minute)

Textal of Subject Headers (view on textal.com)

Continue reading »

Making Interactive Rabbits Talk

A few years back while I was a researcher at the Department od Computing Science, Glasgow University we purchased 2 small Nabaztag rabbits to augment our prototype multimodal navigation system. The rabbit announced instructions for the users to search the map to find different locations around the world – a sort of digital treasure hunt. Fast forward 7 years and I’m doing it again.

The Karotz, the new name for the rabbit, is a special interactive device. It has ears that you can position, an LED in it’s belly that you can set to various colours, a microphone so you can give the rabbit commands, a speaker to play music either remotely or from a USB stick which can give the rabbit a voice, a nose to smell out those pesky RFID tags and a new feature that’s different from the older rabbits, a webcam to see.

We’ve bought another 2 rabbits for our research at CASA and we’ve been having a think about how we can use them to brighten up the office. For the first few months we had some issues with our corporate WiFi network, think blocked ports and firewalls, so actually getting the rabbit to talk to the outside network has been a challenge. By setting up a 3G router in the office we’ve been able to have more control of our Internet of Things devices and this has meant that we can make these devices respond to some of our collection software.

Once we got the rabbit connected, I decided the first thing we had to do was make the Karotz API friendlier to developers. I set up a small web server written in Node.JS on an internal server where we could send commands to the rabbit and it would proxy these authenticated commands to the Karotz API, which in turn sends to the rabbit.

For example if you want to set the ears to down then you would call the following web service:

http://localhost:9000/ears/9/9 

To set the LED to red you would call:

http://localhost:9000/led/ff0000

And to make the rabbit talk you would call:

http://localhost:9000/speak/Hello%20World

Oliver O’Brien had the idea to attach real world London Underground Tube alerts to the rabbit so I set up a command on the server to make the rabbit announce the tube alerts (which you can see on the video below)

http://localhost:9000/status/ff0000/The%20Central%20Line%20Is%20down

These types of ubiquitous technologies allow developers to integrate real time data into our lives without users having to log onto computers or get our mobile phones out to actively check on services. We are just starting to explore the possibilities of this technology so stay tuned for some more of cool little side projects.

Continue reading »

Animating Ground Overlays in Google Maps iOS SDK

Many maps use overlays to display different types of features on the map. Many examples show old hand drawn maps that have been re-projected to fit on our modern day, online ‘slippy’ map but
very few show these overlays over time. In this tutorial we are going to explore animations using the Google Maps iOS SDK to show the current live weather conditions in the UK.

The Met Office is responsible for reporting the current weather conditions, issuing warnings, and creating forecasts across the UK. They also provide data through their API, called DataPoint, so that
developers can take advantage of the live weather feeds in their apps. I’ve used the ground overlays from DataPoint to create a small iOS application, called Synoptic, to loop around the real-time overlays
and display them on top of a Google Map, very handy if your worried about when it’s going to rain.

Finished App and Source Code

I always find it interesting when these tutorials show you what we’re going to create before digging deep into the code so here is a small animation on the right of the page of what the end product should look like.

What you’re looking at is the real-time precipitation, or rain, observations for the UK on Sunday 19th January 2014. It’s been quite sunny in London today so much of the rain is back home in
the North. You can grab a copy of the code for this tutorial from GitHub.

Before we get started

They’re a few things we need before we can start putting the data onto the map. Firstly, you’ll need an API key for Google Maps iOS SDK and you can access this by turning on the SDK from the
Cloud Console (https://cloud.google.com/console/). If you haven’t created a project yet then click Create Project and give your project a name and a unique id. I have a project called
TestSDK, which I use when I’m playing around with various API’s from Google. Follow the instructions on https://developers.google.com/maps/documentation/ios/start#the_google_maps_api_key
for getting your key. If you’ve downloaded the code from GitHub then the bundle identifier is set up as com.stevenjamesgray.Synoptic.

You’ll also need to sign up for a key for DataPoint. Once you’ve
registered for the API they’ll email you a key and your good to go.

Step 1 – Setting up the keys and running the project

When you have your keys and source code then open the project in Xcode and copy and paste your keys into Constants.m (it’s in the Object folder). If the keys are valid then you’ll be able to
build and run the project and you’ll start to download the overlay images and they’ll start to animate. I’ve already setup the mapView in the project but if you haven’t used the Google Maps
iOS SDK before then you should check out Mano Marks excellent HelloMap example on Google Developers Live. This will get you up to speed with creating a MapView and linking it into your project. The only extra part I’ve added is a custom base layer which will be covered in
another blog post.

Downloading the weather data

Datapoint has 2 types of data that can be visualised – observations and forecasts which, naturally, are contained in 2 separate API endpoints. If you look at Constants.m you’ll see the URL
to both endpoints:

NSString* const api_forecast_layers_cap = @"http://datapoint.metoffice.gov.uk/public/data/layer/wxfcs/all/json/capabilities?key=%@";
NSString* const api_obs_layers_cap = @"http://datapoint.metoffice.gov.uk/public/data/layer/wxobs/all/json/capabilities?key=%@";

If you copy the URL and paste it into your browser along with your key you’ll find that this JSON file lists all the layers that are available from the Met Office and where you can fetch the
images from the server. For observations we fetch the LayerName, ImageFormat (for us on iOS it will be png), the Time of the image (the timestamp of the image given by the time array – watch
out for the Z on the end) and most importantly our API key. Once we have constructed this URL we fetch the image.

-(void) selectLayer: (NSString*)layerID withTimeSteps: (NSArray *)timestep_set{
    for(NSString *timestep in timestep_set){

        NSURL *hourlyCall = [NSURL URLWithString: [NSString stringWithFormat: @"http://datapoint.metoffice.gov.uk/public/data/layer/wxobs/%@/png?TIME=%@Z&key=%@", layerID, timestep, MET_OFFICE_API_KEY]];

        NSLog(@"Calling URL: %@", [hourlyCall absoluteString]);
        NSURLRequest *request = [NSURLRequest requestWithURL: hourlyCall];
        AFImageRequestOperation *operation = [[AFImageRequestOperation alloc] initWithRequest:request];
        [operation setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject) {

            //Check for a UIImage before adding it to the array
            if([responseObject class] == [UIImage class]){

                // Setup our image object and write it to our array
                SGMetOfficeForecastImage *serverImage = [[SGMetOfficeForecastImage alloc] init];
                serverImage.image = [UIImage imageWithData: operation.responseData];
                serverImage.timestamp = timestep;
                serverImage.timeStep = nil;
                serverImage.layerName = layerID;

                [overlayArray addObject: serverImage];

                // Increment our expected count so that we know when to start playing the animation
                imagesExpected = @([imagesExpected intValue] + 1);
                }
        } failure:^(AFHTTPRequestOperation *operation, NSError *error) {
                //We didn't get the image but that won't stop us!
                imagesExpected = @([imagesExpected intValue] + 1);
                NSLog(@"Couldn't download image.");
        }];

        [operation start];

    // Start the Timer to check that we have all the images we requested downloaded and stored in the layer array
        checkDownloads = [NSTimer scheduledTimerWithTimeInterval: 1 target:self selector:@selector(checkAllImagesHaveDownloaded:) userInfo: [NSNumber numberWithInt: [timestep_set count]] repeats: YES];
    }
}

This code fetches the images asynchronously and adds them into an array for us to use later in the code. We start a timer to check that we have downloaded all of our images before
starting to loop around the images on the map. The observant reader will notice that if we use loop around this array of images then they would be out of sync as we don’t know what order the images are downloaded so we need to sort them before we
show them on the map. This happens inside our checkAllImagesHaveDownloaded function. This is called every second and checks that all the images are downloaded. If we’ve got all the images then we clear the timer,
sort the array and then kick off the animation on the map. We sort the array using a Comparator which will compare the timestamps of each of the objects and orders them in asending order.

if([imagesExpected isEqualToNumber: imageFiles]){
        [checkDownloads invalidate];

        NSArray *sortedArray;
        sortedArray = [overlayArray sortedArrayUsingComparator:^NSComparisonResult(SGMetOfficeForecastImage *a, SGMetOfficeForecastImage *b) {
             return [a.timestamp compare: b.timestamp];
        }];

        overlayArray = [NSMutableArray arrayWithArray: sortedArray]; 
    ...

Animating the Layers on the Map

When we are ready to animate we create yet another timer which will call the updateLayer method every second. This is where the magic happens! The images we have downloaded from
the Met Office have been created to fit the following bounding box: 48° to 61° North and 12° West to 5° East. This is really easy to convert into decimal degrees using the following rule. Anything west of the
meridian is a negative number and anything east is, of course, positive. Now that we know the bounding box then we create references to the bottom left and top right corners of the
image using the following code:

CLLocationCoordinate2D UKSouthWest = CLLocationCoordinate2DMake(48.00, -12.00);
CLLocationCoordinate2D UKNorthEast = CLLocationCoordinate2DMake(61.00, 5.00);

We grab our current image from the array by using a counter that has been set in the loadView method and then put it on to the map. We set the bearing of the image to 0 (we don’t need
to rotate the image as it’s already in North/South orientation), set the z-index of the image and then add it to the map by setting the ground overlays map property to the mapView we created and set
in the loadView method (our only map object)

GMSCoordinateBounds *uk_overlayBounds = [[GMSCoordinateBounds alloc] initWithCoordinate:UKSouthWest
                                                                             coordinate:UKNorthEast];

GMSGroundOverlay *layerOverlay = [GMSGroundOverlay groundOverlayWithBounds: uk_overlayBounds icon: layerObject.image];
layerOverlay.bearing = 0;
layerOverlay.zIndex = 5  * ([currentLayerIndex intValue] + 1);
layerOverlay.map = mapView;

Increment the counter and check that we haven’t got to the end of the array, if we have then we reset the counter and then wait a second until updateLayer is called again.

// Check if we're at the end of the layerArray and then loop
if([currentLayerIndex intValue] < [overlayArray count] - 1){
    currentLayerIndex = @([currentLayerIndex intValue] + 1);
}else{
    currentLayerIndex = @0;
}

If we run the code like this after we’ve looped around all the images we would get something that looks like this.

Wrong Animation

Unfortunately that’s not quite what we’re looking for! What’s happened is that we have created a new GroundOverlay object and put it on the map above the previous layer without removing the older layer
first. To fix this we need to keep an array of GroundOverlays so that we can remove the layer on the next loop and then remove the old layer from the array. This is done by loop around
the array and setting the map property of the older layer to nil like this:

//Clear the Layers in the MapView
for(GMSGroundOverlay *gO in overlayObjectArray){
    gO.map = nil;
    [overlayObjectArray removeObject: gO];
}

The app is now adding and removing the layers correctly and giving us the illusion of animation on the map.

The complete method to add the current image and remove the old image looks like this:

-(void) updateLayer: (id)selector{
    //Setup the bounds of our layer to place on the map
    CLLocationCoordinate2D UKSouthWest = CLLocationCoordinate2DMake(48.00, -12.00);
    CLLocationCoordinate2D UKNorthEast = CLLocationCoordinate2DMake(61.00, 5.00);

    //Get next layer and place it on the map
    SGMetOfficeForecastImage *layerObject = [overlayArray objectAtIndex: [currentLayerIndex intValue]];

    //Clear the Layers in the MapView
    for(GMSGroundOverlay *gO in overlayObjectArray){
        gO.map = nil;
        [overlayObjectArray removeObject: gO];
    }

    GMSCoordinateBounds *uk_overlayBounds = [[GMSCoordinateBounds alloc] initWithCoordinate:UKSouthWest
                                                                                 coordinate:UKNorthEast];

    GMSGroundOverlay *layerOverlay = [GMSGroundOverlay groundOverlayWithBounds: uk_overlayBounds icon: layerObject.image];
    layerOverlay.bearing = 0;
    layerOverlay.zIndex = 5  * ([currentLayerIndex intValue] + 1);
    layerOverlay.map = mapView;

    [overlayObjectArray addObject: layerOverlay];

    // Check if we're at the end of the layerArray and then loop
    if([currentLayerIndex intValue] < [overlayArray count] - 1){
           currentLayerIndex = @([currentLayerIndex intValue] + 1);
    }else{
           currentLayerIndex = @0;
    }
}

And there you go, animating ground overlays on Google Maps. I hope you’ve found this tutorial useful and if you have then why not share it with your developer friends or
follow me on Twitter (I’m @frogo) or Google+ (+StevenGray).
If you have any questions about this tutorial then either drop me a line via the social links above and I’ll try my very best to answer them. You can also follow the conversation over on Hacker News

Continue reading »

The iPad Video Wall

I am happy to report that the iPad Video wall has grown up from a prototype to a fully fledged finished project. If you have been following the blog then you would have saw the prototype video of the wall’s proof of concept and watched a single movie playing over all 8 iPads. Well I’ve […]

Continue reading »

Telefonica (02) plans to explore Big Data

Telecommunication companies are sitting on a gold mine. With the prevalence of mobile devices in our every day life, for example reports from Google IO are that 400 million Android devices have been activated at a rate of 1,000,000 activations per day, the data that we generate as a collective group is phenomenal. Phone companies […]

Continue reading »

Big Data Problems have been around longer than you think

The Strata Conference is in town and one presentation that caught my eye was titled The Great Railway Caper: Big Data in 1955. John Graham-Cumming from CloudFlare gives a great overview on why some Big Data problems have been around since the early days of computing when computer filled entire rooms. Back in 1955 the […]

Continue reading »

iPad Video Wall

It seems like my favourite device of the moment is the iPad.  First I built the QRator app which has been quite popular and well received by the UCL Grant Museum.  We even won an award for the system. After a discussion with a few of my colleagues about new exhibition pieces for upcoming events […]

Continue reading »

Wolfram Alpha’s Personal Analytics

Wolfram Alpha has just launched their new take on social media analysis, building personalised reports for Facebook users. The computational engine builds various metrics and visualisation based on usage over a period of time, number of friends, geographical distribution of friends and even a network graph showing connections between friends. If you head to the […]

Continue reading »

Olympic Twitter Collectors

As the athletes have been training for the London 2012 Olympic Games so has been our Twitter Collectors. You may have saw the maps we created from data collected by the very first iteration of the Big Data Toolkit’s Twitter collector which produced some great visualisations. Over the past few weeks I re-wrote some of […]

Continue reading »

Windows Phone 7 – App in 30 days

Today, I attended the excellent Windows Phone Bootcamp in London where we learnt all about the Windows Mobile 7 framework (a great write up of the day here from Gary Ewan Park – @grep13).  Previously, during my MSc in Computing Science at University of Strathclyde, I encountered the old Windows Mobile framework which was quite challenging as none of […]

Continue reading »

Harvesting the Crowd: Experiments with Twitter

Harvesting the crowd is such a large and complex process but we can learn some interesting events happening just from looking at Geo-located Tweets. This seminar concentrates on what we sense we can make from the collective tweets of a city. Slides from seminar given to Oxford DTC e-Research Center, Oxford University, Jun 2011

Continue reading »

iCloud: My Thoughts

Apple’s Worldwide Developers Conference is underway and with that comes Steve Jobs famous keynote address and Apple’s lastest tech.  This year saw Jobs announce iOS 5, MacOS Lion, but more importantly Apple’s MobileMe replacement, iCloud. iCloud is Apple’s offering into the cloud computing infrastructure that Google and Amazon have dominated over the past few years.  […]

Continue reading »

London Twitter Map

We have been working hard here at CASA lately, building tools to collect, analyse and visualise different data sets from all over the web. One piece of software that has proven quite popular over the past few months is our internal Twitter Collector. The collector mines Twitter for tweets inside a geographical radius either with […]

Continue reading »