Testing Map-Based UIs for Self-Driving Cars: HERE’s Knight Rider

I was kindly invited, earlier this week, to take part in “insideHERE” in Berlin, a small event run at the HERE HQ in Berlin. HERE, being born out of the ashes of Navteq and Nokia Maps, is now owned by a consortium of German car companies. For the special event, HERE’s developers and engineers opened up their research labs and revealed their state-of-the-art mapping and location services work. HERE Auto is making a real play to be the “Sat Nav of the future”, competing directly with Google and Apple to create, manage and augment data between your smartphone and your car. Tomorrow I’ll outline the general visualisation work I saw that demonstrates their high-precision spatial datasets, but first, today, I mention one particular research project which shows how maps will be continue to be a crucial part of driving, even when cars drive themselves.

“Knight Rider” is a test rig, built to simulate a car, where the engineers and UI/UX designers can try out different configurations and locations of controls and maps on a dashboard. They key aspect being tested is how much trust the user can place in the car, based on what they can see and information that is displayed. Testers can sit in the “car” and drive it, to experience map/control designs and, crucially, how it feels to give up the steering wheel but continue to have the confidence that the journey will proceed as planned! Large exterior screens, fans and a windshield provide some depth of realism. The intention is not to create a realistic driving simulator, completely with fully photorealistic buildings and roads, but instead to get the tester as comfortable as possible to evaluate the designs effectively, before they are put in a real test car on the road.

When we saw the rig, it was configured with maps in three places – a short but wide one that wraps across the dashboard, a circular map that sits just to the right of the dashboard, beside the steering wheel, and finally a heads-up display (HUD) that reflects in the windshield, this achieved by a carefully angled screen pointing upwards.

The dashboard map shows a single map, behind the regular digital numbers/dials you would expect on a normal dashboard. The map here switched between a general 3D overview of the journey ahead, when “cruising”, to a more detailed, but still a “helicopter” 3D view, when carrying out manoeuvres such as approaching a destination or a complex junction:

dashmap1

dashmap2

The panel alongside typically shows an overhead map, in a circle with your location on the centre, it rotates as you move:

circlemap
It is also the main drive control panel when not steering, for example if you want to tell the car to overtake a car in front, the AI having decided not to do so already – you are not steering that car here, but “influencing” the AI to indicate that you would like it to do this, if safe:

circlemap2

Finally, the HUD necessarily does not show much information at all, apart from a basic indication of nearby traffic (so that you are reassured the computer can see it!) and any indication of hazards ahead. You mainly want to be looking though the window for the traffic yourself, of course:

hud

The key interaction being tested is changing from human to computer controlled driving, and back. The first is achieved by listening for the comptuer voice prompt, then letting go of the steering wheel once asked to. If you don’t retake control of the car when you need to, for instance as you are changing onto a class of road for which autonomous driving is not available, and you have ignored the voice prompts, then then the car will park up as soon as it’s safe to do so.

It’s an impressive simulator and crucial to shaping the UI of the autonomous cars which are starting to appear on the horizon, in the distance, now.

Photos and video courtesy of HERE Maps.


Visit the new oobrien.com Shop
High quality lithographic prints of London data, designed by Oliver O’Brien