Simulation environment complete?

Bird's Eye View

I’ve just uploaded the latest version of my world map. It finally has all of the areas we need for the simulation – the town, country road, and city are all in place. I will further improve the environment as we move forward. We may need to tweak some of the locations, and if our mobile device can handle it I will add some “eye candy”.

Here are some screenshots of each individual area:

Town CountryCity

I got some iOS/Android Bluetooth controllers (for free)

ARM Bluetooth Controllers

Today I attended ARM Game Developer Day in Shoreditch. ARM processors are found in over 95% of mobile devices (all iOS devices, most Android devices), so it should come as no surprise that a major focus of the event was the future of mobile gaming – namely VR.

Seeing as we’re in the middle of developing a mobile VR simulation, the timing couldn’t have been more perfect. I had the opportunity to hear many of ARMs engineers and affiliates speak about mobile graphics optimization, VR considerations for mobile devices, and the pros and cons of various VR headsets.

One obstacle brought up during the talks was the lack of user-input while wearing a VR headset which requires use of your smartphone (Gear VR, Google Cardboard). Luckily, to help us developers remedy the problem, the community managers were handing out these tiny, adorable Bluetooth gamepads. They’re not the most advanced or durable little devices – in fact a quick Google search reveals that they’re made in China – but they’re exactly what we need for our project!

I’ll be testing the gamepads with John tomorrow afternoon, but in the mean time, I’ve also included a QR code (which leads here) linking to some super-relevant information for us.

Technical design meeting – Nov. 4

An extremely flattering image. Yup.
From the left: Luke, John, David, and Francesco. If this selfie were a competition, I’d say John won.

Yesterday, a few of us got together to discuss the more technical aspects of our simulation design. With our first deadline, November 17th, less than two weeks away, we felt it imperative that we reach a concrete understanding of where to begin our work.

Approaching our design from a technical standpoint has allowed us to focus the scope of our simulation, and break it down to the essentials – an initial structure from which we can grow the project into something more robust.

Game/Simulation concept:

Our simulation will compare and contrast the experience of being a driver in a traditional car, and the experience of being a “driver” in a driverless car. The traditional car will be controlled by the player. The driverless car will not be controlled by the player, but will display information about the surrounding environment – information the driverless car uses to navigate, which would not be present in a traditional car.

Our simulation will be broken up into two phases – a phase during which we or the player are allowed to customize the parameters of a short journey, and a phase during which the player experiences the journey, from inside a vehicle, using a VR headset.

The “customization” phase:

Though we haven’t yet decided on a visual design for this phase, we have a good idea of how it will function. This phase will allow us, or the player, to change the parameters used to generate the simulated world. We are aiming to include the following parameters in this phase:

Driving Hazards – Randomized events to which either the driver or the car will need to react.

  • Pedestrians
  • Other, erratic drivers
  • [Bonus] Cyclists or motorcycle riders, if we have time to implement them

Environmental Hazards – Driving conditions which could affect the performance of a car or driver

  • Darkness
  • Fog
  • [Bonus] Rain, if we have time to implement it

Type of car – Whether the player will drive a traditional car or be a passenger in a driverless car, and how the two cars control

  • The traditional car will have acceleration and braking controls
  • The driverless car will have a panic button, but no direct controls. It will, however, visualize information for its passengers

Amount and type of information displayed in the driverless car – Information about the car and its surroundings, visible to the driver/passenger in real time. This could include:

  • Surrounding obstacles, or potential hazards in the vicinity of the vehicle
  • Visualization of information being passed between vehicles
  • Statistical information such as current speed, the time it would take to come to a complete stop
  • A GPS-style map of the current journey
  • As our UX team members do more research, we may identify other information which could benefit our drivers

Though we had discussed having this customization phase run in tandem with the VR phase, the developmental challenge this kind of gameplay poses falls outside the scope of our project. Separating these two phases will afford us time to focus on improving our driverless car simulation, instead of exploring more VR-specific game dynamics.

The “simulation” phase:

During the simulation phase, the player will experience a brief drive through a dynamic game world. Over the course of their session, the player will encounter various hazards and obstacles.

The player, wearing a VR headset, will be placed into the driver’s seat of a traditional or a driverless car. The driverless car will relay real-time information to the player, which will help inform the player of the methods driverless cars use to make decisions. By contrast, the traditional car will not relay information, but will be under the control of the player.

There is a pedestrian behind a building. Our driver cannot see him. But the car in front can see him, and relays this information to our vehicle. (Ignore the bits in the lower-left)
There is a pedestrian behind a building. Our driver cannot see him. But the car in front can see him, and relays this information to our vehicle.
(Ignore the bits in the lower-left)

A driverless car has access to a wealth of information – information from its sensors, information provided by other cars around it, and even information downloaded from satellites and the internet. Presenting that information represents a fascinating UX and design challenge.

The purpose of our simulation is to expose how drivers react in a traditional driving experience, and how they react in a driverless car experience. Furthermore, how does the driverless car experience change when players are presented with more, or less, information?

Let’s get technical:

John, David, Francesco, and I discussed a few possible ways in which we could construct such a simulation. We need to build a world in which the player can be surprised by unexpected events. For this reason, we don’t want players to be able to memorize the driving environment. We need to procedurally generate the driving journey.

Some rough sketches of world tiles through which the player will drive
Some rough sketches of world tiles through which the player will drive

We are in the process of designing a series of “world tiles” – chunks of road, city, and town which we can stitch together to create varied, exciting routes through the simulation. These tiles can vary in complexity, and provide different navigational challenges. As development progresses, our tile design will be informed by the technical scope of the simulation, and by our UX/design team members. As our understanding of driverless car technology improves, the types of scenarios we wish to include in our game world will be defined more clearly.

First steps towards creating our simulation:

To create our initial prototype, we need to break the simulation down to a bare minimum. By determining the most critical elements, we will have a strong idea of how to begin development.

To begin with, we need to create our driverless car. Francesco is beginning to model the car interior, which presents an interesting design challenge. What does a driverless car look like? What’s needed on the inside? These questions will be answered by our research, and by UX analysis.

Google’s car looks like this. Will we follow their model?

Using the cars Francesco is creating, we need to begin simulating our game world. John and I will be working closely to create the game, but there’s a lot of design work remaining – both in terms of the technical design, and the functional design. We are a ways off from having a simulation up and running, but we have laid out concrete goals for our end product.

The major question remains: how, precisely, do we expect this simulation to be used? We initially discussed using the simulation as a “game” which promotes public acceptance of driverless cars. We hoped to demonstrate that displaying information inside a driverless car could put passengers at ease, and create trust. Perhaps the way we display information inside the car could even be used as a prototype for an actual driverless car interface.

What we realized during yesterday’s meeting is that our original goal – promoting public acceptance – requires a great deal of UX research. In order to shape people’s perceptions of driverless cars, we first need to know how people react to different driving/driverless experiences. And what better way to gauge reactions to different driving experiences than …running people through our simulation? Our simulation, by design, will provide a framework for creating numerous, varied driving scenarios. Perhaps instead of setting “making a game” as our immediate goal, we can concentrate on how our simulation may be used as a UX research tool?

That question remains to be answered. Hopefully over the coming week we have reached a conclusion. What’s certain for now is that we’ve made a great deal of progress, and are all excited about the shape our project is taking.

More to come soon.

A silly (and rather tense) depiction of a world in which driverless cars are the norm

I thought I’d break up all the seriousness with this delightful video. Very clever use of object cloning to create the effect.

This does beg the question though: In a world where driverless cars are standard, would pedestrians learn not to fear moving vehicles, knowing that the vehicles would stop to avoid them?

The Trolley Problem

One of the major problems driverless cars face in terms of public approval – The Trolley Problem. How would driverless cars be programmed to work around it?

… you are the driver of a runaway trolley barreling down the railway tracks toward five workmen who cannot get out of the way. To prevent their deaths, your only option is to divert the trolley onto a side track, but diverting would kill one worker on the side track.

Wikipedia page here: https://en.wikipedia.org/wiki/Trolley_problem

An interesting piece on the matter: http://www.forbes.com/sites/modeledbehavior/2015/09/13/ethics-wont-be-a-big-problem-for-driverless-cars/