I don’t know if this is a real thing or just a stupid idea, but I was watching some folks talk about giant robot stories (in the context of giant robot games) while also working on a customer risk assessment and I suddenly wondered if we could use one in the context of another?

Currently I work making giant robots safe and secure, so I already know robots in the context of risk analysis works. But what about risk analysis as a tool for game design? So there are lots of methodologies for assessing the risk (and determining how to mitigate it) for systems but one I really like because of its collaborative and practical nature is the French government’s EBIOS 2010 system. We won’t dig into it in detail nor discuss my professional variations on it, but rather look at it from a very high altitude and see if it makes a game. More correctly, if it identifies the parts of a simulation that are fun to model in a game. Maybe we get some new giant robot direction!
So the first step is to identify the assets of the system. Now, this is often naïvely interpreted as the physical objects of value in the system but this is not how this works. The assets of the system are the elements of the system that are critical to its correct and safe operation. They might be things but they might also be functions.
assets
So what kind of assets to giant robots have?
- integrity of their armour — if the armour is busted, that’s bad
- safety of the pilot
- ability to destroy an opponent
- ability to navigate difficult terrain
- security from extreme environmental threats (radiation, engineered, disease, poison)
- ability to function in a wide range of temperatures
- ability to function in extremes of shock and vibration
- ability to detect threats (enemies in this context)
I’m sure there are more, but this is a pretty good list to start with. So the next step is to determine just how bad your day gets if these assets are compromised. Since this is subjective we don’t want really fine granularity — let’s just say it’s zero if nothing bad happens, 1 if it’s a pain in the ass, 2 if the system becomes useless, and 3 if the pilot dies.
So integrity of the armour. Let’s call that a 1 because we have pilot safety and basic functions somewhere else. We don’t really care much if the armour is damaged if nothing else happens.
Pilot safety, that’s a 3 obviously. Note that in a real assessment here is where we would argue about the dollar value of a life — is it really more important to keep the pilot alive than anything else? And we might change the severity definitions based on this discussion. Anyway, and so on. Let’s summarize:
- 1 — integrity of their armour
- 3 — safety of the pilot
- 2 — ability to destroy an opponent
- 1 — ability to navigate difficult terrain
- 2 — security from extreme environmental threats (radiation, engineered, disease, poison)
- 2 — ability to function in a wide range of temperatures
- 2 — ability to function in extremes of shock and vibration
- 2 — ability to detect threats (enemies in this context)
Next we need to talk about what threatens these assets. What are the threats?
threats
So normally we’d brainstorm these and get lots of ideas and then winnow them down to essential and unique threats. But let’s short circuit that since you can’t respond very quickly to this and I’ll just list a few.
- enemy weapons damage our weapons
- enemy weapons damage out mobility subsystems
- enemy weapons damage our pilot cockpit
- environmental temperature is very high or very low
- weapons use creates too much heat
- weapons malfunction
- mobility system generates too much heat
- subsystem breaks down from lack of maintenance
- enemy weapons damage sensors
I think already we can see a game system come together though I’m not blind to the fact that I am thinking about game systems as I generate this list. It’s a bit of a cheat so I’m not sure it proves much. Maybe if I started with a topic I don’t know well?
Anyway the next step is to decide how likely each threat is. Let’s say 0 is amazingly unlikely. 1 is unlikely, 2 is common, and 3 will happen pretty much every time you get into a fight. Let’s quickly go through that:
- 2 — enemy weapons damage our weapons
- 2 — enemy weapons damage out mobility subsystems
- 1 — enemy weapons damage our pilot cockpit (because it’s small compare to everything else!)
- 1 — environmental temperature is very high or very low
- 3 — weapons use creates too much heat
- 1 — weapons malfunction
- 2 — mobility system generates too much heat
- 2 — subsystem breaks down from lack of maintenance
- 2 — enemy weapons damage sensors
risk matrix
Now we just multiply these to find out how much we care about each scenario. If a threat doesn’t impact any asset we don’t care. So for example, let’s look at “enemy weapons damage our weapons”. That seems to affect only our ability to damage opponents, which has an asset value of 2. So the risk for this threat is 2 x 2 = 4. We’d normally make a risk appetite grid to say just how bad a 4 is. Something like:
Severity -> | ||||
Likelihood | 0 | 1 | 2 | 3 |
0 | who cares | who cares | who cares | maybe bad |
1 | who cares | maybe bad | worrying | bad |
2 | who cares | worrying | bad | very upsetting |
3 | maybe bad | bad | very upsetting | unacceptable |
So a 2 x 2 is BAD.
Let’s look at something with multiple asset impact. Enemy weapons damage our pilot cockpit. Now clearly this affects our pilot safety, our mobility, frankly almost all of our assets. So we pick the most severe one: pilot safety. So that’s a 1 x 3 — BAD.
As we go through this we start thinking about mitigations. For each scenario that’s, let’s say, worrying or worse are there mitigations we can put in place that reduce either the severity or the likelihood of the event? So, for example, we could add armour to the cockpit and maybe reduce severity by one step. That’d be nice. But we need to also consider the ramification (cost) of the mitigations.
Because I want to talk about it in the next step let’s also look at weapons use creates too much heat (3). We will now have to invent the impact of heat on the robot and now we’re also designing a game — we’re imagining features of this robot and its world context. So let’s say we think that a hot robot is an unhappy robot. That most subsystems degrade. Certainly the weapon but also mobility and maybe pilot safety ultimately. So that happens with a likelihood of 3 and pilot safety is the biggest deal of all the impacts. 3 x 3 is unacceptable.
mitigations
So a mitigation is a recommended change to the system that reduces the risk level of a given threat scenario. And this is where we start getting a game I think because when assessing a mitigation we have to consider its cost and that’s where we start to get at least robot construction rules.
We have an unacceptable scenario up there — weapons overheating can kill the pilot. That would be bad. It can also do lots of other things, so even if we solve the pilot problem we still could wind up with a 3 x 2 that’s very upsetting. So we’d really like to bring down the likelihood of a weapon overheating. We could:
- prefer weapons that do not generate much heat (like rockets, say)
- add heat dissipation equipment to weapons (sinks, heat pipes)
- add heat dissipation equipment to the whole system
- … and so on
Now from a game design perspective what’s interesting here is not how we make a giant war robot safer, but the detail that we are adding to the system. Now we know we want to track heat, maybe by component. We know that some weapons generate more or less heat. We have a new subsystem (heat sinks) that could also be damaged and create cascading trouble.
discussion
What this seems to do is to give us a big pool of credible detail — elements of a fictional universe that have some justification for existing. Ultimately a good (or more often bad) risk analysis is what drives pretty much everything in the real world: nothing is perfect and so we need to decide how much imperfection we can tolerate. A lot if not all complexity comes out of this thought process, and trade-offs like that are also a Good Trick in game design: they create diversity in approaches to playing the game well.