DETROIT : Become Human
@Game Builder NPC
By: QUANTIC DREAM
What: Adventure game. In a soon to come future, androids develop feelings and fight for their rights
My job: Game Builder NPC
The first part of the process is of course the design of what our NPCs are doing. This is based on is the ambient of the scene, the message we want to send to the player and the kind of implementation we will use (see next parts).
Implementation: Pedestrian Manager
One of the most important way to implement AI behaviour in Detroit is to use the “Pedestrian Manager”. This homemade tech allows us to create paths and action spots that simulate a “street life” behaviour. This tech is coupled with a “Traffic manager”, a script-based tech that simulates traffic, intersections and crosswalks.
Implementation: Custom AI Systems
Sometimes the scenes require very specific AI behaviours that can’t be automatized by the previous tech. In this case, we use visual scripting (close to Unreal Engine’s blueprints) to make custom behaviours. For instance, in the E3 2017 scene (known as “Capitol Park”), I made a complex system in order to have NPCs follow the player and interact with the environment depending on what the player does.
Implementation: Staging Actions and static actions
These methods of implementation are the most common for NPCs in Detroit. The concept is simple: we script some cycles (static actions) and some reacts to the proximity of the player (staging actions) in order to always have something interesting to watch on the background.
Implementation: Sequence Arrangement
This part of the implementation concerns filmed parts of the game. As long as the game is highly cinematographic, a lot of sequences have to be arranged with some NPCs, in the background or as “main characters”