Hi! My name is Cristi, and I am a Tools Developer at ThroughLine Games. In this blog post, I will describe the process we went through in order to develop our internal tools for creating conversations in Forgotton Anne. I will cover the vision we had in the beginning, the tool development process, some of the challenges we encountered along the way, and the final result. In order to appeal to a broader audience, I will not go into technical details but rather the overall design of the tools we have developed.
Early in the production of the game, we knew that story and dialogue would be among the pillars of our game’s experience. At the same time, given the scope of the game and the size of the story content, we realized that we needed to build tools to make our development workflow as fast and effective as possible. We also knew it would be crucial to be able to quickly iterate content if we wanted to obtain results that would satisfy our creative ambitions.
There are several forms of assets and content that need to be authored and put together in order to create a complete conversation scenario between two or more characters. Intuitively, the three most popular forms of content required are text, sound, and graphics or animations. To get an overview of the tools we have developed for setting up conversations, I will delve into how we handled each type of content in more detail.
The dialogue in our game can be anything from a single line to multiple characters talking at the same time and presenting the player with choices which lead to branching conversations. Finding a good data structure to store and represent complex conversations was not easy, but by starting small and adding new features as they were required, we reached a solution we were happy with. Since all of this complexity needed to be easily accessible, we created a tool for building these data structures, which we refer to as dialogue trees.
This tool serves multiple purposes, but the most important one is that it is a central place for creating and editing all the dialogue trees in the game in a visual and intuitive way. The center view, which takes most of the space, is a place for editing the actual content of the dialogue trees. The dialogue trees are composed of nodes where we can write in the dialogue lines and define how long they should be. These nodes can then be connected in order to define the flow of the conversation. These dialogue trees are then saved into the project as Unity prefabs which can simply be dragged into our levels and work out of the box. These features have proven to be essential as they have made creating dialogue trees a straightforward and pleasant task.
In terms of audio: for every line of dialogue content we also have a corresponding audio recording. This means that we needed a way of matching dialogue lines with the audio recordings. The most obvious solution was to assign a unique identifier to each line so they can be easily identified. And because each line of dialogue was created through our Dialogue Editor this was a simple task. Along with that, we also generated a unique filename that can be used for every line of dialogue recording, so all we would typically do then is export a list of all those filenames and assign them to voice over audio files.
One final aspect of creating a conversation is the visual one. What we wanted to achieve with our conversations in terms of animation is a flow of natural expressions that our characters play out. For that purpose, we built the Talk System, which serves as a way of defining a graph of interconnected animation states. These graphs can then be used to trigger a dynamic transition between the current animation state and another state in the animation graph by playing all the animations on the path leading to the new animation state. Below is a preview of the animation graph for our main character.
Finally, all of these different types of content come together and help create an engaging and lively conversation between two or more characters. However, we felt we were missing a way of fine tuning these conversations. So in order to tweak the timing of the dialogue lines, the animation transitions, and other parameters we created a tool that can give us an overview of a specific section of a conversation and let us control precisely how that section will play out. We call this tool the Cutscene Director.
In the Cutscene Director, we can preview dialogue sections and see if they need any adjustment. At the very top of the window lies a timeline where we can adjust the timing of lines according to the waveform of the corresponding audio file. Right below the waveform, we can setup the aforementioned animation state transitions by adding keyframes at the desired times in the timeline. During a conversation, we also want to trigger different actions like moving the camera or a character, and this can also be set up in this tool for accurate timing. On the left side of the window, we have a way to inspect and configure what the different actions in the timeline do. The closer we got to the end of development, it felt like this tool was exactly what we needed to bring all of the different forms of content together and to fine tune the experiences we create with the conversations in our game.
Finally, it is worth mentioning that these tools have proven to be invaluable throughout the production of Forgotton Anne. They have given us the speed of implementation and iteration that is essential for creating meaningful conversations, and all of that would not have been possible without the ease of use and extensibility of the Unity Editor. With that being said, it is safe to conclude that it is worth investing the time and resources in developing tools and systems that improve your workflow, and ultimately allow you to create a better game.