Technical Issues in Tools Development Roundtable - Day 2: Tools
About the moderator
- Geoff Evans
- Tools Director at Infinity Ward
- GDC Advisory Board Member
- Created The Toolsmiths Comunity around Technical Issues in Tools Development Roundtables
- Geoff’s twitter account
- Geoff’s blog
Topics for the day include: programming languages, design patterns, usability, UI framework
Table of content
- Node Graph UI
- Python or Script API access to toolchain
- Tools or Workflows for live services
- Collaborative editing
- Tools configuration for power users
- Asset cost and visualization
- Porting and redesigning tools for new APIs or Frameworks?
- Techniques for designing tool workflows
- Ease of use and what is going on with the data
- Providing feedback to the user while they are using the data
- Organizing assets
Node Graph UI
Context: WPF
- in-house lib (supports 100k visual elements)
- uses SharpDX to make it performant
- have a slight chance of this lib becoming opensource
- stopped using xaml
- stopped using wpf
- Secrets to scaling WPF - stop using WPF
- Creating a Tools Pipeline for Horizon Zero Dawn by Dan Sumaili & Sander van der Steen
- Qt Node based editor
UI framework poll
- C#/WPF - 90%
- Windows Forms - 5%
- native MFC - 20%
- native C++/Qt - 30%
- python (PyQt/PySide) - 80%
- imgui - 3%
- webbase
- maya/MB native UI
- rolling something custom
Python or Script API access to toolchain
Context: Pros/Cons of having/not having script access to the toolchain.
automation
- helps interface with TA tools
- have C++ and C# APIs exposed
- IronPython (running on .Net) helps interop with C#
- lets directly script against the editor (bypassing the compile step )
- TAs writing in IronPython
- engineers don’t think too much about it
- working out great for the past 2 years
- get TAs to have access to engine level features and data
- this has made TAs more productive
- getting the libs in memory so we are not crossing the boundaries of the languages constantly
- TAs use C# to have easier access to the editor’s APIs
- CLR from python
- using Clojure lang to CLR
Tools or Workflows for live services
Context: anything with back end development.
- for configuring and running services independently recommend looking into Docker
- easily to set up and run (locally for testing)
- the docker instances run locally on users machines to run services
- independent of the OS version and SW installed
- easily to set up and run (locally for testing)
- using docker namespaces for dev, QA, mainline
- helping users spin up the server stack for a game locally
- mobile games (on top of docker)
- easy to configure/setup environments for QA in a test branch
Collaborative editing
Context: p4, git, shotgun (communicate, notes to assets, annotations, review tools)
Editing levels/assets at the same time (see what the other Content Creator is doing)
- shotgun integration cost (might be an issue)
- feedback loop - annotation, notes very useful
- look into what the review process of assets looks like
- Working Together: Solutions for Collaborative Asset Creation by Niklas Gray (aka Niklas Frykholm)
- the data model that they use to merge the data in real time
- Our Machinery blog The Story behind The Truth: Designing a Data Model by Niklas Gray
- focusing on the data models (keeping it simple)
- issue to maintenance
- locking aspects
- having different editors running and communicating with the server to perform merge checks
- are there any merge conflicts
- not simple problem to solve of streaming changes between machines
- how are you going to share the data and will you be able to sleep at night with the current implementation?
- look into transactional file formats (look into journaling)
Is live editing actually useful?
no one had an answer
Tools configuration for power users
Context: providing a configuration file that will help users to configure the tools - best practices, what to watch out for
customization
without providing another build to a team, they can customize a tool in a declarative fashion
- Watch out for configuration files that are based on other configs that are based on other configs…
- failure mode that you would have cfg referencing other configs (make it clear)
- hard to reason about what is in the configuration
- Don’t give too much power to the descriptions
- plug-in system
- ability to tailor to specific skill level
- a plug-in system that is based on Managed Extensibility Framework ( for C#)
- Don’t do inheritance of config files
- having config validation (might look like a bug, but is a setting issue)
- having config causing issues on startup
- most of the customizations are very technical
- few tools allow how to actually customize the visual real estate of the tool
- customizing the property editor to hide the things that you don’t care about
- dockable tabs
- have own custom windowing framework
- visual studio like
- have photoshop like for artists
- custom context menu (window level, custom views)
- filtering of data
- the ability of property editor (add the ability to save the filters as presets)
- customization for maximizing productivity
- use imgui
- easy to hide UI (switch debug mode when engineer stops by)
- opensource docking framework AvalonDock
question: should the tool be the same for everyone or let everyone customize it like crazy?
- getting the balance of the way you can actually customize things and making it useful for a lot of people (not just experts)
- Don’t fall into the trap of marking features as “this feature is for the experts only”
- most people will think that they are not experts
- people won’t be able to find them
- most people will think that they are not experts
Asset cost and visualization
Context: How do you show the users that something is hitting a budget
metrics
- a temperature gauge that shows the users that the user the budget limit
- Thermometer - in LittleBigPlanet’s Level editor
- using the Thermometer video
- UGC (user generated content) editor with
- memory analyzer
- performance profilers
- automated testing pipeline (that went around the game world and would take screenshots of problematic areas)
- Disney Infinity (game)
- CPU spike a tooltip would pop up and show what exactly happens and how you can fix it
- how to use the editor video
Porting and redesigning tools for new APIs or Frameworks?
Context: If you are porting from MFC to Qt
- No automation, just do it
- had success porting a window at a time to Qt
- Qt porting on a per-window basis
- used qtwinmigrate
- not very good for big editors
- 5+ years of development of custom MFC controls
- had no access to original authors of the tool
- race/struggle for getting resource invested into the porting while delivering features
- issues with focus events (MFC and Qt were fighting for focus)
- moving from MFC - still going
- 2 month to ditch win forms (C#) and port to native WxWigets
- ran all of the C# code through a $35 code porter - turned .cs files into cpp and h files
- had dedicated manpower
- the people that were porting had the mind share how the whole tool worked
- might be a contributing factor to the success
Techniques for designing tool workflows
Context: thinking of creating a 2.0 version of a tool
- talk to users and ask what is working for them
- do a user study
- Tools Tutorial Day: The Science of Customer-Focused Tools Development by Caroline Colon
- good time to look into what features that are not needed in the next version of the tool
Ease of use and what is going on with the data
Context: transitioning from the UI matching 1-to-1 with the runtime data (Object-Oriented) to changes going through UI components running through a data conversion pipeline (Data-Oriented)
Disconnect of how the data is represented in the UI and the data in the runtime
More technical users feel the disconnect
- transferred from having a tool that ha everything exposed to the user (they can play around)
- transferred to having a select few controls to tweak helped the content creators produce content at a higher rate (without the extra cognitive load)
- it is always good to have a translation layer/abstraction layer that helps you make changes to the internals of a tool without breaking things in the frontend
- abstract the complexity from the users
- the content creators don’t really need to know how the memory model works
- For more technical users it is better to give them the ability to see the runtime data
- just exposing the data that you have
- GDC2019 Tools Tutorial Day: The System of Tools: Reducing Frustration in a Daily Workflow by Laura Teeples
- GDC 2018 “A Tale of Three Data Schemas”. Tools Tutorial Day by Ludovic Chabant
- always have the ability to see what the data is translating into
Providing feedback to the user while they are using the data
Context: complex tool feedback (how to communicate with users that don’t have time to read the docs)
Comunicating why something is happening in the tool
- Visual Studio analyzer (while you are typing it gives you feedback)
- see Roslyn analyzers (that will give you a good starting point)
- predicting what a user’s operation is going to do
- “I’m about to make this connection”
- “I’m about to drag this node” - giving the user some preview of what their actions/changes is going to achieve
- Adding control flow preview while the user is in the midst of making an edit to something
- a simple example of getting in the mind of the users
- showing what a user wants to select and highlighting whatever they are hovering over in a viewport
- this was a debug feature that became an important part of a designer workflow
- showing what a user wants to select and highlighting whatever they are hovering over in a viewport
- having the show what are the const and ramification is for that edit
- When you are using a functional approach it is a lot easier to achieve: logging, rewind, undo playback
Organizing assets
Context: pitfalls for organizing assets; rules of thumb
assets that are used on different platforms and in packages
- map pipeline
- when someone exports a map the pipeline separate the metadata from positions of game objects, separate the meshes from the textures.
- ask the engine team to support loading not only the fully backed data but the loose files as well
- how much you want to have offline vs online data
- build workflow as a graph approach
- take the time to look at the data to understand the scope
- what the cost of splitting the data
- fewer files on disk might get you better performance
- HandmadeCon 2016 - Asset Systems and Scalability by Chris Butcher
- Tools Tutorial Day: Bungie’s Asset Pipeline: ‘Destiny 2’ and Beyond by Brandon Moro
- have the engine be able to load backed data and loose data
- reaching the balance of assets knowing about other assets and assets - requiring other assets to be built and available
- assets knowing about other assets via indexes
- way more complicated when you have to process an asset before you can process the current asset
- make the pipeline parallelized so that you can have multiple things being processed
- deep coupled (multiple files vs one file)
- don’t forget to think about the cognitive load of multiple files vs one file
- multiple files may be great for power users but might provide issues for regular users
- remembering to understand how the data maps to VCS
- authoritative data source (especially in a distributed environment)
To continue the conversation about these topics and more Join our