UDN
Search public documentation:

DesignWorkflow
日本語訳
中国翻译
한국어

Interested in the Unreal Engine?
Visit the Unreal Technology site.

Looking for jobs and company info?
Check out the Epic games site.

Questions about support via UDN?
Contact the UDN Staff

UE3 Home > Unreal Engine 3 Basics > Epic Games Design Workflow
UE3 Home > Character Artist > Epic Games Design Workflow
UE3 Home > Environment Artist > Epic Games Design Workflow
UE3 Home > Level Designer > Epic Games Design Workflow

Epic Games Design Workflow


From concept to completion


This document is intended to give some insight on the internal workflow for level creation at Epic Games.

Starting from the art concept


Once the concept for an environment has been approved by the art director and the lead designer for the project, it is given to the Level Designers (LDs) and the environment artists.

breaking down the concept

The LDs will begin identifying the simple structures in the maps, such as where buildings will be, and what rough shapes can be used to prototype the level.

identifying key structures

The artists work with their leads to identify major set pieces (hero pieces) that will be landmarks in a given area of the game. These are typically set aside until later in the project, as the majority of the meshwork is the modular sets, which get used in 90% of the world.

deriving modular mesh sets

The LDs work with the environment artists to define the set of modular pieces that will be built by the artists. These include meshes such as trim pieces, doors, windows, walls, corners, and variations of these.

The gameplay concept


In terms of "concept" we look at a big difference between art concept and gameplay concept and each plays an important role. Gameplay concept involves a lot of LD-programmer interaction in the early stages. Art concept involves a lot of LD-Artist interaction, and usually won't even start until after gameplay prototyping is complete.

Designer/Programmer interaction

The LDs work with the programmers to prototype gameplay ideas using a combination of Unreal Kismet and existing behavior in the engine. This is typically done in Proof of Concept levels, which contain only the geometry and assets needed to accurately test the relevant systems. During the course of the project, the LDs will require behavior that does not currently exist, which gets prioritized based on time available and the needs of the game project.

Gameplay prototyping in Kismet

This behavior is either added to Kismet, or is provided through game code. In many cases, the LD will create the needed behavior in Kismet, which the programmer can then evaluate and potentially streamline.

Mesh creation


Our artists use a variety of tools to create the characters and environments.

tools used for hi poly meshes

Early mesh creation is done in 3D Studio Max, with much work being transferred into Z-Brush for modeling and detailing.

low poly meshes

Low poly mesh modeling is done almost exclusively in 3D Studio Max. The artists create a medium resolution mesh that is moved to Z-Brush to be detailed, and is also used to refine into the low-poly mesh.

Unwrapping UV Maps

After the low poly mesh has been completed, the mesh's UV map is created using primarily 3D Studio Max's UVMap and UnwrapUVW modifiers. 3D Studio Max 8 has a more advanced feature set than 3D Studio Max 7.

Creating normal maps from meshes

The artists have several different methods of creating normal maps from meshes. An older way was to use SHTools to process the high-poly and low poly meshes together. In this method, the meshes are often disassembled to avoid tracing errors for things that are separate but very close together, such as fingers, shoulderpads, and mouth parts such as the teeth and tongue. Also, newer versions of 3D Studio Max (8 and beyond) and Maya both have the ability to process the meshes as well, and most artists have switched to using them.

The normal maps are very often combined with a secondary bump map, such as fabric patterns. SHTools has support for doing this, as well as newer versions of 3D Studio Max. Also, small details such as rivets, bolts, scratches and dents are often composited into the normal map from a library of pre-processed normal maps the artists have made. This allows the modeling process to go faster, as fewer of the very small details need to be modeled in polygons.

Material creation


Diffuse and specular creation

Once the UV map has been created, and the normal map processed, the texture artists begin working on the texture maps for the object or character. This is done using the layout from the UV map. In many cases, the specular map will be a full-color texture, to allow for different portions of the object or character to appear to be made of different materials, such as flesh, copper, steel, etc.

Creating the material in UnrealEd

Once the main textures for the object or character are at a near-completion stage, the textures are imported into the engine, and a material is created for the object. Using the DeferCompression flag during import can be an important timesaver, as it delays compression of the texture to DXT1 or DXT5 until the package is saved, thus allowing faster imports while experimenting with and tweaking the final textures. A new material is created, the new textures are brought into it as texture samples, and hooked up to the appropriate shader nodes (Diffuse, Specular, Normal)

Level Prototyping


BSP rough from Concept

The level designers use the concept art for an environment to determine the basic shapes and layout that the level will need to test the environment and do gameplay prototyping. At this stage the level will have very few or no materials applied to any of the geometry, and very few lights. Almost all of the construction will be of BSP primitives, which will be replaced later in the construction pipeline by static meshes or terrain. Very few BSP primitives are in the final level, the ones that do survive are primarily used for floors, ceilings, or simple walls.

Pathing the BSP rough

The initial pathing is very simple, mainly used to ensure that all players and AI characters are able to navigate the environment. After the initial pathing is laid out, playtesting will begin.

Playtesting the BSP rough

At this point, the BSP level is ready for playtesting. This initially starts out with just the LD and the lead LD running through it, and progresses from there to the rest of the design team testing the level at various states of completion. If the level is intended to be a multiplayer map, there are scheduled gameplay sessions in our testing lab, with comments given after the play session.

Gameplay roughing

This is to determine if certain gameplay elements are appropriate for this level, and once scripting is in place, to adjust difficulty and placement of encounters and pickups. The LD is going through the level at this point placing gameplay objects such as ammo pickups, and also further tweaking path placement and begins doing basic scripting for the level.

Scripting


Introduction to Kismet

Unreal Kismet is what Epic uses to do almost all in-game behavior for the levels and for cinematics. The LDs are responsible for all scripts being in place, and for reporting issues with script elements. At this point, the LDs are also requesting custom Kismet actions to perform level or game specific tasks.

Using Matinee to script events over time

Matinee is used in our games to perform two main tasks. The first is to script game events over time. This can be as simple as a door that opens, or as complex as a series of explosions over time that change materials on objects and cause multiple other Kismet events to fire. Most of the Matinee used in Gears is for the simpler actions, such as doors.

Adding effects to the level

The LDs do most of the initial work of placing ambient particle effects such as smokes and fires, as well as placing effects that will be fired later using kismet, such as dust that sifts down. Most of these are driven by Kismet, to allow them to be turned off for performance when they're no longer needed.

Meshing level


Placeholders

Designers use placeholders very often in the prototyping stages of a level. A designer will often use an object that is the approximate size and shape of the final mesh, provided by the artists. This object is named the same as the final mesh, and is in the same location in the package. This allows the artists to import the final mesh and material and have it automatically propagate to wherever the placeholder was used.

Re-using existing meshes

Unreal Engine 3 allows non-uniform scaling of placed meshes in the levels. We use this extensively to assemble a level, by using a mesh in multiple ways to reduce the amount of memory overhead both for textures and mesh data. An example of this would be using a doorframe to construct a window frame on a building, or using a rubble pile as a mountain range in the distance.

Optimizing use of meshes

One of the easier optimizations to make with regards to meshes and memory overhead is using the Primitive Stats section of the Generic Browser to identify meshes that are used infrequently and can be replaced with other meshes that are already in use. An example of this is replacing a column that only has minor variation from other columns, but is only used a few times in an environment. While it does not save much memory or performance on a per-object basis, the combination of that and other meshes being removed will add up.

Lighting the level


Static lights

Static lights are used to light the scene independently of dynamic scene elements such as characters, physics object, or moving objects. These are used in abundance to light the environments. Holding L and left clicking on a surface will add one of these. The shadows generated from these lights are static, and are either baked into lightmaps or into vertex lighting on the mesh. This is the default style for lights.

Dynamic lights

There are 2 types of lights that affect dynamic objects. One is the dynamic light, which is set by right clicking on a light and choosing `affecting only dynamic objects' from the list under "Set what this light affects". These dynamic lights will only influence objects with the Dynamic lighting channel set, which by default includes characters, movers (interpolated actors like doors) and physics objects. These lights by default cast expensive dynamic shadows on the scene, and should be used sparingly, as every dynamic light that influences an object costs another render pass for that object. Setting the light to `affecting both dynamic and static objects' will cause the light to affect everything in the world. This is our primary shadowcasting light for our scenes, and is used extremely carefully, as this is the most expensive light type. The only light that is set like this by default is the Directional light type, which we use to simulate sunlight.

Skylights

Skylights are used to simulate diffused lighting from a hemisphere over the world. This is used to provide a level of ambient lighting to the world. These are very cheap to use.

Bounce lighting

Shift-L and left click will add a light with settings taken from the color of the pixel that was clicked on, with lower intensity and radius, to simulate a bounce light. This is used extensively to provide more realistic lighting results.

Modulated shadows

By default, all shadowcasting lights evaluate whether they are being occluded to generate accurate shadows in the scene. Setting a light to use modulated shadows replaces this in favor of modulating the scene using a projected texture of the shadow caster's silhouette. This is a much higher performance method of shadowing the world, though it is not as technically accurate as using the depth-buffered shadow approach. This also gives us control over the color and intensity of the shadows.

Streaming


To Be Done.

Cutting levels into streaming sections

To Be Done.

Using volumes to stream

To Be Done.

Memory statistics

To Be Done.

Performance Optimization


Lighting optimizations

Lights are the single most expensive thing in Unreal Engine 3, so careful use of them is extremely important. There are several tools for optimizing lights, most of which are detailed on UDN. LevelOptimization has several topics dedicated to lighting optimizations.

Memory optimizations

Within UnrealEd's Generic Browser is a tab for Primitive Stats. This shows how much memory is being used for all meshes in use in the current level. We watch for meshes that are similar or identical to other meshes that could be replaced to save on memory costs.

Here is a list of in-game commands to analyze the current scene:

Stat memory - displays a list of how much memory is being used by what type of assets.

Stat d3dscene - displays the amount statistics for Direct3D, showing how many triangles are being rendered, as well as stats for how long in milliseconds to render various portions of the frame (including lights

Appendices


Appendix 1: Importing assets in to the engine

The artist responsible for creating the content is also responsible in most cases for importing those assets into the engine. All static meshes are brought in using 3D Studio Max's Ascii Scene Export (ASE) format. They are imported into a content package using a naming scheme that allows us to locate it later. Textures are imported in either .TGA or .BMP formats. Targa formats support 24 or 32 bit texture, while we only support 24-bit (no alpha) textures from .BMP files. We also store all of the source assets in our Perforce repository under a `artsource' depot.

Appendix 2: Assigning materials to assets

On import of an .ase, the importer checks to see if the material applied to the object matches the name of a material that is currently loaded in the editor. If it finds a match, it will auto-assign that material.

In 3D Studio Max, this is simply set by naming the material.

If you wish to re-assign a material for all instances of an object (say changing the material on a door) double-click on the staticmesh in the Generic Browser, open the LODInfo field, open [0], then the materials list. Changing the material is simple, select the material entry you want to change, find the new material in the Generic Browser, select it, and click on the green arrow in the materials list.

Changing the material on a placed mesh will NOT carry over to other copies of that mesh, so can be used to create variation without loading additional meshes into memory.

Select the placed instance of the mesh you wish to change. For example, there are two identical doors placed side by side, but one needs to have a different material. Select one of the doors, hit F4 to open the StaticMeshActor properties window. Open the StaticMeshActor entry, then StaticMeshComponent. Scroll down to Rendering, and then click on the Materials entry. It is probably empty, so click on the green + icon. This will add an entry that corresponds to the first material applied to that mesh. If you had 2 material applied to the door, for example one for the base door and one for the window, and only wanted to change the `SECOND' one, then you would add 2 entries to the Materials array and leave the first one alone.

Now, to change the material applied to the mesh, use the same procedure as before. Find the new material in the Browser, select it, then select the material entry in the Property window, and click the green arrow to enter it. One door should now have a different material than the other, even though they're both using the same mesh.

Appendix 3: Physical assets

An informative page on importing static meshes with simplified collision is here: Collision Reference Any static mesh that has simplified collision can be added to the world as a Rigid Body (physical object).

Skeletal meshes that have had physics setups created for them using PhAT, as decribed here: PhysicsAssetTool These can be placed as Physics Assets in the world, and will behave physically when interacted with.

Appendix 4: Physical materials

Physical Materials are how the PhysX system determines the behavior of two objects when they collide. A reference page on what Physical Materials are and how they work is here: PhysicalMaterialSystem

Appendix 5: Cooking for play on consoles

To Be Done.

Appendix 6: Source Control Integration

We've integrated source control (SCC) into the Generic Browser. This allows us to check content packages (but not levels) in and out from within the editor. An important note: A package cannot be checked out if it is not the most recent version of the package. The editor must be closed and the content packages synched to the most current revision for checkout to work at that point.

Levels are checked in and out outside of the editor.

Epic uses Perforce as an internal source control system

Appendix 7: Content Management

Again, Epic uses Perforce to manage game content. An artist will usually check out the source package, add whatever new content they have into the package, and save the package. All artists have access and update on pretty much a daily basis so they always have the latest content. If a package is checked out they can request that it be checked in.

Our packages are usually capped at about 150mb then we'll move on and make a new one for any given environment set.

For example we'll set up an environmental package structure that looks similar to this...

For COG_City:

  • COG_City_Doors
  • COG_City_Doors02
  • COG_City_Doors03
  • COG_City_Floors
  • COG_City_Floors02
  • COG_City_Floors03

See the Asset Pipeline page for naming conventions and package organization.

Appendix 8: Builds

Currently, every day a build script retrieves the latest versions of content and code, compiles the engine and all game projects, labels this uniquely, checks all the content and compiled binaries in, then writes an internal email informing the project teams that a new version is available. All the artists and designers have batch files on their systems that allow them to synch to the most recent full build. This helps to avoid confusion as to which build people are working from.