The Interactive Tools Framework in UE5

During the last few major versions of UE4, a stack of libraries was built up in the service of the new and expanding Modeling Editor Mode. This included a low-level mesh/geometry processing plugin (cleverly named GeometryProcessing), the InteractiveToolsFramework, and the MeshModelingToolset plugin. In UE5 these libraries have been significantly expanded, but also undergone some major reorganization, and and some portions have been taken out of Experimental status.

In previous Tutorials, I have covered using these libraries in-Editor to build custom Modeling Tools (https://www.gradientspace.com/tutorials/2020/1/2/libigl-in-unreal-engine), doing command-line Geometry Processing (https://www.gradientspace.com/tutorials/2020/9/21/command-line-geometry-processing-with-unreal-engine), doing Runtime Procedural Mesh Generation (https://www.gradientspace.com/tutorials/2020/10/23/runtime-mesh-generation-in-ue426 and https://www.gradientspace.com/tutorials/2020/11/11/procedural-mesh-blueprints-in-ue426), and most recently, using the Interactive Tools Framework (ITF) to build a small Runtime 3D Modeling App (https://www.gradientspace.com/tutorials/2021/01/19/the-interactive-tools-framework-in-ue426). With the changes in 5.0, all these posts and sample projects need to be updated (not a small effort!).

In this article I will describe the high-level changes to the ITF / GeometryProcessing / MeshModelingToolset stack. This will serve as a rough “porting guide” for UE4 usage of these libraries. I have also updated the Runtime Tools Framework Demo to work with UE5, the updated code project is available on Github (https://github.com/gradientspace/UE5RuntimeToolsFrameworkDemo), and I will discuss some details later in the post.

GeometryProcessing

Several major structural changes were made to the GeometryProcessing Plugin. The first and foremost is that portions of it were moved into the Engine core, to a module named GeometryCore. This was necessary to allow the core Engine and Editor to use the various Geometry algorithms, as they cannot easily have plugin dependencies. Specifically the contents of the GeometricObjects module of GeometryProcessing were moved, and that module no longer exists. So, to update your Build.cs files, generally you can just replace the “GeometricObjects” references with “GeometryCore”. Over time more GeometryProcessing functionality may migrate to GeometryCore as it becomes needed for core Engine features.

The core FDynamicMesh3 class and various associated types were also moved from the DynamicMesh module (although that module still exists and contains many processing algorithms). The paths to these files has changed, so for example where you could previously #include “DynamicMesh3.h”, you will now have to #include “DynamicMesh/DynamicMesh3.h”. A few other frequently-used utility headers like MeshNormals.h, MeshTangents.h, and MeshTransforms.h were also moved to this new DynamicMesh subfolder of GeometryCore.

Another major change is that nearly all code in GeometryCore and GeometryProcessing was moved to a namespace, UE::Geometry::. Moving code into this namespace resolved many naming conflicts with Engine types, and reduces the need to use highly verbose naming to avoid conflicts in the global namespace. This does, however, tend to mean any code written against GeometryProcessing will need some updates. Most code in the engine simply does a using namespace UE::Geometry; in any affected .cpp files, however this should never be done in a header. Generally the Engine uses explicit full names for UE::Geometry types in headers, eg in class definitions that are not also in the UE::Geometry namespace. In some cases you will also find class-scoped using declarations like using FDynamicMesh3 = UE::Geometry::FDynamicMesh3;

The GeometryProcessing plugin in UE4 was a largely self-contained set of libraries. The core GeometricObjects library even defined it’s own templated math types for Vectors/etc. This was because the core FVector type in UE4 used 32-bit float precision, and GeometryProcessing primarily uses doubles. In UE5, the rest of the Engine has caught up with GeometryProcessing, and the core FVector is now double-precision, a specialization of the UE::Math::TVector<T> template.

A major complication of this conversion was that the “short names” for the new explicit float and double core types were chosen to be FVector3f and FVector3d, exactly the global-scoped type names that had been used in GeometryProcessing for it’s templated Vector type. So, to resolve this conflict, and simplify usage of GeometryProcessing across the Engine, the GeometryProcessing FVector3<T> template was fully replaced by the new UE::Math::TVector<T>. Similar changes were made for TVector2 and a few other types. This may sound straightforward, but GeometryProcessing had used some vector-math naming and idioms common to external libaries like Eigen, that were in conflict with some of the “Unrealisms” of FVector. So, some former functions of GeometryProcessing’s FVector2/3/f/d were moved to standalone functions, to avoid duplication in the core Vector type. For example FVector3d.Normalized() no longer exists as a member function, and a free-function UE::Geometry::Normalized() must now be used. These changes required extensive (but very rote) refactoring of the entire GeometryProcessing library. As a result, most UE4 GeometryProcessing-based vector-math code is likely to not compile in UE5 without similar modifications.

Some other type conflicts were resolved by renaming the GeometryProcessing type, rather than by switching to the Engine types. In particular, the UE::Geometry variant of FTransform remains, and was renamed to FTransformSRT3f/d to resolve the name conflict and more clearly indicate it’s functionality (“SRT” indicates the Scale/Rotate/Translate transform order applied to points). In future there may be variants that could (for example) support composition of transforms with non-uniform scaling, which is not possible with the core Engine FTransform3f/d. In general, where UE::Geometry has “it’s own” version of a type, the goal is to provide a variant that is more compatible with standard math libraries and “textbook” equations, which in turns simplifies integration of those libraries by licensees, porting algorithms, etc. A prime example is the UE::Geometry::TMatrix3, which uses textbook post-multiplication ordering for matrix/matrix multiplies, vs the Engine TMatrix which uses a somewhat unusual pre-multiplication-of-transpose, that can trip up attempts to (eg) implement a formula one might find online or in a research paper.

Finally, GeometryCore and GeometryProcessing were taken out of Experimental status. What this means is that in future breaking changes will generally not be made without going through standard UE deprecation processes, ie APIs that need to be modified will be deprecated for an Engine release before being removed.

GeometryFramework

A central character in several of my previous tutorials was the USimpleDynamicMeshComponent class, which provided a renderable Component based on FDynamicMesh3. In UE4 this was primarily used by Modeling Mode, to support fast live previews of mesh editing, and could also be created and used at Runtime. In UE5, this Component has been become a fully-functional type, and was renamed to UDynamicMeshComponent. It was also moved from the ModelingComponents module of the MeshModelingToolset plugin, to a core Engine module named GeometryFramework, which now also includes an associated Actor type, ADynamicMeshActor, as well as UDynamicMesh, a UObject wrapper for a FDynamicMesh3.

UDynamicMeshComponent was significantly “cleaned up”, and some areas that were previously a bit ad-hoc, like support for Tangents, are now much cleaner. Support for both Simple and Complex collision was added, and a full UBodySetup API is included, as well as various helper functions. A key thing to note about Physics support, though, is that Async physics build is not supported, ie changes to collision geometry require a relatively slow game-thread recomputation. Async physics build has been added in the UE5 Main development stream and will land in 5.1.

FDynamicMesh3 is now serializable, and the UDynamicMesh wrapper can be added as a UProperty of any UObject. UDynamicMeshComponent now uses a UDynamicMesh to store it’s mesh, rather than directly storing a FDynamicMesh3. This means UDynamicMeshComponent is serializable, ie you can add an instance to any Actor, initialize/edit the UDynaimcMesh, and it will save/load as expected, and be included in the cooked game build.

Note, however, that UDynamicMesh is not currently an “asset type”, ie you cannot make one in the Content Browser like a UStaticMesh. Technically nothing prevents you from writing your own code to do that, as an Asset is simply a serialized UObject. However by default the UDynamicMeshComponent will create it’s own UDynamicMesh instance, and it will be serialized “with the Component”, which means it is stored “in the level”. I will cover this in depth in future Tutorials.

To avoid breaking code, the direct FDynamicMesh3 access functions of UDynamicMeshComponent still exist, such as FDynamicMesh3* GetMesh(). However, it is strongly recommended that the ProcessMesh() and EditMesh() functions be used instead. These give UDynamicMesh/Component some control over when the mesh update actually occurs, which (in future) will allow for safe access from multiple threads, ie mesh updates will not need to all be done on the game thread. These two functions also exist on UDynamicMesh, as well as other “mesh containers” like UPreviewMesh that are used heavily in MeshModelingToolset.

ADynamicMeshActor is a standard Actor type for a UDynamicMeshComponent, similar to AStaticMeshActor and UStaticMeshComponent. It is Blueprintable, and the new Geometry Script plugin can be used to do mesh generation and editing for UDynamicMeshActor/Component in Blueprints. I won’t discuss that further here, but I do have an extensive series of Youtube videos on the topic. Similarly, DynamicMeshActors can now be emitted by the mesh creation Tools in Modeling Mode, and similarly the mesh editing Tools generally all work on DynamicMeshComponents.

InteractiveToolsFramework

The Interactive Tools Framework (ITF) has become more deeply integrated into the UE Editor, while still remaining fully functional for Runtime use. Usage is no longer limited to Modeling and Pain modes, many UE Editor Modes now use the ITF to some extent. However some major refactoring has occurred to support this broader usage base. In particular some aspects of the ITF were quite specific to Modeling, and an attempt has been made to remove these aspects, or at least make them optional.

UToolTargets

Perhaps the most significant change is that the previous way that Modeling Tools interacted with “Mesh Objects” like a StaticMesh or Volume, via FPrimitiveComponentTarget, has been deprecated and replaced. FPrimitiveComponentTarget was a relatively simple wrapper around something that could provide a FMeshDescription, which was used to bootstrap the Modeling Mode, however it had major problems. In particular, it relied on a global registry, which meant that if an Engine module registered a FComponentTargetFactory, a plugin could not easily override that Factory (even at Runtime). Similarly, since the Engine does not support RTTI, it was quite cumbersome for a plugin to extend the core FPrimitiveComponentTarget API with additional functionality without making Engine changes, and then build Tools that used that functionality.

The replacement is the UToolTarget system, where a base UToolTarget class defines no functionality itself, and UInterfaces are used to add sets of API functions. The UObject system supports run-time checked type querying/casting, which allows the Tool system to then determine if a given UToolTarget supports a particular UInterface. For example the IPrimitiveComponentBackedTarget interface provides functions for accessing the Actor, Component, Transform, etc of a PrimitiveComponent, and the IMeshDescriptionProvider interface provides APIs for accessing a MeshDescription for a given ToolTarget.

To avoid the global-registry problem, UToolTargetFactory implementations for particular Component/Object types are registered with the UToolTargetManager, which lives in the UInteractiveToolsContext, adjacent to the ToolManager, GizmoManager, and so on. A given UToolTarget implementation like UStaticMeshComponentToolTarget will implement various of the APIs above, and an Editor Mode will register it’s Factory with the ToolTargetManager on setup (see UModelingToolsEditorMode::Enter() as an example).

To support “capability queries”, ie to ask the ToolTargetManager if it can build a ToolTarget for a given target Object (Actor, Component, Asset, etc) and set of required ToolTarget APIs, there is a FToolTargetTypeRequirements type. The common usage is that a ToolBuilder will have a static FToolTargetTypeRequirements enumerating the APIs it requires, which will be passed to the ToolTargetManager. An example such function is shown below for UBaseMeshProcessingTool/Builder.

const FToolTargetTypeRequirements& UBaseMeshProcessingToolBuilder::GetTargetRequirements() const
{
	static FToolTargetTypeRequirements TypeRequirements({
		UMaterialProvider::StaticClass(),
		UMeshDescriptionCommitter::StaticClass(),
		UMeshDescriptionProvider::StaticClass(),
		UPrimitiveComponentBackedTarget::StaticClass()
		});
	return TypeRequirements;
}

The UToolTarget system is highly flexible, as it does not explicitly define any base interfaces or require specific object types. A ToolTarget Interface can be defined for any UObject type, which then allows Tools to manipulate instances of that UObject - or even specific UObjects - via the published UInterfaces.

UContextObjectStore

In addition to UToolTargetManager, a very generic mechanism for passing objects down into Tools from higher levels has been added to the InteractiveToolsContext, the UContextObjectStore. This is, to be blunt, basically just a list of UObject pointers, that can be searched by class type. The basic usage is to add an object to the Store, eg ToolsContext->ContextObjectStore->AddContextObject(NewObject<SomeUType>()), and then later that UObject instance can be found by querying ToolsContex->ContextObjectStore->FindContext<SomeUType>(). Other Manager types like the ToolManager have helper functions to access the ContextObjectStore.

The purpose of the ContextObjectStore was to replace the proliferation of ToolsContext APIs that ITF implementors were required to provide. For example, the previous strategy to expose some UE Editor functionality would have been to abstract it in an interface like the IToolsContextQueriesAPI. However expanded usage of the ITF in the Editor means that more information needs to be passed from the Editor to Tools, and abstracting all those channels via APIs at the ITF level would be very complex. So, the ContextObjectStore was intended to be used to pass Editor-level API abstractions (or simply Editor-level objects and data structures directly) in a generic way, customizable for specific “Editor”/”Client” situations.

A mechanism like the ContextObjectStore can be easily abused, however. It is nothing more than a shared list of UObject pointers, effectively a global list from the PoV of Tools living inside a given ToolsContext. So, for example, any UObject instance can be added to the store, and the store will prevent that object from being garbage collected, as long as it exists. Similarly multiple objects of the same type can be added, and only the first will ever be found. Or by the same token, nothing prevents “someone else” from removing a context object you added.

If you go spelunking, you will find some places in the UE codebase where the ContextObjectStore has been used for purposes other than “Editor-level providing abstract/generic APIs to Tool-level”, and is instead used as a convenient way to pass data members around. I strongly encourage you to not treat those usages as a pattern that should be followed. Ask yourself, “would I consider passing this data by temporarily sticking it in a global void-pointer array”? If the answer is a clear no, then using the ContextObjectStore is probably not the right approach.

UModelingObjectsCreationAPI

The IToolsContextAssetAPI in the UE4 ITF is a prime example of the type API that is better done via the ContextObjectStore. That API was used by Tools to emit new Assets, which required some abstraction to permit Runtime usage of the ITF. However the ToolsContext was required to provide an IToolsContextAssetAPI implementation, even if it did not have the concept of Assets (ie, at Runtime!). And then because that API existed, many Modeling Tools emitted “Assets” even though they were actually just trying to emit “Meshes”, which limited how easily they could be adapted to different use cases.

To resolve this situation, IToolsContextAssetAPI has been removed from the ITF in UE5, and the core ITF has no concept of “emitting Assets”. Instead, a UModelingObjectsCreationAPI type has been defined in the ModelingComponents module of MeshModelingToolset. This type contains a function CreateMeshObject() which Tools can use to create new ‘Mesh Objects’, which could be a StaticMesh Asset, but also could be a Volume, DynamicMeshActor, or any other Mesh Type (eg as we will do in our Runtime demo below). UEditorModelingObjectsCreationAPI is the implementation used in Modeling Mode in the UE Editor.

The ContextObjectStore is used to provide an implementation of UModelingObjectsCreationAPI to the Modeling Tools. Primarily the Tools use a static utility function UE::Modeling::CreateMeshObject(), which finds the UModelingObjectsCreationAPI implementation in the ContextObjectStore, and uses it to create the Mesh Object. An extensive FCreateMeshObjectParams struct is used to provide mesh creation information, ie names, materials, the source MeshDescription or DynamicMesh, and so on. Similarly the function returns a FCreateMeshObjectResult that provides pointers to the new Actor, Component, and Asset, where applicable.

A similar set of functions and types is available for Texture objects, and more are likely to be added in the future. Note, however, that support for this API is completely optional - a Runtime ITF implementation would only need to provide a UModelingObjectsCreationAPI implementation if it was to use MeshModelingToolset Tools that emit new Mesh Objects.

UCombinedTransformGizmo

The UTransformGizmo developed for Modeling Mode in the UE Editor is designed to also work at Runtime, however this means it’s behavior in the UE Editor is not ideal (particularly for rendering), and it has a significantly different UX. To make way for a future Gizmo implementation, UTransformGizmo was renamed to UCombinedTransformGizmo. In addition, the concept of “default Gizmos” is in the process of being removed from the GizmoManager, and so usage of UCombinedTransformGizmo should now be done via a set of utility functions in /BaseGizmos/TransformGizmoUtil.h.

To create new 3D Gizmos from Modeling Tools and Runtime ITF code, a helper object can be automatically registered in the ContextObjectStore using the function UE::TransformGizmoUtil::RegisterTransformGizmoContextObject(), and similarly unregistered using DeregisterTransformGizmoContextObject(). Once registered, the utility functions UE::TransformGizmoUtil::Create3AxisTransformGizmo() and ::CreateCustomTransformGizmo() are available and should replace previous calls to UInteractiveGizmoManager::CreateTransformGizmo(). However Gizmos can be discarded the same way as before, using the various UInteractiveGizmoManager::DestroyGizmo() variants.

Finally, if you previously tried to use UTransformGizmo in your own projects, you will likely have run across the need to call GizmoRenderingUtil::SetGlobalFocusedSceneViewTrackingEnabled(). This was, to be blunt, a very gross hack needed to work around limitations of communicating between the game thread and render thread, because the sub-Components used in the Gizmo figure out some aspects of their rendering on the render thread. This caused no end of problems in the Editor, and so it was removed and replaced with a more structured system based on an object called the UGizmoViewContext. This object is created and added to the ContextStore (again…) by the TransformGizmoUtil registration function above. It is then necessary to update this GizmoViewContext with the active FSceneView every frame. This is generally straightforward and you can see how it is used in the sample project below, in the function URuntimeToolsFrameworkSubsystem::Tick(). But just note that UCombinedTransformGizmo instances will not function without this SceneView being set correctly.

UInteractiveToolsContext Customization

The core ITF classes - UInputRouter, UInteractiveToolManager, UInteractiveGizmoManager, UToolTargetManager, and UContextObjectStore - are fully functional on their own, however for many users of the ITF it may be desirable to customize or extend the behavior of the base classes. This was previously somewhat difficult unless you also subclassed UInteractiveToolsContext and replaced the Initialize() function, which from a maintenance perspective is not ideal, as the base implementation may be extended in the future (as it was in UE5 to add the ToolTargetManager and ContextObjectStore). To simplify customization of these base Managers, the default UInteractiveToolsContext now allows you to provide custom functions to create and destroy each of the sub-objects.

This capability is used in the Editor, to support a “hierarchy” of InteractiveToolsContexts. This is useful information if you are building in-Editor Tooling using the ITF - in addition to each Editor Mode (UEdMode) having a local InteractiveToolsContext (ITC), there is also a ToolsContext that lives at the “ModeManager” level. This ITC is considered a “parent” ITC, and (for example) the InputRouter is shared between the Parent and any active Mode ITCs, to enforce “one capture at a time” behavior. You may find having a hierarchy of ToolsContexts helpful if you are building a complex at-Runtime DCC Tool. If you want to browse the Editor code for this, see the base class UEditorInteractiveToolsContext and subclasses UModeManagerInteractiveToolsContext and UEdModeInteractiveToolsContext.

UAssetEditors now also support UEdModes, and hence have a UInteractiveToolsContext in each Mode by default. This is not heavily used in existing Asset Editors, however the new UV Editor (a companion to Modeling Mode) is an example of an Asset Editor built primarily using the ITF. A variant of Modeling Mode has also been integrated into the Static Mesh Editor, although this is not enabled by default, and only exposes a few Tools there. The plugin is called Static Mesh Editor Modeling Mode, and can be found in the source tree in \Plugins\Experimental\StaticMeshEditorModeling\. This plugin is fully self-contained, ie no Engine changes are needed for a plugin to “add itself” to the StaticMeshEditor.

MeshModelingToolset / Exp

The MeshModelingToolset plugin has been moved out of Experimental status, however portions of the plugin and modules that needed to remain Experimental were then moved to a “MeshModelingToolsetExp” plugin. The only real effect of this in terms of porting projects is that your .uplugin and .build.cs files may need to be updated to add “MeshModelingToolsExp” and/or “MeshModelingToolsEditorOnlyExp”.

Runtime Tools Framework Sample Project

A port of the UE4 Sample Project to UE5 is available on github here: https://github.com/gradientspace/UE5RuntimeToolsFrameworkDemo. I made this project by forking the UE4 project, and then submitted the port in a single commit. So, if you are interested in what specifically had to be updated, or you need to perform a similar upgrade on your own project based on my sample, you can browse the diff here. I will give a high-level overview of the changes below

RuntimeGeometryUtils Plugin

This plugin was used in several of my other sample projects, it’s not really critical to this sample, but it provides the OBJ import and the ADynamicSDMCActor. Most of the changes are simply updating paths, dealing with the new UE::Geometry namespace, and some minor API and function changes. The URuntimeDynamicMeshComponent class was removed as it is no longer needed - it just added collision support to USimpleDynamicMeshComponent, but the replacement UDynamicMeshComponent in UE5 now includes full simple and complex collision support. To minimize changes involved in the port I left UGeneratedMesh intact, however the new engine UDynamicMesh class is a superior replacement, I hope to do that change in the future (and similarly for ADynamicSDMCActor, replacing with the Engine’s ADynamicMeshActor).

RuntimeToolsSystem Module

The RuntimeToolsSystem Module is the main code of the sample, and implements the “Tools Framework Back-End” for Runtime. This is where most of the changes have taken place.

As discussed above, the FPrimitiveComponentTarget system was removed in favor of the new UToolTarget system. So, URuntimeDynamicMeshComponentToolTarget has been deleted, and a new URuntimeDynamicMeshComponentToolTarget and URuntimeDynamicMeshComponentToolTargetFactory added in it’s place. This new ToolTarget factory is registered in URuntimeToolsFrameworkSubsystem::InitializeToolsContext().

To support creation of new meshes by the various Tools, a new URuntimeModelingObjectsCreationAPI implementation of UModelingObjectsCreationAPI was added. The ::CreateMeshObject() function spawns a new URuntimeMeshSceneObject (which is a wrapper for a mesh Actor/Component) via the URuntimeMeshSceneSubsystem. An instance of this API implementation is similarly registered in URuntimeToolsFrameworkSubsystem::InitializeToolsContext(). As it was no longer needed, the FRuntimeToolsContextAssetImpl implementation of IToolsContextAssetAPI was also removed (and that API no longer exists).

The above changes allow some of the Modeling Tool subclasses in the /Tools/ subfolder to be simplified. In particular, several Tools had to override base-class functions to handle creating new objects, because the previous AssetAPI-based versions could not be hacked to function correctly. With the new UModelingObjectsCreationAPI, the Tool-creates-new-Mesh-object flow is much cleaner and no longer required any customization at the Tool level.

Finally several small changes were needed to update support for the 3D Transform (TRS) Gizmo. First, The function UE::TransformGizmoUtil::RegisterTransformGizmoContextObject() must be called to register an instance of the UCombinedTransformGizmoContextObject in the ContextObjectStore, this is done in URuntimeToolsFrameworkSubsystem::InitializeToolsContext(). This object registers the various sub-gizmos with the GizmoManager, and provides a wrapper that can spawn instances of the new UCombinedTransformGizmo. In future this will not be directly possible via the GizmoManager (it still is in 5.0, for legacy reasons, but this is due to change). So, the next step is to update USceneObjectTransformInteraction to call UE::TransformGizmoUtil::CreateCustomTransformGizmo() instead of talking to the GizmoManager directly.

Finally, there had previously been calls in the AToolsContextActor to call GizmoRenderingUtil::SetGlobalFocusedSceneViewTrackingEnabled(). This was, frankly, a gross hack, that set global pointers which allowed the Gizmo to communicate information based on the FSceneView between the Game and Render threads. In 5.0 this is no longer necessary. Instead, in URuntimeToolsFrameworkSubsystem::Tick(), an instance of UGizmoViewContext is fetched from the ContextObjectStore, and passed the current FSceneView. This is all that is necessary to provide the Gizmo with correct camera information. The UGizmoViewContext is automatically created and configured by the GizmoRenderingUtil registration function that was called above.

And that’s it! Those are the major changes that were necessary.

If you run into problems, or have questions, please don’t hesitate to find me on twitter ( https://twitter.com/rms80 ) or on the Epic Dev Community ( https://forums.unrealengine.com/u/rmseg ).