Writing an OpenUSD Plugin
The Plug system is powerful but can be difficult to get started with. Let's look at an example and call out common pitfalls.
OpenUSD has a plugin feature that allows you to add new prim types, file types, asset resolvers, shader implementations, and honestly any kind of code you'd like to load at run time. All of USD aside from some of the very lowest level code is implemented using this plugin system. The Usd, Sdf, and Tf libraries are all plugins, just to give some examples.
Plugins are managed by the Plug library. Before we get started on an example, let's take a look at this library from a high level. Plug has two major responsibilities, plugin discovery, and dynamic plugin loading. It discovers available plugins using plugInfo.json files, and when requested it loads the dynamic libraries those plugin files point to.
During its initialization Plug will check a few locations for pluginfo.json files, including paths from an environment variable named PXR_PLUGINPATH_NAME. These plugin paths are a PATH like string, so on Mac or Linux you can provide a : separated list of paths, and on Windows you can use ; as the separator.
Plugins can be written in Python or C++, but some features require a C++ plugin. For an example of a Python plugin see here. I'm going to make a C++ example, which is a bit more involved, but often necessary for things like asset resolvers.
plugInfo.json
The plugInfo.json file is used to find the python code or dynamic library that needs to be loaded, and also to provide metadata about what is in the plugin. Here's some examples of the kinds of things that can go into the metadata:
- New prim types that can be loaded at runtime as needed (usually generated by schemaGen.py) link
- Codeless prim types, defined without a dynamic library to load link
- New asset resolver types link
- New file types (Alembic, in this case) link
- New entries in the SdfMetadata stored in USD layers link
- New menu items in usdview (the python example from above) link
For building plugins in general, see this excellent repo from Weta Digital
Most plugInfos have a general layout something like this.
{
"Plugins": [
{
"Name": "mylibrary",
"LibraryPath": "./myLibrary.so",
"ResourcePath": "resources",
"Root": "..",
"Type": "library",
"Info": {
"Object Describing the Contents": {
"contents"
}
}
}
]
}plugInfo.json example
This would be for a C++ library with a folder structure like
./resources/plugInfo.json
./myLibrary.soA plugInfo file can also include other plugInfo files with an "Includes" entry. If I had a directory containing a bunch of plugins structured like the above, each in their own subdirectory, I could make a main plugInfo.json file like this
{
"Includes": [
"*/resources/"
]
}Now if Plug discovers that plugInfo.json it will recurse down one directory and check each "resources" directory it finds for more plugInfo.json files.
plugInfo.json files can contain comments using the # character, like in python. This is not standard json, so you can't use normal json libs to parse these files unless you strip the comments first.Discovery
The most common way to get USD to load your plugins is to use the PXR_PLUGINPATH_NAME environment variable. This can be a single path or a list of paths to search at startup, separated by a : or ; depending on your platform. Each path is checked to see if there is a plugInfo.json file in that folder; the search is not recursive. Changing this env var after startup will have no effect, it is only read once early in initialization.
You can also programmatically register your plugins at runtime. If you call the function Plug.Registry().RegisterPlugins("path/to/check") the Plug library will load any plugInfos it finds at that path, including through "Includes". This can happen after startup, and clients can listen to the notice PlugNotice::DidRegisterPlugins if they need to update when new plugins become available at runtime.
When your plugin is found by Plug it is not loaded unless it's needed. Usually this is determined by consulting the information provided in the "Info" key. If USD is loading a usd file that has a prim of type DoritosCoolRanchTriangle, the Plug library will be called to see if there are any plugins that implement that type. If there are, that plugin will then be loaded. You can check out what plugins are discovered and loaded with a python one liner like:
print('\n'.join([str((x.name, x.isLoaded)) for x in Plug.Registry().GetAllPlugins()]))Or you could do it with a less gross one liner ;) it's up to you really.
Building a C++ Plugin
The biggest challenge with C++ plugins is that the C++ plugin interface is C++, not C. Because of that we don't have the kind of clean, ABI compatible interface you can get by using C with integral types.
Consider the case of the Python C interface. When building an extension that talks to Python using this interface you don't have to worry what compiler was used for that Python build, or what dependencies Python links to, you can just build your object code with whatever C compiler and Python can communicate with it over that simple C interface connection.
Because OpenUSD has C++ types in the interface we don't have as much freedom. We need to make sure we build the plugin with a compiler that produces object code compatible with the code that was generated when our USD libs were built. In practice this usually means using the same version of the same compiler. We also need to link to the same dependencies that were used to build this copy of USD, the same onetbb library, the same boost libraries, the same python version, the same opensubdiv, etc.
So, with that out of the way, I'm going to make a minimal example. For this example we'll build an asset resolver that derives from the ArDefaultResolver but prints out paths as it resolves them. This is a useless stub, but should demonstrate the moving parts to get a C++ library that we can load as a plugin. The source code is here.
I'm going to build it in a docker container based on some USD builds I use a lot in my projects. I think this will demonstrate how to get a plugin working against a USD version you built yourself, and the Dockerfile will clearly document the dependencies.
If you want to build a plugin that works with a USD version used by your 3D editor, or for the usd-core version from PyPI, that's a whole extra level of complexity. Hopefully armed with this knowledge you'll know where to start. Some 3D editors like Houdini provide instructions on how to get this working.
In my rstelzleni/usd-alpine docker container I keep up a version of USD built against a minimal alpine linux os. This keeps container sizes small, at the cost of not being compatible with the VFX reference platform. The latest USD version I have at the time of writing is 26.03, so we'll use that as the base version. The container has all the build tools and intermediate build files stripped out, only the installed USD package and its dependencies are in there. It's at the path /opt/USD.
The first thing we'll need is a Dockerfile that adds in the build dependencies. I built this USD lib with clang, so we'll get that compiler along with some other stuff we need. The versions of these dependencies are not pinned here, but we are pinned to alpine v3.23.3 and that should mean we get a compatible version of clang. Same for the other dependencies.
from rstelzleni/usd-alpine:usd-26.03
RUN apk update && apk upgrade && apk add --no-cache \
alpine-sdk \
boost-dev \
ccache \
clang \
clang-dev \
cmake \
linux-headers \
onetbb-dev \
python3-dev \
opensubdiv-dev
COPY . /src/demoResolver
RUN mkdir /src/build
WORKDIR /src/build
RUN cmake \
-DCMAKE_INSTALL_PREFIX=/opt/plugins \
-DCMAKE_BUILD_TYPE=Release \
-DCMAKE_PREFIX_PATH=/opt/USD \
/src/demoResolver
RUN cmake --build . --config Release --target install -j8Dockerfile
There's a few interesting things to call out here.
- Notice that we're adding dev packages for boost, onetbb, opensubdiv and python even though we won't be using them directly. CMake
find_packagewon't let us discover and use the pxr package without these available. If your USD library was built with more/other dependencies you may need those as well. - When running cmake to configure the build we need to add
/opt/USDto the prefix path so that cmake can discover the/opt/USD/cmake/pxrTargets.cmakefile that sets up the dependency on the USD libs. This cmake directory should be available in standard USD installs.
Now let's look at the CMakeLists.txt
cmake_minimum_required(VERSION 3.26)
project(demoResolver)
find_package(pxr CONFIG REQUIRED)
add_library(demoResolver SHARED
resolver.cpp
)
target_link_libraries(demoResolver PRIVATE
ar
)
# Install the finished product
configure_file(plugInfo.json
${CMAKE_CURRENT_BINARY_DIR}/demoResolver/resources/plugInfo.json
COPYONLY
)
install(TARGETS demoResolver DESTINATION lib)
install(FILES plugInfo.json DESTINATION lib/resources)CMakeLists.txt
Things to call out,
- The
find_package(pxr...call gets thepxrTargets.cmakefile, which sets up the include paths, libraries to link, etc. We don't need to do anything but link to thearlibrary for theArResolverclass and the rest is done for us. - We do need to explicitly install the plugInfo.json file into an expected place. I put it into a
resourcesdirectory, just because that's the common practice. I know that a foolish consistency is the hobgoblin of little minds, but I like my little hobgoblin. He's so cute.
You can check out the resolver.h and cpp files in the repo. The plugInfo looks like this.
{
"Plugins": [
{
"Info": {
"Types": {
"ConfusingAcronymResolver": {
"bases": [
"ArResolver"
]
}
}
},
"LibraryPath": "./libdemoResolver.so",
"Name": "Confusing Acronym Resolver",
"ResourcePath": "resources",
"Root": "..",
"Type": "library"
}
]
}One interesting thing here is that I put ArResolver into the "bases" list but I didn't mention ArDefaultResolver, which is the class I actually derived our resolver from. You don't have to list every base class here. With what we've set we can discover our type and our plugin with code like the below, even if our plugin is not loaded.
resolverType = Tf.Type.FindByName("ArResolver")
allResolverTypes = Plug.Registry().GetAllDerivedTypes(resolverType)I didn't add ArDefaultResolver to that list because it is not an interface class, and I think it's unlikely that any code would ever need to search for child classes of the default resolver.
With all this we can run a docker build, which will compile and install the plugin in the container image.
docker build -t demoResolver:latest .Once installed we need a few more steps to actually use our asset resolver.
Building and Testing
First off let's add a few things to this Dockerfile. At this point we could copy out the plugin artifact and use it anywhere we want to use the USD version it is compatible with. Or, we could make this a multi-stage Dockerfile and make a new version that removes all the build tools we installed (they take up a lot of space). For ease of demonstration though, let's stick inside this one Dockerfile.
To the end of that file we'll add this.
... The previous listing is above here ...
ENV PXR_PLUGINPATH_NAME /opt/plugins/lib/resources
ENTRYPOINT ["python"]Dockerfile, continued
Now when we build and run the Dockerfile we'll get dropped into a python repl where we can experiment, and the plugin we've built should be discovered and available in USD. When we run we'll use the -v flag to map the variants example from the same git repo into the container, so we have a stage with references we can open.
> docker build -t demoResolver:latest .
-- snip lots of output --
> docker run --rm -it -v ../variants:/data/variants demoResolver:latest
>>> from pxr import Plug
>>> plugin = Plug.Registry().
GetPluginWithName("Confusing Acronym Resolver")
>>> print(f"{plugin.name} - {plugin.isLoaded}\n")
Confusing Acronym Resolver - False
>>> from pxr import Ar, Usd
>>> print(f"{plugin.name} - {plugin.isLoaded}\n")
Confusing Acronym Resolver - False
>>> Ar.GetResolver()
<pxr.Ar.Resolver object at 0xffff98dfa5e0>
>>> print(f"{plugin.name} - {plugin.isLoaded}\n")
Confusing Acronym Resolver - True
>>> stage = Usd.Stage.Open('/data/variants/showcase.usda')
_Resolve(@/data/variants/showcase.usda@) -> @/data/variants/showcase.usda@
_Resolve(@/data/variants/multiple-variants.usda@) -> @/data/variants/multiple-variants.usda@
_Resolve(@/data/variants/multiple-variants.usda@) -> @/data/variants/multiple-variants.usda@
-- snip additional resolve calls --- Notice that we didn't tell USD to use our asset resolver. The Ar library will check for plugins that provide an
ArResolverderived class and will use it if it finds one. If not it falls back to using theArDefaultResolver - Our plugin got loaded on the first call to
Ar.GetResolver()which was the first time Ar went to look for our plugin. - We set the
PXR_PLUGINPATH_NAMEenv var to point to our single plugin directly. If we had multiple plugins we could either provide them as a list in the env var, or we could make a main plugInfo.json file that "Includes" all our plugins, and point the env var to that main plugInfo. A common practice is to have a plugins directory, and have a plugInfo.json in it containing"Includes": [ "*/resources" ]so you can get plugins from subdirectories. - When opening the stage the actual
_Resolveoutput will be garbled and interleaved because stage opens are multithreaded and I didn't protect the print statements with a mutex. I cleaned up the output above for readability.
The asset resolver output isn't all that interesting, we're using the default asset resolver on the filesystem, so it just finds everything using the natural file paths that were passed in.
We've got a working example! This is great, but it's about as simple as it can be. If you need to provide python bindings for C++ types that are compatible with the boost::python bindings used in pxr, things will be more complex. The Weta examples I linked earlier support this and can provide a good starting point. If there's interest I could write a followup article covering that too.
This is a Linux example but let's not forget our friends in the wonderful world of Windows. If you're building for Windows you'll need to make sure your public symbols in C++ are exported using something like __declspec(dllexport), update the plugInfo to reference a .dll instead of a .so, and of course, make sure you build using the same version of MSVC that was used for the USD library you're linking to.
That's All Folks!
This gives an example of a C++ asset resolver plugin, but hopefully the information about building with C++, writing plugInfos, discovering plugins, etc. will be helpful in getting other kinds of plugins up and running as well. I think just knowing what functions are available on Plug and having some basic idea about the plugin discovery system is helpful no matter what you're trying to set up.
If you run into trouble with a new plugin, here's some ideas -
1) Does your plugin appear in
Plug.Registry().GetAllPlugins()? If not you probably need to update PXR_PLUGINPATH_NAME and restart or move your library to a different location.2) Can you load your library directly? If you get the plugin from the registry and load it does it succeed? Or are there errors?
Plug.Registry().GetPluginWithName("MyPlugin").Load()3) If both of the above are working, it's likely that nothing is deciding to load your plugin at runtime. Check the "Info" metadata in your plugInfo, it should probably include some information about what base classes you derive from, or it might need settings to advertise the features it offers.