
is.gd/cityengine_2022

city from the
CityEngine




Inspector
assigns IDs at
.

ArcGIS CityEngine from Esri is a procedural city planning system – to put it simply. One of its strengths is the integration of many building blocks, such as the import of architectural and geographic data (including Openstreet Maps), the easy application of scripts of any complexity and an extremely user-friendly way of working – anyone can build with it as they wish. Click here for the trial version: is.gd/cityengine_trial. At the same time, it was one of the first connectors to the Omniverse – and we’ll just ask about it, and if you want to get started right away, a 30-day trial version is available here: is.gd/omniverse_test. And if you want to install everything straight away, you can get in touch with our friends at DVE Advanced Systems(is.gd/dveas_omniverse) and PNY(is.gd/pny_omniverse). Of course with enterprise support and all kinds of extras, including professionals in the support team who understand the whole thing much better.


Simon Haegler has been working as a software developer in Esri’s CityEngine team in Zurich since 2011. After an MSc in Electrical Engineering and Information Technology at ETH Zurich, he was a founding member of the CityEngine start-up Procedural Inc. Besides CityEngine, the corresponding APIs and plugins, his focus is on the combination of GIS with procedural technologies in AEC (Architecture, Engineering and Construction) and in the entertainment industry (Digital Set Building and Pipeline Tools). Email: shaegler (at) esri.com
DP: Hello Simon! One would think that a system like ArcGIS CityEngine, which talks to everything else in the pipeline, would like to go into the Omniverse/USD – are you doing anything yet?
Simon Haegler: USD is a quantum leap for CityEngine in the VFX space and other industries like AEC (Architecture, Engineering and Construction) are now jumping on the bandwagon. When we integrated USD into CityEngine 2020.0, it was the first time that our customers were able to export complex city models with out-of-the-box PBR materials to Houdini & Co. with virtually no compromises. Previously, with FBX, for example, this was always a fiddle with the number of objects and materials.
With Omniverse, Nvidia is now going one step further and making the handling of USD scenes considerably easier for the user. Omniverse provides direct access to a high-quality and very fast path tracer (RTX) that supports all USD features. But even simple functions such as the Outliner are very practical, as USD-specific concepts such as compositing, layering and referencing can be edited here.


Stadt aus der
CityEngine
In addition, there are hundreds of functions for simulations, animations etc. pp. Therefore, it was obvious for us to offer the CityEngine “Connector for Omniverse”. This is particularly interesting in combination with other connectors, e.g. with Autodesk Revit.
One use case from AEC is to combine a building model from Revit with a city model from CityEngine in Omniverse, whereby the respective scenes are kept “live” in both programmes, making iterations easier. Currently, our connector is a separate, freely available plugin for CityEngine. The CityEngine Connector can be installed in the Omniverse Launcher under Exchange -> Connectors, see here: is.gd/omniverse_launcher. This allows us to react quickly to changes in Omniverse, as it is still a relatively new product.
DP: If you look at the CityEngine data sets – entire cities – doesn’t the USD workflow become quite slow at some point?
Simon Haegler: USD is designed for performance right from the start, and even large city models can be easily exchanged in “downstream” DCCs. That’s why we initially designed the CityEngine “Connector for Omniverse” as an extended USD exporter. Essentially, you specify the Omniverse “Nucleus” server and the connector sends the USD data there. The rest works as with the normal USD exporter, with one exception: if a scene with the same name already exists on the Omniverse server, the connector generates an incremental update. Existing buildings with the same ID (CityEngine Object ID) are “muted” and updated with the new buildings. This is useful for updating individual buildings in large scenes without having to repeat a long export.
DP: How much does a normal CityEngine user have to “rethink” if they want to work in USD?
Simon Haegler: In contrast to traditional formats such as FBX, USD emphasises the concept of reference. In practice, this means that a CityEngine USD export consists of several geometry files that reference each other.
This is intentional: We write CityEngine layers and inserted assets that have not been modified by CGA back into separate USD files. This enables the user to mute individual CityEngine layers and buildings, for example, or to exchange individual assets (e.g. vegetation) without having to perform a new export. The whole thing is held together by a lightweight “root” file, which then loads the entire model during import.

DP: Let’s go the other way round: What can I already display from the Omniverse Library in CityEngine?
Simon Haegler: Using an Omniverse server as an asset source is currently still in the internal experimentation phase. Since the Omniverse Asset Library is basically “only” a collection of USD files, you can copy them into a CityEngine project and integrate them like normal assets (but you must observe the Nvidia terms of use here). The integration works as usual via the CGA “insert” command or as a single “static model”. One limitation is that CityEngine currently only understands the “USD Preview Surface” PBR material and cannot yet display MDL materials.
DP: Does the “Live” work in both directions, or do I still have to export manually?
Simon Haegler: The export direction is semi-live, you have to start the export with just a few clicks, the connected Omniverse apps are then automatically updated if their “live” mode is activated. The import is currently still manual. In order to be able to offer a fully “bidirectional” connector, there is still some homework ahead of us, especially to enable a good UX in dealing with external, “live” USD assets in the context of CGA and Rule Packages (RPK).

DP: CityEngine is already very flexible when it comes to file formats – is Omniverse the bridge to the other 3D and video workflows?
Simon Haegler: The messaging from Nvidia to Omniverse has changed slightly over the last few years. Omniverse is not easy to understand from a user perspective because it has this universal functional claim and consists of so many components in very different expansion stages. From CityEngine’s point of view, Omniverse is above all a convenient way of rendering complex city scenes efficiently and with high quality and combining them with other content. In the medium term, the simulation tools (Physics) are of course also interesting. We have already experimented with “micro-services” within Omniverse, e.g. to realistically demolish individual CityEngine buildings, but we and Nvidia still have some work to do.
DP: As a programmer, how good is the SDK from Nvidia, and how much effort did it take for the CE team to implement the connector?
Simon Haegler: My answer here is somewhat coloured, as I started with the connector when Omniverse was still in the preview stage and there was no official SDK, let alone developer documentation, i.e. I reverse-engineered access with the support of Nvidia. However, the actual Connector API (C ) was solidly designed from the start and straight-forward to use.
In the meantime, other APIs have been added (Python), for example to install your own micro-services on the Omniverse server or to build entire desktop apps based on the Omniverse UI Toolkit. Where there is still a bit of a problem is the publication process of custom connectors on the Omniverse launcher, where Nvidia is still “finding its way” and it takes some communication effort until the installer and documentation files are in the right place. The CityEngine team is definitely a “very early adopter” of Omniverse here.

DP: And if we look to tomorrow – CityEngine 2030, or Omniverse 2030 – what features do you think are on the horizon, and what topic will we no longer be interested in by then?
Simon Haegler: Seven years is an extremely long time in this context and it’s difficult to make a prediction, especially with the current focus on AI/ML. I’m thinking of image generators (Dall-E and the like) as well as NeRF (Neural Radiance Fields) and inverse rendering research. It may be that by 2030 we will no longer need 3D formats at all, as everything will be in the ML models and the scenes will be rendered directly from text or sketches (concept art).
But it is also possible that in 2030 we will still be descending into the depths of the production pipelines and be annoyed that a render has failed because there is a non-supported Unicode character in the texture name.
You must be logged in to post a comment.