This blog post was originally published by SmartCow AI Technologies. It is reprinted here with the permission of SmartCow AI Technologies.
Over the past few weeks, the simulation team here at SmartCow AI has been working on something that will help us gain a deeper understanding on how to create better, smarter, and more efficient spaces in our world.
All of this can be done through cutting-edge emerging technology called a Digital Twin — a large-scale simulative system that would act as an accurate counterpart for a real-life area or object.
The first actual notion of a Digital Twin stemmed in 1991, from David Gelernter’s Mirror Worlds, where he mentions that:
“A Mirror World is an ocean of information, fed by many data streams. Some streams represent hand-entry of data at computer terminals; they flow slowly. Others are fed by automatic data-gathering and monitoring equipment, like the machinery in a hospital’s intensive care unit, or weather-monitoring equipment, or traffic-volume sensors installed in roadways.”
At the turn of the century, Dr. Michael Grieves applied the actual idea of digital twins to software and real-life use, while later in 2010, the term “Digital Twin” was introduced by John Vickers at NASA. Nowadays, Digital Twin Systems are being built to monitor all kinds of data structures, acquiring data from their real-life counterpart, and being fed with this same data to gain knowledge on how to move forward with a specific task.
Why Digital Twins?
One of the greatest advantages of Digital Twins we discovered is just how versatile they are. If we can replicate any space in 3D, we also have the flexibility to move objects around or change anything we want, without disrupting the physical condition of the space in the real world.
This, of course, came with several perceived benefits- one of which being a greater scope for analytics to take place to observe process flows. Patterns which are not easily seen in data could emerge in a Digital Twin. Another great benefit is that this is an extremely cost-effective solution; if something goes wrong or not quite as planned, we can always reset a Digital Twin.
We’ve been working on our own Digital Twin creation for use in Smart City and Smart Space applications. We wanted to create a system where data can be visualized and studied up to the point where a person can gather this said data, and transfer it into the real-world. Moreover, we want to be able to use this data to simulate scenarios that may not be present within the data itself, but can be present within the real world.
DIY — Digitize It Yourself!
So, how did we do it? For starters, a Digital Twin is a 3D system based on a real physical space, so that’s how we approached it. Our demo build features a road intersection found in Antwerp, Belgium, boasting multiple buildings, foliage, streetlights, and traffic systems.
All of the Digital Twin’s scene assets were created in 3D Modeling software. A rendering engine was then used to visualize the results. To facilitate our collaborative development, we used NVIDIA’s Omniverse™ platform as our main rendering engine, which was also great for real-time GPU-based rendering. Here is how we set up our pipeline for our Digital Twin which supports real-time weather and day-night cycles:
SmartCow’s approach to creating our weather-compatible Digital Twin!
To begin this long journey, we started from the 3D Modelling phase as it was easily the most involved and time-consuming process. Each and every relevant item in the scene needed to be modelled with good geometry, have texture maps and/or materials applied, and be consistent with respect to its surrounding objects. The bulk of the assets were created by our dedicated 3D Artist.
When making these assets, the modelling workflow was quite standard: We created the asset geometry in 3D composed of quadrilateral planes in order to assure that the geometry is not only efficient for our scene, but that it is also easily divisible into triangular geometry which has a higher chance to preserve the overall shape of the mesh in rendering. We then defined its UV maps, and applied a texture for each UV map. The scene needed to be a 1:1 Digital Counterpart, so extra attention had to be given to the spatial relationship between objects; incorrect scaling can ruin the immersion and visual fidelity aspect of the scene.
The level of detail in the geometry was also a significant consideration; higher-fidelity geometry looks better and is more immersive, but will run slower and take up significant storage space. It’s all about balance!
Example of a hyper-realistic 3D table asset for our Digital Twin.
The next step was to implement all of our 3D data inside a simulation/rendering program. The hand-painted 3D textures were then configured as materials to be applied onto the assets which compose the final scene. Objects were then placed with respect to their physical counterpart in the Digital Twin space. With the empty scene created, it served as a blank canvas for data streaming and simulation capability.
When all the assets are put together, the Digital Twin starts to look the part!
In terms of functionality, our build features a real-time weather system consisting of several effects, including but not limited to:
- Sunny weather ☀️
- Rainy weather 🌧️
- Snowy weather ❄️
- Foggy weather 🌫️
- Cloudy weather ☁️
To achieve this, each weather condition was defined as a configuration profile. Each configuration consists of particle emitter effects to mimic weather such as snow or rain, and corresponding lighting effects such as a darker, more muted sky via editing light composition during rainy conditions and a brighter, more saturated sky during sunny conditions to further improve the depth of realism.
To take this a step further, we obtained real-time weather data from our recently released CityStation product. The system was installed in our desired location and through a custom connector written by our team, the data was ingested and parsed into our Digital Twin. By taking this data and activating the correct weather profile in response to these conditions, we were able to simulate real-time weather which perfectly reflects the exact conditions in the area where CityStation is installed!
Make it rain! (ﾉ◕ヮ◕)ﾉ*:･ﾟ✧
Since we have this real-time data, it made sense to go the extra mile to add some more visual fidelity. In that case, a day-night cycle was just the solution to enhance our scene. We achieved this effect by defining sun and moon profiles with a similar setup to our weather effects. The sun (or moon)’s position would then be emulated by adjusting their angle based on the current time of the day using these standard solar elevation formulae. When the sun sets, the moon profile can kick in, and vice-versa. Sunset visuals included, of course!
No scene is complete without a beautiful sunset!
Further functionalities can be added in, such as vehicle traffic systems, pedestrian traffic systems, pedestrian count systems, and so on.
By now, our Digital Twin has come together quite nicely. After following this entire process from start to finish, this is what we had as a result:
Our result — A Digital Twin you can live in!
So, now you know how we approached the murky waters that is Digital Twin development. And hopefully, you understand just how powerful this tool can be. We will continue to add even more features to harness the incredible capability of this cutting edge technology. We really think that with a Digital Twin, the sky’s the limit!
Mirror worlds — or the day software puts the universe in a shoebox: how it will happen and what it will mean. David Gelernter — Published 14th November 1991.
Machine Learning and Simulations Software Engineer, SmartCow AI Technologies
3D Artist, SmartCow AI Technologies