At present, the dream of autonomous driving is little more than that: the shared vision of an industry that is struggling to survive in a shortage of raw materials (see our article on the Industry chip shortage) not only requires more chips than are available, but will also have to accept further supply bottlenecks in the future when the expected palladium shortage sets in (source: https://www.t-online.de/auto/technik/id_91749886/palladium-experte-warnt-autobauer-vor-naechstem-engpass.html ).
Nevertheless, development is progressing, also and especially in the area of autonomous driving - if one were cynical, one would smoothly scramble for words and trouble the longer-lived dead, notwithstanding the always confidently presented promises of performance by visible parts of the industry such as Tesla - which, meanwhile, still owe fully functional proof. So it's time to ask how the development can be brought to the road and tested. We find the answer in an interview with Stefan Wenz from Epic Games, the company behind the Unreal Engine.
- Testing autonomous driving functions: Where there is a will despite a lack of pathways
- The Unreal Engine interview with Stefan Wenz, Epic Games
- Unreal Engine - How Games Became the Bridge for Virtual Test Environments
- Fortnite - Successful test of the Unreal Engine
- Unreal Engine: So much reliance on the virtually collected data
Ca. 21 min
Testing autonomous driving functions: Where there is a will despite a lack of pathways
To everyone’s delight and despite the prophecies of doom: progress is being made, and Level 3 driver assistance systems will be on the road this year. In other words, systems that allow drivers to hand over responsibility to the autonomous driving functions and take their eyes off the road. The vehicle performs acceleration, braking and evasive maneuvers independently, participates in any necessary emergency lanes – and cannot be used in the rain. Further developments are therefore pending to eliminate the teething troubles and to reach level 4 or even 5 for fully autonomous driving. Above all, tests are necessary for this – how does the vehicle react in situation X, with parameter Y and special case Z? How can these tests be performed, en masse, with variants, automated? How do I obtain the necessary data sets, how do I verify them, how do I get them into an environment where all conceivable test scenarios can be run?
One solution is provided by game engines, which originated in the gaming industry and have been perfecting highly realistic representations of the environment for years. The racing game industry in particular has always strived for maximum realism and often strives for the status quo of simulation. We therefore took the opportunity to talk to Stefan Wenz, Business Development Manager Unreal Engine Enterprises at Epic Games, about the Unreal Engine. The result was an informative, interesting and very likeable interview – we hope you enjoy it!
The Unreal Engine interview with Stefan Wenz, Epic Games
Mobility Rockstars: Hello Stefan! Thank you for taking the time to answer our questions. Let’s get specific right off the bat: What exactly is the Unreal Game Engine?
Stefan: Perhaps one would first have to clarify what a game engine is in the first place. Ultimately, namely an amalgamation of various functions that are needed to run a game, for example. For a game like “Fortnite” from Epic Games, for example, you need a visual component for the display itself, the so-called “render engine”. You need a Speech component for the voice interface, a Sound component – in our case a “Spatial Audio” to enable 3D hearing to hear where enemies are coming from in a game like Fortnite, and various smaller services to run the game. Last but not least, the physics engine is very important. And these building blocks that you need to run a game is also what you need within a simulation for autonomous driving functions. And that’s exactly what the Unreal Engine is. Of course, there are also other providers, such as the Unity Engine(editor’s note: we have also already written an article about it). have written an article ) – however, the Unreal Engine is one of the few that is completely open source.
Unreal is also insanely modular – you can add modules via C++, for example, to bring your own functions into the simulation. Through Blueprints (Visual Scripting component in UE), we give non-programmers the opportunity to develop games.
Mobility Rockstars: Does your engine licensing model work as you know it then? In other words, small developers use the open source engine free of charge, game manufacturers or OEMs can purchase licensing models for it?
Stefan: We are actually a bit different than comparable providers. There are three licensing options. Two of them are possible on the web – one is the Publisher license, the other is the Creator license, which is designed for all B2B Usecases. So, for example, if an OEM wants to create an application, a driving simulation for example, the OEM can go to the web and download the engine for free. In addition, there is the option to get support from us via a closed ticket system, where access then costs a manageable, but existing contribution.
In the case of the publisher license, we find ourselves in the B2C area, where game publishers can use the engine, also free of charge, until the first million is earned with the Unreal-based product – in which case royalties would then be due.
The third license option is to simply contact us. We make an Enterprise Agreement if you want to work with us, and here you can additionally agree to get certain B2C applications – like an interactive web configurator to click together your car – which are actually part of the Publisher License anyway. For that, you just have to talk to us. It can be stated that the engine is always free of charge in the enterprise environment. The engine also includes various services free of charge, such as “Quixels”, high-resolution assets of various objects, or “Metahuman”, with which human avatars can be created – these can then be used for vehicle interior simulations, for example, to test fatigue detection or similar services.
Unreal Engine – How Games Became the Bridge for Virtual Test Environments
Mobility Rockstars: So is this the bridge from gaming to industry? Building blocks and models that have already been developed for games and have the pretensions of a simulation, which can now be used for testing in a virtual environment?
Stefan: Exactly. The level of detail of these assets is extremely high, thanks in part to technological advances. The level of simulation used to be a bit bumpy, simply because realism was lacking, as graphics engines could not yet deliver the required performance. In the meantime, we have come a long way, especially with driving simulations. One of our customers did a test with professional computers and first had them drive on the Nordschleife at the Nürburgring before he had them drive again in a driving simulator. The race results were extremely similar with deviations only in the range of seconds.
Mobility Rockstars: This brings back memories of my youth, when I played the game “Test Drive” on my first computer. Vehicles and road situations were already simulated there – of course still extremely bumpy, with empty roads and clunky cars, but it already felt different whether you were driving a BMW or a Dodge Viper. Were these the beginnings of today’s testing capabilities via Unreal Engine?
Stefan: Yes, that was certainly an important step, even though the hardware at that time was of course not ready to map a simulation environment. In contrast, when I look at what is possible today, it is enormous. For example, we also do “virtual test drives,” which is a real simulation that a real driver drives through, but we also already do automated validations of autonomous driving software with Unreal. In the end, it makes no difference to the software whether the information supplied comes from a camera or is generated from the engine. The important thing is that it happens in real time – which is what an engine was built for – and that the realism is so high that the footage is equivalent to footage from the camera.
But that’s also why more and more OEMs are turning to the possibilities of testing autonomous driving functions in the Unreal Engine: You can perform the necessary tests infinitely fast. Validations that used to require thousands of test miles can now be done overnight. Now, for example, if an error occurs during the night, it is possible to jump to the error, analyze it, fix it and immediately retest if the error was really fixed. In addition, all the conceivable variants can now be taken into account in testing that are impossible to simulate in the real world: Passers-by unexpectedly running onto the road, rain, road surfaces – all this can be simulated in the game engine with the same parameters and tested accordingly.
If you think about it one more way, it’s not always just about validating the software for one-off autonomous driving – you’re also making improvements without making original functions worse. These recursive tests can also be optimally mapped in a game engine.
Mobility Rockstars: I also see an overlap there, for example with Robotic Process Automation. Automating these tests so as not to have to assign highly qualified personnel to them certainly also makes sense.
Stefan: Definitely. Because Unreal is based on C++, you have all the functions in the Unreal Engine and can program everything you need. However, it also makes sense to work with plugins and develop them – this is a better way to avoid errors in what is ultimately a complex environment. In addition, we are also constantly learning. Take our game “Fortnite” as an example – that was ultimately founded just to test and improve our own technologies. Where are the errors, what can be optimized, where is the pain?
Fortnite – Successful test of the Unreal Engine
Mobility Rockstars: Okay, wait – so that means Fortnite, this really very successful game, is basically just a single Unreal Engine test that happened to become very successful?
Stefan: Exactly! And what is interesting for customers from the automotive industry – or actually for all customers – is that over 350 million accounts play Fortnite, and thus test the software. There is no other software, no matter how good, that has been tested at a similar level.
Mobility Rockstars: Of course, this also makes it ideal for reliable applications in the automotive industry. But what I and our readers are also interested in is: So now we have this very reliable engine that is widely tested and offers numerous possibilities. But how do I ultimately create my virtual test environment? How do I get the necessary data into the simulation?
Stefan: There are different ways how to create the test environment. Sure, the easiest solution is: find a CG artist to create the environment, using the Quixels assets, Metahuman, etc. – of course this costs something, the kilometer of road is not exactly cheap there, but this way you can build what exactly you need.
Mobility Rockstars: So I could just recreate the street I want in the middle of Munich via Unreal Engine, right?
Stefan: Right – but actually you can go further and say “Why recreate a street from Munich?” I’d rather have a road put together where everything is there and happening that I want to test. Also, of course, I can use “Capture Reality”. This is the scanning technology used to capture the Quixel assets. You can have a drone fly over the landscape and capture a point cloud. This can be transferred to the Unreal Engine and recreated. Even easier would be to look at what other software companies there are that work with Unreal. There’s Cesium, for example – they’ve captured practically the entire world and make it available to map providers, among others, for the maps in navigation systems. The company that built Microsoft Flight Simulator also uses Unreal. So that’s all data that exists and can be accessed. This allows the environmental model to be extremely realistic. For vehicles and their interiors, this is often even easier – the OEM can simply provide the CAD model, which can be imported. Of course, then you still have to rework. Integrate sensors and their functions, and so on.
Mobility Rockstars: It’s really a very interesting topic, and so slowly, after the definition of terms, the environment and the vehicles, we are now approaching the topic of testing. How far can I go in the simulation using Unreal Engine? Can I automate and even vary tests with bots? Can my pedestrian walk in the middle in one test and an inch to the left in the next? So can I use an A.I. to help me with test coverage?
Stefan: Again, you could say that we learned a lot from the gaming industry. For example, when I play a round of Fortnite, I have a lobby where there are 100 players who would like to play a round. But is that always 100 people? An alternative are computer-controlled players – so-called bots. These bots, of course, have to make decisions that feel correct, they have to make mistakes to miss sometimes, and so on. These are already basics that will be needed later in testing. In our “The Matrix Awakens” trailer (editor’s note: a computer game based on the Unreal 5 engine) you can see for example for about 13 seconds images of a traffic situation, which, like all traffic in the game, is represented by the “Mass AI Crowd and Traffic System” – this is an A.I. that simulates traffic. These vehicles also interact with each other, they overtake each other, turn, other vehicles shear in or have to accelerate. So there is also an interaction between the A.I.s.
Unreal Engine: So much reliance on the virtually collected data
Mobility Rockstars: How much reliance can be placed on simulated data, which is also purchased cheaply and in bulk? After all, lives may depend on it in road traffic.
Stefan: The data is not 100 percent deterministic, but the test will also spit out the same result every time if performed the same way. Of course, a game engine is not comparable to a pre-rendered graphic – for example, when it rains and the drops fall on the windshield of the simulated vehicle, they don’t always fall exactly the same way, sometimes falling to the left, sometimes to the right, just like rain does, which is no different in real life. But the goal of the game engine is to output the same data over and over again, as realistically as possible. For this, the Physix engine is also included, which ensures that physical objects behave as in reality – a ball rolls as far as a real ball would also roll.
Mobility Rockstars: This also raises the question for me of how far testing has already come in such a highly complex simulation: Can it replace physical, real testing on real roads if I can cover so many test situations virtually?
Stefan: Testing via Unreal cannot completely replace real tests on the road. However, tests in the virtual environment certainly make it unnecessary to a large extent. Everything I did before to test the software, I can do in Unreal. During a test drive, I can define what the result should be before the drive: I can say “drive route XY in sunshine”, and the result must be the same as in the simulation. So I still need the test drive to set a validation point so that simulation and real test drive don’t drift apart. Besides, such a test drive is also fun! But it is precisely the useless situations that can be avoided: You go out, have a fault, have to go back, search, check, readjust, out again. In simulation, I can shortcut that, find the error, fix it, and retest it while running other tests in parallel. In addition, I can also import the images I take during the test drive directly back into the Unreal Engine and then test them again in the simulation to see if they produce the same results as the real test drive. So the validation works both ways. From that point of view, I would not say that real test drives can be completely replaced, but testing in virtual environments eliminates many unnecessary drives.
Mobility Rockstars: Stefan, thank you very much for the very informative and exciting interview and we are already looking forward to the next conversation!
Stefan: Thank you, likewise!