Testing Autonomous Systems with AV elevate
In the final instalment of our three-part blog series on rFpro’s AV elevate™ platform, we turn our attention to the last critical phase of autonomous vehicle development: testing. We spoke to Josh Wreford, Product Manager for AV elevate, to explore how simulation enables scalable, safe and cost-effective validation of ADAS and automated driving systems.
Q: What do we mean by “testing” in the context of autonomous vehicles, and how does it differ from tuning and training?
Josh: Testing is where everything comes together. Tuning is about configuring your sensors to capture the best data, and training is about teaching your perception system to understand that data. Testing is the phase where you evaluate whether the complete system, so perception, decision-making and control, works as intended in a huge variety of scenarios.
It’s also by far the broadest and most complex part of the process. You can test individual components in isolation or run full-stack simulations. The goal is always the same: develop and mature the system as much as possible virtually before any physical prototypes are ready for the public road – and then use the simulations to explore the edge cases that you may never come across in your own real-world driving.

Users can import hundreds of test scenarios straight into AV elevate and start testing immediately.
Q: Where do developers typically start?
Josh: At the component level. A simple example might be testing pedestrian detection. You’re focusing purely on the perception layer and evaluating the object detection algorithm. Once that works, you might add the decision-making and control layers – asking, “What should the vehicle do once it sees the pedestrian?” That could mean braking, swerving, or stopping entirely.
As you add more components into the loop, such as the full vehicle model, the simulation becomes more complex, but also more representative. The beauty of simulation is that you can build this up gradually and run the tests repeatedly in controlled conditions
Q: What types of testing are supported in AV elevate?
Josh: We support every major testing architecture, whether it is software-in-the-loop (SiL) hardware-in-the-loop (HiL) or driver-in-the-loop (DiL). You can test individual perception stacks in the cloud or inject real sensor data into physical ECUs. For driver-in-the-loop scenarios, like lane keeping or evasive manoeuvres, you can even feed that same data into a motion platform to observe how the human driver, or passenger, reacts.
The flexibility is what sets AV elevate apart. Whether you’re validating an ECU, running neural networks in the cloud, or simulating complete Euro NCAP scenarios, the platform is capable of it all.

Simulation helps OEMs validate systems like AEB or lane keep systems in advance of track tests.
Q: Can you give an example of a typical testing workflow using AV elevate?
Josh: Sure, let’s say you’re testing an automated emergency braking system. First, you define your ego vehicle (chassis and control system) and configure the sensors (camera, radar, LiDAR etc). Next that sensor data is integrated into your perception system or pure ground truth feeds can be fed to the planning and control system. Using our scenario editor tool you can build and edit a test scenario, such as a pedestrian walking into the path of the vehicle. You define the route, traffic, time of day, weather, and so on.
Once that’s done, the test can be massively scaled across multiple variations. For example, different lighting conditions, pedestrian types, crossing angles, or just different locations. The tests are executed either locally or in the cloud, and AV elevate collects all the data in the background.
Testing is so varied and use case-specific that we have enabled users to create their own metric calculators. So the data can be evaluated automatically with a pass/fail criteria.
Q: What if developers don’t have time to build large test libraries from scratch?
Josh: That’s a common challenge, and we’ve addressed it by supporting scenario import from industry standards such as OpenScenario as well as integration with third-party scenario databases. This means users can import hundreds of test scenarios straight into AV elevate and start testing immediately. The open API architecture also allows the AV elevate simulation platform to be connected to any scenario controller.
Once a database is built, it can be reused across future projects. That’s a massive time saver, especially for repeat validation workflows or regression testing after a software update.

AV elevate can automatically vary the base test scenario to create more rigorous testing.
Q: Is this type of simulation testing relevant only to AVs, or is it used for ADAS too?
Josh: It’s absolutely relevant for ADAS. In fact, one of the key drivers right now is compliance with Euro NCAP’s ADAS protocols and other similar schemes. Simulation helps OEMs validate systems like AEB or lane keep systems in advance of track tests. Ultimately, this will improve safety scores, reduce development costs and increase the OEMs confidence that they will pass the test.
To evaluate the robustness of ADAS systems, Euro NCAP is beginning to vary the configurations of its tests. This is great news for consumers as it should result in systems that work better in the real world, but it is a headache for OEMs who will need to more thoroughly test them. AV elevate can automatically vary the base tests within certain parameters to do this type of testing.
Q: How does AV elevate help customers handle the complexity of these tests?
Josh: One of the things customers value most is that they can do everything in one platform. Real-time and ray-traced rendering, SiL, HiL and DiL, perception system evaluation – it’s all possible within the same simulation ecosystem.
You don’t need separate tools for each use case. That means the digital assets created can be reused, for example vehicle models, locations and test scenarios, across departments and across the ‘Tune’, ‘Train’ and ‘Test’ phases.
Q: How can you optimise simulation for testing?
Josh: Ultimately, it depends on what is being tested. If you’re validating a perception system in poor lighting or night-time conditions, for example, the best quality data would be needed, so you would make use of our ray tracing rendering technology. This takes more time but more accurately simulates how light behaves. But for many control-based tests or HiL perception tests, our real-time engine is much faster.
A common approach is to use the real-time engine for early testing and switch to ray tracing once you’ve identified edge cases or failures that need more highly accurate sensor input.
Q: Are any customers using AV elevate to support their testing programmes today?
Josh: Yes – AMD is a great example. They’re using AV elevate to validate their automated parking systems. They’ve set up HiL configurations, with perception, decision and control all running in real time, and sensor inputs injected into real ECUs.
Again, to further optimise the simulations and make the initial process more efficient, they have started with AV elevate’s internal vehicle model rather than a custom chassis model. When testing their parking systems, they’ve been able to develop the system to steer and stop in the right places quickly thanks to the use of simulation and it has become central to their development process. As they move on to more complex dynamic manoeuvres, having the ability to use custom chassis models is highly valued and is put in place.
Q: Finally, why is testing such an important part of the AV development process?
Josh: Testing is the proof point. It’s where you validate all the hard work you did in tuning and training. And with the scale of testing required for AV systems, which involves millions of miles of real-world driving, simulation is the only practical option to help shorten timeframes and explore test cases that would be difficult or unsafe to find through real-world testing.
AV elevate enables all forms of testing at any stage of the development process. Whether you’re validating a single sensor or the full system, we give you the tools to do it accurately, efficiently and repeatedly. That’s the power of a fully integrated simulation platform.



