Technology 3 min read

Toyota First to Adopt Nvidia's Self-Driving Simulator

This new tech allows car manufacturers to simulates millions of miles of driving scenarios on self-driving cars before they ever hit the road. ¦ Nvidia

This new tech allows car manufacturers to simulates millions of miles of driving scenarios on self-driving cars before they ever hit the road. ¦ Nvidia

Toyota already announced a collaboration with JAXA to build a lunar rover. So this next partnership doesn’t shock too much as Toyota has positioned itself as a leader in technology adoption.

Toyota also already leverages Nvidia’s computer known as the Drive AGX Xavier. It powers real world self-driving vehicle testing. Thus, it will assist Drive Constellation in simulation and testing. But not many readers may be familiar with Nvidia’s Drive Constellation program.

The project’s full name is the Virtual Reality Autonomous Vehicle Simulator. But Drive Constellation is shorter and captures the essence of the program well enough. It implements lessons learned via video games to benefit self-driving cars.

How Drive Constellation Works

Known primarily for their performance-grade video cards, Nvidia brings all of that expertise to the virtual realm for self-driving car testing. It’s a cloud-based platform, first announced last year by CEO Jensen Huang.

Developers can use it to test autonomous vehicles in a variety of situations including dangerous scenarios.

Nvidia described the platform as providing simulations “with greater efficiency, cost-effectiveness, and safety than what is possible to achieve in the real world.” This involves a two-pronged approach including a software platform and performance hardware.

The Nvidia Drive Sim emulates an autonomous car’s sensors.

Then, the Constellation Vehicle runs a driverless car software stack. In order to do this, Nvidia uses its Nvidia Drive AGX Pegasus chip. The hardware included two Xavier processors and graphics processors rated to 320 trillion operations per second.

Constellation processes the simulated data like it was real-world data. Commands are then sent to Drive Pegasus in a digital feedback loop every 30 seconds.

Why Toyota Won’t be the Only Adopter

This process creates photo-realistic data streams across its large variety of testing environments.

You can capture data in stormy or snowy weather or across different road terrain and surfaces. It can also mimic glare effects, day time and night time vision, and more.

As with other decentralized projects, developers can create their own scenarios then upload them in order to integrate their own sensor and vehicle models. Nvidia commented about the scalability and the efficiency of the testing, saying;

“This large-scale validation capability is comparable to operating an entire fleet of test vehicles, however, it is able to accomplish years of testing in a fraction of the time.”

Given the open-source nature of the program, other manufacturers will likely flock to the new platform.

Read More: Nvidia Confirms Mellanox Acquisition for $6.9-billion

First AI Web Content Optimization Platform Just for Writers

Found this article interesting?

Let Juliet Childers know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.


Profile Image

Juliet Childers

Content Specialist and EDGY OG with a (mostly) healthy obsession with video games. She covers Industry buzz including VR/AR, content marketing, cybersecurity, AI, and many more.

Comments (0)
Most Recent most recent
You
1
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.