All News | Boards | Chips | Devices | Software | LinuxDevices.com Archive | About | Contact | Subscribe
Follow LinuxGizmos:
Twitter Facebook Pinterest RSS feed
*   get email updates   *

Nvidia unveils Drive PX 2 platform for self-driving cars

Jan 7, 2016 — by Eric Brown 1,611 views

Nvidia unveiled a “Drive PX 2” platform for self-driving cars, an update to its earlier Tegra-based Drive PX automotive mainboard design.

Nvidia and Qualcomm showed off new automotive platforms at CES that demonstrate the power of their advanced GPUs to achieve sophisticated computer vision capabilities. Qualcomm’s new Linux- and Android-ready Snapdragon 820A is an automotive spin on its quad-core 820 SoC, that targets in-vehicle infotainment (IVI) and advanced driver assistance systems (ADAS). Here, we look at Nvidia’s Drive PX 2 platform for self-driving cars, an update to its Tegra-based Drive PX automotive board with 16nm Tegras that haven’t even been announced yet.



Nvidia CEO Jen-Hsun Huang shows off Drive PX 2 board at CES.
(click images to enlarge)

Last year at CES, Nvidia showed off the first autopilot prototype for its Tegra X1 or K1 based Drive PX automotive motherboard. Aimed at ADAS systems, Drive PX is now being used by more than 50 companies in the automotive world, says Nvidia. This year, Nvidia unveiled the Drive PX 2, which supports fully autonomous cars.

Drive PX 2 will provide 8 teraflops of processing power and support 24 trillion deep learning operations a second, or 10 times the performance of the first-generation Drive PX, claims the chip designer. The new board will incorporate two “next-gen Tegra processors and discrete GPUs based on our Pascal architecture,” says Nvidia.



Render of Drive PX 2 in its chassis with cooling system
(click image to enlarge)

The 16nm Tegras will be paired with GPUs that are more powerful than six GeForce Titan X cards, says the company. By comparison, the current Tegra X1, which was announced a year ago, is a 64-bit, 20nm SoC with four Cortex-A57 cores and four Cortex-A53 cores with a 256-core Maxwell GPU.


Illustrations of Drive PX 2 in use (left) and typical Drive PX 2-based ADAS screens
(click images to enlarge)

The Drive PX 2 platform supports sensor integration from up to four lidar detectors, four fish-eye cameras, two narrow field cameras, plus GPS, radar, and ultrasonic sensors to provide a 360-degree view around the car. This is roughly the description of the Audi A6 test car that is currently being used to test the Drive PX 2 in California. The highly integrated board enables much more compact self-driving car computers than those used in current trials, says Nvidia.

— ADVERTISEMENT —


Drive PX 2 developers will have access to Nvidia’s Deep Learning GPU Training System (DIGITS) SDK. This deep neural network technology can handle advanced ADAS tasks such as surround view, collision avoidance, pedestrian detection, cross-traffic monitoring, and driver-state monitoring. Developers can also tap Nvidia’s Driveworks middleware for autonomous driving, which enables sensor calibration, acquisition of surround data, synchronization, and recording, as well as processing streams of sensor data. No OS support was listed for Drive PX 2, but Drive PX supported Linux, Android, and QNX.

Despite the Audi test car, it is Volvo that will be the first to publicly commit to the system in what Qualcomm claims is the world’s first public trial of autonomous driving. In 2017, the Swedish automaker will lease to customers 100 XC90 luxury SUVs outfitted with Drive PX 2. The cars will be licensed to drive autonomously in Volvo’s hometown of Gothenburg, and semi-autonomously elsewhere.




Drive PX 2 and other Nvidia CES news

 

(advertise here)


Print Friendly, PDF & Email
PLEASE COMMENT BELOW

Please comment here...