Startups build accurate AV perception on NVIDIA DRIVE


For self-driving cars, things on the surface are important.

Although humans are taught to avoid quick judgments, self-driving cars must be able to see, detect, and act quickly and accurately to ensure safe operation. This capability requires a powerful perception software stack that can fully recognize and track the surrounding environment of the vehicle.

Startups around the world are developing these perception stacks, using the high-performance, energy-efficient computing of NVIDIA DRIVE AGX to provide autonomous vehicle manufacturers with highly accurate object detection.

The NVIDIA DRIVE AGX platform processes data from a series of sensors (including cameras, radar, lidar, and ultrasound) to help self-driving cars perceive the surrounding environment, locate the map, and then plan and execute a safe path forward. This artificial intelligence supercomputer supports autonomous driving, in-vehicle functions and driver monitoring, and other safety functions-all in a compact package.

Hundreds of automakers, suppliers, and startups are building self-driving cars on NVIDIA DRIVE AGX, which is why global perception stack developers choose to develop solutions on AI platforms.

Development from day one

NVIDIA DRIVE is the starting point for companies looking to get up and running awareness solutions.

motivationIt is an autonomous driving start-up company located in Hungary. It has built a modular software stack called aiDrive, which can provide comprehensive perception capabilities for autonomous driving solutions.

The company first started building its solutions on a computing platform using NVIDIA DRIVE in 2016. With high-performance, energy-saving computing, aiDrive can perform perception using mono, stereo, and fisheye cameras, and fuse radar, lidar, and other sensors to obtain flexible and scalable solutions.

aiDrive integrates data from vehicle sensors to achieve flexible and scalable perception.

“We have been using NVIDIA DRIVE since day one,” said Péter Kovács, senior vice president of aiDrive at aiMotive. “These platforms work turnkey and support easy cross-target development-this is also the technology ecosystem that developers are familiar with.”

HorizonIt is a start-up company located in South Korea, founded by perception experts in 2014, aiming to build advanced driving assistance systems on a large scale. Through development on NVIDIA DRIVE AGX, the company has deployed a powerful perceptual deep neural network for the AI-assisted driving platform.

Stradvision’s powerful sensing solution can operate in severe weather conditions, such as snow.

DNN, also known as SVNet, is one of the few networks that meet the accuracy and calculation requirements of mass-produced vehicles.

Performance at each level

Even for lower levels of autonomy, such as ADAS or AI-assisted driving, a strong perception stack is essential for safety.

PhantomVision uses cameras around the vehicle for complete 360-degree coverage.

Silicon Valley Startups Phantom Artificial Intelligence With years of experience in the automotive and technology industries, an intelligent perception stack that can predict the movement of objects has been developed. The computer vision solution called PhantomVision combines front-view, side-view and rear-view cameras to cover a 360-degree field of view of the vehicle.

Real-time detection and target tracking on the bird’s-eye view provide accurate motion estimation of road objects. The high-performance processing of DRIVE AGX enables the software to perform real-time sensing functions.

CalmCar’s perception solution emphasizes safety, and its core is automotive-grade computing.

With the mission of creating a safer road environment for all users, a Chinese startup company Cool car Constructed a multi-camera active surround view perception system. CalmCar’s solution is based on automotive-grade NVIDIA DRIVE Xavier, supporting Level 2+ driving, valet parking and mapping.

By developing comprehensive solutions on NVIDIA DRIVE, these startups are providing accurate and robust perception for AI-assisted and self-driving cars around the world.

Source link

Related Articles

Leave a Reply

Your email address will not be published.

Back to top button