Chipmaker Nvidia is acquiring DeepMap, the high-definition mapping startup announced. The company said its mapping IP will help Nvidia’s autonomous vehicle technology sector, Nvidia Drive. “The acquisition is an endorsement of DeepMap’s unique vision, technology and people,” said Ali Kani, vice president and general manager of Automotive at Nvidia, in a statement. “DeepMap is expected […]
Volvo plans to bake NVIDIA’s DRIVE chipsets for autonomous and smart cars into its next-generation platform, with the first fruit of that partnership to be revealed in 2022. The next-generation Volvo XC90 luxury SUV will use NVIDIA DRIVE Orin for its self-driving system, the chip-maker confirmed today at its GTC 2021 conference. NVIDIA and Volvo are no strangers, of course, … Continue reading
NVIDIA has revealed its newest chipset for smart cars and autonomous vehicles, with DRIVE Atlan promising to be “an AI data center on wheels” with an exponential uptick in processing power. Targeting Level 4 and Level 5 self-driving vehicles – along with the next-generation of highly-assisted cars – NVIDIA DRIVE Atlan is expected to begin sampling to automakers and AV … Continue reading
Nvidia is positioning itself to be a major player in autonomous vehicle development.With a wide range of hardware and software offerings for automakers looking to expand their R efforts, the chipmaker has something for just about everyone.Today, it's announcing yet another partnership that will leverage its strengths.Nvidia announced on Tuesday that it will lend its Nvidia Drive platform to Volvo Trucks for use in the development of self-driving trucks.While other partnerships might focus on passenger vehicles, this one is all about enabling autonomy for other parts of the automotive sector, including logistics, construction, public transportation and even garbage collection.To better work together, Volvo Trucks and Nvidia will co-host engineering teams in both Silicon Valley and Volvo's home in Gothenburg, Sweden.
Testing of autonomous autos is critical since there will be no human behind the wheel.These vehicles have to recognize all manner of obstacles and conditions to operate safely.Testing of autonomous vehicles is done on public roads in some states, but much of the testing is done in the virtual world.NVIDIA has announced that its DRIVE Constellation autonomous vehicle simulation platform is available.This is a virtual platform for testing fleets of virtual autonomous cars.The virtual world can simulate everything from routine driving to rare and dangerous situations.
At its 2019 GPU Technology Conference in San Jose, graphics processor manufacturer Nvidia announced Monday that it will be releasing version 9.0 of its Nvidia Drive platform for autonomous vehicles to developers and, perhaps more interestingly, that it is also working with German standards body TÜV SÜD to develop a sort of driver's license for self-driving cars.The Nvidia Drive AP2X release 9.0 -- as the new hardware and software platform is more specifically known -- includes a few new features for autonomous vehicle developers to play with.The software will also be able to make decisions at complex intersections, work through congested traffic and automatically change into a faster moving lane without endangering the vehicle's occupants or, presumably, oncoming traffic.In areas where there is no map data for the autonomous software to use for guidance -- perhaps a private driveway or a newly paved road -- Drive 9.0 can even generate its own map data, piecing together information from cameras, lidar and radar sensors to create what Nvidia calls a Local HD map that can be used for self-driving navigation after a few human-piloted trips.Most noteworthy is Safety Force Field, a new, open-source collision avoidance algorithm for autonomous vehicles that uses computational predictions to guess where nearby cars, pedestrians and other obstructions will be in the immediate future and then act to shield autonomous vehicles from collision with a combination of emergency braking and intelligent steering around obstructions.Nvidia is making the Safety Force Field algorithm open-source for other developers to use when building their own autonomous platforms.
Autonomous vehicle development is a time and resource-intensive business, requiring dozens of test vehicles, thousands of hours of data collection and millions of miles of driving to hone the artificial brains of the cars of tomorrow.That's the question Nvidia hopes to answer with the release of its Nvidia Drive Constellation testing platform for self-driving cars.The announcement came during the keynote address at Nvidia's 2019 GPU Technology Conference in San Jose Monday.Drive Constellation is, basically, a simulation and validation platform that allows automakers and developers to test their autonomous vehicles and technologies in a virtual environment that lives in a specially-designed cloud server.Constellation works by first generating an extremely realistic environmental simulation using Nvidia's RTX-powered servers, then feeding that simulation into the autonomous vehicle software or platform requiring evaluation.It's like plugging your self-driving car's brain into The Matrix, where it can be tested, trained and developed over and over again.
It comes months after Nvidia seeded Constellation to simulation partners (in September), and weeks after Uber open-sourced Autonomous Visualization System, the web-based platform for vehicle data used by its Advanced Technologies Group, or ATG (the division charged with developing its autonomous car platform).Constellation taps two different kinds of servers, the first of which (Constellation Simulator) powers Nvidia Drive Sim — a software platform that simulates a driverless car’s sensors — and the second of which (Constellation Vehicle) contains an Nvidia Drive AGX Pegasus chip (a pair of Xavier processors and GPUs rated at 320 trillion operations per second) and runs a complete autonomous vehicle software stack.Constellation processes simulated data from the Constellation Simulator as if it were recorded from sensors of real-world cars driving on the road; commands from Drive Pegasus are fed back to the simulator, completing the digital feedback loop roughly every 30 seconds.Constellation can generate photo-realistic data streams to create a vast range of testing environments, simulating a variety of weather conditions, such as rainstorms and snowstorms, along with different road surfaces and terrain.Moreover, it can mimic the effects of blinding glare at different times of the day and limited vision at night.And because it’s decentralized, developers can upload traffic scenarios, integrate their own vehicle and sensor models, and drive entire fleets of test vehicles “billions” of simulated miles.
It is now easier for developers and manufactures to view the raw data from sensors in real-timeSanta Clara-based NVIDIA has unveiled NVIDIA DRIVE AutoPilot, a tool it claims is the world’s first commercially available level 2+ automated driving system.Rob Csongor, VP of Autonomous Machines at NVIDIA commented that: “A full-featured, Level 2+ system requires significantly more computational horsepower and sophisticated software than what is on the road today”“NVIDIA DRIVE AutoPilot provides these, making it possible for carmakers to quickly deploy advanced autonomous solutions by 2020 and to scale this solution to higher levels of autonomy faster.”Using the computational power of deep neural networks in analysing the data from 360 cameras and data gathered from external sensors on the vehicle, NVIDIA Drive enables a vehicle to successfully evade obstacles in its environment as well as navigated high speed lane changes and instigate automatic emergency breaking.All of the components of autonomous vehicles, but the driver is still very much required.
Nvidia’s automotive ambitions seemed targeted solely on creating a platform to enable fully autonomous vehicles, notably the robotaxis that so many companies hope to deploy in the coming decade.It turns out that Nvidia has also been working a more near-term product that opens it up to a different segment in the automotive industry.The company announced Monday at CES 2019 that it has launched Nvidia Drive AutoPilot, a reference platform that automakers can use to bring more sophisticated automated driving features into their production vehicles.The Drive AutoPilot system is meant to make those advanced driver assistance system in today’s cars even better.It enables highway merging, lane changes, lane splits, pedestrian and cyclist detection, parking assist, and personal mapping as well as in-cabin features like driver monitoring, AI copilot capabilities, and advanced in-cabin visualization of the vehicle’s computer vision system.It also allows for over-the-air software updates, a capability that automakers, with the exception of Tesla, have been slow to adopt.
That means that the car can automatically handle steering, acceleration, and deceleration in the driver’s environment, as well as features like cruise control and lane centering.Nvidia introduced the system at CES 2019, the big tech trade show in Las Vegas this week.German car maker Continental will develop Level 2-plus systems on Nvidia Drive, with startup production in 2020.Others will be making announcements during the show as well.“It’s all based on AI, with many deep neural networks used for perception.It could be six, it could be four, it could be more, depending on the car maker.
Amazon today announced an open source release of the Alexa Automotive Core (AAC) SDK, or Auto SDK, to help automakers integrate Alexa voice control into cars and their infotainment systems, screens often used for navigation, media, or climate control.The software development kit is free for download on GitHub and is optimized for bringing Alexa to in-car dashboards to accomplish tasks common with hands-free voice control — like playing music, providing turn-by-turn directions from native navigation systems, helping people find local businesses, and making phone calls.The Auto SDK will also be able to do things Alexa can do in a smart speaker, such as control smart home devices, check the weather, and launch Alexa skills.In recent years, carmakers like Ford and Toyota have brought Alexa to some of their popular vehicles, and Alexa skills have been introduced by Mercedes-Benz, Hyundai, General Motors, and others to let you do things like unlock your car door with your smart speaker, but the launch of the Alexa Auto SDK represents the first time Amazon has made a development kit especially for vehicles.The Auto SDK comes from the Alexa Auto team, which was first formed last year and aims to help Alexa compete with Siri in Apple’s CarPlay, Google Assistant in Android Auto, and SoundHound, whose Houndify platform is being deployed for voice control in Hyundai cars and the Nvidia Drive autonomous vehicle platform.The team has worked with carmakers like Ford, as well as companies like Anker, whose Roav Viva gadget plugs into a cigarette lighter and brings Alexa into cars without an infotainment console for $50.
Bosch and Daimler have been working together to roll out future autonomous cars and have announced that they will source a new AI platform in that effort.Both companies will source the Drive Pegasus platform AI processors and software from NVIDIA.The tech will be used to help autonomous and automated vehicle navigate the complex environment of congested city streets.The Drive Pegasus platform will be provided under the contract that includes processors and software allowing the processing of vehicle-driving algorithms generated by Bosch and Daimler using machine learning methods.The ECU network will reach a computing capacity that can support hundreds of trillions of operations per second.The firms note that this is akin to the performance delivered by at least six synchronized and highly advanced desktop computer workstations.
Einride's T-pod all-electric, self-driving transport vehicle uses the NVIDIA DRIVE platform for its autonomous smarts.
One of the most exciting moments from the hit film Black Panther came to life today, as Nvidia’s Holodeck software enabled a driver using virtual reality to drive a car in the real world.Nvidia CEO Jensen Huang offered the demonstration at the 2018 GPU Technology Conference (GTC) as an aside while discussing autonomous car technologies, noting that he didn’t even have a name for what was being shown.“VR remote driving” would make sense.You can see our video of the amazing feat for yourself here.As a live video shows a Ford Fusion appearing to drive itself, a remote driver wearing an HTC Vive steers the car from the GTC stage.The car is represented virtually as a Lexus similar to the one in Black Panther.
There are too many situations the artificial intelligence system has to train for and not enough testing time on the highways.Announced at the GTC event in San Jose, California, the technology is a fusion of two very different technologies enabled by Nvidia’s graphics processing units (GPUs): self-driving cars and VR.The cloud-based system for testing autonomous vehicles using photorealistic simulation can create a safer, more scalable method for bringing self-driving cars to the roads.Speaking at the opening keynote of GTC 2018, Nvidia CEO Jensen Huang announced that the Constellation runs on two different kinds of servers.“We at Nvidia are dedicating ourselves to solve this problem,” Huang said on stage.The first server runs Nvidia Drive Sim software to simulate a self-driving vehicle’s sensors, such as cameras, lidar, and radar.
Self-driving cars are going to be processing a lot of data from sensors aimed at detecting pedestrians and other hazards.And to make sure that data gets routed to the central processing unit (CPU) on time, Aquantia is bringing its high-speed ethernet networking into autonomous vehicles.San Jose, California-based Aquantia has teamed up with Nvidia to provide its 10 Gbps ethernet connectivity for the Nvidia Drive Xavier and Drive Pegasus supercomputers for autonomous vehicles.The Nvidia platforms with Aquantia Multi-Gig networking will be available to car partners in the first quarter of 2018.Chips in autonomous vehicles is expected to be a $30 billion market by 2030, according to analyst firm Raymond James.The company introduced three new chips to bring high-speed networking into cars.
The chipmaker also released an SDK for building AI-enabled applications such as facial recognition.NVIDIA is proving its technology is indispensable in the AI self-driving car sector as the Consumer Electronics Show (CES) 2018 kicks off this week.In a new tie-up, Volkswagen and Uber will leverage the chipmaker’s graphics processors to step up a gear in driverless vehicle visual capabilities.Announcing the partnership in Las Vegas on Sunday, Uber and Volkswagen said NVIDIA’s artificial intelligence-powered chips will create a cockpit-style experience and increase passenger safety.The NVIDIA DRIVE IX platform will power the I.D.NVIDIA’s AI chip will run so-called “co-pilot” applications, making smart judgement calls to up convenience for human passengers.
Nvidia will power artificial intelligence technology built into its future vehicles, including the new I.D.Buzz, its all-electric retro-inspired camper van concept.The partnership between the two companies also extends to the future vehicles, and will initially focus on so-called “Intelligent Co-Pilot” features, including using sensor data to make driving easier, safer and more convenient for drivers.The AI features will be based on Nvidia Drive IX platform, and can enable features like face recognition-based door unlocking, gesture input for cockpit controls, natural language speech recognition and even monitoring a driver’s attentive and distracted state for providing safety alerts to bring them back to focus.Nvidia has been working with automakers on putting artificial intelligence features into vehicles for some time now, and revealed a number of partners on this front last year at CES, including Audi and Mercedes-Benz, so Volkswagen adds yet another automaker to its growing number of partners.AI in vehicles is often thought of as a means to achieving full self-driving capabilities, but this is an example of what it might be able to do in production cars in the nearer term, long before fully Level 4 and 5 autonomy is available in the average consumer vehicle sold directly to individuals.
Nvidia announced at its CES 2018 keynote that it’s now partnering with two Chinese companies on bringing autonomous driving to roads, including Baidu and automaker ZF.Nvidia CEO Jensen Huang announced that Nvidia’s Drive Xavier autonomous compute platform would be used for Baidu’s Apollo project, which aims to offer an open platform for self-driving cars in partnership with a wide variety of automakers, suppliers and tech companies.The partnership will give Nvidia key access to supplying the Chinese market with AV tech, letting it build a platform that can truly span the world, and operate in what’s become the most important, largest and fastest-growing auto market in the world.ZF is a huge, crucial supplier in safety and ADAS tech, too and it’s likely to help Nvidia grow aspects of its business that focus on steps between safety features now, and eventual full self-driving later.Baidu has a large presence at CES this year, a first for the Chinese internet tech giant.It’s going to be showing off its latest AV prototypes at the show, which will include tech from Nvidia on board.