fbpx

NVIDIA

www.nvidia.com

NVIDIA invented the highly parallel graphics processing unit—the GPU—in 1999. Since then, NVIDIA has set new standards in visual computing with interactive graphics on products ranging from smart phones and tablets to supercomputers and automobiles. Since computer vision is extremely computationally intensive, it is perfectly suited for parallel processing, and NVIDIA GPUs have led the way in the acceleration of computer vision applications. In embedded vision, NVIDIA is focused on delivering solutions for consumer electronics, driver assistance systems, and national defense programs. NVIDIA expertise in visual computing, combined with the power of NVIDIA parallel processors, delivers exceptional results.

Recent Content by Company

CUDA Refresher: The GPU Computing Ecosystem

This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. This is the third post in the CUDA Refresher series, which has the goal of refreshing key concepts in CUDA, tools, and optimization for beginning or intermediate developers. Ease of programming and a giant leap in performance …

CUDA Refresher: The GPU Computing Ecosystem Read More +

Mercedes-Benz and NVIDIA to Build Software-Defined Computing Architecture for Automated Driving Across Future Fleet

Auto and Computer Industry Leaders Intend to Join Forces and Enable Next-Generation Fleet with Software Upgradeability, AI and Autonomous Capabilities Tuesday, June 23, 2020 – Mercedes-Benz, one of the largest manufacturers of premium passenger cars, and NVIDIA, the global leader in accelerated computing, plan to enter into a cooperation to create a revolutionary in-vehicle computing …

Mercedes-Benz and NVIDIA to Build Software-Defined Computing Architecture for Automated Driving Across Future Fleet Read More +

World’s Top System Makers Unveil NVIDIA A100-Powered Servers to Accelerate AI, Data Science and Scientific Computing

Cisco, Dell Technologies, HPE, Inspur, Lenovo, Supermicro Announce Systems Coming This Summer Monday, June 22, 2020—ISC Digital—NVIDIA and the world’s leading server manufacturers today announced NVIDIA A100-powered systems in a variety of designs and configurations to tackle the most complex challenges in AI, data science and scientific computing. More than 50 A100-powered servers from leading …

World’s Top System Makers Unveil NVIDIA A100-Powered Servers to Accelerate AI, Data Science and Scientific Computing Read More +

Best AI Processor: NVIDIA Jetson Nano Wins 2020 Vision Product of the Year Award

This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. Compact yet powerful computer for AI at the edge recognized by the Edge AI and Vision Alliance. The small but mighty NVIDIA Jetson Nano has added yet another accolade to the company’s armory of awards. The Edge …

Best AI Processor: NVIDIA Jetson Nano Wins 2020 Vision Product of the Year Award Read More +

CUDA Refresher: Getting Started with CUDA

This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. This is the second post in the CUDA Refresher series, which has the goal of refreshing key concepts in CUDA, tools, and optimization for beginning or intermediate developers. Advancements in science and business drive an insatiable demand …

CUDA Refresher: Getting Started with CUDA Read More +

CUDA Refresher: Reviewing the Origins of GPU Computing

This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. This is the first post in the CUDA Refresher series, which has the goal of refreshing key concepts in CUDA, tools, and optimization for beginning or intermediate developers. Scientific discovery and business analytics drive an insatiable demand …

CUDA Refresher: Reviewing the Origins of GPU Computing Read More +

2020 Vision Product of the Year Award Winner Showcase: NVIDIA (AI Processors)

NVIDIA’s Jetson Nano is the 2020 Vision Product of the Year Award Winner in the AI Processors category. NVIDIA’s Jetson Nano delivers the power of modern AI in the smallest supercomputer for embedded and IoT. Jetson Nano is a small form factor, power-efficient, low-cost and production-ready System on Module (SOM) and Developer Kit that opens …

2020 Vision Product of the Year Award Winner Showcase: NVIDIA (AI Processors) Read More +

NVIDIA’s New Ampere Data Center GPU in Full Production

New NVIDIA A100 GPU Boosts AI Training and Inference up to 20x; NVIDIA’s First Elastic, Multi-Instance GPU Unifies Data Analytics, Training and Inference; Adopted by World’s Top Cloud Providers and Server Makers SANTA CLARA, Calif., May 14, 2020 (GLOBE NEWSWIRE) — NVIDIA today announced that the first GPU based on the NVIDIA® Ampere architecture, the …

NVIDIA’s New Ampere Data Center GPU in Full Production Read More +

NVIDIA EGX Edge AI Platform Brings Real-Time AI to Manufacturing, Retail, Telco, Healthcare and Other Industries

Ecosystem Expands with EGX A100 and EGX Jetson Xavier NX  Supported by AI-Optimized, Cloud-Native, Secure Software to Power New Wave of 5G and Robotics Applications SANTA CLARA, Calif., May 14, 2020 (GLOBE NEWSWIRE) — NVIDIA today announced two powerful products for its EGX Edge AI platform — the EGX A100 for larger commercial off-the-shelf servers …

NVIDIA EGX Edge AI Platform Brings Real-Time AI to Manufacturing, Retail, Telco, Healthcare and Other Industries Read More +

NVIDIA Releases Jetson Xavier NX Developer Kit with Cloud-Native Support

Cloud-Native Support Comes to Entire Jetson Platform Lineup, Making It Easier to Build, Deploy and Manage AI at the Edge Thursday, May 14, 2020—GTC 2020—NVIDIA today announced availability of the NVIDIA® Jetson Xavier™ NX developer kit with cloud-native support — and the extension of this support to the entire NVIDIA Jetson™ edge computing lineup for …

NVIDIA Releases Jetson Xavier NX Developer Kit with Cloud-Native Support Read More +

Training with Custom Pretrained Models Using the NVIDIA Transfer Learning Toolkit

This article was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. Supervised training of deep neural networks is now a common method of creating AI applications. To achieve accurate AI for your application, you generally need a very large dataset especially if you create… Training with Custom Pretrained Models …

Training with Custom Pretrained Models Using the NVIDIA Transfer Learning Toolkit Read More +

Speeding Up Deep Learning Inference Using TensorRT

This article was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. This is an updated version of How to Speed Up Deep Learning Inference Using TensorRT. This version starts from a PyTorch model instead of the ONNX model, upgrades the sample application to use… Speeding Up Deep Learning Inference …

Speeding Up Deep Learning Inference Using TensorRT Read More +

NVIDIA VRSS, a Zero-Effort Way to Improve Your VR Image Quality

This article was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. The Virtual Reality (VR) industry is in the midst of a new hardware cycle – higher resolution headsets and better optics being the key focus points for the device manufacturers. Similarly on the software front, there has been …

NVIDIA VRSS, a Zero-Effort Way to Improve Your VR Image Quality Read More +

Accelerating WinML and NVIDIA Tensor Cores

This article was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. Every year, clever researchers introduce ever more complex and interesting deep learning models to the world. There is of course a big difference between a model that works as a nice demo in… Accelerating WinML and NVIDIA Tensor …

Accelerating WinML and NVIDIA Tensor Cores Read More +

Speeding Up Deep Learning Inference Using TensorFlow, ONNX, and TensorRT

This article was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. Starting with TensorRT 7.0,  the Universal Framework Format (UFF) is being deprecated. In this post, you learn how to deploy TensorFlow trained deep learning models using the new TensorFlow-ONNX-TensorRT workflow. Figure 1 shows… Speeding Up Deep Learning Inference …

Speeding Up Deep Learning Inference Using TensorFlow, ONNX, and TensorRT Read More +

Learning to Rank with XGBoost and GPU

This article was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. XGBoost is a widely used machine learning library, which uses gradient boosting techniques to incrementally build a better model during the training phase by combining multiple weak models. Weak models are generated by… Learning to Rank with XGBoost …

Learning to Rank with XGBoost and GPU Read More +

Laser Focused: How Multi-View LidarNet Presents Rich Perspective for Self-Driving Cars

This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. Deep neural network takes a two-stage approach to address lidar processing challenges. Editor’s note: This is the latest post in our NVIDIA DRIVE Labs series, which takes an engineering-focused look at individual autonomous vehicle challenges and how …

Laser Focused: How Multi-View LidarNet Presents Rich Perspective for Self-Driving Cars Read More +

Building a Real-time Redaction App Using NVIDIA DeepStream, Part 2: Deployment

This article was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. This post is the second in a series (Part 1) that addresses the challenges of training an accurate deep learning model using a large public dataset and deploying the model on the edge… Building a Real-time Redaction App …

Building a Real-time Redaction App Using NVIDIA DeepStream, Part 2: Deployment Read More +

Building a Real-time Redaction App Using NVIDIA DeepStream, Part 1: Training

This article was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. Some of the biggest challenges in deploying an AI-based application are the accuracy of the model and being able to extract insights in real time. There’s a trade-off between accuracy and inference throughput.… Building a Real-time Redaction App …

Building a Real-time Redaction App Using NVIDIA DeepStream, Part 1: Training Read More +

NVIDIA Introduces DRIVE AGX Orin — Advanced, Software-Defined Platform for Autonomous Machines

Tuesday, December 17, 2019 — NVIDIA today introduced NVIDIA DRIVE AGX Orin™, a highly advanced software-defined platform for autonomous vehicles and robots. The platform is powered by a new system-on-a-chip (SoC) called Orin, which consists of 17 billion transistors and is the result of four years of R&D investment. The Orin SoC integrates NVIDIA’s next-generation GPU …

NVIDIA Introduces DRIVE AGX Orin — Advanced, Software-Defined Platform for Autonomous Machines Read More +

nvidia

Didi Chuxing Teams with NVIDIA for Autonomous Driving and Cloud Computing

Tuesday, December 17, 2019 — NVIDIA and Didi Chuxing (DiDi), the world’s leading mobile transportation platform, today announced that DiDi will leverage NVIDIA GPUs and AI technology to develop autonomous driving and cloud computing solutions. DiDi will use NVIDIA® GPUs in the data center for training machine learning algorithms and NVIDIA DRIVE™ for inference on …

Didi Chuxing Teams with NVIDIA for Autonomous Driving and Cloud Computing Read More +

nvidia

NVIDIA Provides Transportation Industry Access to Its Deep Neural Networks for Autonomous Vehicles

Developers Also Gain Access to NVIDIA Advanced Learning Tools to Leverage DNNs Across Multiple Datasets While Preserving Data Privacy Tuesday, December 17, 2019 — NVIDIA today announced that it will provide the transportation industry with access to its NVIDIA DRIVE™ deep neural networks (DNNs) for autonomous vehicle development on the Developers Also Gain Access to …

NVIDIA Provides Transportation Industry Access to Its Deep Neural Networks for Autonomous Vehicles Read More +

8035c178-21d4-4b80-9576-eac3f6876b55

NVIDIA Provides U.S. Postal Service AI Technology to Improve Delivery Service

Advanced AI System to Process Package Data 10x Faster with Higher Accuracy Tuesday, November 5, 2019 — GTC DC — NVIDIA today announced that the United States Postal Service – the world’s largest postal service, with 485 million mail pieces processed and delivered daily – is adopting end-to-end AI technology from NVIDIA to improve its package …

NVIDIA Provides U.S. Postal Service AI Technology to Improve Delivery Service Read More +

mlperf-inference-social-banners-1149622-INT4-fb-ig-2048x2048-2-r5

Int4 Precision for AI Inference

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. If there’s one constant in AI and deep learning, it’s never-ending optimization to wring every possible bit of performance out of a given platform. Many inference applications benefit from reduced precision, whether it’s mixed precision for recurrent …

Int4 Precision for AI Inference Read More +

csm_Alvium-Open-Housing_Carrier-Board_Beauty_01_72dpi_20af6f4b00

Allied Vision Joins NVIDIA Jetson Ecosystem

New Vimba Open Source Driver to Support Jetson TX2 Platform Stadtroda, Germany, November 21, 2019 – Camera manufacturer and image processing expert Allied Vision announced it has become an official member of the NVIDIA Jetson ecosystem and is giving developers on these powerful AI embedded computing systems access to Allied Vision camera modules for industrial …

Allied Vision Joins NVIDIA Jetson Ecosystem Read More +

d1b0b20d-46cd-4dd6-8757-1da249df55fe

NVIDIA Announces Jetson Xavier NX, World’s Smallest Supercomputer for AI at the Edge

Latest Addition to Jetson Product Family Brings Xavier Performance to Nano Form Factor for $399 Wednesday, November 6, 2019 – NVIDIA today introduced Jetson Xavier™ NX, the world’s smallest, most powerful AI supercomputer for robotic and embedded computing devices at the edge. With a compact form factor smaller than the size of a credit card, …

NVIDIA Announces Jetson Xavier NX, World’s Smallest Supercomputer for AI at the Edge Read More +

NVIDIA-Deep-Learning-Platform

Automatic Defect Inspection Using the NVIDIA End-to-End Deep Learning Platform

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. Quality requirements for manufacturers are increasing to meet customer demands. Manual inspection is usually required to guarantee product quality, but this requires significant cost and can result in production bottlenecks, lowered productivity, and reduced efficiency.  Defect inspection …

Automatic Defect Inspection Using the NVIDIA End-to-End Deep Learning Platform Read More +

TracticaNVIDIAChart

Is NVIDIA Moving Toward AI Chipset World Domination?

This market research report was originally published at Tractica's website. It is reprinted here with the permission of Tractica. NVIDIA is, by all accounts, the de facto standard in AI chipsets today. In addition to being a chipset manufacturer, NVIDIA offers extra value to its customers by creating derivative products and solutions based on its …

Is NVIDIA Moving Toward AI Chipset World Domination? Read More +

Rapid Prototyping on NVIDIA Jetson Platforms with MATLAB

This article was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. This article discusses how an application developer can prototype and deploy deep learning algorithms on hardware like the NVIDIA Jetson Nano Developer Kit with MATLAB. In previous posts, we explored how you can design and train deep learning …

Rapid Prototyping on NVIDIA Jetson Platforms with MATLAB Read More +

new_blog_nodali

Case Study: ResNet50 with DALI

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. Let’s imagine a situation. You buy a brand-new, cutting-edge, Volta-powered DGX-2 server. You’ve done your math right, expecting a 2x performance increase in ResNet50 training over the DGX-1 you had before. You plug it into your rack …

Case Study: ResNet50 with DALI Read More +

SLAM-1280x711

What Is Simultaneous Localization and Mapping?

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. SLAM is a commonly used method to help robots map areas and find their way. To get around, robots need a little help from maps, just like the rest of us. Just like humans, bots can’t always …

What Is Simultaneous Localization and Mapping? Read More +

17-superpod-angle-600x400

NVIDIA Breaks Eight AI Performance Records

From 8 hours to 80 seconds: NVIDIA smashes AI training time. Only company to submit across all six categories. You can’t be first if you’re not fast. Inside the world’s top companies, teams of researchers and data scientists are creating ever more complex AI models, which need to be trained, fast. That’s why leadership in …

NVIDIA Breaks Eight AI Performance Records Read More +

cedbea25-ca5f-4ae7-8686-2f264d79bd3c

Top AI Chipset Companies Announced, Including NVIDIA, Intel, NXP, Apple, and Google, Based on CompassIntel.com Research

NVIDIA, Intel, NXP, Apple, and Google top the A-List in AI Chipset Index based on recently released research by Compass Intelligence. The 2019 A-List in AI (artificial intelligence) Chipset Index includes companies providing software and hardware components of AI chipsets. AI chipset products include central processing units, graphic processing units, neural network processors, application specific …

Top AI Chipset Companies Announced, Including NVIDIA, Intel, NXP, Apple, and Google, Based on CompassIntel.com Research Read More +

Geforce-RTX-SUPER-842x450_thmb

With Great Power Comes Great Gaming: NVIDIA Launches GeForce RTX SUPER Series

Best-In-Class Performance, Power Efficiency, Plus Real-Time Ray Tracing for Growing Wave of Blockbuster Games SANTA CLARA, Calif., July 02, 2019 – NVIDIA today supercharged its lineup of gaming GPUs by introducing GeForce® RTX 2060 SUPER™, GeForce RTX 2070 SUPER and GeForce RTX 2080 SUPER — delivering best-in-class gaming performance and real time ray tracing for …

With Great Power Comes Great Gaming: NVIDIA Launches GeForce RTX SUPER Series Read More +

NVIDIA_CUDA-web_mid

NVIDIA Brings CUDA to Arm, Enabling New Path to Exascale Supercomputing

Global HPC Leaders Join to Support New Platform June 17, 2019 — International Supercomputing Conference — NVIDIA today announced its support for Arm CPUs, providing the high performance computing industry a new path to build extremely energy-efficient, AI-enabled exascale supercomputers. NVIDIA is making available to the Arm® ecosystem its full stack of AI and HPC software — …

NVIDIA Brings CUDA to Arm, Enabling New Path to Exascale Supercomputing Read More +

NVIDA-EGX-Computing-Platform-From-Nano-to-T4_thmb

NVIDIA Launches Edge Computing Platform to Bring Real-Time AI to Global Industries

Leading Computer Makers Adopt NVIDIA EGX Platform, Offering GPU Edge Servers for Instant AI on Real-Time Streaming Data in Telecom, Healthcare, Manufacturing, Retail, Transportation Monday, May 27, 2019 – Computex — NVIDIA today announced NVIDIA EGX, an accelerated computing platform that enables companies to perform low-latency AI at the edge — to perceive, understand and act in …

NVIDIA Launches Edge Computing Platform to Bring Real-Time AI to Global Industries Read More +

Screen-Shot-2019-04-29-at-3

DRIVE Labs: How We’re Building Path Perception for Autonomous Vehicles

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. Detailing the building blocks of autonomous driving, new NVIDIA DRIVE Labs video series provides an inside look at DRIVE software. Editor’s note: No one developer or company has yet succeeded in creating a fully autonomous vehicle. But …

DRIVE Labs: How We’re Building Path Perception for Autonomous Vehicles Read More +

DRIVEAV-1

Tesla Raises the Bar for Self-Driving Carmakers

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. In unveiling the specs of his new self-driving car computer at this week’s Tesla Autonomy Day investor event, Elon Musk made several things very clear to the world. First, Tesla is raising the bar for all other …

Tesla Raises the Bar for Self-Driving Carmakers Read More +

pasted-image-0-12-625x417

Machine Learning Acceleration in Vulkan with Cooperative Matrices

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. Machine learning harnesses computing power to solve a variety of ‘hard’ problems that seemed impossible to program using traditional languages and techniques. Machine learning avoids the need for a programmer to explicitly program the steps in solving …

Machine Learning Acceleration in Vulkan with Cooperative Matrices Read More +

remove_neuron

Pruning Models with NVIDIA Transfer Learning Toolkit

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. It’s important for the model to make accurate predictions when using a deep learning model for production. How efficiently these predictions happen also matters. Examples of efficiency measurements include electrical engineers measuring energy consumption to pick the best …

Pruning Models with NVIDIA Transfer Learning Toolkit Read More +

Screen-Shot-2019-04-15-at-6

How Does a Self-Driving Car See?

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. To drive better than humans, autonomous vehicles must first see better than humans. Building reliable vision capabilities for self-driving cars has been a major development hurdle. By combining a variety of sensors, however, developers have been able …

How Does a Self-Driving Car See? Read More +

How AI Is Transforming Healthcare

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. Healthcare is a multitrillion-dollar global industry, growing each year as average life expectancy rises — and with nearly unlimited facets and sub-specialties. For medical professionals, new technologies can change the way they work, enable more accurate diagnoses …

How AI Is Transforming Healthcare Read More +

nvidia

NVIDIA and American College of Radiology AI-LAB Team to Accelerate Adoption of AI in Diagnostic Radiology Across Thousands of Hospitals

NVIDIA Clara AI Toolkit Enables New ACR AI-LAB, Giving 38,000+ ACR Members and Other Radiology Professionals Ability to Develop and Deploy AI at Their Own Institutions BOSTON, April 08, 2019, World Medical Innovation Forum — NVIDIA and the American College of Radiology today announced a collaboration to enable thousands of radiologists nationwide to create and use …

NVIDIA and American College of Radiology AI-LAB Team to Accelerate Adoption of AI in Diagnostic Radiology Across Thousands of Hospitals Read More +

Tensor-Core-Matrix

Tensor Core Programming Using CUDA Fortran

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. The CUDA Fortran compiler from PGI now supports programming Tensor Cores with NVIDIA’s Volta V100 and Turing GPUs. This enables scientific programmers using Fortran to take advantage of FP16 matrix operations accelerated by Tensor Cores. Let’s take …

Tensor Core Programming Using CUDA Fortran Read More +

Jetson-Nano

NVIDIA Expands its AI Edge Capabilities

This market research report was originally published at Tractica's website. It is reprinted here with the permission of Tractica. NVIDIA is not necessarily known for powering low power AI applications. Most of its artificial intelligence (AI) focus has been on powering high power cloud or data center applications, where its Tesla graphics processing unit (GPU) …

NVIDIA Expands its AI Edge Capabilities Read More +

Speeding Up Semantic Segmentation Using MATLAB Container from NVIDIA NGC

This article was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. Gone are the days of using a single GPU to train a deep learning model.  With computationally intensive algorithms such as semantic segmentation, a single GPU can take days to optimize a model. But multi-GPU hardware is expensive, …

Speeding Up Semantic Segmentation Using MATLAB Container from NVIDIA NGC Read More +

ONHoriz-3DShadow-Sm

ON Semiconductor Collaborates with NVIDIA on Cloud-based Autonomous Vehicle Simulation

ON Semiconductor’s Image Sensor model provides real-time inputs to NVIDIA DRIVE Constellation simulation platform SAN JOSE, CA – GTC – March 19, 2019 – ON Semiconductor (Nasdaq: ON), driving energy-efficient innovations, today announced that it is leveraging its sophisticated image sensor modeling technology to provide real-time data to the NVIDIA DRIVE Constellation™ simulation platform. The …

ON Semiconductor Collaborates with NVIDIA on Cloud-based Autonomous Vehicle Simulation Read More +

Stroke of Genius: GauGAN Turns Doodles into Stunning, Photorealistic Landscapes

March 18, 2019 – A novice painter might set brush to canvas aiming to create a stunning sunset landscape —  craggy, snow-covered peaks reflected in a glassy lake — only to end up with something that looks more like a multi-colored inkblot. But a deep learning model developed by NVIDIA Research can do just the …

Stroke of Genius: GauGAN Turns Doodles into Stunning, Photorealistic Landscapes Read More +

2000px-Amazon_Web_Services_Logo

NVIDIA Teams with Amazon Web Services to Bring AI to Millions of Connected Devices

AWS IoT Greengrass Enables NVIDIA Jetson to Seamlessly Deploy AI for Edge Devices SAN JOSE, Calif., March 18, 2019 GPU Technology Conference—NVIDIA today announced a collaboration with Amazon Web Services (AWS) IoT on NVIDIA® Jetson™ to enable customers to deploy AI and deep learning to millions of connected devices. This joint solution enables models to …

NVIDIA Teams with Amazon Web Services to Bring AI to Millions of Connected Devices Read More +

tri_logo_lndscp-1200

NVIDIA and Toyota Research Institute-Advanced Development Partner to Create Safer Autonomous Transportation

Collaboration to Accelerate Use of Autonomous Vehicles and AI Technologies Expands to New Testing and Validation System SAN JOSE, Calif., March 18, 2019 GPU Technology Conference — Toyota Research Institute-Advanced Development (TRI-AD) and NVIDIA today announced a new collaboration to develop, train and validate self-driving vehicles. The partnership builds on an ongoing relationship with Toyota to …

NVIDIA and Toyota Research Institute-Advanced Development Partner to Create Safer Autonomous Transportation Read More +

constellation_gtc-600x338

NVIDIA DRIVE Constellation Now Available — Virtual Proving Ground for Validating Autonomous Vehicles

Open, Scalable Simulation Platform Enables Large Virtual Fleets of Self-Driving Cars SAN JOSE, Calif., March 18, 2019 GPU Technology Conference — NVIDIA today announced the NVIDIA DRIVE Constellation™ autonomous vehicle simulation platform is now available. The cloud-based platform enables millions of miles to be driven in virtual worlds across a broad range of scenarios — from routine …

NVIDIA DRIVE Constellation Now Available — Virtual Proving Ground for Validating Autonomous Vehicles Read More +

NVIDIA Introduces DRIVE AV Safety Force Field: Computational Defensive Driving Policy to Shield Autonomous Vehicles from Collisions

Mathematically Rigorous and Validated in Simulation, Safety Driving Decision Algorithms Protect Against Unpredictability of Real-World Traffic SAN JOSE, Calif., March 18, 2019 GPU Technology Conference—NVIDIA today bolstered its NVIDIA DRIVE™ AV autonomous vehicle software suite with a planning and control layer designed to enable a safe and comfortable driving experience. A primary component of this …

NVIDIA Introduces DRIVE AV Safety Force Field: Computational Defensive Driving Policy to Shield Autonomous Vehicles from Collisions Read More +

Screen-Shot-2019-03-18-at-2

NVIDIA Announces DRIVE AP2X – World’s Most Complete Level 2+ Autonomous Vehicle Platform

March 18, 2019 – Today at the GPU Technology Conference, NVIDIA founder and CEO Jensen Huang announced NVIDIA DRIVE AP2X — a complete Level 2+ automated driving solution encompassing DRIVE AutoPilot software, DRIVE AGX and DRIVE validation tools. DRIVE AP2X incorporates DRIVE AV autonomous driving software and DRIVE IX intelligent cockpit experience. Each runs on …

NVIDIA Announces DRIVE AP2X – World’s Most Complete Level 2+ Autonomous Vehicle Platform Read More +

18-deepstream-sdk-600x319

NVIDIA and Microsoft Create Edge-to-Cloud Real-Time Streaming Video Analytics Solution

Make cities smarter by connecting NVIDIA DeepStream Edge AI and Microsoft Azure IoT. March 18, 2019 – It’s a huge challenge extracting actionable insights from the sea of data created by the world’s billions of cameras and sensors. Bringing all this raw data to the cloud is inefficient because of bandwidth, cost and latency limitations. …

NVIDIA and Microsoft Create Edge-to-Cloud Real-Time Streaming Video Analytics Solution Read More +

18-isaac-image-600x338

Revving Robotics: NVIDIA Isaac SDK Brings Modern AI to Autonomous Machines

Robotics developer toolbox — Isaac apps, GEMs, Robot Engine and Sim — is moving into general availability March 18, 2019 – Robotics developers are off to the races creating autonomous machines of the future. We’re fueling their efforts with the NVIDIA Isaac SDK, which will be publicly available soon as a free robotics developer toolbox …

Revving Robotics: NVIDIA Isaac SDK Brings Modern AI to Autonomous Machines Read More +

db249e89-59ea-4a2e-8733-c86a9ab16cb0_600

NVIDIA Announces Jetson Nano: $99 Tiny, Yet Mighty NVIDIA CUDA-X AI Computer That Runs All AI Models

SAN JOSE, Calif., March 18, 2019 GPU Technology Conference—NVIDIA today announced the Jetson Nano™, an AI computer that makes it possible to create millions of intelligent systems. The small but powerful CUDA-X™ AI computer delivers 472 GFLOPS of compute performance for running modern AI workloads and is highly power-efficient, consuming as little as 5 watts. …

NVIDIA Announces Jetson Nano: $99 Tiny, Yet Mighty NVIDIA CUDA-X AI Computer That Runs All AI Models Read More +

18-aws-robomaker-600x319

NVIDIA Accelerates Robotic Development from Cloud to Edge with AWS RoboMaker

With support for AWS RoboMaker, NVIDIA platform allows robots to be developed and deployed faster. March 18, 2019 – The NVIDIA Jetson AI computer platform now supports Amazon Web Services AWS RoboMaker. Robotic simulation and development can now be easily done in the cloud and deployed across millions of robots and other autonomous machines powered …

NVIDIA Accelerates Robotic Development from Cloud to Edge with AWS RoboMaker Read More +

Clara AI Lets Every Radiologist Teach Their Own AI

A system for radiologists to deliver AI-assisted annotation, adapt AI for their patients, and deploy it in the hospital. March 18, 2019 – AI is ready to unlock massive potential in hospital systems, especially in one of the areas where deep learning holds the most promise: medical imaging. That’s why NVIDIA introduced Clara AI, a …

Clara AI Lets Every Radiologist Teach Their Own AI Read More +

style-transfer-ai-playground-corp-blog-600x319

Slide into the Latest Deep Learning Research in the NVIDIA AI Playground

March 18, 2019 – If you’ve ever wanted to dig into the latest in deep learning research, now’s your chance. NVIDIA has launched AI Playground, an online space where anyone can experience our research demos firsthand. “Research papers have new ideas in them and are really cool, but they’re directed at specialized audiences — we’re …

Slide into the Latest Deep Learning Research in the NVIDIA AI Playground Read More +

18-ngc-software-hub-600x319

NVIDIA Expands NGC Software Hub with Tools for Data Scientists to Build Optimized Solutions Faster

NGC adds models, model scripts and industry solutions to a growing set of AI containers; New wave of NGC-Ready enterprise servers and support services fuel GPU-accelerated AI. March 18, 2018 – Whether advancing science, building self-driving cars or gathering business insight from mountains of data, data scientists, researchers and developers need powerful GPU compute. They …

NVIDIA Expands NGC Software Hub with Tools for Data Scientists to Build Optimized Solutions Faster Read More +

18-cuda-x-600x319

NVIDIA Announces CUDA-X AI SDK for GPU-Accelerated Data Science

March 18, 2019 – Data scientists working in data analytics, machine learning and deep learning will get a massive speed boost with NVIDIA’s new CUDA-X AI libraries. Unlocking the flexibility of Tensor Core GPUs, CUDA-X accelerates: … data science from ingest of data, to ETL, to model training, to deployment. … machine learning algorithms for …

NVIDIA Announces CUDA-X AI SDK for GPU-Accelerated Data Science Read More +

gpu-virtual-machines-cloud-600x328

NVIDIA CUDA-X AI Acceleration Libraries Speed Up Machine Learning in the Cloud by 20x; Available Now on Microsoft Azure

March 18, 2019 – Data scientists can now accelerate their machine learning projects by up to 20x using NVIDIA CUDA-X AI, NVIDIA’s data science acceleration libraries, on Microsoft Azure. With just a few clicks, businesses of all sizes can accelerate their data science, turning enormous amounts of data into their competitive advantage faster than ever …

NVIDIA CUDA-X AI Acceleration Libraries Speed Up Machine Learning in the Cloud by 20x; Available Now on Microsoft Azure Read More +

PathPartner_Logo

PathPartner Showcases AI-Based Algorithms for Autonomous Machines on NVIDIA’s Jetson AGX Xavier Platform

PathPartner to Demonstrate its Smart-Vision Algorithms Suite at GTC 2019 in San Jose from March 17-21 Fremont, California – Mar 2, 2019 – PathPartner Technology, a product R&D and engineering specialist, announced today that it will demonstrate its AI-based smart-vision algorithms suite for autonomous machines at the GPU Technology Conference (GTC) 2019 in San Jose …

PathPartner Showcases AI-Based Algorithms for Autonomous Machines on NVIDIA’s Jetson AGX Xavier Platform Read More +

Football-512x288

An Introduction to the NVIDIA Optical Flow SDK

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. NVIDIA’s Turing GPUs  introduced a new hardware functionality for computing optical flow between images with very high performance. The Optical Flow SDK 1.0 enables developers to tap into the new optical flow functionality. You can download the Optical Flow …

An Introduction to the NVIDIA Optical Flow SDK Read More +

1oeustFcjTXUVPCXM6qe9Pw

Make Sense of the Universe with Rapids.ai

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. Classification of astronomical sources in the night sky is important for understanding the universe. It helps us understand the properties of what makes up celestial systems, from our solar system to the most distant galaxy and everything …

Make Sense of the Universe with Rapids.ai Read More +

RoadBotics-1280x889

AI My Ride: Startup Revs Up Vehicle Videos to Spot Potholes

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. Christoph Mertz wants to fix your bumpy ride. His solution — using AI to detect potholes — came as a lightbulb moment inspired by working on autonomous vehicle projects at Carnegie Mellon University. Mertz is now the …

AI My Ride: Startup Revs Up Vehicle Videos to Spot Potholes Read More +

carter-popcorn-delivery-robot-hero-1280x756

Pop Star: At NVIDIA, Popcorn Delivery Robot Bears Kernel Innovation

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. At NVIDIA, the robots are being taught to weave their way through the workplace. It helps that, as Ankhit, an NVIDIA Linux systems administrator discovered, they come bearing popcorn. Moments after placing an order, via Slack, a …

Pop Star: At NVIDIA, Popcorn Delivery Robot Bears Kernel Innovation Read More +

2

Using Nsight Compute or Nvprof to Show Mixed Precision Use in Deep Learning Models

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. Mixed precision combines different numerical precisions in a computational method. The Volta and Turing generation of GPUs introduced Tensor Cores, which provide significant throughput speedups over single precision math pipelines. Deep learning networks can be trained with …

Using Nsight Compute or Nvprof to Show Mixed Precision Use in Deep Learning Models Read More +

Tractica-Logo-e1431719018493_0_1_0_0_0_0_0_0_0

Nvidia Is Moving Faster Than the Competition in the AI Chipset Industry

This market research report was originally published at Tractica's website. It is reprinted here with the permission of Tractica. Since 2015, more than 70 companies have entered the AI chipset market and more than 100 chip starts have been announced. All of them are trying to tackle the AI algorithm acceleration problem using different techniques. …

Nvidia Is Moving Faster Than the Competition in the AI Chipset Industry Read More +

20treeAI-Urban-Green-Spaces-tree-localization-1280x692

Seeing the Wood for the Trees: Using AI for Smarter Forest Management

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. Nearly 19 million acres of forests are destroyed annually, equal to 27 football pitches a minute. Forests serve as homes for thousands of animals, and for many people they’re a source of food, water, clothing, medicine and …

Seeing the Wood for the Trees: Using AI for Smarter Forest Management Read More +

featured-london-street-doctor

A Man, a GAN, and a 1080 Ti: How Jason Antic Created ‘De-Oldify’

This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. You don’t need to be an an academic or to work for a big company to get into AI. You can just be a guy with an NVIDIA GeForce 1080 Ti and a generative adversarial network. Jason …

A Man, a GAN, and a 1080 Ti: How Jason Antic Created ‘De-Oldify’ Read More +

NVIDIA_Mapping_600

NVIDIA Teams Up with Leading HD Mapping Companies to Deliver End-to-End Autopilot Systems for the World’s Major Markets

DRIVE Localization, an open, scalable platform, enables autonomous vehicles to localize themselves within centimeters to HD maps worldwide. NVIDIA is showcasing this week at CES its DRIVE Localization — a system that unlocks the restraints to a global, mass-market solution for cars to find themselves on maps. Leveraging the compute power of GPUs and the …

NVIDIA Teams Up with Leading HD Mapping Companies to Deliver End-to-End Autopilot Systems for the World’s Major Markets Read More +

jhh-mercedes_600

Mercedes-Benz, NVIDIA to Create New AI Architecture for Mercedes Vehicles

NVIDIA’s Jensen Huang, Mercedes’ Sajjad Khan unveil vision for software-defined AI cars integrating self-driving, intelligent cockpits. Mercedes-Benz announced today it has selected NVIDIA to help realize its vision for next-generation vehicles. Speaking to a packed crowd at the Mercedes-Benz booth on the first day of CES 2019, Mercedes-Benz Executive Vice President Sajjad Khan and NVIDIA …

Mercedes-Benz, NVIDIA to Create New AI Architecture for Mercedes Vehicles Read More +

nvidia-drive-software-8

NVIDIA Introduces DRIVE AutoPilot, World’s First Commercially Available Level 2+ Automated Driving System

Continental, ZF Announce L2+ Solutions Based on NVIDIA DRIVE for Production in 2020 Monday, January 7, 2019 — CES — NVIDIA today announced the world’s first commercially available Level 2+ automated driving system, NVIDIA DRIVE™ AutoPilot, which integrates multiple breakthrough AI technologies that will enable supervised self-driving vehicles to go into production by next year. At CES …

NVIDIA Introduces DRIVE AutoPilot, World’s First Commercially Available Level 2+ Automated Driving System Read More +

b231760e-7003-4b4a-9528-69f7367f34a_458b1f53-a0b4-4a81-94c0-dd1d20c3971a-prv

NVIDIA GeForce RTX 2060 Is Here: Next-Gen Gaming Takes Off

Priced at $349, RTX 2060 Delivers Amazing Graphics with Ray Tracing and AI to Tens of Millions PC Gamers; Global Availability Jan. 15 Sunday, January 6, 2019 — CES — NVIDIA today announced the NVIDIA® GeForce® RTX 2060, putting revolutionary Turing architecture GPUs within the reach of tens of millions PC gamers worldwide. The RTX 2060 …

NVIDIA GeForce RTX 2060 Is Here: Next-Gen Gaming Takes Off Read More +

supermarket_final-crop-1280x680

How to Get Started with Deep Learning Frameworks

A guide to the top deep learning frameworks for AI research. This blog post was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. Think of a deep learning framework as a grocery store. Rather than laboring in their own backyard farms, most people shop at markets when they want …

How to Get Started with Deep Learning Frameworks Read More +

Using Calibration to Translate Video Data to the Real World

This article was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. DeepStream SDK 3.0 is about seeing beyond pixels. DeepStream exists to make it easier for you to go from raw video data to metadata that can be analyzed for actionable insights. Calibration is a key step in this …

Using Calibration to Translate Video Data to the Real World Read More +

Using MATLAB and TensorRT on NVIDIA GPUs

This article was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA. As we design deep learning networks, how can we quickly prototype the complete algorithm—including pre- and postprocessing logic around deep neural networks (DNNs) —to get a sense of timing and performance on standalone GPUs? This question comes up …

Using MATLAB and TensorRT on NVIDIA GPUs Read More +

Vision Processing Opportunities in Drones

UAVs (unmanned aerial vehicles), commonly known as drones, are a rapidly growing market and increasingly leverage embedded vision technology for digital video stabilization, autonomous navigation, and terrain analysis, among other functions. This article reviews drone market sizes and trends, and then discusses embedded vision technology applications in drones, such as image quality optimization, autonomous navigation, …

Vision Processing Opportunities in Drones Read More +

“NVIDIA VisionWorks, a Toolkit for Computer Vision,” a Presentation from NVIDIA

Elif Albuz, Technical Lead for the VisionWorks Toolkit at NVIDIA, presents the "NVIDIA VisionWorks, a Toolkit for Computer Vision" tutorial at the May 2016 Embedded Vision Summit. In this talk, Albuz introduces the NVIDIA VisionWorks toolkit, a software development package for computer vision and image processing. VisionWorks implements and extends the Khronos OpenVX standard, and …

“NVIDIA VisionWorks, a Toolkit for Computer Vision,” a Presentation from NVIDIA Read More +

“Computational Photography: Understanding and Expanding the Capabilities of Standard Cameras,” a Presentation from NVIDIA

Orazio Gallo, Senior Research Scientist at NVIDIA, presents the "Computational Photography: Understanding and Expanding the Capabilities of Standard Cameras" tutorial at the May 2016 Embedded Vision Summit. Today's digital cameras, even at the entry-level, produce pictures with quality comparable to that of high-end cameras of a decade ago. Image processing and computational photography algorithms play …

“Computational Photography: Understanding and Expanding the Capabilities of Standard Cameras,” a Presentation from NVIDIA Read More +

OpenCL Eases Development of Computer Vision Software for Heterogeneous Processors

OpenCL™, a maturing set of programming languages and APIs from the Khronos Group, enables software developers to efficiently harness the profusion of diverse processing resources in modern SoCs, in an abundance of applications including embedded vision. Computer scientists describe computer vision, the use of digital processing and intelligent algorithms to interpret meaning from still and …

OpenCL Eases Development of Computer Vision Software for Heterogeneous Processors Read More +

OpenCLLogo_678x452

OpenCL Eases Development of Computer Vision Software for Heterogeneous Processors

OpenCL™, a maturing set of programming languages and APIs from the Khronos Group, enables software developers to efficiently harness the profusion of diverse processing resources in modern SoCs, in an abundance of applications including embedded vision. Computer scientists describe computer vision, the use of digital processing and intelligent algorithms to interpret meaning from still and …

OpenCL Eases Development of Computer Vision Software for Heterogeneous Processors Read More +

“An Update on OpenVX and Other Vision-Related Standards,” A Presentation from Khronos

Elif Albuz, Manager of Vision Software at NVIDIA, delivers the presentation "Update on OpenVX and Other Khronos Standards" at the December 2014 Embedded Vision Alliance Member Meeting. Elif provides an update on the newly released OpenVX standard, and other vision-related standards in progress.

nvidia

Accelerate Machine Learning with the cuDNN Deep Neural Network Library

This article was originally published at NVIDIA's developer blog. It is reprinted here with the permission of NVIDIA. By Larry Brown Solution Architect, NVIDIA Machine Learning (ML) has its origins in the field of Artificial Intelligence, which started out decades ago with the lofty goals of creating a computer that could do any work a …

Accelerate Machine Learning with the cuDNN Deep Neural Network Library Read More +

It’s Tegra K1 Everywhere at Google I/O

This article was originally published at NVIDIA's blog. It is reprinted here with the permission of NVIDIA. You couldn’t get very far at Google I/O’s dazzling kickoff today without bumping into our new Tegra K1 mobile processor. The keynote showed off Google’s new Android L operating system’s gaming capabilities on a Tegra K1 reference device. …

It’s Tegra K1 Everywhere at Google I/O Read More +

Tegra K1-Powered Project Tango DevKit Opens Door to New Worlds Enabled by Computer Vision

This article was originally published at NVIDIA's blog. It is reprinted here with the permission of NVIDIA. Google’s new Project Tango Tablet Developers’ Kit puts powerful new capabilities in the hands of those ready to harness the promise of computer vision. Fast-forwarding Google’s Project Tango from experimental device to developer kit, the tablet incorporates cameras …

Tegra K1-Powered Project Tango DevKit Opens Door to New Worlds Enabled by Computer Vision Read More +

GPUTech

Embedded Vision: Enabling Smarter Mobile Apps and Devices

For decades, computer vision technology was found mainly in university laboratories and a few niche applications. Today, virtually every tablet and smartphone is capable of sophisticated vision functions such as hand gesture recognition, face recognition, gaze tracking, and object recognition. These capabilities are being used to enable new types of applications, user interfaces, and use …

Embedded Vision: Enabling Smarter Mobile Apps and Devices Read More +

Why the Future of Self-Driving Cars Depends on Visual Computing

This article was originally published at NVIDIA's blog. It is reprinted here with the permission of NVIDIA. Everybody hates driving through cross-town traffic. This week, Google said they’re doing something about it, announcing that they’ve shifted the focus of their Self-Driving Car Project from cruising down freeways to mastering city streets. The blog post, by …

Why the Future of Self-Driving Cars Depends on Visual Computing Read More +

nvidia

Anything But Pedestrian: How GPU-Powered Brains Can Help Cars Keep People Safe

This article was originally published at NVIDIA's blog. It is reprinted here with the permission of NVIDIA. Today’s crowded urban centers are, more than ever, a mine field for drivers. It’s not just that there are more pedestrians on the streets; many of them are staring at or talking on their mobile devices as they …

Anything But Pedestrian: How GPU-Powered Brains Can Help Cars Keep People Safe Read More +

NVIDIA CEO Jen-Hsun Huang on stage and speaking about machine learning.

What’s Machine Learning? Thanks to GPU Accelerators, You’re Already Soaking In It

This article was originally published at NVIDIA's blog. It is reprinted here with the permission of NVIDIA. Adobe, Baidu, Netflix, Yandex. Some of the biggest names in social media and cloud computing use NVIDIA CUDA-based GPU accelerators to provide seemingly magical search, intelligent image analysis and personalized movie recommendations, based on a technology called advanced …

What’s Machine Learning? Thanks to GPU Accelerators, You’re Already Soaking In It Read More +

nvidia

How GPUs Help Computers Understand What They’re Seeing

This article was originally published at NVIDIA's blog. It is reprinted here with the permission of NVIDIA. Researchers have been able to advance computerized object recognition to once unfathomable levels, thanks to GPUs. Building on the work of neural network pioneers Kunihiko Fukushima and Yann LeCun – and more recent efforts by teams at the …

How GPUs Help Computers Understand What They’re Seeing Read More +

GPUTech

Introduction to OpenCV for Tegra

Do you want to write your own blazing fast, interactive mobile apps using computer vision technology? Apps that can make your camera smarter, find people's faces, understand their gestures, interpret scenes and augment them with graphics? The Tegra super chip and the OpenCV for Tegra library can help you to do just that! OpenCV for …

Introduction to OpenCV for Tegra Read More +

GPUTech

Real-Time Traffic Sign Recognition on Mobile Processors

There is a growing need for fast and power-efficient computer vision on embedded devices. This session will focus on computer vision capabilities on embedded platforms available to ADAS developers, covering the OpenCV CUDA implementation and the new computer vision standard, OpenVX. In addition, Itseez traffic sign detection will be showcased. The algorithm is capable of …

Real-Time Traffic Sign Recognition on Mobile Processors Read More +

GPUTech

Getting Started With GPU-Accelerated Computer Vision Using OpenCV and CUDA

OpenCV is a free library for research and commercial purposes that includes hundreds of optimized computer vision and image processing algorithms. NVIDIA and Itseez have optimized many OpenCV functions using CUDA on desktop machines equipped with NVIDIA GPUs. These functions are 5 to 100 times faster in wall-clock time compared to their CPU counterparts. Anatoly …

Getting Started With GPU-Accelerated Computer Vision Using OpenCV and CUDA Read More +

Create ‘Machines That See’ Using Industry Resources

This article is an expanded and updated version of one originally published at Design News. It is reprinted here with the permission of Design News. This article explores the opportunity for incorporating visual intelligence in electronic products, introduces an industry alliance created to help engineers implement such capabilities, and describes an upcoming technical education forum …

Create ‘Machines That See’ Using Industry Resources Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 North California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top