GPU

AI Hardware

CUDA Simply Explained – GPU vs CPU Parallel Computing for Beginners

In this tutorial, Mariya talks about CUDA and how it helps us accelerate the speed of our programs. Additionally, we will discuss the difference between processors (CPUs) and graphic cards (GPUs) and how come we can use both to process code.

Read More
Raspberry Pi

What Happens When you plug the AMD Radeon RX 6700 XT into a Raspberry Pi

This guy bought an AMD Radeon RX 6700 XT and plugged it into a Raspberry Pi to see what would happen. Contents: 00:00 – How much did it cost? 00:50 – Unboxing 04:05 – Pi Problems 05:40 – Plugging it in 09:59 – No explosions, yay! 12:18 – Needed: a Driver 12:40 – Compiling the […]

Read More
Hardware

NVIDIA ISC21 Special Address

NVIDIA provides an in-depth overview of the latest news, innovations, and technologies in AI supercomputing from Marc Hamilton, VP of Solutions Architecture and Engineering.

Read More
AI Hardware

NVIDIA CEO Jensen Huan on the Industrial HPC Revolution

Industrial high performance computing (HPC) is at its tipping point. The combination of simulation, accelerated computing, and #AI are bringing the performance and scale required to tackle real-world, industrial problems. This presentation blew my mind.

Read More
Deep Learning

Convolutions in Deep Learning – Interactive Demo App

deeplizard explains the importance of convolutions in deep learning. In deep learning, convolution operations are the key components used in convolutional neural networks. A convolution operation maps an input to an output using a filter and a sliding window. Use the interactive demonstration below to gain a better understanding of this process. 

Read More
AI Hardware

GTC 2021 Keynote with NVIDIA CEO Jensen Huang

NVIDIA CEO Jensen Huang delivers the #GTC21 keynote, where he introduced a number of amazing breakthroughs.

Read More
AI Hardware

The AI Hardware Problem

The abundance of automation and tooling made it relatively manageable to scale designs in complexity and performance as demand grew. However, the power being consumed by AI and machine learning applications cannot feasibly grow as is on existing processing architectures. Where do we go from here? New Mind explores.

Read More