What’s actually happening to a neural network as it learns and what does back-propagation have to do with it?
Deep learning and AI are fundamentally changing the way data is used in computation. They enable computing capabilities that will transform almost every industry, scientific domain, and public usage of data and compute.
The recent success of deep learning algorithms can be seen as the culmination of decades of progress in three areas: research in DL algorithms, broad availability of big data infrastructure, and the massive growth of computation power produced by Moore’s law and the advent of parallel compute architectures.
Deep learning has been employed successfully in such diverse areas as healthcare, transportation, industrial IoT, finance, entertainment, and retail, in addition to high-performance computing.
Examples shown in this video illustrate how the approach works and how it complements high-performance data analytics and traditional business intelligence.
Professor Pieter Abbeel, a professor at UC Berkeley, discusses the use of deep learning in training robots to perform tasks.
Tom Dietterich delivers a talk to the National Academies of Sciences, Engineering, and Medicine at a two-day workshop they organized on the capabilities and applications of artificial intelligence and machine learning for the intelligence community.
Computerphile has a great video explaining Generative Adversarial Networks aka GANs.
Here’s an interesting talk on CNTK, the Microsoft Cognitive Toolkit, and Deep Learning on Azure.
Slides and code are on GitHub.
Siraj Raval covers the latest happenings on the latest research on neural networks, including a potential replacement to Convolutional Neural Networks.
First play with this online demo of pix2pix and then watch the video below explaining the academic paper attached to it.