Here’s an interesting talk from Microsoft Research YouTube channel by Yuija Li about Gated Graph Sequence Neural Networks. Details about the presentation and a link to the paper are below the video.

Link to paper

From the description:

Graph-structured data appears frequently in domains including chemistry, natural language semantics, social networks, and knowledge bases. In this work, we study feature learning techniques for graph-structured inputs. Our starting point is previous work on Graph Neural Networks (Scarselli et al., 2009), which we modify to use gated recurrent units and modern optimization techniques and then extend to output sequences. The result is a flexible and broadly useful class of neural network models that has favorable inductive biases relative to purely sequence-based models (e.g., LSTMs) when the problem is graph-structured. We demonstrate the capabilities on some simple AI (bAbI) and graph algorithm learning tasks. We then show it achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.

This episode of Blocktalk provides a review of consensus algorithms that are used, primarily for consortium based deployments.  This include the popular Proof of Authority, Proof of Work and a variant of BFT. 

The core concepts of the algorithms are introduced and a demonstration of using the popular GETH client to provision a PoA based network, and how the consensus can be chosen at blockchain creation time, demonstrating the popular pluggable consensus.  Additional details and sample code are available on GitHub: https://aka.ms/bt-consensus9F