GPT-2

AI Generative AI Natural Language Processing

OpenAI ChatGPT: The Future Is Here!

Two Minute Papers examines ChatGPT more closely.

Read More
AI Research

Is OpenAI’s AI As Smart As A University Student? 🤖

Two Minute Papers examines the following papers: Grade school math: https://openai.com/blog/grade-school-math/ University level math: https://arxiv.org/abs/2112.15594 Olympiad: https://openai.com/blog/formal-math/

Read More
AI Natural Language Processing

AlphaCode Explained: AI Code Generation

AlphaCode is DeepMind’s new massive language model for generating code. It is similar to OpenAI Codex, except for in the paper they provide a bit more analysis. The field of NLP within AI and ML has exploded get a lot more papers all the time. Hopefully this video can help you understand how AlphaCode works […]

Read More
AI Generative AI Natural Language Processing

OpenAI GPT-3 Is Good At Almost Everything!

OpenAI’s GPT-3 is quite the feat of AI engineering and now we have Two Minute Papers’ take on it.

Read More
AI Natural Language Processing

GPT-3: Language Models are Few-Shot Learners

How far can you go with ONLY language modeling? Can a large enough language model perform NLP task out of the box? OpenAI take on these and other questions by training a transformer that is an order of magnitude larger than anything that has ever been built before and the results are astounding. Yannic Kilcher […]

Read More
AI Natural Language Processing

Will Robot Writers Change the Internet?

Computers just got a lot better at mimicking human language. Researchers created computer programs that can write long passages of coherent, original text. Language models like GPT-2, Grover, and CTRL create text passages that seem written by someone fluent in the language, but not in the truth. That AI field, Natural Language Processing (NLP), didn’t […]

Read More
Natural Language Processing

Natural Language Processing, GPT-2 and BERT

Christoph Henkelmann (DIVISIO) explains what sets Google’s natural language processing model BERT apart from other language models, how can a custom version version be implemented and what is the so-called ImageNetMoment?

Read More
Generative AI Natural Language Processing

OpenAI’s GPT-2 Text Generator

Two Minute Papers explores OpenAI’s GPT2 Check out this GPT-2 implementation too (thanks Robert Miles for the link!) – write something, then tab, enter, tab, enter and so on: https://transformer.huggingface.co/doc/gpt2-large OpenAI’s post: https://openai.com/blog/gpt-2-6-month-follow-up/Tweet source: https://twitter.com/gdm3000/status/1151469462614368256

Read More
AI Generative AI

GPT-2: Why Didn’t They Release It?

Since hearing about Open AI’s decision not to release GPT-2 due to it “being too dangerous,” I have been puzzled by their decision to release their research that went into creating it. Furthermore, the idea of an organization called “Open AI” hiding their best work to date seemed off. To me, it smelled like a […]

Read More
AI Generative AI

A Closer Look at GPT-2

Computerphile’s Rob Miles takes a closer look at GPT-2, the AI deemed “too dangerous” by its creators to release. If you’ve not heard about it, it’s an AI that, given a bit of text to prime it, it continues writing a believable and coherent way.

Read More