The basic idea here starts with a question: How well can GPT3 trained on the Feynman Lectures answer physics questions? GPT3 (Generative Pre-trained Transformer 3) is OpenAI’s new autoregressive language model for natural language processing. This model has shown significant improvements in (see examples) the most basic tasks such as summarization, answering questions, and comprehension. What makes GPT3 so impressive is that it’s a generalized neural network with billions of parameters, and its success has only been rivaled with narrow focused neural networks. Here are some applications from beta testers.
- Website layout - generate website layouts (actual code!) from a written description.
- Plots - generate plots from prompt
- Typesetting math - generate typeset math equations from prompt
- Population data - interprets data in spreadsheet to complete missing data
- Legalese - convert legal document into simple language
- Nutrition labels - extract ingredients from picture of nutrition label and determine if healthy