India  

GPT-3

2020 text-generating language model

GPT-3    ▸ Facts   ▸ Comments   ▸ News   ▸ Videos   

GPT-3: 2020 text-generating language model
Generative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020. Like its predecessor, GPT-2, it is a decoder-only transformer model of deep neural network, which supersedes recurrence and convolution-based architectures with a technique known as "attention". This attention mechanism allows the model to selectively focus on segments of input text it predicts to be most relevant. It uses a 2048-tokens-long context, float16 (16-bit) precision, and a hitherto-unprecedented 175 billion parameters, requiring 350GB of storage space as each parameter takes 2 bytes of space, and has demonstrated strong "zero-shot" and "few-shot" learning abilities on many tasks.

0
shares
ShareTweetSavePostSend
 
Opponents Highlight the Environmental Impact of Artificial Intelligence [Video]

Opponents Highlight the Environmental Impact of Artificial Intelligence

Opponents Highlight the, Environmental Impact of , Artificial Intelligence . VentureBeat reports that the CEO of OpenAI has asked for $7 trillion to develop a project aimed at dramatically..

Credit: Wibbitz Top Stories     Duration: 01:31Published

You Might Like

No news matches found

Sorry, we were unable to find any results in our database for your query


Free news archive access


Did you know?
You are eligible to search our news archive with millions of news references free of charge.

To do this, please sign in first at the top of the screen.

Information about free access to our news archive


Search this site and the web:


Free news archive