Home 4IR OpenAI’s Language Generator GPT-3 is Getting A Lot of Attention

OpenAI’s Language Generator GPT-3 is Getting A Lot of Attention

by Nashwa Ahmed

AI- Journalism

The non-profit artificial intelligence (AI) research company OpenAI, which is backed by names like Peter Thiel, Elon Musk, Reid Hoffman, Marc Benioff, and Sam Altman, has released GPT-3, the company’s third-generation language prediction model. The release of GPT-3 has been met with extreme hype from some of the early users, according to Elconfidencial news paper

 GPT-3

GPT-3 has proved to be the most powerful language model ever created.

Evolving from the previous GPT-2 model, GPT-3 was released last year.

GPT-2 was also extremely impressive, being able to create competent strings of text after being provided an opening sentence, report said

 Report mentioned that GPT-3 has 175 billion parameters.

It increased from GPT-2’s 1.5 billion, and the AI has been demonstrated to create short stories, songs, press releases, and technical manuals.

Not only can the technology create stories

but it can do so while using language that is relatable to specific writers. The technology only required the title, author’s name, and the initial word. GPT-3 is also capable of generating other text like guitar tabs and computer code.

 Web developer Sharif Shameem was able to use GPT-3 to create web-page layouts, while famous coder John Carmach, the CTO at Oculus VR and a major influencer in computer graphics, had to say this about the new technology: “The recent, almost accidental, discovery that GPT-3 can sort of write code does generate a slight shiver.”

Concerns About AI

 Report said “The GPT-3 hype is way too much. It’s impressive (thanks for the nice compliments!)

but it still has serious weaknesses and sometimes makes very silly mistakes.

AI is going to change the world, but GPT-3 is just a very early glimpse.

We have a lot to still figure out,”

0 0 vote
Article Rating

Related Articles

guest
0 Comments
Inline Feedbacks
View all comments