Artificial intelligence Research lab OpenAI released a writing tool that writes like humans last week.
OpenAI is the machine learning nonprofit co-founded by Elon Musk in 2015. Its recently released its first commercial product, the GPT-3. It is a tool that uses OpenAI's third "natural language generation" algorithm. Initially published in May, GPT-3 can look for human text datasets and write an entire sentence and paragraph following those writing patterns in a formal or conversational English.
Language generation models are already widely used like Google's Smart Compose, which autocompletes sentences in Gmail. The Smart Compose is also being used in sports stories while the Associated Press writes stories using it. However, the compositions created by these tools are unnatural and awkward due to limitations in the language model.
Carnegie Mellon University's Language Technologies Institute Professor Carolyn Rose told Business Insider that while natural language generation systems have historically "lacked some nuance," GPT-3 seems different.
The GPT-3 is trained with an "autoregressive language model" that uses 175 billion parameters, which is 10 times more than past language models. GPT-2 only utilized 1.5 billion parameters while Microsoft supercomputer uses 17 billion parameters.
Microsoft claimed that its AI model can understand the nuances of language, including analyzing pages of text, moderating chat content, generating code, and knowing a statement's intent. Meanwhile, GPT-3 can be applied in all tasks without any gradient updates or fine-tuning.
In few-shot demonstrations through text interaction with the model, GPT-3 achieves strong performance on many datasets such as question-answering, translation, and cloze procedures. This also utilizes several tasks requiring domain adaptation like using a novel word in a sentence, performing 3-digit arithmetic, or unscrambling words.
"We find that GPT-3 can generate samples of news articles which human evaluators have difficulty distinguishing from articles written by humans," OpenAI wrote in their research report.
Developers stunned by GPT-3's human-like compositions
OpenAI has given some private individuals access to the model last week through an application programming interface. Developers have tried using GPT-3 to write design websites, creative fiction, and even business memos, which are almost similar to the works of humans.
Sample works of GPT-3 have begun circulating on social media in the past few days and people are amazed at how natural their tone sounds like.
Allen Institute CEO Oren Etzioni said people see the GPT-3 tool as a forerunner of some huge changes in natural language processing. It was built upon all its predecessors throughout 30 years of AI research. "Whether it's significant or not is still an open question, but it's certainly impressive," the CEO of the Seattle-based nonprofit research lab added.
Meanwhile, developer Kevin Lacker said that while GPT-3 seems still imperfect as it is "quite impressive in some areas, the tool is still "subhuman in others." Lacker administered the Turing test on the tool's sample works to check if someone can think an AI work like a human's.
OpenAI CEO Sam Altman noted on Twitter that GPT-3 may still need some refinements, but he disagreed on the attention it is currently getting. "AI is going to change the world, but GPT-3 is just a very early glimpse." Perhaps, it can still be used to create advanced text-to-speech tools or voice instructions that are more pleasing to hear.