Since the year 2015, the Cambridge Dictionary Word of the Year has been published every year. The decision is made by a team on the basis of data relating to words that held popular usage in the year. As its Word of the Year for 2023, the Cambridge Dictionary picked the term "Hallucinate". The word is not crowned with the position for no reason. In fact, with emerging AI technology in the present times, the word "hallucination" is chosen as the Word of the Year 2023 for a special reason. Why is the word chosen for the Cambridge Word of the Year 2023? Are there any other definitions added to the word?
We are ready to throw the light on the matter!
First things first, let us understand what the word "hallucinate" means as per the Cambridge Dictionary.
Well, before the year 2023, the Cambridge Dictionary used to define the word "hallucinate" as, “To seem to see, hear, feel, or smell something that does not exist, usually because of a health condition or because you have taken a drug.”
This is the general meaning of the word "hallucinate' that we use in our everyday English speaking. The use of drugs or a few health conditions sometimes makes us see, feel, smell, or hear things that do not exist, and to seem to experience this is what the word actually refers to. The word is a verb.
Up till 2023, the word and the definition seemed ordinary, but emerging AI technology has led the Cambridge Dictionary to add a different meaning to the word as well.
Now, there are two basic questions that come up.
Question 1: What is the new definition of the word "hallucinate" added in the Cambridge dictionary?
Question 2: Why is the word "hallucinate" chosen as the Cambridge Dictionary Word of the Year 2023?
Here are all your questions answered by us!
First, let us answer the first question we posed.
The word "Hallucinate" and its new definition!
Image Source: Gizmoda
The Cambridge Dictionary adds an additional meaning to the word "Hallucinate". The word is related to artificial intelligence systems, that are capable of generating texts that can mimic human writing, and create false information.
In the present time, generative AI tools like Bard and ChatGPT make use of large language models (LLMs). This can sometimes create false information. In this regard, the Cambridge Dictionary has made an addition to the definition in the year 2023, “When an artificial intelligence hallucinates, it produces false information.”
ALSO READ: List of Word Of The Year From 2013-2023
Why is the word "hallucinate" chosen as the Cambridge Word of the Year 2023?
The word "hallucinate" was selected as the Cambridge Word of the Year as Generative AI is a robust tool, but it is still far from perfect. It is a tool that humankind has not adapted to completely, and there is still a lot to learn. In such a situation, the tools hold great potential strength, but also hazardous weaknesses as well. The need of the hour is to learn how to interact with AI tools effectively and safely in order to avoid the creation and spread of misinformation.
As per a post shared on the site of the Cambridge Dictionary, the word was selected as the new meaning added to it "gets to the heart of why people are talking about AI".
The post further said that AI hallucinations are a reminder that humans are still required to make use of their critical thinking skills while using these tools. "Large language models are only as reliable as the information their algorithms learn from," the post added. The post further said that human expertise is hands-down more important than ever before in order to create up-to-date information on which the LLMs can be trained.
Why does artificial intelligence "hallucinate"?
Generative artificial intelligence makes use of past data for its actions. On a prompt such as an image, a small text, or a code, the AI creates a response that is appropriate according to the information the tool is trained on. Tools like Bard and ChatGPT make use of LLMs that are trained from ginormous information. These tools are effective in recreating human thought because of their learning from millions of sources.
Mistakes can happen as AIs can learn from sources that are factually inaccurate. Moreover, the tools can produce inaccuracies on their own as well while processing the data. In both these instances, real-world consequences can also take place. In an incident, a US law firm made use of AI tools for their legal research. This led to cases that were fictitious in nature to be cited in the court. As a result, the law firm was fined by the judge an amount of $5,000 for their error.
ALSO READ: 100 Years of Disney: The Story So Far
Some other AI-related entries in the Cambridge Dictionary in 2023
This year has seen many new entries in the Cambridge Dictionary, and a lot of them are related to AI. Such words include generative AI (GenAI), Generative Pre-trained Transformer (GPT), and Large Language Model (LLM).
What do the critics have to say?
AI ethicist from the University of Cambridge, Henry Shevlin expressed that it is actually "striking" that instead of selecting a word that is a computer-specific word such as "bugs" or "glitches" to describe the errors made by LLMs, the dictionary team chose a "vivid psychological verb". The ethicist said that the reason may be because it is convenient to "anthropomorphize these systems, treating them as if they had minds of their own”.
He further stated that the year is probably going to be the "high watermark of worries" relating to AI hallucinations as AI companies are attempting to halt the frequencies of errors by focusing on human feedback. Moreover, users are getting to know as to where LLMs can be trusted.
The dictionary offers two usage examples of the word "hallucinate" wherein the word is related to artificial intelligence.
Comments
All Comments (0)
Join the conversation