Wikimedia has partnered with Meta, Facebook's parent company to create an Artificial Intelligence (AI) model that will further improve their fact-checking, according to TechCrunch. This will address the challenges faced by the volunteer editors because of the many Wikipedia footnotes that may be difficult to verify one by one.
As the website grows (with over 17,000 articles every month), there are still a lot of citations that are incomplete, missing, or inaccurate.
To resolve this issue, Meta developed an AI model that will scan these citations at a scale automatically to verify accuracy. The AI can also provide alternative citations for a poorly-sourced passage.
The AI uses Natural Language Understanding (NLU) transformation model that tries to understand various relationships between words and phrases within a sentence. The knowledge index of the model is the Sphere database, which has over 134 million web pages. Furthermore, the model is designed to find a single source to verify every claim.
Meta shared an incomplete citation of the model found on the Wikipedia page to demonstrate the capabilities of the AI. Under the Notable Blackfoot people service, there was a mention of Joe Hip, the first native American to compete for the WBA World Heavyweight title. However, the linked website has no mention of Hipp of boxing.
The AI searched the Sphere database and found a better citation in a 2015 article from the Great Falls Tribune.
Also Read: New Study Warns Public About Wikipedia Inaccuracies On Science Topics: Here's Why
A Scandal
In 2020, Wikimedia got into a scandal when it was revealed that a US teen had written 27,999 entries in a language they didn't speak, according to Engadget. This reminded many people that the only encyclopedia isn't the best source of information. This is what motivated Wikimedia to partner with Meta prevent this from happening ever again.
AI for the Future
Meta has started to develop the building blocks of the next-gen citation tools. Just last year, the company released an AI model that integrated information retrieval and verification. Also, there are training neural networks to learn more nuanced representations of language to figure relevant source material in an internet-size pool of data.
Once they are ready, the AI models will improve the quality of knowledge on Wikipedia. This will preserve the accuracy of a resource that a lot of people all over the world use.
Meta also thinks that their project might help the research community solve difficult problems in AI. The company's AI model is trained on realistic data and on an unprecedented scale, which could guide them to gain better results on a lot of their tasks.
Ideally, this new AI model being developed will be able to process several media types and will teach technology better to understand the world.
Related Article: Wikipedia Is Now Using Artificial Intelligence To Prevent Human Errors And Increase The Number Of Editors
This article is owned by TechTimes
Written by April Fowell