<div class="article-block article-text" data-behavior="newsletter_promo dfp_article_rendering" data-dfp-adword="Advertisement" data-newsletterpromo_article-text="
Sign up for Scientific American’s free newsletters.
” data-newsletterpromo_article-image=”https://static.scientificamerican.com/sciam/cache/file/4641809D-B8F1-41A3-9E5A87C21ADB2FD8_source.png” data-newsletterpromo_article-button-text=”Sign Up” data-newsletterpromo_article-button-link=”https://www.scientificamerican.com/page/newsletter-sign-up/?origincode=2018_sciam_ArticlePromo_NewsletterSignUp” name=”articleBody” itemprop=”articleBody”>
Wikipedia lives and dies by its references, the links to sources that back up information in the online encyclopaedia. But sometimes, those references are flawed — pointing to broken websites, erroneous information or non-reputable sources.
A study published on 19 October in Nature Machine Intelligence suggests that artificial intelligence (AI) can help to clean up inaccurate or incomplete reference lists in Wikipedia entries, improving their quality and reliability.
Fabio Petroni at London-based company Samaya AI and his colleagues developed a neural-network-powered system called SIDE, which analyses whether Wikipedia references support the claims they’re associated with, and suggests better alternatives for those that don’t.
“It might seem ironic to use AI to help with citations, given how ChatGPT notoriously botches and hallucinates citations. But it’s important to remember that there’s a lot more to AI language models than chatbots,” says Noah Giansiracusa, who studies AI at Bentley University in Waltham, Massachusetts.
AI filter
SIDE is trained to recognize good references using existing featured Wikipedia articles, which are promoted on the site and receive a lot of attention from editors and moderators.
It is then able to identify claims within pages that have poor-quality references through its verification system. It can also scan the Internet for reputable sources, and rank options to replace bad citations.
To put the system to the test, Petroni and his colleagues used SIDE to suggest references for featured Wikipedia articles that it had not seen before. In nearly 50% of cases, SIDE’s top choice for a reference was already cited in the article. For the others, it found alternative references.
When SIDE’s results were shown to a group of Wikipedia users, 21% preferred the citations found by the AI, 10% preferred the existing citations and 39% did not have a preference.
The tool could save time for editors and moderators checking the accuracy of Wikipedia entries, but only if it is deployed correctly, says Aleksandra Urman, a computational communication scientist at the University of Zurich, Switzerland. “The system could be useful in flagging those potentially-not-fitting citations,” she says. “But then again, the question really is what the Wikipedia community would find the most useful.”
Urman points out that the Wikipedia users who tested the SIDE system were twice as likely to prefer neither of the references as they were to prefer the AI-suggested ones. “This would mean that in these cases, they would still go and search for the relevant citation online,” she says.
This article is reproduced with permission and was first published on October 19, 2023.