The explosion of AI writing tools is causing Wikipedia to face a wave of poor quality content, containing false information and fake quotes.
To protect the reputation, the volunteer community of this online encounter site has implemented many tightening measures, which was likened by Wikimedia Foundation Product Director, Marshall Miller as the platform's "immunity".
An important change is the rule of quickly deleting clearly generated by AI articles that have not been checked.
Signs include: references to readers (This is your Wikipedia article...), incorrect or meaningless citations and non-existent reference documents... This helps reduce the burden on editors.
In parallel, the Wikimedia project "clearing AI" has compiled a list of common characteristics in AI content such as: abusing cross-section and row- keywords like "more", advertising language like "special", or non-standard formats like curved double-tone characters...
However, Wikipedia emphasized that this is just a supporting factor, not the only basis for deleting the post.
In addition to AI content, the quick deleted policy also applies to harassing, sabotage or meaningless pages.
The Wikimedia Foundation ( Wikipedia management organization) does not directly set these rules and sometimes causes disagreements with the community.
Last June, they temporarily stopped testing the AI summary at the beginning of the article due to opposition.
Despite caution, the Wikimedia Foundation does not deny the potential of AI if used properly. AI is currently applied to detect editing and sendoding and support translation.
The organization's new strategy is to help editors automate repetitive tasks.
In addition, Wikimedia is developing edit Check, a tool that helps contributors follow article writing instructions, recall extra citations, and check neutral tone.
In the near future, the Paste Check feature will be able to verify the origin of the long text presented in the article, helping to reduce the situation of uncensored AI content.
AI is a double-edged sword. It can create a large amount of low-quality content, but it can also become a useful support tool if we apply it properly," said Marshall Miller, Product Director of the Wikimedia Foundation.