The accuracy of information on Wikipedia is one of the most fundamental points for it’s success. Previous research has shown that Wikipedia uses Google to check entries of their users on accuracy. After some research I also found out that Wikipedia checks entries on the way it is written (encyclopaedically or not). Non-encyclopaedic written articles are ‘flagged’ and removed after two weeks if no one edits it. This is part of the conventions of Wikipedia.
In my opinion Wikipedia relies too much on the fact that entries have to have leads on the Internet. So, what happens if we do give them some leads to the Internet and link on Wikipedia inaccurate information to information that is most definitely accurate? What other means are used in these situations to check on the accuracy and on the basis of what arguments do entries get eventually deleted? Or does the reliability increase (according to Wikipedia) on the basis of the fact that links are created between different (screened) pages and more information can be found on the Internet?
First step in this research was to create two posts on Wikipedia containing accurate information partially found on the Internet. Wikipedia immediately approved my first entry but the second entry has to be improved in two weeks due to the fact that it wasn’t encyclopaedically written. So, it wasn’t due to the inaccuracy that it was flagged but due to the fact that it wasn’t written according to the conventions of Wikipedia.
Second step will be to create crosslinks between the pages I’ve created, a page containing fictional information, existing pages on Wikipedia and a webpage to see what happens whenever a fictional article is linked to reliable information. Does the reliability increase?
I’m planning on continuing my research so I’ll keep you up to date with my findings.
Wikipedia conventions: http://nl.wikipedia.org/wiki/Wikipedia:Conventies