Collaborative Encyclopedia has created a uniform community standard for the whole world. Material will serve as a guide for volunteers.
The Wikimedia Foundation, which keeps the Wikipedia collaborative encyclopedia online, will launch its first global code of conduct on Tuesday (2), seeking to respond to criticisms that it has failed to combat harassment cases and suffers from a lack of diversity.
“We need to be much more inclusive,” María Sefidari, chairman of the foundation’s board of directors, told Reuters. “We are losing a lot of voices, losing women, losing marginalized groups.”
Unlike large social networks that pay moderators, the online encyclopedia relies on voluntary collaboration to address user behavior problems.
Wikimedia claims that more than 1,500 volunteers from 5 continents and 30 different languages participated in the creation of the new rules after the council decided last May what the mandatory rules would be.
The new code of conduct prohibits harassment on and off the site, bars behaviors such as hate speech, use of slander, stereotypes or attacks based on personal characteristics, as well as threats of physical violence and “persecution” – how to follow someone through different articles to criticize your work.
The new rules also prohibit the deliberate introduction of false or biased information into the content.
“There has been a process of change in all communities,” said Katherine Maher, executive director of the Wikimedia Foundation, in an interview with Reuters.
“It took time to build the support that was needed to make the consultations, so that people understood why this is a priority,” she said.
Maher said concerns among some users that the new rules would mean the site would become more centralized are unfounded.
Wikipedia has 230,000 volunteer editors working on collaborative articles and more than 3,500 “administrators” who can take actions such as blocking accounts or restricting edits to certain pages.
In some cases, complaints are decided by user panels elected by the communities.
Wikimedia said the next phase of the project will be to work on applying the rules.
“A code of conduct without application will not be useful,” said Sefidari. “We will find that out with the communities,” she said.
Maher said there would be training for communities and task forces for interested users.
Wikimedia has no immediate plans to reinforce its small “trust and security” team, a group of about a dozen employees currently working on urgent matters, such as death threats or the sharing of people’s private information, she said.