Here at NPR, we want to encourage discussion around all of our coverage - from the latest news on U.S. troops in Afghanistan to a review of the new Tom Hanks film. We know that discussion takes place organically on many different platforms (like Facebook and Twitter and Reddit). However, we also believe that it is important to host that discussion on our site as well.
So when we opened up NPR.org to comments in 2008, we set a high bar for ourselves and for you: "We hope the conversations will be smart and generous of spirit. We hope the adventure is exciting, fun, helpful and informative."
Since then we've kept our eyes on that bar, always looking for opportunities to make your experience better, and today, we're taking the next step in that process. We're excited to roll out the latest changes to our community that will make things faster, clearer, more relevant and more consistent for you. We're enlisting a digital system to review and categorize comments on NPR.org that will increase our quality control over each of our moderators and give us a more comprehensive view of the conversation on NPR.org.
Fostering a "Smart and Generous" Community Conversation
With this community that we help build on our site, we want to encourage great conversation - a place for high-quality, relevant discussion with the NPR community, our staff and even our critics on NPR.org.
Fulfilling our responsibility to create that healthy place for discussion includes agreeing on discussion rules with everyone who leaves a comment and also finding the best ways to moderate the conversation fairly. Over time, we've highlighted great comments floating among a hostile conversation, acknowledged the problem of trolls on our site (and asked people not to feed them). We've partnered with outside community managers to help moderate comments, widened our moderation to more individuals on the site, surveyed you for feedback and moved commenting systems.
We know that this moderation is important to keeping the conversation on track and on focus, and we know that we can have great conversation about even a hot button issue with good moderation.
At the same time, we've acknowledged (ironically, in our comments section) that human moderators are human, and therefore may make different judgments than other humans, even when faced with the same objective rules, and may make mistakes too. We also know that quality control among our moderation team is a really high priority.
What's New: Computer Assisted Moderation
So we are happy to announce that we have invited in our new robot overlords. We are making the change to what we are calling "computer assisted moderation."
NPR has partnered with a company called Keepcon, which beginning today will review every comment submitted on the site. Keepcon's robots then classifiy the language in our comments into categories (defined by our discussion rules) and flags comments that may violate NPR's established guidelines. Those categories (and some other technology in the back end) should help us increase the quality control of each of our moderators and see more about each of the comments that are deleted by our moderators.
With this system, we'll be able to allow more comments to be published immediately without the 15-minute delay of human moderation. So when you post a comment, it will be posted immediately. The algorithm review should take under 60 seconds, so our moderation delays should be ameliorated quite a bit. We also think this will lead to some additional consistency in moderation, and hopefully an even better conversation on NPR.org.
Let us know what you think!