If you haven't heard yet, Reddit CEO and co-founder Steve Huffman recently apologized for secretly editing contentious user comments on and announced new controls for the popular content platform, colloquially referred to as the "front page of the internet."
Huffman admitted to the tampering last week, wherein he substituted his Reddit handle with the handles of moderators in hostile comments on the popular r/the_donald subreddit. This week, the 33-year-old engineer issued an apology and laid out the new anti-abuse mechanisms Reddit would be deploying. In the wake of Huffman's revelation, the social news forum's trustworthiness has taken a hit with many of its users.
Reddit's New Rules
First, and likely foremost, Reddit employees will be rendered completely incapable of interfering with content beyond the standard powers that site admins possess. Huffman revealed that he edited the comments via his access as a site engineer, a pathway that will be disabled in the update. As it has been all along, admins are completely unable to alter user content beyond their normal filtering powers. This change will be essential toward rebuilding community trust.
In the wake of previous CEO Ellen Pao's resignation and the surrounding harassment controversy, the company instituted new policies for indecency and heightened its rules against abuse and harassment, but the community moderators were still charged with handling their execution. Now, the company will holistically enforce their rules from the top down more often.
Additionally, users can now filter subreddits of their choosing from the r/all page, which is ostensibly Reddit's homepage. This is an unprecedented move that empowers Redditors to edit their own experience, so the shifts in community activity will be worth watching.
Restoring Their Street Cred
Despite these changes, Reddit's stock with its community has fallen low. The CEO of the company engaged in the very class of behavior that forced his predecessor to step down. In his remarks, Huffman referenced a culture of disrespect he viewed as prevalent throughout the community, but it comes across as disingenuous given his actions.
Tackling this perceived cultural problem, real or not, in tandem with restoring its community relation presents an uncommonly large burden. After last summer's implosion and the reunion of the co-founders, Reddit was expected to bounce back, but these hopes have been dashed.
Reddit's struggles paint a dismal picture for its own future, but it serves as an excellent case study for why platforms need rules and standards early on. A community mechanism as open and freeform as Reddit's ought to have instituted strict rules much sooner and loudly broadcast them during the account registration phase.
Another perennial example of poor rules and standards is Twitter. The microblogging platform has long been home to trolling and abuse bad enough to encourage celebrities such as actress-comedienne Leslie Davis to leave the site altogether.
While Reddit and Twitter offer relative anonymity to trolls and abusers, Facebook doesn't provide any such protections, yet one can easily find instances of harassment on the site. One merely needs to look at a political post to witness the personal attacks.
Bringing the Community Up To Code
While these platforms ought to have been more responsive, hindsight is 20/20 and striding the line between free speech and weeding out abuse is a difficult task. Social networks today ought to learn from their elders' growing pains and assume the worst, that abuse will likely occur.
As the community grows and develops, carefully easing back restrictions may serve to increase engagement without fostering the trolling observed on other networks. If a culture of openness toward harassment doesn't develop and isn't perceived, the network would ideally stay strong and continue generating value for the users unabated.
On top of that, users have an array of anti-abuse tools at their disposal: trolls can be muted and blocked without them knowing, spammers and abusers can be easily reported, and keywords can be preemptively be blocked from video titles. A team of moderators also patrols the community for signs of trouble and manually removes any video content that strays too far into harmful or offensive territory.
What makes Dusk stand out is not the anonymity it grants users, but its immediate rollout of rules and standards for the community. While it's not perfectly fair to compare Dusk to Twitter or Reddit since it has their examples to learn from, the fledgling platform is taking the problem of abuse seriously before it even approaches a fever pitch.
One challenge of Dusk's approach is monetization, but its effectiveness and value have yet to be proven or disproven, so a revenue model might emerge organically.
"We didn't really build this app thinking about monetizing, it was more of a social change experiment," said Kori Handy, the CEO of Design First Apps, which built and deployed Dusk.
Throughout all of the struggles social networks and content platforms have had, two lessons are painstakingly clear: platform businesses absolutely need rules and standards, and finding the right balance is difficult, but it must be attempted.