Site icon Newsgaze

How Reddit Converted Its Millions of Users Into a Content Moderation Army Is Explained in Detail Here.

Reddit Converted Its Millions of Users

Choosing what should and should not show on Reddit‘s feeds, the self-proclaimed front page of the internet is one of the most difficult dilemmas the site faces.

When it comes to content moderation, which has been an increasingly high-profile issue in recent years, Reddit takes a different strategy than other prominent social media platforms, according to the company.

When compared to other social media platforms, such as Facebook, which outsources much of the labor to moderation farms, Reddit Converted Its Millions of Users and  Reddit relies heavily on its communities (or subreddits) to self-police. In addition to regulations stated by each particular subreddit, volunteer moderators are driven by a set of ideas established by Reddit itself and enforced by the community at large.
However, the corporation has come under fire for its business approach, which some have perceived as being too laissez-faire and lacking in responsibility. However, Chris Slowe, Reddit’s chief technology officer, claims that this is a complete mischaracterization.

“It may seem absurd to say this about the internet in this day and age, but humans, on the whole, are actually pretty good. He told TechRadar Pro that when you look at Reddit on a large scale “people are creative,” “people are humorous,” “people are collaborative,” and “people are derpy” – all of the characteristics that make civilization function.

In essence, we want communities to have their own cultures, policies, and philosophical systems. In order for this paradigm to be successful, we must offer the necessary tools and competencies to deal with the [antisocial] minority.”

This is A Separate Beast

Slowe was the very first Reddit employee, hired in 2005 as an engineer after renting out two spare rooms to co-founders Steve Huffman and Alexis Ohanian while living with them in his home in California. After meeting while taking part in the first round of Y Combinator, Slowe was left with happy memories but also with a failing venture and a lot of time on his hands, which he used to launch a new company.

Despite taking a hiatus from Reddit between 2010 and 2015, Slowe’s background provides him with a unique perspective on the company’s growth as well as how the issues it encounters have evolved as a result of his absence.

During the early years, he explains, it was all about scaling up infrastructure to keep up with the increasing volume of traffic. However, during his second tenure, which spans from 2016 to the present, the emphasis has turned to trust, security, and user safety.

“We provide users with methods to report content that violates site standards or guidelines established by moderators, but not all of the content that is reported is taken into consideration. And, in certain circumstances, the report indicates that it is too late to intervene,” he added.

“When I returned to Reddit in 2016, one of my primary responsibilities was to figure out exactly how Reddit communities operate and define what constitutes a healthy community on the platform. “Once we had identified the signs and symptoms of illness, we could proceed from there.”

Self-Policing

Content moderation on Reddit is more complex than on other social media platforms, and it is intended to adhere as closely as possible to the company’s “community-first” attitude, as opposed to other social media platforms.

The most basic kind of content vetting is carried out by the users themselves, who have the ability to upvote stuff they like and downvote items they don’t like. However, while this mechanism increases the popularity of popular messages while suppressing the popularity of unpopular ones, popularity is not always a reliable indicator of propriety.

The community moderators serve as the second line of defense, and they have the authority to remove postings and ban people who violate the guidelines or the policy on content. According to Slowe, the most prevalent subreddit rule is effectively “don’t be a jerk” (don’t be a jerk).

According to the company’s annual Transparency Report, which details all of the content deleted from Reddit each year, moderators are responsible for almost two-thirds of all post removals on the site.

The Reddit admins, who are directly employed by the company, are on hand to capture any potentially hazardous information that may have gotten past the moderators. These employees not only conduct manual spot checks but they are also equipped with technology tools that allow them to identify problem users and police one-on-one encounters that take place in the privacy of the user’s home.

As Slowe explained, “there are a lot of indications we utilize to surface concerns and determine whether individual users are trustworthy and have been acting in good faith.” “The tough part is that you’ll never be able to catch everything. It is partly due to the fact that it will always be a little grey and context-dependent.”

Upon being asked how this scenario could be addressed, Slowe noted that he is in a tough position, split between a desire to follow the company’s community-first ethos and the awareness that new technologies are on the horizon that could assist capture a bigger percentage of abuse.

Examples include Reddit, which is already experimenting with advanced natural language processing (NLP) techniques in order to better properly measure the sentiment of inter-user exchanges. Slowe also hinted at the prospect of utilizing artificial intelligence to examine photographs posted to the site, and he acknowledged that, as time goes on, a greater number of moderation operations will be performed without the need for human intervention.

The founder of Reddit, however, cautioned about the fallibility of these new technologies, which are prone to bias and almost likely capable of error, as well as the issues they may bring to the Reddit model in the future.

“It’s a little frightening, to be honest. According to him, “If we’re talking about this as an enforcement approach, it’s the same as placing cameras literally everywhere and trusting in the great overmind of the computer to alert us when there’s been an incident.”

A technological panopticon could help to reduce the quantity of sleazy stuff that appears on Reddit, but doing so would ultimately require the platform to abandon its primary premise of putting community before content.

When the Going Gets Rough, Remember To Stay Positive

Because of the controversy surrounding Donald Trump’s accounts and the removal of Parler from app stores, none of the major social media platforms can claim to have solved the problem of content moderation. After being embroiled in these debates, the Reddit community ultimately decided to delete the r/DonaldTrump subreddit from the site altogether.

While the community-first model is extremely effective, there is a great amount of tension within Reddit’s approach. Read on to learn more. Despite the fact that the corporation strives to provide its communities with near-total liberty, it is ultimately obliged to make editorial decisions about where the line should be drawn.

“I don’t want to be the arbitrarily and capriciously deciding what content is OK and what is not,” Slowe told us. The ability to enforce a set of [rules] is necessary, however, at the same time.” The border between right and wrong is razor-thin.”

It strives to keep its content policy as short and to the point as possible in order to avoid loopholes and make enforcement easier, although updates are common. Example: Under former CEO Ellen Pao’s leadership, revenge pornography was prohibited on the network in 2015. A clause prohibiting the glorification of violence was introduced to the company’s policy last year.

As Slowe noted, “Being true to our beliefs also entails iterating on our values and reassessing them as we face new methods to cheat the system and push the boundaries of what is acceptable.”

“When we make a change that entails shifting communities from one side of the line to the other, it is the culmination of a lengthy process that began with identifying gaps in our content policy and proceeding backward from there.”

While the vast majority of people will agree that the absence of revenge porn is an unqualified positive and that incitement to violence occurred on r/The Donald, both examples demonstrate that Reddit must engage in moderation on the same level as Facebook, Twitter, or any other social media platform.

Related topics:

Exit mobile version