Should social platforms take more action in policing the content users post?

Social media platforms are now a part of many people’s lives, whether that is simply catching up on their Facebook newsfeed a couple of times a day or creating their own content on platforms such as TikTok, there is no doubt that people are more heavily involved than ever before.

With so many people now using their platforms, there is an ever-growing variety of content being posted – but should social media platforms be in charge of policing the content that is posted, or does freedom of speech mean that anything goes?

Policing Content on Social Media

There is no doubt that policing content on any social media platform is going to be a big job, but is it something that social media platforms should be more involved in? Across any of the platforms, there are easy ways for users to report accounts and/or content that they think is questionable or inappropriate.

However, most people will tell you that this rarely results in any of the content being removed and even less so that users are banned from using the platform – but is that the right way to go?

What Content Should Be Allowed?

Video streaming website Twitch has recently hit the headlines for the amount of adult content that has been hosted and shared on the platform. A lot of the questions come from what counts as adult content and what is simply a bit provocative?

Currently, to have a Twitch account you have to be 13 (pretty standard across most social media accounts) but realistically the date of birth someone enters isn’t verified so there will certainly be underage children using the site. With such a varied audience, is it right that girls in skimpy swimming costumes jump around in hot tubs or put out content of them heavy breathing under the tag of ASMR? Is any of that any different to what a child might see on the beach or watch on daytime TV?

When it comes to what content should and shouldn’t be allowed, there will always be mixed opinions on what is and isn’t okay to be shown online. For example, there are strict rules on what gambling adverts are able to be put on the TV and published in national newspapers – should these same rules apply to the likes of TikTok or does social media allow for things to be a little more lenient?


Where Do You Draw The Line?

The biggest question really is where do you draw the line of what is and isn’t allowed. This problem has been the case across advertising platforms for many years & social media is just the latest avenue to be scrutinised in this way.

One of the biggest issues that people have with questionable content is how it can be checked for legitimacy and the fact that anyone can post without having to be verified. Gambling streams are not a new phenomenon, but there has been recent criticism surrounding the sponsorship and ties certain operators have with streamers. Many have pointed out there is a lack of scrutiny for sponsored sites, with some operating illegally or without a license.

It’s crucial to check that your chosen operator is reputable and licenced, like those found on these high rated casino sites in the UK by So, perhaps if the likes of TikTok or Twitch were to agree that this type of content would be more closely monitored to ensure it is a genuine offering people would be much happier to see this type of content on their newsfeeds and For Your Pages.


Monitoring This Content

There must be hundreds of thousands of pieces of content posted every single hour of the day across all social media platforms, so how exactly do you monitor this content? The truth is that it would take massive teams at each platform; which they won’t be keen to spend with no return from it.

One option that many people are keen to see happen is that people have to show photo ID before they join the likes of Twitch or TikTok and have their account verified. Although this is unlikely to stop all questionable content it does mean that people are much more likely to be held accountable for any content they do post, which means they’re likely to think twice about what they share.

This has been put forward as a strong contender for stopping trolling via social media and helping people to be much kinder to those online than they are now. However, it is recognized that not everyone has access or can afford to buy a photo ID and as such, it would mean that some people are pushed out of using social media platforms which would be largely unfair.


What Would Happen If They Allowed All Content?


There is an argument for allowing all content that isn’t illegal to be posted on social media platforms and for users to take responsibility for what they see on their newsfeed – simply blocking the accounts that they don’t wish to see. If this way of dealing with things was brought in then it is likely that only people over the age of 18 would be allowed a social media account – that way there is much less risk of someone underage seeing content that isn’t appropriate for them to be viewing.

There is a strong argument for gambling content to be allowed on social media, as those under 18 can’t create betting accounts online anyway. As long as genuine casino sites were allowed to post, they have a strict joining criteria for their casinos which would mean children wouldn’t be allowed to sign up.

The truth is that this content can be viewed online anyway – so by taking it away from everyday viewing are we just making it a taboo and something children will be more interested in? Responsible gambling is a really fun pastime, as long as it is used correctly – so perhaps social media platforms should be going down the route of showing gambling, casinos and even adult content can be created and used in a way that is safe and secure, rather than hiding it away and pretending it doesn’t exist?

Leave a Comment