Facebook should be regulated, according to UK MPs

Facebook should be regulated, according to UK MPs

A committee investigating disinformation and fake news has concluded that tech giants such as Facebook should be held accountable for harmful content posted by their users.

The Digital, Culture, Media and Sport Committee (DCMS), one of the Commons Select Committees in the UK, has spent over a year looking at how fake news and disinformation is spread and the effects it can have. According to their report there is now an urgent need for the UK government to respond to the committee's recommendations.

The report also noted that "Mark Zuckerberg has shown contempt" for the process by choosing not to appear before the committee, and Facebook was accused of deliberately frustrating their investigation.



What is 'fake news'?

The term 'fake news' seems to have taken on a life of it's own, and doesn't necessary follow a literal definition anymore. For example, the term is often used to describe news that the reader simply does not agree with. Instead of talking about 'fake news', the report uses the terms 'misinformation' and 'disinformation'. The term disinformation is defined as "the deliberate creation and sharing of false and/or manipulated information that is intended to deceive and mislead audiences, either for the purposes of causing harm, or for political, personal or financial gain." In addition, ‘misinformation’ is defined as "the inadvertent sharing of false information".

Disinformation has been used as a tool to manipulate user opinions on a number of issues, and at its worst it has been shown to undermine democracy as a whole.


What action has been recommended?

The committee concluded that a new category of tech company should be established which has clear legal responsibilities, and these companies should be independently regulated. In addition, the committee recommended that there should be a compulsory code of ethics which sets out what is and is not acceptable on social media, and this code would be enforced by the regulatory body. Accordingly, tech companies should make sure that they have systems in place to ensure that harmful content is identified and removed, with large fines being issued for non-compliance.

The report concludes, "Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites. We repeat the recommendation from our Interim Report that a new category of tech company is formulated, which tightens tech companies’ liabilities, and which is not necessarily either a ‘platform’ or a ‘publisher’. This approach would see the tech companies assume legal liability for content identified as harmful after it has been posted by users."

"We look forward to the Government’s Online Harms White Paper, issued by both the Department for Digital, Culture, Media and Sport and the Home Office, which we understand will be published in early 2019, and will tackle the issues of online harms, including disinformation".
Author
Becky Cunningham
First release
Last update

Ratings

0.00 star(s) 0 ratings

More resources from Becky

Back
Top