The Christchurch massacre is providing further impetus to a move that has been underway for a while to dramatically increase social media companies’ obligations for the content they host. In Australia, the federal government has proposed a taskforce to look at how quickly and effectively social media companies deal with violent images.
To be clear, this push goes far beyond just preventing terrorists from having a platform to spread their ideas. ‘Regulate the internet’ is pushed as the solution to the radicalisation of disaffected youth, the dissemination of hate speech online, sexist and derogatory abuse towards women on social media, even the spread of ‘fake news’.
However, making Facebook, Google and Twitter morally (or even legally) culpable on the basis they failed to prevent the bad behaviour of a minority of users is wrong.
For a start, it lets the actual perpetrators of these heinous acts — and those who cheer on their efforts — off the hook.
It’s not Facebook’s fault that the Christchurch killer chose to livestream his appalling massacre. It’s not Twitter’s fault that people tweet death threats to any woman who has the temerity to appear on a public affairs television show.
Social media allows people to interact with one another. So much like all places where people have congregated throughout history, some of those interactions are positive, some are negative, and some of them are criminal.
The harmful acts are solely the responsibility of the perpetrators. There can be no moral equivalence between creating a space that can be misused to commit violence, or not policing this space quickly enough, and committing violence.
We can expect social media platforms, like phone carriers, to work with police to bring criminals to justice. At a minimum, they owe it to their users to enforce their terms of service and ban people who are misusing their platform.
However, if we want the digital public square to be a place of freedom and global connectivity — and there is no question that we should want this — we cannot expect platforms to pre-vet all content before publication.
We cannot expect them to prevent users from sharing objectionable content at all. To do so would undermine their primary purpose; probably fatally.
This is an edited extract of an opinion piece published in the Australian Financial Review.
19 May 2019 | The Wall Street Journal
Voters shock the media in a result that echoes the victories of Brexit and Donald Trump in 2016. There’s nothing like a shock election result to force media…
19 May 2019 | The Canberra Times
The Coalition’s stunning re-election victory is obviously a triumph for Prime Minister Scott Morrison. His campaign strategy of making economic management and the Labor Party’s big-target, big-taxing, transformative…
18 May 2019 | The Sydney Morning Herald
For nearly three years, the polls, pundits and betting markets have consistently put Labor in a winning position, pointing to a landslide. But such is the magic of…