After the UK Prime Minister Theresa May secured a joint statementfrom theG7 on Friday,backing a call for social media firms to do more to combat online extremism, a Conservative minister has suggested the party is open to bringing in financial penalties or otherwise changing the law in order to encourage more action on problem content from techcompaniesif itsreturned to government at the UKgeneral election on June 8.
The Guardian reports the comments by security minister, Ben Wallace, speaking to BBC Radio 4 on Sunday. Wallaces words follow an expos by the newspaper of Facebooks moderation guidelines which the ministerdubbed totally unacceptable, citing anexample of Facebooks moderator guidance saying its OK to publish abuse of under-seven-year-old children from bullying as long as it doesnt have captions alongside. Facebooksrules have also beencriticized by child safety charities.
The companydeclined to comment for this story. But Facebook has previously said it intends to make it simpler for users to report content problems, and willspeed up the process for its reviewers to determine which posts violate itsstandards (although it has not specified how it will do this). It has alsosaid it will make it easier for moderatorsto contact law enforcement if someone needs help.
Beyond bullying and child safety issues, concern about social media platforms being used to spread hate speech and extremist propaganda has also been rising up the agenda in Europe.Earlier this yearthe German cabinet backed proposalsto fine social mediaplatforms up to50 million if they fail to promptly remove illegal hate speech within 24 hours after a complainthas been made for obviously criminal content, and within seven days for other illegal content.It appears a Conservative-majority UK government would also be looking seriously at applying financial penalties to try to enforce content moderation standards on social media.
Wallaces comments also follow a UK parliamentary committee report, publishedearlier this month,which criticized social media giants Facebook, YouTube and Twitter for taking alaissez-faire approach to moderating hate speech content. The committee also suggested the government should consider imposing fines for content moderation failures, and called fora review of existing legislation to ensure clarity about how itapplies.
After chairing a counterterrorism session at the G7 on Friday, which included discussion about the role of social media in spreading extremist content, the UKs PM Maysaid: We agreed a range of steps the G7 could take to strengthen its work with tech companies on this vital agenda. We want companies to develop tools to identify and remove harmful materials automatically.
Its unclear exactly what those steps will be but the possibility of fines to enforce more control over platform giants is at least now on the table for some G7 nations.
For their part tech firms have said they are already using and developing tools to try to automate flagging up problem content, including seeking to leverage AI. Although given the scale and complexity of the content challenge here, there willclearly not be aquick tech fixfor post-publication moderation in any near-term timeframe.
Earlier this month Facebook also said it was addinga further3,000 staff to its content reviewer team bringing the total number of moderators it employs globally to review content being posted by its almost two billion users to 7,500.