Feds eye new powers to block platforms that fail to combat harmful content online
|CTVnews 29 Jul 2021 at 14:37|
OTTAWA -- The federal government is proposing the creation of new powers to block online platforms that repeatedly refuse to take down harmful content, and is looking at involving CSIS when it comes to combating online threats to national security and child exploitation content.
In launching a new proposal for how to tackle harmful online content, Canadas justice, public safety, and heritage ministers announced Thursday that they want to bring in new laws and regulations to force social media companies to be more accountable for five kinds of harmful content on their platforms: hate speech, child exploitation, the sharing of non-consensual images, incitements to violence, and terrorism.
And, theyre looking at what role federal security and intelligence agencies could play in enforcing these new rules, as well as the potential to completely block access within Canada to platforms that fail to act on content on their services that is deemed harmful.
Departmental officials outlined the proposal and issued a technical discussion paper detailing their legislative aims on Thursday.
Specifically, the government is proposing to create new rules and a compliance regime for online communication services that would force these companies to address harmful content posted on their platforms, including a requirement to review and remove, if necessary, problematic content within 24 hours of the post being flagged.
The sites may also be obliged, in certain circumstances, to preserve content and identifying information for potential future legal action. They could also have new options to alert authorities to potentially illegal content and content of national security concern if an imminent risk of harm is suspected.
The proposal would also compel platforms to provide data on their algorithms and other systems that scour for and flag potentially harmful content, provide a rationale for when action is taken on flagged posts, and would install a new system for Canadians to appeal platforms decisions around content moderation.
The new regime comes with a series of proposed and severe new sanctions for companies deemed to be repeatedly non-compliant. These consequences would include fines of up to five per cent of the companys annual global revenue or $25 million, whichever is higher.
And, as a last resort, if an online platform repeatedly failed to remove child sexual exploitation material or terrorist content, the government would seek the legal ability to block Canadians from accessing that service at all, through a court injunction forcing telecommunications service providers to restrict access to that site or service.