Meta removes child sexual exploitation content on Facebook after being notified by IMDA
SINGAPORE: The Infocomm Media Development Authority (IMDA) has notified Meta to review and remove content on Facebook containing child sexual exploitation material (CSEM), the agency said on Friday (Jun 9).
Meta, the parent company of Facebook, Instagram and WhatsApp, took down the offending page and group within 24 hours, according to IMDA.
This is the first time IMDA has notified a social media service of such content, following the amendments to the Broadcasting Act last November to include enhanced online safety requirements.
The police had alerted the agency to a Facebook page that was part of an online network facilitating the sharing of CSEM, where it subsequently uncovered a Facebook group carrying similar posts.
“The posts contained hyperlinks that led viewers to a website with CSEM,” said IMDA, adding that it also directed internet service providers in Singapore to block a linked website that enabled access and distribution of similar content.
Parliament passed the Online Safety Act last year, requiring social media sites to block access to harmful content within hours. The law empowers the IMDA to deal with harmful online content accessible to Singapore users, regardless of where the content is hosted or initiated.
Under the amended law, IMDA can direct social media companies to block or remove such content, and the agency said it would not hesitate to do so if swift action is not taken.
It can also direct an internet service provider to block access by users in Singapore, in the event an online communication service refuses to take down harmful online content.
The Act took effect on Feb 1 this year.