SOCIAL
Meta Intensifies Content Moderation In India: Over 19.8 Mn Facebook, 6.2 Mn Instagram Posts Removed
In December 2023, Meta, the parent company of Facebook and Instagram, reported significant content moderation actions in India. The tech giant removed over 19.8 million pieces of content on Facebook and 6.2 million on Instagram, adhering to respective platform policies.
What Happened? During the month, Facebook received 44,332 reports through India’s grievance mechanism. Meta provided resolution tools in 33,072 cases, including channels for reporting specific content violations, data download options, and support for hacked account issues. This action aligns with the IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. For the remaining 11,260 reports requiring specialized review, Meta actioned 6,578, leaving 4,682 reviewed but not actioned.
See also: Why Did Apple CFO Mention Indian SaaS Tech Giant Zoho in Its Latest Earnings Call?
Instagram faced a similar scenario, receiving 19,750 reports through the grievance mechanism. Here, Meta resolved issues in 9,555 cases. The other 10,195 reports underwent specialized review, resulting in actions on 6,028 cases, with 4,167 reviewed but not actioned.
Compliance with the new IT Rules 2021 mandates monthly reporting by large digital and social media platforms, reflecting its commitment to content regulation. Meta emphasizes that actions against content violations may include the removal or covering of disturbing images or videos with warnings.
This proactive stance in content moderation saw a rise from November, when Meta removed over 18.3 million pieces of content on Facebook and 4.7 million on Instagram under similar policies.
Take Stock Of The Week Ahead
Get all the latest Share Market trends and news to set you up for the week ahead.
Congratulations!
You have successfully subscribed.
Read next: Samsung Regains Top Spot In India’s Smartphone Market, Overtakes Xiaomi After 6 Years