Under the IT rules that came into force earlier this year, large digital platforms with more than 50 lakh users are required to publish compliance reports every month. Information about the action taken on the complaint received by the platforms is given in it. In addition, this report also contains details of the content that has been removed or disabled using automatic tools.
In October, Facebook took ‘action’ on more than 18.8 million content in 13 categories. During the same period, Instagram took action against more than 3 million content across 12 categories.
In the latest report, Meta has reported that between November 1 and November 30, Facebook received 519 user reports through its Indian complaints mechanism. Out of these, 461 cases were resolved by giving tools to the users.
According to the report, out of more than 162 million content processed by Facebook in November, 10 million content was related to spam. 20 lakh items were of violence. 1.5 million items were of adult nudity and sexual activity. Apart from this, action was also taken on more than one lakh head speeches. Action was taken in many other categories as well. These include 102,700 articles of bullying and harassment, suicide and 370,500 articles of self-injury. Action was also taken on threats related to children.
At the same time, of the 12 categories in which more than 32 lakh articles were processed on Instagram, the maximum number of cases were related to suicide and self-injury. Action was taken on 815,800 such materials. 333,400 cases involving violent and graphic content were removed.