In Just 3 Months Facebook Has Removed 8.7m Child Images
Mike Sanders / 2 years ago
In Just 3 Months Facebook Removed 8.7m Illegal Child Images
Facebook has been fighting many battles over the last year. Battles which, I think it’s fair to say, have seen mixed results. Some they’ve won, some they’ve lost, others they’re still fighting. It seems, however, that one of their biggest issues at present it attempting to moderate the content being posted on their website.
Although I have never encountered it myself, it is widely accepted that Facebook has been used to the distribution of illegal images. When on this topic, we come to the often awkward and distasteful subject of child pornography. We may absolutely abhor it, but sadly it exists and ignoring or pretending it doesn’t exist sadly won’t make it go away.
In a report via The Star though, it seems that Facebook has been very active in attempting to quell this on their website.
Revealing (until now) secret software/algorithms, Facebook has revealed that it has successfully detected and removed 8.7m child images that are portrayed as ‘sexualised’. We should, however, clarify that figure a little. It does not mean that Facebook has a major child pornography issue. It does, however, show their intent to fight against it.
The social media site has openly admitted it’s detection algorithm isn’t perfect. It has led to many, entirely legitimate, images being removed. The site has apologised for any inconvenience caused. They do, however, think that in terms of this subject, it’s best to err on the side of caution. A more notable example of this in recent months has been the removal of a Pulitzer Prize-winning photo of a Vietnamese girl following a napalm attack.
The social media site, however, say that they are working hard to improve this somewhat secret algorithm. As far as I’m concerned, while it is sad that legitimate pictures get pulled, there is a greater good here.
What do you think? Is Facebook right to take such a tough stance? What could be done to help improve it? – Let us know in the comments!