12 September 2016
5 tips to make your Christmas marketing go with a bang
Facebook has faced renewed criticism over the past week with regards to the level of censorship it imposes over the content appearing on the platform.
It all started when Norwegian writer Tom Egeland posted an update called ‘The Terror of War’, showing several children, including a naked nine-year-old called Kim Phúc fleeing a napalm attack during the Vietnam conflict – an iconic and impactful image. The photo was swiftly removed and his account was suspended citing community guidelines around nudity.
In a flurry of support, major newspapers and the Norwegian Prime Minister all then re-posted the same image on their pages, only to have it consistently taken down by Facebook. A huge number of people complained, and the controversy has created a huge amount of bad feeling towards the platform, forcing them to back down, and reinstate Egeland’s account.
Facebook has not apologised, however, although they have released a statement;
“An image of a naked child would normally be presumed to violate our Community Standards, and in some countries might even qualify as child pornography. In this case, we recognise the history and global importance of this image in documenting a particular moment in time. Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal, so we have decided to reinstate the image on Facebook where we are aware it has been removed.”
The social giant said it will also adjust review mechanisms to permit sharing of the image going forward, which could take a few days. However, the whole furore raises some interesting questions about how Facebook operates and chooses to censor. They recently removed their human team of editors, and now an algorithm looks after the trending content – there have been some pretty major hiccups on the trending content lately – including trending a fake story on Megyn Kelly, and a 9/11 conspiracy theory.
Perhaps Facebook need to accept that human interactions are something far too complex for a computer program to understand and that they should employ well trained, media savvy editors with a background in journalism? The stories they feature not only need to stand up to robust fact-checking, they must also be unbiased and balanced – something that at the moment doesn’t seem to be the case.