Pages

Facebook’s New Way to Combat Child Pornography

As sharing photos online has exploded so has, unfortunately, the distribution of child pornography. But while the rise of the Internet and digital cameras have revived a scourge that has almost completely eliminated in late 1980, the new technology can also help turn back again.

Microsoft says it has perfected a technology called PhotoDNA the developer to identify the worst of these disturbing images - even if they are trimmed or altered - and waste through large amounts of data quickly and accurately enough to police the world's largest online services. On Thursday, it announced that Facebook will be the first service to join in the use of free technology, which Microsoft won the National Center for Missing and Exploited Children in December 2009.





Facebook, the largest photo sharing site on the Internet, said it has begun using PhotoDNA hunting for several thousand illegal images recorded among the 200 million pictures uploaded by users every day. Facebook will host an online event at 3:00 pm (ET) on Friday to explain the initiative following his January move to join the center of the Amber Alert network.

"Our hope and belief that Facebook is going to be just the first of many" companies to use what has proven highly effective technology, "said Ernie Allen, executive director of the National Center for Missing and Exploited Children. "Online services will become a hostile place for pedophiles and child pornographers."

PhotoDNA is being used to find and delete images only known sexual exploitation of prepubertal children to avoid trampling on privacy and free speech rights of consumers of adult pornography, he said. The courts have ruled that the pornographic images of child abuse children are not legally protected freedom of expression.

By focusing on images of children under 12 years, the initiative is fighting the "worst of the worst" images, which are often shared over and over again, he said. Child pornography is growing increasingly violent and increasingly represents children, including babies and toddlers.

"These are pictures of the crime scene," not porn, "said Allen. "This tool is essential to protect these victims and to prevent, as much as possible the redistribution of their sexual abuse."

PhotoDNA currently can find about 10,000 images collected by the National Center for Missing and Exploited Children, which has accumulated 48 million images and videos depicting child exploitation since 2002, including 13 million in 2010 alone. The center has a congressional mandate to act as a clearing house for this material to help identify and help victims and assist law enforcement in investigations of the authors.

Tests on Microsoft's SkyDrive, Windows Live and Bing services in the last year indicate a large chilling trade in these images. A network that compares 10 million images in the inventory of 10,000 photos illegal center can expect to have about 125 hits a day, according to Hany Farid, a Dartmouth professor of computer science and digital imaging expert who worked with Microsoft to improve the technology. At least 50. 000 images of child pornography being transmitted online every day, he said.

"This is a small dark little world," he said. "The problem is phenomenal."

PhotoDNA works by creating a hash, or digital code to represent a given image and find examples of it in large data sets, as well as antivirus software made by malicious programs. However, s PhotoDNA hash robust "are capable of finding images even if they have been altered significantly. The evidence on the properties of Microsoft showed that the images accurately identifies 99.7 percent of the time and put up a false alarm only once in every 2 million images, and most of them point to almost identical images, Dr . Farid said.

To create a hash, the program puts the image in black and white in a standard size. Then he carves the image into blocks and subjected to a series of measures. The result of "signatures" can be provided to online service providers, which can then be used to find these specific illegal images on their systems without them or view the contents of private clients.