Rik Ferguson, Global VP Security Research at Trend Micro, is one of the leading experts in information security. He is an advisor to the EU Safer Internet Forum, The Information Security Alliance EURIM, a project leader with Europol at the International Cyber Security Protection Alliance (ICSPA), a director of Get Safe Online, Vice Chair of the Centre for Strategic Cyberspace & Security Science and advisor to various UK government technology forums.
What are your thoughts on the issue of Facebook being used to store and share images of child sex abuse?
Storing and sharing images of child sex abuse or exploitation anywhere is wrong. It is a challenge for a platform with such a huge amount of users, like Facebook, to be able to proactively police content of any description. That has showed up in the past with all kinds of content that groups of people or everybody has found objectionable and sometimes they have made the right decision and sometimes they have made the wrong decision. Obviously, there is only one decision to be made when it comes to child exploitation. I think the challenge is more of a technical one in how do you recognise those images among the probably billions of other images that have to be sorted through and published on a daily basis and still maintain the immediacy of the platform, which is what social networking is all about – being able to share and exchange quickly.
Are you aware there are profiles on Facebook that are being used for human trafficking?
I haven’t come across them directly but it doesn’t surprise me. Facebook has a huge user base. Not exclusively Facebook, but social media in general is used as a tool of communication by all manner of groups; we see it being used by criminals involved in financial crime, as you’ve mentioned, human trafficking as well, by populations who are protesting or rebelling against incumbent authorities in their country. It’s a tool of communication and like any tool, it can be abused and misused as well as used for good.
What is the legal situation – aren’t companies held accountable by law if they are found to have images of child sex abuse on their systems?
They certainly have a legal obligation not to host content of that nature; that content is illegal in pretty much every jurisdiction almost throughout the world. What it really comes down to is how that legal obligation is phrased: if it’s like many others in that it’s on a best efforts basis.
The way it works in the financial world, which I was told by the FSA when it was in existence, is that the more data you hold, the more money you would be expected to spend on protecting that data. For an organisation the size of Facebook, the biggest social network on the planet, the expectation should certainly be there that they make every possible investment in people and process and technology.
If Facebook fails to take action, should they face charges for storing images of child sex abuse?
Going by previous cases, people who store those kinds of images on any kind of media have been prosecuted for making child sex exploitation images. This is because by creating the file on the disk they have made the image. So that is not only a charge which goes against the person who took the photo, but also people who download the photo and people who store the photo.
Theoretically, under law, any organisation whose hardware is storing that kind of data would be liable for prosecution. Whether the authorities would rather go after the user who put it there as opposed to the organisation storing it is a question for law enforcement. But the possibility to prosecute at either end certainly exists.
What do you think of the way in which Facebook is dealing with this issue?
About a week ago Facebook, Twitter, Yahoo and Microsoft among others made a statement that they are joining a technical task force with Thorn: Digital Defenders of Children that will create tools to find child sex abuse content on an automatic basis. This project is very recent so there are no outcomes to judge them on yet, but I am hopeful and pleased Facebook is doing something, though only time will tell how much that is.
What other steps, apart from involvement with Thorn, do you think Facebook needs to take?
So with Thorn, they have said in public they are building tools to find that content and eliminate it, which is great, but what they need to be doing in parallel with that is cooperating with security companies. For example, at Trend Micro we have twenty-five years of intelligence around not only malware but much more; we have huge databases about criminal actors and criminal activities, which can be mined and correlated. If projects like Thorn and the companies that are involved also cooperate with the security industry then what they will be able to do in parallel with eliminating the content is actually to assist law enforcement in going after the people responsible for creating and sharing it.
Is there anything else that needs to happen?
As well as working with law enforcement, Facebook also works closely with organisations like the Child Exploitation and Online Protection Centre (CEOP). They do take this very seriously, but I think they have a technical challenge. And one of the quickest and best ways to overcome that kind of challenge is for the user population to be able to take a more active role. Something Facebook could really improve is the speed of response to individual Facebook user complaints. Involving the user population could be one of the most important tools in combating this.
What would you advise the public do if they see an image of child sex abuse or a profile that appears to be trafficking people on Facebook?
For initial reporting and first point of contact, I would recommend CEOP for people in the UK. They have a great online presence and reporting mechanism. I would encourage people to go there, and then they will know that it’s going through the right law enforcement channels and will be dealt with by people who are specialist in that area. If you do go to your local police station, you may risk finding people who are not as up to speed as they need to be for digital type crimes.
In the US, I would recommend the National Center for Missing and Exploited Children (NCMEC) that also works closely with CEOP. I know this because Trend Micro works closely with them as well – hidden in the background, we take intelligence feeds from those two organisations and anyone using any of our products will be automatically and non-configurably barred from being able to access any content which we know contains child sex abuse images.
- Survivor, advocate and activist Michelle Carmela‘s interview about the issue of child sex abuse images on Facebook can be read here.
- Survivor and advocate David Zimmerman’s interview about the issue of child sex abuse images on Facebook can be read here.
- Author, feminist and co-founder of Rapebook Trista Hendren’s interview about the issue of child sex abuse images on Facebook can be read here.
To report images of child sex abuse or suspected child trafficking, Rik Ferguson, Global VP of Security Research at Trend Micro, recommends people in the UK contact the Child Exploitation and Online Protection Centre (CEOP) and if you are in the US, the National Center for Missing and Exploited Children (NCMEC).