Privacy and information leakage has become one of my favorite topics on the Security blog. It seems that an enormous amount of information is being willingly plastered all over the Internet, from which significant value can be extracted (especially when combined with other public, or more likely private, datasets). The results are mind-boggling, and the implications are not fully comprehensible. Yet another example of this came to light recently from security professional Roger Thompson’s blog.
As we described in the Cyber Risk Report for the week of December 14, Thompson had a credit card suspended because of fraud concerns. As he called to reactivate the card and prove his identity to the fraud division at his bank, he was asked questions regarding his daughter-in-law that were not things that should have been tied to him in traditional security questions. His assumption is that the information was gleaned from a public source, such as a social networking site.
In a follow-on article, Thompson imagines that the information might have come from a Facebook application that he had installed. Thompson believes that the sheer number of application developers ensures that at least some of them have questionable, if not malicious, motives. He cites an example of an application developer that might not be operating above board:
Not long ago, we found some Facebook apps that had been hacked, and were reaching to attack sites in Russia, and while investigating that, we found a site that looked very similar but wasn’t actually attacking. We’re not mentioning the name of this company, because we can’t yet figure out whether they’re good or bad, but they look really suspicious. Their webpage shows no “Contact us” details… just a crudely-drawn graphic. When we did a whois to see who they were, we found that the ownership was hidden behind Privacy Protector.
While they may not be doing bad things, they are at least not being very transparent about their motives, while at the same time having a very large user base.
But let’s set aside the troubling questions regarding the legal and ethical challenges of collecting customers’ personal information that they do not explicitly provide. Even set aside the question of how difficult it is to know whether or not information about you or your employees is leaking online — we’ve discussed this problem before. Instead, consider that this is occurring in a gray area, where technology has outpaced not only legal concepts of what is available, but for most people, the psychological ability to consider the risks and exposures that exist. Consider that in this environment, some enterprising person in your organization is using public sources like Facebook, Bing, or even more capable OSINT tools to feed into a repository somewhere in your corporation.
This kind of information could be wrapped up in a neat little program on someone’s desktop, filed away in a spreadsheet, or stored in a database somewhere in your enterprise. This kind of information is not only legally problematic, especially for those doing business under the EU’s Data Protection Directive, but it’s almost a digital toxic waste. Cory Doctorow and Bruce Schneier have likened it to “weapons-grade plutonium” and “pollution,” respectively.
There are risks even in storing the information that drives business. And these risks will only increase as information content in our enterprises continues to increase at phenomenal rates. Lots of interesting — and interestingly dangerous — information is being collected, and some of it is being collected for questionable reasons.
If you are responsible for information assurance, you might ask yourself some questions like these:
Does your organization have a policy about collecting information?
Disposing of it?
Can you identify which areas of the business are working on neat and progressive projects that involve data mining?
Have you talked with them about what they’re doing?
Would you know if someone decided to start such a project?
Would they think to come to you proactively and ask your advice for how to proceed?
I’m interested to see if more comes out of Thompson’s experience, and whether this is just a fluke or something that will become an accepted practice. But in the meantime, I think there are some important things to do to prepare for some risks that could be lurking under our noses.