Avatar

TRAC-tank-vertical_logo-300x243
Two recent disclosures show that often the weaknesses in cryptography lie not in the algorithms themselves, but in the implementation of these algorithms in functional computer instructions. Mathematics is beautiful. Or at least mathematics triggers the same parts of our brain that respond to beauty in art and music [1]. Cryptography is a particularly beautiful implementation of mathematics, a way of ensuring that information is encoded in such a way so that it can only be read by the genuine intended recipient. Cryptographically signed certificates ensure that you are certain of the identity of the person or organisation with which you are communicating, and cryptographic algorithms ensure that any information you transfer cannot be read by a third party. Although the science of cryptography is solid, in the real world nothing is so easy.

Ransomware is an insidious form of malware that encrypts files on a hard disk so that the legitimate user no longer has access, and charges a fee for the key necessary to decrypt the files. Users are faced with the choice of reinstalling software and recovering documents from back-ups or paying the ransom to retrieve their files. The effects can be devastating on businesses that find that their back-up solution has not been as effective as hoped.

The business model of ransomware depends on the price offered for the key being less than the price of the resources necessary to crack the encryption. Hence, the cybercriminals have a strong incentive to use proven encryption algorithms with long keys. If the algorithm is able to be broken easily, nobody will pay for the key.

One particular piece of ransomware reported by Virus Bulletin used the secure RSA encryption algorithm coupled with a 1024 bit length key to encrypt victim’s files [2]. Usually this would be beyond the capabilities of commercial computer to crack, if the criminals hadn’t made a mistake in generating their key. Researchers discovered that the key only contained numeric characters, instead of a mix of lower case, upper case, punctuation characters or binary data. The criminals had literally interpreted that the key should contain random numbers and had only included digits instead of completely random data. This reduced the effective bit length of the key to something that could be broken within a day on a desktop computer. The cryptography and mathematics was perfect, but the implementation of the cryptography was flawed.

The recent vulnerability reported in Apple’s implementation of SSL certificate handling shows how difficult it can be to get cryptography right [3]. The bug means that when a secure connection is established within another computer, the cryptographic certificate that verifies the identity of the remote party is unchecked. The result being that it is possible for an attacker to masquerade as your webmail provider or bank by using a forged certificate without the forgery being discovered. The cause of the vulnerability was a duplicated line of code that caused the checking of the certificate to be skipped in certain circumstances.

Again, a flawless mathematical algorithm was defeated by a less than perfect implementation. In both cases simple mistakes reduced the effectiveness of the cryptography to the point that it could be easily broken. Anyone who has ever written computer software knows how easy it is to make simple mistakes, and how difficult it can be to identify these mistakes especially if the code appears to work correctly. Test driven development where each conditional branch is tested with unit tests as part of the development process should have caught this bug. Writing such tests and ensuring that the tests cover all possibilities can be done, but it requires a lot of diligence on behalf of the development and test teams.

Implementing cryptography correctly is hard. Although the theory is irrefutable, if the theory isn’t implemented correctly then you can’t rely on what the cryptography tells you. In an imperfect world of imperfect crypto code written by imperfect humans how we can detect when communications are being forged?

Identity verification should not mean blind trust when initiating or accepting a connection. The nature of the connecting computer and the content transferred should still be checked. Server reputation can be very useful in identifying servers or networks that have previously been used by cyber criminals. A cryptographic certificate that asserts that you are connecting to a reputable and trusted organisation, when the server that you are connecting to has been previously used for cyber attacks should cause alarm.

Even the most reputable organisations can be hacked and trusted connections used to distribute malware or as a conduit for hacking attacks. Checking the content transferred over these connections is vital to ensure that attacks cannot be propagated over trusted networks.

The Russian proverb, trust but verify, is often applied to cryptography. Applying non-cryptographic techniques, such as reputation checking, and content scanning, to verify that trusted connections are what they seems to be is necessary to secure data in an imperfect world. Mathematics may be beautiful, but the real-world implementation of mathematics as computer code can be ugly.

References.
1. “Mathematics: Why the brain sees maths as beauty”, BBC News, 13 Feb 2014. http://www.bbc.co.uk/news/science-environment-26151062

2. “Researchers crack ransomware encryption”, Virus Bulletin Blog, 22 Feb 2014. http://www.virusbtn.com/blog/2014/02_21.xml

3. “Apple’s SSL/TLS bug”, Adam Langley Imperial Violet Blog , 22 Feb 2014. https://www.imperialviolet.org/2014/02/22/applebug.html



Authors

Martin Lee

EMEA Lead, Strategic Planning & Communications

Cisco Talos