6.033: Computer System 
Engineering

6.033: Computer System Engineering - Spring 2001

------------

Principles as People

Andrew Lamb

Most cryptographic protocols are judged by the security they offer between principals -- participants in the protocol. In production cryptographic systems, the communication of the principles and their computers is not secured. The security of cryptographic systems which overlook the need to secure communication between users (not just between computers) is only as strong as the security between the user and their computer.

In an ideal world, each human computer user would have complete and utter control over their system, which is typically not the case. Also, if an attacker gets physical access to a machine, the secrets used by the cryptographic system (such as private keys) are only protected by the security measures of the machine. The overall security of any cryptographic system is the security provided by its weakest link, and the weakest link is typically the user's machine.

Many cryptographic systems rely on the fact that the principal knows a private key that must be kept secret. No matter how secure the cryptographic system is, if an attacker gets physical access to the machine he or she can most likely get the secret key. For example, an attacker could remove the hard disk where the private key was stored to obtain the key. Once the hacker has the secret key, the entire cryptographic system has been broken.

If a computer is shared between users, the situation is even worse. The ability to keep the private key of one user away from another user is now totally the responsibility of the operating system. As can be seen from the war stories in the 6.033 class notes, there are numerous ways to defeat an operating system's security that are known, and probably many more that are not currently known.

But the problem is even worse than that. When trust is placed in the user's computer, trust is implicitly placed in whoever is responsible for writing the operating system and initial setup of the computer. As mentioned in today's reading, an almost totally unnoticable attack of replacing a few key self replicating lines in a compiler will leave the user's computer open to exploitation. By using the operating system's security to keep secret information secret, trust must be placed in the operating system.

A possible solution to the security holes introduced by trusting the end computer is to move the secrets physically closer to the actual user. Perhaps smart cards which hold people's private keys and some circuitry to encrypt and sign data could be worn about the neck. The only way for the private key to be used then would then be by someone who physically has the smart card. Smart cards are not fool proof -- they too can be stolen. In addition, the smart card manufacturer must be trusted to make a secure smart care which can not be spoofed.

Many cryptographic protocols assume that an end user and their machine form an inseparable atomic unit. Since a user often does not even have full control over his or her machine this assumption is blatantly false and leads to insecure cryptographic systems which seem secure at first glance.