One of the things I like best about Cisco’s focus on security is the internal SecCon conference we put on each year. It focuses on security threats, defenses, and innovation. Although I participate as a trainer, organizer, and reviewer, my favorite role this year was as an attendee. The conference theme, The State of the Hack, encompassed many elements, but the key one for me was trust and the human element.

The two external keynotes set the tone for talking about trust. Bruce Schneier started by pointing out that trust is an inherent element of living in a society of humans. It allows people to work together, and enables banking, transport, commerce, government, and all the elements necessary for a society to function. Without it, we’d have to raise our own food, and live independently of electricity, money, and even neighbors. Bruce mentioned the four mechanisms that enforce trust: morals, reputation, institutional (rules), and security systems. As security practitioners, we tend to focus on the latter, but should remember the first three as well. Reputation is the currency of trust, and is what allows us to trust financial institutions, police, friends, and our food supply. Reputation takes a long time to build up, over many interactions. Banks and stores need to be in business for years to build trust. You trust your friends and neighbors gradually with money, keys, and babysitting. But trust can be destroyed in just one action, as many transgressing politicians and security-breached vendors can attest.

Next, Bruce talked about the actors in a system as either cooperators or defectors, depending on whether they follow the rules of the system. A system needs to consist primarily of cooperators, who act in the interest of keeping the system honest, for the general benefit of the system. But there will always be opportunities for defectors to achieve a personal gain of money, freedom, or ideals. As long as the proportion of defectors is low, the system can function. He pointed out that defectors can’t be reduced to zero, because it becomes too expensive. Plus, their existence catalyzes systems to change. A society or system with no defectors stagnates.

But with too many defectors a system fails, and that is where security comes in. Security is the mechanism for suppressing dissent. It can take the form of antivirus products, banking procedures, or police forces stopping protesters. In the world of Cisco’s focus on security, the system is the networks and data systems of our customers. The defectors are a variety of criminal enterprises, competitors, and even national governments who wish to gain access to these networks for their own purposes. Cisco’s security focus is on both technological controls and robustness. The technical controls are mostly products and features, such as threat defense, firewalls, intrusion prevention and the like. The robustness focuses on ensuring there are no vulnerabilities in any of the products that allow access into an otherwise securely constructed network.

But as I mentioned earlier, the focus of this conference was on the human element. In the second keynote, Apollo Robbins, the “Gentleman Thief,” illustrated some deficiencies in the human system of trust and attention. Among his dazzling displays of pickpocketing and deft-handed trickery was an important message: Human attention is a limited resource, and distraction can be employed to get around normal defenses. In one of his tricks, although the volunteer on stage knew his watch was going to be stolen, he still didn’t notice it happening. Although you think you can watch what he is doing, when you are given another task, such as answering a question, your brain ignores other signals.

From a network security perspective, I believe that an insight here is the focus on the human element that operates the network infrastructure. Distributed denial of service (DDoS) attacks these days are often used to hide the true attack, which may be a subtle exploitation of a vulnerability. If you give a network operator too much to do, they may not notice activity that would be more obvious on its own. Interfaces that expose too much complexity to an operator can prevent them from picking out the important items among the distractions. Another example might be having too many security systems to focus on, with no unifying system to bring them all together, correlate issues, and make it simple to notice the important issues.

I also want to mention a talk on Social Engineering called “Hacking the User Brain.” In this talk, Casey Hardy pointed out that social engineering is an end-around attack to avoid technical security controls. Ultimately, all security is operated by humans, and as we have mentioned, human minds have inherent weaknesses. Despite all the technical controls, technical systems still have to have human administrators, who have the passwords or authentication keys to operate them. It may be much easier for an attacker to trick a human into clicking on a phishing email than to actually break the security system. The email could install malware that allows an attacker a foothold into the network, or takes action to disable the security system.

Because humans have to operate on a certain amount of trust, an attacker just needs to get inside that circle of trust to be effective. The pathway for an attacker into a system of trust is a process of gradual leverage. The more information an attacker has (the employee’s name, their boss’s name, their spouse’s name, the name of their bank, etc.), the better chance they will have at convincing an employee to trust them. Employees of a company have to be able to use email, and to be able to click on links in those emails to get their jobs done. Without some effective software to block these emails, or training to reliably distinguish phishing emails, it is difficult to distinguish the legitimate entities that you can trust from attackers posing as those entities. Ultimately, humans and the system of trust are the weak point of the system.

I hope to be able to incorporate these insights into my work, and expect that the rest of the Cisco security and development community probably took their own new insights home with them. I enjoyed the observation that attention and security are limited resources, and need to be focused where they are most needed. All in all, this year’s SecCon included a refreshing addition to the usually tech-heavy content of security conferences, and gave me plenty of food for thought for the limited resource that is my human awareness.


Andy Balinsky

Security Research Engineer

Security Research and Operations