Red Teaming is one of those terms popping up all over the place lately, and it seems to mean different things to different people. Is it just an intense pentest with looser rules of engagement? Is it people on-site breaking in at night? Is it when you actually steal the asset, instead of taking a picture to prove you got to it?
Micah Zenko, Senior Fellow at the Council on Foreign Relations, wrote Red Team: How to Succeed by Thinking Like the Enemy, that gives a wide-ranging treatment of the question. The way he describes “Red Teaming,” it isn’t a specific set of techniques or rules, but a way of looking at things differently to help avoid unpleasant surprises.
The book starts off in antiquity with the “Devil’s Advocate,” which was an actual position in the Catholic Church, whose responsibilities were to argue against a proposed elevation to sainthood. The title came from the idea that it was in the Devil’s interest to limit the number of saints.
This position was in place until 1983. In the 20 years after it was removed, the Church approved more sainthoods and beatifications than in the prior 2000 years.
The “Devil’s Advocate”was maybe the first red team position, and it had a clear effect: it restrained the organization and made sure that its decisions were supported by the available evidence. Whether it went too far is not the point. The point is that it was a respected, well-staffed, influential process that forced some critical thinking.
The book describes a red team approach as one that uses “simulations, vulnerability probes, and alternate analyses,” to “reveal and test unstated assumptions, identify blind spots, and potentially improve” outcomes and performance. The problem of “how to determine when your practices are producing suboptimal outcomes leads to the central theme: you cannot grade your own homework.” All of that is from the introduction.
A red team approach calls on knowledgeable experts with the necessary information to evaluate … anything. You can red team an organization’s security controls. You can red team the decision to go to market with a new product. You can red team an intelligence assessment: based on the information we can gather, is this site a civil nuclear power station, a start to a weapons program, a ruse to force another country’s hand? Something else entirely?
Red teaming is most successful when those knowledgeable experts are not part of the core team. The team that came up with the plan-so-far cannot help but be blind to some of the embedded assumptions. The people on your red team should have the necessary domain knowledge, but not already be invested in the project’s direction. They should be the kinds of people who can identify correlations or interactions that others overlook. They should be able to put themselves in the mindset of a motivated adversary, and should be familiar with what actual adversaries actually do. They should be able to step back and look with fresh, skilled eyes, and not be afraid to describe what they see.
Red teaming is very close to penetration testing, but maybe encompasses a larger set of options and targets. The next time you’re making a decision – at home or at work – think about a red team analysis.
I’m considering an update to my kitchen. I’m shocked at the initial estimates, and more than a little hesitant to pull the trigger on such an outlay – not to mention the disruption to my home while it happens. My sister recently had her kitchen re-done. She has some fresh knowledge and experience that I don’t, and she knows me well enough to know what’s important to me. So, I’m asking my sister to red team my kitchen plans. Before I read this book, that would have sounded to me like a silly thing to say. But now, there’s no way I’m going forward without it.
Book: Red Team – How to Succeed by Thinking Like the Enemy
Author: Micah Zenko
We think BB is pretty cool …but we might be biased.
Why not find out for yourself and take a class with him?
Available live/virtual and on-demand