What is Red Team Thinking, and how can it help and elevate your organization? In this episode, Bryce G. Hoffman explains his model of Red Team Thinking, inspired by the Red Team concept from the military. He joins host Ben Baker to break down the concept using real-world examples to help leaders navigate the decision-making process with their team. This is not just a tool. This is a mindset that should be practiced by each member so that you can come up with the best strategies and solutions. Learn more about it and how it can be applied to your team and your business by tuning in!
Listen to the podcast here
Red Team Thinking With Bryce Hoffman
I have a question for you. What if all the information, data, and insights you had were wrong? The conceptions, thoughts, and ideas that you came out of them were wrong. That’s what we’re going to talk about. We got Bryce Hoffman joining us. We’re going to talk about Red Team Thinking. I’m going to let him explain what this is because this is something that every organization needs to be thinking about. Bryce, welcome to the show.
Ben, it is a pleasure to be here.
I am honored. I’m trying to think a few months ago, and I heard you talking about this Red Team Thinking, and I went, “This is brilliant. This is something intuitive because it’s counter-intuitive. It makes so much sense. How do I get in touch with Bryce? How do I get him on my show?” This is a concept that needs a great light shone on it. Give some people an idea of who you are, what you do, and let’s get into the Red Team Thinking.
I’m a bestselling author, a top-rated leadership speaker, and the Founder and President of Red Team Thinking. Red Team Thinking is a cognitive capability that engages critical thinking, enables distributed decision-making, and encourages diversity of thought. It’s both a mindset and a set of tools. Those tools allow you to stress test your strategies, challenge your assumptions, identify unseen threats and missed opportunities, make better decisions faster, and better navigate this crazy complex world we live in.
It’s critical because we all walk into scenarios, whether board meetings, the baseball team, or whatever it is, with preconceived notions of what’s right and wrong. This is what the data tells us. The question is, is that really what the data tell us?
Are we even looking at the data? For centuries, people thought that we humans made fundamentally rational decisions. This was first codified and articulated by Adam Smith. He called it The Rational Choice Theory. This theory dominated economics, psychology, and any arena that was interested in figuring out how we make decisions.
Adam Smith said that we make the best decisions possible based on the information or data that we have unless we’re swayed by strong emotions, like love, anger, an unhealthy obsession with tulips, NFTs, or something like that. The problem is that Adam Smith was completely wrong. Over the past few decades, a small army of cognitive psychologists, other scientists, and thousands and thousands of experiments have proven that that’s not how we make decisions. That’s how we wish we made decisions.
The way we make decisions is by relying on one of two modes of thought. They call them System 1 Thinking and System 2 Thinking. For people who are interested in diving deeper into this, the best person to read about is Dr. Daniel Kahneman, who won the Nobel Prize for his work in this arena. He wrote the book, Thinking, Fast and Slow, and others.
To give the synopsis of what it is, System 1 Thinking is automatic and intuitive. It just happens. If I come at you with a flame thrower, you don’t spend a lot of time thinking, “I wonder what’s going to happen if that flame touches me. Is it going to be a good thing? Is it going to be a bad thing? Do I like fire? Do I not like fire?” No. You intuitively know to run away from this crazy guy with a flame thrower, so that’s System 1.Data is data. The question is, is it the right data? Click To Tweet
That’s instinct, in the base way of looking at it.
System 2 Thinking is what Kahneman calls deliberate and effortful. If I write a simple algebraic equation on a chalkboard, unless you’re a math savant, you’ve got to engage System 2 Thinking to solve it. You’ve got to go, “What is X? How do I solve for X? I remember from 10th grade how to solve for X. I’m going to do this, and then I’m going to solve for X. There it is. I know what X is now.” That’s System 2 Thinking.
Here’s the problem that Kahneman and others have uncovered. Most of the time when we think we’re using System 2 Thinking because our brains are fundamentally lazy, we default to using System 1 Thinking and think we’re using System 2 Thinking. We’re relying on past experience or prejudices. We’re relying on cognitive biases that all of us, no matter how smart, well-educated, and successful we are, fall victim to all the time because it’s the way our brains are wired. We’re not aware of that happening. We then get results of our decision-making that are less than optimal because we think we’re thinking things through when these things are bouncing around in our heads and skewing our decisions off the course we think we’re heading on.
That’s fascinating, and it’s very frightening. I have this distinct memory of taking the statistics course at university. I had a stats professor who swore stats don’t lie. The data is the data. You’re going to get one set of statistics out of this, and stats don’t lie. I wrote a paper and I proved a theorem based on that set of statistics all the way up to the middle of the paper. I said, “However,” and I disproved that same theorem using the exact same set of statistics.
I had to go up to the dean of the department to get my paper regraded because this guy was furious at me because his thought is that data is data, all rational decisions need to be made of data, and everybody is going to come to the same conclusions given the same information in front of them. What I’m hearing from you is that’s the furthest thing from the truth.
Not exactly. Data is data. The question is, is it the right data? That’s the thing. If you’re measuring the wrong thing, your measurements can be perfectly accurate. It can be objectively true, but you still draw the wrong conclusions because you’re not measuring the right things. I don’t know how many of your readers who live in the Southern United States have ever experienced the joy of kudzu.
For those who haven’t experienced it, kudzu is an invasive weed native to the Korean peninsula that grows in the US at an insane rate. If a house in a kudzu area is left unattended for a year, and you come back, the house will be buried under kudzu. If you park a car in an area like that and come back six months later without movement, kudzu will be growing all over your car. Where did this monster weed come from?
It came from assessing the data. Back in the 1930s, when the Dust Bowl was happening, and there was also a problem with erosion because of flooding in that part of the country, the US Department of Agriculture studied to find what is a fast-growing plant with a deep root system could anchor the soil and keep it from washing away in floods and blowing away in dust storms. They did a lot of research and said, “We have found the perfect plant. It’s called kudzu. It comes from Korea. The solution is simple. We will plant it along riverbanks and fields all over the Southern United States.”
The data wasn’t wrong. Kudzu is a fast-growing plant with a deep root system that fixes the soil in the way that they were hoping, but they didn’t look at it three-dimensionally. They look at what would be the unintended consequences of planting all this kudzu. Now, it’s something that costs hundreds of millions of dollars a year on eradication efforts, ruined farmland, damage to buildings, and stuff like this, all because they didn’t think about the other consequences of planting this. The world’s filled with stories like that. The old saying, “The road to hell is paved with good intentions. It’s not far off.”
Let’s get back to the Red Team Thinking in terms of its origin. You were talking about how it relates to 9/11. How did Red Team Thinking come out of 9/11? How have you taken that from its original thing to where it is now? That would be a good conversation.
This concept of formal decision to support red teaming was born out of 9/11 and out of two things that happened as a result of 9/11. Most people are probably aware now that in the wake of 9/11, there was a congressional committee and panel to figure out how did this happen because there were very quickly reports that, “The FBI was investigating these guys. There were all these tips to the CIA. Why did this happen when we had all this intelligence?”
The conclusion of those hearings was that the US Intelligence establishment had suffered what they called a colossal failure of imagination. They had access to all this data. The data was correct, but they didn’t piece it together in the right way. They didn’t connect the dots until the terrorists connected them for them. Two things happened. One happened before the report was even done. This happened after midnight on September 12th, 2001, while they were still pulling people from the ruins of the World Trade Center and the Pentagon.
Director of the CIA, George Tenet, stood up a group within the CIA called the Red Cell and said, “We screwed up. We should have seen this coming. We had people in this building who were trying to ring the alarm bell, and we didn’t listen to them. We’re going to create this team, and the job of this team is going to be to take whatever the prevailing wisdom of the CIA is to whatever the conclusions the CIA draws and try to argue that they’re wrong, challenge those conclusions, poke holes in them, and help us see what we’re missing.”
The Red Cell got to work. You can go to the CIA’s website. They say this openly. The Red Cell has been credited with preventing at least half a dozen major terrorist attacks against the United States since 9/11 that would have been equal to or significantly larger than the 9/11 attacks. That’s the Red Cell, and it continues to operate now. To facilitate that, the CIA developed an array of applied critical thinking and groupthink mitigation tools and techniques, many of which we’ve borrowed, modified, and evolved and formed the basis of the Red Team Thinking toolkit.
The second thing that happened as a result of 9/11 is we invaded Iraq and Afghanistan. Initially, things seemed to be going well from the point of view of the US Military, and then suddenly, they weren’t. By 2004, when the wheels started to come off in Iraq and what had seemed to have been a fairly easily won victory was beginning to look like something quite different and an endless quagmire insurgency.
The US Army brought in a new general to lead the army, brought out of retirement a great general, General Peter Schoomaker, and said, “You’ve got two jobs. You got to try to figure out how to deal with the crisis that we’re now in, and your second job is to figure out how we don’t end up in this mess in the future and don’t make the same mistakes again.”
One of the things that Peter Schoomaker did was to impanel a lessons-learned team at the Pentagon to bring in people to say, “What did we do wrong in the planning and the lead up to the invasion of Iraq and what can we do to make sure that doesn’t happen in the future?” One of the key recommendations that came out of this was, “Here’s what we didn’t do. We didn’t challenge our assumptions. We made a lot of very poor assumptions. Number two, we didn’t think about things from the perspective of other key stakeholders, like the Iraqi people. We didn’t include them in our discussions to figure out how things that we were planning on doing were going to look to them. We made a lot of bad decisions based on our biases of previous conflicts that proved to be untrue.”
They decided the way to combat that was to create a methodology, which they called red teaming, to deliberately challenge the plans, strategies, and thinking of the army in the future. They set up a school at the Command And General Staff College at Fort Leavenworth to teach a cadre of military officers how to use these tools and techniques that the CIA and others had developed to stress-test strategies, challenge the organization’s thinking, play the devil’s advocate, and bring a contrarian approach to the military’s thinking. In 2015, I became the first and only civilian from outside government to talk my way into that school, go through the US Army’s Red Team Leader Course, and graduate as a Certified Army Red Team Thinker.“The road to hell is paved with good intentions.” Click To Tweet
That must have been an amazing story in itself, figuring out how to get into a military college program as a civilian, let alone that one.
I called the Pentagon and said, “I understand you have this amazing course. A business could benefit from it tremendously. I’d like to audit the course.” They said, “Who the heck do you think you are?” I said, “I’m a bestselling business author and a nice guy.” They said again, “Who the heck do you think you are?” I am incredibly persistent. I kept knocking on doors for several months until one of the school’s leaders said they finally found the right lawyer to allow me to attend the course. I moved for a few months to a beautiful Fort Leavenworth, Kansas, a garden spot in the Lower Midwest.
When you take a look at that and the thought process that goes with it, the one thing that I was thinking about is, in order to have a red team, you need to have the best and the brightest. You need to have people who are unafraid to voice their own opinions, challenge convention, make assumptions, and sit there and say, “What if I’m wrong? What if you’re wrong?” That takes a very special type of person.
It does. This is spread worldwide very quickly to the British, Canadians, Australians, NATO, and everyone. This formal red team process that they created requires a cadre of people like you described and a team. When I wrote my book, Red Teaming, in 2015, that’s what I tended to do. It is to port that model to business and say, “Here’s the model that is proven, battle-tested, and works. You should adopt this too.”
Companies started calling me and asking me, “Can you help us implement this and bring the red teaming to our companies.” What I found as I did that is that as powerful as this approach is, it also created its own challenges. If you need to have the best and the brightest, you can’t take the best and the brightest who are your best performing executives and say, “You’re no longer in charge. You’re going to be on the red team now. You’re going to sit in an ivory tower and wait for us to develop plans and strategies, and then you’re going to analyze it with your fellow red teamers. We’re going to wait to hear what you have to say before we make a decision.”
That doesn’t work in the real world because it creates logistical and organizational problems in terms of getting that personnel, and the opportunity costs are a loss by them not doing their regular jobs. It can slow down the decision-making process. It can even become a political football in terms of who owns the red team and to who does it report to.
We evolved this formal process of red teaming into what we call red team thinking. It relies on the same tools and techniques and the same basic approach, but we’ve modified it considerably. Instead of having to have this team, you can train your decision-makers on how to use these tools and techniques themselves so that it becomes part of their decision-making practice. You don’t need a separate team. You don’t need the separate step of red teaming the plan.
If you have a major and important plan that has high stakes, then you can bring in some of those people together as an ad hoc team and do a more formal analysis. Having these tools and techniques, it will improve the quality of decisions that they’re making every single day. The more that happens, the other that you get out of that is it changes the culture of the organization.
You start to develop learning agile, an adaptive learning evolving culture that is continually challenging itself and improving on what it does. It’s never satisfied with where it’s at that is always figuring out ways to do it, but it’s doing better. When you get a culture like that, you are going to stay ahead of your competition because no one’s going to come and blow past you because you’re fat, dumb, and happy with where you’re at now, which too many organizations are.
One thing that I’m thinking about when you say that is it gets rid of groupthink. It gets people out of the bind that says, “We’ve always thought this way. We’ve always acted this way. We’ve always hired this type of person. We’ve always employed this type of vendor,” which is extremely dangerous, whether it’s an organization of 5 or 50,000 to sit there and say, “What worked in 2019 is going to work in 2022.”
Look at how many organizations are struggling now with this return to work. Too many of them did. They said, “This is easy peasy. We’re going to send an email, everyone gets back to the office, and things are like they were back in 2019.” Suddenly they find, “Where is everybody? No one’s here.” Goldman Sachs had a rebellion the first day back to work because they sent out an email and told everyone, “Be at your desk at 8:00 or 9:00.” The senior leaders started walking around the headquarters in New York and say, “Where the heck is everybody?” People didn’t show up. The world has changed.
More thoughtful, contrarian, and deeper-thinking companies, like Apple, have been very scientific about their return to work. They’ve been slow-walking it and doing test groups where they are like, “Let’s bring this team back for two days a week and see how that goes. That worked pretty well for this type of job. Let’s try that with the marketing team. In marketing, that didn’t work out so well.”
One of the core principles that we teach is when you’re dealing with complex problems, which this is a perfect example of, it’s impossible to simply come up with the answer in a boardroom. You’re always going to be wrong. What you need to do is have an attitude of probing, testing, and learning, “Let’s try this. Let’s see how that works. That worked, so we can continue to do that. That didn’t work, so we need to adjust course.”
That is increasingly the way that you have to find your way forward in the world. It doesn’t take a lot of time. It’s not about not acting, you’re acting constantly, but you’re acting knowing that you may have to modify your actions. That’s why we think that decision-making should be viewed not as a process but as a practice. It’s something that you’re constantly doing. Every day you’re engaged in refining your decisions and learning and modifying them because that’s the only way you can navigate this complex world.
A lot of it comes down to our original conversation with good versus bad data. Did those Goldman Sachs people have the thought in their head, “No way in God’s green Earth am I ever coming back to the office?” My team has been working remotely for a few years. We’ve made a boatload of money working remotely. I don’t have to wear a suit every day. I don’t have to fight Manhattan traffic every day. I don’t have to pay $400 or $600 a month to park my car. You take a look at it and say, “Are we asking the right questions?”
That’s it right there. You hit something so important, “Are we asking the right questions?” Let me give you a powerful example of that. When I was at Fort Leavenworth at the red teaming school, we did a lot of field trips. One we did was to a team of police officers in Kansas City who I work with them to this day. They’re globally recognized experts on community policing because they came up with a novel model of engaging with the community in a non-confrontational way in what had been a bad part of Kansas City and turned it into a model of the community and police working together in lockstep.
This was back in 2015, and the riots in Ferguson, Missouri, had happened a few years before. These guys were brought in by the Police Department of Ferguson to help them figure out what had gone wrong and how they could avoid another race riot like they had. What they found was a few years before the riots, there was a new police chief. He came in, and like a lot of smalltown police departments, his problem that he realized is that, “A lot of his cops weren’t out on the beat. They were in the coffee shop, drinking coffee and eating donuts.”
He had a brilliant idea. He asked the wrong question. He said, “How do I deal with this? I have an idea. I’m going to make a rule that every officer has to have six interactions per shift minimum with the community.” That makes sense. What’s the problem? The cops aren’t actively policing there. They’re drinking coffee and eating donuts.Decision-making should be viewed not as a process but as a practice. It's something that you're constantly doing. Click To Tweet
The solution then is to make them interact, but how does that play out in practice? A couple of officers in Ferguson, Missouri, it’s Saturday afternoon, you got an hour left on your shift. You pulled over two people for speeding. You pulled over another guy for running a red light, but there haven’t been any crimes. You got 40 minutes left on your shift. What are you going to do? “Look, it’s a group of African-American kids playing basketball at the basketball court.” Pullover, throw three of them up against the fence and frisk him for drugs or whatever. You’ve got three interventions now. You can go back and file your reports. You can say, “I had six police interventions.”
You compound those years of that behavior and create a community that’s at war, where the police are viewed as an occupying power that’s at war with the community. Suddenly something happens. Finally, someone drops a match in a pool of gasoline that’s been building up for several years, and the whole thing explodes. That all happened because somebody asked the wrong question.
It’s teaching people how to ask the right questions. The problem is when we’re dealing with leadership in the vast majority of things, the Peter Principle still exists. We still have a bunch of leaders that are leaders because they’ve been in their jobs enough years that they’ve been promoted and never trained along the way. The question is, how do we bring about red team thinking if we have a group of leaders that truly are nothing more than managers managing process instead of leaders understanding how to inspire, coach, and mentor people?
The answer is that we help those people reclaim the role that they were born with, of being a thinker. When you were a little kid, what did you do? You walked around and asked why all the time. “Why is the sky blue? Why is the grass green? Why is the dog growling? Why is the dish hot?” We’re born thinking, challenging, and not accepting things at face value, asking tough questions to the point that we irritate our parents.
Something happens along the way when we get into organizations. We find that sometimes these organizations don’t welcome our questions and don’t reward critical thinking. They reward instead people who shut up and go along to get along. Because that is being rewarded, those people get promoted. They get promoted to the point where they are now the ones who are enforcing that mode of unquestioning allegiance. That’s not our natural role. If you give people permission, it’s amazing how quickly they can change.
There’s a fascinating series of experiments that I talked about in my book. These are the experiments that led to the idea of groupthink. They were conducted by a scientist called Professor Ash. This was back in the 1950s. These are both amazing and terrifying experiments. What Ash did is he got a group of eleven drama students at his university together. He said, “You’re all going to play students in a scientific experiment on perception.” He then got one person who was the actual subject.
The way the experiments worked was like this. Ash created a series of placards that had three lines on them. One line was short, one line was long, and one line was about halfway in between the two. They all varied a little bit. The point is it wasn’t even close, which line was the short line, middle line, and long line. He then showed one single line and would say, “Which of the three lines matches this line?” You would have to be visually impaired or mentally impaired to not be able to guess the correct answer 100% of the time. It was that dramatic of a difference.
Everyone except the subject was an actor. They went through ten series, but the way it went is that even though a different person was called on first each time, the subject was never called on first. In the first set of experiments that Ash did, in every single experiment, if the line was the medium-length line, the actors would say, “It’s the short line,” and then if it was the long line, they’d say it’s the short line. If it’s a short line, they’d say it’s a long line. They go through this, and a terrifying percentage, over 50% of subjects, at least once over ten questions, would give the wrong answer.
They didn’t want to be seen as being different from the group.
That would be bad enough if that was the case, but then Ash interviewed people afterward and said, “Why did you give the wrong answer?” Over 70% of those people didn’t know they’d given the wrong answer because the force of groupthink is so powerful that people start questioning their own reality. They’d started thinking, “Maybe it’s me.” Once they start thinking that, that starts to influence their perception. That’s terrifying.
They needed a second set of experiments that should give us hope. He changed things up. He did the exact same thing, but he had one actor in every group give a different answer, not even the right answer. As soon as that happened, the percentage of people giving the wrong answer went back to zero because what he called the lone dissenting voice gave the subjects permission to think for themselves again. When they thought for themselves, they could clearly see the truth.
Think about that how that plays out in an organization. If you’re in a large organization and are in a group with ten managers discussing a problem, and nine of them are insisting that something completely idiotic is the case. You’re either going to shut up because you don’t want to get called out, or even worse, you may agree with them because you’re worn down by the mental pressure of it. If one person in that group challenges that, the odds are you’re going to be much more willing to challenge it too.
We’ve seen that play out time and time again with companies we’ve worked with, where people come up to us and say, “The tools you taught were great. That’s all well and good, but the best thing you did was you gave us permission to say what we’ve been thinking. Now that we understand that senior leadership wants to hear what we’re thinking, it’s like the flood gates open up.”
That’s important because one of the things I firmly believe is the answers that companies need are in those companies themselves. They don’t need to hire expensive consultants to come and tell them what to do. I have never been in an organization, company, government agency, or military unit where the answers they need aren’t there. There isn’t somebody and usually more than one person who says, “I can tell you what we need to do. Here’s the problem right here. Nobody listened to me, but here’s the problem.” Instead of doing that, most organizations pay McKinsey, KPMG, or PWC millions and millions of dollars to come and tell them what to do, and it’s often wrong.
It’s justifying what somebody already thinks and say, “McKinsey said the same thing that we did, so it has to be right.”
One of the questions that I asked a client early on in this process was, “You have 320 people in your org chart that has the word strategy in their job title.” This client said, “We do.” I said, “Why do you pay McKinsey millions of dollars a year to develop a strategy for you?” She looked at me and couldn’t answer.
I said, “There’s one of two answers, either you don’t trust your own people, which is sad, or you haven’t given your people the tools that they need to be able to do the job they were hired to do. The solution isn’t hiring McKinsey. If you’ve got 320 people on your payroll that has the word strategy in their title, you should be developing your strategy.”
You don’t trust your own people.The answers that companies need are in those companies themselves. They don't need to hire expensive consultants to come and tell them what to do. Click To Tweet
That’s why we say, “Don’t outsource thinking.” That’s our motto. I got it on my coffee cup.
Bryce, we’ll have to have you back for round two of this because there’s so much in this whole thought process that it deserves another conversation. I’m going to ask you two questions. One, what is the best way for people to get in touch with you? That’s a vital question to ask.
Here’s the last question that I ask everybody. As you leave a meeting, get in your car and drive away, what’s the one thing you want people to think about you when you’re not in the room?
“He made me think.”
I have been thinking ever since we started this conversation.
Mission accomplished. I can rest easy, Ben.
I want to thank you for opening up my brain to different ways of thinking and giving me solace to know that how I’ve been dealing and communicating with my clients is the way that I should be. You have justified how I’ve been dealing with my clients for years. I appreciate that. I didn’t know the words, but now I do.
Thank you for letting me share Red Team Thinking with your readers. I appreciate It.
Thanks a lot.
- Red Team Thinking
- Thinking, Fast and Slow
- Red Teaming
- The Thinking Leader Podcast
- Red-Team.tv – YouTube
- LinkedIn – Bryce Hoffman
- Twitter – Bryce Hoffman
- American Icon: Alan Mulally and the Fight to Save Ford Motor Company
About Bryce Hoffman
I am a bestselling author, speaker, and unconsultant who believes that individuals have the power to transform companies and cultures through great leadership and applied critical thinking.
My books include:
📘 American Icon: Alan Mulally and the Fight to Save Ford Motor Company
📕 Red Teaming: How Your Business Can Conquer the Competition Challenging Everything
I am one of the world’s foremost experts on decision-support red teaming, a revolutionary methodology developed by the U.S. military and intelligence agencies to help leaders make better decisions in today’s complex and rapidly changing world.
In 2017, I founded Red Team Thinking to teach individuals and organizations around the world around the world how to use this same approach to:
📌 Engage critical thinking
📌 Enable distributed decision making
📌 Encourage diversity of thought
Why? Because who thinks wins! 🏆
I also coach senior executives using the management principles I learned from my mentor, legendary CEO Alan Mulally.
📣 I also like to talk – about leadership. In fact, Inc. Magazine named me one of its “Top 100 Leadership Speakers.” 🥇
In 2021, I launched “The Thinking Leader Podcast” so that I could keep talking during the pandemic. 🎙
Oh, and I used to be a 📰 business journalist 📰, but I got over it – though I still write a column about leadership, strategy, and of course red teaming for Forbes.com.