Why “going with your gut” is sometimes the best way to go
On why intuition matters as much as the numbers – even from my perch as a former math major who does pricing strategy in tech for a living
I think through monetization strategy for tech products and services for the lion’s share of my day. Unsurprisingly, the conscious, rational and logical “left brain” is highly celebrated in my world. It can be easy to extend that to believing that subconscious, intuitive, and emotional decision making (or the “right brain”) is often fallible and even whimsical.
I was recently reading the thought-provoking “anti-Moneyball” The Eye Test by Chris Jones. That got me thinking about whether “going with your gut” truly deserves the bad rap it often gets. As ironic as it sounds, I dug into some data that suggest that our emotions and feelings might not only be important in our intuitive ability but may actually be essential to make good decisions.
1. “Numbers don’t lie” – except sometimes they do.
“Show me the data, the numbers don’t lie” is a mantra for objective, evidence-backed sound decision making. But equally true is the “garbage in, garbage out” tenet of analytical rigor that one of my Finance professors tried to drill into our brains in the early weeks of business school.
Algorithms are made by humans, and like humans can be highly subjective and ridden with bias. Several missteps in the US criminal justice system shed light on this unfortunate truth. In 2020, Robert Julian-Borchak Williams became the first person to be unfairly convicted and arrested by an algorithm. Williams, an African-American man in Detroit, was charged with stealing five watches after having been wrongly tabbed by a facial recognition algorithm - though it soon became amply evident to everyone (including the detectives involved) that Williams bore little resemblance to the actual thief. Facial recognition systems have demonstrated bias against people of color - Facebook’s facial recognition algorithm labeled Black people “primates” (which it recently told the BBC “was clearly an unacceptable error”) and a recent study found that 35 percent of women of color examined by machines were identified as men. Machines and models cannot self-correct - only we can.
Another problem with using purely analytical thinking as the gold standard for decision making is that it relies on past information. While our past may give us a clue into future behavior, it is most certainly not predictive of the future. No data mining in the world could have predicted the arrival of COVID or known how to respond to it, and our weather models are not equipped to forecast hurricanes or wildfires. It remains a quintessentially human capability to be able to come to terms with uncertainty - which is complex, multi-dimensional, constantly evolving, and not amenable to predictive models.
2. There is no “one right answer” to anything.
Chris Jones, the author of The Eye Test, shares how his older son, Charley, has autism. He elaborates how Charley is a voracious reader but the spelling of even basic words defies him. To Chris, it seemed inexplicable that Charley could read without being able to spell. He has since learned that Charley does not read by sounding out words (like most of us do), but by memorizing the shapes of words (somewhat analogous to how people learn Mandarin or hieroglyphics). Charley reads as we recognize people - we can usually instantly recall the faces of people we know but probably can’t draw them. Chris uses this powerful personal example to drive home the point that there isn’t even one right way to read!
Mulling on his reflections in a more professional realm, most data scientists follow a generic three-step data science process to solve “big data” issues:
Identify the problem
Develop a solution
Communicate the solution
Within this process, companies hiring data scientists often have ideas about what they want them to do (i.e. step 1) and data scientists really spend most of their time on step 2.
But Einstein famously said, “If I had an hour to solve a problem I'd spend 55 minutes thinking about the problem and 5 minutes thinking about solutions.”
Eric Weber, a data science thought leader who has held roles at StitchFix, Yelp, LinkedIn and CoreLogic after an academic career as an assistant professor, tends to agree. In the #DataTalk podcast he says, “They (i.e. companies) know they want them (i.e. data scientists) to be able to organize and pull data. In some cases be a glorified analyst; in other cases build more advanced machine learning models to do particular tasks. But the important part there is the company is so familiar with their data … Well, in some cases they are … that they already have the set of tasks they want to accomplish. In a lot of ways they don’t know what they don’t know. They’re missing out on what’s possible.”
He goes on to say that the true value a data scientist might bring to the table is coming up with different ways to even frame the problem before attempting to solve it: “Of course from a business side, if you’re in a C-level or something else, you don’t dig into the data enough day to day. You don’t dig into the predictions enough day to day to understand how it could be different. But the data scientists can. That’s where they add a lot of value — not just in building models and pulling data, but in trying to actually produce innovation for the business and innovative ideas.”
3. As problems get more complex and ambiguous, sound instincts and experience often matter more than simply crunching the numbers - making those who possess those skills “non-fungible people”.
Venture capitalist Fred Wilson recently penned a post about NFPs - i.e. non-fungible people. He discusses how most team members are “fungible”, but there are always a few “non-fungible people” and retaining these uniquely skilled NFPs is incredibly important to the long-term success of a business.
My experience is that more often than not, those team members who have a sort of instinctive knack for detecting patterns, perhaps subconsciously, that other people either overlook or mistake for random noise are NFPs.
Take for example, the decision to divest a business or to launch a new product. Needless to say, the “foot soldiers” in an organization are always deployed to build sophisticated analyses - so that the decision makers can look at models and research and predictions. But even after all that, there are often still huge uncertainties and lots of known unknowns. Knowing that there will never be enough information to make a fully data-based decision, the decision-making leader often relies on “gut feel” to make the call after looking at the analysis - and that is how they earn what they get paid.
I think the ability to make these gut calls comes down to two things, which set you up on an increasingly steady path to being an NFP:
Superior pattern recognition skills based on rich experiences: The more we experience, the more we hone our intuition. Scientists describe the human brain as a “predictive processing framework” - constantly comparing incoming information and experiences against stored knowledge and memories of previous experiences, and predicting what will come next. Intuition or gut feelings are also a product of this processing that happens in the brain. Intuitions occur when your brain makes a significant match or mismatch with prior models (based on past experiences) and current experience at an automatic and subconscious level.
This is what helps seasoned venture capitalists to determine whether a start-up will succeed, school admissions officers to predict which students will succeed, and doctors to make diagnoses in complex and ambiguous situations.
Cross-indexing from different domains: What takes these pattern recognition skills (and subsequently our intuitive ability) from good to great is the ability to see these patterns across different fields.
I saw this up close and personal in my first job - when I worked with a Member of Parliament who was the CEO of Unilever in India and a scientist by training in former lives. He would, for example, intuitively draw on his learnings from women manning the Unilever supply chain through the company’s Shakti program and turning greater profits while simultaneously lifting themselves out of poverty to new service concepts targeting rural poultry farmers in the development sector. He also firmly believed that in general management, people with diverse backgrounds are, all other things being equal, going to probably be more valuable and learn faster because they would be more effective at recognizing (and cross-indexing) more patterns from their varied life experiences.
***
However, despite this ode to the power of intuitive thinking, it is also important to be keenly aware of the cognitive biases that this type of thinking can fall prey to. Seeing patterns where none exist, only remembering when we didn’t trust our gut and should have (and not the reverse) or taking unnecessary risks to recover a prior loss are only some of the ways that our intuition can very easily lead us down a slippery slope.
I strongly believe in the power of intuitive decision making coupled with self-checking and continual feedback. I definitely do not squirm anymore when I sometimes cite intuitive thinking as part of a rigorous decision making framework. But I do find myself pausing to reflect on the underlying reasons for those intuitions - before factoring them in.
(Hustle Fuel represents my own personal views. I am speaking for myself and not on behalf of my employer, Microsoft Corporation.)