"Seeing" Meta Competency

"Seeing" Meta Competency

Our ability to “see” – our ability to make meaning – is one leadership meta competency. It resides in the domain of cognition, including logic, knowledge, expertise, information, data, discernment and judgement. Externally and collectively, at the organizational level, a strong seeing meta competency is demonstrated in a shared, compelling vision and/or collective commitment to a winning strategy – thus a clear sense of purpose, direction, and how to get there. At the team level, it shows up as a clear team mandate, such that the team is clear about its purpose and objectives. At the individual level, it manifests as vision, purpose, goals. All of these are a function of what we notice and the meaning we make of it.

What we see is fundamental to how we lead. Leaders are often compelled by their particular vision or insight – they see something differently or they may notice more. 17 Desirable leadership qualities such as the ability to simplify complexity, cull the essential from the non-essential, perceive risk and opportunity, exercise good judgement, and make good decisions all stem from the depth and breadth of what a leader is able to see. The seeing meta competency includes the content expertise (e.g., industry knowledge, process knowledge, etc.) and pattern recognition that comes from experience that fuels deep understanding and sparks novel connections. Discovery, invention and innovation all stem from the ability to see a different possibility.

How It Operates -- How we “See” or Make Meaning

Consider our internal mechanisms for meaning making. At any given moment, we have access to millions of pieces of data. Not only can we perceive the physical data that is available through our five senses, but we can also perceive somatic, emotional, relational, linguistic, narrative, cognitive, and meta-physical information, as well.

While 11 million pieces of data are available to our minds at any given moment, we are typically only consciously aware of 40. 18 Our analytical mind is designed and trained to select only the data that is important and ignore the rest. We give that data meaning, and use those data/meanings to make assumptions, draw conclusions, take action and form beliefs. Chris Argyris depicted this way our minds work with his famous Ladder of Inference framework. Through this framework, he illustrates how our minds “climb the Ladder of Inference” millions of times a day, sometimes so fast that we are not even aware it is happening, often treating assumptions like fact and unaware of the steps we took (and skipped) to move from data to conclusion and action. It also shows how our pre-existing beliefs influence which data seem important to us and thus which we notice and which we ignore. Because of this confirmation bias, 19 we are more likely to notice the data that supports our pre-existing beliefs and dismiss or never even notice data that disconfirms them.

Throughout our lives, as we continuously “climb the Ladder of Inference,” we develop a set of beliefs about how the world works and what are the “rules of the game.” This set of beliefs, our mental models, is the lens through which we see and interpret the world. Our mental models are the sum total of everything we have ever learned and experienced. Yet, like any model, mental models represent a slice of reality, but are not able to capture the whole of reality. MIT scientist Jay Forrester says, “The image of the world around us, which we carry in our head, is just a model. Nobody in his head imagines all the world, government or country. He has only selected concepts, and relationships between them, and uses those to represent the real system.” Peter Senge popularized this idea of mental models in his 1990 breakout management book The Fifth Discipline: The Art and Practice of the Learning Organization. Yet, this idea had been around for centuries. Peter Senge writes, “What we carry in our heads are images, assumptions and stories. Philosophers have discussed mental models for centuries, going back at least to Plato’s parable of the cave.” 20

Our mental models are influenced by our culture, family of origin, education, gender, race, ethnicity, the historical context in which we have lived. 21 22 Yet, the beliefs and assumptions that constitute our mental models are often implicit – we can take them for granted as Truth. We can assume others see the world similarly, and we can be completely unaware that we are looking through this filtering lens of our mental models, that influences how we see, do, connect, be.

The Ladder of Inference can raise our awareness of some traps of our analytic mind. These traps fall into two categories: errors of logic (we make faulty assumptions, linkages, or leaps) and errors of bias (we notice some things and ignore others, based on our pre-existing narratives, beliefs, and mental models). Starting from the bottom rung and moving up the ladder, consider some common ways we can get into trouble:

• We assume that we have access to all the data and so we do not inquire about what we might be missing or what others know that we do not. Recall the fable of the blind men touching an elephant. Each reports very differently “what an elephant is like” (i.e., a rope, wall, hose, or fan – depending on whether he is touching the tail, side, trunk, or ear). None is wrong, but none reports on the totality of the elephant, either.

• Because of how we are thinking/feeling (e.g., our mental models, our frames, our assumptions, etc.), we unwittingly only notice some data and not others. We can be unaware of data that is right in front of us, because we do not deem it to be important or significant. The “Invisible Gorilla” video by Christopher Chabris and Daniel Simons is an excellent demonstration of this dynamic. 23

• We may see the same data, but interpret it wildly differently. Rubin’s vase, the image that can be interpreted (correctly!) as either a vase or two faces, demonstrates how the exact same data can be interpreted very differently. 24

• We make assumptions without even being aware that we have -- and then treat these assumptions like fact.

• We only notice data that supports our pre-existing beliefs. This predisposition makes it important that we proactively seek disconfirming data. For example, research shows that we are less likely to notice the mistakes of employees we deem “high performers” and more likely to notice mistakes of those we deem “low performers,” thus showing the power of early impressions.

• We may use labels (sometimes called “fat words”) that can easily mean very different things to different people, but assume others know and share our intended meaning (e.g., are “client management skills” more like “client relationship management skills” or “business development skills” or both or something else altogether).

• We are unaware of how our own mental models limit or color our perspective and assume that others see the world the same way we do. We assume our mental model reflects “Truth” with a capital “T” and that our way of seeing the world is universally accepted (i.e., “That’s the way the world works”).

Daniel Kahneman and Amos Tversky introduced the idea of cognitive biases and their impact on decision making in 1974, and in doing so, fathered the field of behavioral economics. 25 (Their research and ideas were recognized when Kahneman was awarded a Nobel Prize in economics in 2002.) Since then, the field of behavioral economics has become more mainstream. Many leaders today realize how biases (e.g., Confirmation Bias, Anchoring, Loss Aversion, Over Confidence, Excessive Optimism, etc.) can negatively affect business decision-making. 26 27 28 29

Kahneman depicts our mind as the uneasy relationship between two systems: the automatic, fast “System 1” and the slow, effortful “System 2.” System 1 operates quickly and effortlessly, and is prone to the traps illuminated by the Ladder of Inference plus some. (Kahneman identified 48 heuristics that System 1 falls prey to, when making judgements based on “gut reaction.”) System 2 is slower and more deliberate and, with purposeful effort, is able to see and side-step these traps. Kahneman writes, “The way to block errors that originate in System 1 is to recognize that you are in a cognitive mind field, slow down and ask for reinforcement from System 2.”

These meaning making mechanisms not only operate at the individual level, but also at the collective level. Groups of people carry shared images, assumptions and stories that act as a screening mechanism for what they see. This dynamic is observable in all kinds of groups – from organizations and industries, to religious and ethnic groups. Take the example of the scientific community. Albert Einstein once wrote, “Our theories determine what we measure.” Thomas Kuhn writes in the Structure of Scientific Revolutions,

…when paradigms change, the world itself changes with them. Led by a new paradigm, scientists adopt new instruments and look in new places. Even more important, during revolutions scientists see new and different things when looking with familiar instruments in places they have looked before. 30

Peter Senge speaks to this collective meaning-making at the organizational level as a critical leverage point for organizational learning. He writes,

The problem with mental models lies not in whether they are right or wrong – by definition, all models are simplifications. The problems with mental models arise when the models are tacit – when they exist below the level of awareness. The Detroit automakers didn’t say, ‘We have a mental model that all people care about is styling.’ They said, ‘All people care about is styling.’ Because they remained unaware of their mental models, the models remained unexamined. Because they were unexamined, the models remained unchanged. As the world changed, a gap widened between Detroit’s mental models and reality, leading to increasingly counterproductive actions. 31

How to Develop This Meta Competency (“See More”)

In order to “see more,” we first must become aware that we have a particular lens through which we see the world and then hone our awareness of that lens. Doug Silsbee describes beautifully this capacity to “see more.”

We can increase our inherent capacity to notice, to witness, to observe our inner life, our relationships, and our context with an increasingly refined set of distinctions. These distinctions live in us as part of our accumulated cultural and experiential history. As we develop throughout our lifetime, the body of distinctions that is available to us richens and deepens. We are able to witness more and more.

Curiosity – about what we may not be noticing, about plausible alternative interpretations, about how we may be inadvertently contributing to undesirable situations – is the fuel for “seeing more.” Curiosity helps us expand what we notice and open our minds to a multitude of interpretations.

Curiosity causes leaders to ask questions. Action Design 32 has long taught leaders to “balance high quality advocacy and inquiry” as part of its Productive Interactions curriculum, based on the work of Chris Argyris. Asking high quality questions is the textbook approach to escaping the traps illustrated by the Ladder of Inference. Jennifer Garvey Berger encourage leaders to “ask different questions” (not just more questions or better questions), as a way to invite illuminating (and often surprising) data, insights or paradigm shifts. 33 James Ryan, the Dean of the Harvard School of Education, wrote, “Questions are like keys. The right question, asked at the right time, will open a door to something you don’t yet know, something you haven’t yet realized, or something you haven’t even considered – about yourself and others." 34 Economist Noreena Hertz wrote in the New York Times,

It is crucial to ask probing questions not only of experts, but of ourselves. All of us show bias when it comes to what information we take in. We typically focus on anything that agrees with the outcome we want. We need to acknowledge our tendency to incorrectly process challenging news and actively push ourselves to hear the bad as well as the good. When we find data that supports our hopes, we appear to get a dopamine rush similar to the one we get if we eat chocolate, have sex or fall in love. But it’s often information that challenges our existing opinions or wishful desires that yields the greatest insights. 35

Being curious and asking questions does not imply a lack of confidence in our own ideas. It is important for leadership to have strong conviction. At the same time, it is valuable to remain open to other ideas or data that leadership may not have considered or may not have had access to. Operating from a stance of “strong ideas, loosely held” 36 whereby leadership expresses strong ideas, and at the same time remains open to being influenced by others is a powerful way to balance curiosity and conviction.

Awareness of how our mind works and its limits is necessary, but not sufficient. Awareness of one’s own propensity for bias, while an important first step, does not solve the problem. Biases sit in our “blind spot” and thus by their very nature are resistant to change. (For example, almost all of us, including drivers laid up in the hospital for traffic accidents they themselves caused, rate ourselves to be in the top 20% of drivers.) 37 38 At the same time, reducing bias matters. A McKinsey study of more than 1,000 major business investments showed that when organizations worked at reducing the effect of bias in their decision-making processes, they achieved returns up to seven percentage points higher. 39 40 41 42
What else can leadership do to confront biases and limit their impact? Creating broad awareness of cognitive biases enables colleagues to spot them in themselves and in others, as well as build safe-guards in some formal decision-making processes. Most importantly, inviting healthy, rigorous debate and productive conflict (with a stance of “strong ideas, loosely held) in diverse groups sets a stage where colleagues feel encouraged to raise concerns and share differing perspectives.

Furthermore, it helps to remember that we have both more and less control over our thoughts than we think. While our subconscious has access to many more times the data than our conscious mind (11,000,000:40) and our cognitive processes are subject to many errors, we also can train our minds to direct our attention. Using mindfulness practices, we can learn to “exert executive control of attention,” 43 and thereby purposefully focus on some things and ignore others. We can increase our active awareness of dynamics that have been automatic, and thus increase our awareness of a broader range of options and increase our ability to purposefully choose among them. This control is particularly valuable when we are emotionally hijacked, and our automatic reactions are more likely to kick in. Additionally, the more we build and reinforce the neural pathways of desirable reactions through practice, the more likely we are to be able to enact those behaviors in high-stakes situations when our subconscious is leading the charge.

Summary of Key Points: The “Seeing” Meta Competency

Leadership is propelled (for better and worse) by how it perceives reality, and thus the seeing meta competency is paramount to leadership. The seeing meta competency:

• Manifests in direction, vision, strategy, agenda, goals and purpose – establishing “where we are going,” “why,” and “how.”

• Is the domain of cognition and the home of logic, knowledge, expertise, information, data, discernment and judgement. It is rooted in how humans make meaning – what we notice, how we interpret what we notice, and how that drives our conclusions, beliefs, mental models and actions – as well as what we notice in the future.

• Is vulnerable to the weaknesses inherent to how the human analytic mind works, including errors of logic and errors due to bias.

• Operates at the individual and collective levels.

Approaches to “seeing more” can be both cultivated at the individual level and institutionalized at the team or organizational levels. Leadership with a well-developed seeing meta competency:

• Has a wide, deep, and textured lens through which it senses and interprets the world, as well as an awareness of that lens and its limits.

• Is curious -- about what we may not be noticing, about plausible alternative interpretations, about how we may be inadvertently contributing to undesirable situations.

• Operates with “strong ideas, loosely held,” simultaneously operating with conviction and an openness to be influenced by others.

• Maintains an alertness and openness to feedback.

• Understands their own and others’ propensity for bias.

• Encourages respectful, rigorous debate, where diverse perspectives are included and encouraged.

• Practices executive control of attention to learn to slow down automatic thought processes, intervene to disrupt habitual reactions, and choose from a broader set of more effective strategies.

© 2021 Carolyn Volpe Cunningham