Skip to main content

Middle Ground Essay: Thinking About People

Middle Ground Essay: Thinking About People

Our thoughts and feelings guide our social interactions

Much of the thinking we do is focused on the people we meet and interact with. Scientists studying social cognition—how we think about people —have identified various ways our minds work to process social information and make sense of our interactions with others. These ways of thinking often involve making judgments about people and social events and can shape how we feel and act toward others. As long as our judgments are accurate, this isn’t a problem, but certain mental habits can cause us to get people wrong or make unwise decisions, to the detriment of those around us—and ourselves.

How the mind works

Two systems of thinking

We live in a complex world; our senses are barraged with sounds, sights, and other stimuli. The internet only adds to this torrent of information. While our brains do a good job processing all this, we have limited attention, and thinking takes effort. To help us  process information efficiently, the human mind has two systems: a  fast, automatic mode that operates outside of awareness and can save time or thinking effort (especially when we’re in a hurry or have to make decisions quickly), and a slower, more deliberate system that helps us think more deeply about people and complex situations.

In many social situations, events unfold rapidly, one on top of the other, and we don’t have time to ponder why people do what they do, or to learn much about the people we meet. In such situations the automatic thinking system takes over and can help us make good enough—but often inexact—judgments without spending a lot of mental effort. Scientists have described a variety of mental strategies as part of the automatic thinking system. Below you can learn more about some of these thought habits and why they play an important role in social life.

Mental shortcuts

Categorizing and stereotypes

People automatically put things in categories. We sort almost everything into groups: cars, trees, animals, rocks—and people. Putting things in boxes helps us make quick judgments, but when we do this with people, it can lead to stereotyping. To stereotype someone is to automatically categorize them as a member of a particular group. That’s not necessarily a problem. There is often at least a kernel of truth to stereotypes, and when we’re right about the people we’re categorizing, stereotypes can help us process information quickly and respond in useful ways. Imagine you have lunch plans with your elderly relative. If you think that older people are forgetful, it might be good to call ahead and remind your relative of the date.

When stereotypes are inaccurate, however, we deny other people’s individuality. This can be offensive or hurtful—for example, if your elderly relative actually has an excellent memory and never forgets a date, your calling ahead might make them feel disrespected. Stereotypes can be even more problematic, forming the basis for prejudice (valuing groups differently from one another) and discrimination (treating groups differently based on prejudice).

One way to minimize stereotyping is to take a step back and ask what we really know about a person before we put them into a box. We can challenge our existing stereotypes by getting to know group members as individuals. It also helps to think about our own feelings when other people stereotype us, and to try viewing the world from the perspective of those we might stereotype.

Judging behavior (attribution errors)

As we try to make sense of other people’s behavior and save on mental effort, we’re apt to make the easiest assumption: a person acts that way because of who they are. It’s relatively quick and easy to assign someone’s behavior to their character. (“That young person taking a handicapped space looks fine—they must be a jerk!”) We tend to disregard the role of other factors—those unrelated to someone’s character or personality—because it takes more mental effort and time to imagine what else might explain the behavior. (Maybe they have a disability that is not visible, or they just didn’t see the sign.)

Psychologists call this the fundamental attribution error: the tendency to overestimate the influence of people’s character on their behavior and underestimate the role of context. It causes us to overlook the effect that situations have on people, and to believe people have more control over the world than they actually might. Being aware of such mental errors is important, because they can affect how we interact with others—for example, determining whether we scorn someone or empathize with them.

This error in judging individuals can also affect how we think about people’s ability to succeed or fail economically. It can lead to a faith in upward social mobility that is contradicted by actual economic conditions. A person who firmly believes that anyone can become successful by “pulling themselves up by the bootstraps” might fail to consider that some people start life with strikes against them. With such a point of view, they may discount the effort it takes some people just to catch up, financially and socially, with others who've had an easier start.  

Verifying what you believe (Confirmation bias)

When we are trying to evaluate a claim or statement, it’s easiest to examine only evidence that is consistent with what we already believe or think—selective searching cuts way down on mental effort. If we do this, we fall prey to confirmation bias: seeking out information that supports ideas we already hold and ignoring contradictory information. This common and often automatic bias also operates in the social realm—for example, if you’ve been told that someone has a bad temper, you’ll be on the lookout for signs of anger from them. It affects many other parts of our lives, too, even something as innocuous as, say, deciding whether to eat more blueberries. If you start out believing blueberries are good for you, you’ll be less likely to look for news about their health risks.

The bias really kicks in when we care about the outcome—when we want something to be true (or false) because it reinforces our identity or values. In that case, we may be highly motivated to search for evidence that supports what we believe. What’s more, we may deliberately discount or reject information, or avoid sources that contradict our cherished beliefs. But when we do this, the opinion we form isn’t truly independent. To avoid confirmation bias, it’s important to search not only for information that supports our beliefs, but also for high-quality information that might contradict them.

Modern technology can inadvertently strengthen confirmation bias: apps and online sites that remember our searches and search preferences often provide us with information that confirms what we believe, before we even get a chance to do a balanced search. Receiving only such targeted information can keep us in our comfort zone.


Estimating events (heuristics)

People use rules of thumb to make fast estimates and decisions about uncertain events. Called heuristics, these mental shortcuts, like others, can be useful but can also lead us astray.


How does a mental shortcut color your judgment of risk?

Try this:


  • Click and drag each cause of death, ordering them according to the likelihood that a U.S. citizen will die from them.


  •  Chronic Lower Respiratory Diseases
  •  Diabetes
  •  Cancer
  •  Heart Disease
  •  Homicide
  •  Accidents
  •  Suicide

  • 1. Heart Disease
  • 2. Cancer
  • 3. Chronic Lower Respiratory Diseases
  • 4. Accidents
  • 5. Diabetes
  • 6. Suicide
  • 7. Homicide

When judging the likelihood of something happening, we tend to estimate it based on how quickly we can think of examples. If instances come easily to mind, we’re likely to think the event is more common than it really is. Scientists call this the availability heuristic. The tendency can give us some idea how likely an event might be, but it can also create a biased assessment of reality. Say, for example, you want to see a show in a nearby neighborhood, but the location has been in the news recently with stories about crime and drug-related incidents. These stories come quickly to your mind, affecting your decision to visit. Judging just from the news reports, you might overestimate the risk and unnecessarily fear the place and its people. This mental shortcut can also cause us to overlook different or unexpected information that might counter our biases and help us form more balanced judgments. 

When searching for information, dig beyond what comes easily.

Slower thinking

What can we do to combat biases, judge others more accurately, or form more balanced opinions? Try some of these science-backed tips, most of which require us to override our automatic thinking mode and shift into a more deliberate, slow-thinking mode.

  • Burst your bubble and seek out information that contradicts your beliefs to combat conformation bias. You can do this by deliberately seeking out high-quality sources that you might normally avoid, or with the help of apps or websites that cover issues from across the political spectrum.  
  • Practice playing devil’s advocate and come up with rival points of view or imagine outcomes that are counter to what you believe. Doing this can help you consider arguments from multiple sides of an issue, combat confirmation bias, and correct flawed judgments about others.
  • Educate yourself. Awareness of specific biases can decrease your tendency to fall prey to them, at least to some extent.


Really? So rude!

Quick vs. slow thinking

When you see others behaving badly, your immediate reaction might be to judge the person’s character—you think they’re strange, or lazy, or rude.

But maybe there’s a reason for their actions. If you take a moment, you might imagine a situation where that person’s behavior makes sense.

Wow, that person's in a rush. Maybe there's an emergency

Further Reading

Kelloggs School of Management at Northwestern University (2018). “How Closely do our Beliefs About Social Mobility Match Reality?: The Answer Differs Between Americans and Europeans, and Between Liberals and Conservatives.” (article)

Kelloggs School of Management at Northwestern University (2017). “The Psychology Behind Fake News: Cognitive Biases Help Explain Our Polarized Media Climate.” (article)

Daniel Kahneman (2013). Thinking, Fast and Slow. (book)

C. G. Lord, M. R. Lepper, and E. Preston (1984). “Considering the Opposite: A Corrective Strategy for Social Judgment.” Journal of Social Psychology 47(6): 1231–1243. (abstract)

Claude Steele, C. M. (2010). Whistling Vivaldi and Other Clues to How Stereotypes Affect Us. (book)



National Science Foundation
This material is based upon work supported by the National Science Foundation under Grant No. 1713638. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.