Dr. Heidi Grant

  • Home
  • About
  • Speaking
  • 3 Things To Do
  • Resources
    • 9 Things Assessment
    • Focus Assessment
  • Books
  • Blog
  • Contact

Make It Easier to Cut Your Losses

December 19, 2010 by Heidi Grant Leave a Comment

(From Fast Company)

Sometimes, we don’t know when to throw in the towel.   As a project unfolds, it becomes clear that things aren’t working out as planned, that it will cost too much or take too long, or that a rival company will beat you to the punch.  But instead of moving on to new opportunities, we all too often simply stay the course.

Company leaders continue to allocate manpower and money to projects long after it’s become clear that they are obviously failing, digging a deeper hole rather than trying to climb their way out of it (Remember how long it took to get rid of New Coke?)

The costs to the company, in terms of both resources and lost opportunities, can be enormous.  For the leader who refuses to see reason, it can be career-ending.  We recognize this foolishness immediately in others, but that doesn’t stop us from making the same mistake ourselves.  Why?

There are several powerful and largely unconscious psychological forces at work here.   We may throw good money after bad because we haven’t come up with an alternative, or because we don’t want to admit to our colleagues, or ourselves, that we were wrong.   But the most likely culprit is our overwhelming aversion to sunk costs.

Sunk costs are the resources that you’ve put into an endeavor that you can’t get back out.   Once you’ve realized that you won’t succeed, it shouldn’t matter how much time and effort you’ve already spent on something.  A bad idea is a bad idea, no matter how much money you’ve already thrown at it.

The problem is that it doesn’t feel that way.  Putting in a lot only to end up with nothing to show for it is just too awful for most of us to seriously consider.  We worry far too much about what we’ll lose if we just move on, and not nearly enough about the costs of not moving on  – more wasted resources, and more missed opportunities.

Companies have developed ways of trying to deal with this problem, but they usually involve extensive external monitoring of decision-making that is both costly and labor-intensive.   But thanks to recent research by Northwestern University psychologists Daniel Molden and Chin Ming Hui, there is a far simpler and inexpensive way to be sure you are making the best decisions when a project goes awry:  focus on what you have to gain, rather than what you have to lose.

As I’ve written about before, psychologists call this adopting a promotion focus. When we think about our goals in terms of potential gains, we automatically (often without realizing it) become more comfortable with making mistakes and accepting the losses we may have to incur along the way.  When we adopt a prevention focus, on the other hand, and think about our goals in terms of what we could lose if we don’t succeed, we become much more sensitive to sunk costs.

For example, in one of their studies, Molden and Hui put participants into either a promotion or prevention mindset by having them spend five minutes writing about their “personal hopes and aspirations” (promotion) or “duties and obligations” (prevention).  They also included a control group with no manipulation of mindset.

Next, each participant was told to imagine that he or she was president of an aviation company that had committed $10 million to developing a “radar-blank” plane.  With the project near completion and $9 million already spent, a rival company announces the availability of their own radar-blank plane, which is both superior in performance and lower in cost.  The question put to participants was simple – do you invest the remaining $1 million and finish your company’s (inferior and more expensive) plane, or cut your losses and move on?

Molden and Hui found that participants who had been put in a prevention mindset  (focused on avoiding loss) stayed the course and invested the remaining $1 million roughly 80% of the time.  The control group, included to provide a sense of how people would respond without any changes to their mindset, was virtually identical to the prevention group.  This suggests that when a project is failing and sunk costs are high, most of us naturally become prevention-minded, and more likely to try to keep waging a losing battle.

The odds of making that mistake were significantly reduced by adopting a promotion mindset (focused on potential gain) – those participants invested the remaining $1 million less than 60% of the time.*

When we see our goals in terms of going for a win, rather than avoiding a failure, we are more likely to see a doomed project for what it is, and try to make the most of a bad situation.

It’s not difficult to achieve greater clarity if you make a deliberate effort to refocus yourself prior to making your decision.  Stop and reflect on what you have to gain by cutting your losses now – the opportunities for progress and innovation.  If you do, you’ll find it much easier to make the right choice.

*Why not a bigger drop? Good question.  Remember that promotion focus was manipulated very indirectly through a totally unrelated writing task.  If you adopt a promotion focus directly with respect to the decision itself, considering what you could gain by moving on from your failure, the effects should be even stronger.

Why Creative People Get Kept Out of the Driver’s Seat

December 12, 2010 by Heidi Grant 1 Comment

From my Fast Company blog:

Two candidates are being interviewed for a leadership position in your company.  Both have strong resumes, but while one seems to be bursting with new and daring ideas, the other comes across as decidedly less creative (though clearly still a smart cookie).  Who gets the job?  And who should?

The answer to the question of who gets the leadership job is usually the less creative candidate.  This fact may or may not surprise you – you may have seen it happen many times before.  You may have even been the creative candidate who got the shaft.  But what you’re probably wondering is, why?

After all, it’s quite clear who should be getting the job.   Creativity – the ability to generate new and innovative solutions to problems – is obviously an important attribute for any successful business leader.  Research shows that leaders who are more creative are in fact better able to effect positive change in their organizations, and are better at inspiring others to follow their lead.

And yet, according to recent research there is good reason to believe that the people with the most creativity aren’t making it to the top of business organizations, because of a process that occurs (on a completely unconscious level) in the mind of everyone who has ever evaluated an applicant for a leadership position.

The problem, put simply, is this: our idea of what a prototypical “creative person” is like is completely at odds with our idea of a prototypical  “effective leader.”

Creativity is associated with nonconformity, unorthodoxy, and unconventionality.  It conjures visions of the artist, the musician, the misunderstood poet.   In other words, not the sort of people you usually put in charge of large organizations. Effective leaders, it would seem, should provide order, rather than tossing it out the window.

Unconsciously, we assume that someone who is creative can’t be a good leader, and as a result, any evidence of creativity can diminish a candidate’s perceived leadership potential.

In one study conducted by organizational psychologists Jennifer Mueller, Jack Goncalo, and Dishan Kamdar, 55 employees rated the responses of nearly 300 of their (unidentified) coworkers to a problem-solving task for both creativity (the extent to which their ideas were novel and useful) and as evidence of leadership potential.  They found that creativity and leadership potential were strongly negatively correlated – the more creative the response, the less effective a leader the responder appeared.

In a second study, participants were told to generate an answer to the question “What could airlines do to obtain more revenue from passengers?” and give a 10-minute pitch to an evaluator.

Half the participants were asked to give creative answers (both novel and useful, e.g. “offer in-flight gambling with other passengers”), while the other half were told to give useful but non-novel answers (e.g., “charge for in-flight meals.”) The evaluators, unaware of the different instructions, rated participants who gave creative answers as having significantly less leadership ability.

Even though it is a quality that is much-admired, there is a very clear unconscious bias against creativity when it comes to deciding who gets to be in the driver’s seat.  Organizations may inadvertently place people in leadership positions who lack creativity and will only preserve the status quo, believing they are picking people with clear leadership potential.

The good news is, the bias can be wiped out – in fact, reversed – if evaluators have a charismatic leader (i.e., someone known for their uniqueness and individualism, like a Steve Jobs, Richard Branson, or Carly Fiorina) rather than an effective but non-charismatic leader in mind.   In the airline-revenue study, when evaluators were asked to list 5 qualities of a “charistmatic leader” prior to the idea pitch, the participants with creative solutions were instead perceived as having the most leadership potential.

Taking the time to remind yourself (or, if you are the applicant, remind your interviewer) that creativity is essential to effective leadership rather than at odds with it, is the key to making sure your company has the very best people behind the wheel.

When You Benefit From Being Underestimated, and When You Pay For It

November 1, 2010 by Heidi Grant Leave a Comment

From my Fast Company Blog:

There have been times in my life when I felt that, because I’m female, I have been treated unfairly in the workplace – times when I was passed over for leadership positions, or less trusted with responsibilities that are traditionally given to men.   Then again, I’ve also felt at times that I’ve benefitted from low expectations – particularly when handling something women aren’t supposed to do well. (Like the time when diagnosing and repairing a simple computer glitch suddenly rendered me a “computer whiz” around the office.  Come on, people.)

If you are a member of a group that is stereotyped as less competent, then you are no doubt well aware that stereotypes do in fact influence how your coworkers and supervisors see you.  What you may not have realized is that their influence can work for or against you, depending on the type of evaluation you are receiving.

Psychologists who study the way human beings make judgments distinguish between using minimum standards (enough to make you suspect something is true) and confirmatory standards (enough to make you certain that something is true).

Imagine you are trying to figure out whether or not Steve is a dishonest guy.  Minimum standards of dishonesty would probably be met the first time you catch Steve in a lie – you would start to suspect that Steve can’t be trusted, but you wouldn’t be sure. After all, everybody lies from time to time.  To meet confirmatory standards, however, you’d probably have to catch Steve in a number of lies – enough to conclude that he is more than usually deceptive.

Stereotypes affect both our minimum and confirmatory standards for a given trait, but in opposite directions.  For example, part of the stereotype for women, particularly in the business world, is that they are less competent than men.  Studies show that because of this stereotype, minimum standards of competence for women are lower than they are for men. In other words, you are quicker to suspect that a woman is smart than you are to suspect that a man is.  That’s because when a woman does something “smart” it stands out more, since it is (unfortunately) more surprising.  When it comes to minimum standards of competence, women seem to benefit from being underestimated.

Unfortunately, the reverse is true when it comes to confirmatory standards, which are higher for women when it comes to competence.  So in order for someone to be certain that a woman is smart, she needs to provide more evidence of competence than a man would.  For a woman, you need to be consistently really smart to prove you aren’t actually stupid.

These differing standards have real world consequences.  In one study, female candidates for a job were more likely to be placed on a short list than males (evidence for the lower minimal standard of competence), but less likely than male candidates to actually be hired (evidence for the higher confirmatory standard of competence). In another study, White law school applicants with weak credentials were judged more positively than Black applicants with identical credentials (further evidence of the higher confirmatory standard for a stereotyped group).

So stereotyped people (women, minorities) will have an easier time than their White male counterparts when minimum standards are used to judge them, and a harder time when confirmatory standards are used.  But what determines which standards are used?

In a recent set of studies, researchers found that set of standards that get used is often determined by the formality of the evaluation.  A formal record or log  (like an end-of-the-year review) invokes the use of the confirmatory standard, while informal evaluation and personal note-taking  (like the kind of feedback your boss gives you at a weekly meeting) invokes the use of the minimum standard.

The researchers asked each participant in the studies to review information about a company trainee with a spotty performance record (i.e., he or she had lost a file on a client, missed an important deadline, and forgotten a scheduled appointment with a client, among other things).  The participants were asked to either “take informal notes” that would be for purely personal use, or to keep a “formal employment log” that would become a part of the employee’s permanent record.

They found that participants were more likely to record negative behaviors in their personal notes for White males than for women or Black males, but less likely to do so for White males in their formal notes.  In other words, judges noticed and recorded fewer negative behaviors for the groups stereotyped as incompetent (women and Blacks) when using minimal standards in the informal evaluation, but noticed and recorded more of the same behaviors when using the confirmatory standards of the formal evaluation.

At the end of both evaluations, participants were asked if the trainee should be kept on at the company or terminated.  Not surprisingly, White males were more likely to be recommended for termination when evaluated informally, and less likely to be fired when evaluated formally.

The participants in these studies weren’t overt racists or sexists – in fact, they weren’t even aware that they were evaluating employees differently because of their race or gender.  Like much of today’s workplace bias, its influence occurred at an unconscious level, perpetrated by otherwise decent and fair-minded people.  But even if its workings are intangible, the results of bias are anything but.  When different standards are unknowingly used, people end up being more likely to be hired or fired because of their gender or race, and that is unacceptable.

The good new is, unconscious bias loses much of its power once we recognize that it exists.  Once we become aware that we are apt to use different standards to evaluate people doing the same job, and once we understand when we are likely to be a little too lenient, or a little too critical, we can adjust accordingly.  Probe your own thinking for bias – ask yourself, would I come to the same conclusion about this employee’s behavior if she were a he, and if he were White?  Chances are you can make fair decisions, once you realize how and why you might make unfair ones.

What Makes You (and Me) Act Like a Jerk

October 24, 2010 by Heidi Grant Leave a Comment

Lessons from Good Boss, Bad Boss

I recently finished fellow PT blogger Robert Sutton’s excellent new book, Good Boss, Bad Boss.  In it, he describes not only what the best (and worst) bosses do, but why they do it, identifying the essential beliefs that form the foundation of effective (and ineffective) management.

It struck me again and again as I was reading that so much of the advice Sutton offers on how to be a good boss can also be applied to the universal challenges of being a good and happy person.   I think one of my favorite chapters, “Squelch Your Inner Bosshole,” is a perfect illustration of what I mean.

In it, Sutton points to the some of the forces that turn otherwise decent human beings into rotten bosses.  We would be wise to remember that these forces are often present in the lives of non-bosses as well – who among us hasn’t been a real jerk on occasion?  The good news is, if you can identify the triggers of your unpleasant behavior, and become aware of their influence on you, you too can effectively squelch your inner a**hole.

Here are some of the triggers of bad boss behavior Sutton highlights:

1.     “Power Poisoning.”

Sure, power sometimes corrupts.  But more often, it just turns us into jerks.  Studies show that when people are given power, they become less tuned in to other people’s feeling and needs, paying less attention to what others say and do.   With power, our language and behavior becomes more insulting and inappropriate, and we become more self-absorbed, focusing more on our own personal gain than what is best for the group.

It’s not just bosses who experience the nasty side effects of power.  Have you ever been in a relationship with someone who was just a bit too needy and insecure?  Were you surprised to find how cold, selfish, or downright cruel you became in response?  When friends or romantic partners give us all the power, when we find ourselves with too much “hand,” it can lead to pretty callous behavior.

2.     “Extreme Performance Pressure.”

Being under time pressure, or knowing that a lot is riding on what we’re doing, makes all of us less sensitive to the needs and feelings of others.   We’re so busy thinking about what could wrong, and worrying about our own performance, that it creates a kind of tunnel vision.    Feeling anxious makes you irritable – this is why you come home from work after a hard day and yell at your spouse, your kids, or your dog.

3.     “Sleep Deprivation, Heat, and Other Bodily Sources of Bad Moods.”

Sutton points out that a lack of sleep, or uncomfortable temperatures, can disrupt our ability to make good, rational decisions, because tiredness and heat make us irritable and impatient.  Poor nutrition and illness can also leave you feeling unusually jerky.

(Interestingly, do you know what doesn’t predict mood?  Day of the week – people aren’t actually reliably happier on Friday and more depressed on Monday.  So if you’re acting like a jerk on a Monday, find something else to blame.)

4.     “Nasty Role-Models” and “A**hole Infected Workplaces”

Throughout Good Boss, Bad Boss, Sutton emphasizes the enormous power of social influence.  We emulate the people around us, often unconsciously.  And as he writes, “emotions are remarkably contagious.”  Anxiety, cynicism, selfishness, and negativity rub off.  So if you are surrounded by cranky jerks, you just might begin to behave that way yourself without realizing it.

Sutton’s solution to the trigger problem is a good one  – make sure you have people in your life you can trust to tell you when you are acting like a jerk.  Give them explicit permission to do so, and make sure you really listen and react without defensiveness.

Then take a good hard look at how you’re acting and ask yourself if that’s really the person you want to be.    If not, start looking around for the trigger.  Is power going to your head?  Are you under too much pressure?  Are you hanging around too many jerks?

If you’re not happy with your own behavior, renew your commitment to noticing and respecting the needs and feelings of the other people in your life.  And if you need one, take a nap.

Is Your Willpower Running Low? Only If You Believe It Is.

October 11, 2010 by Heidi Grant Leave a Comment

A great deal of recent research (some of which I’ve written about in this blog) suggests that our capacity for self-control is much like a muscle.  Its strength varies from person to person, and also from moment to moment, depending on how recently and how hard it’s had to work.  (Think about how your legs can feel like jelly after a long run, and you get the idea.)

Just as our muscle strength is inherently limited, so too are our reserves of willpower.  Thus, self-control is often at its weakest immediately after we’ve had to use it – an effect demonstrated in dozens of published studies, and obvious to anyone who has every succumbed to the urge to drink, smoke, or eat a whole pint of ice cream at the end of very stressful day.

But what if you happened to be someone who believed that engaging in difficult tasks was energizing, rather than depleting?  What if you were convinced that using your willpower activates resources, rather than drains them?  What would happen?

You’d be right!  Thanks to a new set of studies by Veronika Job, Carol Dweck, and Gregory Walton, it’s become clear that people’s beliefs about the nature of self-control determine whether or not it is depleted by use.

The researchers distinguished between people who believed that willpower is a limited resource or a non-limited resource, and found that only those who believed in the limited-resource theory had less self-control (i.e., made lots of mistakes) after working on something very difficult.

How can this be? Both groups were equally exhausted by difficult task, so you might think they would be equally mistake-prone.  But it turns out that our theories about self-control determine how exhaustion affects us.

When people who hold the limited-resource view experience something as exhausting, they have less self-control and are more prone to errors because they see exhaustion as a sign to reduce effort, in order to rest and eventually replenish their self-control reserves.  In contrast, those with the non-limited resource view continue to put in effort despite their exhaustion, and make fewer errors because of it.

These beliefs, not surprisingly, predict how people handle the more stressful and demanding periods in their lives.  For instance, the researchers found that during the more stressful , exam-filled weeks in the academic semester, belief in the limited-resource theory of self-control predicted greater consumption of unhealthy junk foods, procrastination, and less effective study habits among college students.  Those who believed in limitless willpower, on the other hand, held up under stress just fine.

So, is self-control limited, or isn’t it?  The answer has become a lot less clear, and frankly, I’m no longer sure it matters.  What does matter is whether or not you believe that it’s limited.    And since you have some choice when it comes to your beliefs, I recommend going with the limitless willpower view.  Maybe in the end, all it takes to put down that pint of ice cream at the end of the day is believing that you actually can.

The Cure for Loneliness

October 1, 2010 by Heidi Grant Leave a Comment

The world grows ever smaller, more connected, more crowded, and ironically, increasingly lonely for many of us.  This is a problem with a whole host of unhappy consequences, not just for the individuals who experience it, but for society as a whole.

It’s important to point out before I go any further that loneliness is not the same thing as being a private person, or a “loner,” because some of us actually both need and enjoy a lot of time to ourselves.  Loneliness, instead, refers to the difference between the amount of social contact and intimacy you have and the amount you want.  It’s about feeling isolated, like an outcast.

(That said, the opposite of loneliness isn’t popularity either – you can have dozens of “friends” and still feel lonely.  True intimacy and feelings of relatedness are much more about the quality of your relationships than the quantity.)

Persistent loneliness is not only emotionally painful, but can be more damaging to our physical and mental health than many psychiatric illnesses.  For instance, lonely people sleep poorly, experience severe depression and anxiety, have reduced immune and cardiovascular functioning, and exhibit sings of early cognitive decline that grow more severe over time.

Not surprisingly, psychologists have created dozens of interventions designed to try to tackle this epidemic of loneliness.  The approaches taken are varied, but can be broken up, roughly speaking, into four different categories.

There are interventions aimed at:

Improving social skills. Some researchers argue that loneliness is primarily the result of lacking of the interpersonal skills required to create and maintain relationships.  Typically, these interventions involve teaching people how to be less socially awkward – to engage in conversation, speak on the phone, give and take compliments, grow comfortable with periods of silence, and communicate in positive ways non-verbally.

Enhancing social support.  Many lonely people are victims of changing circumstances. These approaches offer professional help and counseling for the bereaved, elderly people who have been relocated, and children of divorce.

Increasing opportunities for social interaction. With this approach, the logic is simple:  If people are lonely, give them opportunities to meet other people.  This type of intervention, therefore, focuses on creating such opportunities through organized group activities.

Changing maladaptive thinking.  This approach might seem surprising, and its rationale less obvious than the other approaches.  But recent research reveals that over time, chronic loneliness makes us increasingly sensitive to, and on the lookout for, rejection and hostility.  In ambiguous social situations, lonely people immediately think the worst.  For instance, if coworker Bob seems more quiet and distant than usual lately, a lonely person is likely to assume that he’s done something to offend Bob, or that Bob is intentionally giving him the cold shoulder.

Lonely people pay more attention to negative social information (like disagreement or criticism). They remember more of the negative things that happened during an encounter with another person, and fewer positive things.

All this leads, as you might imagine, to more negative expectations about future interactions with others – lonely people don’t expect things to go well for them, and consequently, they often don’t.

Interventions aimed at changing this self-fulfilling pattern of thinking begin by teaching people to identify negative thoughts when they occur.  Whenever they feel anxious about a social encounter, find themselves focusing on everything that went wrong, or wondering if they’ve made a bad impression, a red flag is raised.

Next, they learn to treat these negative thoughts as testable hypotheses rather than fact.  They consider other possibilities – maybe everything will go smoothly, maybe it wasn’t all bad, perhaps everyone liked me after all.  They practice trying to see things from the perspective of others, and interpret their actions more benignly.

Take the case of Bob the Distant Coworker.  With thought retraining, lonely people learn to ask themselves questions like “Am I sure Bob doesn’t like me?  Could there be other, more likely reasons for his quiet, reserved behavior at work?  Could he simply be preoccupied with some problem?  I know sometimes I get quiet and distracted when something is bothering me.  Maybe Bob’s behavior has nothing to do with me!”

Once the negative thoughts are banished, lonely people can approach new relationships with a positive, optimistic outlook, see the best in others, and learn to feel more confident about themselves.

With four approaches to curing loneliness, the obvious question is:  What works?  Thanks to a recent meta-analysis of 50 different loneliness interventions, the answer is clear.  Interventions aimed at changing maladaptive thinking patterns were, on average, four times more effective than other interventions in reducing loneliness.  (In fact, the other three approaches weren’t particularly effective at all.)

It turns out that fundamentally, long-term loneliness isn’t about being awkward, or the victim of circumstance, or lacking opportunities to meet people.  Each can be the reason for relatively short-term loneliness – anyone who has ever moved to a new town or a new school and had to start building a network of friends from scratch certainly knows what it’s like to be lonely.   But this kind of loneliness needn’t last long, and new relationships usually are formed… unless you’ve fallen into a way of thinking that keeps relationships from forming.

More than anything else, the cure for persistent loneliness lies in breaking the negative cycle of thinking that created it in the first place.

Follow me on Twitter @hghalvorson

  • « Previous Page
  • 1
  • 2
  • 3
  • 4
  • 5
  • Next Page »

Dr. Grant has delivered talks for:

Twitter Facebook Linkedin
© 2025 Dr. Heidi Grant | Site by Objectiv