Double Standards, (Mis)Aligned Incentives and What To Do About it
This excerpt from Think Twice got me thinking about incentives and the double standards between our personal and professional, or internal and external views.
“Dr. Katrina Firlik, a neurosurgeon, shared an example: at conference dealing with spine surgery, a surgeon presented the case of a female patient with a herniated disc in her neck and pain that was caused by a pinched nerve. She had already failed typical conservative treatments such as physical therapy, medication, and waiting it out.
The surgeon asked the audience to vote on a couple of choices for surgery. The first was the newer anterior approach, where the surgeon removes the entire disc, replaces it with a bone plug, and fuses the discs. The vast majority of the hands shot up. The second choice was the older posterior approach, where the surgeon removes only the portion of the disc that is compressing the nerve. No fusion is required because the procedure leaves most of the disc intact. Only a few audience members raised their hands.
The speaker then asked the audience, which was almost entirely male, ‘What if this patient is your wife?’ The show of hands was reversed for the same two choices. The main reason is that the amount surgeons are paid for the newer and more complicated procedure is typically several times what they’d receive for the older procedure.”
This story tells us a lot about human behavior. It’s a textbook example of a double standard, which occurs when “a rule or principle is unfairly applied in different ways to different people or groups.” It is also an example of misaligned incentives — doctors earning more money doing a risker treatment (see the principal-agent problem). These two issues tend to go hand in hand.
Double standard — doctors do a risky procedure on patients they would be unwilling to do on their own family.
Misaligned Incentives — doctors earn more doing expensive (and often riskier) procedures like spinal fusion. Patient suffers.
It begs the question, are doctors evil?
Not so fast. The author is quick to assert that doctors are willing to do a spinal fusion treatment because they can earn more. I would argue that there is more at play here than meets the eye.
I believe doctors care about their patients. They also care about professional success. While not mutually exclusive, it can lead to conflicts of interest when incentives are not aligned.
One possibility is that doctors prefer to do a more expensive and invasive procedure because the system incentivizes them to do so (more on that below). This doesn’t absolve them from committing bad behavior.
A second (and perhaps more representative) possibility is that doctors are unaware of their contradictory behavior until they are faced with a personal choice (i.e. choosing a treatment for their wives).
I would argue that often times we don’t tend to be aware that we are applying a double standard until it becomes personal. In this case, the doctors were forced to reconsider a more invasive procedure (i.e spinal fusion) because it involved a family member.
It’s not only doctors that change their minds. Consider the following example.
When Statistics Don’t Matter
Disease X affects a small percentage of the population and the survival rate is low. The only available treatment does little to improve the chances of survival but it got FDA approval because it was shown to help in the early stages of the disease. It is extremely expensive — over $250k per year — and has serious side effects.
In principle, you believe that patients with disease X should not receive treatment. The healthcare system would be better served allocating those resources towards R&D or finding cures for diseases with higher probabilities of success.
Then a loved one is diagnosed with disease X. The doctor asks for your consent to approve the treatment. You agree that, without a doubt, your loved one should receive treatment.
So what happened here?
It’s Personal
Our macro view on healthcare policy quickly disintegrated in the face of personal misfortune. We’ve made an exception to our healthcare policy. A double standard. If your loved one receives treatment, then so should everyone else.
In both examples, double standards are at play. It would be unfair and unethical to do one spinal procedure on patients you would hesitate to do on your loved ones. Similarly, it would be unfair to deny treatment to other patients when you would demand that a family member receive it.
It’s when things get personal, that our principles are put to the test.
Principle #1 — Provide for my family.
Principle #2 — Provide the best care to my patients.
Principle #1 and #2 are not mutually exclusive. In fact, well-designed systems are those that understand the tension between the different stakeholders’ incentives and create a system of checks and balances. In other words, they incentivize doctors to optimize both principles #1 and #2 (i.e. tie reimbursement to patient outcomes). Poorly designed systems like the US healthcare system create a misalignment of incentives.
The System is Broken
The US spends nearly twice as much on healthcare as other high-income countries, yet has poorer population health outcomes (see here). It is a broken triangle between patients, providers, and payors.
Patients care about getting treated. They typically don’t know the price of the procedure, and they don’t bother asking as long as their insurance covers it.
Providers (i.e. doctors, hospitals) deliver care to patients. Most are also profit-motivated and compete with other providers to attract patients by offering the best treatments. There’s nothing wrong with that per se. The US is the world’s leader in drug discovery and biotech innovation. But when given the choice between two procedures, providers will likely choose the more expensive one as long as they get reimbursed (principle #1 above).
Payers (i.e. insurance companies) reimburse providers. They earn the difference between what patients pay them in insurance policies and the amount they pay out to providers.
You would think payers would put pressure on providers to offer cheaper but effective alternatives, but it’s hard for them to do so when there’s a lack of standardized procedures that would allow for a proper comparison of price and quality. Without proper oversight and transparent pricing, the system starts to break down. For a more detailed account of the healthcare system, see The Healing of America by T.R. Reid and Reinventing American Health Care by Ezekiel Emanuel.
As patients, we rarely ask for a second diagnosis or if there is a less invasive alternative. We rarely get an itemized bill or explanation of costs before a procedure. We don’t care. We want the best care, and we want it now. We deserve it. That’s why we pay for insurance. The rest of the healthcare system can wait.
Awareness Issue
Most of the time we act in line with our principles and values. A double standard means we deviate from that and discriminate based on the situation. But the key issue illustrated in both examples above is that most of the time we are not aware that this happens until we are confronted with a personal dilemma. Whether participants act in their best interest or outright unethically, it is hard to know for sure.
However, the broader issue at hand is systemic. It’s an issue of misaligned incentives. The healthcare system has many flaws and perhaps the most critical one is the way reimbursement works. It is tied to the number of procedures, not on the outcome of patient care.
Not Just Healthcare
This same dynamic occurs in finance. Brokers want to sell as many mortgages. They get compensated on volume. There’s a constant tension between the bank’s underwriting risk committee (care about not losing money) and brokers (want to earn as much commission).
Survival of the Fittest
It’s not just doctors and bankers that commit to double standards. In situations of stress or conflict, we may be forced to prioritize one set of principles over another, which may give rise to a double standard. Consciously or unconsciously, we say one thing and do another. In the face of adversity, our survival instincts kick in high hear and we tend to act in our own best interest, often in conflict with one set of principles. We prioritize our wellbeing over others. It’s instinctual. Quite simply, we do as airlines instruct us to do: put our oxygen masks before assisting other passengers.
However, this can lead to the justification of bad behavior (see Milgram Experiment and self-justification). At the heart of the issue is this: not enough skin in the game.
More Skin in the Game
The first step is to recognize our vulnerabilities and tendencies to behave inconsistently. The second step is to constantly remind ourselves of our principles and hold ourselves accountable to them. No exceptions. The third step is to design and implement systems that prevent this sort of behavior from happening by aligning incentives between stakeholders. More skin in the game. (For more see Skin in the Game by Nassim Taleb).
Healthcare — reimburse doctors based on patient outcomes, not on the number of procedures.
Banking — compensate bankers based on the default rate, not on the number of loans they sell.
Silicon Valley does this well. Startup employees earn a big piece of their compensation in equity, so they have a vested interest in the long-term success of the company.
At last, how do we spot double standards and (mis)aligned incentives in our own lives?
To that end, here is a practical set of questions to help uncover true motives in our daily interactions. Use them next time you’re receiving any sort of advice not just from doctors but also from lawyers, consultants, bankers, and even friends:
If you were in my shoes, would you do as you say?
Do you get any sort of benefit or compensation if I follow your advice?
Are there alternatives? If yes, why choose this over the others?
If you were to explain to your [parents/kids/significant others] why you gave me this advice, what would you say?
Do you think this advice is an overreaction to the situation?
What kind of new information would make you change your mind?
Do we have more time to delay making a decision?
I am not immune to these cognitive biases. But at least being aware that I am vulnerable helps me uncover situations where it is likely to happen and take measures to prevent it. A helpful exercise is to turn the issue on its head by making it personal and seeing if you change your mind.
There’s nothing quite like putting our principles to the test. When the rubber meets the road…only then we will know.