Do you remember me telling you the story of my earliest signs of patient empowerment? That story was about the day I decided, at the feisty age of five, to fight back against our family doctor during his house call to our little bungalow on Pleasant Avenue. My totally out-of-character childhood revolt was launched when I overheard Dr. Zaritsky tell my mother that he’d have to give me a needle to fix what was ailing me – and that he’d have to pull down my pajama bottoms to aim the needle just so into my bare bum. But I was having none of it, as I described here:
“I wept. I screamed. I struggled. I tried to run away from him. I think I may have even punched Dr. Zaritsky right in the stomach – until I finally ended up exhausted, sobbing and humiliated, face-down on the chesterfield, essentially calf-roped into submission by two exasperated adults.
“In hindsight, I’m indeed amazed that I actually somehow found it within my (very sick) little 5-year old spunky self to try to fight off a great big doctor who, in our home, was a man second only to Pope Pius XII in terms of authority and reverence.“
To speak or act so disrespectfully to the wonderful Dr. Zaritsky – or to any physician – would have been inexcusably bad behaviour in my family. But meanwhile, in a foreign country far, far away from Pleasant Avenue (i.e. in the United States of America), the stirrings of an even bigger patient revolution were simmering.
As researchers Drs. Robert Blendon and John Benson wrote much later in their landmark history of U.S. health care from 1945-2000, society was very different back then.(1)
While I was busy trying to run away from Dr. Zaritsky, there were not yet any Medicare, Medicaid, or the War on Poverty’s health care programs for our neighbours to the south. Launched a decade later, these were introduced into a far different America than what you know today:
“There was less widespread citizen distrust of government then, a greater public willingness to have government regulate the private sector, and less public concern with the level of federal taxation.”
It didn’t take long, however, for that “citizen distrust” to grow.
In her interesting work on the history of drug advertising(2), the University of Pittsburgh’s Dr. Julie Donohue writes that the American public began losing trust in their health care system starting in the late 1960s, mirroring the general weakening of trust in authority figures and institutions. This of course was also the volatile era in the United States that included the civil rights movement, Vietnamese war protests, environmentalism, the growing voice of feminism and gay rights, among many other anti-establishment cultural phenomena.
In 1966, for example, three-quarters of the American public surveyed had a great deal of confidence in leaders in medicine, but by 1990, less than one-quarter did.
(Ironically, Blendon and Benson found that this decline in public confidence in these medical leaders had NOT affected Americans’ high level of respect for specific practicing physicians, who have consistently been among the highest-ranked professionals. Unfortunately, they add, there appeared to be no comparably positive long trend for health care institutions.)
Few issues could have predicted this change in public confidence as much as the subject of addressing self-medication likely did:
Historically, the federal regulation of drugs began in 1906 when only a few effective drugs were even on the market. People usually chose their medications themselves. There had been two classes of drugs available:
- “ethical” drugs (prescribed only by physicians; just a few of these are still considered today to be effective: digitalis, morphine, quinine, aspirin, ether, for example)
- patent or proprietary drugs (widely available; made of unknown ingredients under trademarked names like Lydia Pinkham’s Vegetable Compound, Hamlin’s Wizard Oil, Kick-a-poo Indian Sagwa, Warner’s Safe Cure for Diabetes; the main ingredient of these salves and tonics was water, plus in some products addictive substances such as alcohol or opium; ads for patent drugs – complete with wildly exaggerated claims – accounted for roughly half of the newspaper industry’s advertising income at the turn of the last century)
But the American Medical Association regarded self-medication with the latter products as a threat to the medical profession.
As Donohue explains:
“By categorizing drugs advertised solely to physicians as ‘ethical’, the AMA created an incentive for pharmaceutical companies to focus their promotional efforts on physicians.
“At the heart of these efforts were the goals of reducing self-treatment and encouraging deference to professional medical judgment.
“The fact that pharmaceutical companies did not advertise prescription drugs to the public was consistent with the prevailing perception that patients were unable to make medical decisions on their own.”
After World War II, laypersons deferred to professional judgment to treat medical conditions that they previously would have treated by themselves.
Social scientists writing about health care in the 1950s and 1960s supported this assumption that patients had little need for involvement in their health care decisions. For example:
- sociologists described patients as “dependent: needing and expecting to be taken care of by stronger, more ‘adequate’ persons”. (Parsons and Fox 1952).
- economists portrayed consumer of health care as “uninformed and reliant on physicians to act as agents on his or her behalf” because of the asymmetry of information between physicians and their patients. During this period, “it was a common and accepted practice for physicians to withhold basic information from patients about their diagnosis and treatment.” (Arrow 1963)
In 1951, the U.S. Congress passed regulatory amendments which created a definition of prescription drugs to include “those that were not safe for use except under the supervision of a practitioner licensed by law to administer such drug[s].”
Sales of prescription drugs exploded.
By 1959, overall drug sales to U.S. consumers rose from $300 million to $2.3 billion, with prescription drugs accounting for all but approximately $4 million of this increase. According to Donohue:
“By 1969, prescription drugs made up 83 percent of all consumer spending on pharmaceuticals.
“Self-medication, which in the early twentieth century was widespread and viewed as a ‘sacred right,’ now took a back seat to the pharmacological treatments guided by physicians after World War II.
“Once drugs were made available only through a physician’s prescription, the pharmaceutical companies stopped advertising directly to patients and instead channeled all their promotions to health professionals.
“By the 1960s, more than 90 percent of the pharmaceutical companies’ spending on marketing was aimed at doctors (with the rest targeting pharmacists and hospitals) – a complete reversal of the pattern thirty years earlier.”
Meanwhile, the 1960s also introduced massive changes to the practice of medicine up here north of the U.S. border as well.
In Canada, our prairie province of Saskatchewan was heading for a provincial election in 1960, and there was only one major campaign issue: the introduction of universal government medical insurance (the country’s first province to attempt this).
Physicians were split in their support of the concept. Doctors who favoured this new system were isolated and ostracized by their angry colleagues, who mounted what Canadian Dimension described as a “ferocious propaganda campaign” against universal health care.(3) The campaign was fronted by the local College of Physicians and Surgeons, with the support of both the Canadian Medical Association and the American Medical Association. For example:
“The local medical hierarchy in 1960 took much of their advice from outsiders and adopted tactics which had proved successful in many similar campaigns in the United States.
“They amassed $100,000 for propaganda purposes, a tremendous sum in 1960 and far more than any party would spend in a Saskatchewan provincial election. Every household received printed anti-universal health care propaganda, and advertisements flooded the radio and newspapers. Public meetings were held throughout the province and were addressed by prominent doctors and supporters.
“The crudeness of the propaganda appears to have been based on the assumption that the Saskatchewan electorate was as unsophisticated as their American counterparts.”
Throughout this campaign, there were denunciations of socialism, communism, and (worst insult of all) “socialized medicine”. Some of the propaganda warned that many doctors would leave the province in protest if this campaign succeeded, and they’d be replaced by inferior foreign practitioners, described like this:
“They’ll have to fill up the profession with the garbage of Europe. Some of the European doctors who come out here are so bad, we wonder if they have ever practiced medicine.”
According to Canadian Dimensions, the effectiveness of this anti-Medicare campaign can be judged by the results of the June 8th election in 1960. Tommy Douglas (forever after revered throughout Canada as the “father of Medicare”) and his CCF party won 37 of the 54 seats in the provincial legislature. (Many Saskatchewan doctors walked out in protest, fearing their incomes would be greatly reduced, but the strike collapsed after three weeks due to an overwhelming unsympathetic public response).
Nationally, the writing was on the wall. The National Medical Care Insurance Act was passed in the Canadian Parliament on December 8, 1966 by an overwhelming vote of 177 to 2. See also: Why You Should Have Your Heart Attack in Canada
Few people in either Canada or the U.S. at the time could have guessed then that, within a few years, pharmaceutical companies would once again start advertising directly to patients.
Drug giant Pfizer, for example, launched a public relations campaign in the early 1980s called Partners in Health Care to increase awareness of medical conditions such as diabetes, angina, arthritis, and hypertension. But as Dr. Donohue tells the story:
“Although these ads didn’t mention any drugs by name, they prominently displayed Pfizer’s name in the hope that consumers who visited their doctors might ask for one of the manufacturer’s products for those conditions.
“By the early 1990s, drug manufacturers were allowed by regulators to run reminder ads and help-seeking ads for the general public that mentioned either a disease or the name of a drug, but not both at the same time.”
She adds that early concerns were expressed by senior FDA staff about the public’s ability to evaluate “the risk/benefit ratio of a drug”, and through overwhelming objections from physicians due to fears that such advertising had the potential to undermine physician authority. Even pharmaceutical executives argued at first that “DTCA would hurt the doctor-patient relationship and confuse an unsophisticated public.”
But the final shift came about largely due to the issue discussed near the beginning of this post: the general weakening of trust in authority figures and institutions.
Coupled with a new consumer interest in partnering with physicians in medical decision-making, this era introduced significant changes in patient rights – ranging from the introduction of compulsory informed consent prior to medical procedures to the growing number of disease-specific patient advocacy groups.
Donohue reminds us that, as early as 1983, Dr. Jerome Kassirer, editor of the New England Journal of Medicine at the time, warned his colleagues:(4)
“Physicians must set aside their image of themselves as making life and death decisions alone, and undertake instead the less glamorous and more time-consuming process of exploring optimal outcomes with the patient.”
During the 1990s, several pharmaceutical manufacturers started to use paid advertising in its most visible and direct form to bypass physicians entirely and promote prescription drugs directly to consumers. Some ads were related to the introduction of “lifestyle” drugs for which no market even existed yet (like Upjohn’s Rogaine drug for hair loss).
These were the forerunners of what’s now known as Direct To Consumer advertising (DTCA – those “Ask Your Doctor” ads we love to hate – illegal worldwide, by the way, except in two countries: New Zealand and the United States of America).
Donohue summarizes the history of DTCA by quoting Dr. Janet Woodcock of the FDA’s Center for Drug Evaluation and Research in a presentation from September 2003 (5):
“It was not until the time of HIV and cancer activism in the late 1980s that the concept of patient empowerment really took hold.
“Several other forces were also active at that time. First, outcomes researchers had shown that patient values and preferences could drive the choice of appropriate treatment. Second, the rise of managed care in many forms led patients to believe that the health care system could not always be completely relied upon to act in their best interest at all times.
“These forces have resulted in a shift in the general societal perception of who needs what information – and of the dynamics of medical decision-making in general.”
(1) Robert J. Blendon, John M. Benson. Americans’ Views On Health Policy: A Fifty-Year Historical Perspective. Health Affairs. doi: 10.1377/ March 2001 vol. 20 no. 2 33-46.
(2) Julie Donohue. A History of Drug Advertising: The Evolving Roles of Consumers and Consumer Protection. Milbank Quarterly. 2006 Dec; 84(4): 659–699.
(3) Lorne Brown, Doug Taylor. Canadian Dimension. The Limits of Medicine in a Sick Society. July/August 2012 issue.
(4) Kassirer JP. Adding Insult to Injury: Usurping Patients’ Prerogatives. New England Journal of Medicine. 1983;308:898–901.
(5) Woodcock J. Director, Center for Drug Evaluation and Research, Food and Drug Administration. Speech at the FDA Public Meeting on Direct-to-Consumer Promotion; September 22–23, 2003; Washington, D.C.
Q: Are the changes we’ve seen in healthcare relationships just part of a natural evolution of authority?
Image: Thanks to Prawny