Read Monoculture: How One Story is Changing Everything Online
Authors: F.S. Michaels
Tags: #Business and Economics, #Social Science - General
Public libraries embodied something called library faith: the belief that books change lives. Library faith represented a foundational belief in “the virtue of the printed word, the reading of which is good in itself, and upon which many basic values in our civilization rest. When culture is in question,” said political scientist Oliver Garceau, “the knowledge of books, the amount of reading, and the possession of a library — all become measures of value, not only of the individual but also of the community.”
24
Public libraries also once shaped people’s reading tastes, “improving” people through books. In the 1940s and 50s, librarians argued about whether or not “light” fiction should be allowed in the library because fiction was considered entertainment, something with little or no educational value, and libraries were supposed to be edifying.
25
Libraries took it upon themselves to help people become informed and thoughtful citizens by keeping the knowledge and values needed for democratic society in circulation.
26
In 1852, the trustees of the Boston Public Library — the first library in America to be supported by public taxes — stated, “…it is of paramount importance that the means of general information should be so diffused that the largest possible number of persons should be induced to read and understand questions going down to the very foundations of social order, which are constantly presenting themselves, and which we, as people, are constantly required to decide, and do decide, whether ignorantly or wisely.”
27
The United Nations called the public library “a living force for education, culture and information” and “an essential agent for the fostering of peace and spiritual welfare through the minds of men and women.”
28
In short, the public library didn’t just contribute to the public good; it was the public good. We invested in it with our tax dollars because we believed our society was better off when our citizens were literate and educated. The library was the people’s university, the great equalizer in society — the place where you could access books and learn for free regardless of your income.
29
As a public good, libraries existed outside the boundaries of the market.
30
Libraries preserved the human record within the limits of their resources, protecting and transmitting that record for future generations.
31
They embodied intellectual freedom, the idea that you should be able to think and believe what you want. Because of that belief in intellectual freedom, diverse views — even those that were “unorthodox, unpopular, or considered dangerous by the majority” — were deemed to be in the public interest, and the library became a place where you could find alternative and competing points of view on a given issue.
32
In practice, intellectual freedom meant important but controversial books were put out on the shelf instead of banned or burned. Information about who was reading what was kept confidential, even from law enforcement, and everybody who used the library had equal access to the information they needed regardless of religion, ethnicity, gender, age, or economic status.
33
The library created information resources that the market wouldn’t, because the private sector had no reason to invest in knowledge that didn’t make money, and knowledge that is unorthodox, unpopular, or considered dangerous often isn’t profitable.
34
When the economic story spreads through your community and into the public library, library services become understood as a market, and what goes on in markets starts happening at the library. Information is transformed from a social good that helps to develop informed citizens into something to buy and sell and profit from. The library becomes an information business in the information services industry and starts to focus on what businesses focus on: customer service, cutting costs, efficiency, and productivity.
Librarians become information specialists who just happen to work in libraries. Chief Librarians become CEOs. Library patrons become customers, and libraries start gathering information about customer needs and wants through market research. Libraries become worth supporting not because they are a public good, but because they respond to customer needs.
35
By 1980, public libraries were focused not on prescribing what patrons ought to read, but on being responsive to customers by giving them what they wanted.
36
What some customers wanted, of course, was to ban books, creating a conflict of interest between customer responsiveness and the library’s historical dedication to intellectual freedom.
37
As the economic story spreads into libraries, economic language spreads too. As one prominent librarian said, “Every time a dollar changes hands [at the library] there has been a business transaction. We establish a mission based on our values, we plan strategically and allocate resources accordingly, we engage competent and capable staff to make our products and services available, we monitor and adjust depending on our market’s needs and desires. These are all business activities.”
38
The profession started to ask itself, “What If You Ran Your Library Like a Bookstore?” and branch libraries in East London were renamed “Idea Stores.”
39
To be sure, many libraries embraced these shifts because of cuts in government funding. In 2010, American libraries from coast to coast — including the venerable Boston Public Library — again found themselves struggling to cope with city budget deficits in an attempt to avoid branch closures, staff layoffs, and reduced hours and services. In the past, as libraries struggled for shrinking government dollars, many had already adopted business strategies to address the shortfall. Libraries slowly became a place to make money, and a place for corporations to promote themselves and sell their products and services.
40
When the library comes to think of itself as a business, it starts being discussed in terms of return on investment. The economic story says libraries should make money by developing their own revenue streams and opening bookshops, gift shops, and coffee shops. Libraries should also introduce user fees and charge for library cards. In the economic story, you are an individual, and as an individual, if you benefit from something like the public library personally, you should pay for that benefit personally. Though fees for library cards were controversial when they were first introduced because they flew in the face of the library principle of equal access to information, user fees now typically represent 10 to 15 percent of the average library budget.
41
In the oil-rich Canadian province of Alberta, library user fees were introduced after government cutbacks in the 1980s. Even when the province became solvent and debt-free, posting multi-billion dollar surpluses and enjoying a reputation as the wealthiest province in Canada, the fees stayed. In the capital city of Edmonton, after user fees were introduced, library enrolment and circulation dropped significantly and had not recovered ten years later.
42
The smaller center of Banff, Alberta, chose to axe its library user fee; library membership soared 40 percent that year.
43
Although many libraries allow people to ask for the fee to be waived if it’s unaffordable, as one librarian said, “[As] someone who grew up in a poor family, I feel that asking people for proof of their poverty humiliates them. (Surely being poor is humiliation enough without having to identify yourself as such to get ‘special treatment’ in what I feel is our most democratic institution — the public library.)”
44
In the economic story, libraries are encouraged to raise funds by selling named space to individual or corporate donors. This was already happening in libraries to some extent; library buildings were being named after donors. The Carnegie libraries were named for steel magnate Andrew Carnegie, who financed more than 2,500 libraries around the world. What is different now, though, is that library parts are for sale.
45
Naming opportunities include the circulation desk, individual meeting rooms, study rooms, window reading nooks, reading benches, and the picture book collection. That kind of private sponsorship, though it brings in revenue, also creates a vicious cycle; companies get a tax write-off for their donations, which means less corporate tax ends up in city coffers. With less money available in public funds, libraries typically find themselves on the chopping block again, making them even more dependent on private sector funding.
In the economic story, the neutral public space that the public library once represented doesn’t stay neutral. Prior to the Vancouver 2010 Olympics, public libraries in Vancouver were asked to make sure the brands of sponsoring corporations were given exclusive play at library functions. A leaked internal memo read, “Do not have Pepsi or Dairy Queen sponsor your event…Coke and McDonald’s are the Olympic sponsors. If you are planning a kids’ event and approaching sponsors, approach McDonald’s and not another well-known fast-food outlet.” Libraries were also advised to try to meet official sponsors’ brand requirements. If only Sony equipment were available in the library, for example, instead of equipment made by official sponsor Panasonic? “I would get some tape and put it over the ‘Sony,’” the Vancouver Public Library’s manager of marketing and communications was quoted as saying, “Just a little piece of tape.”
46
The economic story interprets the “public” in public library in a new way. It says the management of public libraries ought to be outsourced to the private sector, which is more efficient and effective. Book-buying for the library is outsourced to corporations. Critics worry that outsourcing the development of the library collection is akin to corporations deciding which books are available to you in the library at all, and question whether books that challenge the status quo or that criticize business itself will find their way onto the shelves.
47
Even so, in 1997, the city of Riverside, California became the first documented library system to outsource the operation of its 25 library branches to a private company called Library Systems & Services, LCC (LSSI). Critics say the company runs libraries for less than cities can by hiring fewer trained librarians, and by paying lower salaries and offering fewer benefits to employees.
48
Yet by 2010, LSSI was America’s fifth-largest library system, having “taken over public libraries in ailing cities in California, Oregon, Tennessee and Texas.” In late 2010, LSSI won its first contract to run libraries in the financially-healthy city of Santa Clarita, California; the $4 million deal was described as “a chance for the company to demonstrate that a dose of private management can be good for communities, whatever their financial situation.”
49
After all, in the economic story, the public sector and the private sector are no longer distinct areas of activity that ought to be managed differently. In the economic story, the public sector and the private sector are the same sector: private.
To the extent that economic thinking is based on the market, it takes the sacredness out of life, because there can be nothing sacred in something that has a price. Not surprisingly, therefore, if economic thinking pervades the whole of society, even simply non-economic values like beauty, health, or cleanliness can survive only if they prove to be ‘economic.’
—E.F. SCHUMACHER
You in the West have the spiritually poorest of the poor much more than you have the physically poor. Often among the rich are very spiritually poor people. I find it is easy to give a plate of rice to a hungry person, to furnish a bed to a person who has no bed, but to console or to remove the bitterness, anger, and loneliness that comes from being spiritually deprived, that takes a long time.
—MOTHER TERESA
FOR HUNDREDS OF YEARS, the human value at the center of medicine was health, says biomedical ethicist Daniel Callahan, “the integrated well-being of mind and body” — the healing of the sick, the compassionate relief of suffering. Doctors were expected to act in the best interests of their patients. Plato wrote, “The physician, as such, studies only the patient’s interest, not his own…The business of the physician, in the strict sense, is not to make money for himself, but to exercise his power over the patient’s body…All that he says and does will be said and done with a view to what is good and proper for the subject for whom he practices his art.”
1
Even so, doctors weren’t always respected. In Roman times, doctors were decidedly low in status: they were slaves, freedmen, or foreigners. Until as late as 1745, surgeons were considered craftspeople who belonged to the same guild as barbers since both worked with their hands. A medical journal of the time remarked that when a promising young man chose to become a doctor, “the feeling among the majority of his cultivated friends is that he has thrown himself away.”
2
In the 1800s in England, doctors hovered around the edges of the gentry, trying to look and act like the upper class since professional success was about having the right aristocratic patrons and displaying the right social graces. In America, the aristocracy didn’t exist, so medical schools and societies were launched, often by doctors themselves, to bolster the status of the profession. At the same time, legislation was enacted that controlled who could and couldn’t open a medical practice.
As a result, in the 1800s, being a doctor was a hard way to make a living. Americans were wary of medical authority. Doctors didn’t have stores of medical knowledge or techniques to pull from, and most families, isolated in rural areas with low incomes, could only afford to call a doctor if the situation was desperate. Doctors charged for mileage on top of the fee for a medical visit, and five to ten miles of travel meant the travel fee could be four or five times as high as the visitation fee. They ended up working long hours and traveling long distances to see patients. The image we still have today of the dedicated, selfless doctor comes from that era of medicine.
3
During the Industrial Revolution, work that used to be done at home started moving into the factories, making it harder for family members to care for the sick at home. As steamboats and railways were built, cities began to develop. Better mobility meant that family members were more spread out than they had been, so they weren’t always available to care for the sick. As cities grew, property values also started to rise, and many families could only afford to live in apartments, which left less space at home to care for the sick. More people were also living alone in cities, which meant the need for hospitals was growing along with the demand for doctors. At one time, few people used hospitals voluntarily because of the risk of infection; hospitals were more about charity than medical expertise and most were run by religious orders where nuns, doctors and nurses volunteered their time to care for the sick. You went to the hospital to die, or when you didn’t have family or friends to care for you. If you were sick, you were simply safer at home.
4
At the same time, doctors were also becoming more mobile. The invention of the telephone meant patients could call the doctor instead of sending for him, and the invention of the car meant doctors could reach patients faster; doctors were among the first car buyers. As doctors began to travel farther and faster, they saw more patients, increasing from an average of five to seven patients a day in the mid-1800s, to 18 to 22 patients a day by the early 1940s. As travel costs went down, medical care became more affordable. Doctors became more accessible, and people became more dependent on their services.
5
Still, in 1900, medical practice was unsophisticated. New ideas were slow to be adopted. Most surgeons still used their bare hands when operating, and few pharmaceutical drugs existed. A medical education meant you’d sat through two years of mostly lectures at one of over 150 schools, many of which were for-profit and had low entrance standards.
Then medical knowledge started to grow. From the early 1900s to the early 1940s, x-rays, ECGs, and the four major blood groups were discovered, along with insulin, sulfa, penicillin and anaesthetics. Doctors became a symbol of healing. The growing demand for medical care meant that doctors could afford to give up lower-paying services and focus on higher-paying, more complex services that involved things like diagnostic labs, radiology, and surgical suites. Those complex services were often offered in hospitals now that medicine had advanced to the point where a doctor’s expertise no longer fit into a black bag, and where the services offered were too expensive to be maintained in every doctor’s office.
6
As urbanization shifted care of the sick from families and neighbors to doctors and hospitals, health care became a commodity, something that was bought and sold. At the same time, though, medicine wasn’t thought of as just another thing for sale. It was regulated because it dealt with serious issues like the relief of human suffering. Bad health care could have drastic consequences like disability or death, and most people who needed medical help weren’t in a position to evaluate the kind of help they were getting.
The buying and selling of health care was also softened by the ideals that dominated the culture of the medical profession.
7
In 1934, the ethics code of the American Medical Association (AMA) said non-doctors (outside investors) profiting from medical work was “beneath the dignity of professional practice, is unfair competition within the profession at large, is harmful alike to the profession of medicine and the welfare of the people, and is against sound public policy.”
8
Before World War II then, medicine was a cottage industry financed mostly by wealthy patients and philanthropists. Not enough medical technology existed to support a health manufacturing industry, and the government was uninvolved in health care other than via licensing and tax laws. In 1946, most American citizens were uninsured and paid for medical services out of their own pocket, or sometimes paid in kind. But in 1946, medicine was also viewed as a profession, not a business. A patient’s medical needs, by and large, were put ahead of a doctor’s financial gain.
After the Second World War, funding that had gone to the atom bomb was redirected to medical research, and in the 1950s and ’60s, major advances were made in surgery, radiation, chemotherapy, organ transplants, and tranquilizers. Medical knowledge had now grown too large for a single doctor to learn during training, and doctors increasingly began to specialize. In 1923, 11 percent of American doctors were specialists; in 1989, over 70 percent were. Specialists were paid more than generalists and enjoyed more prestige, but specialization also meant that a doctor’s once-holistic view of you as a patient became fragmented, and personalized medical care started to fade.
9
With the rise of new medical technology, along with specialization, insurance coverage, and unregulated payments for doctors’ fees, medicine started looking attractive to outside investors. In the late 1960s and early 1970s, Wall Street started investing in for-profit health care facilities like investor-owned hospitals, nursing homes, home care, labs, and imaging services.
10
After an advertising ban in medicine was lifted, doctors and hospitals started advertising their services. Where open and public competition between doctors and hospitals had once been considered unethical and unprofessional, advertising now made that competition public, which strained collegiality.
11
As investors started showing interest in health care, medical costs started to spiral due to inflation, growing research expenses, rising doctors’ fees, higher hospital costs, more health benefits for employees, and an aging population (medical advances had lengthened our lives but now we faced the complications of chronic disease which we just hadn’t survived to experience before). Malpractice suits were also rare until the twentieth century, when a growing number of lawsuits created “defensive medicine”: doctors did everything they possibly could in a medical situation to avoid being sued for negligence.
12
Technological advances in medicine were also proving expensive. Though new technology usually pays for itself because machines replace workers, in medicine that didn’t happen. Instead, medical advances involving complicated equipment and procedures required additional experts to be trained in the technology and increased costs instead of decreasing them.
13
The market was presented as a solution to all of these problems. The economic story says that a health care market will bail the government out of health care support it can no longer afford. Medicine started taking on the management practices of large businesses, and industrialization techniques were applied in the field. Private capital became a major player in the system, and much of the money was tied up in insurance companies and manufacturers of health technology. For-profit health services appeared in home care, kidney dialysis centers, care centers, and hospitals. Multinational health care companies grew and were said to be “to the old ‘doctor’s hospitals’ what agribusiness is to the family farm.”
14
An original $8 share in Humana, a multinational health care company, in 1968 was worth $336 by 1980; investments in hospital systems during those years returned almost 40 percent more in earnings than the average for other industries.
15
In the early decades of the twenty-first century, health care is a multi-billion dollar industry. Medical schools now offer joint MD-MBA degrees and business school graduates hold top positions in medical organizations, even though as recently as 1978, doctors weren’t expected to understand health care financing and organization.
16
Managers of both not-for-profit and for-profit hospitals, who earn salaries as hefty as those in the private sector, are rewarded based on the net income of the hospital, and hospital CEOs or presidents “are clearly accountable to their boards as business experts.”
17
Health care policies are laid out by business school professors and economists.
18
Arnold Relman, former editor-in-chief of the
New England Journal of Medicine
and the man who coined the term
medical-industrial complex
, says the most important socioeconomic change in a hundred years of American health care is the movement from “a professional service for the sick and injured into one of the country’s largest industries” — a transformation of health care from the compassionate relief of suffering to a profit-oriented business. Relman admits, “I am not saying that business considerations were never a part of the medical profession…or that physicians were in the past unconcerned about their income…But the commitment to serve patients’ medical needs (as well as the needs of public health) and the special nature of the relation between doctor and patient placed a particularly heavy obligation on physicians that was expected to supersede considerations of personal gain — and usually did.”
19
Biomedical ethicist Callahan agrees. “[T]here is an enormous difference,” he says, “between a discipline and a profession whose practitioners do not resist the personal good life when it comes their way, and one which has that life as its purpose.”
20
Paul Starr, a Pulitzer Prize-winning sociologist adds, “The contradiction between professionalism and the rule of the market is long-standing and unavoidable. Medicine and other professions have historically distinguished themselves from business and trade by claiming to be above the market and pure commercialism. In justifying the public’s trust, professionals have set higher standards of conduct for themselves than the minimal rules governing the marketplace.”
21
Back in the 1960s, the norm as a doctor, according to the AMA, was to limit your professional income to “reasonable levels” because the “charging of an excessive fee is unethical…[the] fee should be commensurate with…the patient’s ability to pay…”
22
Health care was considered to be about need, not someone’s ability to pay, since health care dealt in quality of life as well as life itself.
23
The economic story, on the other hand, says health care services are products, hospitals and doctors are sellers, and you as a patient, your government, and your insurance company are buyers. Your doctor is an entrepreneur competing with other doctors for your business. As a business, the health care industry promotes ever-changing products, “medicalizes” problems by advertising all kinds of conditions, stimulates interest in cures, builds consumer demand, and tries to get you out to the doctor more. Doctors can now be hired by insurers, which creates a conflict of interest between you and your doctor; an insurer typically wants to pay out as little as possible, so your doctor is caught in the middle, wanting to do what’s medically necessary for you as a patient while being aware that his or her employer is eyeing the cost.
24
In short, health care as a business is profoundly different from health care as a profession. Health care as a profession was founded on the relationship between you and your doctor; you trusted that your doctor was acting in your best interests.
25
But by the 1990s, that trust was starting to erode. In the United States, hospitals were commonly paid a lump sum per patient by insurance programs, and so could grow their profits by keeping their costs down; doctors could order fewer tests, hospital workers could be paid less, and patients who were critically ill could even be shipped to other hospitals, thereby creating a financial incentive for hospitals to treat the least sick and discharge them as fast as possible, keeping turnover high. Simply, it was more profitable to keep people out of the hospital than in. By 1997, over one-third of hospital revenues in America were realized from outpatient services.
26