IN RETROSPECT, a single day often comes to demarcate the transition between eras. Never mind that the Continental Congress voted to declare the colonies’ independence on July second and that the document probably wasn’t signed until August. The Fourth of July, the day the text of the Declaration of Independence was adopted, will forever be the symbolic first day of the new nation. In the twentieth century, December 7, 1941, became the symbolic end of an America that held the world at arm’s length and the beginning of America the superpower. November 22, 1963, became the symbolic first day of what would be known as the Sixties and of the cultural transformation that wound its course through the subsequent decades. The symbolic last day of the culture that preceded it was November 21, 1963.
IT WAS A THURSDAY. New York City saw a trace of rain that day, with a high of fifty-six, ending several days’ run of late-autumn sunshine. As dusk fell at CBS headquarters at 485 Madison Avenue, Walter Cronkite was in the anchor chair for The CBS Evening News. Just a year and a half into his job, Cronkite was not yet the nation’s Uncle Walter. He wasn’t even the nation’s leading anchorman. His ratings had lagged behind those of Huntley and Brinkley on NBC from the beginning, and the shift in September from a fifteen-minute program to a half-hour program had done nothing to close the gap.
There wasn’t much news to spice up the broadcast this evening. The day had produced one good human-interest story: Robert Stroud, the Birdman of Alcatraz, had died in his sleep at the federal prison in Springfield, Missouri, that morning. But otherwise, the news was humdrum. The Senate Armed Services Committee had approved President Kennedy’s nomination of Paul Nitze to be secretary of the navy. House minority leader Charles Halleck held a press conference in which he said that he did not see how the president’s civil rights bill could get to the floor of the House before the Christmas recess—no surprise, given the many ways in which the all-powerful Rules Committee, dominated by southern Democrats, could delay the process. On Wall Street, the Dow industrials had dropped more than 9 points, more than 1 percent of the Dow’s opening 742. Nobody was especially worried, however. The October figures for housing starts and durable goods had just come out, providing more evidence that the economy was on the upswing.
CBS might have been number two in evening news, but it was number one in prime-time programming. The Neilsen ratings that week placed eight CBS programs in the top ten, led by The Beverly Hillbillies with a rating of 34.9, meaning that 34.9 percent of all American homes with a television set were watching it. Since 93 percent of American homes had a television set by 1963, the upshot was that the same program was being watched in almost a third of all the homes in the United States. Those same staggering numbers went deep into the lineup. All of the top thirty-one shows had ratings of at least 20. By way of comparison, the number one show in the 2009–10 season, American Idol, considered to be a gigantic hit, had a rating of 9.1.1
The explanation for the ratings of 1963 is simple: There wasn’t much choice. Most major cities had only four channels (CBS, NBC, ABC, and a nonprofit station of some sort) at most. People in some markets had access to just one channel—the monopoly in Austin, Texas, where the lone station was owned by Lady Bird Johnson, was the most notorious example.
The limited choices in television viewing were just one example of something that would come as a surprise to a child of the twenty-first century transported back to 1963: the lack of all sorts of variety, and a simplicity that now seems almost quaint.
Popular music consisted of a single Top 40 list, with rock, country, folk, and a fair number of Fifties-style ballads lumped together. No separate stations specializing in different genres, except for country music stations in a few parts of the nation. Except in university towns and the very largest cities, bookstores were small and scarce, usually carrying only a few hundred titles. No Amazon. If you didn’t see a movie during the week or two it was showing in your town, you would probably never see it. No DVDs. With television, you either saw a show the night it played or waited until it was repeated once during the summer. No TiVo.
People drove cars made in the United States. Foreign cars from Europe were expensive and rare. Cars from Japan had just been introduced in 1963, but had not been greeted with enthusiasm—“made in Japan” was synonymous with products that were cheap and shoddy. You might see an occasional sports car on the road—Ford’s Thunderbird or Chevrolet’s Corvette—but the vast majority of customers chose among sedans, convertibles, and station wagons made by General Motors, Ford, or Chrysler.
The typical American city of 1963 had appallingly little choice in things to eat. In a large city, you would be able to find a few restaurants serving Americanized Chinese food, a few Italian restaurants serving spaghetti and pizza, and a few restaurants with a French name, which probably meant that they had French onion soup on the menu. But if you were looking for a nice little Szechuan dish or linguine with pesto or sautéed fois gras, forget it. A Thai curry? The first Thai restaurant in the entire nation wouldn’t open for another eight years. Sushi? Raw fish? Are you kidding?
ON THIS THURSDAY, November 21, television’s prime-time lineup included The Flintstones, The Donna Reed Show, My Three Sons, Perry Mason, and The Perry Como Show, but it was the fourteenth-rated show, Dr. Kildare, that made Time magazine’s recommended viewing. The story that week involved a pregnant unmarried teen who had gotten an abortion. She was so psychologically shattered by the experience that even Dr. Kildare couldn’t help. He had to refer her to a psychiatrist in another CBS program, The Eleventh Hour, for an episode that would air a week later.
She shouldn’t have gotten pregnant in the first place, of course. Getting pregnant without being married was wrong, and if a girl did get pregnant then she and the boyfriend who had gotten her in that fix were supposed to get married. If she didn’t get married, she should put the baby up for adoption. These were conventional views shared across the political spectrum. As of 1963, Americans continued to obey those norms with remarkable consistency. The percentage of births to single women, known as “the illegitimacy ratio,” had been rising worrisomely among Negroes (the only respectful word for referring to African Americans in 1963). But among whites, the illegitimacy ratio was only 3 percent, about where it had been throughout the century.
Marriage was nearly universal and divorce was rare across all races. In the 1963 Current Population Survey, a divorced person headed just 3.5 percent of American households, with another 1.6 percent headed by a separated person. Nor did it make much difference how much education a person had—the marriage percentages for college grads and high school dropouts were about the same.
Not only were Americans almost always married, mothers normally stayed at home to raise their children. More than 80 percent of married women with young children were not working outside the home in 1963.2 When Americans watched The Adventures of Ozzie and Harriet (it was still going strong in 1963, at twenty-sixth place in the ratings), they were looking at a family structure that the vast majority of them recognized from their own experience, whether they were white or black and whether they were working class, middle class, or rich.
An irony of Ozzie and Harriet was that the real Harriet Nelson was herself a working mother (she was a show-business veteran who played herself on the show). Another irony: It wasn’t clear that Ozzie did work—or at least the show never disclosed what Ozzie did for a living. But he had to be doing something. Rich or poor, it was not socially respectable to be adult, male, and idle. And so it was that 98 percent of civilian men in their thirties and forties reported to government interviewers that they were in the labor force, either working or seeking work. The numbers had looked like that ever since the government had begun asking the question.
Whether television was portraying loving traditional families or pointing with alarm to the perils of breaking the code, television was a team player. It was taken for granted that television programs were supposed to validate the standards that were commonly accepted as part of “the American way of life”—a phrase that was still in common use in 1963.
The film industry chafed under that obligation more than the television networks did, but it mostly went along. Few relics of a half century ago seem more antiquated than the constraints under which filmmakers operated. If filmmakers in 1963 wanted the approval of the Production Code of the Motion Picture Association of America, which almost all of them still did, the dialogue could not include any profanity stronger than hell or damn, and there had better be good dramatic justification even for them. Characters couldn’t take the name of the Lord in vain, or ridicule religion, or use any form of obscenity—meaning just about anything related to the sex act. Actors couldn’t be seen naked or even near naked, nor could they dance in a way that bore any resemblance to a sexual action. The plot couldn’t present sex outside marriage as attractive or justified. Homosexuality was to be presented as a perversion. Abortion? “The subject of abortion shall be discouraged, shall never be more than suggested, and when referred to shall be condemned,” said the code.3
There had been pushes against the Production Code before November 1963. Movies like Elmer Gantry and Lolita had managed to get code approval despite forbidden themes, and a few pictures had been released without approval, notably Man with the Golden Arm, Anatomy of a Murder, and Some Like It Hot. A British production that made every sort of licentiousness look like fun, Tom Jones, had opened in October. But the top-grossing American-made movies of 1963—How the West Was Won, Cleopatra, Bye Bye Birdie, The Great Escape, Charade—still fit squarely within the moral world prescribed by the Production Code.
Freedom of expression in literature was still a live issue. A federal court decision in 1959 had enjoined the Post Office from confiscating copies of Lady Chatterley’s Lover, Tropic of Cancer, and Fanny Hill sent through the mails, but many state laws were still on the books. Just a week earlier, a court in Manhattan had heard a case testing a New York State law that prohibited selling any book that “exploits, is devoted to, or is made up of descriptions of illicit sex or sexual immorality.” Did Fanny Hill fall into that category? Without a doubt, said the three-judge panel. It was well written, the court acknowledged, but “filth, even if wrapped in the finest packaging, is still filth.”4
Part of the reason for these widely shared values lay in the religiosity of America in 1963. A Gallup poll taken in October asked as two of its background questions the interviewee’s religious preference and whether he or she had attended church in the last seven days (note the wording in 1963—“church,” not “church or synagogue” or “worship service”). Only 1 percent of respondents said they did not have a religious preference, and half said they had attended a worship service in the last seven days. These answers showed almost no variation across classes. Poor or rich, high school dropout or college graduate, the percentages of Americans who said they were religious believers and had recently attended a worship service were close to identical.5
Hollywood had especially elaborate restrictions on the way that criminal activity could be portrayed, amounting to a stipulation that movies must always show that crime doesn’t pay. But to most Americans, that didn’t seem odd. By 1963, crime had been low for many years. In large swaths of America, doors were routinely left unlocked, children were allowed to move around the neighborhood unsupervised, and, except in the toughest neighborhoods of the largest cities, it seldom occurred to someone walking alone at night to worry about muggers.
The nation’s prisons held only a fraction of the inmates they would hold by 2010, but clearance rates for crimes and the probability of prison time if convicted for a felony were both high. And so we have this paradox compared to later years: Crime was low and few people had ever been in prison, even in low-income neighborhoods, but most of the people in those neighborhoods who regularly committed crimes ended up in jail. People weren’t being naive to believe that crime didn’t pay. By and large, it really didn’t.
As for illegal drugs, we cannot put hard numbers to the prevalence of use—surveys on drug use wouldn’t begin until the late 1970s—but there certainly wasn’t much happening that attracted the attention of the police. In 1963, there were just 18 arrests for drug abuse violations per 100,000 Americans, compared to 1,284 per 100,000 for drunkenness.6 As of 1963, people drank like fish and smoked like chimneys, but illegal drugs were rare and exotic.
America still had plenty of problems on November 21, 1963. The greatest of all, the one that had been eating at the vitals of the American body politic ever since the founders couldn’t bring themselves to condemn slavery in the Declaration of Independence, was the status of African Americans. In 1963, the South was still such a thoroughly segregated society that whether the segregation was de jure or de facto didn’t make much practical difference. In the North, the laws supporting segregation were gone, but neighborhoods and schools in urban areas were segregated in practice. The racial differences in income, education, and occupations were all huge. The civil rights movement was the biggest domestic issue of the early 1960s, and it was underwritten by a moral outrage that had begun among blacks but was rapidly raising the consciousness of white America as well.
The status of American women in 1963 had not yet led to a movement, but there was much to be outraged about. Almost as many girls as boys had enrolled in college in the spring of 1963, but thereafter the discrepancies grew. That same year, there were 1.4 male college graduates for every female, two master’s degrees awarded to males for every one that went to a female, and eight PhDs that went to males for every one that went to a female. Worse than that were the expectations. Teaching and nursing were still two of the only occupations in which women received equal treatment and opportunity, and the women who did enter male-dominated professions could expect to put up with a level of sexual harassment that would prompt large summary damage awards in the 2000s. The vast majority of men took it for granted that women were expected to get married, cook the meals, keep the house clean, raise the children, and cater to the husband. Women who didn’t were oddballs.
Pollution was a dreadful problem in many urban areas. The smog in Los Angeles was often a visible miasma hanging over the city, and less visible pollution was just as dangerously a presence in the nation’s lakes and rivers.
And there was the problem that within a year would become a focal point of national domestic policy: poverty. The official poverty line didn’t exist yet—it was in the process of being invented by the economist Mollie Orshansky and her colleagues at the Social Security Administration—but when that definition of poverty was retrospectively calculated for 1963, it would be determined that almost 20 percent of the American people were below the poverty line. And yet poverty was still on the periphery of the policy agenda. The reason was more complicated than obtuseness or indifference, and it goes to the strategic optimism that still prevailed in 1963: Poverty had been dropping so rapidly for so many years that Americans thought things were going well. Economists have since reconstructed earlier poverty rates using decennial census data, and determined that 41 percent of Americans were still below the poverty line in 1949.7 A drop from 41 percent to under 20 percent in just fourteen years was a phenomenal achievement. No one knew those numbers yet, but the reality of the progress they represent helps explain why the average American wasn’t exercised about poverty in 1963. Things had been getting better economically in ways that were evident in everyday life.
That kind of progress also helps explain why, if you took polling data at face value, America didn’t have a lower class or an upper class in 1963. In the responses to a Gallup poll taken that fall, 95 percent of the respondents said they were working class (50 percent) or middle class (45 percent). A great many poor people were refusing to identify themselves as lower class, and a great many affluent people were refusing to identify themselves as upper class. Those refusals reflected a national conceit that had prevailed from the beginning of the nation: America didn’t have classes, or, to the extent that it did, Americans should act as if we didn’t.
AS WALTER CRONKITE ended the broadcast on November 21 with his newly coined sign-off, “That’s the way it is,” he had no way of knowing that he was within hours of a career-changing event. The grainy videotape of the special bulletins, with Cronkite’s ashen face and his carefully dispassionate voice saying that the news was official, the president was dead, fiddling with his glasses, trying to hide that he was blinking away tears, would become the iconic image of how the nation got the news.
Nor could he, nor any of his audience, have had any way of knowing how much America was about to change, in everything—its politics, economy, technology, high culture, popular culture, and civic culture.
The assassination was to some degree a cause of that change. On November 21, 1963, Kennedy was not an unusually popular president. The image of JFK’s presidency as Camelot came later, through Theodore White’s interview of Jackie Kennedy a few weeks after the assassination. In the weeks just before the assassination, Gallup put his job approval rating at 58 percent—not bad, but hardly spectacular in that unpolarized era—and the New York Times’ number one nonfiction best seller was Victor Lasky’s highly critical J.F.K.: The Man and the Myth. Apart from his only average political clout when he died, Kennedy was disinclined by temperament and beliefs to push for radical change. Then an accident of history brought a master legislator to the White House at a time when national grief and self-recrimination hobbled his political opposition. It is surely impossible that anything resembling the legislative juggernaut that Lyndon Johnson commanded would have happened if Kennedy had been in the Oval Office. No one knows how Vietnam would have played out if Kennedy had lived, but it could hardly have been worse than the trauma that Johnson’s policies produced.
In other ways, the assassination provides a marker coinciding with changes that were going to happen anyway. Many of the landmark reforms of the 1960s were produced by Supreme Court decisions, not the president or Congress, and the activist supermajority on that court was already established. Seven of the judges sitting on the court when Kennedy died were there throughout the next six years of historic decisions.
A sexual revolution of some sort was inevitable by November 21, 1963. The first oral contraceptive pill had gone on the market in 1960 and its use was spreading rapidly. Of course sexual mores would be profoundly changed when, for the first time in human history, women had a convenient and reliable way to ensure that they could have sex without getting pregnant, even on the spur of the moment and with no cooperation from the man.
A revolution of some sort in the fortunes of African Americans was inevitable. The civil rights movement had been intensifying for a decade and had reached its moral apogee with the March on Washington on August 28, 1963, which filled the Mall with a quarter of a million people and concluded with Martin Luther King Jr.’s “I Have a Dream” speech. The precise shape of the legislation and regulatory regime to implement the revolution were probably different under Johnson than they would have been under Kennedy, but momentum for major change in 1963 was already too great to stop.
Something resembling the War on Poverty would probably have been proposed in 1964, no matter what. Michael Harrington’s The Other America had appeared in the spring of 1962 proclaiming that 40 to 50 million Americans were living in poverty, and that their poverty was structural—it would not be cured by economic growth. Kennedy had read the book, or at least some laudatory reviews of it, and ordered the staff work that would later be used by Johnson in formulating his War on Poverty. How many programs Kennedy could have actually passed is another question, but Harrington’s thesis was already being taken up by the liberal wing of the Democratic Party and would have become part of the policy debate even without the assassination.
Other movements that would have sweeping impact on American society were already nascent in 1963. Early in the year, Betty Friedan had published The Feminine Mystique, seen now as the opening salvo of the feminist movement. Rachel Carson’s Silent Spring had appeared in 1962 and become a New York Times best seller, setting off public interest that would lead to the environmental movement. Ralph Nader had written his first attack on the auto industry in the Nation, and two years later would found the consumer advocate movement with Unsafe at Any Speed.
The cultural landscape of the Sixties was already taking shape in 1963. Bob Dylan’s “Blowin’ in the Wind,” “A Hard Rain’s a-Gonna Fall,” and “Don’t Think Twice, It’s All Right”—all theme songs for what we think of as the Sixties—had been released six months before Kennedy died. In November 1963, the Beatles had played for the queen, were the hottest group in England, and were planning their first U.S. tour.
And history had already swallowed the demographic pig. The leading cohorts of the baby boomers were in their teens by November 21, 1963, and, for better or worse, they were going to be who they were going to be. No one understood at the time what a big difference it could make if one age group of a population is abnormally large. Everyone was about to find out.
THIS BOOK IS about an evolution in American society that has taken place since November 21, 1963, leading to the formation of classes that are different in kind and in their degree of separation from anything that the nation has ever known. I will argue that the divergence into these separate classes, if it continues, will end what has made America America.
To forestall misinterpretation, let me spell out what this book does not argue.
First, I do not argue that America was ever a classless society. From the beginning, rich and poor have usually lived in different parts of town, gone to different churches, and had somewhat different manners and mores. It is not the existence of classes that is new, but the emergence of classes that diverge on core behaviors and values—classes that barely recognize their underlying American kinship.
Second, I do not make a case for America’s decline as a world power. The economic dynamics that have produced the class society I deplore have, paradoxically, fostered the blossoming of America’s human capital. Those dynamics will increase, not diminish, our competitiveness on the world stage in the years ahead. Nor do I forecast decline in America’s military and diplomatic supremacy.
But the American project was not about maximizing national wealth nor international dominance. The American project—a phrase you will see again in the chapters to come—consists of the continuing effort, begun with the founding, to demonstrate that human beings can be left free as individuals and families to live their lives as they see fit, coming together voluntarily to solve their joint problems. The polity based on that idea led to a civic culture that was seen as exceptional by all the world. That culture was so widely shared among Americans that it amounted to a civil religion. To be an American was to be different from other nationalities, in ways that Americans treasured. That culture is unraveling.
I focus on what happened, not why. I discuss some of the whys, but most of them involve forces that cannot be changed. My primary goal is to induce recognition of the ways in which America is coming apart at the seams—not seams of race or ethnicity, but of class.
That brings me to the subtitle of this book and its curious specification of white America. For decades now, trends in American life have been presented in terms of race and ethnicity, with non-Latino whites (hereafter, just whites) serving as the reference point—the black poverty rate compared to the white poverty rate, the percentage of Latinos who go to college compared to the percentage of whites who go to college, and so on. There’s nothing wrong with that. I have written books filled with such comparisons. But this strategy has distracted our attention from the way that the reference point itself is changing.
And so this book uses evidence based overwhelmingly on whites in the description of the new upper class in part 1 and based exclusively on whites in the description of the new lower class in part 2. My message: Don’t kid yourselves that we are looking at stresses that can be remedied by attacking the legacy of racism or by restricting immigration. The trends I describe exist independently of ethnic heritage. In the penultimate chapter, I broaden the picture to include everyone.
As with all books on policy, this one will eventually discuss how we might change course. But discussing solutions is secondary to this book, just as understanding causes is secondary. The important thing is to look unblinkingly at the nature of the problem.