1. Cornucopia And Its Discontents
Groping for a sense of the Zeitgeist has been an intellectual stock-in-trade since the ancient Greek thinkers, who discovered they were living in history. If they could name the immediate past, they could locate themselves in relation to it; they could perhaps comprehend and certainly criticize it. Kings, naturally, have always wanted to know where they stood in the winds of their time, and so, of course, have the opponents of kings, as well as those who simply wanted to make do in the crevices of power. Today, the habit of naming the Zeitgeist has grown widespread, even frantic. As a convenience sport, it is most frequently practiced by journalists and publicists with deadlines to meet and headlines to write; there is profit in getting the right handle on the moment and making it marketable.
Zeitgeist-mongering is the stuff of cocktail party chat for an age in which capsule stereotypes masquerading as ideas help us master the flood of incoming information. But the Zeitgeist is an elusive wind, and the worst temptation is to oversimplify. There are many cross-breezes, eddies, local variations, rippling shifts of direction; even Sturm und Drang blows in fits and starts. The Zeitgeist mutters, like the oracle of Delphi, and like the oracle it requires interpreters. What the Zeitgeist mutters depends in good part on what questions it is asked.
"The Fifties" were multiple, of course, according to whether you lived on Manhattan Island or in Manhattan, Kansas, in Southern California or North Carolina; different too depending on whether you were eight or eighteen or fifty-eight, female or male, black or white, Irish Catholic or Protestant or Jewish, an electrical worker or a salesman of appliances or a housewife with an all-electric kitchen or the president of General Electric; and this is not yet to speak of differences in family style and personality. But one thing we know is that the presumably placid, complacent Fifties were succeeded by the unsettling Sixties. The Fifties were, in a sense, rewritten by the Sixties, as the Sixties have been rewritten by the Eighties.
I am going to look at the Fifties, then, as a seedbed as well as a cemetery. The surprises of the Sixties were planted there. I want to look closely at the culture and institutions of the Fifties, look at how the Fifties presented themselves to the young in general, and in particular to that minority who were about to claim the right, if not the capacity, to remake history: those of us who were born during or just after World War II; who were roughly eight to fourteen years old in 1956, when Dwight Eisenhower defeated Adlai Stevenson for the second time and "Heartbreak Hotel" was a smash hit; who were thus twelve to eighteen in 1960, when the sit-ins began and John Kennedy was elected President over Richard Nixon; who were then seventeen to twenty-three in 1965, when Lyndon Johnson began the systematic bombing of North Vietnam. I offer something of a composite of those who were middle- or upper-middle class in origin and poised to go to college in the late Fifties and early Sixties; and in particular those who spawned the civil rights and antiwar movements and the New Left within them, as well as the hippies and other cultural movers and shakers of the Sixties. How did we understand the world and ourselves in it? How did the spirit and structures of that decade shape our sensibilities? How, from closure, did openings come?
A first approximation: this generation was formed in the jaws of an extreme and wrenching tension between the assumption of affluence and its opposite, a terror of loss, destruction, and failure.
"Affluence": so much a word of the Fifties, with its cognate connotations of flow, flux, fullness. The word had already achieved currency by the time John Kenneth Galbraith published the bestselling The Affluent Society in 1958; it was far more American than "rich," harnessed as that brutal syllable is to its natural counterpart, "poor," thus bringing inequality to mind. "Affluence" sounds general, and in the Fifties it was assumed to be a national condition, not just a personal standing. Indeed, affluence was an irresistible economic and psychological fact in a society that had long since made material production and acquisition its central activities. The boom of 1945 to 1973, occasionally interrupted by recessions only to roll on seemingly undiminished, was the longest in American history.
Starting late in war-blasted Western Europe and Japan, the boom rolled, however unevenly, through the rest of the industrialized world. But America was the richest, richer than any other country or bloc had ever been.
By 1945, the United States found itself an economic lord set far above the destroyed powers, its once and future competitors among both Allies and Axis powers. Inflation was negligible, so the increase in available dollars was actually buying more goods. Natural resources seemed plentiful, their supplies stable; only small think-tanks and obscure writers worried about whether they might ever prove exhaustible. And if, as some critics charged, the distribution of income had not materially changed since the Thirties, the fact remained that all segments of the population were improving their positions—not necessarily in relation to one another, but in relation to their pasts and those of their families. And it was the relation to the past that struck most people as the salient comparison. The Depression was over. And so were the deprivations of World War II, which also brought relative blessings: While European and Japanese factories were being pulverized, new American factories were being built and old ones were back at work, shrinking unemployment to relatively negligible proportions. Once the war was over, consumer demand was a dynamo.
Science was mobilized by industry, and capital was channeled by government as never before. The boom was on, and the cornucopia seemed all the more impressive because the miseries of Depression and war were near enough to suffuse the present with a sense of relief.
The flush of prosperity and the thrill of victory also translated into a baby boom.” The number of births jumped by 19 percent from 1945 to 1946, then another 12 percent the next year, and after settling down for three years boomed again and continued to boom into the early Sixties. More babies were born in 1948-53 than in the previous thirty years. The first boom could be understood as a makeup for wartime deprivations, but then why did it resume and, astonishingly, go on? As Landon Y Jones has pointed out, the sustained boom took place only in the United States, Canada, Australia, and New Zealand, countries that were left unscathed by the war, blessed with land, robust with confidence, feverish with what Lord Keynes once called "relentless consumption." Couples were marrying earlier, starting their children earlier, and having more of them. The baby boom was widely touted as a tribute to the national glory. Whatever the exact explanation, babies were the measure and the extension of the economic boom; they were good for its markers; they were its pride; in some ways they were its point.
So affluence was not just an economic fact but a demographic one, and the demographic bulge matched the affluent state of mind. The idea of America had long been shaped by the promise of opportunity in a land of plenty, but at long last the dream seemed to be coming true.? The world seemed newly spacious, full of possibilities. Americans were acquiring consumer goods at an unprecedented pace; indeed, with the housing boom, and the great treks from the country to the city, from the city to the suburbs, from the South to the North, millions of Americans were acquiring whole new spaces to live in. The cities were being "renewed," "redeveloped," their faces lifted, while the upbeat language of "renewal" concealed the injuries done to millions who were unwillingly shunted away from the valuable parcels of real estate they had called home; but there was presumably nothing to worry about, for wasn't progress (as General Electric advertised) our most important product, and didn't the language of affluence imply that there was room for all in the great gushing mainstream?
Most of the newly affluent were happy to forget, and the media had little interest in reminding them, that even with easy credit and higher incomes and the growing number of white-collar jobs, not everyone could afford a new house, a new car, TV set, high fidelity sound, or the rest of the appurtenances of the American good life. The evident fact remained: in the course of the Fifties, television, high fidelity, jet travel, and multiple cars became middle-class staples. Galbraith charged that private affluence was crowding out public goods, causing and obscuring the impoverishment of the public sector. If you looked at American schools, if you contrasted the condition of trains and subways with the condition of suburban houses and cars, you could see that public services were being starved, that public funds were going to fuel the boom in private spaces and private goods.
For after 1945, the government had been enlisted in behalf of private comfort and convenience for the vast reaches of the middle classes. But who looked?
The Puritan Utopia of a "city upon a hill" found its strange completion in the flatlands of the American suburb. For growing numbers, daily life was delivered from the cramp of the city, lifted out to the half-wide, half-open spaces, where the long-sought and long-feared American wilderness could be trimmed back and made habitable. The prairie became the lawn; the ranch, the ranch house; the saloon, the Formica bar. The postwar American families wanted space for stretching out, space for their children (and from them), space from their parents and in-laws; and they wanted their private domains loaded with the latest appliances: partly for the convenience, partly to confirm that they were making a fresh start, freed from Depression cramp. In 1945, a mere 19 percent of the people polled by The Saturday Evening Post said they were willing to live in an apartment or a "used house."
Fueled by federal financing, by low interest rates and mortgage guarantees for veterans, builders constructed vast suburban developments. Between 1946 and 1958, outside the farms, 85 percent of all new housing was built beyond the central cities. And when the vets and newlyweds beheld the grass and trees and the panoply of their private properties, they must have felt at least for a moment that the American dream had come true, that in America even the butchery of the war could have a happy ending. With the kitchen spilling directly into the dining room, the glass doors opening from the living room into the outdoor barbecue and play area, the picture window bringing the lawn right up to the wall-to-wall carpet, the ideal suburban home was an intertwining of nature and civilization; it was as if the suburban family had realized Karl Marx's vision of a blending of countryside and city.
Magazines advertised these houses, television featured them, relatives admired and envied them, and the suburbanites reveled in the space—the "spaciousness"—of their new quarters, jumping at the chance to stuff them with washers and dryers, electric kitchens and garage openers and do-it-yourself workshops. Spread out laterally, like the lords of tiny manors, they enlarged their domains and cushioned their days with television, a kind of electronic upholstery. Apparently the whole world was at the fingertips of the American family.
And the family was the raison d'étre of affluence, its point and its locale. The ostensible beneficiaries of all the plenitude were the most dependent members of the family unit: Mom, who would spend the bulk other life supervising her conveniences, and the kids, who would grow up knowing how good the things of life could be. Dad's wage underwrote the whole family's division of labor and pleasures; after the jarring wartime years, when vast numbers of women were mobilized into jobs, women were now expected—and expected themselves—to secure the home front. This delicate bargain was secured by an unwritten contract, a division of labor, that was trumpeted through all the linkage networks of the modern mass media. Against the centrifugal pressures inherent in Mom and Dad's division of labor, the nuclear family was bound together through the cementing idea of "togetherness."
The suburb .was of course inconceivable without quick, reliable transportation to work, and the instrument of choice, the incarnation of power, comfort, and freedom all at once, was the automobile. This was the time of the automatic transmission, of power steering, power brakes, and more powerful engines. The long stabbing fins, easily mocked, were only the extreme and outward signs that the car, like a yacht, was meant for cruising. The conspicuous adornment of chrome was a sign that America had metal to waste. And what could be more deluxe than to bring the car under one's own roof, in the two-car garage?
Shopping and leisure were retailored for an age of easy access. The shopping center represented the possibility of consumption without limits, the logical extension of the department store. The drive-in theater, a bonus of auto-convenience, created a social space perfectly adapted for the newly mobile.
Improved roads also heightened the sense of freedom—even as the breadwinner followed the same route day after day. Even city dwellers could slip away to the countryside for weekends and summer vacations. The expressways were especially efficient conduits. And what the local expressway made possible every day, the interstate highway made possible on a national scale. The Federal-Aid Highway Act of 1956 authorized forty-one thousand miles of interstate roads, great sleek limited-access superhighways where nature was trimmed back for smooth passage, and Americans could begin to feel that the whole of their vast country was coming within reach.
The open road had long been a symbol of American freedom from overcivilization; it meant adventure and sex and joyrides before it meant commuting. For now, few worried aloud about the congestion, the carnage and pollution which the private automobile brought with it, or about its consequences for the cities, or about the future of America's dependence on petroleum. The car was still the incarnation of personal power, freedom, leisure, sex, access, efficiency, ease, comfort, and convenience all wrapped in a single machine; both a symbol and a symptom of the American search for ways to liberate the self from social restraints. It was personal power in a private compartment tooling its way toward the horizon.
Middle-class Americans were becoming cultural omnivores, traveling abroad in growing numbers, visiting national parks and historical sites, going to theaters, museums, and concerts as never before. Cultural ambitions ratcheted upward; New York became the world center of the arts. Growing numbers of middle-class consumers felt it their responsibility to be au courant. They were accumulating coffee-table books, subscribing to Saturday Review and the Book-of-the-Month Club, buying records, briefing themselves about art. The wealthier were buying paintings, propelling Abstract Expressionists to stardom and unanticipated wealth; the less wealthy bought prints. Amateurs tried their hands at acting and choral singing, or tinkered with crafts at home. Painting by the numbers was one fad that all by itself contained the contradictory aspirations of the middle-class Fifties: creativity and security at the same time. Movie attendance shrank, largely because of competition from television, but campuses and museums spawned film (not just movie) societies, and by the late Fifties, amid the overall decline, Americans were for the first time getting to see a good number of foreign films: the British comedies starring Alee Guinness and Peter Sellers; Brigitte Bardot; then, in the cosmopolitan centers and university towns, Bergman, Fellini, the French New Wave, even the Russians.
For the multitudes who could afford the ticket, then, the payoff for hard work and a willingness to accept authority promised to be a generous share in the national plenitude.
Even when the goods were not at hand, the ads cataloged a beckoning future. For decades, advertising had barraged Americans with images of a world without horizons, but now, in television, it had the most powerful and—in Madison Avenue's language—most "penetrating" conduit ever devised. In the early Fifties, when the tube was a new toy, people lined up in the streets to stare at the new models in store windows. Television rewarded, tantalized, cozened, flattered; it congratulated Americans for being so sensible or fortunate as to live in a land where television was available. For most viewers, television's world, however sanitized and upbeat, hovered close enough to the reality of their lives and their immediate aspirations to render the image of abundance plausible. No longer did you have to be a criminal poseur to believe, with Jay Gatsby, "in the green light, the orgiastic future that year by year recedes before us." Tomorrow we could all "run faster, stretch out our arms farther ... ." And so, when the majority of Americans called themselves middle class, they meant at the least that they were on their way.
By way of a summary of the economic underpinnings, then: Where the parental generation was scourged by memories of the Depression, the children of this middle class in the late Forties and Fifties were raised to take affluence for granted. The breadwinners were acutely aware of how hard they had worked to afford the picture window, the lawn, the car, the Lionel trains; and since they could, most of them, remember a time when the sweat of their brow availed them little, they were flooded with relief and gratitude, and expected their children to feel the same. Many were the parents who policed their rambunctious children with when-I-was-your-age tales of the Depression. Here was generational cleavage in the making.
And yet children also live out potentials that lie dormant in their parents; the discontinuities can be overdrawn. For all their comforts, the middle-class parents were afflicted by "insecurity," to use another of the decade's code words. One was not supposed to feel "insecure." It was a mark of "maladjustment." Yet no matter how much consumer debt they piled up to feed their hunger for consumption, no matter how eagerly they accumulated space and goods to convince themselves that their self-sacrificial struggles had been worthwhile (and to placate the Puritan's nameless guilts), they were not always convinced that their well-upholstered consumer paradise was here to stay. Nor was it always self-evident that the price was worth paying. Many are the signs that Americans were ill at ease in Eden, and although they lay scattered throughout the culture, susceptible to rival interpretations, their cumulative weight is impressive. Strikingly, for example, Americans spent a growing portion of their incomes on life insurance.*®
While disposable family income rose by a considerable 49 percent between 1950 and 1960, sales of individual life insurance policies rose by more than 200 percent in dollar value; and this did not even include the increase in employee-benefit plans. It is also worthy of note that the number of psychiatrists multiplied almost sixfold between 1940 and 1964; and presumably, although statistics are hard to come by, the number of patients who thought they needed their heads examined mushroomed accordingly. “The temptation grew to define "maladjustment" as a medical problem susceptible to personal "cure."
The middle class's choice of everyday reading matter also tells us something of its preoccupations. Bestsellers, of course, do not directly transcribe popular moods, but their readers have to find the shifts palatable, recognize new styles of heroism as plausible.
Between 1945 and the early 1950s, the typical bestseller hero was a go-getting individual who goes after what he wants, straightforwardly, and gets it.” But starting with Sloan Wilson's The Man in the Gray Flannel Suit and Herman Wouk's Marjorie Morningstar (both 1955), among other bestsellers, success costs. A hard-driving man discovers conflicts between work and family commitments. Heroes no longer conquer, but try to adapt and balance. Success is no longer a good that justifies itself; now it has to be justified as an instrument of self-fulfillment.
Likewise, popular social criticism tells us a good deal about widespread middle-class apprehension. True, there was a curious rift dividing the writers of social analysis. Some prominent intellectuals, many of them ex-radicals, were busily settling their accounts with the postwar order. These celebrants of affluence, however uneasy, presumed that America was melting down to a single sea of national satisfaction. Their intellectual style was to celebrate American unity, "the American way of life"—singular, not plural. The dangers came from resentful arrivistes, authoritarian workers, brutish anti-intellectuals—ingrates, in short. The melting pot was invoked sentimentally, as an ideal, without irony: differences in America were meant to be melted down. America was "exceptional," exempt from European passions and dangers, as it had been spared not only fascism but the temptations of socialism and communism; there was only one "American way of life." Daniel Bell® and Seymour Martin? Lipset, socialists turned sociologists, wrote that we had attained that blessed state in which ideology was defunct, exhausted; social problems were now discrete, isolated, manageable by clear-headed professionals. And as important organs of intellectual opinion closed ranks, officialdom also closed doors. "Those who do not believe in the ideology of the United States," declared the attorney general of the United States, Tom Clark, in 1948, "shall not be allowed to stay in the United States."!°
But when McCarthyism overreached, going after not just defenseless Communists and helpless innocents but the U.S. Army itself, it was beaten back, replaced by a more popular, plausible, and stable consensus that these intellectuals helped formulate: that America was the very model of the best possible society; that economic growth would make opportunity universal; that domestic differences could be bargained out; that Communism could be contained by a combination of military might and free enterprise.
The consensus intellectuals had their influence; they were much cited in popular journals, much honored in their professions. At least one of their journals was financed, as it turned out, by the Central Intelligence Agency.*! But later analysts, impressed by the chasm between the Fifties and the Sixties, may have set too high an estimate on their impact; they may have left more of a mark on their disciplines than on the public at large. At least they were not unopposed. In the early Sixties, the New Left also built up its oppositional identity, its hard-and-fast generational definition, by decrying this "dominant ideology." But in the process we overlooked our debts to the dissonant voices of the Fifties. What has to be remembered is that Bell and Lipset were not the authors of the bestselling Fifties polemics; and some of the popular social critics told a different tale indeed.’?
For all their overemphasis on social equilibrium, the bestselling social critics agreed that the heroic individual was paying a steep price—in autonomy and meaning—for the security and comfort he was reaping from the managed, bureaucratically organized society. David Riesman's The Lonely Crowd (WO; paperback edition, 1953) delivered an elegy for the "inner-directed" Protestant soul and deplored the degradation of work, arguing that the new “outer-directed" America had forfeited the liberating potentials of leisure time for shallow conformity, and that even "peer groups," which buffered individuals against the citadels of power, could prove suffocating. C. Wright Mills s White Collar (1951) lamented the spread of the sales mentality and the ebbing of the independent middle class. William H. Whyte's The Organization Man (1956) deplored the displacement of the entrepreneurial ethos by smooth, manipulative adjustment. More radically, Mills's The Power Elite (1956) made the argument that history was in the hands of irresponsible corporate, political, and military circles. But even the less radical—usually ex-radical—(critics agreed that authentic community and tradition were being flattened by a "mass society."
Later in the Fifties, muckrakers scraped at the surface of the consumer society: Vance Packard in The Hidden Persuaders (1957), John Keats in The Crack in the Picture Window (1957) and The Insolent Chariots (1958), while John Kenneth Galbraith, of course, struck at the giddiness of The Affluent Society.
"Conformity" became something else to feel anxious about, whether in books like Robert Lindner's Must You Conform? (1956) or New Yorker cartoons. The point is that some critical mass of readers wanted to be warned. And these books were lying on the coffee tables of many a curious adolescent.
In the years to come, many words would be spilled about the "generation gap," many of them in hysteria and bravado on each side. In retrospect, all the claims seem overblownand yet, what about the fierce sense of difference? The young insisted that their life situation was unprecedented (and therefore they had no one to follow); the older, that they did understand, so well, and with so many years' advantage, that they knew better (and therefore should be followed). As many studies revealed, student radicals of the New Left shared many more sentiments and values with their parents than with the rest of American society. Children of the relatively democratic families of the educated middle class, they wanted to live out the commitments to justice, peace, equality, and personal freedom which their parents professed. But about the meaning of affluence there was a divide of experience which could never be erased. Parents could never quite convey how they were haunted by the Depression and relieved by the arrival of affluence; the young could never quite convey how tired they were of being reminded how bad things had once been, and therefore how graced and grateful they should feel to live normally in a normal America.
The opportunities were real, however, and the revolts of the following decade would have been unimaginable without them. For the middle-class children who came of age in those years, there was an approved track for running faster and stretching farther: college and university training. Credentials were tickets—indeed, the only sure tickets—to the affluent society. The service sector of the economy was growing, the manufacturing sector shrinking. More employees than ever before were handling people and paper, not soil, ore, lumber, and steel. And if most of the white-collar workers—even most of the professionals were performing repetitive labors in large organizations at less than spectacular wages, it still wasn't hard for them to feel, to know, that they were doing better than their parents did. They had reason to think that, with higher education, their children could move up higher still, perhaps to become secure, self-employed professionals like doctors and lawyers, even though the self-employed middle class was shrinking while the bureaucratized sector boomed. In this respect, the secretaries and clerks and low-level bureaucrats who made up the bulk of the white-collar sector shared the aspirations of the professionals and managers who made up the cream of it.
Even before the closing of the frontier, the American middle class had believed that education marked the route upward to membership in the republic of plenty. By the late Fifties, the demand from below for higher education was more than matched by a demand from above. The economic explosion detonated an educational one. During World War II, big science at the service of big government had begun to demonstrate what it could do for warfare: the Manhattan Project's atomic bomb was the supreme product of this partnership.
And the Cold War extended the partnership into peacetime, in the form of what radicals called "the permanent war economy."?? Big industry systematically enlisted science both to organize itself and to develop and market the peacetime cornucopia of consumer goods. The centers of power wanted better-trained personnel and government-subsidized knowledge.
To harness knowledge to power, no institution was more important than the university. In the permanent ideological as well as military mobilization which the Cold War and high consumption economy promised, managerial styles would have to be taught; specific techniques for the manipulation of the physical world would have to be instilled; the American celebration would have to be refined and rendered plausible. But military arguments did the most to promote the cause of higher education. Especially after the Russians shattered American pride by getting into the heavens first with their Sputnik in 1957, public funds poured into the universities. "Intellect has ... become an instrument of national purpose, a component part of the "military-industrial complex," wrote Clark Kerr.'* Total spending on public institutions of higher education rose from $742.1 million in 1945 to $6.9 billion in 1965.*°
The universities boomed even faster than the college-age population. The result was that by 1960 the United States was the first society in the history of the world with more college students than farmers. (By 1969 the number of students had nearly doubled, to three times the number of farmers.)'® The number of degrees granted, undergraduate and graduate combined, doubled between 1956 and 1967.’”
The proportion enrolled in public institutions rose especially fast. The elite universities still trained gentlemen, but increasingly the gentlemen were being trained as managers and professors, not bankers, diplomats, and coupon clippers with a taste for higher things. In the postwar meritocratic mood, there was more room—though still not as much as sheer academic merit would have commanded—for high school graduates like me whose background was not particularly gentlemanly. Science was our faith: Golly gee, Mr. Wizard. Knowledge solved problems; it worked. Even the pandemic fear of polio had a happy ending when Dr. Jonas Salk developed his vaccine in 1954; what miracles could not be wrought by scientific knowledge? (Nor was it lost on my family and friends that Dr. Salk, as well as Einstein and many atomic scientists, were Jews like us.) So it was fully within the spirit of the moment when Alexander Taffel, the principal of the Bronx High School of Science, wrote in my class yearbook:
About a century ago, the great editor, Horace Greeley, pointed the way of
opportunity to the youth of his day in the words, "Go west, young man!"
Today, there are no more undeveloped western territories but there is a new
and limitless "west" of opportunity. Its trails lead through the schools,
colleges and university to the peaks of higher learning. Never in history has
there been so promising an opportunity for the young men and women who
can make the ascent.
As you of the class of 1959 go on to higher education, you are in full accord
with the rimes. The road you are taking is not an easy one but you will find it
interesting and rewarding. For those who pursue it with devotion and
sincerity, the signposts everywhere read, "Opportunity Unlimited!"
Yet the affluent Fifties were, as I. F. Stone wrote, haunted. Conformity was supposed to buy contentment, cornucopia promised both private and public Utopia, but satisfaction kept slipping out of reach. Opportunity meant competition; even the middle class had to wonder whether the great meritocratic race was really wide open. Plenitude beckoned, but there was no finish line, no place to rest and assure oneself, once and for all, "I've made it." And there were fears that could barely be kept at bay. The affluent society was awash with fear of the uncontrollable. The personal jitters matched the country's obsession with "national security."
Republicans and Democrats disputed whether the primary agent of insecurity was internal or external Communism, but virtually the whole society agreed that the Soviet state posed a serious threat to peace and the American way of life. The daily newspaper, the TV news, Time and Life and Reader's Digest, and at school the Weekly Reader, were all full of thick red arrows and black tides swooping and oozing across the West. The Bomb, which felt like a shield in 1945, turned into a menace again in 1949, when the Russians exploded their own. The supporters of Senator Joseph McCarthy feared the Communist Party of the United States of America. Liberal and left-wing enclaves feared McCarthyism. Conservatives feared social dissolution, immorality, rock 'n' roll, even fluoridation. Intellectuals feared their own past, and the mass mind.
The middle class furnished its islands of affluence, but around it the waters kept rising.
Popular culture and politics ran rife with foreboding. While the actual rate of juvenile delinquency probably declined in comparison with that of a half-century earlier, adults panicked.’ Juvenile delinquents haunted the imaginations if not the streets of the middle class; even if the barbarians could be kept away from the nation's gates, they might sneak into the house through the kids' bedrooms. Movies and comic books bent the prevailing insecurity into concrete fears of alien invaders who, descending from outer space or rising from the black lagoon, threatened the land, the lives, even the souls of harried America.
Blobs, things, creatures, body-snatchers, and all manner of other monsters crept into the sacrosanct household, infiltrated the bodies and minds of loved ones, stole their personalities, left them as standardized, emotionless hulks who could be read as Communist or conformist or just plain alien, depending on the terms of one's ideological paranoia.
There may not have been a single master fear, but to many in my generation, especially the incipient New Left, the grimmest and least acknowledged underside of affluence was the Bomb. Everything might be possible? So might annihilation. Whatever the national pride in the blasts that pulverized Bikini and Eniwetok atolls, whatever the Atomic Energy Commission's bland assurances, the Bomb actually disrupted our daily lives. We grew up taking cover in school drills—the first American generation compelled from infancy to fear not only war but the end of days. Every so often, out of the blue, a teacher would pause in the middle of class and call out, "Take cover!" We knew, then, to scramble under our miniature desks and to stay there, cramped, heads folded under our arms, until the teacher called out, "All clear!" Sometimes the whole school was taken out into the halls, away from the windows, and instructed to crouch down, heads to the walls, our eyes scrunched closed, until further notice. Sometimes air raid sirens went off out in the wider world, and whole cities were told to stay indoors. Who knew what to believe? Under the desks and crouched in the hallways, terrors were ignited, existentialists were made. Whether or not we believed that hiding under a school desk or in a hallway was really going to protect us from the furies of an atomic blast, we could never quite take for granted that the world we had been born into was destined to endure.
The Bomb also drew a knife-edge line between the generations. Our parents remembered World War II as The War, "The Good War," which, whatever its horrors, had drawn the country together and launched America upon its unprecedented prosperity. And if the memory of horrors lingered into peacetime, they associated the Bomb not so much with war as with the end of The War, deliverance for American boys spared the need to storm the beaches of Japan; and then, by the standard Cold War arguments, with the keeping of the postwar peace. The memory of Hiroshima and Nagasaki was either repressed or transfigured, forged into a shield against the hypothetically world-conquering Soviet aggressors. In government propaganda, the Bomb was either too terrible to be used or not so terrible that it couldn't be weathered. General-turned-President Eisenhower, the first professional military man to hold the office in three-quarters of a century, spoke soothingly of "Atoms for Peace," a slogan cheerfully used as an official postmark. If the Cold War was nerve-racking, the Bomb could tranquilize.
What was to become the New Left generation (at first only a small minority of the whole generation, of course) had a different angle of vision. For us, the future was necessarily more salient than the past. The Bomb threatened that future, and therefore undermined the ground on which affluence was built. Rather than feel grateful for the Bomb, we felt menaced. The Bomb was the shadow hanging over all human endeavor. It threatened all the prizes. It might, if one thought about it radically, undermine the rationale of the nationstate. It might also throw the traditional religious and ethical justifications of existence into disarray, if riot disrepute. The Bomb that exploded in Hiroshima gave the lie to official proclamations that the ultimate weapon was too terrible to be used. It had been used. And worse was being prepared. We did not even know that genial Ike thought of the Bomb as a weapon like any other, one that might actually be used, one that, indeed, he threatened to drop in Korea and offered to the French in Indochina.”
But this is one of those moments when I do not know exactly how many of "us" I am speaking about. There are no scientific-sounding numbers to wield. Much of the nuclear terror probably hovered just beneath the threshold of awareness. Several observers have reported what my own impressions and interviews confirm: children who grew up in the Fifties often dreamed, vividly, terrifyingly, about nuclear war.?
This cannot have been simply because of the presence of the Bomb: there were far more missiles in the Seventies, when college students were not dreaming the same dreams. To some extent it must have been the stress of amply reported East-West confrontations. As the air raid drills confirmed, the Bomb was not just a shadow falling on some distant horizon. Bombs were actually going off. H-bomb tests obliterated atolls in the South Pacific; A-bombs regularly scorched the Nevada desert. President Eisenhower was benignly reassuring, except that East-West relations failed to improve—culminating in the collapse of the summit conference of 1960, when the Russians brought down Francis Gary Powers's U-2 spy plane and Ike was caught in a lie. Such reassurances did not altogether reassure.
Popular culture, that ever-quivering barometer, also registered some of the anxieties that Washington sought to dissolve with official elixirs. In many science fiction films of the Fifties, the Bomb was conspicuously the off-screen nemesis.* Aliens sometimes recognized the atomic peril before the stupid humans did; they came to help us, and if we didn't get the point (nations of the world, unite), so much the worse for us. The Day the Earth Stood Still (1951) portrayed an otherworldly agent sent to warn earthlings that they had better not loose their military destructiveness into the heavens; paranoid American soldiers panicked and shot him. In Them! (1954), as in low-budget Japanese releases, it was atomic testing that created the bug-eyed monsters in the first place. On the Beach, about the aftermath of thermonuclear war, was a bestseller in 1957; the star-studded movie of 1959, the first to show a bomb-blasted planet more or less "realistically," suggested (in a speech by Fred Astaire) that the prewar world had been to blame for not taking the danger seriously enough.
The same Bronx High School of Science yearbook which contained the principal's paean to opportunity included these words, not from ban-the-bomb activists (none of those were visible in the class of '59) but from the student editors:
In today's atomic age ... the flames of war would write finis not only to our
civilization, but to our very existence. Mankind may find itself unable to rise
again should it be consumed in a nuclear pyre of its own making. In years to
come, members of this class will bear an ever-increasing responsibility for the preservation of the heritage given us. Those of us who will become scientists must make certain that the Vanguards and Sputniks of the future herald the coming of an era of light and not an epoch of never-ending darkness.°
The Bomb was not the only offstage presence to shake what C. Wright Mills called the American Celebration. For Jewish adolescents in particular, the Nazis were not so long defeated, and Hitler was the most compelling of all bogeymen. "Camp" did not mean only a place to go for the summer. Protective parents were reluctant to remind us, but rumors and images and random facts did seep into our consciousness. Photos of camp survivors, not yet stereotyped, floated through popular culture like stray bones, and lodged, once in a while, in our collective throat. One of my grandmother's brothers had stayed behind in Lithuania when she, three sisters, and another brother came to America, for example, and I was vaguely aware that all but one member of his family had been murdered; I remember my excitement when we learned, in the early Fifties, that one of her nephews had turned up, having apparently run off to join the Red Army near Vilna as the Nazi troops approached.
The Holocaust had not yet acquired that name, at least in my hearing; the catastrophe was simply a mangled piece of history, incomprehensibly real, unique to the twentieth century: our century. Meredith Tax, who grew up in the Milwaukee of the Fifties, has written: "Every night I looked under the bed for men from Mars, witches, and Nazis. My little brother slept with a German Luger, war booty of my father's, unloaded but with magic potency."® The heavily German-American Milwaukee had an active Nazi Bund during the war, as she points out, and so the main downtown street was full of "war memorabilia" stores displaying swastikas. But even in New York my father once or twice referred darkly to Yorkville, the German section of Manhattan, as if once, in prehistory, something terrible—I was not to know what—had happened there.
We were Survivors, in short, or our friends were, without haying suffered in the flesh, thanks to our (or our friends') grandparents for having journeyed halfway around the world to Ellis Island. But our luck was tainted, confused. For some parents, the relief they felt was another form and measure of America's bounty, the gift of affluence. But questions nagged: Why should we have been so lucky? How close was the close call? Again a spiritual gulf opened between the generations, a divide which led us in later years to our different ways of reliving World War II.
Our parents had lived through these horrors. Later, childishly thinking them omnipotent, we wanted to know: How could they have let this happen? How could they not have known? Some felt tremors of guilt, perhaps just beneath the threshold, that they had let the slaughter take place without quite knowing, without making a point of knowing, without doing much, or anything, or in any case enough—but what would have been enough?—to help their European cousins, to press the sainted FDR to bomb the tracks to Auschwitz or open the immigration gates. One might even surmise that some of their guilt was later fought out over Vietnam, that the Jewish Cold Warriors of the Fifties and early Sixties were dead set on stopping Communism precisely because they had failed to stop the Nazis—whereas to me and people I knew, it was American bombs which were the closest thing to an immoral equivalent of Auschwitz in our lifetimes.”
When the time came, we jumped at the chance to purge ourselves of the nearest thing to the original trauma. And then atrocities committed by innocent America rang the old alarms—even if the parallels were drawn too easily, overdrawn, with crucial differences obscured. (Killing peasants because they were supposed to be Vietcong, even destroying villages "in order to save them," as an American officer once famously said, was not the same as killing Jews systematically because they were Jews.) We were going to be active where our parents' generation had been passive, potent where (having once looked omnipotent) they had finally proved impotent. Then we could tell our parents: We learned when we were children that massacres really happen and the private life is not enough; and if not now, when?
So the generational divide was not just an economic but a spiritual fact. And if Jews were transfixed by their unforgettable knowledge, it was not only Jews who were haunted. Many gentiles (as well as Jews) converted the Holocaust into yet another reason to love America, but some brooded about what it implied for the human heart and the human project, even for redemptive dreams of affluence. The massacre of the Jews was a huge fact lying overturned, square in the middle of the through route to progress. There were some, or many, for whom the Holocaust meant that nothing—neither private satisfactions nor the nation's greater glory—could ever supplant the need for a public morality. There were Christians as well as Jews who concluded that they would never end up "good Germans" if they could help it.
The fact of affluence and the terror of destruction: the tension was especially sharp among a minority: the largely urban and suburban, disproportionately Jewish children of the moreor-less affluent but discomfited middle class. And this minority was located within huge institutions, the elite but mass universities, which collected these forces, as a magnifying glass collects the rays of the sun, and brought them to a smolder. For neither economic tendencies nor even political issues by themselves could generate a student movement.
First there had to be an igniting minority.
This early New Left of the early Sixties, which I will sometimes call the old New Left, the pre- Vietnam New Left, aspired to become the voice, conscience, and goad of its generation.
It was never quite typical: it was morally more serious, intellectually and culturally more ambitious than the rest of its generation. It shared its generation's obsessions, and then some, but focused them in an original way. Itself ignited by the civil rights movement, it was the small motor that later turned the larger motor of the mass student movement of the late Sixties. Within a few years this minority created a tradition—a culture, a style, an approach to society, a set of tactics—that played itself out in the movement's subsequent history. It was on the achievements as well as the paradoxes and tensions of the old New Left that the later movement foundered.
The old New Left was acutely, even sentimentally, conscious that they were of a particular age. "We are people of this generation," the 1962 Port Huron Statement of Students for a Democratic Society opens, "bred in at least modest comfort, housed now in universities, looking uncomfortably to the world we inherit." But the authors of this document were aware that they were not altogether typical of their affluent peers. "Our work is guided by the sense that we may be the last generation in the experiment with living," they wrote.
"But we are a minority—the vast majority of our people regard the temporary equilibriums of our society and world as eternally-functional parts."
This minority turned out to be, as Jack Newfield later wrote, "prophetic," but at the time they could not be sure. (Anyway, many people feel like prophets and turn out to be wrong.) They were not only willing to be marginal, they felt there was a kind of nobility in being devoted to the public good in an unconventional way. In a nation devoted to private pursuits, they believed in public action. In a culture devoted to the celebration of middle-class security, they labeled it smugness and expressed solidarity with people who were systematically excluded from a fair share in prosperity. The revelation that there were people blocked from affluence not only offended them, it discredited the dream—a dream they already felt ambivalent about, even estranged from. They felt cramped by the normal middle-class pursuits of career, family, and success, and they brandished their alienation as a badge. They were not satisfied to take up public participation as a sideline, whether in political parties, PTAs, or professional associations. Their peers wanted to make families; this tiny group wanted to make history.’
The New Left, when it erupted, insisted that above all it was new, tailored to a new time, exempt from the vices that had afflicted the various factions of the Old Left. There was truth in the insistence. The Old Left had been shattered by McCarthy ism, the Cold War, the postwar consensus, and its own moral obtuseness vis-a-vis the Soviet Union; partly for this reason, partly because of the prevailing fear of getting out of line, and partly because of the rewards of gray-flannelled conformity, there was (with few exceptions) a "missing generation" on the Left.? Few were the radicals twenty to thirty years old in the Fifties who might have served as exemplars for the next generation, a link between experience and innocence. The self-flattering idea of a virgin birth enabled the early New Left to think its way past defeat, to break from both pro-Soviet and Cold War rigidities. From this reality came much of the famous New Left spunk, the impulse to go it alone. But the heady truth in this image of self-creation also concealed continuities. The movers and shakers of the Sixties did not invent a new political culture from scratch.
Even in the ranch-housed, well-laundered Fifties, while the bulk of the middle class busied itself with PTA meetings, piano lessons, and The Saturday Evening Post, there were, dotted around the country, enclaves where groups of adults carried on in opposition to prevailing values. Moreover, within the very mass youth culture which affluence made possible, the self-satisfied Fifties were crisscrossed by underground channels where the conventional wisdoms of the time were resisted, undermined, weakened. It was in these enclaves of elders and subterranean channels, rivulets, deep-running springs—or backwaters and swamps, depending on your point of view—that unconventional wisdoms, moods, and mystiques were nurtured.
With left-wing politics in a state of collapse, most of these oppositional spaces were cultural—ways of living, thinking, and fighting oneself free of the affluent consensus. Most were indifferent or hostile to politics, which they saw as yet another squandering of energy.
But even the anti-political enclaves opened a space for later and larger oppositions, both the New Left and the counterculture, oppositions compounded—however contradictorily—of politics and culture. The beats were the main channel; hostile to the postwar bargain of workaday routine in exchange for material acquisition, they devoted themselves to principled poverty, indulged their taste for sexual libertinism, and looked eastward for enlightenment. Overlapping, there were other tiny bohemias of avant-garde culture and political dissonance, notably the radical pacifists of Liberation, New York's Living Theatre, San Francisco's anarchist and East-minded poets, jazz connoisseurs, readers of The Village Voice and Evergreen Review. Battered remnants of the Old Left carried their torches for some kind of socialism, rejected the orthodoxies of the Cold War to one degree or another, and felt the national security state to be a menace rather than a guarantor of true-blue liberties; they maintained a "folk culture" in the absence of an actual folk. These were, to use the shorthand, subcultures where exotic practices attracted a hard core of rebels, a fringe of hangers-on, and a larger penumbra of the part-time, the tempted, and the vicarious participants. More narrowly political were the invisible communities clustered around the social-democratic Dissent and I. F. Stone's anti-Cold War Weekly, trying in different ways to think in the name of a Left that did not exist. In their studies, and among their students, obscure critical intellectuals like Paul Goodman, Herbert Marcuse, Norman O.
Brown, William Appleman Williams, and Betty Friedan were writing the books, many of them not even published until well into the next decade, which set a tone for rebellion when rebels came up from the underground streams, looked around, and decided to make history. There was anger collecting in these nodes, but they also were governed by a happy sense of their distance from the normal. It was as if they were living in color while the rest of America was living in black and white. They radiated jarring signals to the next generation.
At the same time, usually less angry, certainly less focused, and far more extensive, popular music and the movies and other forms of mass-distributed culture began speaking in their own ways directly to the young, challenging the affluent society's claims that its social arrangements were sufficient nourishment for the human spirit. Some of the initiative came from the entrepreneurs of popular culture, who, to keep the mainstream entertained, scouted the margins, absorbing outsiders and outsideness, packaging them in marketable form, relaying the idea that authorities were questionable and that to be young was to be weird, angry, marginal, dispossessed. So hoods acquired a shadow life as folk heroes. But more important than the hoods themselves, their culture of delinquency turned out to be the outer edge of a more vast and amorphous teenage culture. To put it another way, what happened in the mid-Fifties is that the normal teenage culture borrowed the mystique of the subterraneans in order to express its own uneasy and ambivalent relation to the society of parents. The adolescent society depended on affluence—on time and money of its own to spend—but it also flirted with the harmless part of the culture of delinquency: the spirit of fun and adventure, the disdain for studies, the drinking, smoking, making out, swearing, staying out late.? Never before had so many of the leisured young had a chance to spend so much so relentlessly to indulge their tastes. The marketplace sold adolescent society its banners. To call the resulting spectacle an "adversary culture" would be to lend it too much coherence and to miss its ambiguities.* But this cultural display was certainly far from an uncritical embrace of the social order. Where the narrower enclaves and channels of the beats, the bohemians, and the remnant Left opened spaces for the New Left in the early Sixties, and for the pure counterculture later on, the shallower channels of the Fifties' teenage culture marked the territory for the far larger youth upheaval of the late Sixties.
Rock and roll and its dances were the opening wedge, hollowing out the cultural ground beneath the tranquilized center. Marion Brando and James Dean embodied styles and gestures of disaffection. On the fringes, satirists of all kinds—Mad, Lenny Bruce, Tom Lehrer, Mort Sahl, Chicago's Compass and Second City cabarets—ridiculed a host of pieties.
Tv's Steve Allen and Sid Caesar and their offshoots and imitators carried some of the rambunctious spirit into the mainstream. Late in the decade, domestic avant-garde films as well as foreign dramas of dislocation helped a new college generation feel that angst was normal. As America exported Hollywood movies, it imported parables of estrangement.
Literary culture was also piled high with maps of a devastated social landscape; struggles with the absurd resounded in the heart of every half-alienated student. Lost souls and embattled antiheroes paraded their losses of meaning. J. D. Salinger's Holden Caulfield was revolted by "phoniness," and his other dislocated adolescents dabbled in Zen. In the legitimate theater, Arthur Millers Willy Loman matched the spiritually uprooted souls of Riesman's and Mills's sociology. To Beckett's and Genet's and lonesco's characters, the postwar cornucopia looked absolutely beside the point. Off Broadway, "communication" was problematic, "togetherness" a bad joke, happy endings the real absurdity; and Grove Press's Evergreen Review carried the news outside New York. In Lady Chatterley's Lover and Henry Miller's Tropic novels, finally available over the counter, raw sex was posed as the oasis in an arid society. Existentialism started from the premise of meaninglessness, and then executed a brilliant judo move: it declared that precisely because humanity is deserted by God and values are not inscribed in the natural order of things, human beings are responsible for making their own meanings. (It followed, then, that authority would always have to prove itself, minute to minute. If Norman Mailer could bend existentialism to support John F. Kennedy in 1960, he could just as easily turn it against Kennedy's successor and the Vietnam war in 1965.) Book marketing itself pried open a new cultural space: Starting in 1952, first Doubleday and then other publishers began to publish serious nonfiction in paperback, so that avant-garde currents and European repertory existentialism, the absurd, all manner of philosophy, history, and sociology—could circulate to the idea-hungry and college-bound.