Download high-resolution image Look inside
Listen to a clip from the audiobook
audio play button
0:00
0:00

The Middle Out

The Rise of Progressive Economics and a Return to Shared Prosperity

Look inside
Listen to a clip from the audiobook
audio play button
0:00
0:00
Political journalist Michael Tomasky tracks an exciting change among  progressive economists who are overturning decades of conservative dogma and offering an alternative version of capitalism that can serve broadly shared prosperity to all.

In the first half of the twentieth century the Keynesian brand of economics, which saw government spending as a necessary spur to economic growth, prevailed. Then in the 1970s, conservatives fought back. Once they got people to believe a few simple ideas instead—that only the free market could produce growth, that taxes and regulation stifle growth—the battle was won. The era of conservative dogma, often called neoliberal economics, had begun. It ushered in increasing inequality, a shrinking middle class, and declining public investment. For fifty years, liberals have not been able to make a dent in it. Until now. 
 
In The Middle Out, journalist Michael Tomasky narrates this history and reports on the work of today's progressive economists, who are using mountains of historical evidence to contradict neoliberal claims. Their research reveals conservative dogma to be unfounded and shows how concentrated wealth has been built on the exploitation of women, minorities, and the politically powerless. Middle-out economics, in contrast, is the belief that prosperity comes from a thriving middle class, and therefore government plays a role in supporting families and communities. This version of capitalism—more just, more equal, and in which prosperity is shared—could be the American future. 

“Tomasky has written an engrossing history of ideas. It’s an incisive look at neoliberalism’s trajectory and the rise of a new intellectual model for truly shared prosperity. The Middle Out is critical to understanding our current political economy.” —Heather McGhee, author of The Sum of Us: What Racism Costs Everyone and How We Can Prosper Together

“One of America’s great liberal journalists tackles the great liberal problem of our age — the economic defenestration of democracy. Essential reading for those seeking to understand how we got into this mess, and how to get out.” —Zachary D. Carter, author of The Price of Peace: Money, Democracy, and the Life of John Maynard Keynes

“It would not be enough to say that the articulate Michael Tomasky targets almost all the economic challenges of our time in his extraordinary new book, The Middle Out. He writes highly accessibly, makes economics a fascinating story, and has a powerful understanding of how economies work.  The American economy has not been run on the basis of well-founded ideas in the last forty years, he makes clear, but on the basis of narrow and self-centered interests. He provides pathways to return to an age of the kinds of values that once made it possible for a compassionately and intelligently run economy to fulfill the lives of all Americans.” —Jeffrey Madrick, author of Invisible Americans: The Tragic Cost of Child Poverty
Chapter One
The Golden Age


Emerging from the War

This book will end by looking forward, but it will open by looking back. Specifically, we will look back to the era after World War II and examine how economic policy making changed over that time so that we have a fuller understanding of how we got to this point, when (most of) the Democratic Party has embraced a much more economically populist agenda. What made that happen? This book offers four different answers to this question, all to be discussed in coming chapters, and each answer zooming the historical lens out a bit more broadly than before. In the short term, the main answer is the pandemic—­a crisis that created the conditions that made a greater level of government intervention in the economy possible. In the slightly longer short term, say the last fifteen or so years, it’s changes in economic thinking and in political activism and in the often-­overlooked but important foundation world that opened up the space for ideas that challenged the reigning conventional wisdom. In the longer term still, I examine the economic ideas and principles that were incubated in the 1940s and that came to dominate policy making in the 1970s, because all that new thinking and activism of the last fifteen years represented a reaction to them. And finally, my fourth answer to the above question lies in the fact that those ideas, the ones that came to prominence in the 1970s, were themselves a reaction to the reigning ideas of the earlier era, which takes us back to the end of World War II. We could do this forever, I suppose, but and there’s a good argument for going back to the New Deal, but I choose the late 1940s because that was the time when our modern world and economy were, in many important senses, born. Before 1945, as the labor historian Joshua B. Freeman has written, the United States “was much more a conglomeration of regions with distinctive forms of economic activity, politics, and culture” than it would become.

So this returns us to what is often called the golden age of capitalism—­that postwar period when the middle class exploded, wages grew steadily, American homes added comforts that would have been inconceivable in the Depression era, and everything just seemed to work. As we’ll see, everything didn’t work for everyone. Black people, other people of color (who didn’t exist in the United States in large numbers yet), and women were largely cut out of the action, a historical error that any new golden age must take pains to correct. And even beyond that, there were problems aplenty, as there always are: intense labor strife, deep ideological divides over communism and anticommunism fueled by demagogues like Joe McCarthy, violent battles for civil rights, bitter division over the war in Vietnam.

But economically speaking, it was a comparative golden age. And this is the first important point that needs to be made today about this period. It was an age of shared prosperity compared with today, and it was such because our economic values were better. Coming out of depression and war—­two experiences shared by the whole population that demanded sacrifice and made people think about not just their own well-­being but that of their neighbors—­the United States of the 1940s and 1950s had morally superior economic values to the United States of, say, the 1980s.

But let’s put morality to the side for now and establish that the period was materially better. Robert Skidelsky, the economic historian and biographer of John Maynard Keynes, wrote in his short post–­Great Meltdown volume, Keynes: The Return of the Master, of the two broad postwar periods that he refers to as the Bretton Woods era and the period of the so-­called Washington Consensus. Bretton Woods refers to the Bretton Woods Conference of July 1944, when representatives of the Western democracies met at a resort in New Hampshire to set up a system of global economic cooperation to take hold once the war was won. Bretton Woods established the International Monetary Fund and provided for a system of fixed exchange rates; in addition, the Bretton Woods system encouraged government intervention in the economy to promote full employment as a chief goal. Thus, the Bretton Woods system was, Skidelsky writes, broadly Keynesian. The “Washington Consensus” was a phrase coined by the British economist John Williamson in 1989, referring originally to a set of policy ideas that Washington sought to impose on developing economies. These ideas included lower tax rates, deregulation, privatization, and kindred notions; given that, the term came to refer to neoliberal market fundamentalism more generally. The Bretton Woods system lasted, according to Skidelsky, from 1951 to 1973, when the first OPEC oil crisis caused a number of economic shocks. After a transition period, the Washington Consensus era ran from 1980 to the time of Skidelsky’s writing (the book was published in 2009).

Skidelsky compared global and Western economic performance during the two periods using a variety of metrics. He begins with real global GDP growth, which ran at 4.8 percent during the Bretton Woods period, and 3.2 percent during the Washington Consensus period (and remember, the neoliberals say that it’s all about growth!). He also notes the existence of five global recessions since 1980, and none during the Bretton Woods era (and one large one in that mid-­1970s interregnum). Likewise, global unemployment grew in the Washington Consensus period; Skidelsky tracks the unemployment rate in the five major industrial democracies, and all rose during the latter period (most dramatically in the U.K. and Germany; in the United States, the increase was from 4.8 percent to 6.1 percent, which isn’t as sharp as the U.K.’s 1.6 percent to 7.4 percent but still means a few million more people out of work). Inequality was, not surprisingly, “stable in the Bretton Woods age, but it rose sharply in the Washington Consensus years from 1982 and all the way into the new millennium.” World growth volatility—­changes in the growth rate of real GDP over time—­was characterized by large spikes in the mid-­1970s and early 1980s, but overall was about the same in the two periods. Only on inflation did the Washington Consensus period produce better results, but the numbers weren’t as different as many might expect—­3.9 percent in the Bretton Woods period to 3.2 percent in the Washington Consensus period. Likewise, four economists taking a longer historical view found that the average annual growth rate for the advanced democracies in the 1950 to 1973 period was 4.9 percent, compared with 2.2 percent for 1820 to 1870, 2.5 for 1870 to 1913, 1.9 percent for 1913 to 1950, and 2.5 percent for 1973 to 1979 (when their study ended). They note also that the growth was broadly shared—­that wages grew steadily on pace with productivity, a key measure indicating that gains weren’t just concentrated at the top as they have been in more recent decades. So the economy really was better for middle-­class families than before or since.

It should be said that none of this growth and prosperity was inevitable. Early during World War II, as it became clear, in the United States and the U.K. in particular, that the Allies were likely to win the war eventually, economic and political minds turned toward the postwar economy. They did so with considerable trepidation. To appreciate this, we have to imagine the wartime economy, barely comprehensible to us today in our age of unquenchable consumerism. This was a time when millions of people were doing without. Meat, sugar, rubber, gasoline, and many other goods were rationed. Governments issued pamphlets to housewives instructing them on how to squeeze everything they could out of the groceries they were allowed to buy. There was a Home Front Pledge that even Eleanor Roosevelt took: “I will pay no more than top legal prices. I will accept no rationed goods without giving up ration points.” As the end of the war approached, some critics observed that the upper classes were growing a bit weary of all this. The historian Isser Woloch wrote that in the summer of 1944, in his regular newspaper column, George Orwell noted that certain cars on the British rail system were once again being designated first-­class, a distinction that had disappeared after the war’s onset, and that some gated private parks in London, “which had been opened to the public when the government requisitioned their iron railings for scrap metal in 1940,” were once again being closed off.

There was deep fear among policy makers in both the United States and the U.K. that the economy would tank when the war ended. The wartime economy put everyone to work, including women, who had previously mostly stayed at home; the transition to a peacetime economy would surely reduce employment and lead to some serious dislocations and a period of labor strife. Inflation fears were rampant, too, because wartime laws on price stabilization across a host of sectors were due to be lifted.

Wartime planners like the New Dealer Chester Bowles, who as the war drew to a close headed the Office of Price Administration (where a young Richard Nixon worked for a time), saw all this coming and wanted to avoid these shocks. But they were also driven by larger, historic motivations in both the United States and Britain: that these nations, the world’s two most durable democracies, must not simply revert to the way things were before; that they were going to emerge from the war, first, in a new position of economic and political supremacy vis-­à-­vis the rest of the world and, second, with something of a duty to deliver on the mighty ideals they had invoked in fighting Hitler and would continue to invoke against Stalin during the Cold War.

This meant, in short, that new degrees of social provision would be necessary and that government would have to drive this activity because the private sector on its own would not. In the U.K., this thinking was most famously made manifest in the Beveridge Report. On June 10, 1941, the War Cabinet commissioned Sir William Beveridge, an economist who had served in a range of government posts, to conduct “a survey of the existing national schemes of social insurance and allied services, including workmen’s compensation and to make recommendations.” In November 1942, he issued his report, which he packaged as an assault on the “five giants” that plagued the lives of the greater proportion of the British population: want, disease, ignorance, squalor, and idleness. The two that you couldn’t get away with today, ignorance and idleness, were seen in the report as reflecting not moral failings of the people but rather the failure of society to properly educate people and provide them with sufficient motivational opportunities. The report’s recommendations were sweeping, especially so in a country that had not embraced government expansion in the 1930s in quite the way Roosevelt’s America had. By 1945, the Labour Party released its manifesto for that year’s campaign, called Let Us Face the Future. While a bit vague on the particulars on matters like housing and health care, the manifesto nevertheless committed the party to expansive measures (“the best health services should be available free for all”).

Something similar was happening in the United States. It was happening without quite the same level of commitment, because the American Democratic Party was more ambivalent about such matters than the British Labour Party, and because the American president was ideologically elusive; yes, despite all that Franklin Roosevelt accomplished and all the change he oversaw, he was in some ways a cautious politician, unwilling to confront his party’s segregationists, backpedaling toward deficit reduction after his 1936 reelection. But he was being pushed on this matter of the postwar economy by his advisers, and none more so than Chester Bowles. Around Thanksgiving 1943, as Roosevelt was preparing to leave for the Tehran Conference with Churchill and Stalin, at which the Western powers committed to opening up a second front against Nazi Germany, Bowles sent Roosevelt a memo. He urged the president to return from Tehran with the message that the soldiers were asking him about what kind of country they would be returning home to—­one still stunted by unemployment and poverty, or one committed to a new dynamism. It had to be the latter. And so Bowles advised FDR to return from Iran and give a speech that might go as follows:

Therefore, I propose a second Bill of Rights in the field of economics: the right to a home of your own, the right of a farmer to a piece of land, the right of the businessman to be free from monopoly competition, the right to a doctor when you’re sick, the right to a peaceful old age with adequate social security, a right to a decent education.

Roosevelt did not exactly respond with enthusiasm. He shared the memo with another aide and asked, “What the hell do I do about this?” But he did come around. The next January, in his State of the Union address, Roosevelt unveiled his proposal for an “Economic Bill of Rights”: the rights to work, food, clothing, and leisure; freedom from monopolies and unfair competition; additional rights to housing, education, medical care, and Social Security. Bowles and other New Dealers were hopeful that Roosevelt would make this the theme of his reelection campaign. He didn’t; he ran largely as a wartime president, and he mentioned the Economic Bill of Rights only one other time, in a speech in Chicago in late October that was broadcast nationally on radio. But it was enough. The Democratic Party was committed now to postwar Keynesianism in a way it had not been before Bowles’s intervention and Roosevelt’s speech.

What Made the Golden Age Golden?

The golden age, it must be said, got off to a rather leaden start. It didn’t help the national morale that as Hitler was locked in his bunker, two and a half weeks shy of putting a bullet in his mouth, Roosevelt died, handing leadership of the nation to a vice president the people (outside Missouri) barely knew. Harry Truman, who today regularly ranks as one of our great presidents, inspired little confidence in April 1945. And as far as the economy was concerned, many of the postwar fears of the New Deal men were quickly realized. Inflation surged. The price of steel rose almost immediately after the war. Likewise, the price of meat—­so strictly rationed during the war—­spiked in 1946 as price controls expired over President Truman’s veto. The controls were reimposed, but the supply of meat disappeared “as the big meatpacking companies held back supplies and cattlemen kept their steers in feed lots awaiting higher prices before they would send them to slaughter.”

The immediate postwar period also saw an enormous amount of labor strife. Organized labor, then very powerful and quite left wing, reined in its militancy during the war, with unions in many sectors accepting wage freezes (that’s how health insurance became attached to employment in the United States; it was a nonwage benefit management could offer workers) and agreeing not to strike (although there were literally thousands of wildcat strikes, sometimes pitting workers against not only bosses but their own leaders). Then, after the war, labor unrest exploded, and 1946 saw more strikes than any year in American history. A mere three days after V-­J Day, the United Auto Workers (UAW) requested a 30 percent pay increase; this was at a time when wages in most sectors were going down, in part because of the oversupply of labor, what with several million men returning home from the war. Electrical workers followed, then oil workers and coal miners and lumber workers and longshoremen and more. All told, in 1946, about 4.6 million workers were involved in strikes. This was in a workforce of around 60 million people.

Over the next couple of years, inflation abated and labor unrest quieted (not entirely voluntarily—­the Republicans took over Congress in January 1947 and passed just one big bill, the anti-­union Taft-­Hartley Act, which made it much harder for unions to organize and to strike; Truman vetoed it, but Congress, with more than half of all Democrats joining the Republicans, overrode his veto comfortably). A recession hit in November 1948—­well-­timed from the perspective of Truman, who’d just pulled off his upset victory over Thomas Dewey. Pent-­up wartime consumer demand had been over-­satisfied, as department store sales crashed by 22 percent. Unemployment began to rise because, with all those returning veterans, there were far more people than jobs. The downturn wasn’t severe, but it lasted nearly a full year. Everything was sorted out by 1950, though, and it was off to the races. From 1950 to 1973, the gross domestic product averaged 4.06 percent. For a little perspective on that, we’ve had just two years above 4 percent in the entire twenty-­first century—­its first year, 2000, when GDP was 4.1 percent, and 2021’s rate of 5.7 percent.

What made the golden age golden? Economists point to such factors as “an unprecedented growth rate of labor productivity along with a similarly high rate of capital accumulation.” Conservatives would put it all down to a newly dynamic private sector—­innovators like Abraham Levitt and his sons William and Alfred, who built the American suburbs, and the engineers at GE who started bringing more good things to life in postwar America than ever before, and more than dazzled consumers could believe. The explosion in international trade was important; the World Trade Center that ceased to exist on September 11, 2001, opened in the early 1970s but was being planned by the Port of New York Authority (as it was then known) as far back as 1946. Increased military spending to fight the Cold War and the Korean War—­and to sustain this new national-­security state on a permanent basis—­was a huge factor. Indeed, “military Keynesianism,” a concept that first came into vogue in the United States during World War II, was firmly established as national policy by the time of Truman’s National Security Council directive NSC-­68, which advocated a massive expansion of the U.S. military budget. By 1953, writes the historian Jonathan Levy, “total federal government expenditures accounted for nearly a quarter of GDP—­nearly two-­thirds of which was military spending.”

Levy posits that the golden age was harmonized by a “fiscal triangle” that consisted of the federal government, the private for-­profit sector, and the nonprofit sector. The private sector, of course, drove the economy and was still dominated at that point by the production of actual goods (as opposed to services; the United States transitioned to being a service economy from the late 1960s through the 1980s). The nonprofit sector at the time was growing by leaps and bounds. Nonprofit philanthropy had received favorable tax treatment for decades, but the big moment came when Congress passed the law establishing tax-­exempt so-­called ­501­(c)­(3) organizations in 1954. The government was also growing and becoming more professionalized. Truman signed a law creating the Council of Economic Advisers to give presidents objective and professional economic advice (although they often lose arguments to the political people). The second CEA chair was Leon Keyserling, a liberal Keynesian who had studied at Columbia under FDR’s adviser Rexford Tugwell. No one called the government the enemy in those days. The government, the private sector, and the nonprofit sector worked together, wrote Levy, to become “the dominant political-­economic coordinating mechanism of Cold War liberalism. Its task was the production, distribution, and redistribution of industrial income.”

That last noun is crucial. Redistribution on a vast scale was accepted more or less across the political spectrum, except by a few cranks—­some Texas oilmen, a dean at Notre Dame named Clarence Manion, assorted capitalists like Walter Knott of Knott’s Berry Farm fame—­who had no political power at the time. The top marginal tax rate was 94 percent during the war; it was lowered to 91 percent for 1946. “Marginal rate” means that rich people paid these rates only on dollars earned above $200,000, an amount of money that almost no one not named Rockefeller made. (A small sign of how our values have changed: in 2021, that $200,000 would equate to about $2.8 million, which is made by roughly 300 Major League Baseball players, as well as something approaching 5.5 million other people.) However, taxation at lower levels of income was still quite high. For example, on dollars earned above $90,000 ($1.2 million today), filers paid 87 percent. On dollars earned above $50,000 ($697,000), they paid 75 percent. On dollars earned above a relatively modest $16,000 ($223,000), 50 percent of each of those dollars went to Uncle Sam. Even the lowest rate then, 20 percent, was twice today’s 10 percent. And compared with today’s seven tax brackets, there were a staggering twenty-­four in 1950. At all levels, if you made more, you paid more.

Of course, then as now, the tax code was festooned with deductions and ways to avoid paying the posted rate. Still, those are rates that would be impossible to replicate today; antitax forces would literally start a war. Rates that high would also be bad policy today, in a world where capital is so mobile. (If they made me emperor, I’d probably impose a top marginal rate of about 55 or 60 percent on some quite-­high dollar figure, like $5 million.) But those tax rates did one salutary thing: they brought in a ton of revenue. Despite what the supply-­siders say—­and we’ll dig into this in more detail in a later chapter—­the history on this point is as straightforward as it can be: higher tax rates produce more revenue. There was money for the government to spend, and it spent it. Yes, on building up the military, but also on things like rebuilding Europe and Japan: the Marshall Plan cost around $15 billion, or $180 billion in today’s dollars. The Federal-­Aid Highway Act of 1956, which built the Interstate Highway System (on the basis of a military-­emergency rationale), cost more than $100 billion over ten years. That would equate to about $1 trillion in today’s dollars.

Corporate tax rates were higher, too. The statutory corporate tax rate in the 1950s was 50 percent. The effective rate jumped around but was typically in the 30 to 40 percent range. The maximization of profit that became holy writ in corporate America later in the century wasn’t part of the equation then. Shareholders in publicly held companies had a right to expect a reasonable return on their investment, but only that—­a reasonable return. Such was the ethos of the time. There was a business group, the Committee for Economic Development (CED), that was started during the war by executives from the Studebaker auto company and Eastman Kodak to plan reconversion to a peacetime economy. The CED got corporate America behind the Marshall Plan, and it enlisted private-­sector CEOs and university presidents to describe the social role of business. An economist named Howard Bowen published a book in 1953 called Social Responsibilities of the Businessman. Interestingly, it was one of a series of books on Christian ethics produced by the Federal Council of the Churches of Christ in America. This paragraph sums up the book’s, and to some extent the era’s, ethos:

The unrivaled freedom of economic decision-­making for millions of private businessmen, which characterizes our free enterprise system, can be justified not if it is good merely for the owners and managers of enterprises, but only if it is good for our entire society. We can support freedom and private control of enterprise only if it is conducive to the general welfare by advancing progress, promoting a high standard of living, contributing to economic justice, etc. We judge its success or failure in terms of the public interest. When we consider proposals for its modification, we do so with the public interest in mind. Business, like government, is basically “of the people, by the people, and for the people.”

And so General Motors consummated, with the UAW’s leader, Walter Reuther, the Treaty of Detroit in 1950, in which management for the first time accepted the idea that the union was its legitimate bargaining partner. Health care and pensions became a permanent part of labor contracts. In that same year, Truman signed into law amendments to Social Security that dramatically expanded that program. Yes, the amendments raised the level of taxation, to 6.1 percent of payroll (it’s 6.2 percent today, for both employer and employee), but they increased average benefits from $26 a month to $46 a month, added ten million new workers to the rolls, and greatly increased benefits for widows and orphans. And of course Medicare and Medicaid were created in 1965 (these were technically also amendments to Social Security).

This little section barely scratches the surface of a remarkable time. There was still a lot wrong with that America, as the next section will discuss. And even as the country committed to these liberal public investments, a new conservative movement was taking shape that would change the country entirely by 1980. But in sum, the golden age of American capitalism was golden because that America had better economic values. We invested in ourselves. We cared about inequality. We did not venerate extreme wealth. We taxed the well off to pay to improve the lives of working people. In other words, I’d argue, the America of that era practiced middle-­out economics. And working people saw their lives improve in dramatic ways.

It Wasn’t Golden for Everyone

It is important to note, though, that this golden age had deep defects, defects we’ve only just started to reckon with. Our political culture—­and yes, this was true of mid-­century liberalism, not just conservatism—­was built around the understood presumption that the chief beneficiaries of all this public generosity would be white men.

In fact, if the postwar era was a time of rare consensus on behalf of public expenditure, that was probably because of this presumption. This is the argument of some recent and important works of history, notably Jefferson Cowie’s Great Exception. Cowie’s main thesis was that those of us who came of age during the Bretton Woods era thought of it as “normal” and the Reagan counterrevolution as aberrational. Cowie argues that that has it backward—­that the New Deal and the postwar period of vast public investment were the actual outliers in American history. But another point he makes is about race: that the broad social cohesion that characterized the United States during the Bretton Woods period happened because most people were white. The United States in 1950 was roughly 88 percent white, 10 percent Black, and 2 percent other. To Cowie, “a sort of ‘Caucasian’ unity took place among a historically divided working class.” As we know well, many, many white voters resent government doing things perceived as benefiting people of color. It’s hard to avoid the grim conclusion that America was okay with redistribution provided the goods were going to white folks.

That Blacks were legally excluded from most of the postwar bounty is well known. The GI Bill, officially the Servicemen’s Readjustment Act of 1944, may be the most often cited example. But what many people don’t understand is that the bill’s discriminatory effects were smuggled in through a side door, very slyly constructed. The bill provided numerous benefits to returning soldiers—­the cost of college or vocational tuition, money to start a small business, unemployment benefits as they looked for work, and, most of all, home loans at very favorable rates, financed and administered by the Federal Housing Administration (FHA) and the newly created Veterans Administration. Racial exclusion was not written into the law, so officially the bill’s benefits were available to Black veterans. Instead, according to Richard Rothstein, author of the searing book The Color of Law, which documents the ways in which federal policy underwrote segregation, FHA regulations promulgated after the legislation was passed stated that “incompatible racial groups should not be permitted to live in the same communities.” In effect, Blacks couldn’t get home loans under the bill. They couldn’t get home loans generally to move to the new suburbs, which were virtually all built with racial covenants. This, it has to be said, was chiefly the work of liberalism, not conservatism. The FHA subsidized the explosion of the American suburbs after the war—­literally millions of homes that African Americans were simply not allowed to buy.

Blacks were excluded from many jobs as well. The story here is sometimes a little more complicated than just a simple tale of intransigent racism. But only a little. Here, we see leading American corporate CEOs who in some cases held fairly progressive personal beliefs on race but did next to nothing to integrate their workforces. Robert Woodruff, the head of Coca-­Cola in the postwar period, donated money toward educational opportunities for Blacks and raised millions for the United Negro College Fund. Coke advertised to Black consumers in Black newspapers, using figures like Jackie Robinson, Jesse Owens, and the Harlem Globetrotters. But it hired few African Americans, either at the Atlanta headquarters or within its far-­flung network of bottlers and distributors. General Electric had promulgated an antidiscrimination policy as early as 1935, but by the 1960s “less than 1 percent of salaried employees, and less than 2.5 percent of GE’s overall workforce, were people of color.” Roughly the same was true of General Motors—­where, by the way, the UAW was just as racist as management. The most infamous postwar story of all involved Eastman Kodak in Rochester, New York. For decades, Eastman Kodak was Rochester, and it had a national reputation as a benevolent employer that paid good wages, promoted from within, and took care of its people—­as long as they were white. When Black ministers and others pushed the corporation to hire people of color after the passage of the Civil Rights Act (which, officially anyway, barred discrimination in the workplace), Kodak dug in its heels. The famous radical organizer Saul Alinsky eventually showed up in Rochester, quipping that “the only contribution the Eastman Kodak Company has ever made to race relations was the invention of color film.” The future senator Daniel Patrick Moynihan eventually mediated a settlement between the company and the community that did result in more Black hires.

Most of us tend to think about racism before sexism, because racial discrimination was this country’s original sin, and it often (not always, but often) assumed uglier public forms than sexism, like lynchings and murders. But however sexism was expressed, women were denied untold economic opportunity during the golden age as well. Women, especially white, middle-­class mothers, were supposed to stay home and raise the kids. Most women accepted this state of affairs. A Fortune magazine survey from 1946 found that just 22 percent of men—­and 29 percent of women—­thought women deserved “an equal chance with men” for jobs.

But not all women went along. At the Detroit-­area auto plants during the war, women constituted a quarter of the workforce or more. When the war ended, the percentages went back down to nearly prewar levels. Again, as with race, this wasn’t just on management. The UAW went along with this for the most part, too. For example, one hundred women worked at Ford’s Highland Park plant before the war; during the war, that number spiked to fifty-­eight hundred; by November 1945, when the plant had reconverted to the manufacture of tractors, the number of women workers was back down to three hundred. Women workers organized and fought back. At the UAW’s March 1946 convention, they forced a floor vote on a resolution calling on the union not to collude with management in upholding sexual discrimination. It actually passed, but it didn’t change anything in practice.

On the issue of wages in particular, “equal pay for equal work” had been a rallying cry of Susan B. Anthony and Elizabeth Cady Stanton in the 1860s. Very little progress was made, though, for decades. In 1944, while all those women were working in factories, the upstate New York congresswoman Winifred Stanley introduced an act calling for equal pay. It never even made it out of committee. It took until June 1963—­four months after the publication of Betty Friedan’s Feminine Mystique—­for Congress to pass an equal pay act, as part of President Kennedy’s New Frontier program. By that time, it passed the House overwhelmingly, by 362–­9. The nine were southern Democrats, five from Texas and four from Mississippi (yes, the Alabamans all voted for it!); there were in addition 62 members who voted “Present.” This was landmark legislation, although of course the problem persisted and continues to persist: in 2019, women made 82 cents to men’s $1.

New research in recent years has emphasized that with respect to both race and gender, the economic costs of institutional discrimination were borne not just by people of color and women but by the whole society. Just a couple of quick examples: Heather McGhee’s 2021 book, The Sum of Us, provides a brilliant analysis of the economic costs to society of institutional racism, expressed in a small but telling way in her metaphor of swimming pools, revenue-­generating public goods, being filled in with cement and shuttered rather than integrated. Similarly, an OECD study from 2016 emphasized the link between gender discrimination and economic growth. If the G20 fulfilled its target of reducing the gender gap in labor force participation by 25 percent by 2025, 100 million women would be brought into labor markets in those countries.

If the United States is to embark on a new golden age, these issues must be squarely addressed. We need to be able to look back at the golden age to see what about it was good, and there was a lot that was good: in terms of opposing economic inequality, accepting labor unions, enforcing progressive taxation, and making broad public investments, our economic values were far better than the values that took hold in the mid-­1970s. But a second golden age would have to proceed by undoing the discriminatory legacies of the old age. This is something that a lot of powerful people in this country are against.
© Jack Meserve
MICHAEL TOMASKY was appointed top editor of The New Republic in March 2021. He is also editor of Democracy: A Journal of Ideas, a contributing opinion writer for The New York Times, and a regular contributor to The New York Review of Books. He is the author of four books: Left for Dead (1996), Hillary’s Turn (2001), Bill Clinton (2017), and If We Can Keep It (2019). View titles by Michael Tomasky

About

Political journalist Michael Tomasky tracks an exciting change among  progressive economists who are overturning decades of conservative dogma and offering an alternative version of capitalism that can serve broadly shared prosperity to all.

In the first half of the twentieth century the Keynesian brand of economics, which saw government spending as a necessary spur to economic growth, prevailed. Then in the 1970s, conservatives fought back. Once they got people to believe a few simple ideas instead—that only the free market could produce growth, that taxes and regulation stifle growth—the battle was won. The era of conservative dogma, often called neoliberal economics, had begun. It ushered in increasing inequality, a shrinking middle class, and declining public investment. For fifty years, liberals have not been able to make a dent in it. Until now. 
 
In The Middle Out, journalist Michael Tomasky narrates this history and reports on the work of today's progressive economists, who are using mountains of historical evidence to contradict neoliberal claims. Their research reveals conservative dogma to be unfounded and shows how concentrated wealth has been built on the exploitation of women, minorities, and the politically powerless. Middle-out economics, in contrast, is the belief that prosperity comes from a thriving middle class, and therefore government plays a role in supporting families and communities. This version of capitalism—more just, more equal, and in which prosperity is shared—could be the American future. 

“Tomasky has written an engrossing history of ideas. It’s an incisive look at neoliberalism’s trajectory and the rise of a new intellectual model for truly shared prosperity. The Middle Out is critical to understanding our current political economy.” —Heather McGhee, author of The Sum of Us: What Racism Costs Everyone and How We Can Prosper Together

“One of America’s great liberal journalists tackles the great liberal problem of our age — the economic defenestration of democracy. Essential reading for those seeking to understand how we got into this mess, and how to get out.” —Zachary D. Carter, author of The Price of Peace: Money, Democracy, and the Life of John Maynard Keynes

“It would not be enough to say that the articulate Michael Tomasky targets almost all the economic challenges of our time in his extraordinary new book, The Middle Out. He writes highly accessibly, makes economics a fascinating story, and has a powerful understanding of how economies work.  The American economy has not been run on the basis of well-founded ideas in the last forty years, he makes clear, but on the basis of narrow and self-centered interests. He provides pathways to return to an age of the kinds of values that once made it possible for a compassionately and intelligently run economy to fulfill the lives of all Americans.” —Jeffrey Madrick, author of Invisible Americans: The Tragic Cost of Child Poverty

Excerpt

Chapter One
The Golden Age


Emerging from the War

This book will end by looking forward, but it will open by looking back. Specifically, we will look back to the era after World War II and examine how economic policy making changed over that time so that we have a fuller understanding of how we got to this point, when (most of) the Democratic Party has embraced a much more economically populist agenda. What made that happen? This book offers four different answers to this question, all to be discussed in coming chapters, and each answer zooming the historical lens out a bit more broadly than before. In the short term, the main answer is the pandemic—­a crisis that created the conditions that made a greater level of government intervention in the economy possible. In the slightly longer short term, say the last fifteen or so years, it’s changes in economic thinking and in political activism and in the often-­overlooked but important foundation world that opened up the space for ideas that challenged the reigning conventional wisdom. In the longer term still, I examine the economic ideas and principles that were incubated in the 1940s and that came to dominate policy making in the 1970s, because all that new thinking and activism of the last fifteen years represented a reaction to them. And finally, my fourth answer to the above question lies in the fact that those ideas, the ones that came to prominence in the 1970s, were themselves a reaction to the reigning ideas of the earlier era, which takes us back to the end of World War II. We could do this forever, I suppose, but and there’s a good argument for going back to the New Deal, but I choose the late 1940s because that was the time when our modern world and economy were, in many important senses, born. Before 1945, as the labor historian Joshua B. Freeman has written, the United States “was much more a conglomeration of regions with distinctive forms of economic activity, politics, and culture” than it would become.

So this returns us to what is often called the golden age of capitalism—­that postwar period when the middle class exploded, wages grew steadily, American homes added comforts that would have been inconceivable in the Depression era, and everything just seemed to work. As we’ll see, everything didn’t work for everyone. Black people, other people of color (who didn’t exist in the United States in large numbers yet), and women were largely cut out of the action, a historical error that any new golden age must take pains to correct. And even beyond that, there were problems aplenty, as there always are: intense labor strife, deep ideological divides over communism and anticommunism fueled by demagogues like Joe McCarthy, violent battles for civil rights, bitter division over the war in Vietnam.

But economically speaking, it was a comparative golden age. And this is the first important point that needs to be made today about this period. It was an age of shared prosperity compared with today, and it was such because our economic values were better. Coming out of depression and war—­two experiences shared by the whole population that demanded sacrifice and made people think about not just their own well-­being but that of their neighbors—­the United States of the 1940s and 1950s had morally superior economic values to the United States of, say, the 1980s.

But let’s put morality to the side for now and establish that the period was materially better. Robert Skidelsky, the economic historian and biographer of John Maynard Keynes, wrote in his short post–­Great Meltdown volume, Keynes: The Return of the Master, of the two broad postwar periods that he refers to as the Bretton Woods era and the period of the so-­called Washington Consensus. Bretton Woods refers to the Bretton Woods Conference of July 1944, when representatives of the Western democracies met at a resort in New Hampshire to set up a system of global economic cooperation to take hold once the war was won. Bretton Woods established the International Monetary Fund and provided for a system of fixed exchange rates; in addition, the Bretton Woods system encouraged government intervention in the economy to promote full employment as a chief goal. Thus, the Bretton Woods system was, Skidelsky writes, broadly Keynesian. The “Washington Consensus” was a phrase coined by the British economist John Williamson in 1989, referring originally to a set of policy ideas that Washington sought to impose on developing economies. These ideas included lower tax rates, deregulation, privatization, and kindred notions; given that, the term came to refer to neoliberal market fundamentalism more generally. The Bretton Woods system lasted, according to Skidelsky, from 1951 to 1973, when the first OPEC oil crisis caused a number of economic shocks. After a transition period, the Washington Consensus era ran from 1980 to the time of Skidelsky’s writing (the book was published in 2009).

Skidelsky compared global and Western economic performance during the two periods using a variety of metrics. He begins with real global GDP growth, which ran at 4.8 percent during the Bretton Woods period, and 3.2 percent during the Washington Consensus period (and remember, the neoliberals say that it’s all about growth!). He also notes the existence of five global recessions since 1980, and none during the Bretton Woods era (and one large one in that mid-­1970s interregnum). Likewise, global unemployment grew in the Washington Consensus period; Skidelsky tracks the unemployment rate in the five major industrial democracies, and all rose during the latter period (most dramatically in the U.K. and Germany; in the United States, the increase was from 4.8 percent to 6.1 percent, which isn’t as sharp as the U.K.’s 1.6 percent to 7.4 percent but still means a few million more people out of work). Inequality was, not surprisingly, “stable in the Bretton Woods age, but it rose sharply in the Washington Consensus years from 1982 and all the way into the new millennium.” World growth volatility—­changes in the growth rate of real GDP over time—­was characterized by large spikes in the mid-­1970s and early 1980s, but overall was about the same in the two periods. Only on inflation did the Washington Consensus period produce better results, but the numbers weren’t as different as many might expect—­3.9 percent in the Bretton Woods period to 3.2 percent in the Washington Consensus period. Likewise, four economists taking a longer historical view found that the average annual growth rate for the advanced democracies in the 1950 to 1973 period was 4.9 percent, compared with 2.2 percent for 1820 to 1870, 2.5 for 1870 to 1913, 1.9 percent for 1913 to 1950, and 2.5 percent for 1973 to 1979 (when their study ended). They note also that the growth was broadly shared—­that wages grew steadily on pace with productivity, a key measure indicating that gains weren’t just concentrated at the top as they have been in more recent decades. So the economy really was better for middle-­class families than before or since.

It should be said that none of this growth and prosperity was inevitable. Early during World War II, as it became clear, in the United States and the U.K. in particular, that the Allies were likely to win the war eventually, economic and political minds turned toward the postwar economy. They did so with considerable trepidation. To appreciate this, we have to imagine the wartime economy, barely comprehensible to us today in our age of unquenchable consumerism. This was a time when millions of people were doing without. Meat, sugar, rubber, gasoline, and many other goods were rationed. Governments issued pamphlets to housewives instructing them on how to squeeze everything they could out of the groceries they were allowed to buy. There was a Home Front Pledge that even Eleanor Roosevelt took: “I will pay no more than top legal prices. I will accept no rationed goods without giving up ration points.” As the end of the war approached, some critics observed that the upper classes were growing a bit weary of all this. The historian Isser Woloch wrote that in the summer of 1944, in his regular newspaper column, George Orwell noted that certain cars on the British rail system were once again being designated first-­class, a distinction that had disappeared after the war’s onset, and that some gated private parks in London, “which had been opened to the public when the government requisitioned their iron railings for scrap metal in 1940,” were once again being closed off.

There was deep fear among policy makers in both the United States and the U.K. that the economy would tank when the war ended. The wartime economy put everyone to work, including women, who had previously mostly stayed at home; the transition to a peacetime economy would surely reduce employment and lead to some serious dislocations and a period of labor strife. Inflation fears were rampant, too, because wartime laws on price stabilization across a host of sectors were due to be lifted.

Wartime planners like the New Dealer Chester Bowles, who as the war drew to a close headed the Office of Price Administration (where a young Richard Nixon worked for a time), saw all this coming and wanted to avoid these shocks. But they were also driven by larger, historic motivations in both the United States and Britain: that these nations, the world’s two most durable democracies, must not simply revert to the way things were before; that they were going to emerge from the war, first, in a new position of economic and political supremacy vis-­à-­vis the rest of the world and, second, with something of a duty to deliver on the mighty ideals they had invoked in fighting Hitler and would continue to invoke against Stalin during the Cold War.

This meant, in short, that new degrees of social provision would be necessary and that government would have to drive this activity because the private sector on its own would not. In the U.K., this thinking was most famously made manifest in the Beveridge Report. On June 10, 1941, the War Cabinet commissioned Sir William Beveridge, an economist who had served in a range of government posts, to conduct “a survey of the existing national schemes of social insurance and allied services, including workmen’s compensation and to make recommendations.” In November 1942, he issued his report, which he packaged as an assault on the “five giants” that plagued the lives of the greater proportion of the British population: want, disease, ignorance, squalor, and idleness. The two that you couldn’t get away with today, ignorance and idleness, were seen in the report as reflecting not moral failings of the people but rather the failure of society to properly educate people and provide them with sufficient motivational opportunities. The report’s recommendations were sweeping, especially so in a country that had not embraced government expansion in the 1930s in quite the way Roosevelt’s America had. By 1945, the Labour Party released its manifesto for that year’s campaign, called Let Us Face the Future. While a bit vague on the particulars on matters like housing and health care, the manifesto nevertheless committed the party to expansive measures (“the best health services should be available free for all”).

Something similar was happening in the United States. It was happening without quite the same level of commitment, because the American Democratic Party was more ambivalent about such matters than the British Labour Party, and because the American president was ideologically elusive; yes, despite all that Franklin Roosevelt accomplished and all the change he oversaw, he was in some ways a cautious politician, unwilling to confront his party’s segregationists, backpedaling toward deficit reduction after his 1936 reelection. But he was being pushed on this matter of the postwar economy by his advisers, and none more so than Chester Bowles. Around Thanksgiving 1943, as Roosevelt was preparing to leave for the Tehran Conference with Churchill and Stalin, at which the Western powers committed to opening up a second front against Nazi Germany, Bowles sent Roosevelt a memo. He urged the president to return from Tehran with the message that the soldiers were asking him about what kind of country they would be returning home to—­one still stunted by unemployment and poverty, or one committed to a new dynamism. It had to be the latter. And so Bowles advised FDR to return from Iran and give a speech that might go as follows:

Therefore, I propose a second Bill of Rights in the field of economics: the right to a home of your own, the right of a farmer to a piece of land, the right of the businessman to be free from monopoly competition, the right to a doctor when you’re sick, the right to a peaceful old age with adequate social security, a right to a decent education.

Roosevelt did not exactly respond with enthusiasm. He shared the memo with another aide and asked, “What the hell do I do about this?” But he did come around. The next January, in his State of the Union address, Roosevelt unveiled his proposal for an “Economic Bill of Rights”: the rights to work, food, clothing, and leisure; freedom from monopolies and unfair competition; additional rights to housing, education, medical care, and Social Security. Bowles and other New Dealers were hopeful that Roosevelt would make this the theme of his reelection campaign. He didn’t; he ran largely as a wartime president, and he mentioned the Economic Bill of Rights only one other time, in a speech in Chicago in late October that was broadcast nationally on radio. But it was enough. The Democratic Party was committed now to postwar Keynesianism in a way it had not been before Bowles’s intervention and Roosevelt’s speech.

What Made the Golden Age Golden?

The golden age, it must be said, got off to a rather leaden start. It didn’t help the national morale that as Hitler was locked in his bunker, two and a half weeks shy of putting a bullet in his mouth, Roosevelt died, handing leadership of the nation to a vice president the people (outside Missouri) barely knew. Harry Truman, who today regularly ranks as one of our great presidents, inspired little confidence in April 1945. And as far as the economy was concerned, many of the postwar fears of the New Deal men were quickly realized. Inflation surged. The price of steel rose almost immediately after the war. Likewise, the price of meat—­so strictly rationed during the war—­spiked in 1946 as price controls expired over President Truman’s veto. The controls were reimposed, but the supply of meat disappeared “as the big meatpacking companies held back supplies and cattlemen kept their steers in feed lots awaiting higher prices before they would send them to slaughter.”

The immediate postwar period also saw an enormous amount of labor strife. Organized labor, then very powerful and quite left wing, reined in its militancy during the war, with unions in many sectors accepting wage freezes (that’s how health insurance became attached to employment in the United States; it was a nonwage benefit management could offer workers) and agreeing not to strike (although there were literally thousands of wildcat strikes, sometimes pitting workers against not only bosses but their own leaders). Then, after the war, labor unrest exploded, and 1946 saw more strikes than any year in American history. A mere three days after V-­J Day, the United Auto Workers (UAW) requested a 30 percent pay increase; this was at a time when wages in most sectors were going down, in part because of the oversupply of labor, what with several million men returning home from the war. Electrical workers followed, then oil workers and coal miners and lumber workers and longshoremen and more. All told, in 1946, about 4.6 million workers were involved in strikes. This was in a workforce of around 60 million people.

Over the next couple of years, inflation abated and labor unrest quieted (not entirely voluntarily—­the Republicans took over Congress in January 1947 and passed just one big bill, the anti-­union Taft-­Hartley Act, which made it much harder for unions to organize and to strike; Truman vetoed it, but Congress, with more than half of all Democrats joining the Republicans, overrode his veto comfortably). A recession hit in November 1948—­well-­timed from the perspective of Truman, who’d just pulled off his upset victory over Thomas Dewey. Pent-­up wartime consumer demand had been over-­satisfied, as department store sales crashed by 22 percent. Unemployment began to rise because, with all those returning veterans, there were far more people than jobs. The downturn wasn’t severe, but it lasted nearly a full year. Everything was sorted out by 1950, though, and it was off to the races. From 1950 to 1973, the gross domestic product averaged 4.06 percent. For a little perspective on that, we’ve had just two years above 4 percent in the entire twenty-­first century—­its first year, 2000, when GDP was 4.1 percent, and 2021’s rate of 5.7 percent.

What made the golden age golden? Economists point to such factors as “an unprecedented growth rate of labor productivity along with a similarly high rate of capital accumulation.” Conservatives would put it all down to a newly dynamic private sector—­innovators like Abraham Levitt and his sons William and Alfred, who built the American suburbs, and the engineers at GE who started bringing more good things to life in postwar America than ever before, and more than dazzled consumers could believe. The explosion in international trade was important; the World Trade Center that ceased to exist on September 11, 2001, opened in the early 1970s but was being planned by the Port of New York Authority (as it was then known) as far back as 1946. Increased military spending to fight the Cold War and the Korean War—­and to sustain this new national-­security state on a permanent basis—­was a huge factor. Indeed, “military Keynesianism,” a concept that first came into vogue in the United States during World War II, was firmly established as national policy by the time of Truman’s National Security Council directive NSC-­68, which advocated a massive expansion of the U.S. military budget. By 1953, writes the historian Jonathan Levy, “total federal government expenditures accounted for nearly a quarter of GDP—­nearly two-­thirds of which was military spending.”

Levy posits that the golden age was harmonized by a “fiscal triangle” that consisted of the federal government, the private for-­profit sector, and the nonprofit sector. The private sector, of course, drove the economy and was still dominated at that point by the production of actual goods (as opposed to services; the United States transitioned to being a service economy from the late 1960s through the 1980s). The nonprofit sector at the time was growing by leaps and bounds. Nonprofit philanthropy had received favorable tax treatment for decades, but the big moment came when Congress passed the law establishing tax-­exempt so-­called ­501­(c)­(3) organizations in 1954. The government was also growing and becoming more professionalized. Truman signed a law creating the Council of Economic Advisers to give presidents objective and professional economic advice (although they often lose arguments to the political people). The second CEA chair was Leon Keyserling, a liberal Keynesian who had studied at Columbia under FDR’s adviser Rexford Tugwell. No one called the government the enemy in those days. The government, the private sector, and the nonprofit sector worked together, wrote Levy, to become “the dominant political-­economic coordinating mechanism of Cold War liberalism. Its task was the production, distribution, and redistribution of industrial income.”

That last noun is crucial. Redistribution on a vast scale was accepted more or less across the political spectrum, except by a few cranks—­some Texas oilmen, a dean at Notre Dame named Clarence Manion, assorted capitalists like Walter Knott of Knott’s Berry Farm fame—­who had no political power at the time. The top marginal tax rate was 94 percent during the war; it was lowered to 91 percent for 1946. “Marginal rate” means that rich people paid these rates only on dollars earned above $200,000, an amount of money that almost no one not named Rockefeller made. (A small sign of how our values have changed: in 2021, that $200,000 would equate to about $2.8 million, which is made by roughly 300 Major League Baseball players, as well as something approaching 5.5 million other people.) However, taxation at lower levels of income was still quite high. For example, on dollars earned above $90,000 ($1.2 million today), filers paid 87 percent. On dollars earned above $50,000 ($697,000), they paid 75 percent. On dollars earned above a relatively modest $16,000 ($223,000), 50 percent of each of those dollars went to Uncle Sam. Even the lowest rate then, 20 percent, was twice today’s 10 percent. And compared with today’s seven tax brackets, there were a staggering twenty-­four in 1950. At all levels, if you made more, you paid more.

Of course, then as now, the tax code was festooned with deductions and ways to avoid paying the posted rate. Still, those are rates that would be impossible to replicate today; antitax forces would literally start a war. Rates that high would also be bad policy today, in a world where capital is so mobile. (If they made me emperor, I’d probably impose a top marginal rate of about 55 or 60 percent on some quite-­high dollar figure, like $5 million.) But those tax rates did one salutary thing: they brought in a ton of revenue. Despite what the supply-­siders say—­and we’ll dig into this in more detail in a later chapter—­the history on this point is as straightforward as it can be: higher tax rates produce more revenue. There was money for the government to spend, and it spent it. Yes, on building up the military, but also on things like rebuilding Europe and Japan: the Marshall Plan cost around $15 billion, or $180 billion in today’s dollars. The Federal-­Aid Highway Act of 1956, which built the Interstate Highway System (on the basis of a military-­emergency rationale), cost more than $100 billion over ten years. That would equate to about $1 trillion in today’s dollars.

Corporate tax rates were higher, too. The statutory corporate tax rate in the 1950s was 50 percent. The effective rate jumped around but was typically in the 30 to 40 percent range. The maximization of profit that became holy writ in corporate America later in the century wasn’t part of the equation then. Shareholders in publicly held companies had a right to expect a reasonable return on their investment, but only that—­a reasonable return. Such was the ethos of the time. There was a business group, the Committee for Economic Development (CED), that was started during the war by executives from the Studebaker auto company and Eastman Kodak to plan reconversion to a peacetime economy. The CED got corporate America behind the Marshall Plan, and it enlisted private-­sector CEOs and university presidents to describe the social role of business. An economist named Howard Bowen published a book in 1953 called Social Responsibilities of the Businessman. Interestingly, it was one of a series of books on Christian ethics produced by the Federal Council of the Churches of Christ in America. This paragraph sums up the book’s, and to some extent the era’s, ethos:

The unrivaled freedom of economic decision-­making for millions of private businessmen, which characterizes our free enterprise system, can be justified not if it is good merely for the owners and managers of enterprises, but only if it is good for our entire society. We can support freedom and private control of enterprise only if it is conducive to the general welfare by advancing progress, promoting a high standard of living, contributing to economic justice, etc. We judge its success or failure in terms of the public interest. When we consider proposals for its modification, we do so with the public interest in mind. Business, like government, is basically “of the people, by the people, and for the people.”

And so General Motors consummated, with the UAW’s leader, Walter Reuther, the Treaty of Detroit in 1950, in which management for the first time accepted the idea that the union was its legitimate bargaining partner. Health care and pensions became a permanent part of labor contracts. In that same year, Truman signed into law amendments to Social Security that dramatically expanded that program. Yes, the amendments raised the level of taxation, to 6.1 percent of payroll (it’s 6.2 percent today, for both employer and employee), but they increased average benefits from $26 a month to $46 a month, added ten million new workers to the rolls, and greatly increased benefits for widows and orphans. And of course Medicare and Medicaid were created in 1965 (these were technically also amendments to Social Security).

This little section barely scratches the surface of a remarkable time. There was still a lot wrong with that America, as the next section will discuss. And even as the country committed to these liberal public investments, a new conservative movement was taking shape that would change the country entirely by 1980. But in sum, the golden age of American capitalism was golden because that America had better economic values. We invested in ourselves. We cared about inequality. We did not venerate extreme wealth. We taxed the well off to pay to improve the lives of working people. In other words, I’d argue, the America of that era practiced middle-­out economics. And working people saw their lives improve in dramatic ways.

It Wasn’t Golden for Everyone

It is important to note, though, that this golden age had deep defects, defects we’ve only just started to reckon with. Our political culture—­and yes, this was true of mid-­century liberalism, not just conservatism—­was built around the understood presumption that the chief beneficiaries of all this public generosity would be white men.

In fact, if the postwar era was a time of rare consensus on behalf of public expenditure, that was probably because of this presumption. This is the argument of some recent and important works of history, notably Jefferson Cowie’s Great Exception. Cowie’s main thesis was that those of us who came of age during the Bretton Woods era thought of it as “normal” and the Reagan counterrevolution as aberrational. Cowie argues that that has it backward—­that the New Deal and the postwar period of vast public investment were the actual outliers in American history. But another point he makes is about race: that the broad social cohesion that characterized the United States during the Bretton Woods period happened because most people were white. The United States in 1950 was roughly 88 percent white, 10 percent Black, and 2 percent other. To Cowie, “a sort of ‘Caucasian’ unity took place among a historically divided working class.” As we know well, many, many white voters resent government doing things perceived as benefiting people of color. It’s hard to avoid the grim conclusion that America was okay with redistribution provided the goods were going to white folks.

That Blacks were legally excluded from most of the postwar bounty is well known. The GI Bill, officially the Servicemen’s Readjustment Act of 1944, may be the most often cited example. But what many people don’t understand is that the bill’s discriminatory effects were smuggled in through a side door, very slyly constructed. The bill provided numerous benefits to returning soldiers—­the cost of college or vocational tuition, money to start a small business, unemployment benefits as they looked for work, and, most of all, home loans at very favorable rates, financed and administered by the Federal Housing Administration (FHA) and the newly created Veterans Administration. Racial exclusion was not written into the law, so officially the bill’s benefits were available to Black veterans. Instead, according to Richard Rothstein, author of the searing book The Color of Law, which documents the ways in which federal policy underwrote segregation, FHA regulations promulgated after the legislation was passed stated that “incompatible racial groups should not be permitted to live in the same communities.” In effect, Blacks couldn’t get home loans under the bill. They couldn’t get home loans generally to move to the new suburbs, which were virtually all built with racial covenants. This, it has to be said, was chiefly the work of liberalism, not conservatism. The FHA subsidized the explosion of the American suburbs after the war—­literally millions of homes that African Americans were simply not allowed to buy.

Blacks were excluded from many jobs as well. The story here is sometimes a little more complicated than just a simple tale of intransigent racism. But only a little. Here, we see leading American corporate CEOs who in some cases held fairly progressive personal beliefs on race but did next to nothing to integrate their workforces. Robert Woodruff, the head of Coca-­Cola in the postwar period, donated money toward educational opportunities for Blacks and raised millions for the United Negro College Fund. Coke advertised to Black consumers in Black newspapers, using figures like Jackie Robinson, Jesse Owens, and the Harlem Globetrotters. But it hired few African Americans, either at the Atlanta headquarters or within its far-­flung network of bottlers and distributors. General Electric had promulgated an antidiscrimination policy as early as 1935, but by the 1960s “less than 1 percent of salaried employees, and less than 2.5 percent of GE’s overall workforce, were people of color.” Roughly the same was true of General Motors—­where, by the way, the UAW was just as racist as management. The most infamous postwar story of all involved Eastman Kodak in Rochester, New York. For decades, Eastman Kodak was Rochester, and it had a national reputation as a benevolent employer that paid good wages, promoted from within, and took care of its people—­as long as they were white. When Black ministers and others pushed the corporation to hire people of color after the passage of the Civil Rights Act (which, officially anyway, barred discrimination in the workplace), Kodak dug in its heels. The famous radical organizer Saul Alinsky eventually showed up in Rochester, quipping that “the only contribution the Eastman Kodak Company has ever made to race relations was the invention of color film.” The future senator Daniel Patrick Moynihan eventually mediated a settlement between the company and the community that did result in more Black hires.

Most of us tend to think about racism before sexism, because racial discrimination was this country’s original sin, and it often (not always, but often) assumed uglier public forms than sexism, like lynchings and murders. But however sexism was expressed, women were denied untold economic opportunity during the golden age as well. Women, especially white, middle-­class mothers, were supposed to stay home and raise the kids. Most women accepted this state of affairs. A Fortune magazine survey from 1946 found that just 22 percent of men—­and 29 percent of women—­thought women deserved “an equal chance with men” for jobs.

But not all women went along. At the Detroit-­area auto plants during the war, women constituted a quarter of the workforce or more. When the war ended, the percentages went back down to nearly prewar levels. Again, as with race, this wasn’t just on management. The UAW went along with this for the most part, too. For example, one hundred women worked at Ford’s Highland Park plant before the war; during the war, that number spiked to fifty-­eight hundred; by November 1945, when the plant had reconverted to the manufacture of tractors, the number of women workers was back down to three hundred. Women workers organized and fought back. At the UAW’s March 1946 convention, they forced a floor vote on a resolution calling on the union not to collude with management in upholding sexual discrimination. It actually passed, but it didn’t change anything in practice.

On the issue of wages in particular, “equal pay for equal work” had been a rallying cry of Susan B. Anthony and Elizabeth Cady Stanton in the 1860s. Very little progress was made, though, for decades. In 1944, while all those women were working in factories, the upstate New York congresswoman Winifred Stanley introduced an act calling for equal pay. It never even made it out of committee. It took until June 1963—­four months after the publication of Betty Friedan’s Feminine Mystique—­for Congress to pass an equal pay act, as part of President Kennedy’s New Frontier program. By that time, it passed the House overwhelmingly, by 362–­9. The nine were southern Democrats, five from Texas and four from Mississippi (yes, the Alabamans all voted for it!); there were in addition 62 members who voted “Present.” This was landmark legislation, although of course the problem persisted and continues to persist: in 2019, women made 82 cents to men’s $1.

New research in recent years has emphasized that with respect to both race and gender, the economic costs of institutional discrimination were borne not just by people of color and women but by the whole society. Just a couple of quick examples: Heather McGhee’s 2021 book, The Sum of Us, provides a brilliant analysis of the economic costs to society of institutional racism, expressed in a small but telling way in her metaphor of swimming pools, revenue-­generating public goods, being filled in with cement and shuttered rather than integrated. Similarly, an OECD study from 2016 emphasized the link between gender discrimination and economic growth. If the G20 fulfilled its target of reducing the gender gap in labor force participation by 25 percent by 2025, 100 million women would be brought into labor markets in those countries.

If the United States is to embark on a new golden age, these issues must be squarely addressed. We need to be able to look back at the golden age to see what about it was good, and there was a lot that was good: in terms of opposing economic inequality, accepting labor unions, enforcing progressive taxation, and making broad public investments, our economic values were far better than the values that took hold in the mid-­1970s. But a second golden age would have to proceed by undoing the discriminatory legacies of the old age. This is something that a lot of powerful people in this country are against.

Author

© Jack Meserve
MICHAEL TOMASKY was appointed top editor of The New Republic in March 2021. He is also editor of Democracy: A Journal of Ideas, a contributing opinion writer for The New York Times, and a regular contributor to The New York Review of Books. He is the author of four books: Left for Dead (1996), Hillary’s Turn (2001), Bill Clinton (2017), and If We Can Keep It (2019). View titles by Michael Tomasky