In “The Economic Weapon,” Nicholas Mulder illuminates the genesis of diplomatic sanctions during the interwar period of the 1920s and 1930s through the second world war, as well as the tortuous erection of a liberal world order which gave those sanctions significance. Mulder, an assistant professor of modern European history at Cornell University, argues that the modern conception of sanctions – which seeks to remediate the internal ambitions or conditions of a country (e.g., “to address human rights violations, convince dictatorships to give way to democracy, smother nuclear programs, punish criminals, press for the release of political prisoners, or obtain other concessions”) – has evolved from its original external purpose of halting interstate war and preserving territorial order. Mulder posits that sanctions have become a creeping form of statecraft by other means, enacted by a thicket of detached technocrats and experts and often without due consideration of their unintended consequences. Many of these consequences, such as the concomitant rise of nationalism and autarky in the post-Versailles era as nations hostile reacted to their exclusion from global commerce, have contributed to what the author calls sanctions’ “history of disappointment.” Sanctions, now an overused instrument of diplomatic and economic coercion, have largely been rendered inefficacious by their limited potential to enact pressure in absence of total war.
Mulder traces the history of sanctions, first in its prototype as commercial blockades (the “permanent potentiality” that could be used even in peacetime) and then increasingly as a financial instrument of excluding wayward nations from the global trading system. This “new form of liberalism,” he writes, expanded the scope of conflict, and did not necessarily require the large deployment of troops or manufacture of armaments. Rather, it now could be mobilized by “a technical and administrative apparatus of lawyers, diplomats, military experts, and economists” from their own desks. For the first time in history, civilian bureaucrats could control matters of international statecraft. The triumph of the Foreign Office over the Admiralty, writes Mulder, became a new, and potentially destabilizing, method of exclusionary warfare. In doing so, sanctionists broached a new form of politics that blurred distinctions between soldiers and citizens, state and private property, and hostile and neutral nations under the banner of preserving territorial order and spurring democratization.
While economic provocations were used successfully against Yugoslavia in 1921 and Greece in 1925, Mulder exposes their limited reach, as well as how they may have backfired in the run up to World War II. Sanctions against Mussolini’s Italy in the wake of its invasion of Ethiopia and Hitler’s Germany during its accelerating territorial conquests on the European continent, as well as resource embargos against Japan, may have only fueled aggressive territorial expansion and autarky rather than curtail it. For example, sanctions were unable to prevent Italy from following through on its war in Africa, largely through its own successful national resistance campaign of stockpiling, national saving, and metal collection drives that were motivated by Mussolini’s call to bind the country into a “fortress of resistance” against the overbearing Allied powers. Mussolini was able to sublimate this campaign of resistance into a policy of autarchia – an anti-sanctions endeavor to achieve resource independence, which led to adventures to secure iron ore from Spain and oil from Albania. Likewise, Germany’s “defensive autarky” aimed to develop blockade resistance (Blockadefestigkeit) through increased domestic resource production, solidification of relations with Central and East European countries, and expansions into resource-rich territory including support for Franco’s Spanish Nationalists in 1936, as well as the annexation of Austria and Czechoslovakia in 1938 and 1939, respectively. U.S. oil and metal embargoes on Japan also increased its ambition for an “autarkic East Asian economic zone based on a yen block covering Japan, Manchukuo, Korea, north China, and Taiwan,” only to be thwarted by sustained conflict with Chiang’s Nationalist forces on the Asian continent.
While describing the history of sanctions as a largely one of disappointment, Mulder also calls attention to the “stabilizing power of provision” that was first sorely missed during the interwar period under the post-Versailles League of Nations era. Such coordination was more successful at building alliances and planting the seeds of a postwar international order that generated peace and productivity. Mulder points to the U.S.’ Lend-Lease agreement, which provided weapons and materials to nations withstanding Axis invasion, as “the most significant economic scheme against aggression ever created.” Such positive economic mobilization, he argues, was the salve to the chronic ill that plagued the ineffectual and ill-fated League. Mulder’s examination and assessment of this history is clear: positive aid provision, rather than destabilizing resource deprivation, is most integral to the health of liberal internationalism and the containment of hostile actors.
While Mulder criticizes sanctions applied in the interwar context, his book does not examine the efficacy of more recent examples of sanctioning. While many scholars agree that leveraging sanctions and embargos on adversaries has been unfruitful (e.g., in the case of North Korean and Iranian nuclearization, Cuba’s communist rule, Russia’s territorial adventurism and domestic repression, Venezuela’s totalitarianism), there are limited cases where sanctions have worked where the subject is dependent on positive relations with the sanction applier. For example, the threat of sanctions convinced the Dutch to agree to Indonesian independence in 1949, Britain to deescalate from the Suez Crisis in 1957, and South Korea and Taiwan to abandon their nuclear programs in 1975 and 1976. In each of these contexts, sanctions threatened to exclude target countries from vital channels of support and induced them to change course. Sanctions may have limited range, and thus work only in specific circumstances, but their efficacy appears to be largely contingent on the context of relations between the applier and target countries.
Despite these caveats, Mulder does a superb job illuminating how sanctions can fray the fabric work of international stability, rather than actively working to promote its preservation.
The Manassas Journal Messenger had more than a century’s worth of a storied reporting record. Established at the end of the Civil War, the Messenger was part of the rich number of competing local papers covering Prince William County on the outskirts of the Washington, D.C. metropolitan area. It had survived Reconstruction, economic downturns, and multiple world wars. But in late 2012, the 10,000-circluation daily newspaper suddenly closed its press and shuttered its doors after 143 years of operation. The Messenger had been struggling to compete for a stable readership and necessary advertising revenue, especially in a market saturated by the Washington Post that drew a much larger audience. Berkshire Hathaway terminated the paper just six months after it purchased the Messenger’s parent company Media General in 2012, citing its inability to generate profits. The closing took with it 33 jobs at the paper and an additional 72 corporate positions at the parent group.
While unfortunate, the Messenger’s fate is the story of thousands of other local news publications across the country. The traditional business model of print media is rapidly succumbing to the emergence of alternative news platforms that splinter pre-existing bases of readership while also draining the necessary ad revenue that came with it. This has forced publishers to make hard choices such as ending print editions, laying off workers, or asking for donations—all to stay afloat in an increasingly decaying print business. As local newspapers die off, so too does their vital reporting of issues that affect local communities including school board and city council meetings, political and community events, government malfeasance, and public health crises. For many, the local newspaper was the vanguard of community reporting, the embodiment of a “good neighbor” watchdog that held others honest and accountable for their actions.
This shifting news landscape has left as many as 65 million Americans with either one or no local paper in their county. Once-dedicated local readers have fretted the erosion of the stories that filled their pages. “After years without a strong local voice, our community does not know itself and has no idea of how important local issues are or how the area is changing and challenged by the growth and impact of climate change,” said Mount Dora, Florida, resident David Cohea after the weekly Mount Dora Topic closed in 2006. “We are a nameless and faceless town defined only by neighborhoods.” Those from within are also lamenting the decline. Stephen Kaye, former editor and publisher of the New York-based the Millbrook Independent noted the disappearance of coverage on local news. “We were a check on governments, on endless environmental and zoning hearings, on budgets that we often published in detail, on misdoings and good doings,” he said. “There is now a void. No one took up the slack.”
For others at still extant papers, their choice words reflect the almost inevitable reality that their outlets will be next. With dramatically curtailed staff, the thoroughness and originality of their reporting suffers, providing just minimal oversight of the figures on which they report. “[T]here is little oversight of local government and local business,” said Stan Freeman on the Massachusetts-based Daily Hampshire Gazette and TheRepublican. “The checks and balances afforded by this don’t exist, and it is only a matter of time before the potentially corrupt realize they will be able to get away with corruption more easily.” Says Idaho Statesman reader Penny Beach: “Our local newspaper, the Idaho Statesman, hasn’t closed, but it might as well have. There is nothing of value in it, unless you want to read about Boise State football. I worked as a local newspaper reporter for six years, so I know the stories that are missing: government meetings, politics, court stories, cultural events, stories about new businesses and restaurants.”
Others recount the shift in strategy when big players buy out local papers and divert their focus away from local events to national ones. “National and state news has improved, but local coverage has suffered,” said Bruce Higgins after his local the San Diego Union-Tribune was bought out by the owner of the Los Angeles Times. “We regularly have a section of the paper called ‘California’ that is filled with stories from Los Angeles. Most of us here don’t live in Los Angeles for a reason, and don’t care about what is happening there.” After Berkshire Hathaway swooped in and bought North Carolina’s Greensboro News & Record, reader Sandi Campbell saw the new paper as “a mere shadow of its former self,” barely covering topics in surrounding counties. “It is harder to even find a copy in public places like stores.”
Each of these stories can be retold by the countless numbers of communities witnessing the decline of local print media, a large share of which served as the primary source of reporting about their communities. Now, these once treasured guardians of local news have been left to wither under the conditions of a shifting media landscape, with its ample outlets for specialized news consumption, dog-eat-dog competitiveness for advertising dollars, and the alluring access, timeliness, and efficiency of the digital age.
The Irreplaceable Ideals of Local Journalism
For decades, local newspapers have been the frontlines of original reporting on community affairs. Their place in the American diet of new consumption has been critical, informing readers about events in their local city town halls, school board meetings, or on the campaign trail in local political races. Their meticulous local coverage, raised often by concerns within the community and pursued with a vigilant heed to holding others accountable, has been the “good neighbor” watchdog many have relied on to expose injustices and enforce communal norms of good behavior.
According to Rasmus Kleis Nielsen, Director of Research at the Reuters Institute for the Study of Journalism, University of Oxford and author of Local Journalism: The Decline of Newspapers and the Rise of Digital Media, local journalists provide at least three unique services to their communities through their work: (1) they hold local officials accountable by publicizing their behavior to the wider public, (2) they inspire civic and political participation, and (3) they bind disparate groups to the social fabric of the local community. The content local journalists produce is thus vital to the both the civic mindedness and connection individuals have to their communities.
Many of the stories local newspapers produce are original stories investigative stories in the form of Nielsen’s first description. According to researchers at Harvard’s Nieman Lab, a program that charts journalism’s course in the digital age, local newspapers pull more than their own weight in original reporting. They found that while local papers composed only one quarter of local media outlets in their sample, they contributed about one half of all original news stories. “Essentially, local newspapers produced more of the local reporting in the communities we studied than television, radio, and online-only outlets combined,” they wrote. The Nieman researchers also found that local papers produced more than one third (38 percent) of stories that addressed a “critical information need,” which increased to 60 percent in their sample of all original, local stories. All of these statistics suggest that local papers provide disproportionate value to their communities by originating critical, timely, and original journalism.
A robust ecosystem of local newspapers also has other effects on civic engagement, as Nielsen notes. Research indicates that the loss of local journalism makes people on both sides of the political aisle less likely to vote, since they receive a lower volume of news coverage and thus are able to evaluate political candidates less effectively. Fewer local papers also depresses voter turnout across time, while also strengthening the position of incumbents since fewer candidates decided to run for municipal office. Overall, civic participation falls when citizens become both less informed about circumstances in their local communities and less motivated to seek political office to ameliorate those circumstances.
Another increasingly important reason why a vibrant local news community is important to the body politic is its dampening effect on political polarization. As local journalism cedes ground to national outlets, viewers and listeners are more likely to become polarized. This “nationalization” of news, as researchers have called it, leads to political polarization and more zero-sum conflict between parties, making government work less efficiently at both the local and national levels. This has been documented in the literature: voters were about two percent more likely to vote for the same political party for president and senator after their local newspaper closed compared to voters with an extant local paper.
Pundit Ezra Klein considers higher attentiveness to local media as a potential antidote to the nation’s growing political divide in his new book Why We’re Polarized. He considers several aspects of local news coverage that counteract incentives for political controversy. Klein says that local communities, by nature, are more homogenous and are less likely to engender polarized politics. The questions local communities have to grapple with are also more tangible and less symbolic, so the discussion tends to be more productive and less hostile. Klein also agrees that local politics are more malleable and likely to have a concrete and discernable impact on local conditions, thus empowering people to make a difference. Local news not only informs, but it inspires individuals to engage with their communities and identify with their neighbors, rather than polarizes them to reflexively distrust the other side.
The services local journalism can provide are needed now more than ever. “A vibrant, responsive democracy requires enlightened citizens, and without forceful local reporting they are kept in the dark,” says a report by PEN America. “At a time when political polarization is increasing and fraudulent news is spreading, a shared fact-based discourse on the issues that most directly affect us is more essential and more elusive than ever.” The number of local papers is evaporating, but their value in providing important facts in the most vital times has not. This, as I will later explore, is especially important in times of unprecedented crises such as the COVID-19 pandemic, where misinformation is rampant, and fear and distress are palpable.
The Fraying Model of the Hometown’s ‘Printed Diary’
In times of declining trust in media generally, with just 32 percent of Americans trusting the mass media and an equal percentage considering the news media to be the “enemy of the people,” local news is highly favorable. Seventy-three percent of American report trusting their local newspapers—well higher than their national counterparts. Sixty-three percent of adults consider local journalists to be in touch with their communities. While Americans say they value their local news outlets, far fewer actually understand the grave conditions under which they are operating. According to Pew Research, 71 percent of Americans think their local papers are doing financially well, but only 14 percent say they have paid for or donated money to a local outlet in the past year. “They don’t realize that their local news outlet is under threat,” said co-author of the report Viktorya Vilk, manager of special projects for PEN America.
When Pew asked respondents to explain why they had not contributed to local news sources in the past year, the most common response (49 percent of those answering in the negative) was the widespread availability of free content elsewhere. These preferences reflect the decline of print media over the last few decades. The estimated total U.S. weekday newspaper circulation in 2018 (28.6 million) was just half of its total just two decades ago, and those number continue to fall at a brisk pace. Weekday print circulation decreased 12 percent Sunday print circulation decreased 13 percent just between 2017 and 2018.
Declining circulation is also attributable to the widescale decay of local news outlets. Since 2004, more than one in four papers—greater than 2,000 in total—have disappeared either because of mergers or shutdowns. This shuttering has created at least 200 so-called “news desert” counties with no local paper at all. Many more papers have been deprived of most of their once-solid revenue streams, leading them to become mere “ghost papers” that do minimal original reporting. Collectively, as many as 65 million Americans live in counties with either one or no local paper at all.
A dwindling readership also reduces advertising demand, leading papers to buckle without a stable and predictable flow of revenue. Over the last 15 years, newspapers collectively lost more than $35 billion in ad revenue, a trend which has only accelerated as of late. Between 2008 and 2018, local paper revenue fell 68 percent, leading to mass layoffs as papers are subject to the cost-cutting measures of its owners or shut down altogether. Almost half (47 percent) of local newspaper staff have been cut over the last 15 years, and total newsroom employment is now where it was in the 1970s.
The dynamics of a splintered, digitally-inclined readership with an increasing number of free content at its disposal, the competitive pressures to of a ballooning media ecosystem over limited ad revenue, and the increasing allure of salient national political issues have all contributed to the steady decline of a once-thriving local news industry. While the newspaper business still generates more revenue collectively than other outlets, the revenue is spread across roughly 1,400 daily papers, meaning the average paper receives $28 million in annual revenue. Yet the growing consolidation of the news industry means that the bigger players with larger readerships are capturing more of this revenue pot, and the few remaining readers of local papers are not enough to attract necessary ad revenue to sustain pre-existing print operations. As a result, local outlets die off and are supplanted by large national outlets with strong digital operations and a news focus that is predominantly national.
On the other hand, outlets such as the New York Times, Washington Post, and USA Today are able to court a high-margin digital subscriber base by offering original content such as video analysis and newsletters. As Pew notes in its annual “State of the News Media” report, this comes especially as the new media landscape is notably national in its orientation and digital in its presence, with “new forms of storytelling—from video to crowdsourcing to new documentary styles—and new ways to connect with audiences, often younger ones.” This has led content creators to experiment with new storytelling devices and data visualizations. This has been a harder lift for local outlets struggling to adapt to the digital age; fewer than half offer either video content or newsletters, and one in ten do not even have an Internet presence at all.
Even local news itself does not fare well, regardless of the outlet. “Legacy” journalists emerging from the rubble of the local newspaper business have attempted to fill the gap by launching local and “hyperlocal” news sites around the country, but even these outlets have fared poorly. Many of these ventures, launched by their creators out of reverence for the bygone era of local reporting, have foundered on their finances. “I don’t think it’s realistic to expect a ton of profit on this,” said Jan Schaffer, executive director of American University’s now shuttered J-Lab, a research and funding organization for media entrepreneurs. “It’s always going to be a low-margin business.” Even the big outlets have been unable to profit from these local ventures; newspapers such as the Washington Post, Politico, and the New York Times have abandoned their hyperlocal pilot programs entirely. Only six out of 44 for-profit independent hyperlocal news websites surveyed by community news expert Michele McLellan in 2012 managed to generate more than $250,000 each.
The crosscurrents of the modern media age are forcing local newspapers into obsolescence. The local newspaper, once described by the legendary American newspaper editor Horace Greeley as “the printed diary of the home town,” is vanishing under the shadow of larger national outlets hyper focused on revamping and reinforcing their digital presence. The local coverage that was once the lifeblood of many communities is rapidly disappearing, taking with it all of the attendant political and civic benefits.
“A Storm of a Disease”: How COVID-19 Underscores Local Journalism’s Value
One of the unique functions of local journalism is its ability to act as “an amplifier and alarm for critical information,” especially in times of local crisis. Take, for example, the Flint, Michigan water crisis, when lead from aging water pipes leaked into the water supply and exposed over 100,000 residents to toxic contaminants. Local news outlets like the Flint Journal reported early on the story and continued raising the issue for months. Weeks after the city changed its water source in April 2014, the Journal reported on resident complaints about the color, odor, and taste of their tap water. These reports were also amplified by local TV and radio stations.
By August, the Journal’s reporters were breaking stories about water samples testing positive for contaminants, leading the city to produce an important public health notice advising city residents to boil their water. The next month, the Journal’s Ron Fonger reported that General Motors would no longer use the Flint River water supply at its engine plant out of fears of causing corrosion to engine parts. Over the next several months, Flint residents continued voicing concerns about the crisis through several letters to the editor of the Journal.
It was not until January 2015 that other statewide news outlets, and national outlets soon thereafter, began to cover the water crisis. It took until October 2015—a full seventeen months after the Flint Journal started reporting on the issue—that Governor Rick Snyder signed a bill to fund Flint’s switch water supply back to Lake Huron. PEN America notes that this incident underscores the importance of local news outlets in breaking urgent local stories, voicing resident concerns, amplifying public health warnings, and covering the public response to the crisis. “Without the work of Flint’s local news media to amplify the concerns of community activists and public health officials, residents may have missed out on necessary information about water contamination and may not have seen their concerns validated,” the group wrote in their report on local journalism. “National media may not have subsequently picked up coverage and added public pressure. And the government may not have been pressured into taking action.” Their role was especially important in prosecuting the case against government officials accountable for the crisis, where the Journal discovered public officials had been aware from risks from the Flint River for months without telling the public.
It is not unimaginable to see the importance of local news and reporting amidst the unprecedented time of the COVID-19 pandemic, which poses the largest public health crisis in decades with over 50,000 casualties at the time of writing. Similar to the role played by the Flint Journal during the city’s water crisis, local papers can have the unique role of broadcasting public health warnings to local residents, reporting and tracing local outbreaks, holding officials accountable and following up on community needs such as protective equipment or testing supplies, and communicating local and state policies to suppress the outbreak. Many local outlets have played this role, broachingstories about viraloutbreaks at community nursing homes, for example. And for papers with an online presence, their value is only magnified. Website traffic patterns have shown an uptick in traffic to local newspapers, some by as much as 150 percent, for updates on the virus. Where demand for reliable and timely reporting is high in times of crisis, local outlets can supply informed coverage and fulfill their role as representatives of the public interest.
But, as this paper indicates, the state of local news is in peril. For many outlets, the pandemic is one of the biggest stories they will ever cover, but the challenges of keeping their finances afloat complicates those efforts. Advertisers reluctant to place their ads next to stories about rising death tolls and hospitalizations frustrate the depleting pool of ad money from which local papers can draw. Sanford Nowlin, the chief editor of the San Antonio Current, said the virus “was one of those things that hit us out of the blue,” despite the paper’s strong advertising and events businesses. Larger publications such as the New Orleans and Baton Rouge-based Advocate and Times-Picayune have been forced to furlough a fraction of their work force. A local Charleston, South Carolina paper compared the outbreak to a “storm of a disease,” and a Durham, North Carolina paper described it as an earthquake that could drag the paper for “weeks or months deep in the red.”
So, what can be done to redress the decline in local news? Many have called for new commitments to high quality local journalism supported by philanthropic efforts, new investments in alternative revenue streams, more oversight of equity and access to local news by Congress, and more frequent communication about the value of local journalism. Yet the COVID-19 pandemic only underscores the urgency of action that is needed to sustain local reporting. More than a dozen senators have called for future stimulus money to include economic relief for local journalism, citing the virus as a threat to the survival of these important news sources. “Local news is in a state of crisis that has only been exacerbated by the COVID-19 pandemic,” the senators wrote in a letter sent to Senate leadership in early April. Some local papers such as the Tampa Bay Times, have received Small Business Administration loans from the recent fund created for businesses affected by the crisis.
Bailing out local outlets is only a short-term solution. More long-term investments are desperately needed to ensure a sustainable place for local journalism, and only a coordinated action plan from all stakeholders will suffice. Protecting its tradition must be a priority, a matter of ensuring that the treasured public good—one of the country’s oldest institutions upholding civic participation and political accountability—does not wane into oblivion forever.
BOOK REVIEW: THE CASE AGAINST EDUCATION: WHY THE EDUCATION SYSTEM IS A WASTE OF TIME AND MONEY
BY BRYAN CAPLAN
PRINCETON UNIVERSITY PRESS, PP. 416
In 1973, Harvard economist Michael Spence published his seminal “Job Market Signaling” paper—a publication which later earned him the Nobel Prize—describing the “signals” potential employees send to employers about their qualifications and competence. In the paper, Spence describes how, in lieu of complete information about the quality of the job applicant, employers discern subtle cues about how that potential worker might do on the job. One of the measures employers use to gauge applicants is the education credentials an applicant possesses. Being in possession of a certified degree, writes Spence, is one way to clearly signal that one is clearly able to handle the responsibilities of the job—even without the applicant providing any other information about his or her fitness for the position.
Spence’s paper was soon elaborated on by other prominent economists over the subsequent decades. It has also resurfaced in a book that has earned much attention recently. Bryan Caplan, professor of economics at George Mason University, writes in The Case Against Education: Why the Education System is a Waste of Time and Money that the educational institutions in which millions of students are funneled through each year are largely wasteful, counterproductive, and premised on misguided ideals. Caplan envisions a reality where not everyone should aspire to go to college. Instead, Caplan argues that both students and society would be better off collectively if some made the choice not to pursue higher education. His thesis is thus radical and impractical, but symbolic of fresh thinking that is direly needed to reform our hobbled education system.
First, Caplan’s writing style often comes across as abrasive and gratuitously hostile to his intellectual critics. At many points, the book reads like a screed against anyone who may dare to question his assumptions or empirical findings. His argument too has been subjected to criticism, with many labeling his conclusions as elitist or too trustful of the efficiency of the free market. Caplan’s libertarian credentials stick out like a sore thumb in his writing. His elucidation of the signaling theory of education is blended with his bound to be unpopular personal recommendations of abolishing all forms of public funding for education, removing child labor laws so as to increase workplace training at a young age, and making education more cost prohibitive generally to increase demand.
Caplan’s brilliance, however, radiates in his application of the signaling model to describe how education is a waste for many students. He writes that students suffer not from a lack of access to education, but a surfeit of it. There’s too much to go around, producing less the distinction which traditionally is associated with obtaining a degree and more an increasingly meaningless certification of one’s own skill and ability. Caplan likens this fallacy of composition, whereby obtaining higher education is best for the individual but not for everyone collectively, to a spectator at a concert trying to get a better view. Standing up works if one can’t see, but no one is better off if everyone does it.
The theoretical antithesis of signaling theory is the human capital theory, which states that the purpose of school is to endow students with a selection of skills and abilities that can be applied generally in the labor market. Caplan argues that this is mostly absurd. Schools, he says, barely give students any transferable knowledge or intellectual capital. He pegs the share of the gains to education due to the human capital theory to just 20%, and the rest to signaling. Ask anyone a year or more removed from high school to recall the specific details of their chemistry or algebra classes. In a world where very few American adults retain any skills outside of basic literacy or numeracy, the notion that school imbues students with any form of residual human capital is nonsense. Most knowledge picked up in school is transitory and “inert,” rigid in its range of applications and limited in potential to be recalled over time.
Instead, Caplan argues that education’s true value lies in its capacity to signal to employers that a graduate has accomplished a selection of tasks to merit the conferral of a diploma. Students aren’t at all interested in acquiring “human capital” or whatever that may be; if they were, lecture halls at top Ivy League schools would be packed with non-students listening in to sharpen their skills. Yet this has not happened. Students strive primarily to acquire the diploma, and the prestige and recognition that comes with it. The conferral of a degree is the ultimate certificate of one’s performance. This is known as the “sheepskin effect,” wherein the completion of the final year of college, and the bestowal of the degree which soon follows, is the most significant threshold to cross when in school. Caplan finds that a majority of the education premium (60%) can be attributed merely to obtaining the physical diploma, not the steps completed along the way.
Author, Bryan Caplan
In its essence, Caplan argues that education is a signal by students to potential employers that one has officially demonstrated the ability to complete a range of tasks with a certain degree of distinction. It is a de facto obstacle course, challenging them to do jump over what Caplan argues are often meaningless hurdles. If students are less likely to remember what they just learned or apply what they learned more generally in the real world, then education does very little in the way of imparting any useful skills. Rather, it certifies one’s own work ethic, conscientiousness, and conformity to societal expectations. Education, Caplan writes, is mostly hollow, yet provides a platform for students to showcase their abilities and merely polish them at best. It is akin to an assembly line for automobiles where the frames, bodies, and mechanics of the cars are coming in already manufactured, but the only thing needed is a final coat of paint.
Because school is an extremely costly signal, in both time and money, Caplan argues that many students would be better off by not attending in the first place. Those with a lower aptitude for the rigors of academic life, he argues, would be better off jumping off the train early to pursue more vocational types of learning instead. Those who are considered to have a higher capacity to withstand the demands of school should be the primary recipients of higher education. This theory only works under the signaling model, assuming one’s innate capability is already mostly established, and that additional schooling will do little to bolster their skills in any meaningful fashion.
Caplan’s conclusions are likely to ruffle some feathers. It is easy to see how one might label his arguments as snobbish or elitist, given that Caplan has himself spent and profited from decades in the same system he derides in the first place. He goes even further by saying there should be active action to limit the number of people who can go to school. Either by reducing public subsides, increasing interest rates on student loans, or moving the school system to one based on private charity, education, he says, can be made more restrictive, so that it only serves who stand to benefit the most. By making students have more “skin in the game,” which essentially amounts to making access to education more prohibitive, Caplan believes that the education system will be more efficient, less costly, and serve only those who truly want and need it the most.
Caplan does not reckon with the modern impracticality of moving the public education system to one reliant on private charity for its funding. He does not see the latent potential of high ability students who stand to truly benefit from education, if only they had the funds to obtain it. He does not describe the explosion of online course material or learning resources like Khan Academy that many people rely on to acquire skills. He applies too broad of a brush to how limited the human capital theory is when applied to education. In reality, the different schools and different programs impart human capital at different rates and fashions. Does one expect that a student would gain the same level of transferable skills from taking strictly computer science classes at a technical school as opposed to a liberal arts college with distribution requirements? Caplan’s 80/20 ratio of education being explained by 80% signaling and 20% human capital is too much of a blanket accusation to classify as a general fact.
The Case Against Education is a hard-nosed look at the value of getting an education in the United States. Caplan’s use of the signaling model is brilliant and hard to disqualify. His evidence showing that the conferral of a degree is the most important threshold to cross in education is too much to overlook. His premise that education mostly imparts little to no human capital, but rather frames one’s ability to work hard and adhere to social expectations is spot on. His conclusions are expressly libertarian, entirely antithetical to our rich tradition of the federal government’s involvement in the education system. Those aside, Caplan’s new book should raise some eyebrows. Readers should be good students and take note.
For peculiar and grossly misinformed reasons, the late twentieth-century economist Friedrich Hayek has received newfound attention in contemporary political discourse for his exposition of free markets and personal liberty. Hayek, a fountainhead of the Austrian school of economics, has gained currency among conservative critics of the Obama presidency for his piercing rebuke of socialism and a creeping administrative state. Conservative firebrands like Rush Limbaugh and Glenn Beck have expropriated Hayek’s ideas to admonish what they saw as the beginnings of oppressive government control of the free market and an all-encompassing welfare safety net (Schuessler 2010). As these pundits stoked the fears of the masses, sales of Hayek’s magnum opus Road to Serfdom exploded, quickly reaching the top of the best-seller charts. Today, Hayek is lauded by proponents of small-government and personal autonomy, and his theories have found a home in the hearts and minds of conservatives and libertarians alike.
To understand Hayek’s thoughts on social welfare, we must first examine his place in one of the most heated economic debates of the twentieth century. Events within the first half of the century seriously contested the fitness and practicality of capitalism. The Great Depression and the prolonged misery which it inflicted were symbols of capitalism’s apparent demise (Caldwell 2010, p. 4). Alternative social experiments like communism and fascism were equally unsound, undermined by the political turmoil and botched economic experimentation in the Soviet Union and Europe. Socialism, ostensibly the middle path of moderation, grew in its allure. Rational planning in the form of “socialism, planning, and science” received much attention in the political circles of many Western countries (ibid., p. 5). The gradual ascendency of the Fabian socialists in Britain, as well as the British Labor Party’s endorsement of nationalized industry, signified that socialism was on the rise (Millward 1997).
Socialism as a viable, workable economic system had its share of supporters, and a vigorous ensemble of critics. The crossfire between these two camps was robust and impassioned; the series of academic exchanges that ensued was later marked as one of the most defining moments in the history of economic thought (Colander 2002). It is here that Hayek carved a niche for himself against those who saw promise in central planning. In this so-called economic calculation debate, Hayek launched a barrage of criticisms against state intervention in the economy, including its unsettling outgrowths, and the inevitable slope to totalitarianism which lay in its future. One of those outgrowths was the “Welfare State” and its underlying notions of distributive justice, which Hayek strongly believed was politically infeasible and ethically dubious. Importantly, those ideas must be considered in light of his contributions to the broader capitalism-socialism debate, namely that he saw government planning as fundamentally impractical and impeded by the diffusive nature of knowledge, as well as his elevation of choice and personal liberty as the hallmarks of a dynamic, organic, and free society.
The influence of Ludwig von Mises
Hayek’s philosophical contributions to the economic calculation debate are preceded by those of fellow Austrian economist Ludwig von Mises. In a 1920 article entitled “Economic Calculation in the Socialist Commonwealth,” Mises argued that socialism rests on virtually “impossible” assumptions (Mises [1920] 2012, p. 2). Rational allocation of resources by a socialist economy is infeasible, he wrote, because it overlooks a key principle of how economies work: individual knowledge and initiative. A planner attempting to rationally calculate all of these atomized preferences would be akin to what he called “groping in the dark” (ibid., p. 14). “[T]he mind of one man alone,” he wrote, “is too weak to grasp the importance of any single one among the countlessly many goods of a higher order. No single man can ever master all the possibilities of production” to come away with definite answers (ibid., p. 15).
Ludwig von Mises | Mises Institute
Continuing with this stream of logic, Mises wrote in Omnipotent Government that the futile pursuit of perfect economic allocation by a solitary planner can have unpalatable political consequences (Mises [1944] 2010). He fingered the rise of socialism as evidence of the growing trend toward state omnipotence, and its methods of social compulsion and coercion. People who buy into the sophistry that government interference increases social progress—people who become “entangled in the tenets of state idolatry,” to use his more colorful language—are paving the way for totalitarianism (ibid., p. 8). Mises’ faith in the efficiency of markets, which teem with the different value sets and preferences of each of its participants, and his denunciation of central planning as computationally impossible, created a formidable foundation for Hayek’s commentary in the subsequent decades (ibid., p. 113).
Hayek’s Road to Serfdom, the specter of ‘hot socialism’, and the perils of the welfare state
Mises’ criticism of social planning and its totalitarian ends was significantly influential to Hayek. Fusing Mises with Adam Smith and Alexis de Tocqueville (the latter of which he borrows the title of his capstone publication, The Road to Serfdom), Hayek articulated a scathing rebuke of unbridled economic management while also impassionedly defending free market principles (Hayek [1944] 2006). He labeled the notion of complete knowledge a fallacy that effectively amounts to grasping at straws. Governments attempting to legislate any ethical system of values—a natural outgrowth of such rule, says Hayek—amounts to an unjust and misguided imposition that is doomed to fail. Likewise, he borrows from Smith’s robust articulation of unfettered market activity to say that free markets—spontaneous in origin and dynamic in character—are the only means of economic organization which should be supported. Drawing from a rich history of economic thought on capitalism, Hayek propelled the debate forward and significantly affected the economic discourse for many decades.
Hayek’s most well-known work, The Road to Serfdom, was written in 1944 during a historical period marked by the resurgence of socialism across Europe. The specter of “hot socialism,” what Hayek defined in the book as “a central direction of all economic activity according to a single plan, laying down how the resources of society should be ‘consciously directed’ to serve particular ends in a definite way,” threatened the traditional order of classical liberalism (ibid., p. 78). Hayek warned that a government monopoly over distributive choices in the economy would lead to a gradual descent into tyrannical rule. Attempts to redistribute, borne of the desire for an ethically just social arrangement, would destroy basic democratic norms and a free society.
The Road to Serfdom by Hayek
Hayek argued in Road to Serfdom that state attempts at economic leveling would cause more chaos than it sought to engender. The codification of a certain system of ethics to ground an economic plan, he wrote, is misguided, because such a plan does not and cannot exist in consensus. State planning following such a code would provoke conflict and disagreement because there is no agreed upon plan about what should be done (ibid., p. 65). Because it is impossible for any solitary mind to possess knowledge of every individual’s calculus of needs and wants, individual knowledge cannot be scaled. This is what he called the fundamental principle of individualism—the cognition by one person of only their own interests and the impossibility of divining the interests of those outside of their limited field of understanding (ibid., p. 62).
Hayek then proceeded to delineate how state-directed redistribution would lead to the decay of individual initiative and the rule of law. He argued that, because redistribution favors certain people over others, it is in direct conflict with equal application of the law (ibid., 82). A government aiming at “material or substantive equality of different people” for the purpose of some “substantive ideal of distributive justice,” he writes, leads to favoritism (ibid.). Wrote Hayek: “To give people the same objective opportunities is not to give them the same subjective chance” (ibid.). Naturally, this favoritism can also be present in the labor market. Payment according to such subjective assessments, as opposed to the objective results of one’s actions, has the intended effect of insulating people from hardship. The true usefulness of one’s employment to society becomes obscured by the promotion of subjective merits, which Hayek said interferes with the natural mechanics of the market (ibid., 126).
According to Hayek, once the government undertakes the grand leveling experiment for the sake of greater distributive equality, it increases its scope over human affairs and leads directly to totalitarianism. When this happens, the government, he warned, will now be culpable for everyone’s fate and position (ibid., 111). And because the state can will certain preferred outcomes, everyone’s situation is life is artificially determined. This negatively impacts individual initiative, said Hayek, as “all of our efforts directed towards improving our position will have to aim, not at foreseeing and preparing as well as we can for the circumstances over which we have no control, but at influencing in our favor the authority which has all the power” (ibid.). Hayek’s conception of the welfare state is thus discriminatory, oppressive, and inimical to the basic tenets of a free society.
Soft despotism of bureaucracy and experts
Hayek later acknowledged that his forewarnings of what would happen under the scourge of “hot socialism” were somewhat overblown (Hayek [1956] 1994). By the mid-1950s, Hayek acknowledged that aggressive state planning was nearly moribund in most of the Western world—due to what one scholar attributes to the progression of the Cold War and the “horrors of life under Stalin [becoming] apparent”—and that Road to Serfdom’s implications might be outdated (Caldwell 2010). He warned, however, that while the illusions of “systematic socialism” had run its course, its remnants were still very much extant (Hayek [1956] 1994). He cautioned against complacency: “If few people in the Western world now want to remake society from the bottom according to some ideal blueprint, a great many still believe in measures which… in their aggregate effect may unintentionally produce this result” (ibid.). Specifically, Hayek was referring to the British Labor government’s proposed resumption of nationalizing industry and planning.
While socialism had shot its wad, Hayek argued that a new form of social organization would inevitably rear its head in the form of an administrative state he described in Road to Serfdom. This, he said, would take the form of “[t]hat hodgepodge of ill-assembled and often inconsistent ideals which under the name of the Welfare State has largely replaced socialism as the goal of reformers…” (ibid.). Such a state, he argued, was not necessarily categorically bad. Its peril lay in its gravitation towards coercive and discriminatory methods, which would in effect arrive at the same outcome as socialism. Rather than the hot socialism defined by brutal despots and jackboot rule, Hayek saw the gradualism of the welfare state as akin to Tocqueville’s “soft despotism” of creeping bureaucracy that slowly and unobtrusively suffocates by “a network of small complicated rules” (Tocqueville). Vigilance against such designs, he advised, was needed: “Without such a revised conception of our social aims, we are likely to continue to drift in the same direction in which outright socialism would have carried us a little faster” (Hayek [1956] 1994).
Hayek | Acton.org
At the time of writing his 1960 publication Constitution of Liberty, Hayek argued that socialism had been thoroughly discredited. “It has not merely lost its intellectual appeal,” he writes, “it has also been abandoned by the masses so unmistakably that socialist parties everywhere are searching for a new program that will insure the active support of their followers” (Hayek [1960] 2011, p. 370). Interest in seemingly more innocuous welfare programs—old-age pensions, health insurance, and housing assistance—quickly supplanted talk of nationalized industry, five-year plans, and complex computations to optimize allocation. The problem with this, as he laid out in Road to Serfdom, is that political disagreements on the appropriate policy will cause popular dissatisfaction with the democratic process (Hayek [1944] 2006, p. 75). As a result, matters will need to be taken out of the political arena and into the hands of independent bodies, bureaucrats, and so-called experts. This, Hayek said, leads institutions to drift in their scope and purpose, increasing the size of government and central control over the market in nearly the same manner as what would occur under a socialist system of planning (ibid., 373).
The drift by bureaucratic institutions into the market, he wrote, is the most grievous infraction of all. Channeling Tocqueville’s disdain for the “immense and tutelary power” of absolute state rule, Hayek argued that such control was politically infeasible (ibid., p. 367). New welfare programs rely on the use of coercion and confiscate certain rights from individuals. As a result, he says they are inherently antithetical to a free society. “If government wants not merely to facilitate the attainment of certain standards by the individuals but to make certain that everybody attains them,” he wrote, “it can do so only by depriving individuals of any choice in the matter. Thus, the welfare state becomes a household state in which a paternalistic power controls most of the income of the community and allocates it to individuals in the forms and quantities which it thinks they need or deserve” (ibid., p. 377), The multiplying of “efficient expert administrators,” working in tandem towards the ambiguous and amorphous goal of the common good, leads to an encroachment on personal liberty and leaves people with fewer choices and less freedom over some of the most important matters in their lives (ibid., p. 378).
Hayek proceeded to describe the unmistakably totalitarian fate of redistributive programs in his 1973 work Law, Legislation, and Liberty. Governments attempting to legislate an improvement in the general welfare of the public are not only placing stock into an inherently unnatural human arrangement, they are also deceiving themselves that such an arrangement is possible (Hayek [1973-1979] 1998, Vol. 2, p. 2). Hayek says that there is no sense of justice in the receipt of unearned income. “Nobody has a right to a particular state of affairs unless it is the duty of someone else to secure it,” he writes. “Justice does not impose on our fellows a general duty to provide for us” (ibid., 102). It is only within the power of a free society—spontaneous in its origins and capricious in its course—to determine what should be anyone’s position. This is akin to Smith’s articulation of the “invisible hand” of the market, whereby order is collectively assembled only out of the random, independent actions of each market participant interacting with others in that market. According to Hayek, creating a “just” society that shoots for a more equitable distribution is missing the point of what a spontaneous society means.
Therefore, Hayek argued that the welfare system is unnatural and ethically unfounded. The fact that “we think to be desirable by simply decreeing that it ought to exist,” he wrote, “and indulging in the self-deception that we can benefit from the spontaneous order of society and at the same time mold it to our own will is more than merely tragic” (ibid., p. 106). The application of certain rights or privileges to a free society is misguided, he said, because such actions rest on the assumption that society is an organization by which everyone is employed. Because free society is spontaneous and not collectively organized, the institution of certain rights and modes of conduct would require making society into an organization—which, he said, would be making it “totalitarian in the fullest sense of the word” (ibid., p. 104). Bureaucratic attempts at creating an expansive welfare system, he concluded, would be greasing the skids for totalitarian rule and the abolition of a free society.
Hayek’s acolytes and critics
More recent economists, even some within the Austrian school, have disputed Hayek’s argument that a welfare state will inexorably dissolve into totalitarianism. Even Mises, one of Hayek’s greatest influences, disputed the later Austrian’s theory of socialism, welfare, and political destabilization. In a 1960 article published in Christian Economics, Mises argues that Hayek got the causality all wrong (Mises 1960). The welfare state was not a perverse outgrowth of socialism; rather, it was a method for gradually submerging the economy into the toxic brew of Marxist dogma and fascism. Later economists also disputed the conclusiveness of Hayek’s hypothesis. Lionel Robbins, a twentieth-century British economist and follower of the Austrian school, argued that Hayek’s “absolute skepticism” of all mixed economies belied some of social improvements made by the British welfare system (Robbins 1961, p. 80). Writing in his 1961 article Hayek on Liberty, Robbins contended that “Hayek is somewhat too apt to extrapolate his apprehensions of evil and to assume that deviations from his norm lead cumulatively to disaster” (ibid, p. 80). Specifically, Robbins wrote that Hayek anticipated the growth of totalitarianism from the welfare state with near fatalistic certainty. Such absolute claims, he added, should be avoided.
Hayek also was criticized by economists outside of the Austrian school. Paul Samuelson, a prominent American economist, wrote a paper entitled “Personal Freedoms and Economic Freedoms in the Mixed Economies” (later criticized by Hayek for mistaking his assumptions) arguing that the Austrian economist thought the enactment of the welfare state would almost certainly knock society into totalitarian rule (Samuelson [1964] 1991). In the paper, Samuelson distilled Hayek’s causality into diagrammatic form. Plotting economic and political freedom on two axes, Samuelson drew an arrow between Great Britain in 1850 (positioned far up and to the right in the space) and serfdom, which was drawn at the origin (Farrant & McPhail 2010, p. 105-6). Using Hayek’s assumptions, Samuelson concluded that “[s]ocial reform (moving ‘west’) inevitably plunges society (‘southward’) into serfdom” (qtd. in ibid., p. 106). Hayek castigated Samuelson for his overly simplistic portrayals of his conclusions in Road to Serfdom.
Samuelson’s diagrams
The most recent economist to critique Hayek’s conclusion was George Stigler, also a recent U.S. economist. Mirroring previous criticism of Hayek’s absolute conclusions, Stigler posited in “Reflections on the Loss of Liberty” that plenty of evidence contradicted Hayek’s absolute causality between the welfare state and state control (Stigler [1964] 1981). Stigler interpreted Hayek as stating that discrete instances of government regulation would amalgamate into a single, monolithic state apparatus more powerful than its individual parts (Farrant & McPhail 2010, p. 108). The danger lay in following that road to its draconian endings. Stigler wrote that Hayek was effectively “telling gentlemen drinkers, and especially some Englishmen who were becoming heavy drinkers, not to become alcoholics” (qtd. in ibid., p. 96). Like Robbins, Stigler criticized Hayek for viewing mixed economies with the same absolute uncertainty. Many advances in Western states with mixed economies, Stigler said, seemed to contradict Hayek’s hypothesis.
Despite his critics, Hayek was widely considered to have significantly improved the standing of Austrian thought of the 1970s and 1980s (Kirzner 1988). Many even argued that the range of his conclusions—specifically the menace of an administrative state and loss of personal autonomy that comes with it—extended further than that. Israel Kirzner, a British-born American economist of the Austrian school, argues that Hayek’s articulation of central planning and knowledge improved the standing of the Austrian position, particularly in the understanding of the “welfare” aspects of the market and the nature of economic systems (ibid.). Even Milton Friedman, the great American monetary economist, believed Hayek’s conclusions were even more applicable to the fall of communism in the 1990s than when he wrote in the 1940s (YouTube 1994). As a symbol of his gratitude to the Austrian economist, Friedman wrote the introduction to the fiftieth anniversary of Road to Serfdom in 1994 (Friedman [1994] 2007). However, many believe that his work, has been misread and poorly understood by subsequent labor economists in the socialist calculation debate, specifically in the field of the “knowledge problem” (Kirzner 1984; Lavoie 1985).
The political turbulence caused by the Obama administration has allowed Hayek’s conclusions to achieve a kind of resurgence. Those findings, however, are not without its criticisms. The cold, unpitying thud of the jackboot that Hayek said (and many conservative pundits today echo) would certainly and immediately arrive upon any form of government distribution could be considered too extreme and conclusive. The benefits of social service programs instituted by many countries since the time of writing seem to belie Hayek’s apocalyptic forecast of complete state rule and the elimination of personal liberty. But one thing is certain: Hayek’s contributions to the Austrian school and the economic calculation debate, in which the political implications of the welfare state were central to his broader conclusions, cannot be ignored. His influence, for better or for worse, has outlived him.
References:
Caldwell, B. 2010. “Hayek on Socialism and on the Welfare State: A Comment on Farrant and McPhail’s ‘Does F.A. Hayek’s Road to Serfdom Deserve to Make a Comeback?” Duke University, HOPE Center Working Paper No. 2010-02.
Colander, L. 2002. History of Economic Thought, Fourth Edition. Houghton Mifflin Company, Boston, MA. pp. 356-79.
Farrant, A. and E. McPhail. 2010. “Does F.A. Hayek’s Road to Serfdom Deserve to Make a Comeback?”, Challenge, 53(4), 96-120.
Friedman, M. [1994] 2007. Introduction to the 1994 edition to The Road to Serfdom. ed. B. Caldwell, 249-50. Chicago, University of Chicago Press.
Hayek, F.A. [1944] 2006. The Road to Serfdom. Routledge Classics, New York, NY.
Robbins, L. 1961. “Hayek on Liberty.” Economica, New Series, Vol. 28, No. 109. pp. 66-81.
Samuelson, P.A. [1964] 1991. “Personal Freedoms and Economic Freedoms in the Mixed Economy.” ed. The Collective Scientific Papers of Paul A. Samuelson, Vol.3. MIT Press, Cambridge, MA.
BOOK REVIEW: FIRE AND FURY: INSIDE THE TRUMP WHITE HOUSE
BY MICHAEL WOLFF
HENRY HOLT AND COMPANY, PP. 336
Rarely has such a book provoked such turmoil and consternation in the country’s highest political quarters than Michael Wolff’s Fire and Fury, the dishy new tell-all of Trump’s first year in office. The book’s multiple revelations, garnered by the author’s characteristic use of cajolery and smooth-talking to penetrate the corridors of the White House (“I did whatever it took to get the story,” Wolff said), have predictably provoked presidential outrage and the tragic demise of one of his most important advisors. Following the release of the book, Steve Bannon, considered to be Trump’s Svengali, was terminated from perch at Brietbart and publicly condemned by Trump for his apparent betrayal. Loose lips, Bannon seems to have forgotten, sink ships.
Indeed, the book reads as if it was written directly by Bannon. Wolff fills most of its pages with Bannon’s musings of the internal power dynamics within the White House. In the book, the erstwhile chief strategist is seen chafing at the snobbery of the mainstream Republican political class, the subtle influences of other White House players over Trump and his administration, and the president’s wavering grasp on reality. Wolff reveals a White House increasingly dependent on Bannon for its philosophical and ideological center of gravity; his influence is frequently painted as eclipsing other prominent members of Trump’s cabinet (chief of staff Reince Priebus) and even his own family (Trump’s daughter Ivanka and his son-in-law Jared Kushner). No one in the Trump administration, writes Wolff, could formulate a cogent, coherent narrative. Bannon, “the voluble, aphoristic, shambolic, witty, off-the-cuff figure who was both ever present on the premises and who had, in an unlikely attribute, read a book or two,” quickly moved to fill that void.
Given just enough rope, Bannon quickly set about establishing a war room of political operations in the West Wing. Removing furniture and requisitioning white boards from a nondescript office space, Bannon intensely set about marking out the grand vision of the Trump agenda. His singularly-minded focus and ambition, of which included a trial balloon for nativism in the form of an immigration ban, earned him scorn and ostracism from his colleagues. Bannon’s blitzkrieg of assembling the Trump agenda took the name of what he called the “deconstruction of the administrative state.” Acting swiftly, Bannon set about capturing the soul of the Trump White House. One of the most conspicuous developments in Wolff’s book is Bannon’s hasty ascent through the ranks of Trump’s orbit by means of cunning calculation and scorched-earth warfare.
Outside of Bannon’s plotting, Wolff portrays a White House in perpetual disarray, with the president as its source. Wolff describes a man with no political or ideological lodestar and no sense of greater ambition outside of public acclaim. In the book, Trump is portrayed as peevish, capricious, and “childlike,” with little ability or patience to hold his attention for a prolonged period of time or absorb outside information. His staff describes the many attempts to scrap together any coherent governing agenda without any direction provided by Trump himself. This was the striking concern of the Trump presidency, writes Wolff: “he didn’t process information in any conventional sense—or, in a way, he didn’t process it at all.” He didn’t read, listen, and—most importantly—could not act or speak with inhibition.
Fire and Fury reads as a harrowing account of the utter disarray and desultoriness of the Trump White House. Wolff’s Trump believes himself to be fully above the conventional norms and ceremonial decorum of the presidency: “The notion of the presidency as an institutional and political concept, with an emphasis on ritual and propriety and semiotic messaging—statesmanship—was quite beyond him.” Wolff has received (rightly) his fair share of criticism for the validity of some extemporaneous accounts denoted in the book. Whether these exchanges are completely true is beside the point; rather, Wolff’s accomplishment is floridly weaving the narrative of a Trump White House that is unparalleled in American political history. It discharges plenty of shock and awe, but not enough to differentiate itself from the practically normalized state of frenzy that spews out of the White House on a nearly hourly basis.
BOOK REVIEW: DECIDING WHAT’S TRUE: THE RISE OF FACT-CHECKING IN AMERICAN JOURNALISM
BY LUCAS GRAVES
COLUMBIA UNIVERSITY PRESS, PP. 336
In today’s blustery political climate, it is uncommon to hear the terms “fake news” and “post-truth world” thrown around in such a casual and frequent manner. The arrival of these troubling descriptions coincides with the political ascendancy of Donald Trump, and his fractious assimilation into the White House. Trump’s complicated relationship with the truth, evidenced by the growing multitude of fact-checking enterprises and the disproportionate ratio of fallacious or misleading statements these sites have judged Trump to have made, has strained journalism’s traditional identity by pushing it to be more self-conscious, interpretive, and, by consequence, controversial.
It is this shifting media landscape that Lucas Graves’ Deciding What’s True: The Rise of Fact-Checking in American Journalism smartly brings to light. Graves, a professor of journalism and communication at the University of Wisconsin, argues that the origin of rising stock of fact-checkers like PolitiFact, Factcheck.org, and the Washington Post’s Fact Checker can be attributed to the social, political, and economic pressures that have beset the norms of conventional reporting. Fact-checking, he says, reflects a reform movement within the school of journalism, one that involves “the diminished differentiation or more complex interpretation of the institutional fields of journalism and politics in the face of technological and economic change.” The proliferation of a diverse array of media sources has challenged journalism’s traditional role as a “gatekeeper” of the news. As a result, conventional reportage carries less weight in political discourse, leaving such discourse vulnerable to subtle manipulations of the truth.
The rise of fact-checking in the press is a natural response to troubles in the field of journalism more broadly. Seeking to indict these shortcomings from within, fact-checking has prodded the field to become more self-critical. “What sets [fact-checking] apart is a self-conscious orientation to the rest of the field: fact-checkers seek fairly openly to fix political journalism by introducing new practices, revising prevailing norms, and building institutional resources,” says Graves. Part of that trend is wading into the muddier waters of interpretation, explanation, and conflict resolution rather than strict stenography. By nature, fact-checking then is more aggressive at smoothening out kinks of misinformation and cutting through the fog of a dizzying media environment.
While Graves’ book precedes Trump’s political stardom, it is prescient at forecasting fact-checkers’ cachet in recent years. Lies seem to be increasingly more prevalent, and the unadulterated truth has become a marginalized commodity. The 2016 presidential race, he wagered, seemed to be “a new low for reasonable, fact-based political discourse, worse even than recent contestants dominated by wild claims about death panels and birth certificates.” The visibility of these misstatements can be seen as a victory for fact-checkers, says Graves, and the increased vigilance that the press has maintained in sniffing them out. It has not, however, removed them from the political discourse altogether. Trump’s daily propagation of erroneous information—whether by speech or by Twitter—has continued unabated throughout his first year as president. Trump’s response to the surge of fact-checks on him has been to denounce the media altogether with the charge of “fake news.”
Deciding What’s True is a sharp and illuminating examination of the proliferation of fact-checkers in the media world. Graves argues that rise has corresponded to the deteriorating influence of traditional journalism within the public discourse. Fact-checking sites and organizations have challenged the norms of conventional reporting from within, impelling it to capture the world with a more self-conscious lens. This has produced much controversy, as fact-checkers have pushed beyond the simple relay of information to the murkier world of interpreting and assessing the truthfulness or fallaciousness of political information—sometimes on a standardized scale.
At the end of the day, however, Graves says fact-checkers’ clout stops short at the water’s edge. Along with offering “hard facts” and “decisive rulings,” they must also submit themselves to popular judgement. By consequence, they “need to be able to disclaim their own work, offering it up to the judgement of readers who are free to disagree with those conclusions.” All of this amounts to an emerging field that is more interpretive in its analysis, more decisive in its conclusions, and more controversial in its implications.
A 4/20 rally in Colorado in 2015. (Reuters/Rick Wilking)
On November 6, 2012, voters in the state of Colorado voted conclusively to legalize the possession of marijuana for recreational purposes, enshrining into the state’s constitution an amendment to permit the possession by adults of up to one ounce of cannabis. Multiple other states followed suit in the ensuing years by allowing the issue to appear on the ballot, where voters in all but two of the states that offered the initiative approved the use of recreational weed.
The push by cannabis supporters to pass state laws allowing possession is the culmination of a series of efforts by states spanning decades to loosen restrictions on the drug, which is still prohibited by the federal government. This has created significant tension between Washington and the states, pushing the boundaries of the federalist form of government which allows states some measure of flexibility, but still grants the federal government the right to authorize policy that reigns supreme to that of the states.
This comes at a moment when public support for marijuana is high. Reported marijuana usage—both by those who currently use it and those who have used it in the past—is the highest of any scheduled substance, in terms of actual numbers and the percentage of those surveyed. Those figures have been steadily rising among all age groups over the past decade.
According to a recent survey, more than half of American adults have used marijuana at some point in their lives, and about one in four of them currently use it. More than half also say that smoking is “socially acceptable,” with about the same percentage saying that they would support the legalization of marijuana for recreational purposes.
It is generally hard to determine the causality of the issue—whether state liberalization of marijuana laws is driving laxer perceptions of the substance, or whether its popularity is rising independent of state changes—but one thing remains clear: social mores surrounding the personal use of marijuana are shifting. Using it has become much less a symbol of degeneracy and cultural rebellion than an innocuous or even medically beneficial activity.
Indeed, support for marijuana legalization has risen steeply since the 1970s. While in 1969 only 12 percent of the surveyed population supported marijuana legalization; by 1995 support had more than doubled in percentage terms, and by 2016 60 percent supported legal marijuana.
Recent opinions also find broad opposition to federal enforcement of marijuana laws in states where marijuana was made legal. About six in ten (59 percent) said Washington should not intervene in states that allow marijuana use. That opposition was consistent across the political spectrum. A majority of independents, Democrats, and Republicans believed the government should let states experiment with their marijuana laws. The same survey found that, among those who think marijuana should be legal, 78 percent said there should be no federal intervention to reverse state laws that have legalized it. Public support in favor of marijuana, paired with its continued illegality in federal statute, makes the subject of legalization a divisive and legally thorny issue.
To understand how this landscape came to be, it is necessary to study the origins of the federal proscription against marijuana. As early the late 1960s, President Richard Nixon fingered the widening use of marijuana, especially among college students, as a major contributor to the “growing menace” of drug abuse in the country. In a special message to Congress in 1969, Nixon admonished the recent escalation of drug abuse from “essentially a local police problem into a serious national threat to the personal health and safety of millions of Americans.”
The following year, lawmakers wrote the Controlled Substances Act (CSA) to codify several schedules of controlled substances, of which marijuana was included. Noted for its “high potential for abuse,” a lack of “currently accepted medical use,” and a “lack of accepted safety for use,” marijuana was classified as a Schedule I substance, along with other drugs like heroin and LSD.
Along with listing banned substances, the CSA also served to beef up existing law enforcement authority over drug abuse. The Bureau of Narcotics and Dangerous Drugs was granted an additional 300 agents, and those agents were granted extensive powers to execute and serve search warrants, make arrests without warrant, and seize property with probable cause.
Following the CSA, Nixon established the Office for Drug Abuse Law Enforcement to combat drug trafficking and to compile and pool information on drug markets for use by Federal, State, and local law enforcement agencies. This “balanced, comprehensive program,” as he called it, became the basis for decades of the federal mobilization against marijuana.
The implementation of the CSA, backed by a burgeoning federal law enforcement apparatus, did not deter states from changing their own laws in the field of marijuana possession. Many states throughout the 1970s and 1980s decriminalized the possession of marijuana. An as early as 1978, states began to establish limited medical marijuana trials for cancer and glaucoma patients.
In 1996, California became the first state to formally legalize medical cannabis. Voters passed Proposition 215, also known as the Compassionate Use Act, to allow seriously-ill Californians to obtain marijuana from a physician for the treatment of a terminal illness.
The federal government, however, was not going to sit by and allow flagrant state violations of federal law on drug policy. In 1998, the U.S. government sued a non-for-profit marijuana dispensary for illicit distribution and manufacture of marijuana. The case made its way to the Supreme Court, where Justice Clarence Thomas, writing the majority opinion in United States v. Oakland Cannabis Buyers’ Cooperative, disputed the Cooperative’s contention that any distribution of marijuana was for “medically necessary” purposes. Thomas wrote that because marijuana was listed as a scheduled substance under the CSA, it had no medical benefits that would qualify it for an exemption.
Moreover, because the Cooperative could not be classified as a government-approved research project, he wrote that it could not receive a statutory exemption. Because the case only concerned an injunction—not the constitutionality of the law itself—the Compassionate Use Act still remained on the books. The case became the first of many legal challenges to state maneuvers on drug policy that threatened the supremacy of federal statute.
Prop 215 became the bellwether of additional state changes to marijuana laws. In the five years following the initiative, a handful of states, including Oregon, Alaska, Washington, Hawaii, and Nevada, exempted from prosecution patients meeting certain statutory requirements. Other states like Maine and Colorado legalized cannabis through the ballot box.
While decriminalization initiatives nominally do not violate the CSA (possession of marijuana is still a civil infraction), medical marijuana posed new concerns which placed states that had approved its use at odds with the CSA. Washington responded by launching an assault of legal challenges against the states to enforce the ban on marijuana under the CSA.
(Justin Sullivan/Getty Images News)
In 2004, the Supreme Court decided a landmark case which upheld Congress’ power to prohibit the cultivation and use of marijuana. In Gonzales v Raich, the Court ruled in favor of Alberto Gonzales, Attorney General of the United States, writing that the federal government could prosecute individuals who use marijuana because it can be considered part of interstate commerce. Citing Wickard v. Fillburn, the Court wrote that, given the enforcement challenges of tracking marijuana from cultivator to user, Congress “had a rational basis for believing that failure to regulate the intrastate manufacture and possession of marijuana would leave a gaping hole in the CSA.” By allowing Congress to regulate marijuana, the Court essentially closed that legal chasm.
The federal legislative branch also responded by upholding existing federal text on drug enforcement. In the wake of the nationwide campaign to liberalize marijuana laws, Congress incorporated language in the FY 1999 omnibus appropriations act supporting the existing federal drug approval process and barring any circumvention of that process by legalizing marijuana.
In the same act, Congress prevented the District of Columbia from tabulating ballots of a 1998 voter-backed initiative that would have legalized marijuana for medical purposes. Only in 2010 had Congress officially dropped its challenge to the implementation of the D.C. initiative. Around the same time, direct enforcement of the CSA intensified under the Bush Administration. Federal raids on nearly 200 medical marijuana dispensaries commenced, and federal prosecutors started training their fire on medical marijuana caregivers.
Because of its lack of enforcement and prosecutorial resources and the growing social normalization of marijuana, the federal government had been unable to successfully preempt the implementation of state marijuana laws. Instead, it settled by outlining guidance on limited enforcement priorities that did not directly target marijuana possession.
Congress also allowed states more room to experiment with marijuana policies. In response to the Drug Enforcement Administration’s (DEA) raids on medical cannabis users and dispensaries in states where it was permitted under certain statutory qualifications, the first session of the 108th Congress considered a bipartisan amendment from Representatives Maurice Hinchey and Dana Rohrabacher to prevent the Department of Justice from using appropriations funds to hamper state cannabis laws. While the amendment was defeated, it was taken up again in four subsequent sessions of Congress. While the amendment did not pass until 2014, it led Congress to consider new bills that would loosen federal enforcement in states where medical marijuana was permitted.
In 2009, the incoming Obama Administration curtailed the Bush Administration’s raids on medical cannabis distributors. Announcing a shift in enforcement strategies, Attorney General Eric Holder said enforcement policy would be narrowed to drug traffickers, rather than dispensaries writ large that were compliant with state marijuana laws.
Later that year, Deputy Attorney General David W. Ogden released additional guidance to prosecutors in States that have legalized medical marijuana. The memo listed the prosecution of “significant traffickers of illegal drugs, including marijuana, and the disruption of illegal drug manufacturing and trafficking networks” as a “core priority.” Individuals in “clear and unambiguous compliance” with state laws on medical cannabis were not to be the subject of federal law enforcement resources.
In 2013, Deputy Attorney General James M. Cole issued fresh guidance for U.S. Attorneys about the Department of Justice’s position on the enforcement of the CSA guidance in the wake of state ballot initiatives to legalize recreational marijuana. In the attempt to tackle “the most significant threats in the most effective, consistent, and rational way,” Cole said the DOJ would selectively focus its resources on a slate of certain enforcement priorities that did not include individual users in states where possession was made legal. The new enforcement objectives included preventing the distribution of marijuana to minors, preventing the sale of marijuana to criminals, and preventing violence and drugged driving.
AG James M. Cole testifying on Capitol Hill (J. Scott Applewhite/AP)
It was found that the DOJ had very little prosecutorial agency to unilaterally target individual marijuana users to begin with. Instead, federal law enforcement was heavily dependent on states and localities to enforce its drug laws. For example, in 2010 only 0.8 percent of arrests for marijuana offenses were disposed of in federal court. In a review of state marijuana legalization trends, the Government Accountability Office (GAO) concluded that DOJ “has not historically devoted resources to prosecuting individuals whose conduct is limited to possession of small amounts of marijuana for personal use on private property.” Such activity has been traditionally relegated to state and local law enforcement authorities. In the words of one legal scholar, this lack of authority effectuated less controlled state medical marijuana laws and an unresolved tension between federal and state legal priorities.
Fulfilling its new enforcement objectives (and its limited scope and resources), the DOJ directed more of its resources to criminal networks and drug traffickers rather than individual users.
For example, the Organized Crime Drug Enforcement Task Force (OCDETF) Program— a key player in the federal government’s effort to dismantle major drug trafficking and money laundering operations—was responsible for one-fifth of all drug cases in FY 2013. Of the remaining cases, 99 percent consisted of allegations of drug dealing rather than drug possession. Furthermore, the U.S. Sentencing Commission found that, of all the observable drug cases that year, 93 percent were drug trafficking cases (USSC). These trends reflected DOJs newfound prioritization of drug trafficking and organized crime cases over simple possession.
By the middle of the Obama presidency, states were now beginning to move into new territory. In 2012, voters in Colorado and Washington approved a ballot initiative to legalize small amounts of marijuana for recreational purposes. These were the first two popular endorsements of legal marijuana for personal, non-medical purposes in the country. It reflected voters’ shifting attitudes about the substance, while posing new questions for federal law enforcement about how to handle state-level deviations from the CSA.
By 2014, legalization initiatives also passed in Oregon, Alaska, and the District of Columbia, and voters in Massachusetts, California, Maine, and Nevada also approved similar initiatives by 2016. As of early 2017, nearly nine in ten states allowed the medical use of marijuana. In addition to the eight states that have currently legalized recreational marijuana, experts expect at least nine more states will vote to legalize marijuana entirely by 2018.
As expected, in states that legalized marijuana for recreational use, the number of marijuana-related arrests plummeted. In the year after voters in Colorado approved limited possession of recreational marijuana, the number of marijuana arrests was nearly halved (46 percent) from 2012.
Marijuana sales arrests, meanwhile, did not change noticeably before and after the initiative. The number of marijuana-related court filings, including felony files, misdemeanors, and petty offenses, fell. Charges for marijuana possession in the state fell 88 percent from 2012 to 2015. In Washington, the number of incidents involving marijuana was more than halved in the year after the initiative. Filings for low-level possession offenses in the state also fell nearly 99 percent between 2009 and 2013.
The federal government, its supremacy again being challenged by a wave of new state measures that threatened it, signaled new interest in reopening its case against the states. Marking a reversal of the Obama-era policies, the Trump Administration vowed “greater enforcement” of federal law against marijuana use even in states that had legalized it.
In February of 2017, Attorney General Jeff Sessions commissioned a task force of prosecutors and federal law enforcement officials to recommend changes in federal marijuana policy, among other areas.
(AP)
In a letter to officials in Washington state— where recreational marijuana was legalized—Sessions appeared to be skeptical of the state’s “regulatory structures” for the market for medical and recreational marijuana. Citing a lack of oversight and regulation, Sessions urged the state to address the ramifications of the “dangerous drug,” including drugged driving, possession by minors, and an increase in the number of marijuana-related calls to the State Poison Center in recent years. Many experts cite this as evidence that the Trump Administration will continue to enforce the CSA through the Cole Memo, which allowed U.S. prosecutors launch investigations and prosecutions in states where it is legal.
The wave of state measures to liberalize marijuana laws has presented a formidable challenge to federal law enforcement, which portrays those changes as overt violations of the CSA. Due to these challenges, as well as the DOJ’s limited amount of prosecutorial resources, federal policymakers have been forced to adapt. Congress has allowed states to continue experimenting with new marijuana laws, and the DOJ has altered its enforcement priorities by targeting some of the more harmful, secondary effects of marijuana possession and large-scale crimes like trafficking.
Despite the rhetoric of the current Administration, the federal government is becoming increasingly constrained on this issue, with its authority and jurisdiction being curtailed by shifting popular opinion and states that are obliging the newfound enthusiasm for marijuana.
A dominant fixture in President Donald Trump’s economic agenda has been comprehensive tax reform, with the goal of lowering rates and spurring economic growth. The White House and Republican lawmakers have already started working on a framework for such reform as early as this spring.
After declaring defeat on repealing the Affordable Care Act (ACA), Republicans, led by Trump and a covey of political leaders in both the Administration and Congress, have recently released a tax proposal. This proposal, called “the unified framework for fixing our broken tax code,” acknowledges the Byzantine complications of the current system writ large: its distortions, intricacies, and inequities. It pledges a complete overhaul of both individual and corporate rates, and the elimination of nearly all deductions and exemptions. Most importantly, it draws inspiration from the most recent modification of the tax code: Ronald Reagan’s Tax Reform Act of 1986.
The 1986 reform has become part and parcel of conventional conservative lore. The story goes something like this: The Reagan Administration, as part of its unbridled assault on “Big Government,” pared back layers of obtrusions by the federal government in the free market economy. Bolstered by President Reagan’s charm and personal magnetism, tax reform sailed through the halls of Congress as one of the most impressive legislative achievements by Republicans in modern political history.
Supporters today echo Reagan’s buoyant description of the law as “the most sweeping overhaul of our tax code in our nation’s history.” Its legacy, they say, is monumental, an incontrovertible testament to the wisdom of conservative economics and Reagan’s expertise as a politician who knew how to put that wisdom into practice.
Except the push to pass tax reform was a bit more nuanced. Reagan deserves a lesser share of the credit. It was an effort initially started by Democrats and later co-opted by the Reagan Administration, and only half-heartedly supported by Republicans at the start. Reagan himself was not all that into making an effort at reform, as policymaking did not come easy to him because of his disinterest in its minutiae.
It was also the product of a sinuous process that began four years prior to its passage, an effort that encountered so many failures that it was called the “phoenix project”— a reference to the character in classic mythology that was reborn after resurrecting from ashes. Shepherded through Congress by an unseemly bipartisan alliance of political heavyweights with disparate personalities, it was the ultimate long-shot process. It underscored the fractious nature of political management, the painful necessity of consensus, and the vital importance of fiscal restraint in an area where costs matter.
These dynamics are crisply captured Jeffrey H. Birnbaum and Alan S. Murray’s Showdown at Gucci Gulch, a seminal work that rings ever more important considering recent efforts at tax reform. Trenchant and insightful, Birnbaum and Murray—both reporters for the Wall Street Journal at the time of writing—pry off the façades of the legislative process to reveal the many dimensions of power and persuasion.
The authors describe an ensemble of colorful players, each of which left a fingerprint on the bill as it coursed through Congress. In this story, power is wielded by men who lead powerful committees, who use cajolery and confrontation to bring committee members to consensus—even, as rare as it is in modern day Washington—among members of both parties. Showdown is also a lesson of how policy is a painstakingly slow process with the potential to be besieged by special interests.
At its heart, the battle for tax reform was a contest of persuasion, both by those in the chambers of Congress and the lobbyists who sought to extract as many concessions from them as possible.
Showdown at Gucci Gulch: Lawmakers, Lobbyists, and the Unlikely Triumph of Tax Reform (Vintage, 1987)
Tax reform, the authors note, was hardly inevitable. It was not until 1984 that Reagan decided to commission a “study of reform” on the subject, led by Democratic Senator Bill Bradley. Taxes were not even high on the agenda at the commencement of the 99th Congress in 1985. Talk of reforming the tax code was supplanted by other issues, including the ballooning budget deficit and high rates of inflation. The men in Congress with the most power to lead the effort—members of the Senate Finance Committee and the House Ways and Means Committee—were beholden to special interests. They were elected not to reform the tax code, but instead to protect the perks it preserved.
Working tirelessly for any chance to catch a politician’s ear, lobbyists were described by the authors as a “virtual fourth branch of government,” prodding and plying for favorable schemes and loopholes to minimize their tax exposure. The showdown which lends its name to the title of the book describes the standoff in the halls of Congress between lawmakers and the lobbyists, the latter of which who swarmed the halls in what the authors liken to “so many expectant fathers crowded into the waiting room of the maternity ward.”
Their posh outfits gave their statuses away and their pleas indicated their trade. The hallway they congregated became known as “Gucci Gulch,” a nod to their expensive suits and polished Italian shoes. Big money was at stake, and they clamored for a seat at the table.
Due to the effort by these lobbyists and special interests, the tax code prior to reform was loathed by many for its distortions and inversions; as a result, it engendered in the public little trust and confidence. The tax code began to resemble Swiss cheese with a diversity of avoidance schemes, including tax shelters, expedited write-offs, and deductions for luxurious expenses—all disproportionately used by the well-heeled at the expense of the taxpayer.
In 1987, the Joint Committee on Taxation (JCT) calculated that the federal government lost nearly half of a trillion dollars in the form of tax expenditures—various deductions, exclusions, and carve-outs which are only informally counted as spending in the budget. Corporations also grossly benefited, with one study finding that half of all profitable large companies paid no federal income taxes whatsoever. Public distaste with the unfairness of the tax code increased, but hopes of reform were tempered by the solemn reality of a Congress that moved at the crawl of a snail’s pace.
It was this distaste, coupled with his own personal animus towards the tax code, that piqued the interest of Democratic Senator Bill Bradley of New Jersey.
Bradley was the brainchild of tax reform. He saw it as an opportunity that could fulfill the Democrats’ ambition of making the economy work for everyone, especially the middle-class. It was Bradley who asked and was given permission by Reagan to study tax reform in 1984. It was also Bradley that co-sponsored a bill which served as a model for the 1986 legislation. Contrary to popular myth, tax reform was a Democratic idea that was only later shared by Republican voices. While Bradley’s attention to tax reform was initially received with little seriousness, it soon began to receive more attention.
Treasury officials also signaled their interest in shaking up the code. Led by Treasury Secretary Donald T. Regan, the Treasury’s plan amounted to an unabashed assault on the existing system. The plan reflected Regan’s zest for the dramatic; it was a bold maneuver that sought to clean the political house of as many special interests as possible.
The Regan plan, later dubbed “Treasury I,” was ambitious. Treasury officials, granted free reign by the Administration to experiment, cycled through many audacious ideas. They toyed with scrapping the income-tax system in favor of a consumption-based tax plan, flirted with a value-added tax, proposed large increases in the standard deduction, taxed employee fringe benefits, and even considered jettisoning very popular tax breaks like the home-mortgage-interest deduction. Regan even considered raising corporate tax rates—an idea later squashed by President Reagan, who marked higher corporate rates as off limits.
Birnbaum and Murray acknowledge that while the Treasury plan was too far-fetched and politically naïve, it added much needed momentum to the tax reform campaign and served as a springboard for legislation to come.
While taxes were still not high on the Republican agenda by early 1985, Reagan maintained his interests in seeing reform through. Meanwhile, Democrats salivated at the opportunity to push through changes to the tax code. They saw in Reagan an opportunity to claim the mantle of broad-based growth and economic opportunity.
Democratic Representative Dan Rostenkowski, chair of the House Ways and Means Committee, embraced Reagan’s commitment to reform. In his response to Reagan’s State of the Union address, Rostenkowski excoriated the tax system for its biases in favor of privileged groups, and called for reform that was fair, simple, and targeted at the middle-class. He applauded Reagan for his genuine interest in reform, and pledged “a great deal of Democratic support” to carry it across the finish line.
Rostenkowski’s sympathy for the effort gave more credence to once long-shot project. From its inception, the enterprise was supported by members of both parties in a testament to bipartisan comity. It was not shoved through Congress by the majority party over the protests of the minority, nor was it passed on a short timetable using legislative legerdemain to circumvent its critics. It was a consensus effort that earned the confidence and support of the public, not a partisan gambit that used chicanery and dishonesty to secure its success. As a result, the 1986 law garnered a diverse coalition of supporters. As the authors note:
It linked Ronald Reagan, the most conservative president in modern history, with George McGovern, the Democrat’s most liberal candidate in decades. It paired [Jack] Kemp, the conservative darling of the supply-side movement and a driving force behind the 1981 tax bill, with Bradley, a liberal Democrat and a fervent opponent of the 1981 bill. It even united General Motors’ chairman, Roger Smith, with his company’s long-standing nemesis, consumer activist Ralph Nader.
Another mentionable figure is Republican Senator Bob Packwood. Packwood’s deep connections to special interests, coupled with his standing over the bastion of pro-business sentiment known as the Finance Committee, is what made Packwood such an unlikely champion of tax reform.
Packwood, reading the progress coming out of the House as an omen of things to come, saw tax reform as inevitable. After the House had passed a bill, he took up the effort with newfound enthusiasm. But Packwood did not want a half-baked effort that made incremental tweaks rather than radical revisions to the tax code. Rather, he wanted a chance at what he called “real reform” and saw the House tax plan as an opening. He also wanted to shed his reputation as being “Mr. Special Interest.” In an election year, Packwood saw tax reform as his chance to make his mark on potentially monumental legislation.
According to the authors, what made both Rostenkowski and Packwood so successful at getting tax reform to the finish line was their use of strong-arm tactics and political cunning to marshal support for reform in committee. Pushing all the right levers of power, Rostenkowski and Packwood knew how to wield authority. They persuaded when possible and wrangled when necessary. They forged consensus by framing reform as a shared commitment, one that would need everyone to give up a little bit of what they wanted. They pulled their committees forward by pushing back strongly against amendments that would undermine the bill and gum up its momentum. They underscored the urgency of reform, and the promise of changes that would achieve widespread prosperity. Eventually, support for tax reform gained enough momentum that the authors say convinced even the most vicious critics of the effort to become born-again reformers.
The effort was not without its challenges. There were slip-ups along the way, insurrections, and painful concessions. Many popular deductions were put on the chopping-block for the sake of making the numbers add up. This was not without its discontents, but politicians in both chambers agreed that the bill could not add anything to the deficit.
At their request, the number-crunchers worked tirelessly to evaluate the costs of each provision; the bill could not be scored as being revenue negative. If cost estimators found that the plan added to the deficit, lawmakers would have to return to the drawing board. However, unlike this year’s tax campaign, it did not propose to add trillions to the deficit, nor did it promise to recoup some its cost through exceptionally rosy expectations of reform-induced growth.
Most importantly, the showdown between those who make and enforce laws and those who seek exceptions and concessions from those laws was ultimately won by the former. The lobbyists’ failure, according to the authors, reflected not just their limited influence, but most importantly the ultimate triumph of “populist” legislation. The success of tax reform demonstrated that real reform was possible and that special interests were not indomitable. While the authors acknowledge that the tax bill may have been imperfect effort, it was successful at achieving fundamental reform.
“For all its faults,” they write, “the Tax Reform Act of 1986 was the rough-hewn triumph of the American democratic system.” It represented representative democracy at its finest—policy that aimed to serve not just those who could manipulate the system but everyone across the income spectrum.
As Republicans attempt to cram their tax proposal through Congress on a party line split, it’s worth heeding the message Birnbaum and Murray impart in Showdown. Not only was the 1986 project a bipartisan effort, involving multiple years of input from both Republicans and Democrats, but it also took time. It was not debated, drafted, passed, and signed within a few months.
In a time when the tax code has twice as many words as it did in 1986, this year’s tax reform will be far more contentious and complicated—not to mention far more unlikely, given the precipitous erosion of common ground between the parties in recent history.
Reform in 1986 was achieved because it represented a convergence of interests that were important to both parties. Democratic hopes for closing loopholes were fused with Republicans’ favoritism of lower rates. A bipartisan bill meant a wider base of support, and increasing the odds of a smooth passage to the president’s desk. Lawmakers today should take note.
The 1986 bill also underscored the importance of cost containment. The bill’s drafters took a big interest in making reform revenue neutral. This stipulation also necessitated a lot of concessions from legislators, many of whom were beholden to special interests. Political horse-trading became common practice in committee, especially as new amendments exempting interest groups from tax hikes were introduced. These amendments threatened to violate the conditions of revenue neutrality, and threaten to stop the bill in its tracks. Treasury officials repeatedly crunched the numbers and sent them to Packwood and Rostenkowski, who then moved to make more stringent cutbacks so the bill would not increase the deficit.
This attitude sharply contrasts with that of this year’s tax effort. The current finance committee chairman, for example, said he was fine if the plan added to the budget deficit (which it probably will).
During the season of tax reform, also beware erroneous assumptions that promise to make the numbers add up. Republican tax plans are already taking advantage of an approach called “dynamic scoring,” which assumes that the positive effects of reform on growth will be large enough to pay for the direct costs of the plan. Economists have derided Trump Administration officials for assuming their economic reforms would generate economic growth above the historical norm.
Some also expect that Republicans might use other gimmickry as a substitute for fundamental reform, including sleights of hand to mask the true effect on the deficit. If tax reform has any chance of succeeding, history suggests that a more transparent approach would increase the likelihood of it being successful in its mission to fix the code and primarily help middle-class families.
Showdown at Gucci Gulch serves as a cautionary tale for lawmakers trying to improve the tax code this year. Birnbaum and Murray’s depiction of tax reform as a gruesomely slow process—one that was sustained by the mutual interests of Democrats and Republicans—that could not be rushed. Quick success is often a mirage; hopes are usually dashed by a failed vote, a cohort of wayward party members, or even another event which threatens to steal its thunder.
But as the authors show, those most invested its success could always snatch reform from the jaws of defeat. A political consensus is a mandatory precondition for reform to get the escape velocity it needs. “It breathes its own air,” remarked one surprised senator while observing how fast 1986 project went through Congress.
Showdown shows that a bipartisan bill can lend reform a greater aura of credibility and support. Politicians who are fiddling with the code today should revisit the forgotten history of the 1986 tax bill, not to write a plan that deceptive, partisan, or written in haste, but instead to write one that based on a more smart and sound foundation. But right now it doesn’t seem like they’re off to a good start.
BOOK REVIEW: THE SMEAR: HOW SHADY POLITICAL OPERATIVES AND FAKE NEWS CONTROL WHAT YOU SEE, WHAT YOU THINK, AND HOW YOU VOTE
BY SHARYL ATTKISSON
HARPERCOLLINS PUBLISHERS, PP. 304
Sparing no punches, Sharyl Attkisson’s The Smear is a forceful exposé of the race by political and corporate figures, often anonymous, to influence public opinion. Attkisson, former anchor and reporter for CBS News, focuses most of her criticism at the media world: its sloppily contrived narratives, transactional relations with political figures, and its elevation of what she calls the “smear.” Caught in the crosshairs is nonpartisan, shoe-leather journalism—old-fashioned reporting without smoke screens or hidden agendas from spinmeisters and corporate colluders. The news we receive is shaped and selected by various actors, says Attkisson, who pull the strings behind the curtain to generate the news headlines they want to hear.
“We’re living amid an artificial reality, persuaded to believe it’s real by astroturf engineered to look like grassroots,” she writes. “Success of the paid forces hinges on their ability to remain virtually invisible. To disguise what they do and make it seem as if their work is neither calculated nor scripted. It must appear to be precisely what it is not.”
Attkisson’s fusillade of criticism is directed especially at journalists, many of whom she accuses of being too cozy with the figures in their stories. The media ecosystem, she writes, has increasingly become a front for thinly veiled political operations. Articles and headlines are often staged and premeditated, carefully crafted with the help of political insiders to do the least amount of damage. For example, politicians can feed information to or agree to be interviewed by cushy and sympathetic news outlets. Political operatives may manipulate the press by selectively distributing unfounded rumors or self-serving stories.
To this point, Attkisson laments the degradation of journalistic integrity and the concomitant rise of reporting that is much less organic, nonpartisan, and spontaneous. Today, she writes, reporters have immersed themselves in the charade of competing to get insider information, only so that they are the first to receive fabricated narratives. True shoe-leather investigative reporting is becoming less common in an era where leaks are the predominant source of journalistic insight. Attkisson says this has enabled the proliferation of astroturf—alternative stories, often extremely polemical in nature, which elevate smear tactics and character assassination to the mainstream. And the public is the true recipient of the news that arises from this house of mirrors.
Attkisson documents the rise of the “smear industrial complex” from the character assassination tactics employed by the Clinton family and their inner circle in the 1990s to the modern unfolding of a vast network of super PAC’s and LLC’s that have elevated the smear to essential political campaign strategy. In addition, approximately one-third of The Smear examines what the author considers to be the disproportionate negative media coverage against wildcard candidate Donald Trump, and the punditry’s significant discounting of Trump’s chances of electoral victory in the 2016 presidential race. Attkisson even goes as far as to apply Trump’s terminology of “fake news” to what she sees as a biased industry that deliberately seeks to inflict harm on Trump.
The Smear is a hard-nosed and quite unsettling unmasking of the inside tactics used by people with power and influence to sway popular opinion. Yet it falls short of addressing with equal criticality the media biases and instances of shoddy reporting that fall on both sides of the ideological divide. Attkisson blows off too much steam taking digs at the Clintons and the more recent proliferation of media groups that are critical of conservative ideology and the pundits who endorse it. The author does not examine with equal attention the rapid expansion of misleading news on the right or Trump’s reflexive impulse to spread mistruths on a near daily basis. The book is oriented more towards tearing down the shady operatives pushing biased news on the left, rather than providing a more neutral critique that encompasses similar shady tactics employed by the right. Pizzagate, Benghazi, swirling rumors about Hillary Clinton’s health, and other unfounded rumors pushed by the are treated as mere footnotes, or ignored altogether. More accurately, The Smear is more of a denunciation of the double standard that arises from what the author views as a biased press, balanced with the author’s concern about the impressions it leaves on a pliable public audience.
BOOK REVIEW: THE POLITICS OF RESENTMENT: RURAL CONSCIOUSNESS IN WISCONSIN AND THE RISE OF SCOTT WALKER
BY KATHERINE J CRAMER
UNIVERSITY OF CHICAGO PRESS, PP. 256
Katherine J. Cramer’s The Politics of Resentment is the indispensable book to explain Donald Trump’s victory in the 2016 election. Cramer, a political ethnographer at the University of Wisconsin-Madison, performed a statewide survey of Wisconsin in 2007 and 2008, prior to the fractious recall attempt on Governor Scott Walker. Cramer’s field work suggests that the prominent divide, at least in Wisconsin, is between rural and urban residents. Rural folk, she gathers, view the world through the lens of place and class. Motivated by distinct values and lifestyles, as well as economic hardship, these people have a fundamentally distrustful view of government and the public sector.
Illuminating and eerily prophetic, Cramer’s findings provide greater insight into Trump’s success. Trump exceeded expectations among blue-collar working-class voters, and his performance in rural areas exceeded that of previous Republican nominees. Cramer, who interprets her findings within a strictly state setting, does not extrapolate her conclusions to divine a national mood. Rather, she finds that, at least in Wisconsin, rural voters exhibited resentment towards public officials, who they viewed as wasteful and unproductive, and aversion to elitism. This “rural consciousness,” Cramer writes, is also shaped by concerns of economic injustice. Rural folk were angered by the possibility that their tax dollars, funneled through a burgeoning bureaucratic apparatus, were not making their way back to rural areas.
Cramer identifies three overarching elements of rural consciousness: the belief, well-founded or not, that rural areas were getting ignored by decision makers in major metropolitan centers; that they were not receiving their fair share of public spending; and that urban lawmakers and public officials had no insight into the values and lifestyles of rural people. This rural-urban dichotomy features prominently in Cramer’s findings; in fact, she concludes that it has a more binding effect on voting preferences than political partisanship. Class and location matters, she argues, because it factors into the identities of rural voters.
Cramer’s methodology- going to local coffee spots and gathering places in randomly chosen towns across Wisconsin- is entirely novel. In fact, she argues- rightly- that this type of survey approach should be more common in opinion analyses of voters. “[P]oll-based analyses of opinion ought to be accompanied not just by focus groups or in-depth interviews by also by listening methods that expose us to the conversations and contexts of everyday life.” Perhaps this would have been a useful corrective for the presidential polls, which grossly underestimated Trump’s performance. “We would do well to acknowledge that sometimes there is no substitute for sitting down with people and listening to their perspectives in order to measure what those perspectives are,” she advises.
The Politics of Resentment is the indispensable, must-read study of how Trump, champion of the rural and working-class, captured the presidency. Cramer’s entirely novel field survey approach, which provides a closer and more personal window into voter sentiment than traditional polling, is a whiff of fresh air amidst the statistical noise of political polling. Her conclusions should be especially worrying to governmental fixtures and establishmentarians: rural voters, animated by class and geographical setting, resent the dismissive attitudes of political elites toward their economic and cultural conditions. While Cramer might have been better to elaborate on the national implications of her findings, her insights help those baffled by the meteoric success of President-elect Donald Trump.