Agence Global

  • About AG
  • Content
  • Articles
  • Contact AG

Liz Theoharis, “Joe Manchin’s America”

February 28, 2022 - TomDispatch

As if killing the Child Tax Credit, blocking voting rights, gutting key climate legislation, and refusing living wages wasn’t enough, West Virginia Democratic Senator Joe Manchin is now promoting legislation that further punishes the poor and marginalized. Along with Florida Republican Senator Marco Rubio, he’s introduced the PIPES Act, which undercuts key harm-reduction funding from the Department of Health and Human Services. It arrives with a media campaign launched by Fox News and other conservative outlets pushing bogus claims that the Biden administration is using government funds to buy “crack pipes,” tapping into a decades-long campaign to scapegoat vulnerable populations rather than address the root causes of the unconscionable conditions under which they live.

Paired with Manchin’s moralizing and obstruction when it comes to President Biden’s Build Back Better Bill because he “cannot accept our economy, or basically our society, moving towards an entitlement mentality,” his new legislation is more evidence that he privileges rich donors over actual constituents in West Virginia and is truly willing to punish the poor. He’s claimed that families in his state would use money from the Child Tax Credit to buy drugs, that work requirements rather than more resources will lift poor kids out of poverty, and that, as the Huffington Post reported, “Americans would fraudulently use the proposed paid sick leave policy, specifically saying people would feign being sick and go on hunting trips.”

All of this represents a painful return to the “culture of poverty” debates of the 1960s. Indeed, despite being discredited by scholars and poverty experts over and over since its invention, such anti-poor propaganda seems to rear its head whenever popular opinion and public action might actually lead to improvements in the lives of poor and low-income people.

The culture of poverty American anthropologist Oscar Lewis first suggested there was a culture of poverty in the mid-1960s, an idea quickly championed by the political right. Republican administrations from President Ronald Reagan on, buttressed by right-wing groups like the Moral Majority, claimed that the true origins of poverty lay in immoral personal choices and ways of life that led to broken families and terrible life decisions.

Such ideas were particularly appealing to politicians and the wealthy since they identified the causes of poverty not as a problem of society at large, but of the poor themselves. It was, as they saw it, one that lay deep in an “autonomous subculture [that] exists among the poor, one that is self-perpetuating and self-defeating.” To encourage such thinking, they invented and endlessly publicized hyper-racialized caricatures of the poor like the “welfare queen,” while pushing the idea that poor people were lazy, crazy, and stupid. Then they criminalized poverty, while cutting government programs like welfare and public housing — legislative acts guaranteed to harm millions of Americans across multiple generations.

This culture-of-poverty debate and the legislative action to uphold it have become deeply ingrained in this country and not just among conservatives. In 1996, after all, it was the administration of Democratic President Bill Clinton that ended “welfare as we know it,” its officials having armed themselves with tales about the backwardness of the poor and their need to finally take “personal responsibility” for their lives.

A neoliberal approach to governing has had a hold on significant parts of both parties ever since, while structural poverty and inequality have only deepened. The poor have been pathologized so effectively that culture-of-poverty distortions have even made their way into more progressive media and scholarly accounts of their lives. There, too, poor people are often depicted as incapable of analyzing their own situations or understanding the dilemmas they face, let alone engaging in the sort of strategic thinking that might begin to overcome inequality.

Even, for example, those heralded scholars of poor people’s movements, Francis Fox Piven and Richard Cloward, argued that the history of organizing the poor did not originate among the poor themselves. Instead, they suggested that, in the twentieth century, such organizing efforts were “largely stimulated by the federal government through its Great Society programs” and through anti-poverty agencies, civil-rights activists, and student groups. What such a perspective cut out were the struggles of poor and low-income organizers like Johnnie Tillmon of Arkansas and Annie Smart of Louisiana, both poor mothers and important initiators of the welfare rights movement, as well as other twentieth-century campaigns led by the poor to lift the load of poverty.

Leaders like Tillmon and Smart, in fact, helped build organizations that, in the twentieth century, mobilized tens of thousands of the very people those in power and the media all too often blamed for society’s deepest problems. Slogans used decades later by the Kensington Welfare Rights Union, the National Welfare Rights Union, and the National Union of the Homeless, to mobilize the poor, like “no housing, no peace,” “you only get what you’re organized to take,” and “each one, teach one, so we can reach one more,” highlighted the latent but all-too-real power found within poor communities, as well as the idea that the poor are themselves capable of being agents of positive social change.

This last point is especially important because, historically speaking, poor people have struggled over and over again to create a better country not just for themselves, but for everyone. Far from being imprisoned in a culture of poverty and so helpless to act to change their own conditions, the poor throughout U.S. history have shown an ability, often under the worst imaginable conditions, to transform society for the better by fighting for everyone’s right to healthcare, housing, clean water, an adequate education, and so much more.

Nonetheless, victim-blaming narratives that divert attention from the social structures and interests that have created ever more poverty in this century continue to serve a political purpose for the defenders of the status quo. In today’s America, consider Joe Manchin, one among many necromancers who have reanimated the corpse of the long-discredited culture of poverty. In the process, they’ve given a veneer of sophistication to hateful rhetoric and acts against the poor, including the nearly half of West Virginians who are today in poverty or one emergency from economic ruin.

The death-dealing culture of the rich In America, instead of recognizing the political agency and moral vision of poor people, it’s generally believed that the rich, entrepreneurial, and powerful have the solutions to our social ills. Indeed, as I’ve written previously at TomDispatch, this society has long suffered from a kind of Stockholm syndrome: we look to the rich for answers to the very problems they’re all-too-often responsible for creating and from which, of course, they benefit immeasurably.

Even as Americans begin to question the ever more enormous divide between the rich and the rest of us, the media narrative lionizes the wealthy. For example, contrast the many pieces celebrating the Gates Foundation’s work around global health with its decision early in the pandemic to pressure Oxford University and AstraZeneca to keep exclusive property rights to their Covid-19 vaccine rather than making it widely available for manufacture around the world. That decision and so many more like it by other wealthy individuals, private corporations, and countries played a significant part in creating the vaccine apartheid that continues to divide the Global North and South (and so prepared the groundwork for new variations of the pandemic among the unvaccinated).

Moreover, our society continues to treat the grievances of the rich as public crises requiring government action, but wounds to the rest of us as the unfortunate result of bad luck or personal failures. This dynamic has been seen during both the Trump and Biden presidencies. In the early weeks of lockdown in 2020, the Federal Reserve, under President Trump, funneled billions of dollars into the coffers of the wealthy. Meanwhile, significant parts of the CARES Act, like the Paycheck Protection Program, directed significant sums to high-income households, while millions were left in the lurch.

A year and half later, bad-faith arguments about inflation and scarcity have been used by Manchin and other “moderate” Democrats to sink the Build Back Better agenda and allow major antipoverty programs like the Child Tax Credit to expire, to the detriment of nearly 75% of its recipients. I say bad faith because you need only look at the $2.1 trillion that America’s billionaires have made during these two pandemic years or the $770 billion Congress had no hesitation allocating for the 2022 Pentagon budget and related expenses to see that such scarcity arguments simply don’t hold water. In reality, the resources are at hand to solve our nation’s most burning crises, if only we had the political will.

Those in power maintain their hold on our collective imagination in part due to the pervasive ideological belief that an economy which benefits the rich will, in a trickle-down fashion, benefit the rest of us. This belief is at the core of the curriculum of most university economics departments across the country. In fact, in America today, it’s largely accepted that a rising economic tide will lift all boats rather than just the yachts of the well-to-do, as many of the rest of us sink all around them.

When, however, nearly half of the U.S. population is already poor or lives one lost paycheck, storm, or medical emergency away from poverty, a national reckoning with the core values and political priorities of our society is an increasing necessity. How sad, then, that the poor continue to be pitted against each other, blamed for their poverty (and many of the country’s other problems), and fed the lie of scarcity in a time of unprecedented abundance.

Poverty amid plenty According to the Supplemental Poverty Measure of the U.S. Census, there are 140 million people of different races, genders, and ages from all over this county who are poor or low-income. It’s not that those 140 million Americans all lack the cultural attributes for success or that most refuse to work or don’t understand how to spend or save money. And they certainly aren’t poor because they haven’t prayed hard enough or God simply ordained it. Rather, it’s time to hear some of the real reasons why people are poor or low-income rather than fall prey to the misrepresentations and falsifications of the culture of poverty.

One reason is because the cost of living in this country has, for decades, outpaced household earnings. As the Poor People’s Campaign (which I co-chair with Reverend William J. Barber II) pointed out in a 2018 report, American workers have seen little or no real growth in their weekly wages for the past 40 years, even as economic productivity has shot up exponentially. Tens of millions of Americans work for less than $15 an hour and so can’t afford basic necessities, including housing, childcare, health care, education, food, and gas — the prices of all of which have outpaced wage growth. Believe it or not, there is no state, metropolitan area, or county in the country today where a full-time, minimum-wage job can support a two-bedroom rental apartment.

One-hundred-and-forty-million people are impoverished, in part because racialized voter suppression and gerrymandering have created unfair elections that keep the poor — especially Black, Latinx, and Native American people — largely out of the democratic process. Between 2010 and 2020, more than 27 states passed racist voter suppression laws and, in 2021, 19 states passed an additional 33 of them. As a consequence, elections are rigged from the start and many extremist politicians, essentially smuggled into office, then govern by suppressing wages while cutting health care and critical social services for the poor of all walks of life. (This is something that leaders, especially Reverend William Barber and the Forward Together Moral Mondays Movement, have been pointing out for years.)

Such widespread deep poverty exists in this country because of ongoing and intensifying attacks on social programs, on full display in recent months. At a moment when ever more people need a strong social safety net, there have been dramatic cuts in federal housing assistance, the public-housing stock, food stamps, and other critical social programs. Today, the Temporary Assistance for Needy Families Program supports less than one in four poor families with children, while federal assistance to local water systems has decreased by 74% over the past 40 years, leading to a crisis of water quality and affordability impacting at least 14 million mostly poor people. Add to this, in the midst of a pandemic, the reality that states are sending back monies set aside to help needy people, even as 3.7 million kids were pushed below the poverty line in January alone.

In addition, more and more Americans are struggling because we have become a debtor nation. With wages stagnating and the cost of living rising, there has been an explosion of debt across the country. And given the already described circumstances, you undoubtedly won’t be surprised to discover that the bottom 90% of Americans hold more than 70% of it, including $1.34 trillion in student debt. In 2016, 24 million American families were living “underwater” (meaning they owed more on their houses than those structures were even worth).

The reality of poverty amid plenty has also grown more widespread and evident because our national priorities have increasingly shifted ever more toward a militarized and toxic war economy. Today, out of every federal discretionary dollar, 53 cents go to our military, while only 15 cents go to anti-poverty programs. This sort of spending has been mirrored in our communities, too, where there has been a tenfold increase in spending on prisons and deportations over the past 40 years. In other words, the criminalization of the poor that began in earnest half a century ago is now in full bloom. To cite one indicative figure that sums this up: since 2000, 95% of the rise in the incarcerated population has been made up of people who can’t afford bail.

In 1967, the year before he was assassinated, while organizing the poor across the country for the Poor People’s Campaign, Reverend Martin Luther King, Jr., offered a powerful insight that couldn’t be more relevant today. “We are called upon,” he said,

to help the discouraged beggars in life’s marketplace. But one day we must come to see that an edifice which produces beggars needs restructuring. It means that questions must be raised. And you see, my friends, when you deal with this you begin to ask the question, ‘Who owns the oil?’ You begin to ask the question, ‘Who owns the iron ore?’ You begin to ask the question, ‘Why is it that people have to pay water bills in a world that’s two-thirds water?’ These are the words that must be said.

Indeed, when it comes to who are the poor and why they are poor, it’s time to reject the hateful theories of old and instead answer questions like those.

 

Liz Theoharis writes regularly for TomDispatch (where this article originated). She is a theologian, ordained minister, and anti-poverty activist. Co-chair of the Poor People’s Campaign: A National Call for Moral Revival and director of the Kairos Center for Religions, Rights and Social Justice at Union Theological Seminary in New York City, she is the author of Always With Us? What Jesus Really Said About the Poor and We Cry Justice: Reading the Bible with the Poor People’s Campaign. Follow her on Twitter at @liztheo.

Copyright ©2022 Liz Theoharis — distributed by Agence Global

—————-
Released: 28 February 2022
Word Count: 2,471
—————-

Alfred McCoy, “China is digging its own grave (and ours as well)”

February 24, 2022 - TomDispatch

Consider us at the edge of the sort of epochal change not seen for centuries, even millennia. By the middle of this century, we will be living under such radically altered circumstances that the present decade, the 2020s, will undoubtedly seem like another era entirely, akin perhaps to the Middle Ages. And I’m not talking about the future development of flying cars, cryogenics, or even as-yet-unimaginable versions of space travel.

After leading the world for the past 75 years, the United States is ever so fitfully losing its grip on global hegemony. As Washington’s power begins to fade, the liberal international system it created by founding the United Nations in 1945 is facing potentially fatal challenges.

After more than 180 years of Western global dominion, leadership is beginning to move from West to East, where Beijing is likely to become the epicenter of a new world order that could indeed rupture longstanding Western traditions of law and human rights.

More crucially, however, after two centuries of propelling the world economy to unprecedented prosperity, the use of fossil fuels — especially coal and oil — will undoubtedly fade away within the next couple of decades. Meanwhile, for the first time since the last Ice Age ended 11,000 years ago, thanks to the greenhouse gases those fossil fuels are emitting into the atmosphere, the world’s climate is changing in ways that will, by the middle of this century, start to render significant parts of the planet uninhabitable for a quarter, even possibly half, of humanity.

For the first time in 800,000 years, the level of carbon dioxide (CO2) in the atmosphere has blown past earlier highs of 280 parts per million to reach 410 parts. That, in turn, is unleashing climate feedback loops that, by century’s end, if not well before, will aridify the globe’s middle latitudes, partly melt the polar ice caps, and raise sea levels drastically. (Don’t even think about a future Miami or Shanghai!)

In trying to imagine how such changes will affect an evolving world order, is it possible to chart the future with something better than mere guesswork?  My own field, history, generally performs poorly when trying to track the past into the future, while social sciences like economics and political science are loath to project much beyond medium-term trends (say, the next recession or election). Uniquely among the disciplines, however, environmental science has developed diverse analytical tools for predicting the effects of climate change all the way to this century’s end.

Those predictions have become so sophisticated that world leaders in finance, politics, and science are now beginning to think about how to reorganize whole societies and their economies to accommodate the projected disastrous upheavals to come. Yet surprisingly few of us have started to think about the likely impact of climate change upon global power. By combining political projections with already carefully plotted trajectories for climate change, it may, however, be possible to see something of the likely course of governance for the next half century or so.

To begin with the most immediate changes, social-science analysis has long predicted the end of U.S. global power. Using economic projections, the U.S. National Intelligence Council, for instance, stated that, by 2030, “Asia will have surpassed North America and Europe combined in terms of global power,” while “China alone will probably have the largest economy, surpassing that of the United States a few years before 2030.” Using similar methods, the accounting firm PwC calculated that China’s economy would become 60% larger than that of the United States by 2030.

If climate science proves accurate, however, the hegemony Beijing could achieve by perhaps 2030 will last, at best, only a couple of decades or less before unchecked global warming ensures that the very concept of world dominance, as we’ve known it historically since the sixteenth century, may be relegated to a past age like so much else in our world.

Considering that likelihood as we peer dimly into the decades between 2030 and 2050 and beyond, the international community will surely have good reason to forge a new kind of world order — one made for a planet truly in danger and unlike any that has come before.

The rise of Chinese global hegemony China’s rise to world power could be considered not just the result of its own initiative but also of American inattention. While Washington was mired in endless wars in the Greater Middle East in the decade following the September 2001 terrorist attacks, Beijing began using a trillion dollars of its swelling dollar reserves to build a tricontinental economic infrastructure it called the Belt and Road Initiative (BRI) that would shake the foundations of Washington’s world order. Not only has this scheme already gone a long way toward incorporating much of Africa and Asia into Beijing’s version of the world economy, but it has simultaneously lifted many millions out of poverty.

During the early years of the Cold War, Washington funded the reconstruction of a ravaged Europe and the development of 100 new nations emerging from colonial rule. But as the Cold War ended in 1991, more than a third of humanity was still living in extreme poverty, abandoned by Washington’s then-reigning neo-liberal ideology that consigned social change to the whims of the free market. By 2018, nearly half the world’s population, or about 3.4 billion people, were simply struggling to survive on the equivalent of five dollars a day, creating a vast global constituency for Beijing’s economic leadership.

For China, social change began at home. Starting in the 1980s, the Communist Party presided over the transformation of an impoverished agricultural society into an urban industrial powerhouse. Propelled by the greatest mass migration in history, as millions moved from country to city, its economy grew nearly 10% annually for 40 years and lifted 800 million people out of poverty — the fastest sustained rate ever recorded by any country. Meanwhile, between 2006 and 2016 alone, its industrial output increased from $1.2 trillion to $3.2 trillion, leaving the U.S. in the dust at $2.2 trillion and making China the workshop of the world.

By the time Washington awoke to China’s challenge and tried to respond with what President Barack Obama called a “strategic pivot” to Asia, it was too late. With foreign reserves already at $4 trillion in 2014, Beijing launched its Belt and Road Initiative, while establishing an Asian Infrastructure Investment Bank, with 56 member nations and an impressive $100 billion in capital. When a Belt and Road Forum of 29 world leaders convened in Beijing in May 2017, President Xi Jinping hailed the initiative as the “project of the century,” aimed both at promoting growth and improving “people’s well-being” through “poverty alleviation.” Indeed, two years later a World Bank study found that BRI transportation projects had already increased the gross domestic product in 55 recipient nations by a solid 3.4%.

Amid this flurry of flying dirt and flowing concrete, Beijing seems to have an underlying design for transcending the vast distances that have historically separated Asia from Europe. Its goal: to forge a unitary market that will soon cover the vast Eurasian land mass. This scheme will consolidate China’s control over a continent that is home to 70% of the world’s population and productivity. In the end, it could also break the U.S. geopolitical grip over a region that has long been the core of, and key to, its global power. The foundation for such an ambitious transnational scheme is a monumental construction effort that, in just two decades, has already covered China and much of Central Asia with a massive triad of energy pipelines, high-speed rail lines, and highways.

To break that down, start with this: Beijing is building a transcontinental network of natural gas and oil pipelines that will, in alliance with Russia, extend for 6,000 miles from the North Atlantic Ocean to the South China Sea.

For the second arm in that triad, Beijing has built the world’s largest high-speed rail system, with more than 15,000 miles already operational in 2018 and plans for a network of nearly 24,000 miles by 2025. All this, in turn, is just a partial step toward what’s expected to be a full-scale transcontinental rail system that started with the “Eurasian Land Bridge” track running from China through Kazakhstan to Europe. In addition to its transcontinental trunk lines, Beijing plans branch-lines heading due south toward Singapore, southwest through Pakistan, and then from Pakistan through Iran to Turkey.

To complete its transport triad, China has also constructed an impressive set of highways, representing (like those pipelines) a problematic continuation of Washington’s current petrol-powered world order. In 1990, that country lacked a single expressway. By 2017, it had built 87,000 miles of highways, nearly double the size of the U.S. interstate system. Even that breathtaking number can’t begin to capture the extraordinary engineering feats necessary — the tunneling through steep mountains, the spanning of wide rivers, the crossing of deep gorges on towering pillars, and the spinning of concrete webs around massive cities.

Simultaneously, China was also becoming the world’s largest auto manufacturer as the number of vehicles on its roads soared to 340 million in 2019, exceeding America’s 276 million. However, all of this impressive news is depressing news as well. After all, by clinging to coal production on a major scale, while reaching for a bigger slice of the world’s oil imports for its transportation triad, China’s greenhouse-gas emissions doubled from just 14% of the world’s total in 2000 to 30% in 2019, far surpassing that of the United States, previously the planet’s leading emitter. With only 150 vehicles per thousand people, compared to 850 in America, its auto industry still has ample growth potential — good news for its economy, but terrible news for the global climate (even if China remains in the forefront of the development and use of electric cars).

To power such headlong development, China has, in fact, raised its domestic coal production more than a thousand-fold, from just 32 million metric tons in 1949 to a mind-boggling record of 4.1 billion tons by 2021. Even if you take into account those massive natural-gas pipelines it is building, its enormous hydropower dams, and its world leadership in wind power, as of 2020 China still depended on coal for a startling 57% of its total energy use, even as its share of total global coal-fired power climbed relentlessly to a record 53%. In other words, nothing, it seems, can break that country’s leadership of its insatiable hunger for the dirtiest of all fossil fuels.

On the global stage, Beijing has been similarly obsessed with economic growth above all else. Despite its promises to curb greenhouse-gas emissions at recent U.N. climate conferences, China is still promoting coal-fired power at home and abroad. In 2020, the Institute of International Finance reported that 85% of all projects under Beijing’s BRI entailed high greenhouse-gas emissions, particularly the 63 coal-fired electrical plants the project was financing worldwide.

When the 2019 U.N. climate conference opened, China itself was actively constructing new coal-fueled electrical plants with a combined capacity of 121 gigawatts — substantially more than the 105 gigawatts being built by the rest of the world combined. By 2019, China was the largest single source of pollution on the planet, accounting for nearly one-third of the world’s total greenhouse gas emissions. Meanwhile, U.N. Secretary General António Guterres was warning that such emissions were “putting billions of people at immediate risk.” With an impassioned urgency, he demanded “a death knell for coal and fossil fuels before they destroy our planet” by banning all new coal-fired power plants and phasing them out of developed nations by 2030.

Together, the planet’s two great imperial powers, China and the United States, accounted for 44% of total CO2 emissions in 2019 and so far both have made painfully slow progress toward renewable energy. In a joint declaration at the November 2021 Glasgow climate conference, the U.S. agreed “to reach 100% carbon-pollution-free electricity by 2035,” while China promised to “phase down” (but note, not “phase out”) coal starting with its “15th Five-Year Plan.”

The U.S. commitment soon died a quiet death in Congress, where President Biden’s own party killed his green-energy initiative. Amid all the applause at Glasgow, nobody paid much attention to the fact that China’s next five-year plan doesn’t even start until 2026, just as President Xi Jinping’s promise of carbon neutrality by 2060 is a perfect formula for not averting the climate disaster that awaits us all.

In its hell-bent drive for development, in other words, China is digging its own grave (and ours as well).

Climate catastrophe circa 2050 Even if China were to become the preeminent world power around 2030, the accelerating pace of climate change will likely curtail its hegemony within decades. As global warming batters the country by mid-century, Beijing will be forced to retreat from its projection of global power to address urgent domestic concerns.

In 2017, scientists at the nonprofit group Climate Central calculated, for instance, that rising seas and storm surges could, by 2060 or 2070, flood areas inhabited by 275 million people worldwide, with Shanghai deemed “the most vulnerable major city in the world to serious flooding.” In that sprawling metropolis, 17.5 million people are likely to be displaced as most of the city “could eventually be submerged in water, including much of the downtown area.”

Advancing the date of this disaster by at least a decade, a 2019 report on rising sea levels in Nature Communications found that 150 million people worldwide are now living on land that will be submerged by 2050 and Shanghai was, once again, found to be facing serious risk. There, rising waters “threaten to consume the heart” of the metropolis and its surrounding cities, crippling one of China’s main economic engines. Dredged from sea and swamp since the fifteenth century, much of that city is likely to return to the waters from whence it came in the next three decades.

Simultaneously, soaring temperatures are expected to devastate the North China Plain between Beijing and Shanghai, one of that country’s prime agricultural regions currently inhabited by 400 million people, nearly a third of that country’s population. It could, in fact, potentially become one of the most lethal places on the planet.

“This spot is going to be the hottest spot for deadly heat waves in the future,” said Professor Elfatih Eltahir, a climate specialist at MIT who published his findings in the journal Nature Communications. Between 2070 and 2100, he estimates, the region could face hundreds of periods of “extreme danger” and perhaps five lethal periods of 35° Wet Bulb Temperature (where a combination of heat and high humidity prevents the evaporation of the sweat that cools the human body). After just six hours under such conditions, a healthy person at rest will die.

Rather than sudden and catastrophic, the impact of climate change in North China is likely to be incremental and cumulative, escalating relentlessly with each passing decade. If the “Chinese century” does indeed start around 2030, it’s unlikely to last long once its main financial center at Shanghai is flooded out and its agricultural heartland is baking in insufferable heat.

A democratic world order After 2050, the international community will face a growing contradiction, even a head-on collision, between the two foundational principles of the current world order: national sovereignty and human rights. As long as nations have the sovereign right to seal their borders, the world will have no way of protecting the human rights of the 200 million to 1.2 billion climate-change refugees expected to be created by 2050, both within their own borders and beyond. Faced with such extreme disorder, it is just possible that the nations of this planet might agree to cede some small portion of their sovereignty to a global government set up to cope with the climate crisis.

To meet the extraordinary mid-century challenges to come, a supranational body like the U.N. would need sovereign authority over at least three significant priorities — emission controls, refugee resettlement, and environmental reconstruction. First, a reformed U.N. would need the power to compel nations to end their emissions if the transition to renewable energy is still not complete by, at the latest, 2050. Second, an empowered U.N. high commissioner for refugees would have to be authorized to supersede national sovereignty by requiring temperate northern countries to deal with the tidal flows of humanity from the tropical and subtropical regions most impacted and made least inhabitable by climate change. Finally, the voluntary transfer of funds like the $100 billion promised poor nations at the 2015 Paris Climate Conference would have to become mandatory to keep afflicted communities, and especially the world’s poor, relatively safe.

In the crisis to come, such initiatives would by their very nature change the idea of what constitutes a world order from the amorphous imperial ethos of the past five centuries to a new form of global governance. To exercise effective sovereignty over the global commons, the U.N. would have to enact some long overdue reforms, notably by creating an elective Security Council without either permanent members or the present great-power prerogative of unilaterally vetoing measures. Instead of superpower strength serving as the ultimate guarantor for U.N. decisions, a democratized Security Council could reach climate decisions by majority vote and enforce them through the moral authority, as well as the self-interest, of a more representative international body.

If a U.N. of this sort were indeed in existence by at least 2050, such a framework of democratic world governance could well be complemented by a globally decentralized system of energy. For five centuries now, energy and imperial hegemony have been deeply intertwined. In the transition to alternative energy, however, households will, sooner or later, be able to control their own solar power everywhere the sun shines, while communities will be able to supplement that variable source with a mix of wind turbines, biomass, hydro, and mini-reactors.

Just as the demands of petroleum production shaped the steep hierarchy of Washington’s world order, so decentralized access to energy could foster a more inclusive global governance. After five centuries of Iberian, British, American, and Chinese hegemony, it’s at least possible that humanity, even under the increasingly stressful conditions of climate change, could finally experience a more democratic world order.

The question, of course, is: How do we get from here to there? As in ages past, civil society will be critical to such changes. For the past five centuries, social reformers have struggled against powerful empires to advance the principle of human rights. In the sixteenth century, Dominican friars, then the embodiment of civil society, pressed the Spanish empire to recognize the humanity of Amerindians and end their enslavement. Similarly, in the mid-twentieth century activists lobbied diplomats drafting the U.N. charter to change it from a closed imperial club into the far more open organization we have today.

Just as reformers moderated the harshness of Spanish, British, and U.S. imperial hegemony, so, on a climate-pressured planet of an almost unimaginable sort, civil society will certainly play an essential role in finally putting in place the sort of limitations on national sovereignty (and imperial ambitions) that the U.N. will need to cope with our endangered world. Perhaps the key force in this change will be a growing environmental movement that, in the future, will expand its agenda from capping and radically reducing emissions to pressuring powers, including an increasingly devastated China, to reform the very structure of world governance.

A planet ever more battered by climate change, one in which neither an American nor a Chinese “century” will have any meaning, will certainly need a newly empowered world order that can supersede national sovereignty to protect the most fundamental and transcendent of all human rights: survival. The environmental changes in the offing are so profound that anything less than a new form of democratic global governance will mean not just incessant conflicts but, in all likelihood, disaster of an almost-unimaginable kind. And no surprise there, since we’ll be dealing with a planet all too literally on the brink.

Alfred W. McCoy writes regularly for TomDispatch (where this article originated). He is the Harrington professor of history at the University of Wisconsin-Madison. He is the author of In the Shadows of the American Century: The Rise and Decline of U.S. Global Power (Dispatch Books). His new book, just published, is To Govern the Globe: World Orders and Catastrophic Change.

Copyright ©2022 Alfred W. McCoy — distributed by Agence Global

—————-
Released: 24 February 2022
Word Count: 3,346
—————-

Kelly Denton-Borhaug, “The sacralization of war, American-style”

February 22, 2022 - TomDispatch

Lately, random verses from the Bible have been popping into my mind unbidden, like St. Paul’s famous line from Galatians, “A person reaps what they sow.” The words sprang into my consciousness when I learned of the death of the 95-year-old Vietnamese Buddhist monk and peace activist Thich Nhat Hanh, who helped encourage Martin Luther King to declare his opposition to the Vietnam War so long ago.

For decades, I’ve been moved by Hanh’s witness and his writings, which shined such a light on the destructive consequences of our country’s militarism. As he said, “To prepare for war, to give millions of men and women the opportunity to practice killing day and night in their hearts, is to plant millions of seeds of violence, anger, frustration, and fear that will be passed on for generations to come.”

We reap what we sow. It seems so obvious, but in these endless years of U.S. war-making across the globe, this simple truth seems to have escaped most Americans.

Why? It’s not as if no one’s noticed that the U.S. has, in so many ways, become a more violent society. Many public intellectuals (progressives and conservatives, too) are wringing their hands regarding the dangerous uptick in social violence of all sorts in this country, including voluminous gun purchases, distrust and anger, racism, xenophobia, misogyny, rising deaths from avoidable causes like refusing to be vaccinated — and the list only goes on.

But a thinker like Thich Nhat Hanh stands out from the rest. His insights differed from the norm because he saw so clearly how the seeds of violence in war-culture sprout into a kind of invasive kudzu vine capable of spreading across every aspect of life, while crushing, asphyxiating, and killing so much along the way.

War-culture as an invasive, destructive vine I wonder why the media haven’t more thoroughly investigated the psychology that enables our congressional representatives almost unanimously to approve outlandish, ever larger military budgets, no matter how poorly the U.S. military may be doing in the world. The violent infrastructure of this nation is like a noxious vine with destructive results for us all, but few connect this to other rising forms of violence in the U.S. For instance, our leaders couldn’t find it in their hearts to approve an extension of the child tax credit, even though it played a role in lifting 4.6 million children out of poverty. One study even showed how such cash stipends and tax credits, when provided to poor mothers with babies in the first year of life, resulted in changed brain activity in their children and improved cognitive development.

But West Virginia Democratic Senator Joe Manchin (along with all the Senate Republicans) refused to support continuing that program, while, like almost every one of those Republicans and most of his Democratic colleagues, he had no problem whatsoever approving an astronomical defense budget, even in the wake of the Afghan withdrawal. Parents, he insisted, should have to work to receive any assistance for their children, but the military doesn’t have to work for that $738 billion dollars to be approved. There’s no requirement for a financial accounting or any demand for evidence that the U.S. military solves “national security” problems of any sort.

And it’s not only Manchin. That budget passed in the Senate by a staggering vote of 88 to 10. (The dissenting lawmakers were Senators Cory Booker, Michael Braun, Kirsten Gillibrand, Mike Lee, Ed Markey, Jeff Merkley, Alex Padilla, Rand Paul, Bernie Sanders, and Elizabeth Warren.)

While at least $6 trillion dollars were spent on this country’s post-9/11 wars, crucial issues like climate change and medical care for the elderly and the rest of us are treated with a bake-sale mentality by our lawmakers, with precious little questioning of that reality. Are our leaders afraid of the weapons-making titans of the military-industrial complex (of which they are increasingly a part)? Do they really believe that this is the way to build a more secure world? The 3.7 million children whose families just fell back into poverty as a result of the heartless erasure of the Child Tax Credit are only less safe as they fall asleep tonight. What about our nation’s responsibility to them?

And here’s another all-too-relevant question: Why don’t the rest of us step up to make it stop? Where has the anti-war movement and a movement against that military-industrial-congressional complex been all these years? So many of us are easily distracted, pay too little attention, and focus on our private business, while passing on the seeds of violence, anger, frustration, and fear to each new generation.

Worse yet, in our culture, the military budget is widely viewed as a social, even global good, though both Thich Nhat Hanh and Martin Luther King would have considered this a lie of the first order. The hum of the continuing violence embedded in and eternally reinforced by this country’s war-making structure is so constant that most of us don’t even notice or question it. The structural violence of a nation that puts more money into its military than the next 11 military spenders combined — yes, that’s right, combined — is intolerable, especially because it’s guaranteed to undermine both democracy and public health here and in the wider world. It shouldn’t surprise us that people outside the United States now see us as one of the “main threats to world peace.”

Malignant normality: serving the “Pentagod” What makes such widespread obliviousness to, apathy about, and denial of our addiction to violence so invisible to so many of us? Here, I have to point to one of the moral touchstones in my own life: Jon Sobrino, a priest, writer, and activist who survived the massacre of eight other Jesuit priests and women domestic workers at the José Simeón Cañas Central American University on the outskirts of El Salvador’s capital in 1989. His housemates and colleagues were murdered in cold blood by the Salvadoran Army (backed at the time by Washington) because the priests were calling for social justice, ministering to people caught in war zones, and encouraging those who were too afraid to speak up. Sobrino himself escaped death only because he happened to be out of the country, lecturing, when the slaughter took place.

His spiritual starting point is one I try to adopt in every project I undertake. The first step, he insists, is always to demonstrate “honesty toward reality.” Now, Sobrino may be a theologian, but his approach applies to us all. We simply can’t assume honesty in this dishonest world. We must work for it. And Sobrino takes this further, because his own life experience taught him that being truly honest about our world is difficult indeed, given that violence and injustice are so often “concealed.”

This is where I find his insights so compelling. Being honest about our all-American reality is challenging indeed since the destructive seeds of violence slip so easily and comfortably under the surface of things. This not only makes it difficult to see them clearly, but also much harder to hold accountable those who mischaracterize such incipient, well-funded violence as good, not evil.

Social psychologist Robert Jay Lifton described this as “malignant normality,” the imposition of destructive or violent behavior on Americans as a built-in part of everyday life. Lifton studied the practices of Communist Chinese “thought reform” (once known here as “brainwashing”) and the work of doctors in the Nazi regime to try to understand how people turn away from reality and get caught up in worlds of dishonesty that sow the seeds of harm and destruction.

In this context, I continue to listen to the voices of military servicemembers and veterans who have opened themselves to the uncomfortable truths about how this country is now reaping what its war-culture has sown globally. They have experienced its lethal growth, destruction, and death all too personally. They know in a way the rest of us often don’t what it means to be acculturated to “malignant normality.” Take, for example, retired Air Force Lieutenant Colonel William Astore who recently wrote a piece for TomDispatchabout “the Pentagod” he so faithfully served for 20 years. Stationed in “a cathedral of military power,” a more or less literal “temple of doom” under tons of granite in Cheyenne mountain, Colorado, he ministered, he wrote, to the “jealous and wrathful god” of the nuclear-industrial complex.

Eventually, however, he lost his faith in the American god of war, who “always wanted more.” The bottomless craving of today’s Pentagod is behind more than just the soaring military budget. Remember that, among the latest insanities of that complex, are plans to “modernize” this country’s vast nuclear arsenal at a cost, over the next three decades, of nearly $2 trillion. That includes Northrup Grumman’s $264 billion “potential lifecycle” price tag on a new set of land-based nuclear missiles that will be siloed in heartland states like Wyoming and North Dakota. And we call this “good”?

Last December, I was privileged to hear veterans from the Moral Injury Program at Philadelphia’s Corporal Michael J. Crescenz VA Medical Center testify publicly at a “healing ceremony” about their own encounters with the god of war, the malignant normality of this country’s war-culture, and the seeds of violence it sowed so deeply and painfully in their own lives. One of them was Matthew Abbadusky, who shared a public letter he wrote explaining why he resigned his commission as an Army National Guard chaplain. Its telling first sentence was: “Honesty is the beginning of spiritual life.”

Like Astore, he was no longer willing to serve the U.S. god of war. “I cannot, in good conscience, lend religious and ethical support to a military institution that primarily benefits an economy of corporate, expansionist greed and inconspicuous lust for destruction,” he wrote. His experiences as an infantryman in the 10th Mountain Division, including a 15-month deployment to Iraq and later his work as a military chaplain stateside, “enabled me to arrive at this waypoint on my journey.”

He spoke with passion about “the lifelong visible and invisible wounds” borne by so many of his compatriots in the armed forces:

The morally confounding circumstances a soldier faces on the battlefield are a manifestation of political and corporate moral bankruptcy. The plight they face often places their lives into extreme danger and requires them to make unfathomable decisions, wreaking destruction without, and confusion and chaos within.

Digging out To dig ourselves out of the dishonesty, complacency, apathy, and lies of American war-culture, we’re going to need greater honesty about the way Christianity has been weaponized and manipulated to support our society’s malignant normality. It’s time, for instance, to call out the dishonesty of using certain verses from the New Testament to sacralize war.

For example, not just chaplains and religious leaders but military commanders, military families, and everyday citizens regularly valorize what soldiers do by referring to the Gospel of John: “Greater love has no one than this, that someone lay down his life for his friends.”

It is indeed a beautiful, evocative verse that holds so much meaning for so many people. But there’s a long history of dishonesty surrounding its use in the context of war-culture. Especially on occasions like Veterans Day or Memorial Day, you’ll hear this verse in political speeches, commercials, public-school programs, and ceremonies of all sorts. Exploiting citizens’ honest desire to care for veterans, the militarized use of such words hides the truth about how our soldiers have labored at the forefront of this murderous society.

In this way (and there are so many similar examples, religious and otherwise), war is covered with a sacred sheen, while its seeds of violence are normalized and slip ever further from our consciousness. But being honest requires that we face reality and the truth about the consequences of war. As scholar and activist Khury Petersen-Smith of the Institute for Policy Studies put it, “Military violence always requires dehumanization and the denial of rights — and this inevitably corrupts any notions of democracy.”

Despite the regular hijacking of that verse from John to soften and conceal the ugly violence of American-style war, those words are part of Jesus’s teaching about nonviolent service to others. In fact, biblical scholars agree that the historical Jesus rejected militarized violence. And don’t forget that, in the end, he was executed by the Roman imperial power structure.

It’s worth asking: Who exactly benefits from making the violence of war into something sacred? Do veterans? Countless times I’ve heard them testify that such super-valorization and sacralization of war silences any honesty about the reality they experienced. And that’s true not only of people who participated in the violence of the battlefield, but also those like Astore and Abbadusky who struggle to reckon with the roles they played in the structural violence of war-culture, sowing the seeds of destruction and bearing witness to the consequences.

And what do they need from the rest of us? At the very least, we, too, can strive for deeper honesty regarding this country of ours, which is visibly in trouble and still focused on future wars as the best way to address our fears about the threats that face us. We seem to be unable to think any differently, despite evidence that more war will only make matters worse for the world, as well as for the United States.

Maybe, if we stopped making war and militarism into a sacred enterprise, we’d be more successful in demanding that our political leaders cease their thoughtless approval, year after year, of destructive, ever more gigantic Pentagon budgets.

Maybe, if we began listening more deeply to veterans, our understanding of the true costs of the war-culture that’s engulfed us so disastrously through the first two decades of this century would deepen. And maybe our ability to resist complicity with the way it’s been endlessly sowing the seeds of violence, anger, frustration, and fear, generation after generation, would begin to grow.

Kelly Denton-Borhaug writes regularly for TomDispatch (where this article originated). She has long been investigating how religion and violence collide in American war-culture. She teaches in the global religions department at Moravian University. She is the author of two books, U.S. War-Culture, Sacrifice and Salvation and, more recently, And Then Your Soul is Gone: Moral Injury and U.S. War-Culture.

Copyright ©2022 Kelly Denton-Borhaug — distributed by Agence Global

—————-
Released: 22 February 2022
Word Count: 2,323
—————-

Tom Engelhardt, “My life with Maus”

February 17, 2022 - TomDispatch

Sometimes life has a way of making you realize things about yourself. Recently, I discovered that an urge of mine, almost four decades old, had been the very opposite of that of a rural Tennessee school board this January. In another life, I played a role in what could be thought of as the unbanning of the graphic novel Maus.

For months, I’ve been reading about the growing Trumpist-Republican movement to ban whatever books its members consider politically unpalatable, lest the lives of America’s children be sullied by, say, a novel of Toni Morrison’s like The Bluest Eye or Margaret Atwood’s The Handmaid’s Tale or a history book like They Called Themselves the K.K.K. It’s an urge that just rubs me the wrong way. After all, as a boy growing up in New York City in the 1950s, when children’s post-school lives were much less organized than they are today, I would often wander into the local branch of the public library, hoping the librarian would allow me into the adult section. There — having little idea what I was doing — I would pull interesting-looking grown-up books off the shelves and head for home.

Years later, exchanging childhood memories with a friend and publishing colleague, Sara Bershtel, I discovered that, on arriving in this country, she, too, had found a sympathetic librarian and headed for those adult shelves. At perhaps 12 or 13, just about the age of those Tennessee schoolkids, we had both — miracle of miracles! — not faintly knowing what we were doing, pulled Annmarie Selinko’s bestselling novel Désirée off the shelves. It was about Napoleon Bonaparte and his youthful fiancé and we each remember being riveted by it. Maybe my own fascination with history, and hers with French literature, began there. Neither of us, I suspect, were harmed by reading the sort of racy bestseller that Republicans would today undoubtedly loathe.

Oh, and if you’ll excuse a little stream of consciousness here, my friend Sara was born in a German displaced-persons camp to Jewish parents who had, miraculously enough, survived the Nazi death camps at Auschwitz and Buchenwald, which brings me back to the jumping-off spot for this piece. Unless you’ve been in Ukraine these last weeks, it’s something you undoubtedly already know about, given the attention it’s received: that, by a 10-0 vote, a school board in McMinn County, Tennessee, banned from its eighth-grade classroom curriculum Art Spiegelman’s Pulitzer Prize-winning graphic novel Maus, about his parents’ Holocaust years in Auschwitz and beyond (and his own experience growing up with them afterward). When I first heard about that act I felt, however briefly and indirectly, pulled off the shelves myself and banned. And damn! — yes, I want to make sure that this piece gets banned as well! — I felt proud of it!

Just to back up for a moment: that Tennessee school board banned Spiegelman’s book on the grounds, at least nominally, that it contained naked cartoon mice — Jewish victims in a concentration camp and Spiegelman’s mother, who committed suicide, in a bathtub — and profanity as well (like that word “damn!”). In a world where, given a chance, so many of us would head for the modern equivalent of those adult library shelves — these days, of course, any kid with an iPhone or a computer can get a dose of almost any strange thing on this planet — that school board might as well have been a marketing firm working for Maus. After all, more than three decades after it first hit the bestseller lists, their action sent it soaring to number one at Amazon, while donated copies began to pour into rural Tennessee.

As former Secretary of Labor Robert Reich recently pointed out, if you truly want a teenager to read any book with gusto, the first thing you need to do is, of course, ban it. So, I suppose that, in its own upside-down way, the McMinn board did our world a strange kind of favor. In the long run, however, the growing rage for banning books from schools and libraries (or even, in the case of the Harry Potter books, burning them, Nazi-style) doesn’t offer a particularly hopeful vision of where this country’s headed right now.

“What’s so funny?“ Still, as I’m sure you’ve guessed, I’m only going on like this because that incident in Tennessee and the media response to it brought back an ancient moment in my own life. So, think of the rest of this piece as a personal footnote to the McMinn story and to the growing wave of book bannings in courses and school libraries across too much of this country. And that’s not even to mention the plethora of “gag order” bills passed by or still being considered in Republican-dominated state legislatures to prevent the teaching of certain subjects. It’s further evidence, if you need it, of an urge to wipe from consciousness so much that they find uncomfortable in our national past. It’s also undoubtedly part of a larger urge to take over America’s public-school system, or even replace it, much as Donald Trump and crew would like to all-too-autocratically take over this country and transform it into an unrecognizable polity, a subject TomDispatchhas covered for years.

However, my moment in the sun began at a time when The Donald was about to open the first of his Atlantic City casinos that would eventually turn him into a notorious bankuptee. And it took place inside the world of publishing, which then seemed all too ready to essentially ban Maus from this planet. Back in the early 1980s, putting out a Holocaust “comic book” — though the term “graphic novel” existed, just about no one in publishing knew it — in which Jews were cartoon mice and the Nazis cats, seemed like a suicidal act for a book publisher.

And in that context, here’s my personal story about the cartoon mice that might never have made it to McMinn County, Tennessee. In the 1980s, I was an editor at Pantheon Books, a publishing house run by André Schiffrin who, in a fashion hardly commonplace then or later, gave his editors a chance to sign up books that might seem too unfashionable or politically dangerous.

One day, our wonderful art director, Louise Fili, came to my office. (She worked on another floor of the Random House building in New York City, the larger publishing house of which we were then a part.) In her hands, she had an oversized magazine called RAW that I had never seen before, put out by a friend of hers named Art Spiegelman. It was filled with experimental cartoon art. And in the seams of new issues, he had been stapling tiny chapters of a memoir he was beginning to create about the experiences of his father and mother in the Holocaust. Jews from Poland, they had ended up in Auschwitz and managed, unlike so many millions of Jews murdered in such death camps, to survive the experience. Louise also had with her a proposal from Spiegelman for what would become his bestselling graphic novel Maus.

I still remember her telling me that it had already been rejected by every publisher imaginable. In those days, that was, I suspect, something like a selling point for me. Anyway, I took the couple of teeny chapters and the proposal home — and all these years later, I still recall the moment when I decided I had to put Spiegelman’s book out, no matter what. I remember it because I thought of myself as a rather rational editor and the feeling that I simply had to do Maus was one of the two least rational decisions I ever made in publishing (the other being to do Chalmers Johnson’s book Blowback, also a future bestseller).

At that moment, I doubt I had ever read what came to be known as a graphic novel, but there was something in my background that, I suspect, left me particularly open to it. My mother, Irma Selz, had been a theatrical and later political caricaturist for New York’s leading newspapers and magazines (and, in the 1950s, the New Yorker as well). She was, in fact, known as “New York’s girl caricaturist” in the gossip columns of her time, since she was the only one in an otherwise largely male world of cartoonists.

Because she lived in that world, after a fashion I did, too. I can, for instance, remember Irwin Hazen, the creator of the now largely forgotten cartoon Dondi, sitting by my bedside when I was perhaps seven or eight drawing his character for me on sheets of tracing paper before I went to sleep. (Somewhere in the top of my closet, I suspect I still have those sketches of his!) So I think I was, in some unexpected way, the perfect editor for Spiegelman’s proposal. I was also a Jew and, though my grandfather had come to America in the 1890s from Lemberg (now Ukraine’s Lviv) and later brought significant parts of his family here, I remember my grandmother telling me of family members who had been swallowed up by the Holocaust.

Anyway, here’s the moment I still recall. I was lying down reading what Louise had given me when my wife, Nancy, walked past me. At that moment, I burst out laughing. “What’s so funny?” she asked. Her question took me completely aback. I paused for a genuinely painful moment and then said, haltingly and in an only faintly coherent fashion, something like: “Uh… it’s a proposed comic book about a guy whose parents lived through Auschwitz and later, in his adolescence, his mother committed suicide…”

I felt abashed and yet I had been laughing and that stopped me dead in my tracks. At that very moment, I realized, however irrationally, that whatever this strange, engrossing, disturbing comic book about a world from hell turned out to be, I just had to do it. From that moment on, whether it ever sold a copy or not wasn’t even an issue for me.

A Holocaust comic book? And then, you might say, the problems began. I went to André, told him about the project, and he reacted expectably. Who in the world, he wondered, would buy a Holocaust comic book? I certainly didn’t know, nor did I even care then. In some gut way, I simply knew that a world without this book would be a lesser place. It was that simple.

Thank heavens, as a boss, André deeply believed in his editors, just as we editors believed in each other. He also hated to say “no.” So, instead, a kind of siege ensued while the proposed book passed from hand to hand and others looked and reacted, but I remained determined and knew that, in the end, if I was that way, he would let me do it, as indeed he did.

I was considered something of a fierce editor in those days and yet I doubt I touched a word of Spiegelman’s manuscript. What it is today, it is thanks purely to him, not me. I took him out to lunch to tell him about our publication decision and prepare the way for our future collaboration. While there, I assured him that I knew nothing about producing such a book — he, for instance, wanted the kind of flaps that were found on French but not American paperbacks — and would simply do what he wanted. The one thing I wanted him to know, though, was that he shouldn’t get his hopes up. Given the subject matter, it was unlikely to sell many copies. (A Pulitzer Prize? It never crossed my mind.)

Fortunately, as far as I could tell, he all too sagely paid no attention whatsoever to me on the subject. And as it happened, some months later (as best I remember), the New York Times Book Review devoted a full page to him and, in part, to the future Maus. It was like a miracle. We were stunned and, from that moment on, knew that we had something big on our hands.

And in that fashion, in another century, you could say that I unbanned Maus, preparing the way for McMinn County to ban it in our own Trumpist moment. I couldn’t be prouder today to have had a hand in producing the book that caricaturist David Levine would all too aptly compare to the work of Franz Kafka.

In its continuing eventful existence, as a unique record of the truly terrible things we humans are capable of doing to one another, it is indeed a masterpiece. It raises issues that all of us, parents and children, should have to grapple with on our endangered planet, a place where we have so much work ahead of us if, in some terrible fashion, we don’t want to ban ourselves.

Tom Engelhardt created and runs the website TomDispatch.com (where this article originated). He is also a co-founder of the American Empire Project and the author of a highly praised history of American triumphalism in the Cold War, The End of Victory Culture.  A fellow of the Type Media Center, his sixth and latest book is A Nation Unmade by War.

Copyright ©2022 Tom Engelhardt — distributed by Agence Global

—————-
Released: 17 February 2022
Word Count: 2,135
—————-

William Astore, “America’s disastrous 60-Year War”

February 15, 2022 - TomDispatch

In my lifetime of nearly 60 years, America has waged five major wars, winning one decisively, then throwing that victory away, while losing the other four disastrously. Vietnam, Afghanistan, and Iraq, as well as the Global War on Terror, were the losses, of course; the Cold War being the solitary win that must now be counted as a loss because its promise was so quickly discarded.

America’s war in Vietnam was waged during the Cold War in the context of what was then known as the domino theory and the idea of “containing” communism. Iraq and Afghanistan were part of the Global War on Terror, a post-Cold War event in which “radical Islamic terrorism” became the substitute for communism. Even so, those wars should be treated as a single strand of history, a 60-year war, if you will, for one reason alone: the explanatory power of such a concept.

For me, because of President Dwight D. Eisenhower’s farewell address to the nation in January 1961, that year is the obvious starting point for what retired Army colonel and historian Andrew Bacevich recently termed America’s Very Long War (VLW). In that televised speech, Ike warned of the emergence of a military-industrial complex of immense strength that could someday threaten American democracy itself. I’ve chosen 2021 as the VLW’s terminus point because of the disastrous end of this country’s Afghan War, which even in its last years cost $45 billion annually to prosecute, and because of one curious reality that goes with it. In the wake of the crashing and burning of that 20-year war effort, the Pentagon budget leaped even higher with the support of almost every congressional representative of both parties as Washington’s armed attention turned to China and Russia.

At the end of two decades of globally disastrous war-making, that funding increase should tell us just how right Eisenhower was about the perils of the military-industrial complex. By failing to heed him all these years, democracy may indeed be in the process of meeting its demise.

The prosperity of losing wars Several things define America’s disastrous 60-year war. These would include profligacy and ferocity in the use of weaponry against peoples who could not respond in kind; enormous profiteering by the military-industrial complex; incessant lying by the U.S. government (the evidence in the Pentagon Papers for Vietnam, the missing WMD for the invasion of Iraq, and the recent Afghan War papers); accountability-free defeats, with prominent government or military officials essentially never held responsible; and the consistent practice of a militarized Keynesianism that provided jobs and wealth to a relative few at the expense of a great many. In sum, America’s 60-year war has featured conspicuous destruction globally, even as wartime production in the U.S. failed to better the lives of the working and middle classes as a whole.

Let’s take a closer look. Militarily speaking, throwing almost everything the U.S. military had (nuclear arms excepted) at opponents who had next to nothing should be considered the defining feature of the VLW. During those six decades of war-making, the U.S. military raged with white hot anger against enemies who refused to submit to its ever more powerful, technologically advanced, and destructive toys.

I’ve studied and written about the Vietnam War and yet I continue to be astounded by the sheer range of weaponry dropped on the peoples of Southeast Asia in those years — from conventional bombs and napalm to defoliants like Agent Orange that still cause deaths almost half a century after our troops finally bugged out of there. Along with all that ordnance left behind, Vietnam was a testing ground for technologies of every sort, including the infamous electronic barrier that Secretary of Defense Robert McNamara sought to establish to interdict the Ho Chi Minh trail.

When it came to my old service, the Air Force, Vietnam became a proving ground for the notion that airpower, using megatons of bombs, could win a war. Just about every aircraft in the inventory then was thrown at America’s alleged enemies, including bombers built for strategic nuclear attacks like the B-52 Stratofortress. The result, of course, was staggeringly widespread devastation and loss of life at considerable cost to economic fairness and social equity in this country (not to mention our humanity). Still, the companies producing all the bombs, napalm, defoliants, sensors, airplanes, and other killer products did well indeed in those years.

In terms of sheer bomb tonnage and the like, America’s wars in Afghanistan and Iraq were more restrained, mainly thanks to the post-Vietnam development of so-called smart weapons. Nonetheless, the sort of destruction that rained down on Southeast Asia was largely repeated in the war on terror, similarly targeting lightly armed guerrilla groups and helpless civilian populations. And once again, expensive strategic bombers like the B-1, developed at a staggering cost to penetrate sophisticated Soviet air defenses in a nuclear war, were dispatched against bands of guerrillas operating in Afghanistan, Iraq, and Syria. Depleted uranium shells, white phosphorus, cluster munitions, as well as other toxic munitions, were used repeatedly. Again, short of nuclear weapons, just about every weapon that could be thrown at Iraqi soldiers, al-Qaeda or ISIS insurgents, or Taliban fighters in Afghanistan, would be used, including those venerable B-52s and, in one case, what was known as the MOAB, or mother of all bombs. And again, despite all the death and destruction, the U.S. military would lose both wars (one functionally in Iraq and the other all too publicly in Afghanistan), even as so many in and out of that military would profit and prosper from the effort.

What kind of prosperity are we talking about? The Vietnam War cycled through an estimated $1 trillion in American wealth, the Afghan and Iraq Wars possibly more than $8 trillion (when all the bills come due from the War on Terror). Yet, despite such costly defeats, or perhaps because of them, Pentagon spending is expected to exceed $7.3 trillion over the next decade. Never in the field of human conflict has so much money been gobbled up by so few at the expense of so many.

Throughout those 60 years of the VLW, the military-industrial complex has conspicuously consumed trillions of taxpayer dollars, while the U.S. military has rained destruction around the globe. Worse yet, those wars were generally waged with strong bipartisan support in Congress and at least not actively resisted by a significant “silent majority” of Americans. In the process, they have given rise to new forms of authoritarianism and militarism, the very opposite of representative democracy.

Paradoxically, even as “the world’s greatest military” lost those wars, its influence continued to grow in this country, except for a brief dip in the aftermath of Vietnam. It’s as if a gambler had gone on a 60-year losing binge, only to find himself applauded as a winner.

Constant war-making and a militarized Keynesianism created certain kinds of high-paying jobs (though not faintly as many as peaceful economic endeavors would have). Wars and constant preparations for the same also drove deficit spending since few in Congress wanted to pay for them via tax hikes. As a result, in all those years, as bombs and missiles rained down, wealth continued to flow up to ever more gigantic corporations like Boeing, Raytheon, and Lockheed Martin, places all too ready to hire retired generals to fill their boards.

And here’s another reality: very little of that wealth ever actually trickled down to workers unless they happened to be employed by those weapons makers, which — to steal the names of two of this country’s Hellfire missile-armed drones — have become this society’s predators and reapers. If a pithy slogan were needed here, you might call these the Build Back Better by Bombing years, which, of course, moves us squarely into Orwellian territory.

Learning from Orwell and Ike Speaking of George Orwell, America’s 60-Year War, a losing proposition for the many, proved a distinctly winning one for the few and that wasn’t an accident either. In his book within a book in Nineteen Eighty-Four, Orwell wrote all-too-accurately of permanent war as a calculated way of consuming the products of modern capitalism without generating a higher standard of living for its workers. That, of course, is the definition of a win-win situation for the owners. In his words:

The essential act of war is destruction, not necessarily of human lives, but of the products of human labor. War is a way of shattering to pieces, or pouring into the stratosphere, or sinking in the depths of the sea, materials which might otherwise be used to make the masses too comfortable, and hence, in the long run, too intelligent. Even when weapons of war are not actually destroyed, their manufacture is still a convenient way of expending labor power without producing anything that can be consumed [by the workers].

War, as Orwell saw it, was a way of making huge sums of money for a few at the expense of the many, who would be left in a state where they simply couldn’t fight back or take power. Ever. Think of such war production and war-making as a legalized form of theft, as Ike recognized in 1953 in his “cross of iron” speech against militarism. The production of weaponry, he declared eight years before he named “the military-industrial complex,” constituted theft from those seeking a better education, affordable health care, safer roads, or indeed any of the fruits of a healthy democracy attuned to the needs of its workers. The problem, as Orwell recognized, was that smarter, healthier workers with greater freedom of choice would be less likely to endure such oppression and exploitation.

And war, as he knew, was also a way to stimulate the economy without stimulating hopes and dreams, a way to create wealth for the few while destroying it for the many. Domestically, the Vietnam War crippled Lyndon Johnson’s plans for the Great Society. The high cost of the failed war on terror and of Pentagon budgets that continue to rise today regardless of results are now cited as arguments against Joe Biden’s “Build Back Better” plan. President Franklin D. Roosevelt’s New Deal arguably would have never been funded if today’s vast military-industrial complex, or even the one in Ike’s day, had existed in the 1930s.

As political theorist Crane Brinton noted in The Anatomy of Revolution, a healthy and growing middle class, equal parts optimistic and opportunistic, is likely to be open to progressive, even revolutionary ideas. But a stagnant, shrinking, or slipping middle class is likely to prove politically reactionary as pessimism replaces optimism and protectionism replaces opportunity. In this sense, the arrival of Donald Trump in the White House was anything but a mystery and the possibility of an autocratic future no less so.

All those trillions of dollars consumed in wasteful wars have helped foster a creeping pessimism in Americans. A sign of it is the near-total absence of the very idea of peace as a shared possibility for our country. Most Americans simply take it for granted that war or threats of war, having defined our immediate past, will define our future as well. As a result, soaring military budgets are seen not as aberrations, nor even as burdensome, but as unavoidable, even desirable — a sign of national seriousness and global martial superiority.

You’re going to have it tough at the end It should be mind-blowing that, despite the wealth being created (and often destroyed) by the United States and impressive gains in worker productivity, the standard of living for workers hasn’t increased significantly since the early 1970s. One thing is certain: it hasn’t happened by accident.

For those who profit most from it, America’s 60-Year War has indeed been a resounding success, even if also a colossal failure when it comes to worker prosperity or democracy. This really shouldn’t surprise us. As former President James Madison warned Americans so long ago, no nation can protect its freedoms amid constant warfare. Democracies don’t die in darkness; they die in and from war. In case you hadn’t noticed (and I know you have), evidence of the approaching death of American democracy is all around us. It’s why so many of us are profoundly uneasy. We are, after all, living in a strange new world, worse than that of our parents and grandparents, one whose horizons continue to contract while hope contracts with them.

I’m amazed when I realize that, before his death in 2003, my father predicted this. He was born in 1917, survived the Great Depression by joining Franklin Roosevelt’s Civilian Conservation Corps, and worked in factories at night for low pay before being drafted into the Army in World War II. After the war, he would live a modest middle-class life as a firefighter, a union job with decent pay and benefits. Here was the way my dad put it to me: he’d had it tough at the beginning of his life, but easy at the end, while I’d had it easy at the beginning, but I’d have it tough at the end.

He sensed, I think, that the American dream was being betrayed, not by workers like himself, but by corporate elites increasingly consumed by an ever more destructive form of greed. Events have proven him all too on target, as America has come to be defined by a greed-war for which no armistice, let alone an end, is promised. In twenty-first-century America, war and the endless preparations for it simply go on and on. Consider it beyond irony that, as this country’s corporate, political, and military champions claim they wage war to spread democracy, it withers at home.

And here’s what worries me most of all: America’s very long war of destruction against relatively weak countries and peoples may be over, or at least reduced to the odd moment of hostilities, but America’s leaders, no matter the party, now seem to favor a new cold war against China and now Russia. Incredibly, the old Cold War produced a win that was so sweet, yet so fleeting, that it seems to require a massive do-over.

Promoting war may have worked well for the military-industrial complex when the enemy was thousands of miles away with no capacity for hitting “the homeland,” but China and Russia do have that capacity. If a war with China or Russia (or both) comes to pass, it won’t be a long one. And count on one thing: America’s leaders, corporate, military, and political, won’t be able to shrug off the losses by looking at positive balance sheets and profit margins at weapons factories.

William Astore, a retired lieutenant colonel (USAF) and professor of history, writes regularly for TomDispatch (where this article originated). He is a senior fellow at the Eisenhower Media Network (EMN), an organization of critical veteran military and national security professionals. His personal blog is Bracing Views.

Copyright ©2022 William Astore — distributed by Agence Global

—————-
Released: 15 February 2022
Word Count: 2,429
—————-

Belle Chesler, “Crisis in the schools”

February 14, 2022 - TomDispatch

It feels odd to admit this, but I miss the stillness of the first few disorienting and terrifying weeks of the pandemic, when the noise and hustle of my world quieted down. In March and April of 2020, spring somehow seemed more riotously colorful and gratuitously lush. Choruses of birds replaced the sounds of cars in my neighborhood of Portland, Oregon. Gone was a traffic-filled commute and the energetically grueling weekday rituals of my past 17 years teaching at a large public high school. My house and my family became the locus and focal point of my day. Our tiny universe contracted, as we navigated the first year of the pandemic together, an island of three.

On returning to in-person school for what many hoped might be a “normal school year” in September 2021, I realized that a not-so-subtle shift had occurred in me. I was relieved to be back in the building with my colleagues and overjoyed to see my students in person instead of on Zoom, but I felt crushed by the sensory overwhelm of it all.

Being at school was both eerily familiar and strangely scary. The building itself seemed to roar and echo as voices bounced off every surface. Everywhere, bodies pushed too close. The required social distancing of that moment simply didn’t exist. We careened into and away from each other in the hallways, everyone oddly awkward and unstable, wary of the potential threat of the virus and of one another. The sheer volume of shared togetherness felt terrifying. I left school each day hollowed out from speaking so many words and interacting so closely with so many students and colleagues.

The visceral challenges of being back among 1,800 other humans during a raging pandemic would, however, prove just a precursor to an avalanche of seemingly insurmountable obstacles. The effects of two years of pandemic schooling, both virtual and in-person, have taken their toll on all of us: students, parents, and teachers alike.

Recently, the chaos inflicted by the Omicron variant, including growing staffing shortages that range from missing substitutes, special-education aides, and school nurses to nutrition workers and bus drivers, widespread mental illness, and political strife have left our already struggling public schools in tatters and the people running them (myself included) exhausted. While public discourse has centered around who should be blamed for school-building closures, harassing librarians and teachers in an effort to ban books from our libraries and classrooms, and arguing about critical race theory that’s supposedly being taught in our high schools but isn’t, educators like me have been focused on simply trying to make sure our students are safe and supported in a time of unprecedented hardship and uncertainty.

So it comes as no surprise to me that, according to a study recently done by the Oregon Education Association, 37% of educators in Beaverton, the district where I teach, are considering leaving the profession at the end of this school year. In neighboring Portland that number rises to an alarming 49%. Those numbers represent the cumulative exhaustion of a workforce drained of its energy and resources and of a system no longer able to maintain the people it relies on to keep the very school doors open.

A return to a normal that no longer exists Zoom-learning was soul-crushingly devoid of the laughter and energy of a traditional classroom and could never serve as a replacement for hands-on learning. However, it did, at least, offer a glimpse of the possibility of running schools in a different way, one that might include a learning experience more responsive to the educational, social, and emotional needs of all students.

It was a deeply flawed model, put instantly and chaotically in place when not every student had access to a sufficient WiFi connection or even a computer. Forged in reaction to circumstances novel and dire, it favored the privileged and the most disciplined, while putting so many students and families under incredible stress. Still, it did offer the potential for the sort of change that might include much-needed reforms in a system rooted in antiquated, inequitable, and unsustainable ways of operating.

Our online schedule was more flexible, with longer breaks built into the day. On Wednesdays, we had a full asynchronous day to meet individually with students, connect with parents, and collaborate with colleagues. And because we built our new curriculum from scratch with far fewer requirements from the district and state, we were able to focus on creating more meaningful content. And here’s a sad irony: poor online class attendance allowed us a glimpse of the potential benefits of smaller class sizes and more one-on-one time with students.

Having proved capable of building an entirely new system in just a few weeks, many of us hoped that, when we returned to our schools, we might be able to make necessary and positive changes there, too. Instead, fears of learning loss from the previous year coupled with calls for a “return to normal” forced all of us back into well-worn and established patterns of how schools do school — a complete denial of our experiences of the previous year. The first bell still rang at 7:45 am with hour-and-a-half-long classes stacked up one after another. Back were the same old too-large class sizes, the frenetic and unrelenting pace, the usual standardized curriculum and testing, traditional modes of assessment, outmoded graduation requirements, and the general drudgery of the secondary-school routine.

The only real changes were our Covid protocols: universal masking, trying to keep three feet of distance in cramped spaces filled with 30 to 40 students, and inflexible seating charts meant to help with future makeshift contact tracing. Even our usual active-shooter lockdown drills had to be canceled because you can’t safely cram 40 students under tables in the corner of a classroom in the middle of a pandemic.

While I was incredibly grateful for the added layers of safety, they only intensified the carceral aspects of school. Security guards wandered the halls, doors were locked to outsiders, no matter whether they were volunteers or parents. Strict rules were put in place around how we could gather, who could leave the classroom and when, who could eat where and when, and how we could come and go. All of this sapped joy from the experience of finally being back together.

And then we started to fall apart After the adrenaline rush and novelty of being back in the building together wore off, students started to fall apart. Fights broke out daily. The numbers of students wandering the halls, cutting classes, or simply not showing up increased. For those who continued to attend classes, behaviors once kept under control by an engaging curriculum, positive relationships, and effective classroom management only seemed to intensify. Unable to regulate their emotions, some students would yell or burst into tears; others were unabashedly defiant. For the depressed and the anxious, behaviors ranged from agitation to complete shutdown. For those in need of an escape, numbing behaviors became far more pronounced. And if given a break in the middle of class, the students almost universally retreated into the world of their phones, leaving the room silent as each scrolled furiously, their masked faces illuminated by blue screens.

While connected by the experience of a worldwide pandemic, so many of them are processing the fear, uncertainty, social isolation, and political and cultural chaos individually. Some of my students and colleagues (just like some of your friends) are doing okay. Maybe it’s luck or privilege, thickness of skin, unwavering resilience, or simply denial. But many of them are really struggling and just for the record: exhausted, traumatized, and demoralized adolescents and grown-ups don’t make for the most thoughtful, engaged, or high-achieving students and teachers.

In our school, administrators, trying valiantly to support those students with the most serious mental-health challenges, created a “Wellness Room” where, when feeling overwhelmed, they could sit for 30 minutes and try to regroup. However, our school psychologist and social worker — we only have one of each — couldn’t possibly assist all the students in need of immediate mental-health support. Such services just aren’t available in public schools, even in the best of times. It’s no wonder then that, mental healthcare providers in Oregon are raising the alarm that behavioral healthcare systems are imploding.

Struggling to do more with less To make matters worse, we’re operating with fewer staff than ever before. A shortage of subs across the nation has left schools with too few teachers, bus drivers, nurses, and food-service workers. Since the beginning of the year, teachers and administrators have been forced to cover other classes, losing essential prep time and leaving work previously done during school hours for later. The Omicron wave only intensified the situation.

My question: How do you run a school without enough staff? Some school districts in Kansas are, for example, responding to teacher shortages by lowering the education and age requirements for substitutes, offering the job to anyone with a high-school diploma. In essence, the pandemic continues to reduce our presence to that of a warm adult body, an incredibly demoralizing thought for a dedicated teacher who is also a highly trained professional.

The weight of all these issues falls squarely on our shoulders and we’re already exhausted by the struggles of living through the ongoing pandemic, while being stretched to the limits of our professional abilities. Despite my desire to help my students any way I can, I’m not a trained mental healthcare professional or a social worker. And honestly, I find it strange that our schools — and, more specifically, we teachers — should be left responsible for providing services a more humane society would prioritize and make widely available to its children. I want to use the skills I’ve been honing over the last 20 years to do what I do best, which is educate.

Why I teach People sometimes recoil when I tell them that I teach high school. All of us have stories from those years that carry profound emotional resonance. Often, the scars of adolescence are deep and school can play an outsized role in creating them. However, the high-school classroom can also be a place where, thanks to the right teacher, the right group of classmates, or a particular subject matter, we discover something special about ourselves. That’s why I love working with adolescents.

For the most part, teenagers have not become numb to the magic of our world and are remarkably open to learning and changing. Often, they love and feel deeply. On days when my own sense of despair creeps up on me, their earnestness can act like a balm. In many ways, that reciprocal relationship of ours was part of the job I simply took for granted before the pandemic. I then benefited simply by being in proximity to their hopefulness, passion, and openness, just as they were benefiting from the way I shared my curiosity and love of learning.

So I find myself grief-stricken by what’s been lost in the classroom during these last two years. I miss seeing their faces. I miss watching them flirt and build new friendships. I miss catching their sudden expressions of unmitigated joy. I feel for my students who spend seven to eight hours a day, five days a week, in such an anxiety-provoking environment. I continue to observe the toll the chronic stresses of climate change — yes, the heat and flooding in the northwest have been fierce this year! — political strife, and a seemingly unending pandemic, have taken on student optimism. More than ever, it feels as if we’re urging students to make themselves more vulnerable by investing in a future that’s increasingly hard for them (and us) to imagine. And this year, many of our students have proven unwilling or unable to do just that.

We all deserve more I have real empathy for parents who feel let down by schools. Wanting your child to be cared for, feel safe, and receive a high-quality education isn’t too much to ask. Unfortunately, when we rely on public schools (beset by problems long before the arrival of the pandemic) to be panaceas, how could they come out looking like anything but abject failures? What single institution could possibly solve the complex web of issues that afflict our society?

As for teachers working within that system, no matter how well intentioned, hardworking, or compassionate any of us may be, we’re going to have a hard time personally combating, no less solving the problems we face on a societal level. Every gesture of kindness, care, or even real engagement stands in danger of getting lost in a larger story of failure. Honestly, though, what system isn’t failing us right now, perhaps our political system above all?

For 18 years, I thought I could go it alone, closing my classroom door and trying to create a little utopia where students would feel safe enough to be creative and take risks. For the most part, I felt that I could make it work. Then the pandemic hit and the scale of the issues became so large and complex that I had to admit there was no way I could address it by myself.

None of us are equipped as individuals to fix what’s now broken. We have neither the energy, nor the resources to do that. So, listen to me when I say that teachers are shouting their SOS to you right now. Please send help. We can’t go it alone.

Belle Chesler is a visual arts teacher in Beaverton, Oregon, and she writes regularly for TomDispatch (where this article originated).

Copyright ©2022 Belle Chesler — distributed by Agence Global

—————-
Released: 14 February 2022
Word Count: 2,249
—————-

John Feffer, “The coming MAGA cultural revolution?”

February 10, 2022 - TomDispatch

I’ve just wrapped up my shift at BurgerBoy and I don’t have much time before the weekly self-criticism session at town hall. This hour with my diary is precious, especially when I have to make a big decision. Writing used to be my job, but it’s so much more difficult after eight straight hours on my feet. It’s been more than a year since the disastrous 2024 election and I can’t overestimate how much I miss my old life.

But I shouldn’t complain. Some of my former colleagues from the newspaper have it so much worse. My editor, for instance, is picking tomatoes not far from here under the hot Florida sun, which isn’t easy for a 45-year-old with bad knees. One of our former White House pool reporters is at a nearby chicken-processing plant. The few times we’ve met for a cup of coffee, I can’t bear to look at her hands.

If I had a choice, I wouldn’t be slinging burgers and dumping shoestring potatoes into a fryer 55 hours a week, breathing in that oil-clogged air and barely keeping up with the lunchtime rush. But it’s not as physically demanding as working in the fields or chopping up chickens on a frigid factory floor.

We’ve been at these jobs for six months, which is how long the new Civilian Conservation Corps — a name borrowed from President Franklin Roosevelt’s New Deal but with none of the social-democratic content — has been up and running. At the newspaper, we all thought the new president was joking when he promised to revive the old Biden administration idea of a youth climate corps. Of course, he did so with a grim focus all his own and a new slogan that “everyone has to pitch in to make America great again!”

Left unsaid was the administration’s plan to deport millions of undocumented workers and plunge the country into a desperate labor crisis. What’s more, the president blocked all new immigrants from what he called “shithole countries” and somehow expected incoming Scandinavians to fill the vacuum, though Swedes and Norwegians were clearly uninterested in moving to America en masse to cut lawns and build skyscrapers at non-Scandinavian wages.

So, that left us, the former “expert class,” newly unemployed, to do the work.

“We’re going to send those reporters and other freeloaders down to the countryside to get a real education,” the president insisted when he signed the Civilian Conservation Corps into law. “This is the first step in really draining the swamp.”

After a lifetime dedicated to exposing the corruption, legislative double-dealing, and bureaucratic insanities of Washington, my journalistic colleagues and I never thought of ourselves as actual inhabitants of the swamp. We were the zoologists. We developed the taxonomies and performed the autopsies. So, we dutifully reported on the president’s speech, never thinking it applied to us.

It’s not as if we missed the early warning signs of this war on expertise: the reporters attacked during campaign rallies, the death threats against public health officials, the storming of school-board meetings. It’s just that we didn’t expect those rabid but scattered incidents to morph into an official presidential initiative after the 2024 elections.

On his first day in office, the president signaled his new policy by authorizing a memorial on the Capitol grounds to the “patriots” of January 6th and commissioning a statue of the QAnon shaman for the Rotunda. He then appointed people to his cabinet who not only lacked the expertise to manage their departments but were singularly devoted to destroying the bureaucracies beneath them, not to speak of the country itself. He put militia leaders in key Defense Department roles and similarly filled the courts with extremists more suited to playing reality-show judges than real life ones. In all of this, the president has been aided by a new crop of his very own legislators, men and women who know nothing about Congress and actively flouted its rules and traditions even as they made the MAGA caucus the dominant voting bloc.

We laughed bitterly as we reported on each of these acts of political surrealism. Soon enough, however, those laughs died in our throats.

The joke, we learned, was on us.

Bashing China, emulating China The president’s supporters started bringing up China during the protests following George Floyd’s murder in 2020 when activists began pulling down monuments to slaveholders and Confederate generals.

This was an American-style “Cultural Revolution,” right-wing pundits insisted, referring to the tumultuous period of Chinese history from 1965 to 1975 when young revolutionaries, encouraged by leader Mao Zedong, tortured and killed “reactionary” elements, destroyed cultural treasures, and fought for control of institutions like universities and factories. At the behest of the Communist Party, those Red Guards also supervised the expulsion of intellectuals and civil servants to the countryside for “reeducation.”

America’s racial-justice activists bore no resemblance to those Red Guards. Unlike the young Chinese radicals of the 1960s, America’s activists didn’t kill anyone or subject even the worst racists to beatings and public humiliations. They pressed their demands for the removal of statues through democratic channels.

With that false Cultural Revolution analogy, the president’s supporters were able to portray Democrats as communists, while they engaged in the kind of China-bashing they’d perfected on pandemic, trade, and security issues. Meanwhile, having learned just enough about the Cultural Revolution to advance their far-fetched comparisons, the president’s team also clearly gathered tips on what to do with intellectuals and other “running dog lackeys” of the “globalists.”

Early on in the current administration, “expertise” became the new Communism, with doctorates as suspect as Party membership cards. Scientists who insisted on “promoting the false religion” of climate change found themselves without funding and then without jobs. Witch-hunting committees were established to pin all the failures of the last several decades — the pandemic, the trade deficit, the immigration “crisis” — on the expert class, their attacks becoming the cornerstone for a new, all-American version of class warfare.

The discrediting of experts and their ouster from positions of authority allowed the president’s supporters to move into the recently vacated positions. Credentials now being unnecessary, they became university professors, top officials in federal agencies, and newspaper pundits. The newly created Civilian Conservation Corps became the “solution” to the “problem” of the newly dispossessed expert class. The president introduced it as a “big, beautiful job retraining program.” In reality, the CCC was little different from the vagrancy laws of an earlier era that press-ganged the poor into prison labor.

We initially thought this revamped Civilian Conservation Corps would be voluntary and somehow, despite our reportorial skills, failed to grasp how the machinery of coercion was being constructed behind the scenes. Soon enough, though, with the assistance of its deep-pocketed financial supporters, the government began orchestrating hostile takeovers of media outlets critical of government policy — just as the right-wing government of Viktor Orbán had done in Hungary. School boards, newly dominated by Three Percenters, Family Firsters, and other presidential allies, changed the rules of employment to oust superintendents, principals, and teachers. A coordinated attack on the “Deep State” purged “radicals” from civil service jobs. Laws were passed to make union organizing essentially illegal.

There were some scattered demonstrations against the CCC registrations. The Corps, however, was initially popular at the polls — a reflection of how much anger had built up against the “experts,” the “functionaries,” and all the teachers who were supposedly pushing “critical race theory” in their classrooms. The same people who vociferously attacked vaccine mandates as government overreach had no problem with a new federal agency registering 5% of the population for “job retraining” and “employment relocation.”

Even with pandemic travel restrictions still in place throughout much of the world, the wealthy and famous critics of the president managed to leave the country. A few eccentrics disappeared into the internal exile of mountain shacks and survivalist shelters. The rest of us, with our wishful thinking and slender means, were caught up in the dragnet.

Good, honest work, the president promised as part of the CCC. Well, I’m not in a concentration camp and I can just about live within my stipend as long as I eat most of my meals at BurgerBoy. (The grilled chicken sandwich with avocado isn’t bad.) It’s the first time in my life that I’m grateful to be unmarried and childless. Many of my former colleagues have to pull double or even triple shifts to feed their families on the meager CCC pay.

I can get by. But I’m really worried about what comes next.

Self-Criticism Once a week, as a requirement of the CCC system, we “volunteers” gather in a town hall committee room to report on our “progress” at work and confess any “crimes” of “action, intent, or thought.” Over time, the 25 of us in this mid-sized town in central Florida have found a way to get through the proceedings in a relatively speedy three hours.

Actually, there’s even some truth to the self-criticisms I make. With important exceptions, we in the media did largely ignore the plight of working America, the people we now labor alongside in fast-food restaurants, on road-construction crews, and at hotel-cleaning services. We never truly grasped the difficulty of making a living at such unlivable wages. Nor did we understand the challenges of the jobs themselves until our new colleagues had to teach us repeatedly how to avoid burning ourselves at a fryer or pick tomatoes fast enough to make a decent piece rate.

Perhaps most importantly, we failed to understand the justifiable anger of the working class at how, in recent decades, this country’s economy had skewed so wildly in favor of the wealthy, a tipping of the playing field abetted by the political mainstream. I suppose there’s some poetic justice in sending us on these “assignments” to see how the other 95% live.

But last week, just as I was getting used to those self-criticism sessions, the rules changed. That’s when Karen, the local CCC director (who’d previously headed up the president’s reelection campaign in this town), informed us of a new directive from the administration.

“Saboteurs have infiltrated the CCC,” Karen told us solemnly. “We need to weed out and punish them.”

She painted a picture of wrecked factories and uprooted seedlings on farms. The “resistance” was apparently attempting to undermine “our president’s super-great plans and this has to stop.”

Therefore, Karen needed us to rat on our fellow “volunteers.”

There are four of us — an astrophysicist, a classics professor, a nutritionist, and me — embedded with the local BurgerBoy staff. We four like each other well enough, though I find it difficult to put up with how slowly the astrophysicist assembles the burger orders and I bristle at the little jokes the classicist cracks under her breath in Greek and Latin. After some initial suspiciousness, we now get along with the local staff, too. When Aishah, the longest-serving employee, lobbied for higher wages, we supported her campaign even though the salary increases don’t apply to us.

There isn’t much we can sabotage here at BurgerBoy, unless you consider over-salted fries and poorly mixed shakes acts of resistance. Even if we managed to shut down the whole place, the town would hardly grind to a halt. Customers would just migrate to the fried chicken joint where three other CCC “volunteers” work.

In fact, we’ve all pulled together. Thanks to the nutritionist, we’ve made a few fixes to improve the taste and quality of the food and I’ve rewritten the descriptions of the meals to make them sound more appealing. Aishah suggested changes to better meet local needs around hours of operation andfamily discounts. Our restaurant is now making money instead of consistently losing it.

Even as we’ve turned our BurgerBoy around, however, the rest of the country is failing, big time. The economy is a mess, despite all the conscript labor working to keep supply chains functioning. The shelves are only half-full at the local supermarket. Prices are skyrocketing. Yet another wave of Covid-19 has filled the local hospital’s ICU to the brim in a now officially maskless, vaccine-mandate-less country, which only aggravates the labor shortage.

The new administration needs scapegoats.

Of course, the fears of sabotage are not completely unfounded. There’s resistance all right. It’s just not coming from us.

Resistance It started with a customer who overheard the former classics professor calling the chocolate shake theobroma — “food of the Gods” in Greek. The rest of us groaned, as usual.

“Hey,” the customer whispered to the professor over the counter. “That’s Greek, isn’t it?”

The professor, decked out in her BurgerBoy apron and cap, was all smiles. “That’s right.”

“I’m looking for a tutor for my girl,” the woman said. “Are you available?”

That’s when we first found out about how upset the locals had become over the changes in the schools. Almost everyone in town has been grumbling about the incompetence of the new teachers and the principal’s refusal to meet with any but the wealthiest of the parents. According to local gossip, the students aren’t learning a thing. As word of mouth spread and more customers began asking about our hidden specialties, my CCC colleagues started moonlighting.

And that was just the beginning. We soon found out from our customers that the healthcare system was falling apart because of a lack of competent administrators and dedicated public health officials. Social Security checks and Medicare benefits have been delayed because the federal bureaucracy has shrunk to near invisibility. Even with the addition of CCCers, there still aren’t enough pickers for the crops or enough experienced kill-room operators for the slaughterhouses.

Who needs saboteurs when the system set up by the new government is sabotaging itself? The leaders implemented their new laws on behalf of the People. But the actual people are beginning to have second thoughts.

I know this nightmare won’t end overnight. China’s Cultural Revolution stretched on for nearly a decade and resulted in as many as two million dead. Our now-captive media doesn’t report on the growing violence in this country, but we’ve heard rumors about mobs attacking a courageous podcaster in Georgia and vigilantes targeting a lone abortion provider in Texas. Things might get a lot worse before they get better.

Still, this former reporter needs to decide what part he’s going to play in dealing with autocratic rule in our town and the country at large.

Until now, I haven’t gotten any moonlighting gigs. It speaks volumes about my employability when a professor of dead languages gets more requests for tutoring than I do. But today, one of our customers, a secretary in Town Hall, passed me an envelope. She’d heard I was a journalist, so she took the risk of giving me this information.

According to the documents she slipped me, Karen has been siphoning off money meant for public infrastructure like roads and bridges into meeting her own private infrastructure needs like a remodeled kitchen, a new sports car, and a luxury sailboat. The envelope contains bank records, store receipts, and full-color photos that nail it all down.

So, Karen wants us to rat on saboteurs? I’ve got just the answer: if I have enough courage to confront her or somehow get this information written up and into the world. After all, she has the power to get me reassigned to a coal mine in West Virginia or a prison in South Carolina, if she wants.

I don’t know much about China’s Cultural Revolution, but I do know this: when Communist Party official Deng Xiaoping returned from cleaning out pig pens in the countryside, he didn’t just work to reverse the Cultural Revolution. When he became premier, he began a thorough transformation of the Chinese system.

I’m not a fan of a lot that has happened in China since, but I do know that we, too, need a thorough transformation here in America. If I ever survive the wrath of Karen and make it out of this BurgerBoy, that’s going to be my life’s mission. To exit this current mess, America needs its experts, but it also needs its pickers and cleaners and burger-flippers making livable wages and participating in rebuilding our country.

Drawing on our different skills, we turned around our little BurgerBoy. One day maybe we can bring our all-in-it-together revolution to the rest of this polarized, violent, desperately unequal, and ultimately failing country.

John Feffer writes regularly for TomDispatch (where this article originated), He is the author of the dystopian novel Splinterlands and the director of Foreign Policy In Focus at the Institute for Policy Studies. Frostlands, a Dispatch Books original, is volume two of his Splinterlands series and the final novel in the trilogy, Songlands, has just been published. He has also written The Pandemic Pivot.

Copyright ©2022 John Feffer — distributed by Agence Global

—————-
Released: 10 February 2022
Word Count: 2,732
—————-

Rajan Menon, “The strategic blunder of the 1990s that set the stage for today’s Ukrainian crisis”

February 8, 2022 - TomDispatch

Understandably enough, commentaries on the crisis between Russia and the West tend to dwell on Ukraine. After all, more than 100,000 Russian soldiers and a fearsome array of weaponry have now been emplaced around the Ukrainian border. Still, such a narrow perspective deflects attention from an American strategic blunder that dates to the 1990s and is still reverberating.

During that decade, Russia was on its knees. Its economy had shrunk by nearly 40%, while unemployment was surging and inflation skyrocketing. (It reached a monumental 86% in 1999.) The Russian military was a mess. Instead of seizing the opportunity to create a new European order that included Russia, President Bill Clinton and his foreign-policy team squandered it by deciding to expand NATO threateningly toward that country’s borders. Such a misbegotten policy guaranteed that Europe would once again be divided, even as Washington created a new order that excluded and progressively alienated post-Soviet Russia.

The Russians were perplexed — as well they should have been.

At the time, Clinton and company were hailing Russian President Boris Yeltsin as a democrat. (Never mind that he had lobbed tank shells at his own recalcitrant parliament in 1993 and, in 1996, prevailed in a crooked election, abetted weirdly enough by Washington.) They praised him for launching a “transition” to a market economy, which, as Nobel Laureate Svetlana Alexievich so poignantly laid out in her book Second Hand Time, would plunge millions of Russians into penury by “decontrolling” prices and slashing state-provided social services.

Why, Russians wondered, would Washington obsessively push a Cold War NATO alliance ever closer to their borders, knowing that a reeling Russia was in no position to endanger any European country?

An alliance saved from oblivion Unfortunately, those who ran or influenced American foreign policy found no time to ponder such an obvious question. After all, there was a world out there for the planet’s sole superpower to lead and, if the U.S. wasted time on introspection, “the jungle,” as the influential neoconservative thinker Robert Kagan put it, would grow back and the world would be “imperiled.” So, the Clintonites and their successors in the White House found new causes to promote using American power, a fixation that would lead to serial campaigns of intervention and social engineering.

The expansion of NATO was an early manifestation of this millenarian mindset, something theologian Reinhold Niebuhr had warned about in his classic book, The Irony of American History. But who in Washington was paying attention, when the world’s fate and the future were being designed by us, and only us, in what Washington Post neoconservative columnist Charles Krauthammer celebrated in 1990 as the ultimate “unipolar moment” — one in which, for the first time ever, the United States would possess peerless power?

Still, why use that opportunity to expand NATO, which had been created in 1949 to deter the Soviet-led Warsaw Pact from rolling into Western Europe, given that both the Soviet Union and its alliance were now gone? Wasn’t it akin to breathing life into a mummy?

To that question, the architects of NATO expansion had stock answers, which their latter-day disciples still recite. The newly born post-Soviet democracies of Eastern and Central Europe, as well as other parts of the continent, could be “consolidated” by the stability that only NATO would provide once it inducted them into its ranks. Precisely how a military alliance was supposed to promote democracy was, of course, never explained, especially given a record of American global alliances that had included the likes of Philippine strongman Ferdinand Marcos, Greece under the colonels, and military-ruled Turkey.

And, of course, if the denizens of the former Soviet Union now wanted to join the club, how could they rightly be denied? It hardly mattered that Clinton and his foreign policy team hadn’t devised the idea in response to a raging demand for it in that part of the world. Quite the opposite, consider it the strategic analog to Say’s Law in economics: they designed a product and the demand followed.

Domestic politics also influenced the decision to push NATO eastward. President Clinton had a chip on his shoulder about his lack of combat credentials. Like many American presidents (31 to be precise), he hadn’t served in the military, while his opponent in the 1996 elections, Senator Bob Dole, had been badly injured fighting in World War II. Worse yet, his evasion of the Vietnam-era draft had been seized upon by his critics, so he felt compelled to show Washington’s power brokers that he had the stomach and temperament to safeguard American global leadership and military preponderance.

In reality, because most voters weren’t interested in foreign policy, neither was Clinton and that actually gave an edge to those in his administration deeply committed to NATO expansion. From 1993, when discussions about it began in earnest, there was no one of significance to oppose them. Worse yet, the president, a savvy politician, sensed that the project might even help him attract voters in the 1996 presidential election, especially in the Midwest, home to millions of Americans with eastern and central European roots.

Furthermore, given the support NATO had acquired over the course of a generation in Washington’s national security and defense industry ecosystem, the idea of mothballing it was unthinkable, since it was seen as essential for continued American global leadership. Serving as a protector par excellence provided the United States with enormous influence in the world’s premier centers of economic power of that moment. And officials, think-tankers, academics, and journalists — all of whom exercised far more influence over foreign policy and cared much more about it than the rest of the population — found it flattering to be received in such places as a representative of the world’s leading power.

Under the circumstances, Yeltsin’s objections to NATO pushing east (despite verbal promises made to the last head of the Soviet Union, Mikhail Gorbachev, not to do so) could easily be ignored. After all, Russia was too weak to matter. And in those final Cold War moments, no one even imagined such NATO expansion. So, betrayal? Perish the thought! No matter that Gorbachev steadfastly denounced such moves and did so again this past December.

You reap what you sow Russian President Vladimir Putin is now pushing back, hard. Having transformed the Russian army into a formidable force, he has the muscle Yeltsin lacked. But the consensus inside the Washington Beltway remains that his complaints about NATO’s expansion are nothing but a ruse meant to hide his real concern: a democratic Ukraine. It’s an interpretation that conveniently absolves the U.S. of any responsibility for ongoing events.

Today, in Washington, it doesn’t matter that Moscow’s objections long preceded Putin’s election as president in 2000 or that, once upon a time, it wasn’t just Russian leaders who didn’t like the idea. In the 1990s, several prominent Americans opposed it and they were anything but leftists. Among them were members of the establishment with impeccable Cold War credentials: George Kennan, the father of the containment doctrine; Paul Nitze, a hawk who served in the Reagan administration; the Harvard historian of Russia Richard Pipes, another hardliner; Senator Sam Nunn, one of the most influential voices on national security in Congress; Senator Daniel Patrick Moynihan, a one-time U.S. ambassador to the United Nations; and Robert McNamara, Lyndon Johnson’s Secretary of Defense. Their warnings were all remarkably similar: NATO’s expansion would poison relations with Russia, while helping to foster within it authoritarian and nationalist forces.

The Clinton administration was fully aware of Russia’s opposition. In October 1993, for example, James Collins, the chargé d’affaires at the U.S. embassy in Russia, sent a cable to Secretary of State Warren Christopher, just as he was about to travel to Moscow to meet Yeltsin, warning him that NATO’s enlargement was “neuralgic to Russians” because, in their eyes, it would divide Europe and shut them out. He warned that the alliance’s extension into Central and Eastern Europe would be “universally interpreted in Moscow as directed at Russia and Russia alone” and so regarded as “neo-containment.”

That same year, Yeltsin would send a letter to Clinton (and the leaders of the United Kingdom, France, and Germany) fiercely opposing NATO expansion if it meant admitting former Soviet states while excluding Russia. That would, he predicted, actually “undermine Europe’s security.” The following year, he clashed publicly with Clinton, warning that such expansion would “sow the seeds of mistrust” and “plunge post-Cold War Europe into a cold peace.” The American president dismissed his objections: the decision to offer former parts of the Soviet Union membership in the alliance’s first wave of expansion in 1999 had already been taken.

The alliance’s defenders now claim that Russia accepted it by signing the 1997 NATO-Russia Founding Act. But Moscow really had no choice, being dependent then on billions of dollars in International Monetary Fund loans (possible only with the approval of the United States, that organization’s most influential member). So, it made a virtue of necessity. That document, it’s true, does highlight democracy and respect for the territorial integrity of European countries, principles Putin has done anything but uphold. Still, it also refers to “inclusive” security across “the Euro-Atlantic area” and “joint decision-making,” words that hardly describe NATO’s decision to expand from 16 countries at the height of the Cold War to 30 today.

By the time NATO held a summit in Romania’s capital, Bucharest, in 2008, the Baltic states had become members and the revamped alliance had indeed reached Russia’s border. Yet the post-summit statement praised Ukraine’s and Georgia’s “aspirations for membership,” adding “we agreed today that these countries will become members of NATO.” President George W. Bush’s administration couldn’t possibly have believed Moscow would take Ukraine’s entry into the alliance lying down. The American ambassador to Russia, William Burns — now the head of the CIA — had warned in a cable two months earlier that Russia’s leaders regarded that possibility as a grave threat to their security. That cable, now publicly available, all but foresaw a train wreck like the one we’re now witnessing.

But it was the Russia-Georgia war — with rare exceptions mistakenly presented as an unprovoked, Moscow-initiated attack — that provided the first signal Vladimir Putin was past the point of issuing protests. His annexation of Crimea from Ukraine in 2014, following an illegal referendum, and the creation of two “republics” in the Donbas, itself part of Ukraine, were far more dramatic moves that effectively initiated a second Cold War.

Averting disaster And now, here we are. A divided Europe, increasing instability amid military threats by nuclear-armed powers, and the looming possibility of war, as Putin’s Russia, its troops and armaments massed around Ukraine, demand that NATO expansion cease, Ukraine be barred from the alliance, and the United States and its allies finally take Russia’s objections to the post-Cold War security order seriously.

Of the many obstacles to averting war, one is particularly worth noting: the widespread claim that Putin’s concerns about NATO are a smokescreen obscuring his true fear: democracy, particularly in Ukraine. Russia, however, repeatedly objected to NATO’s eastward march even when it was still being hailed as a democracy in the West and long before Putin became president in 2000. Besides, Ukraine has been a democracy (however tumultuous) since it became independent in 1991.

So why the Russian buildup now?

Vladimir Putin is anything but a democrat. Still, this crisis is unimaginable without the continual talk about someday ushering Ukraine into NATO and Kyiv’s intensifying military cooperation with the West, especially the United States. Moscow views both as signs that Ukraine will eventually join the alliance, which — not democracy — is Putin’s greatest fear.

Now for the encouraging news: the looming disaster has finally energized diplomacy. We know that the hawks in Washington will deplore any political settlement that involves compromise with Russia as appeasement. They’ll liken President Biden to Neville Chamberlain, the British Prime Minister who, in 1938, gave way to Hitler in Munich. Some of them advocate a “massive weapons airlift” to Ukraine, à la Berlin as the Cold War began. Others go further, urging Biden to muster an “international coalition of the willing, readying military forces to deter Putin and, if necessary, prepare for war.”

Sanity, however, can still prevail through a compromise. Russia could settle for a moratorium on Ukrainian membership in NATO for, say, two decades, something the alliance should be able to accept because it has no plans to fast-track Kyiv’s membership anyway. To gain Ukraine’s assent, it would be guaranteed the freedom to secure arms for self-defense and, to satisfy Moscow, Kyiv would agree never to allow NATO bases or aircraft and missiles capable of striking Russia on its territory.

The deal would have to extend beyond Ukraine if it is to ward off crises and war in Europe. The United States and Russia would need to summon the will to discuss arms control there, including perhaps an improved version of the 1987 Intermediate-Range Nuclear Forces Treaty that President Trump ditched in 2019. They would also need to explore confidence-building measures like excluding troops and armaments from designated areas along the NATO-Russian borderlands and steps to prevent the (now-frequent) close encounters between American and Russian warplanes and warships that could careen out of control.

Over to the diplomats. Here’s wishing them well.

Rajan Menon writes regularly for TomDispatch (where this article originated). He is the Anne and Bernard Spitzer Professor of International Relations emeritus at the Powell School, City College of New York, director of the Grand Strategy Program at Defense Priorities, and Senior Research Scholar at the Saltzman Institute of War and Peace at Columbia University. He is the author, most recently, of The Conceit of Humanitarian Intervention.

Copyright ©2022 Rajan Menon — distributed by Agence Global

—————-
Released: 08 February 2022
Word Count: 2,209
—————-

Nan Levinson, “The antiwar movement that wasn’t enough”

February 7, 2022 - TomDispatch

When I urge my writing students to juice up their stories, I tell them about “disruptive technologies,” inventions and concepts that end up irrevocably changing industries. Think: iPhones, personal computers, or to reach deep into history, steamships. It’s the tech version of what we used to call a paradigm shift. (President Biden likes to refer to it as an inflection point.)

Certain events function that way, too. After they occur, it’s impossible to go back to how things were: World War II for one generation, the Vietnam War for another, and 9/11 for a third. Tell me it isn’t hard now to remember what it was like to catch a flight without schlepping down roped-off chutes like cattle to the slaughter, even if for most of the history of air travel, no one worried about underwear bombers or explosive baby formula. Of course, once upon a time, we weren’t incessantly at war either.

However, for my students, the clumsily named Gen Z, the transformative event in their lives hasn’t been a war at all — no matter that their country has been enmeshed in one or more of them for all of their conscious lives. It’s probably George Floyd’s murder or the Covid pandemic or the double whammy of both, mixed in with a deadly brew of Trumpism. That alone strikes me as a paradigm shift.

It’s not that they are uncaring. Those I know are ardent about fixing myriad wrongs in the world and prepared to work at it, too. And like many Americans, for a few weeks as August 2021 ended, they were alarmed by the heartbreaking consequences of their country’s failed mission in Afghanistan and its betrayal of the people there. How could you not be heartbroken about people desperate to save their lives and livelihoods? And the girls… ah, the girls, the 37% of teenage girls who learned to read in those years, went to school with boys, saw their lives change, and probably will be denied all of that in the years to come.

In my more cynical moments, though, I note that it was the girls and women who were regularly trotted out by our government officials and generals insisting that U.S. troops must remain in Afghanistan until — until what? Until, as it turned out, disaster struck. After all, what good American heart doesn’t warm to educating the young and freeing girls from forced marriages (as opposed, of course, to killing civilians and causing chaos)?

Militarism is among the all-American problems the young activists I meet do sometimes bring up. It’s just not very high on their list of issues to be faced. The reasons boil down to this: the wars in Iraq and Afghanistan, interminable as they seemed, had little or no direct effect on most of my students or the lives they imagined having and that was reflected in their relative lack of attention to them, which tells us all too much about this country in the twenty-first century.

Spare change So here we are, 20 years after U.S. troops invaded Afghanistan and months since they hotfooted it out. That two-decade-long boots-on-the-ground (and planes in the air) episode has now officially been declared over and done with, if not exactly paid for. But was that an inflection point, as this country turned its military attention to China and Russia? Not so fast. I’m impatient with the conventional wisdom about our twenty-first-century wars and the reaction to them at home. Still, I do think it’s important to try to figure out what has (or hasn’t) been learned from them and what may have changed because of them.

In the changed column, alas, the answer seems to be: not enough. Once again, in the pandemic moment, our military is filling roles that would be left to civil society if it were adequately funded — helping in hospitals and nursing homes, administering Covid-19 vaccinations and tests, teaching school and driving school buses — because, as Willie Sutton answered when asked why he robbed banks, that’s where the money is.

Apparently, it’s so much money that even the Defense Department doesn’t quite know how to spend it. Between 2008 and 2019, the Pentagon returned almost $128 billion in unspent funds from its staggeringly vast and still expanding budget. Admittedly, that’s a smaller percentage of that budget than other departments turned back, but it started with so much more and, as a result, that Pentagon spare change accounted for nearly half of all “cancelled” government funds during that time.

Yet too little of those vast sums spent go to active-duty troops. A recent survey found that 29% of the families of junior-level, active-duty soldiers experienced food insecurity (that is, hunger) in the past year, a strong indicator of the economic precariousness of everyday military life, even here at home.

It didn’t help that the U.S. military’s wars only sporadically drew extended public attention. Of course, before 1979, when the Soviet Union invaded Afghanistan, that country’s name was shorthand for a place too obscure for most Americans even to find on a world map. And maybe that was still true in 2020, when, nearly two decades after the U.S invaded that nation, the American presence there got all of five minutes of coverage on the national evening newscasts of CBS, NBC, and ABC.

Years earlier, when the focus was more on Iraq than Afghanistan, I attended a meeting of the Smedley Butler Brigade of Veterans For Peace. I was writing a story for the Boston Globe, which made me an easy target for the veterans’ anger. As a result, they badgered me to make our city’s newspaper of record print a daily report of deaths in the war. I explained that, as a freelancer, I had even less influence than they did and, unsurprisingly, such an accounting never came to pass.

Years later, as the U.S. endeavor in Afghanistan wound down and the Globe and other mainstream outlets did actually publish calculations of the costs, I found myself wondering if all those credible, influential media sources would ever publish a reckoning of how many times in the past 20 years, when it might have made a difference, they had run cost analyses of the blinding arrogance that defined U.S. foreign and military policy in those decades. The impact of such accountings might have been vanishingly small anyway.

It’s true, by the way, that Brown University’s Costs of War Project did a formidable job of tackling that issue in those endless war years, but their accounts were, of course, anything but mainstream. Even today, in that mainstream, accurate counts are still hard to come by. The New York Times, which recently published a groundbreaking report on civilian deaths in the Middle East caused by U.S. airstrikes, was stymied by the Pentagon for years when trying to get the necessary documents for just such an accounting, while provincial authorities in Afghanistan often denied that civilian casualties had even occurred.

Presence and power In 2004, when Iraq Veterans Against the War (IVAW) was just getting started, I was introduced to a small group of disillusioned but determined young vets, wonderfully full of themselves and intent on doing things their way. While they appreciated the earlier soldier-led antiwar efforts of the Vietnam War era, they wanted to do it all in a new fashion. “We’re sort of reinventing the wheel,” Eli Wright, a young medic, who had served in Iraq, told me, “But we’re making it a much nicer wheel, I think.” I was smitten.

At first, those newly minted anti-warriors thought the very novelty of their existence in war-on-terror America would be enough. So, they told and retold their stories to anyone who would listen: stories of misguided raids and policing actions for which they were ill-equipped and ill-trained; of soul-destroying cruelty they found themselves implicated in; and of their dawning awareness, even while they were in Iraq, that they could no longer be a party to any of it. Believe me, those veterans told powerful and moving stories, but it wasn’t nearly enough.

In a piece about the power and pitfalls of storytelling, Jonathan Gottschall notes that, in the tales we tell, we tend to divide people into a tidy triad of heroes, victims, and villains. My longtime trope was that we — by which I mean we Americans — allowed those fighting our endless wars to be onlyheroes or victims — the former to valorize, the latter to pity — but nothing else. (Admittedly, sometimes civilian peace workers did see them as villains, but despite an inevitable jockeying for position, civilian and military antiwar groups generally recognized each other as comrades-against-arms.) IVAW insisted on adding activist to that dichotomy, as they attempted to change minds and history.

When you’re trying to do that, or at least influence policy, your odds of success are greater if you have a clear, specific goal you can advocate and agitate for and build coalitions around. Then, when you achieve it, you can, of course, claim victory. IVAW’s overriding aim was to bring the troops home immediately. That goal was finally (more or less) achieved, though at great cost and so much later than they had been demanding, making it anything but a resounding victory; nor did it, in the end, have much to do with those young veterans.

Their significance may lie elsewhere. Last August, in the midst of the chaotic U.S. pull-out from Afghanistan, I tuned in to a podcast about political and social activism just as Rashad Robinson, president of the racial justice organization, Color of Change, was making a distinction between presence (“retweets, shout-outs from the stage”) and power (“the ability to change the rules”).

It would be hard to come up with a better illustration of that difference than Camp Casey, the August 2005 encampment of antiwar military families, veterans, and their sympathizers. It was sprawled across a ditch in Crawford, Texas, a few miles down the road from the ranch of a vacationing President George W. Bush. Their protest made significant news for those five weeks, as media around the world featured heart-rending stories of mothers in mourning and veterans in tears, photos of an iconic white tent, and interviews with Cindy Sheehan whose son, Casey, had been killed in Iraq the year before. The media anointed her the Grieving-Mother-in-Chief and news reports sometimes even got the protesters’ end-the-war, bring-the-troops-home message right.

Whizzing past in a motorcade on his way to a fundraiser, President Bush ignored them, and the war in Iraq continued for another five years with the deaths of about 2,700 more sons and daughters of grieving American mothers. But the next month, when somewhere between 100,000 and 300,000 Camp Casey participants, veterans, and their supporters gathered for an antiwar march through downtown Washington, D.C., the government was forced to acknowledge, perhaps for the first time, the existence of opposition to the war in Iraq. For context, the National Park Service estimated then that, of the approximately 3,000 permits it issued for demonstrations on the National Mall yearly, only about a dozen attracted more than 5,000 people.

Presence matters and in the few years following Camp Casey, when the antiwar veterans were at their most effective, they learned how to make themselves harder to ignore. They’ve since renamed their group About Face and reconceived its purpose and goals, but the perennial challenge to political activists is how to turn presence into power.

Why didn’t the anti-war movement catch on? In February 2003, as many as 10 million people took to the streets in 60 countries to protest the impending U.S. invasion of Iraq. But once that invasion happened, it was primarily the military-related groups, sometimes joined by other peace organizations, that kept the opposition alive. Why, though, couldn’t they turn presence into power? Why didn’t more Americans take up the campaign to end two such pointless wars? Why didn’t we learn?

I make no claim to answering those questions in a definitive way. Nonetheless, here’s my stab at it.

Let’s start with the obvious: the repercussions of an all-volunteer military. Only a small proportion of Americans, self-selected and concentrated in certain parts of the country, have been directly involved in and affected by our twenty-first-century wars. Deployed over and over, they didn’t circulate in civil society in the way the previous draft military had and, as warfare became increasingly mechanized and automated (or drone-ified), there have been ever fewer American casualties to remind everyone else in this country that we were indeed at war in Afghanistan and Iraq. For the troops, that distancing from battle also undoubtedly lessened an innate human resistance to killing and also objections to those wars within the military itself.

Next, stylish as it might be in this country to honor veterans of our wars (thank you for your service!), as Kelly Dougherty, IVAW’s first executive director, complained, “We come home and everyone shakes our hands and calls us heroes, but no one wants to listen to us.” Stories of bravery, horrific wounds, and even post-traumatic stress syndrome were acceptable. Analysis, insight, or testimony about what was actually going on in the war zones? Not so much.

Folk singer, labor organizer, and vet, “Utah” Phillips observed that having a long memory is the most radical idea in America. With items in the news cycle lasting for ever-shorter periods of time before being replaced, administrations becoming ever harder to embarrass, and a voting public getting accustomed to being lied to, even a short memory became a challenge.

The hollowing out of local news in these years only exacerbated the problem. Less local reporting meant fewer stories about people we might actually know or examples of how world events affect our daily lives. Pro-war PR, better funded and connected than any antiwar group could hope to be, filled the gap. Think soldiers striding onto ballfields at sports events to the teary surprise of families and self-congratulatory cheers from the stands. Between 2012 and 2015, the Pentagon paid pro sports teams some $6.8 million to regularly and repeatedly honor the military. Meanwhile, the mainstream media has made it ever harder for peace groups to gain traction by applying a double standard to protest or outsider politics, a reality sociologist Sarah Sobieraj has explored strikingly in her book Soundbitten.

The nature of political protest changed, too. As information was disseminated and shared more and more through social media — activism by way of hashtag, tweet, and Instagram — organizing turned ever more virtual and ever less communal. Finally, despite protestations about the United States being a peace-loving country, the military in these years has proven a rare bipartisan darling, while, historically speaking, violence has been bred into America’s bones.

Maybe, however, the lack of active opposition to the endless wars wasn’t a new normal, but something like the old normal. Sadly enough, conflicts don’t simply end because people march against them. Even the far larger Vietnam antiwar movement was only one pressure point in winding down that conflict. War policy is directed by what happens on the ground and, to a lesser degree, at the ballot box. What an antiwar movement can do is help direct the public response, which may, fingers crossed, save the country from going to war someplace else and save another generation of soldiers from having to repeat the mistakes of the past 20 years.

Nan Levinson’s most recent book is War Is Not a Game: The New Antiwar Soldiers and the Movement They Built. She writes regularly for TomDispatch(where this article originated) and teaches journalism and fiction writing at Tufts University.

Copyright ©2022 Nan Levinson — distributed by Agence Global

—————-
Released: 07 February 2022
Word Count: 2,555
—————-

William Hartung, “What a waste!”

February 3, 2022 - TomDispatch

2021 was another banner year for the military-industrial complex, as Congress signed off on a near-record $778 billion in spending for the Pentagon and related work on nuclear warheads at the Department of Energy. That was $25 billion more than the Pentagon had even asked for.

It can’t be emphasized enough just how many taxpayer dollars are now being showered on the Pentagon. That department’s astronomical budget adds up, for instance, to more than four times the cost of the most recent version of President Biden’s Build Back Better plan, which sparked such horrified opposition from Senator Joe Manchin (D-WV) and other alleged fiscal conservatives. Naturally, they didn’t blink when it came to lavishing ever more taxpayer dollars on the military-industrial complex.

Opposing Build Back Better while throwing so much more money at the Pentagon marks the ultimate in budgetary and national-security hypocrisy. The Congressional Budget Office has determined that, if current trends continue, the Pentagon could receive a monumental $7.3 trillion-plus over the next decade, more than was spent during the peak decade of the Afghan and Iraq wars, when there were up to 190,000 American troops in those two countries alone. Sadly, but all too predictably, President Biden’s decision to withdraw U.S. troops and contractors from Afghanistan hasn’t generated even the slightest peace dividend. Instead, any savings from that war are already being plowed into programs to counter China, official Washington’s budget-justifying threat of choice (even if outshone for the moment by the possibility of a Russian invasion of Ukraine). And all of this despite the fact that the United States already spends three times as much as China on its military.

The Pentagon budget is not only gargantuan, but replete with waste — from vast overcharges for spare parts to weapons that don’t work at unaffordable prices to forever wars with immense human and economic consequences. Simply put, the current level of Pentagon spending is both unnecessary and irrational.

Price gouging on spare parts Overcharging the Pentagon for spare parts has a long and inglorious history, reaching its previous peak of public visibility during the presidency of Ronald Reagan in the 1980s. Then, blanket media coverage of $640 toilet seats and $7,600 coffee makers sparked public outrage and a series of hearings on Capitol Hill, strengthening the backbone of members of Congress. In those years, they did indeed curb at least the worst excesses of the Reagan military buildup.

Such pricing horror stories didn’t emerge from thin air. They came from the work of people like legendary Pentagon whistleblower Ernest Fitzgerald. He initially made his mark by exposing the Air Force’s efforts to hide billions in cost overruns on Lockheed’s massive C-5A transport plane. At the time, he was described by former Air Force Secretary Verne Orr as “the most hated man in the Air Force.” Fitzgerald and other Pentagon insiders became sources for Dina Rasor, a young journalist who began drawing the attention of the media and congressional representatives to spare-parts overcharges and other military horrors. In the end, she formed an organization, the Project on Military Procurement, to investigate and expose waste, fraud, and abuse. It would later evolve into the Project on Government Oversight (POGO), the most effective current watchdog when it comes to Pentagon spending.

A recent POGO analysis, for instance, documented the malfeasance of TransDigm, a military parts supplier that the Department of Defense’s Inspector General caught overcharging the Pentagon by as much as 3,800% — yes, you read that figure right! — on routine items. The company was able to do so only because, bizarrely enough, Pentagon buying rules prevent contract officers from getting accurate information on what any given item should cost or might cost the supplying company to produce it.

In other words, thanks to Pentagon regulations, those oversight officials are quite literally flying blind when it comes to cost control. The companies supplying the military take full advantage of that. The Pentagon Inspector General’s office has, in fact, uncovered more than 100 overcharges by TransDigm alone, to the tune of $20.8 million. A comprehensive audit of all spare-parts suppliers would undoubtedly find billions of wasted dollars. And this, of course, spills over into ever more staggering costs for finished weapons systems. As Ernest Fitzgerald once said, a military aircraft is just a collection of “overpriced spare parts flying in formation.”

Weapons this country doesn’t need at prices we can’t afford The next level of Pentagon waste involves weapons we don’t need at prices we can’t afford, systems that, for staggering sums, fail to deliver on promises to enhance our safety and security. The poster child for such costly, dysfunctional systems is the F-35 combat aircraft, a plane tasked with multiple missions, none of which it does well. The Pentagon is slated to buy more than 2,400 F-35s for the Air Force, Marines, and Navy. The estimated lifetime cost for procuring and operating those planes, a mere $1.7 trillion, would make it the Pentagon’s most expensive weapons project ever.

Once upon a time (as in some fairy tale), the idea behind the creation of the F-35 was to build a plane that, in several variations, would be able to carry out many different tasks relatively cheaply, with potential savings generated by economies of scale. Theoretically, that meant the bulk of the parts for the thousands of planes to be built would be the same for all of them. This approach has proven a dismal failure so far, so much so that the researchers at POGO are convinced the F-35 may never be fully ready for combat.

Its failures are too numerous to recount here, but a few examples should suffice to suggest why the program minimally needs to be scaled back in a major way, if not canceled completely. For a start, though meant to provide air support for troops on the ground, it’s proved anything but well-designed to do so. In fact, that job is already handled far better and more cheaply by the existing A-10 “Warthog” attack aircraft. A 2021 Pentagon assessment of the F-35 — and keep in mind that this is the Department of Defense, not some outside expert — found 800 unresolved defects in the plane. Typical of its never-ending problems: a wildly expensive and not particularly functional high-tech helmet which, at the cost of $400,000 each, is meant to give its pilot special awareness of what’s happening around and below the plane as well as to the horizon. And don’t forget that the F-35 will be staggeringly expensive to maintain and already costs an impressive $38,000 an hour to fly.

In December 2020, House Armed Services Committee Chair Adam Smith finally claimed he was “tired of pouring money down the F-35 rathole.” Even former Air Force Chief of Staff General Charles Brown acknowledged that it couldn’t meet its original goal — to be a low-cost fighter — and would have to be supplemented with a less costly plane. He compared it to a Ferrari, adding, “You don’t drive your Ferrari to work every day, you only drive it on Sundays.” It was a stunning admission, given the original claims that the F-35 would be the Air Force’s affordable, lightweight fighter and the ultimate workhorse for future air operations.

It’s no longer clear what the rationale even is for building more F-35s at a time when the Pentagon has grown obsessed with preparing for a potential war with China. After all, if that country is the concern (an exaggerated one, to be sure), it’s hard to imagine a scenario in which fighter planes would go into combat against Chinese aircraft, or be engaged in protecting American troops on the ground — not at a moment when the Pentagon is increasingly focused on long-range missiles, hypersonic weapons, and unpiloted vehicles as its China-focused weapons of choice.

When all else fails, the Pentagon’s fallback argument for the F-35 is the number of jobs it will create in states or districts of key members of Congress. As it happens, virtually any other investment of public funds would build back better with more jobs than F-35s would. Treating weapons systems as jobs programs, however, has long helped pump up Pentagon spending way beyond what’s needed to provide an adequate defense of the United States and its allies.

And that plane is hardly alone in the ongoing history of Pentagon overspending. There are many other systems that similarly deserve to be thrown on the scrap heap of history, chief among them the Littoral Combat Ship (LCS), essentially an F-35 of the sea. Similarly designed for multiple roles, it, too, has fallen far short in every imaginable respect. The Navy is now trying to gin up a new mission for the LCS, with little success.

This comes on top of buying outmoded aircraft carriers for up to $13 billion a pop and planning to spend more than a quarter of a trillion dollars on a new nuclear-armed missile, known as the Ground-Based Strategic Deterrent, or GBSD. Such land-based missiles are, according to former Secretary of Defense William Perry, “among the most dangerous weapons in the world,” because a president would have only minutes to decide whether to launch them on being warned of an enemy nuclear attack. In other words, a false alarm (of which there have been numerous examples during the nuclear age) could lead to a planetary nuclear conflagration.

The organization Global Zero has demonstrated convincingly that eliminating land-based missiles altogether, rather than building new ones, would make the United States and the rest of the world safer, with a small force of nuclear-armed submarines and bombers left to dissuade any nation from launching a nuclear war. Eliminating ICBMs would be a salutary and cost-saving first step towards nuclear sanity, as former Pentagon analyst Daniel Ellsberg and other experts have made all too clear.

America’s cover-the-globe defense strategy And yet, unbelievably enough, I haven’t even mentioned the greatest waste of all: this country’s “cover the globe” military strategy, including a planet-wide “footprint” of more than 750 military bases, more than 200,000 troops stationed overseas, huge and costly aircraft-carrier task forces eternally floating the seven seas, and a massive nuclear arsenal that could destroy life as we know it (with thousands of warheads to spare).

You only need to look at the human and economic costs of America’s post-9/11 wars to grasp the utter folly of such a strategy. According to Brown University’s Costs of War Project, the conflicts waged by the United States in this century have cost $8 trillion and counting, with hundreds of thousands of civilian casualties, thousands of U.S. troops killed, and hundreds of thousands more suffering from traumatic brain injuries and post-traumatic stress disorder. And for what? In Iraq, the U.S. cleared the way for a sectarian regime that then helped create the conditions for ISIS to sweep in and conquer significant parts of the country, only to be repelled (but not thoroughly defeated) at great cost in lives and treasure. Meanwhile, in Afghanistan, after a conflict doomed as soon as it morphed into an exercise in nation-building and large-scale counterinsurgency, the Taliban is now in power. It’s hard to imagine a more ringing indictment of the policy of endless war.

Despite the U.S. withdrawal from Afghanistan, for which the Biden administration deserves considerable credit, spending on global counterterror operations remains at high levels, thanks to ongoing missions by Special Operations forces, repeated air strikes, ongoing military aid and training, and other kinds of involvement short of full-scale war. Given the opportunity to rethink strategy as part of a “global force posture” review released late last year, the Biden administration opted for a remarkably status quo approach, insisting on maintaining substantial bases in the Middle East, while modestly boosting the U.S. troop presence in East Asia.

As anyone who’s followed the news knows, despite the immediate headlines about sending troops and planes to Eastern Europe and weapons to Ukraine in response to Russia’s massing of its forces on that country’s borders, the dominant narrative for keeping the Pentagon budget at its current size remains China, China, China. It matters little that the greatest challenges posed by Beijing are political and economic, not military. “Threat inflation” with respect to that country continues to be the Pentagon’s surest route to acquiring yet more resources and has been endlessly hyped in recent years by, among others, analysts and organizations with close ties to the arms industry and the Department of Defense.

For example, the National Defense Strategy Commission, a congressionally mandated body charged with critiquing the Pentagon’s official strategy document, drew more than half its members from individuals on the boards of arms-making corporations, working as consultants for the arms industry, or from think tanks heavily funded by just such contractors. Not surprisingly, the commission called for a 3% to 5% annual increase in the Pentagon budget into the foreseeable future. Follow that blueprint and you’re talking $1 trillion annually by the middle of this decade, according to an analysis by Taxpayers for Common Sense. Such an increase, in other words, would prove unsustainable in a country where so much else is needed, but that won’t stop Pentagon budget hawks from using it as their North Star.

In March of this year, the Pentagon is expected to release both its new national defense strategy and its budget for 2023. There are a few small glimmers of hope, like reports that the administration may abandon certain dangerous (and unnecessary) nuclear-weapons programs instituted by the Trump administration.

However, the true challenge, crafting a budget that addresses genuine security problems like public health and the climate crisis, would require fresh thinking and persistent public pressure to slash the Pentagon budget, while reducing the size of the military-industrial complex. Without a significant change of course, 2022 will once again be a banner year for Lockheed Martin and other top weapons makers at the expense of investing in programs necessary to combat urgent challenges from pandemics to climate change to global inequality.

William D. Hartung writes regularly for TomDispatch (where this article originated). He is a senior research fellow at the Quincy Institute for Responsible Statecraft and the author of Profits of War: Corporate Beneficiaries of the Post-9/11 Surge in Pentagon Spending (Brown University’s the Costs of War Project and the Center for International Policy, September 2021).

Copyright ©2022 William D. Hartung — distributed by Agence Global

—————-
Released: 03 February 2022
Word Count: 2,322
—————-

  • « Previous Page
  • 1
  • …
  • 10
  • 11
  • 12
  • 13
  • 14
  • …
  • 40
  • Next Page »

Syndication Services

Agence Global (AG) is a specialist news, opinion and feature syndication agency.

Rights & Permissions

Email us or call us 24 hours a day, 7 days a week, for rights and permission to publish our clients’ material. One of our representatives will respond in less than 30 minutes over 80% of the time.

Social Media

  • Facebook
  • Twitter

Advisories

Editors may ask their representative for inclusion in daily advisories. Sign up to get advisories on the content that fits your publishing needs, at rates that fit your budget.

About AG | Contact AG | Privacy Policy

©2016 Agence Global