Agence Global

  • About AG
  • Content
  • Articles
  • Contact AG

Kelly Denton-Borhaug, “Moral injury and the forever wars”

August 3, 2021 - TomDispatch

This summer, it seemed as if we Americans couldn’t wait to return to our traditional July 4th festivities. Haven’t we all been looking for something to celebrate? The church chimes in my community rang out battle hymns for about a week. The utility poles in my neighborhood were covered with “Hometown Hero” banners hanging proudly, sporting the smiling faces of uniformed local veterans from our wars. Fireworks went off for days, sparklers and cherry bombs and full-scale light shows filling the night sky.

But all the flag-waving, the homespun parades, the picnics and military bands, the flowery speeches and self-congratulatory messages can’t dispel a reality, a truth that’s right under our noses: all is not well with our military brothers and sisters. The starkest indicator of that is the rising number of them who are taking their own lives. A new report by Brown University’s Costs of War Project calculates that, in the post-9/11 era so far, four times as many veterans and active-duty military have committed suicide as died in war operations.

While July 4th remembrances across the country focused on the symbols and institutions of war and militarization, most of the celebrants seemed far less interested in hearing from current and former military personnel. After all, less than 1% of Americans have been burdened with waging Washington’s wars in these years, even as we taxpayers have funded an ever-more enormous military infrastructure.

As for me, though, I’ve been seeking out as many of those voices as I could for a long, long time. And here’s what I’ve learned: the truths so many of them tell sharply conflict with the remarkably light-hearted and unthinking celebrations of war we experienced this July and so many Julys before it. I keep wondering why so few of us are focusing on one urgent question: Why are so many of our military brothers and sisters taking their own lives?

The moral injuries of war The term moral injury is now used in military and healthcare settings to identify a deep existential pain destroying the lives of too many active-duty personnel and vets. In these years of forever wars, when the moral consciences of such individuals collided with the brutally harsh realities of militarization and killing, the result has been a sharp, sometimes death-dealing dissonance. Think of moral injury as an invisible wound of war. It represents at least part of the explanation for that high suicide rate. And it’s implicated in more than just those damning suicides: an additional 500,000 troops in the post-9/11 era have been diagnosed with debilitating, not fully understood symptoms that make their lives remarkably unlivable.

I first heard the term moral injury about 10 years ago at a conference at Riverside Church in New York City, where Jonathan Shay, the renowned military psychologist, spoke about it. For decades he had provided psychological care for veterans of the Vietnam War who were struggling with unremitting resentment, guilt, and shame in their post-deployment lives. They just couldn’t get on with those very lives after their military experiences. They had, it seemed, lost trust in themselves and anyone else.

Still, Shay found that none of the typical mental-health diagnoses seemed to fit their symptoms. This wasn’t post-traumatic stress disorder — a hyper-vigilance, anxiety, and set of fears arising from traumatic experience. No, what came to be known as moral injury seemed to result from a sense that the very center of one’s being had been assaulted. If war’s intent is to inflict physical injury and destruction, and if the trauma of war afflicts people’s emotional and psychic well-being, moral injury describes an invisible wound that burns away at a person’s very soul. The Iraq War veteran and writer Kevin Powers describes it as “acid seeping down into your soul, and then your soul is gone.”

A central feature of moral injury is a sense of having betrayed one’s own deepest moral commitments, as well as of being betrayed morally by others. People who are suffering from moral injury feel there’s nothing left in their world to trust, including themselves. For them, any notion of “a shared moral covenant” has been shattered. But how does anyone live in a world without moral guideposts, even flawed ones? The world of modern war, it seems, not only destroys the foundations of life for its targets and victims, but also for its perpetrators.

Difficult truths from those on the front lines of our wars For civilians like me, there’s no way to understand moral injury without listening to those afflicted with it. I’ve been doing so to try to make sense of our culture of war for years now. As a religious studies scholar, I’ve been especially concerned about the ways in which so many of us give American-style war a sacred quality. Think, for instance, about the meme that circulates during national holidays like the recent July 4th, or Veterans Day, or Memorial Day: “Remember that only two forces ever agreed to die for you: Jesus Christ and the American soldier. One died for your freedom, the other for your soul; pass it on!”

How, I wonder, do such messages further shame and silence those already struggling with moral injury whose experiences have led them to see war as anything but sacred?

It’s been years since I first heard Andy, a veteran of the Iraq War, testify in the most personal way about moral injury at a Philadelphia church. He’s part of a family with a long military history. His father and grandfather both served in this country’s wars before, at 17, he enlisted in the Army in 1999. He came to work in military intelligence and would eventually be deployed to Iraq.

But all was most definitely not well with Andy when, after 11 years in the Army, he returned to civilian life. He found himself struggling in his relationships, unable to function, a mess, and eventually suicidal. He bounced from one mental healthcare provider to the next for eight years without experiencing the slightest sense of relief. On the verge of ending his life, he was referred to a new “Moral Injury Group” led by chaplain Chris Antal and psychologist Peter Yeomans at the Crescenz VA Hospital in Philadelphia. At that moment, Andy decided this would be his last effort before calling it quits and ending his life. Frankly, given what I now know, I’m amazed that he was willing to take that one last chance after so many years of suffering, struggle, and pain to so little purpose.

The professionals who lead that particular group are remarkably blunt about what they call “the work avoidance” of most citizens — the way that the majority of us fail to take any responsibility for the consequences of the endless wars we’ve been fighting in this century. People, they’ve told me, regularly deflect responsibility by adopting any of three approaches to veterans: by valorizing them (think of the simplistic “thank you for your service/sacrifice” or the implicit message of those “hometown heroes” banners); by pathologizing them (seeing vets as mentally ill and irreparably broken); or by demonizing them (think of the Vietnam-era “baby-killers” moniker). Each of these approaches essentially represses what those veterans could actually tell us about their experiences and our wars.

So, the leaders of the Crescenz VA Moral Injury Group developed an unorthodox approach. They assured Andy that he had an important story to tell, one the nation needed to hear so that civilians could finally “bear the brunt of the burden” of sending him to war. Eight years after leaving the military and a few weeks into that program, he finally revealed for the first time to those caregivers and vets, the event at the root of his own loss of soul. While deployed in Iraq, he had participated in calling in an airstrike that ended up killing 36 Iraqi men, women, and children.

I’ll never forget watching Andy testify about that very moment in the Philadelphia church on Veterans Day before an audience that had expressly indicated its willingness to listen. With palpable anguish, he told how, after the airstrike, his orders were to enter the bombed structure. He was supposed to sift through the bodies to find the supposed target of the strike. Instead, he came upon the lifeless bodies of, as he called them, “proud Iraqis,” including a little girl with a singed Minnie Mouse doll. Those sights and the smell of death were, he told us, “etched on the back of his eyelids forever.” This was the “shame” he carried with him always, an “unholy perpetration,” as he described it.

The day of that attack, he said, he felt his soul leave his body. Over years of listening to veterans’ stories, I realize that I’ve heard similar descriptions again and again. It may seem extreme to speak about one’s very soul being eviscerated, but it shouldn’t be treated as an exaggeration. After all, how can we even imagine what the deaths of so many men, women, and children may have meant for the Iraqi families and communities whose loved ones perished that day?

Andy’s story clarifies a reality Americans badly need to grasp: the destruction of war goes far beyond its intended targets. In the end, its violence is impossible to control. It doesn’t stay in those distant lands where this country has been fighting for so many fruitless years. Andy is the proof of that. His “loss of soul” almost had the direst of consequences, as his own suicidal impulses began to take over. Of that moment and his seemingly eternal imprisonment in the hell of war, he said: “I relive this alone, the steel cylinder heavy with the .38, knowing that to drive one into my own face will free me from this prison, these sights and smells.”

Taking moral injury seriously goes against the grain of American war culture Valorizing, pathologizing, and demonizing vets are all ways of refusing to listen to the actual experiences of those who carry out our wars. And for them, returning home often just adds to their difficulties, since so much of what they might say goes against the grain of national culture.

We’re generally brought up to see ourselves as a nation whose military gets the job done, despite the “forever wars” of the last nearly 20 years. Through national rituals, holidays, and institutions, hot embers of intense pride are regularly stoked, highlighting our military as the fiercest and strongest in the world. Many of us identify what it means to be a citizen with belonging to the most feared and powerful armed forces on the planet. As a result, people easily believe that, when the U.S. goes to war, what we’re doing is, almost by definition, moral.

But those who dare to pay attention to the morally injured will find them offering inconvenient and uncomfortable truths that sharply conflict with exactly those assumptions. Recently, I listened to another group of military veterans and combat correspondents who gathered their courage to tell their stories publicly in a unique fashion for The Moral Injuries of War project. Here are just three small examples:

• “The military just teaches you don’t ask questions, and if you figure it out, it really isn’t your business anyway. That part, that probably is the biggest thing, having to do things you wonder about, but you can’t ask a question.”

• “The cynical part of me wants the public to understand that it’s your fault; we are all complicit in all of this horror. I don’t need other people to experience my pain, I need other people to understand that they are complicit in my pain.”

• “People want to say thank you for your service, wave a flag, but you’re left with these experiences that leave you feeling deeply shameful.  I burned through any relationship in my life with anybody who loved me. I have this feeling in my gut that something really bad is going to happen. God’s shoe was going to fall on me, I can’t breathe.”

I remember how struck I was at the Veterans Day gathering in that Philadelphia church where I first heard Andy speak, because it was so unlike most such celebrations and commemorations. Instead of laying wreaths or planting crosses in the ground; instead of speeches extolling vets as “the spine of the nation” and praising them for their “ultimate sacrifice,” we did something different. We listened to them tell us about the soul-destroying nature of what actually happened to them during their military service (and what’s happened to them ever since). And in addition to civilians like me, other vets were in those church pews listening, too.

After the testimonies, the VA chaplain leading the ceremony asked us all to come to the front of the church. There, he directed the vets to form a circle facing outwards. Then, he asked the civilians to form a circle around them and face them directly. What happened next challenged and moved me. The chaplain suggested we simply stand in silence for a minute, looking into each other’s eyes. You can’t imagine how slowly that minute passed. More than a few of us had tears running down our cheeks. It was as if we were all holding a painful, sharp, unforgiving reality — but doing it together.

Moral injury is a flashpoint that reveals important truths about our wars and the war-culture that goes with it. If focused on, instead of ignored, it raises uncomfortable questions. In the United States, military service often is described as the epitome of citizenship. Leaders and everyday folks alike claim to value veterans as our most highly esteemed citizens.

I wonder, though, if this isn’t just another way of avoiding a real acknowledgment of the disaster of this country’s twenty-first-century wars. Closing our ears to the veterans who have been on their front lines means ignoring the truths of those wars as well.

If this nation truly esteemed them, wouldn’t we do more to avoid placing them in just the kind of circumstances Andy faced? Wouldn’t our leaders work harder to find other ways of dealing with whatever dangers we confront? Wouldn’t everyday citizens raise more questions about the pervasive “innocent” celebrations of violence on national holidays that only sacralize war-culture as a crucial aspect of what it means to be an American citizen?

For Andy, that Moral Injury Group at the Crescenz VA was the place where his “screaming soul” could be heard. Instead of being “imprisoned by guilt,” he described how he began to feel “empowered” by it to tell the truth about our wars to the rest of us. He hopes that the nation will somehow learn to “bear its brunt of the burden” of those wars and the all-American war-culture that accompanies them in a way that truly matters — a new version of reality that would start with finally listening.

Kelly Denton-Borhaug has long been investigating how religion and violence collide in American war-culture. She teaches in the global religions department at Moravian University. She is the author of two books, U.S. War-Culture, Sacrifice and Salvation and, more recently, And Then Your Soul is Gone: Moral Injury and U.S. War-Culture. This article originated at TomDispatch.

Copyright ©2021 Kelly Denton-Borhaug — distributed by Agence Global

—————-

Released: 03 August 2021

Word Count: 2,477

—————-

William Astore, “Pivoting to America”

July 29, 2021 - TomDispatch

As a ROTC cadet and an Air Force officer, I was a tiny part of America’s vast Department of Defense (DoD) for 24 years until I retired and returned to civilian life as a history professor.  My time in the military ran from the election of Ronald Reagan to the reign of George W. Bush and Dick Cheney. It was defined by the Cold War, the collapse of the Soviet Union, America’s brief unipolar moment of dominance and the beginning of its end, as Washington embroiled itself in needless, disastrous wars in Afghanistan and Iraq after the 9/11 attacks.  Throughout those years of service, I rarely thought about a question that seems ever more critical to me today: What would a real system of American national defense look like?

During the Cold War, I took it for granted that this country needed a sprawling network of military bases, hundreds of them globally.  Back then, of course, the stated U.S. mission was to “contain” the communist pathogen.  To accomplish that mission, it seemed all too logical to me then for our military to emphasize its worldwide presence.  Yes, I knew that the Soviet threat was much exaggerated. Threat inflation has always been a feature of the DoD and at the time I’d read books like Andrew Cockburn’s The Threat: Inside the Soviet Military Machine. Still, the challenge was there and, as the leader of the “free world,” it seemed obvious to me that the U.S. had to meet it.

And then the Soviet Union collapsed — and nothing changed in the U.S. military’s global posture.

Or, put differently, everything changed.  For with the implosion of the USSR, what turned out to remain truly uncontained was our military, along with the dreams of neoconservatives who sought to remake the world in America’s image.  But which image?  That of a republic empowering its citizens in a participatory democracy or of an expansionist capitalist empire, driven by the ambition and greed of a set of oligarchs?

A few people spoke then of a “peace dividend.” They were, however, quickly drowned out by the military-industrial complex that President Dwight D. Eisenhower had warned this country about.  That complex, to which today we might add not only Congress (as Ike had done in an earlier draft of his address) but America’s sprawling intelligence apparatus of 18 agencies, eagerly moved into the void created by the Soviet collapse and that of the Warsaw Pact. It quickly came to dominate the world’s trade in arms, for instance, even as Washington sought to expand NATO, an alliance created to contain a Soviet threat that no longer existed.  Such an expansion made no sense, defensively speaking, but it did serve to facilitate further arms sales and bring U.S. imperial hegemony to the very borders of Russia.

And there was the rub — for me at least.  As an Air Force officer, I’d always thought of myself, however naively, as supporting and defending the Constitution against all enemies, foreign and domestic (the words of my oath of office).  After 1991, however, the main foreign enemy had disappeared and, though I didn’t grasp it then, our new enemy would prove to be domestic, not foreign.  It consisted of those who embraced as a positive good what I’ve come to think of as greed-war, while making no apologies for American leadership, no matter how violent, destructive, or self-centered it might prove to be.

In short, the arsenal of democracy of World War II fame had, by the 1960s, become the very complex of imperialism, militarism, and industrialism that Eisenhower warned Americans about first in his 1953 “Cross of Iron” speech and then in his more famous farewell address of 1961.  Despite the efforts of a few brave Americans, that arsenal of democracy was largely allowed to morph into an arsenal of empire, a radical change that came shrouded in the myth of “national security.”  The complex would then only serve to facilitate the war crimes of Vietnam and of subsequent disasters like Afghanistan, Iraq, and Libya, among so many others.

Yet those same misdeeds were so often dismissed by the mainstream media as the unavoidable costs of “national defense” or even supported as the unavoidable price of spreading freedom and democracy around the world. It was as if, in some twisted Orwellian fashion, war had come to be seen as conducive to liberty and righteousness.  But as George Orwell had indeed warned us, war is not peace, nor should constant warfare at a global level be the product of any democratic government worthy of its name.  War is what empires do and it’s what America has become: a machine for war.

Creating a people’s military So, I ask again: What would real national defense for this country look like?  Rarely do any of us pose this question, no less examine what it might truly mean.  Rarely do we think about all the changes we’d have to make as a nation and a people if we were to put defense first, second, and last, while leaving behind both our imperial wars and domestic militarism.

I know what it wouldn’t look like.  It wouldn’t look like today’s grossly inflated military.  A true Department of Defense wouldn’t need 800 foreign military bases, nor would the national security state need a budget that routinely exceeds a trillion dollars annually.  We wouldn’t need a huge, mechanized army, a navy built around aircraft carriers, or an air force that boasts of its global reach and global power, all of it created not for defense but for offense — for destruction, anytime, anywhere.

As a country, we would need to imagine a new “people’s” military as a force that could truly defend the American republic. That would obviously mean one focused above all on supporting the Constitution and the rights we (at least theoretically) hold sacred like freedom of speech, the press, and assembly, the right to privacy and due process, and of course the right to justice for all, not just for the highest bidder or those with the deepest pockets.

What might such a new military look like?  First, it would be much smaller.  America’s current military, including troops on active duty, reservists, and members of the National Guard, consists of roughly 2.4 million men and women.  Those numbers should gradually be cut at least in half.  Second, its budget should similarly be dramatically cut, the end goal being to have it 50% lower than next year’s proposed budget of $715 billion.  Third, it wouldn’t be based and deployed around the world. As a republican force (note the lower-case “r”), it would instead serve democratic ends rather than imperial ones.  It would certainly need far fewer generals and admirals.  Its mission wouldn’t involve “global reach,” but would be defensive, focused on our borders and this hemisphere.

A friend of mine, a Navy veteran of the Vietnam War, speaks of a military that would consist of a Coast Guard, “militias” (that is, the National Guard) for each of the fifty states, and little else.  Yes, in this America, that sounds beyond extreme, but he has a point.  Consider our unique advantages in terms of geography.  Our continent is protected by two vast oceans.  We share a long and peaceful border with Canada.  While the border with Mexico is certainly troubled, we’re talking about unarmed, desperate migrants, not a military invasion flooding into Texas to retake the Alamo.

Here, then, are just 10 ways America’s military could change under a vision that would put the defense of America first and free up some genuine funds for domestic needs as well:

1. No more new nuclear weapons.  It’s time to stop “modernizing” that arsenal to the tune of possibly $1.7 trillion over the next three decades.  Land-based intercontinental ballistic missiles like the Ground Based Strategic Deterrent, expected to cost more than $264 billion during its lifetime, and “strategic” (nuclear) bombers like the Air Force’s proposed B-21 Raider should be eliminated.  The Trident submarine force should also be made smaller, with limited modernization to improve its survivability.

2. All Army divisions should be reduced to cadres (smaller units capable of expansion in times of war), except the 82nd and 101st Airborne Divisions and the 10th Mountain Division.

3. The Navy should largely be redeployed to our hemisphere, while aircraft carriers and related major surface ships are significantly reduced in number.

4. The Air Force should be redesigned around the defense of America’s air space, rather than attacking others across the planet at any time.  Meanwhile, costly offensive fighter-bombers like the F-35, itself a potential $1.7 trillion boondoggle, should simply be eliminated and the habit of committing drone assassinations across the planet ended. Similarly, the separate space force created by President Trump should be folded back into a much-reduced Air Force.

5. The training of foreign militaries and police forces in places like Iraq and Afghanistan should be stopped.  The utter collapse of the U.S.-trained forces in Iraq in the face of the Islamic State in 2014 and the ongoing collapse of the U.S.-trained Afghan military today have made a mockery of this whole process.

6. Military missions launched by intelligence agencies like the CIA, including those drone assassination programs overseas, should be halted and the urge to intervene secretly in the political and military lives of so many other countries finally brought under some kind of control.

7. The “industrial” part of the military-industrial complex should also be brought under control, so that taxpayer dollars don’t go to fabulously expensive, largely useless weaponry. At the same time, the U.S. government should stop promoting the products of our major weapons makers around the planet.

8. Above all, in a democracy like ours, a future defensive military should only fight in a war when Congress, as the Constitution demands, formally declares one.

9. The military draft should be restored.  With a far smaller force, such a draft should have a limited impact, but it would ensure that the working classes of America, which have historically shouldered a heavy burden in military service, will no longer do so alone. In the future America of my military dreams, a draft would take the eligible sons and daughters of our politicians first, followed by all eligible students enrolled in elite prep schools and private colleges and universities, beginning with the Ivy League.  After all, America’s best and brightest will surely want to serve in a military devoted to defending their way of life.

10. Finally, there should be only one four-star general or admiral in each of the three services. Currently, believe it or not, there are an astonishing 44 four-star generals and admirals in America’s imperial forces. There are also hundreds of one-star, two-star, and three-star officers.  This top-heavy structure inhibits reform even as the highest-ranking officers never take responsibility for America’s lost wars.

Pivoting to America Perhaps you’ve heard of the “pivot to Asia” under the Obama administration — the idea of redeploying U.S. military forces from the Greater Middle East and elsewhere in response to perceived threats from China.  As it happened, it took the new Biden administration to begin to pull off that particular pivot, but America’s imperial military regularly seems to be pivoting somewhere or other.  It’s time to pivot to this country instead.

Echoing the words of George McGovern, a highly decorated World War II bomber pilot who unsuccessfully ran for president against Richard Nixon in 1972, “Come home, America.” Close all those foreign military bases.  Redirect resources from wars and weapons to peace and prosperity.  Focus on restoring the republic.  That’s how Americans working together could truly defend ourselves, not only from our “enemies” overseas, almost always much exaggerated, but from ourselves, the military-industrial-congressional complex, and all our fears.

Because let’s be frank: how could striking at allegedly Iranian-backed militias operating in Iraq and Syria possibly be a form of self-defense, as the Biden administration claimed back in June?  How is keeping U.S. troops in either of those two countries, or almost any other foreign country, truly a “defensive” act?  America’s “new” department of genuine defense, the one I imagine anyway, will know better.

In my nearly six decades, I’ve come to witness an America that increasingly equates “might” with “right,” and praises its presidents whenever they decide to bomb anyone (usually people in the Middle East or Central Asia, but occasionally in Africa now, too), as long as it’s framed in defensive or “preemptive” terms.  Whether you call this aggression, imperialism, militarism, or something even more unflattering (atrocity?), the one thing it shouldn’t be called is national defense.

Collectively, we need to imagine a world in which we as Americans are no longer the foremost merchants of death, in which we don’t imagine ourselves as the eternal global police force, in which we don’t spend as much on our military as the next 10 countries combined.  We need to dream of a world that’s not totally sliced and diced into U.S. military commands like Africa Command (AFRICOM); the Indo-Pacific Command or INDOPACOM; and the Middle Eastern Central Command (CENTCOM), among others.  How would Americans feel if China had an “AMERICOM” and patrolled the Gulf of Mexico with nuclear-armed aircraft carriers very much “Made in China”?  Chances are we wouldn’t accept Beijing’s high-minded claims about the “defensive” nature of those patrols.

This country’s rebirth will only begin when we truly put our Constitution first and seek to defend it in wiser, which means so much more restrained, ways.

William Astore, a retired lieutenant colonel (USAF) and professor of history, writes regularly for TomDispatch (where this article originated).  He is a senior fellow at the Eisenhower Media Network (EMN), an organization of critical veteran military and national security professionals. His personal blog is Bracing Views.

Copyright ©2021 William Astore — distributed by Agence Global

—————-

Released: 29 July 2021

Word Count: 2,244

—————-

Tom Engelhardt, “The forbidden word”

July 27, 2021 - TomDispatch

It was all so long ago, in a world seemingly without challengers. Do you even remember when we Americans lived on a planet with a recumbent Russia, a barely rising China, and no obvious foes except what later came to be known as an “axis of evil,” three countries then incapable of endangering this one? Oh, and, as it turned out, a rich young Saudi former ally, Osama bin Laden, and 19 hijackers, most of them also Saudis, from a tiny group called al-Qaeda that briefly possessed an “air force” of four commercial jets. No wonder this country was then touted as the greatest force, the superest superpower ever, sporting a military that left all others in the dust.

And then, of course, came the launching of the Global War on Terror, which soon would be normalized as the plain-old, uncapitalized “war on terror.” Yes, that very war — even if nobody’s called it that for years — began on September 11, 2001. At a Pentagon partially in ruins, Secretary of Defense Donald Rumsfeld, already aware that the destruction around him was probably Osama bin Laden’s responsibility, ordered his aides to begin planning for a retaliatory strike against… Saddam Hussein’s Iraq. Rumsfeld’s exact words (an aide wrote them down) were: “Go massive. Sweep it all up. Things related and not.”

Things related and not. Sit with that phrase for a moment. In their own strange way, those four words, uttered in the initial hours after the destruction of New York’s World Trade Center and part of the Pentagon, still seem to capture the twenty-first-century American experience.

Within days of 9/11, Rumsfeld, who served four presidents before recently stepping off this world at 88, and the president he then worked for, George W. Bush, would officially launch that Global War on Terror. They would ambitiously target supposed terror networks in no less than 60 countries. (Yep, that was Rumsfeld’s number!) They would invade Afghanistan and, less than a year and a half later, do the same on a far grander scale in Iraq to take down its autocratic ruler, Saddam Hussein, who had once been a hand-shaking buddy of the secretary of defense.

Despite rumors passed around at the time by supporters of such an invasion, Saddam had nothing to do with 9/11; nor, despite Bush administration claims, was his regime then developing or in possession of weapons of mass destruction; nor, if we didn’t act, would an Iraqi mushroom cloud have one day risen over New York or some other American city. And mind you, both of those invasions and so much more would be done in the name of “liberating” peoples and spreading American-style democracy across the Greater Middle East. Or, put another way, in response to that devastating attack by those 19 hijackers armed with knives, the U.S. was preparing to invade and dominate the oil-rich Middle East until the end of time. In 2021, almost two decades later, doesn’t that seem like another lifetime to you?

By the way, you’ll note that there’s one word missing in action in all of the above. Believe me, if what I just described had related to Soviet plans during the Cold War, you can bet your bottom dollar that word would have been all over Washington. I’m thinking, of course, of “empire” or, in its adjectival form, “imperial.” Had the Soviet Union planned similar acts to “liberate” peoples by “spreading communism,” it would have been seen in Washington as the most imperial project ever. In the early years of this century, however, with the Soviet Union long gone and America’s leaders imagining that they might reign supreme globally until the end of time, those two words were banished to history.

It was obvious that, despite the unprecedented 800 or so military bases this country possessed around the world, imperial powers were distinctly a thing of the past.

“Empires have gone there and not done it” Now, keep that thought in abeyance for a moment, while I take you on a quick tour of the long-forgotten Global War on Terror. Almost two decades later, it does seem to be drawing to some kind of lingering close. Yes, there are still those 650 American troops guarding our embassy in the Afghan capital, Kabul, and there is still that “over-the-horizon capacity” the president cites for U.S. aircraft to strike Taliban forces, even if American troops only recently abandoned their last air base in Afghanistan; and yes, there are still about 2,500 American troops stationed in Iraq (and hundreds more at bases across the border in Syria), regularly being attacked by Iraqi militia groups.

Similarly, despite the withdrawal of U.S. forces from Somalia as the Trump years ended, over-the-horizon airstrikes against the terror group al-Shabaab, halted when Joe Biden entered the Oval Office, have just been started again, assumedly from bases in Kenya or Djibouti; and yes, the horrendous war in Yemen continues with the U.S. still supporting the Saudis, even if by offering “defensive,” not “offensive” aid; and yes, American special operators are also stationed in staggering numbers of countries around the globe; and yes, prisoners are still being held in Guantanamo, that offshore Bermuda Triangle of injustice created by the Bush administration so long ago. Admittedly, officials in the new Biden Justice Department are at least debating, however indecisively, whether those detainees might have any due process rights under the Constitution (yes, that’s the U.S. Constitution!) and their numbers are at a historic low since 2002 of 39.

Still, let’s face it, this isn’t the set of conflicts that, once upon a time, involved invasions, massive air strikes, occupations, the killing of staggering numbers of civilians, widespread drone attacks, the disruption of whole countries, the uprooting and displacement of more than 37 million people, the deployment at one point of 100,000 U.S. troops in Afghanistan alone, and the spending of untold trillions of American taxpayer dollars, all in the name of fighting terror and spreading democracy. And think of it as mission (un)accomplished in the truest sense imaginable.

In fact, that idea of spreading of democracy didn’t really outlast the Bush years. Ever since, there’s been remarkably little discussion in official Washington about what this country was really doing as it warred across significant parts of the planet. Yes, those two decades of conflict, those “forever wars,” as they came to be called first by critics and then by anyone in sight, are at least winding, or perhaps spiraling, down — and yet, here’s the strange thing: Wouldn’t you think that, as they ended in visible failure, the Pentagon’s stock might also be falling? Oddly enough, though, in the wake of all those years of losing wars, it’s still rising. The Pentagon budget only heads ever more for the stratosphere as foreign policy “pivots” from the Greater Middle East to Asia (and Russia and the Arctic and, well, anywhere but those places where terror groups still roam).

In other words, when it comes to the U.S. military as it tries to leave its forever wars in someone else’s ditch, failure is the new success story. Perhaps not so surprisingly, then, the losing generals who fought those wars, while eternally promising that “corners” were being turned and “progress” made, have almost all either continued to rise in the ranks or gotten golden parachutes into other parts of the military-industrial complex. That should shock Americans, but really never seems to. Yes, striking percentages of us support leaving Afghanistan and the Afghans in a ditch somewhere and moving on, but it’s still generally a big “thank you for your service” to our military commanders and the Pentagon.

Looking back, however, isn’t the real question — not that anyone’s asking — this: What was America’s mission during all those years? In reality, I don’t think it’s possible to answer that or explain any of it without using the forbidden noun and adjective I mentioned earlier. And, to my surprise, after all these years when it never crossed the lips of an American president, Joe Biden, the guy who’s been insisting that “America is back” on this failing planet of ours, actually used that very word!

In a recent news conference, irritated to find himself endlessly discussing his decision to pull U.S. forces out of Afghanistan, he fielded this question from a reporter: “Given the amount of money that has been spent and the number of lives that have been lost, in your view, with making this decision, were the last 20 years worth it?”

His response: “I argued, from the beginning [in the Obama years], as you may recall — it came to light after the administration was over… No nation has ever unified Afghanistan, no nation. Empires have gone there and not done it.”

So, there! Yes, it was vague and could simply have been a reference to the fate in Afghanistan, that famed “graveyard of empires,” of the British empire in the nineteenth century and the Soviet one in the twentieth century. But I can’t help thinking that a president, however minimally, however indirectly, however much without even meaning to, finally acknowledged that this country, too, was on an imperial mission there and globally as well, a mission not of spreading democracy or of liberation but of domination. Otherwise, how the hell do you explain those 800 military bases on every continent but Antarctica? Is that really spreading democracy? Is that really liberating humanity? It’s not a subject discussed in this country, but believe me, if it were any other place, the words “empire” and “imperial” would be on all too many lips in Washington and the urge to dominate in such a fashion would have been roundly denounced in our nation’s capital.

A failing empire with a flailing military? Here’s a question for you: If the U.S. is “back,” as our president has been claiming, what exactly is it back as? What could it be, now that it’s proven itself incapable of dominating the planet in the fashion its political leaders once dreamed of? Could this country, which in these years dumped trillions of taxpayer dollars into its forever wars, now perhaps be reclassified as a failing empire with a flailing military?

Of course, such a possibility isn’t generally acknowledged here. If, for instance, Kabul falls to the Taliban months from now and U.S. diplomats need to be rescued from the roof of our embassy there, as happened in Saigon in 1975 — something the president has vehemently denied is even possible — count on one thing: a bunch of Republicans and right-wing pundits will instantly be down his throat for leaving “too fast.” (Of course, some of them already are, including, as it happens, the very president who launched the 2001 invasion, only to almost instantly refocus his attention on invading Iraq.)

Even domestically, when you think about where our money truly goes, inequality of every sort is only growing more profound, with America’s billionaires ever wealthier and more numerous, while the Pentagon and those weapons-making corporations float ever higher on taxpayer dollars, and the bills elsewhere go unpaid. In that sense, perhaps it’s time to start thinking about the United States as a failing imperial system at home as well as abroad. Sadly, whether globally or domestically, all of this seems hard for Americans to take in or truly describe (hence, perhaps, the madness of Donald Trump’s America). After all, if you can’t even use the words “imperial” and “empire,” then how are you going to understand what’s happening to you?

Still, forget any fantasies about us spreading democracy abroad. We’re now in a country that’s visibly threatening to lose democracy at home. Forget Afghanistan. From the January 6th assault on the Capitol to the latest (anti-)voting laws in Texas and elsewhere, there’s a flailing, failing system right here in the U.S. of A. And unlike Afghanistan, it’s not one that a president can withdraw from.

Yes, globally, the Biden administration has seemed remarkably eager to enter a new Cold War with China and “pivot” to Asia, as the Pentagon continues to build up its forces, from naval to nuclear, as if this country were indeed still the reigning imperial power on the planet. But it’s not.

The real question may be this: Three decades after the Soviet empire headed for the exit, is it possible that the far more powerful American one is ever so chaotically heading in the same direction? And if so, what does that mean for the rest of us?

Tom Engelhardt created and runs the website TomDispatch.com — where this article originated. He is also a co-founder of the American Empire Project and the author of a highly praised history of American triumphalism in the Cold War, The End of Victory Culture.  A fellow of the Type Media Center, his sixth and latest book is A Nation Unmade by War.

Copyright ©2021 Tom Engelhardt — distributed by Agence Global

—————-

Released: 27 July 2021

Word Count: 2,075

—————-

John Feffer, “Artificial intelligence wants you (and your job)”

July 22, 2021 - TomDispatch

My wife and I were recently driving in Virginia, amazed yet again that the GPS technology on our phones could guide us through a thicket of highways, around road accidents, and toward our precise destination. The artificial intelligence (AI) behind the soothing voice telling us where to turn has replaced passenger-seat navigators, maps, even traffic updates on the radio. How on earth did we survive before this technology arrived in our lives? We survived, of course, but were quite literally lost some of the time.

My reverie was interrupted by a toll booth. It was empty, as were all the other booths at this particular toll plaza. Most cars zipped through with E-Z passes, as one automated device seamlessly communicated with another. Unfortunately, our rental car didn’t have one.

So I prepared to pay by credit card, but the booth lacked a credit-card reader.

Okay, I thought, as I pulled out my wallet, I’ll use cash to cover the $3.25.

As it happened, that booth took only coins and who drives around with 13 quarters in his or her pocket?

I would have liked to ask someone that very question, but I was, of course, surrounded by mute machines. So, I simply drove through the electronic stile, preparing myself for the bill that would arrive in the mail once that plaza’s automated system photographed and traced our license plate.

In a thoroughly mundane fashion, I’d just experienced the age-old conflict between the limiting and liberating sides of technology. The arrowhead that can get you food for dinner might ultimately end up lodged in your own skull. The car that transports you to a beachside holiday contributes to the rising tides — by way of carbon emissions and elevated temperatures — that may someday wash away that very coastal gem of a place. The laptop computer that plugs you into the cyberworld also serves as the conduit through which hackers can steal your identity and zero out your bank account.

In the previous century, technology reached a true watershed moment when humans, harnessing the power of the atom, also acquired the capacity to destroy the entire planet. Now, thanks to AI, technology is hurtling us toward a new inflection point.

Science-fiction writers and technologists have long worried about a future in which robots, achieving sentience, take over the planet. The creation of a machine with human-like intelligence that could someday fool us into believing it’s one of us has often been described, with no small measure of trepidation, as the “singularity.” Respectable scientists like Stephen Hawking have argued that such a singularity will, in fact, mark the “end of the human race.”

This will not be some impossibly remote event like the sun blowing up in a supernova several billion years from now. According to one poll, AI researchers reckon that there’s at least a 50-50 chance that the singularity will occur by 2050. In other words, if pessimists like Hawking are right, it’s odds on that robots will dispatch humanity before the climate crisis does.

Neither the artificial intelligence that powers GPS nor the kind that controlled that frustrating toll plaza has yet attained anything like human-level intelligence — not even close. But in many ways, such dumb robots are already taking over the world. Automation is currently displacing millions of workers, including those former tollbooth operators. “Smart” machines like unmanned aerial vehicles have become an indispensable part of waging war. AI systems are increasingly being deployed to monitor our every move on the Internet, through our phones, and whenever we venture into public space. Algorithms are replacing teaching assistants in the classroom and influencing sentencing in courtrooms. Some of the loneliest among us have already become dependent on robot pets.

As AI capabilities continue to improve, the inescapable political question will become: to what extent can such technologies be curbed and regulated? Yes, the nuclear genie is out of the bottle as are other technologies — biological and chemical — capable of causing mass destruction of a kind previously unimaginable on this planet. With AI, however, that day of singularity is still in the future, even if a rapidly approaching one. It should still be possible, at least theoretically, to control such an outcome before there’s nothing to do but play the whack-a-mole game of non-proliferation after the fact.

As long as humans continue to behave badly on a global scale — war, genocide, planet-threatening carbon emissions — it’s difficult to imagine that anything we create, however intelligent, will act differently. And yet we continue to dream that some deus in machina, a god in the machine, could appear as if by magic to save us from ourselves.

Taming AI? In the early 1940s, science fiction writer Isaac Asimov formulated his famed three laws of robotics: that robots were not to harm humans, directly or indirectly; that they must obey our commands (unless doing so violates the first law); and that they must safeguard their own existence (unless self-preservation contravenes the first two laws).

Any number of writers have attempted to update Asimov. The latest is legal scholar Frank Pasquale, who has devised four laws to replace Asimov’s three. Since he’s a lawyer not a futurist, Pasquale is more concerned with controlling the robots of today than hypothesizing about the machines of tomorrow. He argues that robots and AI should help professionals, not replace them; that they should not counterfeit humans; that they should never become part of any kind of arms race; and that their creators, controllers, and owners should always be transparent.

Pasquale’s “laws,” however, run counter to the artificial-intelligence trends of our moment. The prevailing AI ethos mirrors what could be considered the prime directive of Silicon Valley: move fast and break things. This philosophy of disruption demands, above all, that technology continuously drive down labor costs and regularly render itself obsolescent.

In the global economy, AI indeed helps certain professionals — like Facebook’s Mark Zuckerberg and Amazon’s Jeff Bezos, who just happen to be among the richest people on the planet — but it’s also replacing millions of us. In the military sphere, automation is driving boots off the ground and eyes into the sky in a coming robotic world of war. And whether it’s Siri, the bots that guide increasingly frustrated callers through automated phone trees, or the AI that checks out Facebook posts, the aim has been to counterfeit human beings — “machines like me,” as Ian McEwan called them in his 2019 novel of that title — while concealing the strings that connect the creation to its creator.

Pasquale wants to apply the brakes on a train that has not only left the station but no longer is under the control of the engine driver. It’s not difficult to imagine where such a runaway phenomenon could end up and techno-pessimists have taken a perverse delight in describing the resulting cataclysm. In his book Superintelligence, for instance, Nick Bostrom writes about a sandstorm of self-replicating nanorobots that chokes every living thing on the planet — the so-called grey goo problem — and an AI that seizes power by “hijacking political processes.”

Since they would be interested only in self-preservation and replication, not protecting humanity or following its orders, such sentient machines would clearly tear up Asimov’s rulebook. Futurists have leapt into the breach. For instance, Ray Kurzweil, who predicted in his 2005 book The Singularity Is Near that a robot would attain sentience by about 2045, has proposed a “ban on self-replicating physical entities that contain their own codes for self-replication.” Elon Musk, another billionaire industrialist who’s no enemy of innovation, has called AI humanity’s “biggest existential threat” and has come out in favor of a ban on future killer robots.

To prevent the various worst-case scenarios, the European Union has proposed to control AI according to degree of risk. Some products that fall in the EU’s “high risk” category would have to get a kind of Good Housekeeping seal of approval (the Conformité Européenne). AI systems “considered a clear threat to the safety, livelihoods, and rights of people,” on the other hand, would be subject to an outright ban. Such clear-and-present dangers would include, for instance, biometric identification that captures personal data by such means as facial recognition, as well as versions of China’s social credit system where AI helps track individuals and evaluate their overall trustworthiness.

Techno-optimists have predictably lambasted what they consider European overreach. Such controls on AI, they believe, will put a damper on R&D and, if the United States follows suit, allow China to secure an insuperable technological edge in the field. “If the member states of the EU — and their allies across the Atlantic — are serious about competing with China and retaining their power status (as well as the quality of life they provide to their citizens),” writes entrepreneur Sid Mohasseb in Newsweek, “they need to call for a redraft of these regulations, with growth and competition being seen as at least as important as regulation and safety.”

Mohasseb’s concerns are, however, misleading. The regulators he fears so much are, in fact, now playing a game of catch-up. In the economy and on the battlefield, to take just two spheres of human activity, AI has already become indispensable.

The automation of globalization The ongoing Covid-19 pandemic has exposed the fragility of global supply chains. The world economy nearly ground to a halt in 2020 for one major reason: the health of human workers. The spread of infection, the risk of contagion, and the efforts to contain the pandemic all removed workers from the labor force, sometimes temporarily, sometimes permanently. Factories shut down, gaps widened in transportation networks, and shops lost business to online sellers.

A desire to cut labor costs, a major contributor to a product’s price tag, has driven corporations to look for cheaper workers overseas. For such cost-cutters, eliminating workers altogether is an even more beguiling prospect. Well before the pandemic hit, corporations had begun to turn to automation. By 2030, up to 45 million U.S. workers will be displaced by robots. The World Bank estimates that they will eventually replace an astounding 85% of the jobs in Ethiopia, 77% in China, and 72% in Thailand.”

The pandemic not only accelerated this trend, but increased economic inequality as well because, at least for now, robots tend to replace the least skilled workers. In a survey conducted by the World Economic Forum, 43% of businesses indicated that they would reduce their workforces through the increased use of technology. “Since the pandemic hit,” reports NBC News,

“food manufacturers ramped up their automation, allowing facilities to maintain output while social distancing. Factories digitized controls on their machines so they could be remotely operated by workers working from home or another location. New sensors were installed that can flag, or predict, failures, allowing teams of inspectors operating on a schedule to be reduced to an as-needed maintenance crew.”

In an ideal world, robots and AI would increasingly take on all the dirty, dangerous, and demeaning jobs globally, freeing humans to do more interesting work. In the real world, however, automation is often making jobs dirtier and more dangerous by, for instance, speeding up the work done by the remaining human labor force. Meanwhile, robots are beginning to encroach on what’s usually thought of as the more interesting kinds of work done by, for example, architects and product designers.

In some cases, AI has even replaced managers. A contract driver for Amazon, Stephen Normandin, discovered that the AI system that monitored his efficiency as a deliveryman also used an automated email to fire him when it decided he wasn’t up to snuff. Jeff Bezos may be stepping down as chief executive of Amazon, but robots are quickly climbing its corporate ladder and could prove at least as ruthless as he’s been, if not more so.

Mobilizing against such a robot replacement army could prove particularly difficult as corporate executives aren’t the only ones putting out the welcome mat. Since fully automated manufacturing in “dark factories” doesn’t require lighting, heating, or a workforce that commutes to the site by car, that kind of production can reduce a country’s carbon footprint — a potentially enticing factor for “green growth” advocates and politicians desperate to meet their Paris climate targets.

It’s possible that sentient robots won’t need to devise ingenious stratagems for taking over the world. Humans may prove all too willing to give semi-intelligent machines the keys to the kingdom.

The new fog of war The 2020 war between Armenia and Azerbaijan proved to be unlike any previous military conflict. The two countries had been fighting since the 1980s over a disputed mountain enclave, Nagorno-Karabakh. Following the collapse of the Soviet Union, Armenia proved the clear victor in conflict that followed in the early 1990s, occupying not only the disputed territory but parts of Azerbaijan as well.

In September 2020, as tensions mounted between the two countries, Armenia was prepared to defend those occupied territories with a well-equipped army of tanks and artillery. Thanks to its fossil-fuel exports, Azerbaijan, however, had been spending considerably more than Armenia on the most modern version of military preparedness. Still, Armenian leaders often touted their army as the best in the region. Indeed, according to the 2020 Global Militarization Index, that country was second only to Israel in terms of its level of militarization.

Yet Azerbaijan was the decisive winner in the 2020 conflict, retaking possession of Nagorno-Karabkah. The reason: automation.

“Azerbaijan used its drone fleet — purchased from Israel and Turkey — to stalk and destroy Armenia’s weapons systems in Nagorno-Karabakh, shattering its defenses and enabling a swift advance,” reported the Washington Post‘s Robyn Dixon. “Armenia found that air defense systems in Nagorno-Karabakh, many of them older Soviet systems, were impossible to defend against drone attacks, and losses quickly piled up.”

Armenian soldiers, notorious for their fierceness, were spooked by the semi-autonomous weapons regularly above them. “The soldiers on the ground knew they could be hit by a drone circling overhead at any time,” noted Mark Sullivan in the business magazine Fast Company. “The drones are so quiet they wouldn’t hear the whir of the propellers until it was too late. And even if the Armenians did manage to shoot down one of the drones, what had they really accomplished? They’d merely destroyed a piece of machinery that would be replaced.”

The United States pioneered the use of drones against various non-state adversaries in its war on terror in Afghanistan, Iraq, Pakistan, Somalia, and elsewhere across the Greater Middle East and Africa. But in its 2020 campaign, Azerbaijan was using the technology to defeat a modern army. Now, every military will feel compelled not only to integrate increasingly more powerful AI into its offensive capabilities, but also to defend against the new technology.

To stay ahead of the field, the United States is predictably pouring money into the latest technologies. The new Pentagon budget includes the “largest ever” request for R&D, including a down payment of nearly a billion dollars for AI. As TomDispatch regular Michael Klare has written, the Pentagon has even taken a cue from the business world by beginning to replace its war managers — generals — with a huge, interlinked network of automated systems known as the Joint All-Domain Command-and-Control (JADC2).

The result of any such handover of greater responsibility to machines will be the creation of what mathematician Cathy O’Neill calls “weapons of math destruction.” In the global economy, AI is already replacing humans up and down the chain of production. In the world of war, AI could in the end annihilate people altogether, whether thanks to human design or computer error.

After all, during the Cold War, only last-minute interventions by individuals on both sides ensured that nuclear “missile attacks” detected by Soviet and American computers — which turned out to be birds, unusual weather, or computer glitches — didn’t precipitate an all-out nuclear war. Take the human being out of the chain of command and machines could carry out such a genocide all by themselves.

And the fault, dear reader, would lie not in our robots but in ourselves.

Robots of last resort In my new novel Songlands, humanity faces a terrible set of choices in 2052. Having failed to control carbon emissions for several decades, the world is at the point of no return, too late for conventional policy fixes. The only thing left is a scientific Hail Mary pass, an experiment in geoengineering that could fail or, worse, have terrible unintended consequences. The AI responsible for ensuring the success of the experiment may or may not be trustworthy. My dystopia, like so many others, is really about a narrowing of options and a whittling away of hope, which is our current trajectory.

And yet, we still have choices. We could radically shift toward clean energy and marshal resources for the whole world, not just its wealthier portions, to make the leap together. We could impose sensible regulations on artificial intelligence. We could debate the details of such programs in democratic societies and in participatory multilateral venues.

Or, throwing up our hands because of our unbridgeable political differences, we could wait for a post-Trumpian savior to bail us out. Techno-optimists hold out hope that automation will set us free and save the planet. Laissez-faire enthusiasts continue to believe that the invisible hand of the market will mysteriously direct capital toward planet-saving innovations instead of SUVs and plastic trinkets.

These are illusions. As I write in Songlands, we have always hoped for someone or something to save us: “God, a dictator, technology. For better or worse, the only answer to our cries for help is an echo.”

In the end, robots won’t save us. That’s one piece of work that can’t be outsourced or automated. It’s a job that only we ourselves can do.

John Feffer writes regularly for TomDispatch (where this article originated). He is the author of the dystopian novel Splinterlands and the director of Foreign Policy In Focus at the Institute for Policy Studies. Frostlands, a Dispatch Books original, is volume two of his Splinterlands series and the final novel in the trilogy, Songlands, has just been published. He has also written The Pandemic Pivot.

Copyright ©2021 John Feffer — distributed by Agence Global

—————-

Released: 22 July 2021

Word Count: 2,956

—————-

Aviva Chomsky, “Migration is not the crisis: what Washington could do in Central America”

July 19, 2021 - TomDispatch

Earlier this month, a Honduran court found David Castillo, a U.S.-trained former Army intelligence officer and the head of an internationally financed hydroelectric company, guilty of the 2016 murder of celebrated Indigenous activist Berta Cáceres. His company was building a dam that threatened the traditional lands and water sources of the Indigenous Lenca people. For years, Cáceres and her organization, the Council of Popular and Indigenous Organizations of Honduras, or COPINH, had led the struggle to halt that project. It turned out, however, that Cáceres’s international recognition — she won the prestigious Goldman Environmental Prize in 2015 — couldn’t protect her from becoming one of the dozens of Latin American Indigenous and environmental activists killed annually.

Yet when President Joe Biden came into office with an ambitious “Plan for Security and Prosperity in Central America,” he wasn’t talking about changing policies that promoted big development projects against the will of local inhabitants. Rather, he was focused on a very different goal: stopping migration. His plan, he claimed, would address its “root causes.” Vice President Kamala Harris was even blunter when she visited Guatemala, instructing potential migrants: “Do not come.”

As it happens, more military and private development aid of the sort Biden’s plan calls for (and Harris boasted about) won’t either stop migration or help Central America. It’s destined, however, to spark yet more crimes like Cáceres’s murder. There are other things the United States could do that would aid Central America. The first might simply be to stop talking about trying to end migration.

How can the United States help Central America? Biden and Harris are only recycling policy prescriptions that have been around for decades: promote foreign investment in Central America’s export economy, while building up militarized “security” in the region. In truth, it’s the very economic model the United States has imposed there since the nineteenth century, which has brought neither security nor prosperity to the region (though it’s brought both to U.S. investors there). It’s also the model that has displaced millions of Central Americans from their homes and so is the fundamental cause of what, in this country, is so often referred to as the “crisis” of immigration.

In the nineteenth and early twentieth centuries, the U.S. began imposing that very model to overcome what officials regularly described as Central American “savagery” and “banditry.” The pattern continued as Washington found a new enemy, communism, to battle there in the second half of the last century. Now, Biden promises that the very same policies — foreign investment and eternal support for the export economy — will end migration by attacking its “root causes”: poverty, violence, and corruption. (Or call them “savagery” and “banditry,” if you will.) It’s true that Central America is indeed plagued by poverty, violence, and corruption, but if Biden were willing to look at the root causes of his root causes, he might notice that his aren’t the solutions to such problems, but their source.

Stopping migration from Central America is no more a legitimate policy goal than was stopping savagery, banditry, or communism in the twentieth century. In fact, what Washington policymakers called savagery (Indigenous people living autonomously on their lands), banditry (the poor trying to recover what the rich had stolen from them), and communism (land reform and support for the rights of oppressed workers and peasants) were actually potential solutions to the very poverty, violence, and corruption imposed by the US-backed ruling elites in the region. And maybe migration is likewise part of Central Americans’ struggle to solve these problems. After all, migrants working in this country send back more money in remittances to their families in Central America than the United States has ever given in foreign aid.

What, then, would a constructive U.S. policy towards Central America look like?

Perhaps the most fundamental baseline of foreign policy should be that classic summary of the Hippocratic Oath: do no harm. As for doing some good, before the subject can even be discussed, there needs to be an acknowledgement that so much of what we’ve done to Central America over the past 200 years has been nothing but harm.

The United States could begin by assuming historical responsibility for the disasters it’s created there. After the counterinsurgency wars of the 1980s, the United Nations sponsored truth commissions in El Salvador and Guatemala to uncover the crimes committed against civilian populations there. Unfortunately, those commissions didn’t investigate Washington’s role in funding and promoting war crimes in the region.

Maybe what’s now needed is a new truth commission to investigate historic U.S. crimes in Central America. In reality, the United States owes those small, poor, violent, and corrupt countries reparations for the damages it’s caused over all these years. Such an investigation might begin with Washington’s long history of sponsoring coups, military “aid,” armed interventions, massacres, assassinations, and genocide.

The U.S. would have to focus as well on the impacts of ongoing economic aid since the 1980s, aimed at helping U.S. corporations at the expense of the Central American poor. It could similarly examine the role of debt and the U.S.-Central America Free Trade Agreement in fostering corporate and elite interests. And don’t forget the way the outsized U.S. contribution to greenhouse gas emissions — this country is, of course, the largest such emitter in history — and climate change has contributed to the destruction of livelihoods in Central America. Finally, it could investigate how our border and immigration policies directly contribute to keeping Central America poor, violent, and corrupt, in the name of stopping migration.

Constructive options for U.S. policy in Central America

Providing Vaccines: Even as Washington rethinks the fundamentals of this country’s policies there, it could take immediate steps on one front, the Covid-19 pandemic, which has been devastating the region. Central America is in desperate need of vaccines, syringes, testing materials, and personal protective equipment. A history of underfunding, debt, and privatization, often due directly or indirectly to U.S. policy, has left Central America’s healthcare systems in shambles. While Latin America as a whole has been struggling to acquire the vaccines it needs, Honduras, Guatemala, and Nicaragua rank at the very bottom of doses administered. If the United States actually wanted to help Central America, the emergency provision of what those countries need to get vaccines into arms would be an obvious place to start.

Reversing economic exploitation: Addressing the structural and institutional bases of economic exploitation could also have a powerful impact. First, we could undo the harmful provisions of the 2005 Central America Free Trade Agreement (CAFTA). Yes, Central American governments beholden to Washington did sign on to it, but that doesn’t mean that the agreement benefited the majority of the inhabitants in the region. In reality, what CAFTA did was throw open Central American markets to U.S. agricultural exports, in the process undermining the livelihoods of small farmers there.

CAFTA also gave a boost to the maquiladora or export-processing businesses, lending an all-too-generous hand to textile, garment, pharmaceutical, electronics, and other industries that regularly scour the globe for the cheapest places to manufacture their goods. In the process, it created mainly the kind of low-quality jobs that corporations can easily move anytime in an ongoing global race to the bottom.

Central American social movements have also vehemently protested CAFTA provisions that undermine local regulations and social protections, while privileging foreign corporations. At this point, local governments in that region can’t even enforce the most basic laws they’ve passed to regulate such deeply exploitative foreign investors.

Another severe restriction that prevents Central American governments from pursuing economic policies in the interest of their populations is government debt. Private banks lavished loans on dictatorial governments in the 1970s, then pumped up interest rates in the 1980s, causing those debts to balloon. The International Monetary Fund stepped in to bail out the banks, imposing debt restructuring programs on already-impoverished countries — in other words, making the poor pay for the profligacy of the wealthy.

For real economic development, governments need the resources to fund health, education, and welfare. Unsustainable and unpayable debt (compounded by ever-growing interest) make it impossible for such governments to dedicate resources where they’re truly needed. A debt jubilee would be a crucial step towards restructuring the global economy and shifting the stream of global resources that currently flows so strongly from the poorest to the richest countries.

Now, add another disastrous factor to this equation: the U.S. “drug wars” that have proven to be a key factor in the spread of violence, displacement, and corruption in Central America. The focus of the drug war on Mexico in the early 2000s spurred an orgy of gang violence there, while pushing the trade south into Central America. The results have been disastrous. As drug traffickers moved in, they brought violence, land grabs, and capital for new cattle and palm-oil industries, drawing in corrupt politicians and investors. Pouring arms and aid into the drug wars that have exploded in Central America has only made trafficking even more corrupt, violent, and profitable.

Reversing climate change: In recent years, ever more extreme weather in Central America’s “dry corridor,” running from Guatemala through El Salvador, Honduras, and Nicaragua, has destroyed homes, farms, and livelihoods, and this climate-change-induced trend is only worsening by the year. While the news largely tends to present ongoing drought, punctuated by ever more frequent and violent hurricanes and tropical storms, as well as increasingly disastrous flooding, as so many individual occurrences, their heightened frequency is certainly a result of climate change. And about a third of Central America’s migrants directly cite extreme weather as the reason they were forced to leave their homes. Climate change is, in fact, just what the U.S. Department of Defense all-too-correctly termed a “threat multiplier” that contributes to food and water scarcity, land conflicts, unemployment, violence, and other causes of migration.

The United States has, of course, played and continues to play an outsized role in contributing to climate change. And, in fact, we continue to emit far more CO2 per person than any other large country. We also produce and export large amounts of fossil fuels — the U.S., in fact, is one of the world’s largest exporters as well as one of the largest consumers. And we continue to fund and promote fossil-fuel-dependent development at home and abroad. One of the best ways the United States could help Central America would be to focus time, energy, and money on stopping the burning of fossil fuels.

Migration as a problem solver Isn’t it finally time that the officials and citizens of the United States recognized the role migration plays in Central American economies? Where U.S. economic development recipes have failed so disastrously, migration has been the response to these failures and, for many Central Americans, the only available way to survive.

One in four Guatemalan families relies on remittances from relatives working in the United States and such monies account for about half of their income. President Biden may have promised Central America $4 billion in aid over four years, but Guatemala alone receives $9 billion a year in such remittances. And unlike government aid, much of which ends up in the pockets of U.S. corporations, local entrepreneurs, and bureaucrats of various sorts, remittances go directly to meet the needs of ordinary households.

At present, migration is a concrete way that Central Americans are trying to solve their all-too-desperate problems. Since the nineteenth century, Indigenous and peasant communities have repeatedly sought self-sufficiency and autonomy, only to be displaced by U.S. plantations in the name of progress. They’ve tried organizing peasant and labor movements to fight for land reform and workers’ rights, only to be crushed by U.S.-trained and sponsored militaries in the name of anti-communism. With other alternatives foreclosed, migration has proven to be a twenty-first-century form of resistance and survival.

If migration can be a path to overcome economic crises, then instead of framing Washington’s Central American policy as a way to stop it, the United States could reverse course and look for ways to enhance migration’s ability to solve problems.

Jason DeParle aptly titled his recent book on migrant workers from the Philippines A Good Provider is One Who Leaves. “Good providers should not have to leave,” responded the World Bank’s Dilip Ratha, “but they should have the option.” As Ratha explains,

“Migrants benefit their destination countries. They provide essential skills that may be missing and fill jobs that native-born people may not want to perform. Migrants pay taxes and are statistically less prone to commit crimes than native-born people… Migration benefits the migrant and their extended family and offers the potential to break the cycle of poverty. For women, migration elevates their standing in the family and the society. For children, it provides access to healthcare, education, and a higher standard of living. And for many countries of origin, remittances provide a lifeline in terms of external, counter-cyclical financing.”

Migration can also have terrible costs. Families are separated, while many migrants face perilous conditions, including violence, detention, and potentially death on their journeys, not to speak of inadequate legal protection, housing, and working conditions once they reach their destination. This country could do a lot to mitigate such costs, many of which are under its direct control. The United States could open its borders to migrant workers and their families, grant them full legal rights and protections, and raise the minimum wage.

Would such policies lead to a large upsurge in migration from Central America? In the short run, they might, given the current state of that region under conditions created and exacerbated by Washington’s policies over the past 40 years. In the longer run, however, easing the costs of migration actually could end up easing the structural conditions that cause it in the first place.

Improving the safety, rights, and working conditions of migrants would help Central America far more than any of the policies Biden and Harris are proposing. More security and higher wages would enable migrants to provide greater support for families back home. As a result, some would return home sooner. Smuggling and human trafficking rings, which take advantage of illegal migration, would wither from disuse. The enormous resources currently aimed at policing the border could be shifted to immigrant services. If migrants could come and go freely, many would go back to some version of the circular migration pattern that prevailed among Mexicans before the militarization of the border began to undercut that option in the 1990s. Long-term family separation would be reduced. Greater access to jobs, education, and opportunity has been shown to be one of the most effective anti-gang strategies.

In other words, there’s plenty the United States could do to develop more constructive policies towards Central America and its inhabitants. That, however, would require thinking far more deeply about the “root causes” of the present catastrophe than Biden, Harris, and crew seem willing to do. In truth, the policies of this country bear an overwhelming responsibility for creating the very structural conditions that cause the stream of migrants that both Democrats and Republicans have decried, turning the act of simple survival into an eternal “crisis” for those very migrants and their families. A change in course is long overdue.

 

Aviva Chomsky writes regularly for TomDispatch (where this article originated). She is professor of history and coordinator of Latin American studies at Salem State University in Massachusetts. Her new book, Central America’s Forgotten History: Revolution, Violence, and the Roots of Migration, will be published in April.

Copyright ©2021 Aviva Chomsky — distributed by Agence Global

—————-

Released: 19 July 2021

Word Count: 2,532

—————-

Andrea Mazzarino, “Who authorized America’s wars?”

July 15, 2021 - TomDispatch

Sometimes, as I consider America’s never-ending wars of this century, I can’t help thinking of those lyrics from the Edwin Starr song, “(War, huh) Yeah! (What is it good for?) Absolutely nothing!” I mean, remind me, what good have those disastrous, failed, still largely ongoing conflicts done for this country?  Or for you?  Or for me?

For years and years, what came to be known as America’s “war on terror” (and later just its “forever wars”) enjoyed remarkable bipartisan support in Congress, not to say the country at large. Over nearly two decades, four presidents from both parties haven’t hesitated to exercise their power to involve our military in all sorts of ways in at least 85 countries around the world in the name of defeating “terrorism” or “violent extremism.”  Such interventions have included air strikes against armed groups in seven countries, direct combat against such groups in 12 countries, military exercises in 41 countries, and training or assistance to local military, police, or border patrol units in 79 countries. And that’s not even to mention the staggering number of U.S. military bases around the world where counterterrorism operations can be conducted, the massive arms sales to foreign governments, or all the additional deployments of this country’s Special Operations forces.

Providing the thinnest of legal foundations for all of this have been two ancient acts of Congress. The first was the authorization for the use of military force (AUMF) that allowed the president to act against “those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored such organizations or persons.” It led, of course, to the disastrous war in Afghanistan. It was passed in the week after those attacks on New York City and Washington, D.C. That bill’s lone opponent in the House, Representative Barbara Lee (D–CA), faced death threats from the public for her vote, though she stood by it, fearing all too correctly that such a law would sanction endless wars abroad (as, of course, it did).

The second AUMF passed on October 15, 2002, by a 77-23 vote in the Senate. Under the false rationale that Saddam Hussein’s Iraq harbored weapons of mass destruction (it didn’t), that AUMF gave President George W. Bush and his crew a green light to invade Iraq and topple its regime. Last month, the House finally voted 268-161 (including 49 Republican yes votes) to repeal the second of those authorizations.

Thinking back to when America’s “forever wars” first began, it’s hard to imagine how we could still be fighting in Iraq and Syria under the same loose justification of a war on terror almost two decades later or that the 2001 AUMF, untouched by Congress, still stands, providing the fourth president since the war on terror began with an excuse for actions of all sorts.

I remember watching in March 2003 from my home in northern California as news stations broadcast bombs going off over Baghdad. I’d previously attended protests around San Francisco, shouting my lungs out about the potentially disastrous consequences of invading a country based on what, even then, seemed like an obvious lie. Meanwhile, little did I know that the Afghan War authorization I had indeed supported, as a way to liberate the women of that country and create a democracy from an abusive state, would still be disastrously ongoing nearly 20 years later.

Nor did I imagine that, in 2011, having grasped my mistake when it came to the Afghan War, I would co-found Brown University’s Costs of War Project; nor that, about a decade into that war, I would be treating war-traumatized veterans and their families as a psychotherapist, even as I became the spouse of a Navy submariner. I would spend the second decade of the war on terror shepherding my husband and our two young children through four military moves and countless deployments, our lives breathless and harried by the outlandish pace of the disastrous forever (and increasingly wherever) wars that had come to define America’s global presence in the twenty-first century.

Amid all the talk about Joe Biden’s Afghan withdrawal decision which came “from the gut,” according to an official close to the president, it’s easy to forget that this country continues to fight some of those very same wars.

What keeps us safe? Take, for example, late last month when President Biden ordered “defensive” airstrikes in Iraq and Syria against reportedly Iran-backed Iraqi militia groups. Those groups were thought to be responsible for a series of at least five drone attacks on weapons storage and operational bases used by U.S. troops in Iraq and Syria. The June American air strikes supposedly killed four militia members, though there have been reports that one hit a housing complex, killing a child and wounding three other civilians (something that has yet to be verified). An unnamed “senior administration official” explained: “We have a responsibility to demonstrate that attacking Americans carries consequences, and that is true whether or not those attacks inflict casualties.” He did not, however, explain what those American troops were doing in the first place at bases in Iraq and Syria.

Note that such an act was taken on presidential authority alone, with Congress thoroughly sidelined as it has been since it passed those AUMFs so long ago. To be sure, some Americans still argue that such preemptive attacks — and really, any military buildups whatsoever — are precisely what keep Americans safe.

My husband, a Navy officer, has served on three nuclear and ballistic submarines and one battleship. He’s also built a nearly 20-year career on the philosophy that the best instrument of peace, should either of the other two great powers on this planet step out of line, is the concept of mutually-assured destruction — the possibility, that is, that a president would order not air strikes in Syria, but nuclear strikes somewhere.

He and I argue about this regularly. How, I ask him, can any weapons, no less nuclear ones, ever be seen as instruments of safety? (Though living in the country with the most armed citizens on the planet, I know that this isn’t exactly a winning argument domestically.) I mean, consider the four years we’ve just lived through! Consider the hands our nuclear arsenal was in from 2017 to 2020!

My husband always simply looks at me as if he knows so much more than I do about this. Yet the mere hint of a plan for “peace” based on a world-ending possibility doesn’t exactly put me at ease, nor does a world in which an American president can order air strikes more or less anywhere on the planet without the backing of anyone else, Congress included.

Every time my husband leaves home to go to some bunker or office where he would be among the first to be sheltered from a nuclear attack, my gut clenches. I feel the hopelessness of what would happen if we ever reached that point of no return where the only option might be to strike back because we ourselves were about to die. It would be a “solution” in which just those in power might remain safe. Meanwhile, our more modest preemptive attacks against other militaries and armed groups in distant lands exact a seldom-recognized toll in blood and treasure.

Every time I hear about preemptive strikes like those President Biden ordered last month in countries we’re not even officially at war with, attacks that were then sanctioned across most of the political spectrum in Washington from Democratic House Speaker Nancy Pelosi to Oklahoma Republican Senator Jim Inhofe, I wonder: How many people died in those attacks? Whose lives in those target areas were destroyed by uncertainty, fear, and the prospect of long-term anxiety?

In addition, given my work as a therapist with vets, I always wonder how the people who carried out such strikes are feeling right now. I know from experience that just following such life-ending orders can create a sense of internal distress that changes you in ways almost as consequential as losing a limb or taking a bullet.

How our wars kill at home For years now, my colleagues and I at the Costs of War Project have struggled to describe and quantify the human costs of America’s never-ending twenty-first-century wars. All told, we’ve estimated that more than 801,000 people died in fighting among U.S., allied, and opposing troops and police forces. And that doesn’t include indirect deaths due to wrecked healthcare systems, malnutrition, the uprooting of populations, and the violence that continues to plague traumatized families in those war zones (and here at home as well).

According to a stunning new report by Boston University’s Ben Suitt, the big killer of Americans engaged in the war on terror has not, in fact, been combat, but suicide, which has so far claimed the lives of 30,177 veterans and active servicemembers. Suicide rates among post-9/11 war veterans are higher than for any cohort of veterans since before World War II. Among those aged 18 to 35 (the oldest of whom weren’t even of voting age when we first started those never-ending wars and the youngest of whom weren’t yet born), the rate has increased by a whopping 76% since 2005.

And if you think that those most injured from their service are the ones coming home after Iraq and Afghanistan, consider this: over the past two decades, suicide rates have increased most sharply among those who have never even been deployed to a combat zone or have been deployed just once.

It’s hard to say why even those who don’t fight are killing themselves so far from America’s distant battlefields. As a psychotherapist who has seen my share of veterans who attempted to kill or — later — succeeded in killing themselves, I can say that two key predictors of that final, desperate act are hopelessness and a sense that you have no legitimate contribution to make to others.

As Suitt points out, about 42% of Americans are now either unaware of the fact that their country is still fighting wars in the Greater Middle East and Africa or think that the war on terror is over. Consider that for a moment. What does it mean to be fighting wars for a country in which a near majority of the population is unaware that you’re even doing so?

As a military spouse whose partner has not been deployed to a combat zone, the burdens of America’s forever wars are still shared by us in concrete ways: more frequent and longer deployments with shorter breaks, more abusive and all-encompassing command structures, and very little clear sense of what it is this country could possibly be fighting for anymore or what the end game might be.

If strikes like the ones President Biden authorized last month reflect anything, it’s that there are few ways — certainly not Congress — of reining in our commander in chief from sending Americans to harm and be harmed.

“Are soldiers killers?” I recall lying awake in 1991, at age 12, my stomach in knots, thinking about the first display of pyrotechnics I can remember, when President George H.W. Bush authorized strikes against Saddam Hussein’s Iraq in what became known as the First Gulf War. I told my father then, “I can’t sleep because I think that something bad is going to happen!” I didn’t know what, but those balls of fire falling on Baghdad on my New Jersey TV screen seemed consequential indeed.

Where were they landing? On whom? What was going to happen to our country? My father, who used a minor college football injury to dodge the Vietnam draft and has supported every war since then, shrugged, patted me on the back, and said he didn’t know, but that I shouldn’t worry too much about it.

As a parent myself now, I can still remember what it was like to first consider that people might kill others. As a result, I try to keep a conversation going with my own children as they start to grapple with the existence of evil.

Recently, our six-year-old son, excited to practice his newfound reading skills, came across a World War II military history book in my husband’s office and found photos of both Nazi soldiers and Jewish concentration camp prisoners. He stared at the gaunt bodies and haunted eyes of those prisoners. After a first-grade-level conversation about war and hatred, he suddenly pointed at Nazi soldiers in one photo and asked, “Are soldiers killers?” My husband and I flinched. And then he asked: “Why do people kill?”

Over and over, as such questions arise, I tell my son that people die in wars because so many of us turn our backs on what’s going on in the world we live in. I’m all too aware that we stop paying attention to what elected officials do because we’ve decided we like them (or hate them but can’t be bothered by them). I tell him that we’re going to keep reading the news and talking about it, because my little family, whatever our arguments, agrees that Americans don’t care enough about what war does to the bodies and minds of those who live through it.

Here’s the truth of it: we shouldn’t be spending this much time, money, and blood on conflicts whose end games are left to the discretion of whoever our increasingly shaky electoral system places in this country’s highest office. Until we pressure lawmakers to repeal that 2001 AUMF and end the forever conflicts that have gone with it, America’s wars will ensure that our democracy and the rule of law as we know it will make any promises of peace, self-defense, and justice ring hollow.

Don’t doubt it for a second. War is a cancer on our democracy.

Andrea Mazzarino writes regularly for TomDispatch (where this article originated). She co-founded Brown University’s Costs of War Project. She has held various clinical, research, and advocacy positions, including at a Veterans Affairs PTSD Outpatient Clinic, with Human Rights Watch, and at a community mental health agency. She is the co-editor of War and Health: The Medical Consequences of the Wars in Iraq and Afghanistan.

Copyright ©2021 Andrea Mazzarino — distributed by Agence Global

—————-

Released: 15 July 2021

Word Count: 2,301

—————-

Michael Klare, “On the brink in 2026: U.S.-China near-war status report”

July 13, 2021 - TomDispatch

It’s the summer of 2026, five years after the Biden administration identified the People’s Republic of China as the principal threat to U.S. security and Congress passed a raft of laws mandating a society-wide mobilization to ensure permanent U.S. domination of the Asia-Pacific region. Although major armed conflict between the United States and China has not yet broken out, numerous crises have erupted in the western Pacific and the two countries are constantly poised for war. International diplomacy has largely broken down, with talks over climate change, pandemic relief, and nuclear nonproliferation at a standstill. For most security analysts, it’s not a matter of if a U.S.-China war will erupt, but when.

Does this sound fanciful? Not if you read the statements coming out of the Department of Defense (DoD) and the upper ranks of Congress these days.

“China poses the greatest long-term challenge to the United States and strengthening deterrence against China will require DoD to work in concert with other instruments of national power,” the Pentagon’s 2022 Defense Budget Overview asserts. “A combat-credible Joint Force will underpin a whole-of-nation approach to competition and ensure the Nation leads from a position of strength.”  

On this basis, the Pentagon requested $715 billion in military expenditures for 2022, with a significant chunk of those funds to be spent on the procurement of advanced ships, planes, and missiles intended for a potential all-out, “high-intensity” war with China. An extra $38 billion was sought for the design and production of nuclear weapons, another key aspect of the drive to overpower China.

Democrats and Republicans in Congress, contending that even such sums were insufficient to ensure continued U.S. superiority vis-à-vis that country, are pressing for further increases in the 2022 Pentagon budget. Many have also endorsed the EAGLE Act, short for Ensuring American Global Leadership and Engagement — a measure intended to provide hundreds of billions of dollars for increased military aid to America’s Asian allies and for research on advanced technologies deemed essential for any future high-tech arms race with China.

Imagine, then, that such trends only gain momentum over the next five years. What will this country be like in 2026? What can we expect from an intensifying new Cold War with China that, by then, could be on the verge of turning hot?

Taiwan 2026: perpetually on the brink Crises over Taiwan have erupted on a periodic basis since the start of the decade, but now, in 2026, they seem to be occurring every other week. With Chinese bombers and warships constantly probing Taiwan’s outer defenses and U.S. naval vessels regularly maneuvering close to their Chinese counterparts in waters near the island, the two sides never seem far from a shooting incident that would have instantaneous escalatory implications. So far, no lives have been lost, but planes and ships from both sides have narrowly missed colliding again and again. On each occasion, forces on both sides have been placed on high alert, causing jitters around the world.

The tensions over that island have largely stemmed from incremental efforts by Taiwanese leaders, mostly officials of the Democratic Progressive Party (DPP), to move their country from autonomous status as part of China to full independence. Such a move is bound to provoke a harsh, possibly military response from Beijing, which considers the island a renegade province.

The island’s status has plagued U.S.-China relations for decades. When, on January 1, 1979, Washington first recognized the People’s Republic of China, it agreed to withdraw diplomatic recognition from the Taiwanese government and cease formal relations with its officials. Under the Taiwan Relations Act of 1979, however, U.S. officials were obligated to conduct informal relations with Taipei. The act stipulated as well that any move by Beijing to alter Taiwan’s status by force would be considered “a threat to the peace and security of the Western Pacific area and of grave concern to the United States” — a stance known as “strategic ambiguity,” as it neither guaranteed American intervention, nor ruled it out.

In the ensuing decades, the U.S. sought to avoid conflict in the region by persuading Taipei not to make any overt moves toward independence and by minimizing its ties to the island, thereby discouraging aggressive moves by China. By 2021, however, the situation had been remarkably transformed. Once under the exclusive control of the Nationalist Party that had been defeated by communist forces on the Chinese mainland in 1949, Taiwan became a multiparty democracy in 1987. It has since witnessed the steady rise of pro-independence forces, led by the DPP. At first, the mainland regime sought to woo the Taiwanese with abundant trade and tourism opportunities, but the excessive authoritarianism of its Communist Party alienated many island residents — especially younger ones — only adding momentum to the drive for independence. This, in turn, has prompted Beijing to switch tactics from courtship to coercion by constantly sending its combat planes and ships into Taiwanese air and sea space.

Trump administration officials, less concerned about alienating Beijing than their predecessors, sought to bolster ties with the Taiwanese government in a series of gestures that Beijing found threatening and that were only expanded in the early months of the Biden administration. At that time, growing hostility to China led many in Washington to call for an end to “strategic ambiguity” and the adoption of an unequivocal pledge to defend Taiwan if it were to come under attack from the mainland.

“I think the time has come to be clear,” Senator Tom Cotton of Arkansas declared in February 2021. “Replace strategic ambiguity with strategic clarity that the United States will come to the aid of Taiwan if China was to forcefully invade Taiwan.”

The Biden administration was initially reluctant to adopt such an inflammatory stance, since it meant that any conflict between China and Taiwan would automatically become a U.S.-China war with nuclear ramifications. In April 2022, however, under intense congressional pressure, the Biden administration formally abandoned “strategic ambiguity” and vowed that a Chinese invasion of Taiwan would prompt an immediate American military response. “We will never allow Taiwan to be subjugated by military force,” President Biden declared at that time, a striking change in a longstanding American strategic position.

The DoD would soon announce the deployment of a permanent naval squadron to the waters surrounding Taiwan, including an aircraft carrier and a supporting flotilla of cruisers, destroyers, and submarines. Ely Ratner, President Biden’s top envoy for the Asia-Pacific region, first outlined plans for such a force in June 2021 during testimony before the Senate Armed Services Committee. A permanent U.S. presence, he suggested, would serve to “deter, and, if necessary, deny a fait accompli scenario” in which Chinese forces quickly attempted to overwhelm Taiwan. Although described as tentative then, it would, in fact, become formal policy following President Biden’s April 2022 declaration on Taiwan and a brief exchange of warning shots between a Chinese destroyer and a U.S. cruiser just south of the Taiwan Strait.

Today, in 2026, with a U.S. naval squadron constantly sailing in waters near Taiwan and Chinese ships and planes constantly menacing the island’s outer defenses, a potential Sino-American military clash never seems far off. Should that occur, what would happen is impossible to predict, but most analysts now assume that both sides would immediately fire their advanced missiles — many of them hypersonic (that is, exceeding five times the speed of sound) — at their opponent’s key bases and facilities. This, in turn, would provoke further rounds of air and missile strikes, probably involving attacks on Chinese and Taiwanese cities as well as U.S. bases in Japan, Okinawa, South Korea, and Guam. Whether such a conflict could be contained at the non-nuclear level remains anyone’s guess.

The incremental draft In the meantime, planning for a U.S.-China war-to-come has dramatically reshaped American society and institutions.  The “Forever Wars” of the first two decades of the twenty-first century had been fought entirely by an All-Volunteer Force (AVF) that typically endured multiple tours of duty, in particular in Iraq and Afghanistan. The U.S. was able to sustain such combat operations (while continuing to maintain a substantial troop presence in Europe, Japan, and South Korea) with 1.4 million servicemembers because American forces enjoyed uncontested control of the airspace over its war zones, while China and Russia remained wary of engaging U.S. forces in their own neighborhoods.

Today, in 2026, however, the picture looks radically different: China, with an active combat force of two million soldiers, and Russia, with another million — both militaries equipped with advanced weaponry not widely available to them in the early years of the century — pose a far more formidable threat to U.S. forces. An AVF no longer looks particularly viable, so plans for its replacement with various forms of conscription are already being put into place.

Bear in mind, however, that in a future war with China and/or Russia, the Pentagon doesn’t envision large-scale ground battles reminiscent of World War II or the Iraq invasion of 2003. Instead, it expects a series of high-tech battles involving large numbers of ships, planes, and missiles. This, in turn, limits the need for vast conglomerations of ground troops, or “grunts,” as they were once labeled, but increases the need for sailors, pilots, missile launchers, and the kinds of technicians who can keep so many high-tech systems at top operational capacity.

As early as October 2020, during the final months of the Trump administration, Secretary of Defense Mark Esper was already calling for a doubling of the size of the U.S. naval fleet, from approximately 250 to 500 combat vessels, to meet the rising threat from China. Clearly, however, there would be no way for a force geared to a 250-ship navy to sustain one double that size. Even if some of the additional ships were “uncrewed,” or robotic, the Navy would still have to recruit several hundred thousand more sailors and technicians to supplement the 330,000 then in the force. Much the same could be said of the U.S. Air Force.

No surprise, then, that an incremental restoration of the draft, abandoned in 1973 as the Vietnam War was drawing to a close, has taken place in these years. In 2022, Congress passed the National Service Reconstitution Act (NSRA), which requires all men and women aged 18 to 25 to register with newly reconstituted National Service Centers and to provide them with information on their residence, employment status, and educational background — information they are required to update on an annual basis. In 2023, the NSRA was amended to require registrants to complete an additional questionnaire on their technical, computer, and language skills. Since 2024, all men and women enrolled in computer science and related programs at federally aided colleges and universities have been required to enroll in the National Digital Reserve Corps (NDRC) and spend their summers working on defense-related programs at selected military installations and headquarters. Members of that Digital Corps must also be available on short notice for deployment to such facilities, should a conflict of any sort threaten to break out.

The establishment of just such a corps, it should be noted, had been a recommendation of the National Security Commission on Artificial Intelligence, a federal agency established in 2019 to advise Congress and the White House on how to prepare the nation for a high-tech arms race with China. “We must win the AI competition that is intensifying strategic competition with China,” the commission avowed in March 2021, given that “the human talent deficit is the government’s most conspicuous AI deficit.” To overcome it, the commission suggested then, “We should establish a… civilian National Reserve to grow tech talent with the same seriousness of purpose that we grow military officers. The digital age demands a digital corps.”

Indeed, only five years later, with the prospect of a U.S.-China conflict so obviously on the agenda, Congress is considering a host of bills aimed at supplementing the Digital Corps with other mandatory service requirements for men and women with technical skills, or simply for the reinstatement of conscription altogether and the full-scale mobilization of the nation. Needless to say, protests against such measures have been erupting at many colleges and universities, but with the mood of the country becoming increasingly bellicose, there has been little support for them among the general public. Clearly, the “volunteer” military is about to become an artifact of a previous epoch.

A new cold war culture of repression With the White House, Congress, and the Pentagon obsessively focused on preparations for what’s increasingly seen as an inevitable war with China, it’s hardly surprising that civil society in 2026 has similarly been swept up in an increasingly militaristic anti-China spirit. Popular culture is now saturated with nationalistic and jingoistic memes, regularly portraying China and the Chinese leadership in derogatory, often racist terms. Domestic manufacturers hype “Made in America” labels (even if they’re often inaccurate) and firms that once traded extensively with China loudly proclaim their withdrawal from that market, while the streaming superhero movie of the moment, The Beijing Conspiracy, on a foiled Chinese plot to disable the entire U.S. electrical grid, is the leading candidate for the best film Oscar.  

Domestically, by far the most conspicuous and pernicious result of all this has been a sharp rise in hate crimes against Asian Americans, especially those assumed to be Chinese, whatever their origin. This disturbing phenomenon, which began at the outset of the Covid crisis, when President Trump, in a transparent effort to deflect blame for his mishandling of the pandemic, started using terms like “Chinese Virus” and “Kung Flu” to describe the disease. Attacks on Asian Americans rose precipitously then and continued to climb after Joe Biden took office and began vilifying Beijing for its human rights abuses in Xinjiang and Hong Kong. According to the watchdog group Stop AAPI Hate, some 6,600 anti-Asian incidents were reported in the U.S. between March 2020 and March 2021, with almost 40% of those events occurring in February and March 2021.

For observers of such incidents back then, the connection between anti-China policymaking at the national level and anti-Asian violence at the neighborhood level was incontrovertible. “When America China-bashes, then Chinese get bashed, and so do those who ‘look Chinese,’” said Russell Jeung, a professor of Asian American Studies at San Francisco State University at that time. “American foreign policy in Asia is American domestic policy for Asians.”

By 2026, most Chinatowns in America have been boarded up and those that remain open are heavily guarded by armed police. Most stores owned by Asian Americans (of whatever background) were long ago closed due to boycotts and vandalism, and Asian Americans think twice before leaving their homes.

The hostility and distrust exhibited toward Asian Americans at the neighborhood level has been replicated at the workplace and on university campuses, where Chinese Americans and Chinese-born citizens are now prohibited from working at laboratories in any technical field with military applications. Meanwhile, scholars of any background working on China-related topics are subject to close scrutiny by their employers and government officials. Anyone expressing positive comments about China or its government is routinely subjected to harassment, at best, or at worst, dismissal and FBI investigation.

As with the incremental draft, such increasingly restrictive measures were first adopted in a series of laws in 2022. But the foundation for much of this was the United States Innovation and Competition Act of 2021, passed by the Senate in June of that year. Among other provisions, it barred federal funding to any college or university that hosted a Confucius Institute, a Chinese government program to promote that country’s language and culture in foreign countries. It also empowered federal agencies to coordinate with university officials to “promote protection of controlled information as appropriate and strengthen defense against foreign intelligence services,” especially Chinese ones.

Diverging from the path of war Yes, in reality, we’re still in 2021, even if the Biden administration regularly cites China as our greatest threat. Naval incidents with that country’s vessels in the South China Sea and the Taiwan Strait are indeed on the rise, as are anti-Asian-American sentiments domestically. Meanwhile, as the planet’s two greatest greenhouse-gas emitters squabble, our world is growing hotter by the year.

Without question, something like the developments described above (and possibly far worse) will lie in our future unless action is taken to alter the path we’re now on. All of those “2026” developments, after all, are rooted in trends and actions already under way that only appear to be gathering momentum at this moment. Bills like the Innovation and Competition Act enjoy near unanimous support among Democrats and Republicans, while strong majorities in both parties favor increased funding of Pentagon spending on China-oriented weaponry. With few exceptions — Senator Bernie Sanders among them — no one in the upper ranks of government is saying: Slow down. Don’t launch another Cold War that could easily go hot.

“It is distressing and dangerous,” as Sanders wrote recently in Foreign Affairs, “that a fast-growing consensus is emerging in Washington that views the U.S.-Chinese relationship as a zero-sum economic and military struggle.” At a time when this planet faces ever more severe challenges from climate change, pandemics, and economic inequality, he added that “the prevalence of this view will create a political environment in which the cooperation that the world desperately needs will be increasingly difficult to achieve.”

In other words, we Americans face an existential choice: Do we stand aside and allow the “fast-growing consensus” Sanders speaks of to shape national policy, while abandoning any hope of genuine progress on climate change or those other perils? Alternately, do we begin trying to exert pressure on Washington to adopt a more balanced relationship with China, one that would place at least as much emphasis on cooperation as on confrontation. If we fail at this, be prepared in 2026 or soon thereafter for the imminent onset of a catastrophic (possibly even nuclear) U.S.-China war.

Michael T. Klare writes regularly for TomDispatch (where this article originated). He is the five-college professor emeritus of peace and world security studies at Hampshire College and a senior visiting fellow at the Arms Control Association. He is the author of 15 books, the latest of which is All Hell Breaking Loose: The Pentagon’s Perspective on Climate Change. He is a founder of the Committee for a Sane U.S.-China Policy.

Copyright ©2021 Michael T. Klare — distributed by Agence Global

—————-

Released: 13 July 2021

Word Count: 2,983

—————-

Rebecca Gordon, “The fires this time: a climate change view from California”

July 12, 2021 - TomDispatch

In San Francisco, we’re finally starting to put away our masks. With 74% of the city’s residents over 12 fully vaccinated, for the first time in more than a year we’re enjoying walking, shopping, and eating out, our faces naked. So I was startled when my partner reminded me that we need to buy masks again very soon — N95 masks, that is. The California wildfire season has already begun, earlier than ever, and we’ll need to protect our lungs during the months to come from the fine particulates carried in the wildfire smoke that’s been engulfing this city in recent years.

I was in Reno last September, so I missed the morning when San Franciscans awoke to apocalyptic orange skies, the air freighted with smoke from burning forests elsewhere in the state. The air then was bad enough even in the high mountain valley of Reno. At that point, we’d already experienced “very unhealthy” purple-zone air quality for days. Still, it was nothing like the photos that could have been from Mars then emerging from the Bay Area. I have a bad feeling that I may get my chance to experience the same phenomenon in 2021 — and, as the fires across California have started so much earlier, probably sooner than September.

The situation is pretty dire: this state — along with our neighbors to the north and southeast — is now living through an epic drought. After a dry winter and spring, the fuel-moisture content in our forests (the amount of water in vegetation, living and dead) is way below average. This April, the month when it is usually at its highest, San Jose State University scientists recorded levels a staggering 40% below average in the Santa Cruz Mountains, well below the lowest level ever before observed. In other words, we have never been this dry.

Under the heat dome When it’s hot in most of California, its often cold and foggy in San Francisco. Today is no exception. Despite the raging news about heat records, it’s not likely to reach 65 degrees here. So it’s a little surreal to consider what friends and family are going through in the Pacific Northwest under the once-in-thousands-of-years heat dome that’s settled over the region. A heat dome is an area of high pressure surrounded by upper-atmosphere winds that essentially pin it in place. If you remember your high-school physics, you’ll recall that when a gas (for example, the air over the Pacific Northwest) is contained, the ratio between pressure and temperature remains constant. If the temperature goes up, the pressure goes up.

The converse is also true; as the pressure rises, so does the temperature. And that’s what’s been happening over Oregon, Washington, and British Columbia in normally chilly Canada. Mix in the fact that climate change has driven average temperatures in those areas up by three to four degrees since the industrial revolution, and you have a recipe for the disaster that struck the region recently.

And it has indeed been a disaster. The temperature in the tiny town of Lytton, British Columbia, for instance, hit 121 degrees on June 29th, breaking the Canadian heat record for the third time in as many days. (The previous record had stood since 1937.) That was Tuesday. On Wednesday night, the whole town was engulfed in the flames of multiple fires. The fires, in turn, generated huge pyrocumulus clouds that penetrated as high as the stratosphere (a rare event in itself), producing lightning strikes that ignited new fires in a vicious cycle that, in the end, simply destroyed the kilometer-long town.

Heat records have been broken all over the Pacific Northwest. Portland topped records for three days running, culminating with a 116-degree day on June 28th; Seattle hit a high of 108, which the Washington Post reported “was 34 degrees above the normal high of 74 and higher than the all-time heat record in Washington, D.C., among many other cities much farther to its south.”

With the heat comes a rise in “sudden and unexpected” deaths. Hundreds have died in Oregon and Washington and, according to the British Columbia coroner, at least 300 in her state — almost double the average number for that time period.

Class, race, and hot air It’s hardly a new observation that the people who have benefited least from the causes of climate change — the residents of less industrialized countries and poor people of all nations — are already suffering most from its results. Island nations like the Republic of Palau in the western Pacific are a prime example. Palau faces a number of climate-change challenges, according to the United Nations Development Program, including rising sea levels that threaten to inundate some of its lowest-lying islands, which are just 10 meters above sea level. In addition, encroaching seawater is salinating some of its agricultural land, creating seaside strips that can now grow only salt-tolerant root crops. Meanwhile, despite substantial annual rainfall, saltwater inundation threatens the drinking water supply. And worse yet, Palau is vulnerable to ocean storms that, on our heating planet, are growing ever more frequent and severe.

There are also subtle ways the rising temperatures that go with climate change have differential effects, even on people living in the same city. Take air conditioning. One of the reasons people in the Pacific Northwest suffered so horrendously under the heat dome is that few homes in that region are air conditioned. Until recently, people there had been able to weather the minimal number of very hot days each year without installing expensive cooling machinery.

Obviously, people with more discretionary income will have an easier time investing in air conditioning now that temperatures are rising. What’s less obvious, perhaps, is that its widespread use makes a city hotter — a burden that falls disproportionately on people who can’t afford to install it in the first place. Air conditioning works on a simple principle; it shifts heat from air inside an enclosed space to the outside world, which, in turn, makes that outside air hotter.

A 2014 study of this effect in Phoenix, Arizona, showed that air conditioning raised ambient temperatures by one to two degrees at night — an important finding, because one of the most dangerous aspects of the present heat waves is their lack of night-time cooling. As a result, each day’s heat builds on a higher base, while presenting a greater direct-health threat, since the bodies of those not in air conditioning can’t recover from the exhaustion of the day’s heat at night. In effect, air conditioning not only heats the atmosphere further but shifts the burden of unhealthy heat from those who can afford it to those who can’t.

Just as the coronavirus has disproportionately ravaged black and brown communities (as well as poor nations around the world), climate-change-driven heat waves, according to a recent University of North Carolina study reported by the BBC, mean that “black people living in most U.S. cities are subject to double the level of heat stress as their white counterparts.” This is the result not just of poverty, but of residential segregation, which leaves urban BIPOC (black, indigenous, and other people of color) communities in a city’s worst “heat islands” — the areas containing the most concrete, the most asphalt, and the least vegetation — and which therefore attract and retain the most heat.

“Using satellite temperature data combined with demographic information from the U.S. Census,” the researchers “found that the average person of color lives in an area with far higher summer daytime temperatures than non-Hispanic white people.” They also discovered that, in all but six of the 175 urban areas they studied in the continental U.S., “people of color endure much greater heat impacts in summer.” Furthermore, “for black people this was particularly stark. The researchers say they are exposed to an extra 3.12C  [5.6F] of heating, on average, in urban neighborhoods, compared to an extra 1.47C [2.6F] for white people.”

That’s a big difference.

Food, drink, and fires — the view from California Now, let me return to my own home state, California, where conditions remain all too dry and, apart from the coast right now, all too hot. Northern California gets most of its drinking water from the snowpack that builds each year in the Sierra Nevada mountains. In spring, those snows gradually melt, filling the rivers that fill our reservoirs. In May 2021, however, the Sierra snowpack was a devastating six percent of normal!

Stop a moment and take that in, while you try to imagine the future of much of the state — and the crucial crops it grows.

For my own hometown, San Francisco, things aren’t quite that dire. Water levels in Hetch Hetchy, our main reservoir, located in Yosemite National Park, are down from previous years, but not disastrously so. With voluntary water-use reduction, we’re likely to have enough to drink this year at least. Things are a lot less promising, however, in rural California where towns tend to rely on groundwater for domestic use.

Shrinking water supplies don’t just affect individual consumers here in this state, they affect everyone in the United States who eats, because 13.5% of all our agricultural products, including meat and dairy, as well as fruits and vegetables, come from California. Growing food requires prodigious amounts of water. In fact, farmland irrigation accounts for roughly 80% of all water used by businesses and homes in the state.

So how are California’s agricultural water supplies doing this year? The answer, sadly, is not very well. State regulators have already cut distribution to about a quarter of California’s irrigated acreage (about two million acres) by a drastic 95%. That’s right. A full quarter of the state’s farmlands have access to just 5% of what they would ordinarily receive from rivers and aqueducts. As a result, some farmers are turning to groundwater, a more easily exhausted source, which also replenishes itself far more slowly than rivers and streams. Some are even choosing to sell their water to other farmers, rather than use it to grow crops at all, because that makes more economic sense for them. As smaller farms are likely to be the first to fold, the water crisis will only enhance the dominance of major corporations in food production.

Meanwhile, we’ll probably be breaking out our N95 masks soon. Wildfire season has already begun — earlier than ever. On July 1st, the then-still-uncontained Salt fire briefly closed a section of Interstate 5 near Redding in northern California. (I-5 is the main north-south interstate along the West coast.) And that’s only one of the more than 4,500 fire incidents already recorded in the state this year.

Last year, almost 10,000 fires burned more than four million acres here, and everything points to a similar or worse season in 2021. Unlike Donald Trump, who famously blamed California’s fires on a failure to properly rake our forests, President Biden is taking the threat seriously. On June 30th, he convened western state leaders to discuss the problem, acknowledging that “we have to act and act fast. We’re late in the game here.” The president promised a number of measures: guaranteeing sufficient, and sufficiently trained, firefighters; raising their minimum pay to $15 per hour; and making grants to California counties under the Federal Emergency Management Agency’s BRIC (Building Resilient Infrastructure and Communities) program.

Such measures will help a little in the short term, but none of it will make a damn bit of difference in the longer run if the Biden administration and a politically divisive Congress don’t begin to truly treat climate change as the immediate and desperately long-term emergency it is.

Justice and generations In his famous A Theory of Justice, the great liberal philosopher of the twentieth century John Rawls proposed a procedural method for designing reasonable and fair principles and policies in a given society. His idea: that the people determining such basic policies should act as if they had stepped behind a “veil of ignorance” and had lost specific knowledge of their own place in society. They’d be ignorant of their own class status, ethnicity, or even how lucky they’d been when nature was handing out gifts like intelligence, health, and physical strength. 

Once behind such a veil of personal ignorance, Rawls argued, people might make rules that would be as fair as possible, because they wouldn’t know whether they themselves were rich or poor, black or white, old or young — or even which generation they belonged to. This last category was almost an afterthought, included, he wrote, “in part because questions of social justice arise between generations as well as within them.”

His point about justice between generations not only still seems valid to me, but in light of present-day circumstances radically understated. I don’t think Rawls ever envisioned a trans-generational injustice as great as the climate-change one we’re allowing to happen, not to say actively inducing, at this very moment.

Human beings have a hard time recognizing looming but invisible dangers. In 1990, I spent a few months in South Africa providing some technical assistance to an anti-apartheid newspaper. When local health workers found out that I had worked (as a bookkeeper) for an agency in the U.S. trying to prevent the transmission of AIDS, they desperately wanted to talk to me. How, they hoped to learn, could they get people living in their townships to act now to prevent a highly transmissible illness that would only produce symptoms years after infection? How, in the face of the all-too-present emergencies of everyday apartheid life, could they get people to focus on a vague but potentially horrendous danger barreling down from the future? I had few good answers and, almost 30 years later, South Africa has the largest HIV-positive population in the world.

Of course, there are human beings who’ve known about the climate crisis for decades — and not just the scientists who wrote about it as early as the 1950s or the ones who gave an American president an all-too-accurate report on it in 1965. The fossil-fuel companies have, of course, known all along — and have focused their scientific efforts not on finding alternative energy sources, but on creating doubt about the reality of human-caused climate change (just as, once upon a time, tobacco companies sowed doubt about the relationship between smoking and cancer). As early as 1979, the Guardian reports, an internal Exxon study concluded that the use of fossil fuels would certainly “cause dramatic environmental effects” in the decades ahead. “The potential problem is great and urgent,” the study concluded.

A problem that was “great and urgent” in 1979 is now a full-blown existential crisis for human survival.

Some friends and I were recently talking about how ominous the future must look to the younger people we know. “They are really the first generation to confront an end to humanity in their own, or perhaps their children’s lifetimes,” I said.

“But we had The Bomb,” a friend reminded me. “We grew up in the shadow of nuclear war.” And she was right of course. We children of the 1950s and 1960s grew up knowing that someone could “press the button” at any time, but there was a difference. Horrifying as is the present retooling of our nuclear arsenal (going on right now, under President Biden), nuclear war nonetheless remains a question of “if.” Climate change is a matter of “when” and that when, as anyone living in the Northwest of the United States and Canada should know after these last weeks, is all too obviously now.

It’s impossible to overstate the urgency of the moment. And yet, as a species, we’re acting like the children of indulgent parents who provide multiple “last chances” to behave. Now, nature has run out of patience and we’re running out of chances. So much must be done globally, especially to control the giant fossil-fuel companies. We can only hope that real action will emerge from November’s international climate conference. And here in the U.S., unless congressional Democrats succeed in ramming through major action to stop climate change before the 2022 midterms, we’ll have lost one more last, best chance for survival.

Rebecca Gordon writes regularly for TomDispatch (where this article originated). She teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Copyright ©2021 Rebecca Gordon — distributed by Agence Global

—————-

Released: 12 July 2021

Word Count: 2,683

—————-

Karen J. Greenberg, “America’s accountability problem”

July 8, 2021 - TomDispatch

America has an accountability problem. In fact, if the Covid-19 disaster, the January 6th Capitol attack, and the Trump years are any indication, the American lexicon has essentially dispensed with the term “accountability.”

This should come as no surprise. After all, there’s nothing particularly new about this. In the Bush years, those who created a system of indefinite offshore detention at Guantánamo Bay, Cuba, those who implemented a CIA global torture program and the National Security Agency’s warrantless surveillance policy, not to mention those who purposely took us to war based on lies about nonexistent Iraqi weapons of mass destruction, were neither dismissed, sanctioned, nor punished in any way for obvious violations of the law. Nor has Congress passed significant legislation of any kind to ensure that all-encompassing abuses like these will not happen again.

Now, early in the Biden era, any determination to hold American officials responsible for such past wrongdoing, even the president who helped launch an assault on the Capitol, seems little more than a fantasy. It may be something to discuss, rail against, or even make promises about, but not actually reckon with — not if you’re either a deeply divided Congress or a Department of Justice that has compromised itself repeatedly in recent years. Under other circumstances, of course, those would be the two primary institutions with the power to pursue genuine accountability in any meaningful way for extreme and potentially illegal government acts.

Today, if thought about at all, accountability — whether in the form of punishment for misdeeds or meaningful reform — has been reduced to a talking point. With that in mind, let’s take a moment to consider the Biden administration’s approach to accountability so far.

How we got here Even before Donald Trump entered the Oval Office, the country was already genuinely averse to accountability. When President Obama took office in January 2009, he faced the legacy of the George W. Bush administration’s egregious disregard for laws and norms in its extralegal post-9/11 war on terror. From day one of his presidency, Obama made clear that he found his predecessor’s policies unacceptable by both acknowledging and denouncing those crimes. But he insisted that they belonged to the past.

Fearing that the pursuit of punishment would involve potentially ugly encounters with former officials and would seem like political retribution in a country increasingly divided and on edge, he clearly decided that it wouldn’t be worth the effort. Ultimately, as he said about “interrogations, detentions, and so forth,” it was best for the nation to “look forward, as opposed to looking backward.”

True to the president’s word, the Obama administration refused to hold former officials responsible for violations of fundamental constitutional and legal issues. Among those who escaped retrospective accountability were Vice President Dick Cheney, who orchestrated the invasion of Saddam Hussein’s Iraq based on lies; the lawyer in the Justice Department’s Office of Legal Counsel, John Yoo, who, in his infamous “Torture Memos,” justified the “enhanced interrogation” of war-on-terror prisoners; and Secretary of Defense Donald Rumsfeld, who created a Bermuda triangle of injustice at Guantánamo Bay, Cuba. In terms of reform, Obama did ensure a degree of meaningful change, including decreeing an official end to the CIA torture of prisoners of war. But too much of what had happened remained unaddressed and lay in wait for abuse at the hands of some irresponsible future president.

As a result, many of the sins that were at the heart of the never-ending response to the 9/11 attacks have become largely forgotten history, leaving many potential crimes unaddressed. And even more sadly, the legacy of accountability’s demise only continues. Biden and his team entered office facing a brand-new list of irregularities and abuses by high-ranking officials, including President Trump.

In this case, the main events demanding accountability had occurred on the domestic front. The January 6th insurrection, the egregious mishandling of the pandemic, the interference in the 2020 presidential election, and the use of the Department of Justice for political ends all awaited investigation after inauguration day. At the outset, the new government dutifully promised that some form of accountability would indeed be forthcoming. On January 15th, House Speaker Nancy Pelosi announced that she planned to convene an independent commission to thoroughly investigate the Capitol riots, later pledging to look into the “facts and causes” of that assault on Congress.

Attorney General nominee Merrick Garland similarly promised, “If confirmed, I will supervise the prosecution of white supremacists and others who stormed the Capitol on January 6th.” Meanwhile, signaling some appetite for holding his predecessor accountable, during the presidential campaign, Joe Biden had already ruled out the possibility of extending a pardon to Donald Trump. In that way, he ensured that, were he elected, numerous court cases against the president and his Trump Organization would be open to prosecution — even as Noah Bookbinder, the executive director of Citizens for Responsibility and Ethics in Washington, recently suggested, reviving of the obstruction of justice charges that had been central to the Mueller investigation of the 2016 presidential election.

Reluctance in the halls of accountability Six months after Joe Biden took office, there has been no firm movement toward accountability by his administration. On the question of making Donald Trump and his allies answer for their misdeeds, the appetite of this administration so far seems wanting, notably, for example, when it comes to the role the president may have played in instigating the Capitol attack. Sadly, Pelosi’s call for an independent commission to investigate that insurrectionary moment passed the House, but fell victim last month to the threat of a filibuster and was blocked in the Senate. (Last week, largely along party lines, the House passed a select committee to investigate the insurrection.)

Trump’s disastrous mishandling of the pandemic, potentially responsible for staggering numbers of American deaths, similarly seems to have fallen into the territory of unaccountability. The partisan divisions of Congress continue to stall a Covid-19 investigation. National security expert and journalist Peter Bergen, for instance, called for a commission to address the irresponsible way the highest levels of government dealt with the pandemic, but the idea failed to gain traction. Instead, the focus has turned to the question of whether or not there was malfeasance at a Chinese government lab in Wuhan.

It matters not at all that numerous journalists, including Lawrence Wright, Michael Lewis, and Nicholson Baker, have impressively documented the mishandling of the pandemic here. Such disastrous acts included early denials of the lethality of the disease, the disavowal of pandemic preparedness plans, the dismantling of the very government office meant to respond to pandemics, the presidential promotion of quack cures, a disregard for wearing masks early on, and so much else, all of which contributed to a generally chaotic governmental response, which ultimately cost tens of thousands of lives.

In truth, a congressional investigation into either the Capitol riots or the Trump administration’s mishandling of the pandemic might never have led to actual punitive accountability. After all, the 9/11 Commission, touted as the gold standard for such investigations, did nothing of the sort. While offering a reputable history of the terrorist threat that resulted in the attacks of September 11, 2001, and a full-scale summary of government missteps and lapses that led up to that moment, the 9/11 report did not take on the mission of pointing fingers and demanding accountability.

In a recent interview with former New York Times reporter Philip Shenon, whose 2008 book The Commission punctured that group’s otherwise stellar reputation, Just Security editor Ryan Goodman offered this observation: “[An] important lesson from your book is the conscious tradeoff that the 9/11 Commission members made in prioritizing having a unanimous final report which sacrificed their ability to promote the interests of accountability (such as identifying and naming senior government officials whose acts or omissions were responsible for lapses in U.S. national security before the attack).”

Shenon added that the tradeoff between accountability and unanimity was acknowledged by commission staff members frustrated by the absence of what they thought should have been the report’s “most important and controversial” conclusions. In other words, when it came to accountability, the 9/11 Report proved an inadequate model at best. Still, even its version of truth-telling proved too much for congressional Republicans facing a similar commission on the events of January 6th.

Note, however, that the 9/11 Commission did lead to movement along another path of accountability: reform. In its wake came certain structural changes, including a bolstering of the interagency process for sharing information and the creation of the Office of the Director of National Intelligence.

No such luck today. And signs of the difficulty of facing any kind of accountability are now evident inside the Department of Justice (DOJ), too. Despite initial rhetoric to the contrary from Attorney General Merrick Garland, the department has shown little appetite for redress when it comes to those formerly in the highest posts. And that reality should bring to mind the similar reluctance of Barack Obama, the president who originally nominated Garland unsuccessfully to the Supreme Court.

For anyone keeping a scorecard of DOJ actions regarding Trump-era excesses, the record is slim indeed. While the department did, at least, abandon any possible prosecution of former National Security Advisor John Bolton for supposedly disclosing classified information in his memoir on his time in the Trump administration, Garland also announced that he would not pursue several matters that could have brought to light information about President Trump’s abuse of power.

In May, for instance, the department appealed a court-ordered call for the release of the full version of a previously heavily redacted DOJ memo advising then-Attorney General Bill Barr that the evidence in the Mueller Report was “not sufficient to support a conclusion beyond a reasonable doubt that the President violated the obstruction-of-justice statutes.” In fact, the Mueller Report did not exonerate Trump, as Mueller himself would later testify in Congress and as hundreds of federal prosecutors would argue in a letter written in the wake of the report’s publication, saying, “Each of us believes that the conduct of President Trump described in Special Counsel Robert Mueller’s report would… result in multiple felony charges for obstruction of justice.”

Adding fuel to the fire of disappointment, Garland pulled back from directly assessing fault lines inside the Department of Justice when it came to its independence from partisan politics. Instead, he turned over to the DOJ inspector general any further investigation into Trump’s politicization of the department.

The path forward — or not? These are all discouraging signs, yet there’s still time to strengthen our faltering democracy by reinstating the idea that abuses of power and violations of the law — from inside the White House, no less — are not to be tolerated. Even without an independent commission looking into January 6th or the DOJ prosecuting anyone, some accountability should still be possible. (After all, it was a New York State court that recently suspended Rudy Giuliani’s license to practice law.)

On June 24th, Nancy Pelosi announced at a news conference that a select Congressional committee, even if not an independent 9/11-style commission, would look into the Capitol attack. That committee, she added, will “establish the truth of that day and ensure that an attack of that kind cannot happen and that we root out the causes of it all.” True, she didn’t specify whether accountability and reform would be part of that committee’s responsibilities, but neither goal is off the table.

And Pelosi’s fallback plan to convene a House select committee could still have an impact. After all, remember the Watergate committee in the Nixon era. It, too, was a select committee and it launched an investigation into abuses of power in the Watergate affair that helped bring about President Nixon’s resignation from office and helped spark or support court cases against many of his partners in crime. Similarly, the 1975 Church Commission investigation into the abuses of the intelligence community, among them the FBI’s notorious counter-intelligence program, COINTELPRO, was also a select committee project. It led to significant barriers against future abuses — including a ban on assassinations and a host of “good government” bills.

Pelosi rightly insists that she’s intent on pursuing an investigation into the Capitol attack. Adam Schiff and Jerry Nadler are similarly determined to investigate the government seizure of Internet communications. Local court cases against Trump, Giuliani, and others will, it appears, continue apace.

Through such efforts, perhaps the potentially shocking facts could see the light of day. Continuing such quests may lead to anything but perfect accountability, particularly in a country growing ever more partisan. Above and beyond the immediate importance of giving the public — and history — a reliable narrative of recent events, it’s important to let Americans know that accountability is still a crucial part of our democracy as are the laws and norms accountability aims to protect. Otherwise, this country will have to face a new reality: that we are now living in the age of impunity.

Karen J. Greenberg writes regularly for TomDispatch (where this article originated). She is the director of the Center on National Security at Fordham Law and author of the forthcoming Subtle Tools: The Dismantling of Democracy from the War on Terror to Donald Trump  (Princeton University Press, August). Julia Tedesco helped with research for this piece.

Copyright ©2021 Karen J. Greenberg — distributed by Agence Global

—————-

Released: 08 July 2021

Word Count: 2,158

—————-

Alfred McCoy, “America’s drug wars: fifty years of reinforcing racism”

July 6, 2021 - TomDispatch

Fifty years ago, on June 17, 1971, President Richard Nixon stood before the White House press corps, staffers at his side, to announce “a new, all-out offensive” against drug abuse, which he denounced as “America’s public enemy number one.” He called on Congress to contribute $350 million for a worldwide attack on “the sources of supply.” The first battle in this new drug war would be fought in South Vietnam where, Nixon said, “a number of young Americans have become addicts as they serve abroad.”

While the president was declaring his war on drugs, I was stepping off a trans-Pacific flight into the searing tropical heat of Saigon, the South Vietnamese capital, to report on the sources of supply for the drug abuse that was indeed sweeping through the ranks of American soldiers fighting this country’s war in Vietnam.

As I would soon discover, the situation was far worse than anything Nixon could have conveyed in his sparse words. Heroin vials littered the floors of Army barracks. Units legendary for their heroism in World War II like the 82nd Airborne were now known as the “jumping junkies.” A later survey found that more than a third of all GIs fighting the Vietnam War “commonly used” heroin. Desperate to defeat this invisible enemy, the White House was now about to throw millions of dollars at this overseas drug war, funding mass urinalysis screening for every homeward-bound GI and mandatory treatment for any who tested positive for drugs.

Even that formidable effort, however, couldn’t defeat the murky politics of heroin, marked by a nexus of crime and official collusion that made mass drug abuse among GIs possible. After all, in the rugged mountains of nearby Laos, Air America, a company run by the CIA, was transporting opium harvested by tribal farmers who were also serving as soldiers in its secret army. The commander of the Royal Lao Army, a close ally, then operated the world’s largest illicit lab, turning raw opium into refined heroin for the growing numbers of GI users in neighboring Vietnam. Senior South Vietnamese commanders colluded in the smuggling and distribution of such drugs to GIs in bars, in barracks, and at firebases. In both Laos and South Vietnam, American embassies ignored the corruption of their local allies that was helping to fuel the traffic.

Nixon’s drug war As sordid as Saigon’s heroin politics were, they would pale when compared to the cynical deals agreed to in Washington over the next 30 years that would turn the drug war of the Vietnam era into a political doomsday machine. Standing alongside the president on that day when America’s drug war officially began was John Ehrlichman, White House counsel and Nixon confidante.

As he would later bluntly tell a reporter, 

“The Nixon White House had two enemies: the antiwar left and black people… We knew we couldn’t make it illegal to be either against the war or black, but by getting the public to associate the hippies with marijuana and blacks with heroin, and then criminalizing both heavily, we could disrupt those communities. We could arrest their leaders, raid their homes, break up their meetings, and vilify them night after night on the evening news.”

And just in case anyone missed his point, Ehrlichman added, “Did we know we were lying about the drugs? Of course, we did.”

To grasp the full meaning of this admission, you need to begin with the basics: the drug war’s absolute, unqualified, irredeemable failure. Just three pairs of statistics can convey the depth of that failure and the scope of the damage the war has done to American society over the past half-century:

• Despite the drug war’s efforts to cut supplies, worldwide illicit opium production rose 10-fold — from 1,200 tons in 1971 to a record 10,300 tons in 2017.

• Reflecting its emphasis on punishment over treatment, the number of people jailed for drug offenses would also grow 10-fold from 40,900 in 1980 to 430,900 in 2019.

• Finally, instead of reducing domestic use, the drug war actually helped stimulate a 10-fold surge in the number of American heroin users from just 68,000 in 1970 to 745,000 in 2019.

In addition, the drug war has had a profound impact on American society by perpetuating, even institutionalizing, racial disparities through the raw power of the police and prisons. Remember that the Republican Party saw the Voting Rights Act of 1965, which ended decades of Jim Crow disenfranchisement for Blacks in the deep South, as a rare political opportunity. In response, Nixon and his men began developing a two-part strategy for winning over white voters in the South and blunting the Democratic advantage with Black voters nationwide.

First, in the 1970 midterm elections, the Republicans began pursuing a “Southern strategy” of courting disgruntled white-supremacist voters in the South in a successful attempt to capture that entire region politically. Three years later, they launched a relentless expansion of the drug war, policing, and prisons. In the process, they paved the way for the mass incarceration of African Americans, denying them the vote not just as convicts but, in 15 states, for life as ex-convicts. Pioneering this cunning strategy was New York’s Republican governor Nelson Rockefeller. The harsh mandatory penalties of 15 years to life for petty drug possession he got the state legislature to pass raised the number of people imprisoned on drug charges from 470 in 1970 to 8,500 in 1999, 90% of them African-American or Latinx.

Such mass incarceration moved voters from urban Democratic bailiwicks to rural prisons where they were counted in the census, but otherwise disenfranchised, giving a bit of additional help to the white Republican vote in upstate New York — a winning strategy Republicans elsewhere would soon follow. Not only did the drug war let conservatives shave opposition vote tallies in close elections, but it also dehumanized African Americans, justifying repressive policing and mass incarceration.

None of this was pre-ordained but the result of a succession of political deals made during three presidencies — that of Nixon, who started it; of Ronald Reagan, whose administration enacted draconian punishments for drug possession; and of the Democrat Bill Clinton, who expanded the police and prisons to enforce those very drug laws. After remaining remarkably constant at about 100 prisoners per 100,000 population for more than 50 years, the U.S. incarceration rate started climbing relentlessly to 293 by the end of Reagan’s term in 1990 and 464 by the end of Clinton’s in 2000. It reached a peak of 760 by 2008 — with a racial bias that resulted in nothing less than the “mass incarceration” of African Americans.

Reagan domesticates the drug war While Nixon fought his war largely on foreign battlefields trying, and failing, to stop narcotics at their source, the next Republican president, Ronald Reagan, fully domesticated the drug war through ever harsher penalties for personal use and a publicity campaign that made abstinence a moral virtue and indulgence a fiercely punishable vice. Meanwhile, he also signaled clearly that he was determined to pursue Nixon’s Southern strategy by staging a major 1980 election campaign rally in Neshoba County, Mississippi, where three civil rights workers had previously been murdered.

Taking office in 1981, Reagan found, to his surprise, that reviving the drug war at home had little public support, largely because the outgoing Democratic administration had focused successfully on drug treatment rather than punishment. So, First Lady Nancy Reagan began crisscrossing the country, while making TV appearances with choruses of cute kids wearing “Just Say No” T-shirts. Even after four years of the First Lady’s campaign and the simultaneous spread of crack cocaine and cocaine powder in cities and suburbs nationwide, only about 2% of the electorate felt that drug abuse was the nation’s “number one problem.”

Then personal tragedy provided Reagan with the perfect political opportunity. In June 1986, just a day after signing a multimillion-dollar contract with the NBA’s Boston Celtics, college basketball sensation Len Bias collapsed in his dorm at the University of Maryland from a fatal cocaine overdose. Five months later, President Reagan would sign the Anti-Drug Abuse Act, aka the “Len Bias Law.” It would lead to a quantum expansion of the domestic drug war, including a mandatory minimum sentence of five years just for the possession of five grams of cocaine and a revived federal death penalty for traffickers.

It also put into law a racial bias in imprisonment that would prove staggering: a 100:1 sentencing disparity between those convicted of possessing crack-cocaine (used mainly by inner-city Blacks) and those using cocaine powder (favored by suburban whites) — even though there was no medical difference between the two drugs. To enforce such tough penalties, the law also expanded the federal anti-drug budget to a massive $6.5 billion.

In signing that law, Reagan would pay special tribute to the first lady, calling her “the co-captain in our crusade for a drug-free America” and the fight against “the purveyors of this evil.” And the two of them had much to take credit for. After all, by 1989, an overwhelming 64% of Americans had come to feel that drugs were the nation’s “number one problem.” Meanwhile, thanks largely to the Anti-Drug Abuse Act, Americans jailed for nonviolent drug offenses soared from 50,000 in 1980 to 400,000 in 1997. Driven by drug arrests, in 1995 nearly one-third of all African-American males between 20 and 29 would either be in prison or on parole.

Clinton’s all-too-bipartisan drug war If those two Republican presidents were adept at portraying partisan anti-drug policies as moral imperatives, their Democratic successor, Bill Clinton, proved adept at getting himself reelected by picking up their seductive rhetoric. Under his administration, a racialized drug policy, with its disenfranchisement and denigration of African Americans, would become fully bipartisan.

In 1994, two years after being elected president, Clinton lost control of Congress to Republican conservatives led by House Speaker Newt Gingrich. Desperate for something he could call a legislative accomplishment, he tacked hard right to support the Violent Crime Control Act of 1994. It would prove the largest law-enforcement initiative in American history: nearly $19 billion dollars for 100,000 new cops to sweep the streets for drug offenders and a massive prison-expansion program to house those who would now be sentenced to life after three criminal convictions (“three strikes”).

A year later, when the non-partisan U.S. Sentencing Commission recommended that the 100:1 disparity in penalties for crack-cocaine and cocaine powder be abolished, along with its blatant racial bias, Clinton flatly rejected the advice, signing instead Republican-sponsored legislation that maintained those penalties. “I am not,” he insisted, “going to let anyone who peddles drugs get the idea that the cost of doing business is going down.”

The country’s Black political leaders were eloquent in their condemnation of this political betrayal. The Reverend Jesse Jackson, a former Democratic presidential candidate, claimed Clinton knew perfectly well that “crack is code for black” and labelled the president’s decision “a moral disgrace” by a man “willing to sacrifice young black youth for white fear.” The Congressional Black Caucus would similarly denounce the sentencing disparity as “a mockery of justice.”

As they predicted all too accurately, the relentless rise of Black incarceration only accelerated. In the five years following passage of Clinton’s omnibus crime bill, the country added 204 prisons and its inmate population shot up by a mind-boggling 28% to 1,305,300. Of those, nearly half (587,300) were Black, though African Americans made up only 13% of the country’s population.

Facing a tough reelection campaign in 1996, Clinton again worked with hard-right congressional Republicans to pass the Personal Responsibility Work Act, which, as he put it, brought an “end to welfare as we know it.” With that law’s work requirement for welfare, even as unemployment among Black residents of cities like Chicago (left behind by industry) hit 20% to 25%, youth in inner cities across America found that street-level drug dealing was fast becoming their only opportunity. In effect, the Clintons gained short-term political advantage by doing long-term social and economic damage to a core Democratic constituency, the African American community.

Reviving Jim Crow’s racial stereotypes Nonetheless, during his 1996 reelection campaign, Clinton trumpeted such dubious legislative achievements. Speaking at a campaign rally in New Hampshire, for instance, Hillary Clinton celebrated her husband’s Violent Crime Control Act for taking back the streets from murderous minority teenagers. “They are often the kinds of kids that are called ‘super-predators,’” Clinton said. “No conscience, no empathy. We can talk about why they ended up that way, but first we have to bring them to heel.”

The term “super-predator” had, in fact, originated with a Princeton University political scientist, John Dilulio, who described his theory to the first couple during a 1995 White House working dinner on juvenile crime. In an article for a neo-conservative magazine that November, the academic trumpeted his apocalyptic analysis. Based solely on the spottiest of anecdotal evidence, he claimed that “black inner-city neighborhoods” would soon fall prey to such “super predators” — a new kind of juvenile criminal marked by “impulsive violence, the vacant stares, and the remorseless eyes.” Within five years, he predicted, there would be 30,000 “more murderers, rapists, and muggers on the streets” who would “place zero value on the lives of their victims, whom they reflexively dehumanize as just so much worthless ‘white trash.’” This rising demographic tide, he warned, would soon “spill over into upscale central-city districts, inner-ring suburbs, and even the rural heartland.”

By the way, the truly significant part of Hillary Clinton’s statement based on Dilulio’s “analysis” was that phrase about bringing super-predators to heel. A quick quiz. Who or what does one “bring to heel”: (a.) a woman, (b.) a man, or (c.) a child? Answer: (d.) None of the above.

That term is used colloquially for controlling a leashed dog. By implicitly referring to young Black males as predators and animals, Clinton was tapping into one of America’s most venerable and virulent ethnic stereotypes: the Black “buck” or “brute.” The Jim Crow Museum of Racist Memorabilia at Ferris State University in Michigan reports that “the brute caricature portrays black men as innately savage, animalistic, destructive, and criminal — deserving punishment, maybe death… Black brutes are depicted as hideous, terrifying predators.”

Indeed, Southern fiction of the Jim Crow era featured the “Black brute” as an animal predator whose natural prey was white women. In words strikingly similar to those Dilulio and Clinton would later use for their super-predator, Thomas Dixon’s influential 1905 novel The Clansman: A Historical Romance of the Ku Klux Klan described the Black brute as “half child, half animal… a being who, left to his will, roams at night and sleeps in the day, whose speech knows no word of love, whose passions, once aroused, are as the fury of the tiger.” When turned into a movie in 1915 as The Birth of a Nation (the first film ever screened in the White House), it depicted a Black man’s animalistic rape of a virtuous white woman and reveled in the Klan’s retribution by lynching.

In effect, the rhetoric about “super-predators” revived the most virulent stereotype from the Jim Crow lexicon. By the end of President Clinton’s term in 2000, nearly every state in the nation had stiffened its laws on juveniles, setting aside family courts and sending young, mainly minority, offenders directly to adult prisons for long sentences.

Of course, the predicted wave of 30,000 young super-predators never happened. Instead, violent juvenile crime was already declining when Hillary Clinton gave that speech. By the time President Clinton’s term ended in 2001, the juvenile homicide rate had fallen well below its level in 1985.

Amazingly, it would be another 20 years before Hillary Clinton was compelled to confront the meaning of those freighted words of hers. While she was speaking to a donors’ meeting in South Carolina during her 2016 presidential campaign, Ashley Williams, a young Black activist, stood up in the front row and unfurled a small banner that read: “We have to bring them to heel.” Speaking calmly, she asked: “Will you apologize to black people for mass incarceration?” And then she added, “I am not a super-predator, Hillary Clinton.”

When Clinton tried to talk over her, she insisted: “I know that you called black people super-predators in 1994.” As the Secret Service hurried that young woman out of the room amid taunts from the largely white audience, Clinton announced, with a palpable sense of relief, “Okay, back to the issues.”

In its report on the incident, the Washington Post asked Clinton for a comment. In response, she offered the most unapologetic of apologies, explaining that, back in 1994, she had been talking about “violent crime and vicious drug cartels and the particular danger they pose to children and families.”

“As an advocate, as first lady, as senator, I was a champion for children,” she added, though admitting as well that, “looking back, I shouldn’t have used those words.”

That was it. No mention of mass incarceration. No apology for using the power of the White House pulpit to propagate the most virulent of racial stereotypes. No promises to undo all the damage she and her husband had caused. Not surprisingly, in November 2016, the African-American turnout in 33 states — particularly in the critical swing states of Florida, Michigan, Pennsylvania, and Wisconsin — was markedly down, costing her the election.

The burden of this past As much as both Republicans and Democrats might wish us to forget the costs of their deals, this tragic past is very much part of our present. In the 20 years since the drug war took final form under Clinton, politicians have made some relatively inconsequential reforms. In 2010, Congress made a modest cut in the sentencing disparity between the two kinds of cocaine that reduced the prison population by an estimated 1,550 inmates; Barack Obama pardoned 1,700 drug offenders; and Donald Trump signed the First Step Act that released 3,000 prisoners. Add up all those “reforms” and you end up with only 1.5% of those now in prison for drug offenses — just the tiniest drop of mercy in a vast ocean of misery.

So, even 50 years later, this country is still fighting a war on drugs and on non-violent drug users. Thanks to its laws, petty drug possession is still a felony with heavy penalties. As of 2019, this country’s prisons remained overcrowded with 430,900 people convicted of drug crimes, while drug offenders represented 46% of all those in federal penitentiaries. In addition, the U.S. still has the world’s highest incarceration rate at 639 prisoners per 100,000 population (nearly double Russia’s), with 1,380,400 people imprisoned, of whom 33% are Black.

So many decades later, the drug war’s mass incarceration still denies millions of African Americans the right to vote. As of 2020, 48 states refused their convicts the vote, while 34 states imposed a range of restrictions on ex-convicts, effectively denying suffrage to about 2.2 million Blacks, or 6.3% of all African-American adults.

Recent challenges have made more visible the drug war’s once largely invisible mechanisms for denying African Americans their rightful political power as a community. In a 2018 plebiscite, Florida voters restored electoral rights to that state’s 1.4 million ex-convicts, including 400,000 African Americans. Almost immediately, however, Republican governor Ron DeSantis required that 800,000 of those felons pay whatever court costs and fines they still owed before voting — a decision he successfully defended in federal court just before the 2020 presidential election. The effect of such determined Republican efforts meant that fewer than 8% of Florida’s ex-convicts were able to vote.

But above all, Black male drug users are still stigmatized as dangerous predators, as we all saw in the recent trial of Minneapolis police officer Derek Chauvin, who tried to defend kneeling on George Floyd’s neck for nine minutes because an autopsy found that the victim had opioids in his blood. And in March 2020, a paramilitary squad of Louisville police broke down an apartment door with a battering ram on a no-knock drug raid for a suspected Black drug dealer and wound up killing his sleeping ex-girlfriend, medical worker Breonna Taylor.

Maybe now, half a century later, it’s finally time to end the war on drug users — repeal the heavy penalties for possession; pardon the millions of nonviolent offenders; replace mass incarceration with mandatory drug treatment; restore voting rights to convicts and ex-convicts alike; and, above all, purge those persistent stereotypes of the dangerous Black male from our public discourse and private thoughts.

If only…

 

Alfred W. McCoy writes regularly for TomDispatch. He is the Harrington professor of history at the University of Wisconsin-Madison. He is the author most recently of In the Shadows of the American Century: The Rise and Decline of U.S. Global Power (Dispatch Books). His latest book (to be published in October by Dispatch Books) is To Govern the Globe: World Orders and Catastrophic Change.

Copyright ©2021 Alfred W. McCoy — distributed by Agence Global

—————-

Released: 06 July 2021

Word Count: 3,412

—————-

  • « Previous Page
  • 1
  • …
  • 17
  • 18
  • 19
  • 20
  • 21
  • …
  • 40
  • Next Page »

Syndication Services

Agence Global (AG) is a specialist news, opinion and feature syndication agency.

Rights & Permissions

Email us or call us 24 hours a day, 7 days a week, for rights and permission to publish our clients’ material. One of our representatives will respond in less than 30 minutes over 80% of the time.

Social Media

  • Facebook
  • Twitter

Advisories

Editors may ask their representative for inclusion in daily advisories. Sign up to get advisories on the content that fits your publishing needs, at rates that fit your budget.

About AG | Contact AG | Privacy Policy

©2016 Agence Global