Agence Global

  • About AG
  • Content
  • Articles
  • Contact AG

Steven Pressman, “Will Household Debt Derail the US Economy?”

November 7, 2022 - The-Washington-Spectator

US household debt hit a record $16.15 trillion in the second quarter of 2022. Mortgage debt accounts for 75% of the total; college debt another 10%. The rest is mainly motor vehicle and credit card debt.

Rising household debt over the past two years is worrisome, and will become a greater problem as interest rates continue to rise. What’s more worrisome, however, are its consequences. Excessive debt can force people to reduce their spending, which will slow economic growth and lead to a recession.

$16 trillion is a large number. But what really matters is debt relative to what people have to repay it. College debt exceeding $100,000 is not a big deal for doctors and lawyers making several times that amount each year. The situation is very different for teachers making half what they owe.

From this perspective, things have gotten better lately. Household debt relative to disposable income fell from 150% in 2009 to 130% in 2016, where it has since remained; and household finances improved during the coronavirus pandemic. Consumer debt payments (which excludes mortgages) relative to disposable income fell from 13% in 2007 to 8.4% in early 2021. Delinquencies (debt payments 90-plus days overdue) fell from 3% before covid to under 2% in 2020 and 2021. Covid (which kept people at home and reduced spending), low interest rates, and generous government benefits during the pandemic (stimulus checks, child tax credit, and a moratorium on repaying college debt) helped bring this about.

Furthermore, debt isn’t always bad. Mortgages let households purchase a home and gain equity when they pay it off. Borrowed money lets people attend college and buy cars. Debt also enables people to survive hard times — a layoff, gig workers getting fewer gigs and non-gig workers getting fewer hours, or any catastrophe precluding employment for some time. However, high debt levels make life stressful and difficult. People worry about eviction, utilities being shut off, putting food on the table, as well as saving for retirement.

Household debt also stimulates the economy and creates jobs. But this, too, is a double-edged sword. Even a small spending cutback due to high debt levels will have negative macroeconomic consequences. Inventories will pile up, firms will cut production, and lay off workers. Service-sector workers will receive less income, and have their gigs or hours reduced.

The main force reducing consumer spending since the 1980s has been the greater share of total income going to the rich, who save large fractions of their income. This leaves less for households struggling to maintain their standard of living. Since low- and middle-income households typically spend nearly all their income, these households have gone into debt or deeper into debt.

But households can handle only so much debt. While the actual breaking point is uncertain, the economic results can be catastrophic once this limit is reached.

In 2008, the Great Recession began when homeowners couldn’t pay their mortgages. Lehman Brothers collapsed, and many other financial institutions stood on the brink of bankruptcy. The government bailed out the financial institutions that created the problem, but then did little to help households saddled with mortgages they couldn’t possibly repay. This is one reason the economic recovery was so weak and it took nearly a decade before household income (adjusted for inflation) reached its pre-2008 level.

A similar problem led to the October 1929 stock market crash and Great Depression. During the Roaring 20s people bought stock on borrowed money. Once stock prices fell a bit, margin calls went out. Lacking sufficient savings to repay loans from stockbrokers, people had to sell stocks to get cash and repay their loans. This pushed down stock prices further, generating more margin calls, and eventually a market crash. What happened on Wall Street soon affected Main Street, as everyone became reluctant to spend money.

While household debt levels have not yet approached a tipping point, four forces will sharply increase debt-to-income ratios in the months ahead.

First, the Federal Reserve has been raising interest rates since early this year. They plan to continue doing so at least through the end of 2022. This will increase rates on credit cards, college debt, and car loans. We have already seen one consequence of this — the ratio of consumer debt payments to disposable income rose to 9.5% in the second quarter of this year (from its record low of 8.4% last year).

Second, the majority of household debt comes from housing. Home prices grew 4.5% annually from 1992 to 2019. Starting in 2020, they have soared more than 10% a year, resulting in a nearly 40% increase in home prices between 2019 and today. Homes are less affordable now than at any time since June 1989. As the Fed continues raising interest rates, housing prices will begin to fall, putting some homeowners underwater. Similar to the Great Recession, many won’t be able to repay their mortgages and will lose their homes.

Third, President Biden recently announced that the college debt moratorium would end in 2023. This moratorium is a leading reason household debt was less problematic during the covid pandemic. Income not used to repay college debt went to repay other debt and kept people from falling further into debt. When college loan repayments resume in January, many households will struggle to pay their bills and also repay their debt.

Finally, government benefits helped US households during the coronavirus pandemic. These benefits have ended. Families struggling to make ends meet must now rely on high-interest borrowing (credit cards, payday loans and auto title loans) to survive.

The good news is that financially strapped households can be helped. For starters, the Fed can stop raising interest rates before they push the economy over the edge.

A more difficult fix is meaningful bankruptcy reform, including allowing people to discharge their college debt, rather than being squeezed during their working lives and then having whatever they still owe taken from their Social Security checks.

Before the 2005 Bankruptcy Reform Act it was easy and cheap for people to reduce their debt through bankruptcy. This is no longer the case. Now people must take two credit counseling courses before having their debt reduced. Many studies have found these classes to be worthless. They don’t change behavior, but they are costly for people already drained of their financial resources. Moreover, delaying bankruptcy protection leads to abuse by creditors and possible loss of one’s home and car.

For many people, bankruptcy is the only option to escape from the crushing debt that comes from job loss, enormous medical bills, divorce, and other unanticipated events. Changing the bankruptcy code is needed. Towards this end, Elizabeth Warren (D-MA) introduced a new bankruptcy bill in the Senate in 2020. It remains stuck in Committee, lacking the votes to end a Republican filibuster.

A more liberal bankruptcy law would help, but it doesn’t solve the underlying problem. People accumulating great debt, and then eliminating it in bankruptcy court every half-dozen years or so, epitomizes Einstein’s quip about insanity — it is “doing the same thing over and over and expecting different results.” The root cause of the household debt problem — greater inequality — needs addressing. As noted above, when more income goes to the top 1%, everyone else must pick up the slack by spending more or going into greater debt. If this doesn’t happen, economic growth slows and household debt levels become more problematic — not due to more debt but due to having less income to repay that debt. This is why higher taxes on corporations and the rich, and more generous spending programs (for example, reviving the refundable child tax credit and increasing Social Security and Medicare benefits), are needed. It is for the good of the economy and the nation.

Unless some action is taken, household debt will continue to rise. And it will threaten to rise to the point where it breaks the backs of American households and the US economy.

Steven Pressman is part-time professor of economics at the New School for Social Research, professor emeritus of economics and finance at Monmouth University and author of Fifty Major Economists, 3rd edition (Routledge, 2013).

Copyright ©2022 The Washington Spectator — distributed by Agence Global

—————-
Released: 07 November 2022
Word Count: 1,321
—————-

Frida Berrigan, “How to survive us”

November 7, 2022 - TomDispatch

When I was growing up, there was a parody of an old-fashioned public announcement tacked to the wall of our kitchen that I vividly remember. It had step-by-step instructions for what to do “in case of a nuclear bomb attack.” Step 6 was “bend over and place your head firmly between your legs”; step 7, “kiss your ass goodbye.”

That shouldn’t be surprising, since my parents, Philip Berrigan and Elizabeth McAlister, once-upon-a-time priest and nun, were well-known antinuclear activists. I was too young to be a part of the “duck-and-cover generation” who, at school, practiced hiding from a nuclear attack beneath their desks or heading for local bomb shelters in the basements of churches and town halls.

Born in 1974, I think of myself as a member of The Day After generation, who were instructed to watch that remarkably popular made-for-TV movie in 1983 and report on our observations and feelings. Dramatizing the life of people in a small town in Kansas after a full-scale nuclear war between the Soviet Union and the United States, it made a strong (if perhaps unintentional) case that dying in the initial blast would have been better than surviving and facing the nuclear winter and over-armed chaos that followed.

In this Ukraine War era, maybe we could label today’s kids as the Generation Fed Up With Grown Ups (Gen Fed Up). The members of Gen Z are “digital natives,” born with smartphones in their hands and instantly able to spot all the messy seams in, and agendas behind, poorly produced, un-informative Public Service Announcements like the New York City Emergency Management department’s much pilloried recent PSA about what to do in case of — yep, you guessed it! — a nuclear attack: get inside, stay inside, and stay tuned. (Sounds pretty close to the poster on my wall growing up, doesn’t it?)

Young people need real information and analysis, survival skills and resources. Generation Z and the younger Generation Alpha (I have some of both in my family) are growing up in a world torn apart by the selfishness and shortsightedness of earlier generations, including the impact of the never-ending production and “modernization” of nuclear weapons, not to speak of the climate upheaval gripping this planet and all the horrors that go with it, including sea level rise, megadrought, flooding, mass migration, starvation, and on and on and on…

Jornado del Muerto

The nuclear age began during World War II with the July 16, 1945, test of a six-kilogram plutonium weapon code-named Trinity in the Jornado Del Muerto Valley in New Mexico. No one bothered to tell the estimated 38,000 people who lived within 60 miles of that atomic test that it was about to take place or that there might be dangerous nuclear fallout following the blast. No one was evacuated. The area, whose Spanish name in translation means, appropriately enough, Journey of Death, was rich in indigenous culture and life, home to 19 American Indian pueblos, two Apache tribes, and some chapters of the Navajo Nation. Though hardly remembered today, they were the first nuclear casualties of our age.

That initial test was quickly evaluated as successful and, less than a month later, American war planners considered themselves ready for the ultimate “tests” — the atomic bombing of two Japanese cities, Hiroshima on August 6th and Nagasaki three days later. The initial blasts from those back-to-back bombs killed hundreds of thousands of people on the spot and immediately thereafter, and countless more from radiation sickness and cancer.

Fat Man and Little Boy, as those bombs were bizarrely code-named, should have signaled the end of nuclear war, even of all war. The incineration of so many civilians and the leveling of two major cities should have been motivation enough to put the cork in the deadly power of the atom and consign nuclear weapons to some museum of horrors alongside the guillotine, the rack, and other past devices of obscene torture.

But it would prove to be just the beginning of an arms race and a cheapening of life that goes on to this day. After all, this country continues to “modernize” its nuclear arsenal to the tune of trillions of dollars, while Vladimir Putin has threatened to use one or more of his vast store of “tactical” nukes, and the Chinese are rushing to catch up. I keep thinking about how 77 years of nuclear brinkmanship and impending doom has taken its global toll, even while making life more precarious and helping render this beautiful and complex planet a garbage can for forever radioactive waste. (Okay, okay, hyperbole alert… it’s not forever, just literally a million years.)

Some among the duck-and-cover generation feared that they wouldn’t live to see adulthood, that there would be no tomorrow. Not surprisingly, too many of them, when they grew up, came to treat the planet as if there indeed were no tomorrow. And you can see evidence of just that attitude any time you consider the “prosperity” of the second industrial revolution with its toxic sludge of fossil fuels, PCBs, asbestos, lead in paint and gas, and so many plastics. This polluting of our ground, water, and air was all, I suspect, spurred on by a nihilistic nuclearism.

It seems impossible to work so hard to shift from burning carbon to capturing solar or wind power if there’s a chance that it could all go up in a mushroom cloud tomorrow. But there have been some notable efforts from which to draw hope and inspiration as we keep living out those very tomorrows. As environmentalist and futurist Bill McKibben writes in his memoir The Flag, The Cross and The Station Wagon: A Graying American Looks Back on His Suburban Boyhood and Wonders What The Hell Happened, President Jimmy Carter tried to guide this country to a less carbon-dependent future — and it cost him the presidency. The Carter White House sought to mitigate the damage of the 1979 oil crisis with significant investments in solar power and other green technologies and cutting-edge conservation. Had such policies been allowed to take hold, as McKibben points out, “climate changes would have turned from an existential crisis to a manageable problem on a list of other problems.”

Can you imagine? We love Carter now for his folksy accessibility, moral stamina, and promotion of affordable housing through Habitat for Humanity, but as we doom-scroll the latest news about present and future climate catastrophes, we have to reach back through time to even imagine a healthier tomorrow. Sadly enough, with Carter, we might have been near a turning point, we might have had a chance… and then actor (and huckster) Ronald Reagan rode his 10-gallon cowboy hat into the White House, removed the rooftop solar panels the Carters had installed, instituted tax cuts for the very wealthy, and loosened regulations on every type of polluter. President Reagan did that in 1986, only a year or so after the last month of our era that the planet was cooler than average.

Tomorrow

1986 seems like just yesterday! Now what? How about tomorrow?

After all, here we are in 2022 about to hit eight billion strong on this planet of ours. And there is, of course, a tomorrow. Hotter and drier but dawning all the same. Wetter and windier but coming anyway.

I have three kids, ages 8, 10 and 15, and they anchor me in a troubling and strange, if still ultimately beautiful, reality. This world, however finite with its increasingly overwhelming problems, is still precious to me and worth a good fight. I can’t turn away from tomorrow. It’s not an abstraction. The headlines now seem to endlessly scream: we are at a potential tipping point in terms of the climate. Did I say a potential tipping point? I meant to make that plural. In fact, an article in the September 8th issue of the Guardian lists 16 of them in all. Sixteen! Imagine that!

Three of the biggest ones that climate scientists agree we’re close to tipping over are:

1. The collapse of Greenland’s ice cap, which will produce a huge rise in global sea levels.

2. The collapse of a key current in the north Atlantic Ocean, which will further disrupt rainfall and weather patterns throughout the world, severely curtailing global food production.

3. The melting of the Arctic’s carbon-rich permafrost, releasing staggering amounts of greenhouse gas emissions into the atmosphere and so further broiling this planet. (Will it freeze again if we do the right thing? Not likely, as it seems as if that tipping point has already tipped.)

In the face of all of this, in the age of Donald Trump, Vladimir Putin, Elon Musk, and the rest of the crew, how do you change political or corporate behavior to slow, if not reverse, global warming? More than three-quarters of a century of uncertain tomorrows has made the human race — particularly, of course, those in the developed/industrialized world — awful stewards of the future.

“So when we need collective action at the global level, probably more than ever since the second world war, to keep the planet stable, we have an all-time low in terms of our ability to collectively act together. Time is really running out very, very fast.” So said Johan Potsdam, a scientist with the Institute for Climate Impact Research in Germany. As he added tellingly, speaking of the global temperature ceiling set at the Paris climate accords in 2015 (and already considered out of date in the latest devastating United Nations report), “I must say, in my professional life as a climate scientist, this is a low point. The window for 1.5C is shutting as I speak, so it’s really tough.”

Dire predictions, reams of science, sober calls to act from climatologists and activists, not to speak of island and coastal communities already being displaced by a fast-warming world. Only recently, two young people from the climate movement Last Generation threw mashed potatoes at the glass covering a classic Claude Monet painting in a museum near Berlin in a bid to get attention, while activists from Just Stop Oil used tomato soup on the glass of Vincent Van Gogh’s Sunflowersin London in October. In neither case were the paintings themselves harmed; in both cases, they have my attention, for what that’s worth.

For striking numbers of climate refugees globally, the point has already tipped and, given their situations, they might like to have some tomato soup and mashed potatoes — to eat rather than to be flung as protest props. In the longer term, for their children and grandchildren, they need masses of people in the biggest greenhouse gas polluters — China and the United States top the list — to radically alter their lifestyles to help protect what’s left of this distinctly finite planet of ours.

Yesterday

Thomas Berrigan, my grandfather, was born in 1879. My grandmother Frida was born in 1886. While they missed the pre-industrial era by more than 100 years, their early lives in the United States were almost carbon-free. They hauled water, chopped wood, and largely ate from a meager garden. As poor people, their carbon footprint remained remarkably small, even as the pace and pollution of life in the United States and the industrialized West picked up.

My father, Philip Berrigan, born in 1923, was the youngest of six brothers. There could have been two more generations of Berrigans between his birth and mine in 1974, but there weren’t. I could have been a grandmother when I gave birth to my last child in 2014, but I wasn’t. So, in our own way, whether we meant to or not, we slowed down the march of generations and I’m grateful for the long perspective that gives me.

In her later years, my grandmother marveled at the ways in which a car could bring her back and forth to the city “all in one day.” More recently, her great-grandchildren have found that they could still go to school (after a fashion) thanks to computers during the Covid pandemic, communicating in real-time with teachers and classmates scattered elsewhere in our world.

It’s not likely that I’ll live until 2079, my grandfather’s 200th birthday, but his great-granddaughter, my daughter Madeline, will just be turning 65 then. If she has my mother’s longevity, she’ll be 86 when we hit the year 2100, That is the grim milestone (tombstone?) when climate scientists expect that we could reach a disastrous global average temperature of 2.1 to 2.9 degrees Celsius above pre-industrial levels. Unless. Unless something is done, many somethings are done to reverse greenhouse gas emissions. Otherwise, that spells disaster beyond measure for my children’s children.

When I look at old photos, I see my own face in my mother’s hollowed-out, age-spotted cheeks. And when I look at my daughter’s still chubby cheeks and the way her eyebrows arch, I see my own younger face (and that of my mother’s, too).

As far as I’m concerned, the year 2100 is my future, even though I won’t be here to struggle through it with my children and their children. In the meantime, we keep putting one foot in front of the other (walking is better for the environment anyway) and struggling somehow to deal with this beautiful, broken world of ours. One generation cedes to the next, doing its best to impart wisdom and offer lessons without really knowing what tools those who follow us will need to carve a better tomorrow out of a worsening today.

To go back to the beginning, while such a thing is still possible, if nuclear weapons, the doctrine of mutually assured destruction, fossil fuels, and apocalyptic fear helped get us to this breaking point, we need something truly different now. We need not war, but peace; not new nukes, but next-generation-level diplomacy; not fossil fuels, but the greenest of powers imaginable. We need a world that Donald Trump, Vladimir Putin, Elon Musk, and their ilk can’t even imagine, a world where their kind of power is neither needed, nor celebrated.

We need gratitude, humility, and awe at the deep web of interconnection that undergirds the whole of nature. We need curiosity, joy in discovery, and celebration. And our kids (that Gen Fed Up) can help us access those powers, because they’re inherent in all children. So, no more ducking and covering, no more Day After, no more staying inside. Let us learn from Generation Z and Generation Alpha and change — and maybe survive.

Frida Berrigan is the author of It Runs In The Family: On Being Raised by Radicals and Growing into Rebellious Motherhood. She writes regularly for TomDispatch(where this article originated) and writes the Little Insurrections column for WagingNonviolence.Org. She has three children and lives in New London, Connecticut, where she is a gardener and community organizer.

Copyright ©2022 Frida Berrigan — distributed by Agence Global

—————-
Released: 07 November 2022
Word Count: 2,408
—————-

Andy Kroll, “Your factoids against mine”

November 3, 2022 - TomDispatch

For about a week in the summer of 2018, I caught an early-morning train from Washington, D.C., to the Albert V. Bryan federal courthouse in the suburb of Alexandria. Located a short drive from George Washington’s estate at Mount Vernon, that courthouse serves the Eastern District of Virginia. It has played host to a wide variety of closely watched cases, from terrorism trials and inscrutable cybersecurity matters to the government’s prosecution of whistleblowers Daniel Hale and Chelsea Manning.

The defendant whose trial I was covering was Paul Manafort, who had been the chairman of Donald Trump’s first presidential campaign. The special investigation led by former FBI director Robert Mueller probing Russian interference in the 2016 election had led to Manafort’s indictment on multiple charges of conspiracy, money laundering, and other financial crimes. He denied the allegations and decided to take his chances at trial, putting his future in the hands of 12 northern Virginia jurors.

The Eastern District — EDVA, as it’s better known — is notorious for its old-school rules. Unlike most legal venues, reporters and members of the public aren’t allowed to bring electronics of any kind into that courthouse. There are no lockers or storage units on-site. Each morning, I waited in line (along with half of the D.C. press corps) inside a small café across from the courthouse to pay $10 to store my phone and laptop underneath the cash register. Bereft of my devices, I was left to cover the Manafort case the way a reporter would have in the 1960s — with pen and paper, scrawling notes on a pad on my knee and later spending as much time deciphering those jottings as I did writing up the day’s events.

I’ll never forget the experience of covering that trial. Joining me in the courtroom gallery most days were a dozen or so self-described “trial tourists,” people who had taken a day off from work to sit in on the case. A few silver-haired retirees had traveled from other states to hear expert witnesses testify about Manafort’s money-laundering operation or his taste in lavish ostrich-skin coats and luxury real estate. But what stays with me most is the way that all the usual noise, chatter, tweets, and din of this bizarre American moment seemed to stop at the courthouse doors. Stepping into Room 900, I felt like some celestial being had pressed the “Mute” button on the outside world.

The jury would ultimately convict Manafort on eight counts of financial fraud. Afterward, one juror, a Donald Trump supporter, told Fox News that she had wanted to find Manafort innocent, “but he wasn’t. That’s the part of a juror,” she explained, “you have to have due diligence and deliberate and look at the evidence and come up with an informed and intelligent decision, which I did.”

I remember her comments because they seemed to confirm what I had observed covering the case — in that courtroom, it didn’t matter whose tweet got the most “likes” or whose video tallied the most views. It felt, strangely enough, like a refuge from the modern mania of social media and Trumpism, an old-fashioned bastion of facts, rationality, and truth.

My mind flashed back to Paul Manafort as I watched the two recent trials of Alex Jones, the prominent conspiracy theorist and founder of the website Infowars. He faced lawsuits in Texas and Connecticut filed by parents whose children had died in the 2012 Sandy Hook school shooting. Jones had spent years spreading cruel lies about that mass killing, calling it a “hoax” and a “false flag” operation, while also accusing those parents of being “crisis actors” whose children were never actually killed.

In both cases, a judge had already ruled against Jones; the question before the two juries was how much he should pay to those Sandy Hook families. In the end, they would together award the families more than $1 billion in damages — money that Jones promptly claimed he didn’t have and couldn’t pay. The Jones trials also marked one of the few times that he faced any sort of accountability for his years of conspiracy theories. Unlike on his show or on social media, in court he couldn’t say whatever he wanted regardless of whether it was true. “You believe everything you say is true, but it isn’t,” Judge Maya Guerra Gamble admonished him. “That is what we’re doing here…Things must actually be true when you say them.”

The loudest voice in the room

We live in an era when the truth can feel like whatever the loudest voices claim it is, whether the most extreme version of events or the one that feels right (even if it isn’t). I’ve covered scores, if not hundreds, of campaign rallies and stump speeches in my 15 years as a journalist. I tend to find my conversations with people in those crowds far more revealing than anything uttered by the candidate onstage, including, of course, that ultimate on-stager Donald Trump.

Lately, I’ve noticed a familiar refrain in those interviews. Once upon a time, rival politicians or competing media pundits normally agreed on at least a modest set of shared basic facts — humans are warming the planet to dangerous levels, say, or democracy works best when everyone participates — and then competed for votes based on how they interpreted and acted upon those facts.

Nowadays, though, rallygoers tell me that it’s ever harder to know what’s true and what’s false, to sift out right from wrong. Today’s politicians and pundits — particularly, though not exclusively, on the Trumpian right — seem not only to have their own opinions but their own “facts” to go with them. In their eyes, it’s increasingly difficult to know who’s being honest anymore. And the response, all too often, is a rhetorical and sometimes literal throwing up of the hands, an acceptance that no one can be trusted, that the facts are simply unknowable.

Surveys measuring the American public’s trust in its institutions capture this phenomenon strikingly. Trust in Congress, the presidency, the news media, and — once inconceivable — even the military is steadily eroding, as fear, suspicion, and resentment become the currency of American politics in this century. But if there was one institution that, until recent years, seemed to withstand this trend, it was the third branch of government, the judicial system.

Of all the institutions vital to American democracy, the courts have held remarkably steady, even during the turbulent years of Donald Trump’s presidency. This was, after all, a man who believed himself above the law, viewed the justice system as a tool to pardon his friends and punish his enemies, and lashed out whenever a judge constrained his executive actions. From one of Trump’s earliest moves as president — a ban on citizens of seven mostly Muslim countries entering the U.S. — to the 62 lawsuits that he and his supporters filed attempting to overturn the 2020 election results, the courts proved resilient in the face of unrelenting attacks.

An independent judiciary is more essential than ever when facts are under assault. As they did in the Manafort case I covered and the more recent Alex Jones trials, the courts can act as a firewall for the truth, a last resort for sifting real from fake, nonsense from reality.

There is, of course, a long and sordid history of courts dealing setbacks to the cause of progress. Look no further than the Supreme Court’s infamous decisions in Dred Scott v. Sandford, Plessy v. Ferguson, or far more recently Shelby County v. Holder, which gutted a key provision of the landmark Voting Rights Act of 1965. But in a truth-challenged era, the courts long remained one of the last holdouts where people could trust that they would at least get a reasonably fair hearing based on the facts, whatever their views or politics.

Or at least that’s how it looked until recently.

According to Gallup, at any given moment over nearly the last five decades, somewhere between two-thirds and three-quarters of Americans claimed to have a “great deal” or at least a “fair amount” of trust in the judicial branch. As recently as 2019, 69% of those surveyed expressed confidence in the nation’s courts, including the Supreme Court. And yet in the three years since then — as Donald Trump (with a big helping hand from Mitch McConnell) stacked the Supreme Court — support has plummeted to a dismal 47% this year. At the same time, a record number of Americans (58%) said they disapproved of the Supreme Court’s performance, while just 40% approved.

That steep drop in trust has no doubt been shaped by recent controversies. At the top of that list is the decision by the Supreme Court’s conservative majority to overturn Roe v. Wade, a decades-old precedent to which many of the justices who struck it down had previously paid lip service as settled law.

But the dwindling faith in the courts isn’t purely a reflection of the decision to strike down Roe. It’s now all too common to see federal judges described in news stories and on TV as “Obama judges” or “Trump judges,” “Bush judges” or “Clinton judges,” as if that somehow will help the audience make sense of the decision in question. Not only does that moniker too often prove misleading, but it fuels the notion that judges are nothing more than “politicians in robes,” as the saying goes.

It’s one thing to critique the current crop of Supreme Court justices for decisions that fly in the face of longstanding precedents, especially when those same judges vowed to respect precedent during their confirmation hearings. But the trend toward describing all judges in political terms undoubtedly leaves the impression that the judicial system is little more than a dressed-up political body, just another place where the ever fiercer partisan battle lines and tribal loyalties come into play.

Admittedly, there have indeed been recent non-Supreme Court decisions, too, that seem to suggest former President Trump succeeded in creating a more political judicial system when he pushed through over 200 judicial confirmations — some of them deemed by the American Bar Association unqualified for the bench, nearly all of them deemed loyal to the conservative doctrine of originalism — in the hope that they would rule favorably for him. (“If it’s my judges, you know how they’re gonna decide,” was Trump’s classic comment during the 2016 presidential campaign.) In Florida, for instance, Trump-appointed Judge Aileen Cannon has handed down one mystifying ruling after another in the ongoing litigation over the ex-president’s refusal to hand over all the classified and non-classified documents he took with him to his Mar-a-Lago estate. But there are far more Trump-appointed judges who have reviewed and dismissed legal challenges to the 2020 election or presided fairly over the criminal prosecution of various January 6th rioters. “There was nothing patriotic about what happened that day — far from it,” Judge Timothy Kelly, a Trump appointee, said in August. “It was a national disgrace.”

The speed of truth

Thinking back to that courtroom in Alexandria in 2018, I learned a lesson: The truth moves slowly. Far more slowly than the velocity of a viral tweet or an infuriating Facebook post. The first story you encounter online about a major world event or a breaking-news story may not be the most accurate version of what happened, if it’s accurate at all. Truth takes time to reveal itself. That time can feel longer than ever in a world where we’ve become conditioned to believe that we can have all the facts at our fingerprints in an instant. Make us wait and we lose interest.

The five years I spent reporting for my just-published book, A Death on W Street: The Murder of Seth Rich and the Age of Conspiracy, put this lesson about truth into greater relief. The book chronicles one of the most searing truth crises of the last five years — the story of a young man, Seth Rich, whose death became a global conspiracy theory, a partisan talking point, and a Fox News rallying cry. The false and fantastical theories about Rich, a 27-year-old staffer for the Democratic Party who was gunned down on a Washington street in 2016, began spreading mere hours after his murder had been publicly announced. The amplification of those lies happened almost instantaneously, faster than anyone could keep track of them, let alone stop them.

When Rich’s family exhausted their options to correct the record through media interviews and other public statements, they decided their only remaining choice was to seek accountability in a court of law. The Riches sued Fox News and people in Fox’s orbit, and ultimately reached settlements that helped protect the truth and restore Seth’s reputation and memory.

But it took three years of litigation to achieve those outcomes in court. Put another way, it took three long years for the facts and realities of Rich’s life and death to catch up with the fantasies, memes, and conspiracy theories spread about him. Still, at least there remained a venue for Rich’s family to receive a fair hearing, a protected space for an honest accounting of what was true and what wasn’t.

And yet today, that space seems increasingly under threat.

At stake in this year’s midterm elections is control of the House of Representatives and the Senate. Much has been written about what a Republican majority might do with its newfound subpoena power should the GOP retake control of the House. But when it comes to the courts, the Senate is crucial, since it controls the judicial confirmation process, approving or blocking nominees to fill dozens of openings across the federal court system. If Mitch McConnell returns to his position as Senate majority leader, it’s a good bet that he’ll thwart President Biden’s attempts to fill those vacancies before the 2024 election.

And if that next presidential contest were to usher in a Republican president (especially you know who), McConnell and his fellow Republicans will again have the power to usher onto the federal bench the next generation of Samuel Alitos and Clarence Thomases. And then, watch out!

The Supreme Court excepted, the judicial system has largely stood firm in the face of a half-decade of Trumpian attacks and a surge in conspiratorial politics. Our judicial branch still remains a refuge for the facts. The question is: How much longer can they hold on?

Andy Kroll is an investigative journalist with ProPublica based in Washington, D.C. His just-published book is A Death on W Street: The Murder of Seth Rich and the Age of Conspiracy. Follow him on Twitter at @AndyKroll and on Facebook. This article first appeared at TomDispatch.

Copyright ©2022 Andy Kroll — distributed by Agence Global

—————-
Released: 03 November 2022
Word Count: 2,390
—————-

Tom Engelhardt, “An obituary for our world

November 1, 2022 - TomDispatch

Oddly enough, I’ve read obituaries with fascination from the time I was quite young. And yet, in all these years, I’ve never really reflected on that fact. I don’t know whether it was out of some indirect fascination with death and the end of it all or curiosity about the wholeness (or half-ness or brokenness) of an individual life in full. But here’s the odd thing: in all that time — put it down to the charm of youth or, later, perhaps a lingering sense of youthfulness or, at least, agelessness — I never really thought about my own obituary. Like so many of us when younger, I simply couldn’t imagine my own death. Against all reason, it seemed strangely inconceivable.

Now, at 78, I find that obituaries are again on my mind — and not just because people I knew are being featured in them all too often these days or for that other all-too-obvious reason, which I hardly need to spell out here. As a matter of fact, if you put my last name or yours into a search engine, you may be surprised at how many obituaries come up. It turns out, in fact, that Engelhardts have been dying for centuries now.

After all, the one obituary you can’t really have is your own; at least, not unless you decide to write it yourself or you’re so well known that a newspaper obit writer interviews you as one of the “pre-dead” while you’re still kicking. Of course, for the best known among us, such pieces, as at the New York Times, are prepared and written well in advance because the one thing we do know, whether we think about it or not, accept it or not, is that we all will indeed die.

Nuclear winter or a climate-change-induced nuclear summer?

Let’s not be shy. If there’s one word that comes to mind (mine anyway) at the moment, it’s madness. And no, believe it or not, I’m not even thinking about Donald Trump or the crazed crew of election deniers, QAnon conspiracy believers, and white nationalists who have become the essence of the Republican Party and may sweep to victory, at least in the House of Representatives, only days from now. And no, neither am I thinking about the Trumpist-leaning Supreme Court that might single-handedly (or perhaps hand in hand with all too many voters on November 8th) send us even further down the road to autocracy or at least to an eternally Republican-controlled mania-ocracy.

From the time we left our Neanderthal cousins in the dust, the story of humanity is tens of thousands of years old; and our history — you know, since we first began herding other creatures, raising crops, and arming ourselves to the teeth — is thousands of years old. In all those eons, we discovered so many things, both uplifting and down-thrusting. But perhaps, looking back (if, given our present circumstances, anyone’s even bothering), the most remarkable thing may be that we discovered — once quite purposely and once without at first even noticing that we’d done so — two different ways to do ourselves in. And, believe me, I’m using that word advisedly, given the Elizabethan moment that passed only recently, leaving so many of us watching a “news” spectacle that was her obituary and nothing else but that for what seemed like ever and a day. Now, of course, the former British queen is gone not just from our world but from that news cycle, too. Not a trace of her remains. Nothing, it seems, lasts long these days, Donald Trump aside. And if things continue to go ever wronger on this planet of ours — and I wouldn’t Truss (joke, joke) that they won’t — it’s possible that she could indeed prove to be the last queen.

As I’m sure you already know, those two discoveries I’m thinking about are nuclear weapons and climate change. Each of them should be on all our minds right now for reasons almost too obvious to enumerate. Our own president recently chatted privately with Democratic Party donors about the possibility that we might indeed face “Armageddon” (his word, not mine) for the first time since the Cuban missile crisis of 1962. That would be thanks to Vladimir Putin’s invasion of Ukraine and the Russian president’s threat (“this is not a bluff“) to use nuclear weapons for, as he himself pointed out, the first time since the United States ended World War II by obliterating the cities of Hiroshima and Nagasaki.

In a sense, however, whether Putin ever uses those “tactical” nuclear weapons or not, he has, in his own uniquely deplorable fashion, already nuked this planet. His decision to invade Ukraine and, after an eight-month disaster (including the especially dangerous occupation of a Ukrainian nuclear power plant), only increase the level of destruction, while evidently looking for no off-ramp whatsoever, has sent energy politics in the worst possible direction. Some desperate European countries have already turned back to coal power; militaries are burning ever more fossil fuels; gas prices have been soaring globally; and what modest attention was focused on the broiling of this planet and the very idea of the major powers cooperating to do anything about it now seems like a fantasy from some past universe.

It evidently doesn’t matter that a combination of fearsome monsoons and growing glacial melt flooded one-third of Pakistan in an unparalleled fashion; that record heat and drought was last summer’s reality across much of the northern hemisphere; that Hurricane Ian only recently leveled parts of Florida in what should have been, but given where we’re heading, won’t be a once-in-500-year fashion; that a mainstream website like Politico can now refer to our country as “the United States of Megadrought“; or that rivers from the Yangtze to the Mississippi are drying up in a historic manner. Worse yet, that’s just to start down a far longer list of climate horrors. And I almost forgot to mention that the giant fossil-fuel companies continue to live on another planet from the rest of us. Call it profit heaven.

Returning to the subject of obituaries, you could, of course, have written a group one for the approximately one billion sea creatures that died last summer, thanks to a record heat wave on Canada’s Pacific coast, or another based on the recent report that, since 1970, the population of fresh-water species on this planet has fallen by a startling 83%. In fact, if you’re in an obituary-writing mood and thinking of the pre-dead, don’t forget the emperor penguin. According to the U.S. Fish and Wildlife Service, that classic creature is threatened with extinction by the end of this century thanks to the increasing loss of the sea ice it needs to exist on a fast-warming planet.

So, give the Vlad full credit. His invasion of Ukraine refocused the attention of the world on that other way we’ve come up with to do ourselves in, those nuclear weapons. In short, he’s helped take our minds off climate change at the worst possible moment (so far), even as his war only increases the level of greenhouse gases heading into the atmosphere. Well done, Mr. President!

I’m sure you won’t be surprised to learn then that, according to a recent United Nations report, of the 193 nations which, in 2021, agreed to step up their efforts to fight climate change, only 26 have followed through so far (and even some of those in an anything but impressive fashion). In other words, our future — should we ever get there — will be blistering. The Earth is now on track to warm not by the 1.5 degrees Celsius the 2015 Paris climate accord made its ultimate temperature, but a potentially broiling 2.1 to 2.9 degrees Celsius by century’s end.

Even before the Ukraine war began, the powers that be were paying all too little attention to how we could do ourselves (and so many other species) in by overheating the planet. Worse yet, the major powers of the old Cold War were already “modernizing” their nuclear arsenals — in the case of the United States, to the tune of more than a trillion dollars over the coming decades. That will include a mere $100 billion to create a “next generation” intercontinental ballistic missile dubbed the LGM-35A Sentinel, undoubtedly because it’s meant to stand guard over hell on earth. Meanwhile, the rising power on the planet, China, is rushing to catch up. And now, with a war underway in Europe, “dirty bombs” and far worse are seemingly back on the playing fields of history.

Here, I suspect, is the strangest thing of all. We now know that we’re quite capable of doing something humanity once left to the gods — creating a genuinely apocalyptic future on this planet. With our weaponry, we already have the ability to induce a “nuclear winter” (in which up to five billion of us could starve to death) or, with greenhouse gases, to fry this planet in a long term way via, to coin a new phrase, a climate-change-induced nuclear summer.

And that — don’t you think? — should already have been game-changing information.

And yet, despite the Greta Thunbergs of this world when it comes to climate change, these days there are no significant equivalents to her or, say, 350.org or the Sunrise Movement when it comes to nukes. Worse yet, despite the growing green movement, the fact that we’re already in the process of making Earth an increasingly unlivable place seems not to have fazed so many of those in a position to run things, whether nationally or corporately. And that should stun us all.

An ultimate obit?

Give humanity credit. When it comes to our urge to destroy, we seem to see no limits, not even those of our own existence. I mean, if you really had the desire to write a communal obituary for us, one logical place to start might indeed be with the invasion of Ukraine at a time when the planet was already beginning to broil. Honestly, doesn’t it make you want to start writing obituaries not just for our individual selves, but for all of the pre-dead on a planet where the very idea of mass killings could, in our future, gain a new meaning?

And in that context, if you want to measure the madness of the moment, just imagine this: It’s quite possible that a political party largely taken over by that supreme narcissist, Donald Trump, the Me-Man of history, could win one or both houses of Congress in this country’s coming midterm elections and even the presidency again in 2024. Given that the U.S. is one of the planet’s two leading greenhouse gas emitters, that would, of course, help ensure a fossil-fuelized future hell. The Donald — like his authoritarian cohorts elsewhere — could be the ultimate god when it comes to our future destruction, not to speak of the future of so many other beings on this planet. Think of him and his crew as potentially the all-too-literal ultimate in (un)civilization.

After all these thousands of years — a long, long time for us but not for planet Earth — the question is: Should we aging types begin thinking not just about our own obituaries (“He was born on July 20, 1944, in New York City, on a planet engulfed in war….”) but humanity’s? (“Born in a cave with their Neanderthal and Denisovan cousins…”)

Everything, of course, ends, but it doesn’t have to end this way. Yes, my obituary is a given, but humanity’s should be so much less so. Whether that proves true or not is up to us. When it comes to all of this, the question is: Who will have the last word?

Tom Engelhardt created and runs the website TomDispatch.com (where this article originated). He is also a co-founder of the American Empire Project and the author of a highly praised history of American triumphalism in the Cold War, The End of Victory Culture. A fellow of the Type Media Center, his sixth and latest book is A Nation Unmade by War.

Copyright ©2022 Tom Engelhardt — distributed by Agence Global

—————-
Released: 01 November 2022
Word Count: 1,948
—————-

Liz Theoharis, “The quality (or inequality) of life”

October 27, 2022 - TomDispatch

Ours is an ever more unequal world, even if that subject is ever less attended to in this country. In his final book, Where Do We Go From Here?, Reverend Martin Luther King wrote tellingly, “The prescription for the cure rests with the accurate diagnosis of the disease. A people who began a national life inspired by a vision of a society of brotherhood can redeem itself. But redemption can come only through a humble acknowledgment of guilt and an honest knowledge of self.”

Neither exists in this country. Rather than an honest sense of self-awareness when it comes to poverty in the United States, policymakers in Washington and so many states continue to legislate as if inequality weren’t an emergency for tens, if not hundreds, of millions of us. When it comes to accurately diagnosing what ails America, let alone prescribing a cure, those with the power and resources to lift the load of poverty have fallen desperately short of the mark.

With the midterm elections almost upon us, issues like raising the minimum wage, expanding healthcare, and extending the Child Tax Credit (CTC) and Earned Income Tax Credit should be front and center. Instead, as the U.S. faces continued inflation, the likelihood of a global economic recession, and the possibility that Trumpists could seize control of one or both houses of Congress (and the legislatures of a number of states), few candidates bother to talk about poverty, food insecurity, or low wages. If anything, “poor” has become a four-letter word in today’s politics, following decades of trickle-down economics, neoliberalism, stagnant wages, tax cuts for the rich, and rising household debt.

The irony of this “attentional violence” towards the poor is that it happens despite the fact that one-third of the American electorate is poor or low-income. (In certain key places and races raise that figure to 40% or more.) After all, in 2020, there were over 85 million poor and low-income people eligible to vote. More than 50 million potential voters in this low-income electorate cast a ballot in the last presidential election, nearly a third of the votes cast. And they accounted for even higher percentages in key battleground states like Arizona, Florida, Michigan, North Carolina, Texas, and Wisconsin, where they turned out in significant numbers to cast ballots for living wages, debt relief, and an economic stimulus.

To address the problems of our surprisingly impoverished democracy, policymakers would have to take seriously the realities of those tens of millions of poor and low-income people, while protecting and expanding voting rights. After all, before the pandemic hit, there were 140 million of them: 65% of Latinx people (37.4 million), 60% of Black people (25.9 million), 41% of Asians (7.6 million), and 39.9% of White people (67 million) in the United States. Forty-five percent of our women and girls (73.5 million) experience poverty, 52% of our children (39 million), and 42% of our elders (20.8 million). In other words, poverty hurts people of all races, ages, genders, religions, and political parties.

Poverty on the decline?

Given the breadth and depth of depravation, it should be surprising how little attention is being paid to the priorities of poor and low-income voters in these final weeks of election season 2022. Instead, some politicians are blaming inflation and the increasingly precarious economic position of so many on the modestly increasing paychecks of low-wage workers and pandemic economic stimulus/emergency programs. That narrative, of course, is wrong and obscures the dramatic effects in these years of Covid supply-chain disruptions, the war in Ukraine, and the price gouging of huge corporations extracting record profits from the poor. The few times poverty has hit the news this midterm election season, the headlines have suggested that it’s on the decline, not a significant concern to be urgently addressed by policy initiatives that will be on some ballots this November.

Case in point, in September, the Census Bureau released a report concluding that poverty nationwide had significantly decreased in 2021. Such lower numbers were attributed to an increase in government assistance during the pandemic, especially the enhanced Child Tax Credit implemented in the spring of 2021. No matter that there’s now proof positive such programs help lift the load of poverty, too few political candidates are campaigning to extend them this election season.

Similarly, in September, the Biden administration convened the White House Conference on Hunger, Nutrition, and Health, hailed as the first of its kind in more than half a century. But while that gathering may have been an historic step forward, the policy solutions it backed were largely cut from the usual mold — with calls for increases in the funding of food programs, nutritional education, and further research. Missing was an analysis of why poverty and widening inequality exist in the first place and how those realities shape our food system and so much else. Instead, the issue of hunger remained siloed off from a wider investigation of our economy and the ways it’s currently producing massive economic despair, including hunger.

To be sure, we should celebrate the fact that, because of proactive public intervention, millions of people over the last year were lifted above income brackets that would, according to the Census Bureau, qualify them as poor. But in the spirit of Reverend King’s message about diagnosing social problems and prescribing solutions, if we were to look at the formulas for the most commonly accepted measurements of poverty, it quickly becomes apparent that they’re based on a startling underassessment of what people actual need to survive, no less lead decent lives. Indeed, a sea of people are living paycheck to paycheck and crisis to crisis, bobbing above and below the poverty line as we conventionally know it. By underestimating poverty from the start, we risk reading the 2021 Census report as a confirmation that it’s no longer a pressing issue and that the actions already taken by government are enough, rather than a baseline from which to build.

Last month, for example, although a report from the Department of Agriculture found that 90% of households were food secure in 2021, at least 53 million Americans still relied on food banks or community programs to keep themselves half-decently fed, a shocking number in a country as wealthy as ours. More than 20% of adults in the last 30 days have reported experiencing some form of food insecurity. In other words, we’re talking about a deep structural problem for which policymakers should make a commitment to the priorities of the poor.

An accurate diagnosis

If the political history of poverty had been recorded on the Richter scale, one decision in 1969 would have registered with earthshaking magnitude. That August 29th, the Bureau of the Budget delivered a dry, unfussy memo to every federal government agency instructing them to use a new formula for measuring poverty. This resulted in the creation of the first, and only, official poverty measure, or OPM, which has remained in place to this day with only a little tinkering here and there.

The seeds of that 1969 memo had been planted six years earlier when Mollie Orshansky, a statistician at the Social Security Administration, published a study on possible ways to measure poverty. Her math was fairly simple. To start with, she reached back to a 1955 Department of Agriculture (USDA) survey that found families generally spent about one-third of their income on food. Then, using a “low-cost” food plan from the Department of Agriculture, she estimated how much a low-income family of four would have to spend to meet its basic food needs and multiplied that number by three to arrive at $3,165 as a possible threshold income for those considered “poor.” It’s a formula that, with a few small changes, has been officially in use ever since.

Fast forward five decades, factor in the rate of inflation, and the official poverty threshold in 2021 was $12,880 per year for one person and $26,500 for a family of four — meaning that about 42 million Americans were considered below the official poverty line. From the beginning though, the OPM was grounded in a somewhat arbitrary and superficial understanding of human need. Orshansky’s formula may have appeared elegant in its simplicity, but by focusing primarily on access to food, it didn’t fully take into account other critical expenses like healthcare, housing, childcare, and education. As even Orshansky later admitted, it was also based on an austere assessment of how much was enough to meet a person’s needs.

As a result, the OPM fails to accurately capture how much of our population will move into and out of official poverty in their lifetimes. By studying OPM trends over the years, however, you can gain a wider view of just how chronically precarious so many of our lives are. And yet, look behind those numbers, and there are some big questions remaining about how we define poverty, which say much about who and what we value as a society. For the tools we use to measure quality of life are never truly objective or apolitical. In the end, they always turn out to be as much moral as statistical.

What level of human deprivation is acceptable to us? What resources does a person need to be well? These are questions that any society should ask itself.

Since 1969, much has changed, even if the OPM has remained untouched. The food prices it’s based on have skyrocketed beyond the rate of inflation, along with a whole host of other expenses like housing, prescription medicine, college tuition, gas, utilities, childcare, and more modern but increasingly essential costs, including Internet access and cell phones. Meanwhile, wage growth has essentially stagnated over the last four decades, even as productivity has continued to grow, meaning that today’s workers are making comparatively less than their parents’ generation even as they produce more for the economy.

Billionaires, on the other hand… well, don’t get me started!

The result of all of this? The official poverty measure fails to show us the ways in which a staggeringly large group of Americans are moving in and out of crisis during their lifetimes. After all, right above the 40 million Americans who officially live in poverty, there are at least 95-100 million who live in a state of chronic economic precarity, just one pay cut, health crisis, extreme storm, or eviction notice from falling below that poverty line.

The Census Bureau has, in fact, recognized the limitations of the OPM and, since 2011, has also been using a second yardstick, the Supplemental Poverty Measure (SPM). As my colleague and poverty-policy expert Shailly Gupta-Barnes writes, while factoring in updated out-of-pocket expenses the “SPM accounts for family income after taxes and transfers, and as such, it shows the antipoverty effects of some of the largest federal support programs.”

This is the measure that the Census Bureau and others have recently used to show that poverty is dropping and there’s no doubt that it’s an improvement over the OPM. But even the SPM is worryingly low based on today’s economy — $31,000 for a family of four in 2021. Indeed, research by the Poor People’s Campaign (which I co-chair with Bishop William Barber II) and the Institute for Policy Studies has shown that only when we increase the SPM by 200% do we begin to see a more accurate picture of what a stable life truly beyond the grueling reach of poverty might look like.

Volcker Shock 2.0?

Taking to heart Reverend King’s admonition about accurately assessing and acknowledging our problems, it’s important to highlight how the math behind the relatively good news on poverty from the 2021 census data relied on a temporary boost from the enhanced Child Tax Credit. Now that Congress has allowed the CTC and its life-saving payments to expire, expect the official 2022 poverty figures to rise. In fact, that decision is likely to prove especially dire, since the federal minimum wage is now at its lowest point in 66 years and the threat of recession is growing by the day.

Indeed, instead of building on the successes of pandemic-era antipoverty policies and so helping millions (a position that undoubtedly would still prove popular in the midterm elections), policymakers have acted in ways guaranteed to hit millions of people directly in their pocketbooks. In response to inflation, the Federal Reserve, for instance, has been pursuing aggressive interest rate hikes, whose main effect is to lower wages and therefore the purchasing power of lower and middle-income people. That decision should bring grimly to mind the austerity policies promoted by economist Paul Volcker in 1980 and the Volcker Shock that went with them.

It’s a cruel and dangerous path to take. A recent United Nations report suggests as much, warning that such inflation-triggering policies in the U.S. and other rich countries represent an “imprudent gamble” that threatens “worse damage than the financial crisis of 2008 and the Covid-19 shock in 2020.”

If the U.S. is to redeem itself with a vision of justice, it’s time for a deep and humble acknowledgment of the breadth and depth of poverty in the richest country in human history. Indeed, the only shock we need is one that would awaken our imaginations to the possibility of a world in which poverty no longer exists.

Liz Theoharis writes regularly for TomDispatch (where this article originated). She is a theologian, ordained minister, and anti-poverty activist. Co-chair of the Poor People’s Campaign: A National Call for Moral Revival and director of the Kairos Center for Religions, Rights and Social Justice at Union Theological Seminary in New York City, she is the author of Always With Us? What Jesus Really Said About the Poor and We Cry Justice: Reading the Bible with the Poor People’s Campaign. Follow her on Twitter at @liztheo.

Copyright ©2022 Liz Theoharis — distributed by Agence Global

—————-
Released: 27 October 2022
Word Count: 2,214
—————-

Chris Hedges, “Writing on war and living in a world from Hell”

October 25, 2022 - TomDispatch

As this century began, I was writing War Is a Force That Gives Us Meaning, my reflections on two decades as a war correspondent, 15 of them with the New York Times, in Central America, the Middle East, Africa, Bosnia, and Kosovo. I worked in a small, sparsely furnished studio apartment on First Avenue in New York City. The room had a desk, chair, futon, and a couple of bookshelves — not enough to accommodate my extensive library, leaving piles of books stacked against the wall. The single window overlooked a back alley.

The super, who lived in the first-floor apartment, smoked prodigious amounts of weed, leaving the grimy lobby stinking of pot. When he found out I was writing a book, he suggested I chronicle his moment of glory during the six days of clashes known as the Stonewall Riots, triggered by a 1969 police raid on the Stonewall Inn, a gay club in Greenwich Village. He claimed he had thrown a trash can through the front window of a police cruiser.

It was a solitary life, broken by periodic visits to a small antique bookstore in the neighborhood that had a copy of the 1910-1911 Encyclopedia Britannica, the last edition published for scholars. I couldn’t afford it, but the owner generously let me read entries from those 29 volumes written by the likes of Algernon Charles Swinburne, John Muir, T.H. Huxley, and Bertrand Russell. The entry for Catullus, several of whose poems I could recite from memory in Latin, read: “The greatest lyric poet of Rome.” I loved the certainty of that judgment — one that scholars today would not, I suspect, make, much less print.

There were days when I could not write. I would sit in despair, overcome by emotion, unable to cope with a sense of loss, of hurt, and the hundreds of violent images I carry within me. Writing about war was not cathartic. It was painful. I was forced to unwrap memories carefully swaddled in the cotton wool of forgetfulness. The advance on the book was modest: $25,000. Neither the publisher nor I expected many people to read it, especially with such an ungainly title. I wrote out of a sense of obligation, a belief that, given my deep familiarity with the culture of war, I should set it down. But I vowed, once done, never to willfully dredge up those memories again.

To the publisher’s surprise, the book exploded. Hundreds of thousands of copies were eventually sold. Big publishers, dollar signs in their eyes, dangled significant offers for another book on war. But I refused. I didn’t want to dilute what I had written or go through that experience again. I did not want to be ghettoized into writing about war for the rest of my life. I was done. To this day, I’m still unable to reread it.

The open wound of war

Yet it’s not true that I fled war. I fled my wars but would continue to write about other people’s wars. I know the wounds and scars. I know what’s often hidden. I know the anguish and guilt. It’s strangely comforting to be with others maimed by war. We don’t need words to communicate. Silence is enough.

I wanted to reach teenagers, the fodder of wars and the target of recruiters. I doubted many would read War Is a Force That Gives Us Meaning. I embarked on a text that would pose, and then answer, the most basic questions about war — all from military, medical, tactical, and psychological studies of combat. I operated on the assumption that the simplest and most obvious questions rarely get answered like: What happens to my body if I’m killed?

I hired a team of researchers, mostly graduate students at Columbia University’s School of Journalism, and, in 2003, we produced an inexpensive paperback — I fought the price down to $11 by giving away any future royalties — called What Every Person Should Know About War.

I worked closely on the book with Jack Wheeler, who had graduated from West Point in 1966 and then served in Vietnam, where 30 members of his class were killed. (Rick Atkinson’s The Long Gray Line: The American Journey of West Point’s Class of 1966 is the story of Jack’s class.) Jack went on to Yale Law School after he left the military and became a presidential aide to Ronald Reagan, George H.W. Bush, and George W. Bush, while chairing the drive to build the Vietnam Veterans Memorial in Washington.

He struggled with what he called “the open wound of Vietnam” and severe depression. He was last seen on December 30, 2010, disoriented and wandering the streets of Wilmington, Delaware. The next day, his body was discovered as it was dumped from a garbage truck into the Cherry Island Landfill. The Delaware state medical examiner’s office said the cause of death was assault and “blunt force trauma.” Police ruled his death a homicide, a murder that would never be solved. He was buried in Arlington National Cemetery with full military honors.

The idea for the book came from the work of Harold Roland Shapiro, a New York lawyer who, while representing a veteran disabled in World War I, investigated that conflict, discovering a huge disparity between its reality and the public perception of it. His book was, however, difficult to find. I had to get a copy from the Library of Congress. The medical descriptions of wounds, Shapiro wrote, rendered “all that I had read and heard previously as being either fiction, isolated reminiscence, vague generalization or deliberate propaganda.” He published his book, What Every Young Man Should Know About War, in 1937. Fearing it might inhibit recruitment, he agreed to remove it from circulation at the start of World War II. It never went back into print.

The military is remarkably good at studying itself (although such studies aren’t easy to obtain). It knows how to use operant conditioning — the same techniques used to train a dog — to turn young men and women into efficient killers. It skillfully employs the tools of science, technology, and psychology to increase the lethal force of combat units. It also knows how to sell war as adventure, as well as the true route to manhood, comradeship, and maturity.

The callous indifference to life, including the lives of our soldiers, sailors, airmen and marines, leapt off the pages of the official documents. For example, the response to the question “What will happen if I am exposed to nuclear radiation but do not die immediately?” was answered in a passage from the Office of the Surgeon General’s Textbook of Military Medicine that read, in part:

Fatally irradiated soldiers should receive every possible palliative treatment, including narcotics, to prolong their utility and alleviate their physical and psychological distress. Depending on the amount of fatal radiation, such soldiers may have several weeks to live and to devote to the cause. Commanders and medical personnel should be familiar with estimating survival time based on onset of vomiting. Physicians should be prepared to give medications to alleviate diarrhea, and to prevent infection and other sequelae of radiation sickness in order to allow the soldier to serve as long as possible. The soldier must be allowed to make the full contribution to the war effort. He will already have made the ultimate sacrifice. He deserves a chance to strike back, and to do so while experiencing as little discomfort as possible.

Our book, as I hoped, turned up on Quaker anti-recruitment tables in high schools.

“I am sullied”

I was disgusted by the simplistic, often mendacious coverage of our post-9/11 war in Iraq, a country I had covered as the Middle East bureau chief for the New York Times. In 2007, I went to work with reporter Laila Al-Arian on a long investigative article in the Nation, “The Other War: Iraq Veterans Bear Witness,” that ended up in an expanded version as another book on war, Collateral Damage: America’s War Against Iraqi Civilians.

We spent hundreds of hours interviewing 50 American combat veterans of Iraq about atrocities they had witnessed or participated in. It was a damning indictment of the U.S. occupation with accounts of terrorizing and abusive house raids, withering suppressing fire routinely laid down in civilian areas to protect American convoys, indiscriminate shooting from patrols, the large kill radius of detonations and air strikes in populated areas, and the slaughter of whole families who approached military checkpoints too closely or too quickly. The reporting made headlines in newspapers across Europe but was largely ignored in the U.S., where the press was generally unwilling to confront the feel-good narrative about “liberating” the people of Iraq.

For the book’s epigraph, we used a June 4, 2005, suicide note left by Colonel Theodore “Ted” Westhusing for his commanders in Iraq. Westhusing (whom I was later told had read and recommended War is a Force That Gives Us Meaning) was the honor captain of his 1983 West Point class. He shot himself in the head with his 9mm Beretta service revolver. His suicide note — think of it as an epitaph for the global war on terror – read in part:

Thanks for telling me it was a good day until I briefed you. [Redacted name] — You are only interested in your career and provide no support to your staff — no msn [mission] support and you don’t care. I cannot support a msn that leads to corruption, human right abuses and liars. I am sullied — no more. I didn’t volunteer to support corrupt, money-grubbing contractors, nor work for commanders only interested in themselves. I came to serve honorably and feel dishonored.

The war in Ukraine raised the familiar bile, the revulsion at those who don’t go to war and yet revel in the mad destructive power of violence. Once again, by embracing a childish binary universe of good and evil from a distance, war was turned into a morality play, gripping the popular imagination. Following our humiliating defeat in Afghanistan and the debacles of Iraq, Libya, Somalia, Syria, and Yemen, here was a conflict that could be sold to the public as restoring American virtue. Russian President Vladimir Putin, like Iraqi autocrat Saddam Hussein, instantly became the new Hitler. Ukraine, which most Americans undoubtedly couldn’t have found on a map, was suddenly the front line in the eternal fight for democracy and liberty.

The orgiastic celebration of violence took off.

The ghosts of war

It’s impossible, under international law, to defend Russia’s war in Ukraine, as it is impossible to defend our invasion of Iraq. Preemptive war is a war crime, a criminal war of aggression. Still, putting the invasion of Ukraine in context was out of the question. Explaining — as Soviet specialists (including famed Cold War diplomat George F. Kennan) had — that expanding NATO into Central and Eastern Europe was a provocation to Russia was forbidden. Kennan had called it “the most fateful error of American policy in the entire post-Cold War era” that would “send Russian foreign policy in directions decidedly not to our liking.”

In 1989, I had covered the revolutions in East Germany, Czechoslovakia, and Romania that signaled the coming collapse of the Soviet Union. I was acutely aware of the “cascade of assurances” given to Moscow that NATO, founded in 1949 to prevent Soviet expansion in Eastern and Central Europe, would not spread beyond the borders of a unified Germany. In fact, with the end of the Cold War, NATO should have been rendered obsolete.

I naively thought we would see the promised “peace dividend,” especially with the last Soviet leader Mikhail Gorbachev reaching out to form security and economic alliances with the West. In the early years of Vladimir Putin’s rule, even he lent the U.S. military a hand in its war on terror, seeing in it Russia’s own struggle to contain Islamic extremists spawned by its wars in Chechnya. He provided logistical support and resupply routes for American forces fighting in Afghanistan. But the pimps of war were having none of it. Washington would turn Russia into the enemy, with or without Moscow’s cooperation.

The newest holy crusade between angels and demons was launched.

War unleashes the poison of nationalism, with its twin evils of self-exaltation and bigotry. It creates an illusory sense of unity and purpose. The shameless cheerleaders who sold us the war in Iraq are once again on the airwaves beating the drums of war for Ukraine. As Edward Said once wrote about these courtiers to power:

Every single empire in its official discourse has said that it is not like all the others, that its circumstances are special, that it has a mission to enlighten, civilize, bring order and democracy, and that it uses force only as a last resort. And, sadder still, there always is a chorus of willing intellectuals to say calming words about benign or altruistic empires, as if one shouldn’t trust the evidence of one’s own eyes watching the destruction and the misery and death brought by the latest mission civilizatrice.

I was pulled back into the morass. I found myself writing for Scheerpost and my Substack site, columns condemning the bloodlusts Ukraine unleashed. The provision of more than $50 billion in weapons and aid to Ukraine not only means the Ukrainian government has no incentive to negotiate, but that it condemns hundreds of thousands of innocents to suffering and death. For perhaps the first time in my life, I found myself agreeing with Henry Kissinger, who at least understands realpolitik, including the danger of pushing Russia and China into an alliance against the U.S., while provoking a major nuclear power.

Greg Ruggiero, who runs City Lights Publishers, urged me to write a book on this new conflict. At first, I refused, not wanting to resurrect the ghosts of war. But looking back at my columns, articles, and talks since the publication of War is a Force That Gives Us Meaning in 2002, I was surprised at how often I had circled back to war.

I rarely wrote about myself or my experiences. I sought out those discarded as the human detritus of war, the physically and psychologically maimed like Tomas Young, a quadriplegic wounded in Iraq, whom I visited recently in Kansas City after he declared that he was ready to disconnect his feeding tube and die.

It made sense to put those pieces together to denounce the newest intoxication with industrial slaughter. I stripped the chapters down to war’s essence with titles like “The Act of Killing,” “Corpses” or “When the Bodies Come Home.”

The Greatest Evil Is War has just been published by Seven Stories Press.

This, I pray, will be my final foray into the subject.

Chris Hedges was a war correspondent for two decades in Central America, the Middle East, Africa, and the Balkans, 15 of them with the New York Times, where he was awarded the Pulitzer Prize. He is the author of 14 books, including War Is a Force That Gives Us Meaning, What Every Person Should Know About War, and the just-published The Greatest Evil Is War (Seven Stories Press). He writes a column for ScheerPost and has a show, the Chris Hedges Report, on the Real News Network. He has taught at Columbia University, New York University, Princeton University, and the University of Toronto. as well as students earning their college degrees from Rutgers University in the New Jersey prison system. You can find him at chrishedges.substack.com. This article originated at TomDispatch.

Copyright ©2022 Chris Hedges — distributed by Agence Global

—————-
Released: 25 October 2022
Word Count: 2,450
—————-

Nick Turse, “Getting to yes”

October 24, 2022 - TomDispatch

What’s the U.S. military doing in Africa? It’s an enigma, wrapped in a riddle, straight-jacketed in secrecy, and hogtied by red tape. Or at least it would be if it were up to the Pentagon.

Ten years ago, I embarked on a quest to answer that question at TomDispatch, chronicling a growing American military presence on that continent, a build-up of both logistical capabilities and outposts, and the possibility that far more was occurring out of sight. “Keep your eye on Africa,” I concluded. “The U.S. military is going to make news there for years to come.”

I knew I had a story when U.S. Africa Command (AFRICOM) failed to answer basic questions honestly. And the command’s reaction to the article told me that I also had a new beat.

Not long after publication, AFRICOM wrote a letter of complaint to my editor, Tom Engelhardt, attempting to discredit my investigation. (I responded point by point in a follow-up piece.) The command claimed the U.S. was doing little on that continent, had one measly base there, and was transparent about its operations. “I would encourage you and those who have interest in what we do to review our Website, www.AFRICOM.mil, and a new Defense Department Special Web Report on U.S. Africa Command at this link http://www.defense.gov/home/features/2012/0712_AFRICOM/,” wrote its director of public affairs Colonel Tom Davis.

A decade later, the link is dead; Davis is a functionary at Pima Community College in Tucson, Arizona; and I’m still keeping an eye on AFRICOM.

A few months ago, in fact, I revealed the existence of a previously unknown AFRICOM investigation of an airstrike in Nigeria that killed more than 160 civilians. A formerly secret 2017 Africa Command document I obtained called for an inquiry into that “U.S.-Nigerian” operation that was never disclosed to Congress, much less the public.

Since then, AFRICOM has steadfastly refused to offer a substantive comment on the strike or the investigation that followed and won’t even say if it will release relevant documents to members of Congress. Last month, citing my reporting, a group of lawmakers from the newly formed Protection of Civilians in Conflict Caucus called on Secretary of Defense Lloyd Austin to turn over the files on, and answer key questions about, the attack. The Pentagon has so far kept mum.

Has AFRICOM then, as Davis contended so long ago, been transparent? Is its website the go-to spot for information about U.S. military missions on that continent? Did its operations there remain few and innocuous? Or was I onto something?

A kinder, gentler combatant command

From its inception, according to its first commander, General William Ward, AFRICOM was intended “to be a different kind of command”: less hardcore, more Peace Corps. “AFRICOM’s focus is on war prevention,” Deputy Assistant Secretary of Defense for African Affairs Theresa Whelan said in 2007, “rather than warfighting.”

In 2012, Ward’s successor, General Carter Ham, told the House Armed Services Committee that “small teams” of American personnel were conducting “a wide range of engagements in support of U.S. security interests.” Years later, retired Army Brigadier General Don Bolduc, who served at AFRICOM from 2013 to 2015 and headed Special Operations Command Africa until 2017, would offer some clarity about those “engagements.” Between 2013 and 2017, he explained, American commandos saw combat in at least 13 African countries: Burkina Faso, Cameroon, Central African Republic, Chad, the Democratic Republic of Congo, Kenya, Libya, Mali, Mauritania, Niger, Somalia, South Sudan, and Tunisia. U.S. troops, he added, were killed or wounded in action in at least six of them.

Between 2015 and 2017, there were at least 10 unreported attacks on American troops in West Africa alone. A month after that January 2017 Nigerian air strike, in fact, U.S. Marines fought al-Qaeda militants in a battle that AFRICOM still won’t admit took place in Tunisia. That April, a U.S. commando reportedly killed a member of warlord Joseph Kony’s Lord’s Resistance Army in the Central African Republic. The next month, during an advise, assist, and accompany mission, 38-year-old Navy SEAL Kyle Milliken was killed and two other Americans were wounded in a raid on a militant camp in Somalia. That same year, a Navy SEAL reportedly shot and killed a man outside a compound flying an Islamic State (ISIS) flag in Cameroon. And that October, AFRICOM was finally forced to abandon the fiction that U.S. troops weren’t at war on the continent after ISIS militants ambushed American troops in Niger, killing four and wounding two more. “We don’t know exactly where we’re at in the world, militarily, and what we’re doing,” said Republican Senator Lindsey Graham, then a member of the Senate Armed Services Committee, after meeting with Pentagon officials about the attack.

In the 2010s, I would, in fact, help reveal that the U.S. had conducted at least 36 named operations and activities in Africa — more than anywhere else on earth, including the Middle East. Among them were eight 127e programs, named for the budgetary authority that allows Special Operations forces to use foreign military units as surrogates in counterterrorism missions. More recently, I would report on 11 of those proxy programs employed in Africa, including one in Tunisia, code-named Obsidian Tower and never acknowledged by the Pentagon, and another with a notoriously abusive Cameroonian military unit connected to mass atrocities.

Five of those 127e programs were conducted in Somalia by U.S. commandos training, equipping, and directing troops from Ethiopia, Kenya, Somalia, and Uganda as part of the fight against the Islamist militant group al-Shabaab. In 2018, 26-year-old Alex Conrad of the Army’s Special Forces was killed in an attack on a small U.S. military outpost in Somalia.

Such outposts have long been a point of contention between AFRICOM and me. “The U.S. maintains a surprising number of bases in Africa,” I wrote in that initial TomDispatch article in July 2012. Colonel Davis denied it. “Other than our base at Camp Lemonnier in Djibouti,” he claimed, “we do not have military bases in Africa.” I had, he insisted, filed that article before AFRICOM could get me further outpost material. “If he had waited, we would have provided the information requested, which could have better informed his story.”

I had begun requesting information that May, called in additional questions in June and July, and then (as requested) put them in writing. I followed up on the 9th, mentioning my looming deadline and was told that AFRICOM headquarters might have some answers for me on the 10th. That day came and went, as did the 11th. TomDispatch finally published the piece on July 12th. “I respectfully submit that a vigorous free press cannot be held hostage, waiting for information that might never arrive,” I wrote Davis.

When I later followed up, Davis turned out to be on leave, but AFRICOM spokesperson Eric Elliott emailed in August to say: “Let me see what I can give you in response to your request for a complete list of facilities.”

Then, for weeks, AFRICOM went dark. A follow-up email in late October went unanswered. Another in early November elicited a response from spokesperson Dave Hecht, who said that he was handling the request and would provide an update by week’s end. I’m sure you won’t be shocked to learn that he didn’t. So, I followed up yet again. On November 16th, he finally responded: “All questions now have answers. I just need the boss to review before I can release. I hope to have them to you by mid next week.” Did I get them? What do you think?

In December, Hecht finally replied: “All questions have been answered but are still being reviewed for release. Hopefully this week I can send everything your way.” Did he? Hah!

In January 2013, I received answers to some questions of mine, but nothing about those bases. By then, Hecht, too, had disappeared and I was left dealing with AFRICOM’s Chief of Media Engagement, Benjamin Benson. When asked about my questions, he replied that public affairs couldn’t provide answers and I should instead file a Freedom of Information Act (FOIA) request.

To recap, six months later, Benson recommended I start again. And in good faith, I did. In 2016, three and a half years later, I finally received a partial response to that FOIA request: one page of partially redacted — not to mention useless — information about (yep!) Camp Lemonnier and nothing else.

I would spend years investigating the bases Davis claimed didn’t exist. Using leaked secret documents, I shed light on a network of African drone bases integral to U.S. assassination programs on the continent as well as the existence of a secret network of National Security Agency eavesdropping outposts in Ethiopia. Using formerly secret documents, I revealed an even larger network of U.S. bases across Africa, again and again. I used little-noticed open-source information to highlight activities at those facilities, while helping expose murder and torture by local forces at a drone base in Cameroon built-up and frequented by Americans. I also spotlighted the construction of a $100 million drone base in Niger; a previously unreported outpost in Mali apparently overrun by militants after a 2012 coup there by a U.S.-trained officer; the expansion of a shadowy drone base in the Horn of Africa and its role in lethal strikes against the Islamic State in Iraq and Syria; hundreds of drone strikes from Libya to Somalia and the resulting civilian casualties; and the flailing, failing U.S. war on terror all across Africa.

Not surprisingly, AFRICOM’s website never had much to say about such reporting, nor could you go there to find articles like:

“The AFRICOM Files: Pentagon Undercounts and Ignores Military Sexual Assault in Africa”

“Pentagon Document Shows U.S. Knew of ‘Credible’ Reports of Civilian Casualties After Its Attacks in Somalia”

“New Data Shows the U.S. Military Is Severely Undercounting Civilian Casualties in Somalia”

“Pentagon Stands by Cameroon — Despite Forensic Analysis Showing Its Soldiers Executed Women and Children”

“U.S. Troops in Africa Might be in Danger. Why Is the Military Trying to Hide It?”

You know you’re on target when you’re getting a lot of flak(s)

In the years since, a parade of AFRICOM press officials came and went, replying in a by-then-familiar fashion. “Nick, we’re not going to respond to any of your questions,” Lieutenant Commander Anthony Falvo, head of its public affairs branch, told me in October 2017. Did he, I asked, believe AFRICOM needn’t address questions from the press in general or only from me. “No, just you,” he replied. “We don’t consider you a legitimate journalist, really.” Then he hung up.

That same month, I was inadvertently ushered behind the closed doors of the AFRICOM public affairs office. While attempting to hang up on me, a member of the staff accidentally put me on speakerphone and suddenly I found myself listening in to the goings on, from banal banter to shrieking outbursts. And, believe me, it wasn’t pretty. While the command regularly claimed its personnel had the utmost respect for their local counterparts, I discovered, for example, that at least certain press officers appeared to have a remarkably low opinion of some of their African partners. At one point, Falvo asked if there was any “new intelligence” regarding military operations in Niger after the 2017 ambush that killed those four American soldiers. “You can’t put Nigeriens and intelligence in the same sentence,” replied someone in the office. Laughter followed and I published the sordid details. That very month, Anthony Falvo shipped off (literally ending up in the public affairs office of the USS Gerald Ford).

Today, a new coterie of AFRICOM public affairs personnel field questions, but Falvo’s successor, Deputy Director of Public Affairs John Manley, a genuine professional, seems to be on call whenever my questions are especially problematic. He swears this isn’t true, but I’m sure you won’t be shocked to learn that he fielded my queries for this article.

After Col. Tom Davis — who left AFRICOM to join Special Operations Command (where, in a private email, he called me a “turkey”) — failed to respond to my interview requests, I asked AFRICOM if his defer-and-deny system was the best way to inform the American public. “We are not going to comment on processes and procedures in place a decade ago or provide opinions on personnel who worked in the office at that time,” said Manley.

“Our responsibility is to provide timely, accurate, and transparent responses to queries received from all members of the media,” Manley told me. Yes, me, the reporter who’s been waiting since 2012 for answers about those U.S. bases. And by AFRICOM standards, maybe that’s not really so long, given its endless failures in quelling terrorism and promoting stability in places like Burkina Faso, Libya, and Somalia.

Still, I give Manley a lot of credit. He isn’t thin-skinned or afraid to talk and he does offer answers, although sometimes they seem so far-fetched that I can’t believe he uttered them with a straight face. Though he agreed to discuss his replies further, I doubted that badgering him would get either of us anywhere, so I’ll just let his last one stand as a digital monument to my 10-year relationship with AFRICOM. When I asked if the public affairs office had always been as forthcoming, forthright, and helpful with my queries as possible, he unleashed the perfect capstone to my decade-long dance with U.S. Africa Command by offering up just one lone word: “Yes.”

Nick Turse is the managing editor of TomDispatch (where this article originated) and a fellow at the Type Media Center. He is the author most recently of Next Time They’ll Come to Count the Dead: War and Survival in South Sudan and of the bestselling Kill Anything That Moves.

Copyright ©2022 Nick Turse — distributed by Agence Global

—————-
Released: 24 October 2022
Word Count: 2,238
—————-

Karen J. Greenberg, “The Ukraine moment”

October 20, 2022 - TomDispatch

Ukraine is obviously a powder keg. With each passing day, in fact, the war there poses new threats to the world order. Only recently, Vladimir Putin’s Russia intensified its attacks on civilian targets in that beleaguered land, while threatening to use tactical nuclear weapons and adding Ukraine’s neighbor Belarus to its side on the battlefield. And don’t forget the Russian president’s decision to draft hundreds of thousands of additional civilians into his military, not to speak of the sham referendums he conducted to annex parts of Ukraine and the suspected cyberattack by a pro-Russian group that disrupted airline websites at hubs across the United States.

President Biden has repeatedly pledged not to enter the war. As he wrote in an op-ed in the New York Times last May (and has continued to signal): “So long as the United States or our allies are not attacked, we will not be directly engaged in this conflict, either by sending American troops to fight in Ukraine or by attacking Russian forces.” Washington has instead carved out a cautious but decidedly engaged response to the war there.

So far, that conflict has not posed a threat to this country and the Biden administration has held fast to the president’s commitment not to engage directly in that fight. But the war does continue to escalate, as do the taunts of an increasingly desperate Vladimir Putin. To date, the U.S. has pledged $15.2 billion in military assistance to Ukraine and its neighbors, an investment that has included arms, munitions, equipment, and training. The Biden administration had also imposed sanctions against more than 800 Russians as of June with additional ones announced in late September, while blocking oil and gas imports from that country.

At such a moment of ever-increasing international tension, however, it seems worthwhile to recall what lessons the United States learned (or at least should have learned) from its own wars of this century that fell under the rubric of the Global War on Terror, or GWOT.

Lessons learned?

We certainly should have learned a great deal about ourselves over the course of the war on terror, the global conflicts that followed al-Qaeda’s devastating attacks of September 11, 2001.

We should have learned, for instance, that once a war starts, as the war on terror did when the administration of George W. Bush decided to invade Afghanistan, it can spread in a remarkable fashion — often without, at least initially, even being noticed — to areas far beyond the original battlefield. In the end, the war on terror would, in its own fashion, spread across the Middle East, South Asia, and Africa, with domestic versions of it lodging in both European countries and the United States in the form of aggressive terrorism prosecutions, anti-Muslim policing efforts, and, during the Trump administration, a “Muslim ban” against those trying to enter the U.S. from many largely Muslim countries.

In the process, we learned, or at least should have learned, that our government was willing to trade rights, liberties, and the law for a grim version of safety and security. The trade-off would, in the end, involve the indefinite detention of individuals (some to this very day) at that offshore prison of injustice, Guantánamo; torturing captives at CIA black sites around the world; launching “signature drone strikes” which regularly made no distinction between civilians and combatants; not to mention the warrantless surveillance that targeted the calls of staggering numbers of Americans. And all of this was done in the name of keeping ourselves safe, even if, in the end, it would help create an America in which ever less, including democracy, seems safe anymore.

Finally, we should have learned that once a major conflict begins, its end can be — to put the matter politely — elusive. In this way, it was no mistake that the war on terror, with us to this day in numerous ways, informally became known as our “forever war,” given the fact that, even today we’re not quite done with it. (U.S. troops are, for instance, still in Iraq and Syria.) According to the Costs of War Project at Brown University, that conflict has cost this country at least $8 trillion — with an additional estimated $2.2-$2.5 trillion needed to care for the veterans of the war between now and 2050.

Given all of this, there are, at least, three lessons to be taken from the war on terror, each sending a strong signal about how to reckon with Russia’s aggression against Ukraine.

Beware mission creep

The war on terror was in large part defined by mission creep. What started as an incursion into Afghanistan to rout al-Qaeda and the perpetrators of 9/11 grew exponentially into a global set of conflicts, including a full-scale invasion of Iraq and the use (largely) of air power in Pakistan, Somalia, Yemen, and other countries across Africa and the Middle East. This was all deemed possible thanks to a single joint resolution passed by Congress a week after the attacks of September 11th, the Authorization for the Use of Military Force (AUMF), which included neither geographical areas nor specific adversaries other than those who conspired to bring about (or supported in some fashion) the 9/11 attacks. It was, in other words, so vague as to allow administration after administration to choose its enemies without again consulting Congress. (A separate 2002 authorization would launch the invasion of Iraq.)

The war in Ukraine similarly continues to widen. The 30 nations in NATO are largely lined up alongside that country against Russia. On October 11th, the Group of Seven, or G7, including Canada, France, Germany, Italy, Japan, the United Kingdom, and the United States, pledged “financial, humanitarian, military, diplomatic, and legal support… for as long as it takes.” On that same day, the U.N. met to consider responses to Russia’s escalating missile and drone attacks on Ukrainian cities as well as its claim to have won a referendum supposedly greenlighting its annexation of four Ukrainian regions.

Meanwhile, the U.S. commitment to support Ukraine has grown ever more geographically extensive. As Secretary of State Antony Blinken explained during a visit to Kyiv in September, the American mission encompasses an effort “to bolster the security of Ukraine and 17 of its neighbors; including many of our NATO Allies, as well as other regional security partners potentially at risk of future Russian aggression.” Moreover, the United States has acted on an ever more global scale in its efforts to levy sanctions against Russia’s oligarchs, while warning of retribution (of an undefined sort) against any nation that provides a haven for them, as did China when it allowed a superyacht owned by a Russian oligarch to dock in Hong Kong’s harbor.

When it comes to Ukraine, the imperative of defining and limiting the scope of American involvement — whether in the areas of funding, weapons supplied, training, or even the deployment of U.S. troops near Ukraine or secret operatives in that country — couldn’t (in the light of GWOT) be more important. So far, Biden has at least kept his promise not to send U.S. troops to Ukraine. (In fact, just before the Russian invasion, he actually removed national guardsmen who had been stationed there in the late fall of 2021.)

It is perhaps a sign of restraint that the Biden administration has so publicly specified just what weaponry it’s providing to that country and which other countries it’s offering assistance to in the name of security concerns over the war. And in making decisions about which munitions and armaments to offer, the administration has insisted on deliberation and process rather than quick, ad-hoc acts. Still, as the GWOT taught us, mission creep is a danger and, as Putin’s Russia continues to expand its war in Ukraine, it’s important to keep a watchful eye on our expanding involvement, too.

Honor the law

Notably, the war has been defined by Russia’s escalating abuses of international law and human rights. To begin with, that country violated international law with its unprovoked invasion, an act of straightforward aggression. Since then, reports of atrocities have mounted. An Independent International Commission of Inquiry on Ukraine issued a report last month to the U.N.’s Commissioner for Human Rights citing the use of explosives in civilian areas; evidence of torture, rape, and brutal executions; and the intentionally cruel treatment of those in custody. The massacre of civilians in the Ukrainian towns of Bucha and Izyum signaled Russia’s intent to continue its gruesome violations of the laws of war despite Ukrainian President Volodymyr Zelensky’s appeal to the U.N. for accountability.

That this is the road to lasting problems and an escalating threat environment is a lesson this country should have learned from its own war on terror in this century. The atrocities carried out by terrorist groups, including 9/11, led top officials in the Bush administration to calculate that, given the threat facing the country, it would be legitimate, even imperative, to ignore both domestic and international legal restraints. The greatest but hardly the only example of this was the willingness of the Central Intelligence Agency to use torture, which it relabeled “enhanced interrogation techniques,” including waterboarding, exposure to extreme cold, sleep deprivation, and painful, prolonged forms of shackling at CIA black sites scattered around the world. That brutal program was finally laid out in 2014 in a nearly 600-page executive summary of a Senate investigation. Other illegal actions taken during the war on terror included setting up Guantánamo offshore of American justice and the Bush administration’s decision to invade Iraq based on a lie: that autocrat Saddam Hussein possessed weapons of mass destruction.

When it comes to Ukraine, the war-on-terror experience should remind us of the importance of restraint and lawfulness, no matter the nature of the Russian threat or the cruel acts Putin has countenanced. “Russian forces were likely responsible for most casualties, but so too Ukrainian troops — albeit to a far lesser extent,” the U.N. commissioner for human rights said in a video message last spring. In August, Amnesty International issued a report which held that “Ukrainian forces have put civilians in harm’s way by establishing bases and operating weapons systems in populated residential areas, including in schools and hospitals.”

Plan for an ending

Despite Vladimir Putin’s predictions that the war would end quickly with a Russian triumph and despite his continuing escalation of it, there has been no dearth of scenarios for such an ending. Early on, observers saw the possibility of a negotiated peace in which Ukraine would agree not to seek future membership in NATO, while Russia withdrew its troops and dropped its claims to Ukrainian territory (Crimea excepted). Soon thereafter, another scenario forecast “a new iron curtain” after Russian gains in eastern and southern Ukraine left “two antagonistic blocs staring each other down over a lengthy militarized border.” Others have predicted endless further escalation, including a possible Russian tactical nuclear strike in that country causing the West to retreat — or counter with its own nuclear gesture.

Only recently, almost eight months into the war, 66 nations at the U.N. General Assembly called for its end, while even retired American Admiral Mike Mullen, the former chairman of the Joint Chiefs of Staff, told ABC’s George Stephanopoulos, “I think we need to back off [the war] a little bit and do everything we possibly can to try to get to the table to resolve this thing.” Others agree that the conflict should be ended sooner rather than later.

And for good reason! This country’s war on terror should be an apt reminder that planning for an ending is imperative, sooner rather than later. From the beginning, you might say, the forever war had no sense of an ending, since Congress’s authorization for the use of force lacked not only geographical but temporal limits of any sort. There was, in fact, no sense of what an end to hostilities might involve. Not even the killing of Osama bin Laden, the leader of al-Qaeda, in 2011 was seen as ending anything, nor was the death of autocrat Saddam Hussein imagined as a conclusion of that American war. To this day, that 2001 authorization for war remains in place and one of the main symbols of the excesses of the war — Guantánamo Bay — remains open.

Right now, despite any calls by former warriors like Mullen or diplomats for an end to the war in Ukraine, it’s proving a distinctly elusive proposition not just for Vladimir Putin but for the U.S. and its NATO allies as well. As a senior administration official told the Washington Post recently, speaking of Putin’s threat to use nuclear weapons and his draft of new Russian conscripts, “It’s definitely a sign that he’s doubling down, that we’re not close to the end, and not close to negotiations.”

In a speech delivered at the U.N. in late September, Secretary of State Antony Blinken caught the forever-war mood of the moment on all sides by expressing doubts about diplomacy as a cure-all for such a war. “As President Zelensky has said repeatedly,” Blinken told the Security Council, “diplomacy is the only way to end this war. But diplomacy cannot and must not be used as a cudgel to impose on Ukraine a settlement that cuts against the U.N. Charter, or rewards Russia for violating it.”

Given the lessons of the war on terror, casting doubt on the viability of future negotiations risks setting the stage for never-ending warfare of a distinctly unpredictable sort.

The stakes

Though the war in Ukraine is taking place in a different context than the war on terror, with a different set of interests at stake and without the non-state actors of that American conflict, the reality is that it should have yielded instructive lessons for both sides. After all, America’s forever war harmed the fabric of our political life in ways almost too numerous to name, many of them related to the ever-expansive, extralegal, never-ending nature of that conflict. So imagine what this war could do to Russia, to Ukraine, and to our world.

The war in Ukraine offers Washington an opportunity to push the international community to choose a new scenario rather than one that will expand into a frighteningly unknown future. It gives the Biden administration a chance to choose law over lawlessness and emphasize a diplomatic resolution to that still-escalating crisis.

This time around, the need to exercise restraint, caution, and a deep respect for the law, while envisioning how the hostilities might actually end, could not be more important. The world of our children lies in the balance.

Karen J. Greenberg writes regularly for TomDispatch (where this article originated). She is the director of the Center on National Security at Fordham Law and author most recently of Subtle Tools: The Dismantling of Democracy from the War on Terror to Donald Trump (Princeton University Press). Julia Tedesco conducted research for this article.

Copyright ©2022 Karen J. Greenberg — distributed by Agence Global

—————-
Released: 20 October 2022
Word Count: 2,423
—————-

Robert Lipsyte, “Home runs first”

October 18, 2022 - TomDispatch

Home runs first
by Robert Lipsyte

The time has come to ban the bomb.

Of course, all those nuclear ones in the arsenals of the “great” powers, but — since I’m a sportswriter by trade — let’s start with the home run. Call it a four-bagger, a dinger, a moon shot, or (in my childhood) a Ballantine blast for the beer that sponsored so much baseball. One thing is certain, though: the dream of the game-changing home run has shaped our approach to so much, from sports to geopolitics. Most significantly, it’s damaged our ability to solve problems through reason and diplomacy.

So, consider banning both The Bomb and the home run as the first crucial steps toward a safer, more peaceful world.

For 102 years now, since Babe Ruth first joined the Yankees, we’ve been heading for this moment when a frustrated American lunatic might potentially try to take this country hostage by threatening violent civil war, while a frustrated Russian lunatic tries to take the world hostage by threatening to annihilate it.

Saving both the country and the world by disarming the lunatics can only be accomplished via the careful little steps that no longer seem to be a priority either in the playbooks of baseball or in the arsenals of liberal democracy. Over the past decades, they’ve largely been discarded in favor of the idea of the big bang, be it for deterrence, intimidation, or, in two horrendous moments in 1945, actual big bangs that created the politics of mutually assured destruction as a forever possibility.

How did that happen? In sports, blame it on baseball, which gave up much of its original artistry for the triumphal explosion that now overrides all else, potentially wiping out both past mistakes and future hopes. To set a proper example, the home run should be cancelled if the world is to be saved.

It’s easy enough. Just change the rulebooks so that a ball hit out of the park doesn’t count. It’s not even a ball or a strike, just a nothing, another missing baseball. Get over it.

Bombs away!

Getting rid of the home run will be a particularly hard sell in the glow of the round-tripper renaissance born by the extraordinary season of the New York Yankees’ Aaron Judge. It unfolded, handily enough, as the specters of both Donald Trump and Vladimir Putin haunted the non-sports networks. By hitting 62 home runs in a single season, an American League record, Judge brought back the shock-and-awe thrill of it all in a creamy cloud of nostalgia that has briefly obscured the terror of the real bombs.

Judge’s record season also managed to obscure for the moment just how tawdry the very idea of a home-run record had become. After all, the major league home-run record is now 73, set in 2001 by Barry Bonds of the San Francisco Giants. Until Judge came along, that record, like Bonds himself, had been mired in a Trumpian or Putinesque sports version of disgrace and disgust, though ascribing sane motives to Bonds is far easier than to Trump or Putin, because Bonds is no lunatic.

In fact, he was a truly great player, apparently so maddened by the ascendance of rival hitters seemingly on performance-enhancing drugs that he, too, may have reached for chemical help. The runner-ups to him for the single-season record, Mark McGwire (70 dingers in 1998) and Sammy Sosa (66 in 1998), were also linked to steroid use.

Ironically, it was in 1998, a year stained by the Bill Clinton-Monica Lewinsky scandal, that the McGwire-Sosa home run rivalry was credited with diverting the nation from the shame of the White House — and it could only do so because home-run records held such powerful magic.

The record for ultimate power without drugs demands respect. In that sense, the most impressive previous one was set at 61 in 1961 by Roger Maris. He was a Yankees outfielder and a thoughtful, decent player without much flair. Despite all those homers, he was no Bombardier, especially because he was playing alongside a charismatic superstar, Mickey Mantle, whom fans had long hoped would supplant the until-then ultimate record of that ur-superstar Babe Ruth. Maris was never quite accepted as such after he broke Ruth’s 1927 60-homer mark.

Enter the Babe

In his time (and for decades thereafter), the Babe was Mr. Baseball and, in some ways, Mr. America, too, the very symbol of this country’s emerging power after World War I. His style of play — Bam! — was the one our leaders began to see themselves bringing to global dynamics. He was the face of the Roaring Twenties (unless you’d prefer that flying fascist Charles Lindbergh or that gangster-in-chief Al Capone).

Ruth had been a sensation, a metaphor for appetite, celebrity, food, sex, and victory. In 1920, his first year with the New York Yankees, the 25-year-old Ruth hit what was then a nearly inconceivable number of home runs: 54. Until that moment, 15 or so homers were usually enough to win the home-run title. An exception was 1919 when Ruth, then still a Boston Red Sox pitcher, hit 29.

The Babe appeared at a propitious moment for baseball. His achievements counteracted the negative effects of what came to be known as the Black Sox scandal in which members of the Chicago White Sox were accused of throwing the 1919 World Series to the Cincinnati Reds in a gambling scheme. There was gloom and soul searching. The national pastime fixed? The nation corrupted?

At least in the mythology of baseball, the emergence of Babe Ruth and the Yankees was credited with helping save the game itself and perhaps the pride of the nation as well. Through sheer power! Bam!

The Yankees would, in fact, get into the World Series in six of the next eight seasons as they developed into baseball’s powerhouse franchise. With all those homers in mind, they would come to be known as the Bronx Bombers. The United States went on to swing its own big bats in World War II, Korea, Vietnam, the Persian Gulf, Afghanistan in 2001 and the Persian Gulf again in 2003, all en route to becoming, at least in the minds of its leaders and the Washington foreign-policy crew, the world’s leading superpower.

Time out. Are you finding this hyperbolic or, given the nature of baseball, not serious enough to put on the same page with those endless wars or the once all-American weaponry that has now become Vladimir Putin’s threat to the world? Babe Ruth, Roger Maris, Barry Bonds a key to our future? Not likely, huh? Well, just hang on to my theory that we’re in thrall to The Bomb (or do I mean enthralled by it?) and that, to survive, we’d better begin disarming — and keep reading.

Enter Aaron Judge

Enter Aaron Judge, a large, friendly, humble 30-year-old, an accomplished all-round player who’s considered “clean” or steroid-free.

On October 4th, in Toronto, in the first inning of the next-to-last game of the 2022 regular season, Judge, to his great relief and that of so many fans, hammered number 62. Mission accomplished! (Sadly, an apt enough phrase, given the way President George W. Bush featured it in reference to his 2003 invasion of Iraq — only to later regret it for obvious reasons and have it used again in 2018 by Donald Trump in reference to Syria, where U.S. troops remain to this day.)

At that moment, there was a new Bomber-in-Chief and might makes right was reaffirmed. No other sport, in fact, ever reinvented itself so thoroughly by focusing on one act — although football came close in 1906 when it legalized the forward pass. That, however, was an attempt to open up the game to prevent so many injuries from the brutal mass collisions of what was then essentially a rushing game. The year before, 19 young men had died and 159 had been seriously injured. President Theodore Roosevelt, a famous proponent of that supposedly manly game, demanded reforms to save it. The casualty rate soon dropped, but how well that all came out remains open to doubt, since the issue of traumatic brain injuries continues to plague football.

While football and baseball both became more dramatically exciting with their big bangs, in the process, baseball lost its brainy chess-like quality. Instead of eking out runs using cunning tactics like the sacrifice bunt, the hit-and-run play, or the delayed steal — all now categorized, whether nostalgically or derisively, as “small ball” — managers came to depend on their sluggers to muscle their way to victory, often at the last minute. As time went on and football, with its dramatic brutality, also often heightened at the last minute, became the dominant sport, the home run only gained more value as one of the best ways to lure in younger fans.

Power is sexy

The home run was once justified by the Nike slogan “Chicks dig the long ball,” a variant perhaps of former Secretary of State Henry Kissinger’s “Power is the ultimate aphrodisiac.”

Trump and Putin, like most long-ball hitters (although not Aaron Judge), tend to strike out all too often and be forgiven for it because their fans believe that they’ll soon turn it all around with a home run. No wonder the term home run has become synonymous with having done the best job possible, nailing the deal, case, or diagnosis. In truth, the home run should have become the symbol of the quick fix that may not hold, the brass ring that diverts us from the pleasure of the process, the big club created to intimidate opponents into submission that so often turns them into resentful insurgents.

Which is where we are now. The Russians are in the deep muddy exactly because the Ukrainians knew how to play small ball. They found that they could take a hit-and-run approach with those Russian tanks on the outskirts of Kyiv as effectively as the Vietcong ever did with American ones (and you may remember who won that war).

At the same time, Trump’s Republicans and Putin’s Russians have depended on the long ball. The January 6th insurgency was an attempted walk-off blast to drive home the Big Lie that Trump had really won the election. Had it succeeded, he would have been an autocrat by coup. It was, however, thwarted by small ball: the incredible courage and discipline of the police and the defense of the nation by Democrats through the democratic process.

The invasion of Ukraine and the attempted seizure of its capital, Kyiv, ostensibly to save it from Western aggression as well as “militarism and Nazification,” was Putin’s shot at a home-run-style putsch. He had envisioned his invasion as a triumphant blitzkrieg ending in a quick Russian victory. The duration, relentlessness, and success of the Ukrainian response surprised the world, especially America, which (in its version of finesse) then used sanctions and military aid to support that beleaguered country.

The struggle continues as the Trump team threatens bloodshed in the streets and civil war if their criminal goals are legally blocked. Meanwhile, Vladimir Putin continues to threaten an all-too-literal big-bang response to Ukrainian battlefield successes via his country’s vast arsenal of tactical nuclear weapons stationed just offstage — with the fear of escalation into full-scale mushroom clouds and, as our president put it, “Armageddon” lurking in our future.

Talk about a potential Big Bang!

What can we do?

Getting out the vote, especially in this time of voter suppression, requires small ball in its most passionate and precise form. Small ball was always about hard work, discipline, and dedication. Think of non-violent demonstrations during the Civil Rights era. So, practice your political version of the sacrifice bunt, while making sure that everyone is on the team, knows the play, and turns out.

Far be it from me to advise the Ukrainians, especially in the arts of the hit-and-run and the sacrifice. Material aid and back-channel diplomacy are, however, also examples of small ball in their way, but the terror of the Big Bang still looms over everything.

Admittedly, metaphor seems shallow and easy when so many lives are at stake, but at least when it comes to baseball, if not this planet, it would indeed be possible to ban the bomb and return to a sports version of a small-ball world. Unfortunately, sports rules don’t work globally, so banning the real bomb seems all too unlikely.

If we could do that, though, you could let the home run stay and who would care?

But, alas, what’s happening on this planet isn’t a game after all.

Robert Lipsyte writes regularly for TomDispatch (where this article originated) and is a former sports and city columnist for the New York Times. He is the author, among other works, of SportsWorld: An American Dreamland.

Copyright ©2022 Robert Lipsyte — distributed by Agence Global

—————-
Released: 18 October 2022
Word Count: 2,090
—————-

Rami G. Khouri, “The OPEC+ oil cut and the lessons of imperial overreach”

October 17, 2022 - Rami G. Khouri

What should we make of the spat between the United States and Saudi Arabia, following last week’s announcement of a sharp cut in oil production by the Russian- and Saudi-headed cartel OPEC+? Shocked analysts and officials in the United States and Europe called the Saudi move a betrayal and a hostile act against the Western allies mired in the Ukraine war. Many see this as a personal humiliation for President Joe Biden, with Riyadh siding with Russia in its war on Ukraine — even after Biden fist-bumped with Saudi Crown Prince Mohammed bin Salman at a meeting in Jeddah, in a complete reversal of his campaign promise to make the Saudis “the pariah that they are.” American officials are now considering a series of retaliatory measures, including stopping arms sales and even withdrawing all 3,000 U.S. troops from Saudi Arabia (and the 2,000 U.S. soldiers in the neighboring United Arab Emirates, another OPEC+ member; Sheikh Mohammed bin Zayed Al Nahyan, the president of the UAE, had a friendly meeting with Vladimir Putin in Moscow this week).

As reports emerge of Saudi officials apparently ignoring U.S. warnings not to go ahead with the oil production cut, I can’t help but think of the lessons of history. A much longer time frame and wider context may be necessary to fully analyze this situation and accurately capture what it is all about. I’ve chronicled the modern Middle East and its links with the United States for the past 54 years, including two decades during which I also wrote books on archaeology and the Roman Empire in the region. With that much history in mind, the immediate issues here are no doubt important, evolving according to many factors beyond oil prices: Ukraine, the upcoming U.S. elections in November, Arab worries about Iran, and the roles of Russia and China in the Middle East. But they may not be the best frame in which to appreciate these furies.

A deeper tale here has lived in our region for millennia and is rearing its head now in the most dramatic fashion: imperial overreach. Going back to the Roman East, imperial powers have had expectations of their local clients that proved to be mistaken. Throughout the long history of the Middle East, the volatility of such relationships has been enough to tip the scales between total imperial conquest or retreat.

There are echoes of this distant past in Biden’s rift with Mohammed bin Salman. It is worth recalling a long-ago episode between a foreign imperial power and the dominant state in the Arabian Peninsula, as documented by the Greek geographer Strabo writing in the first century B.C.

In 25 B.C., a Roman military expedition set out from Egypt to control all of Arabia, especially the lucrative spice-trading states of Arabia Felix, in what is now Yemen. An expeditionary force of some 10,000 Roman and Egyptian troops set off from the Suez area to cross the Red Sea, accompanied by 1,500 Nabataean and Jewish troops. The Nabataeans, from their capital at Petra in modern-day Jordan, agreed to assist the Romans and sent a local administrator named Syllaeus to help guide them south through the Arabian desert.

But things did not go as planned, and after six months of mostly fruitless wandering through the harsh terrain, losing many men and supplies, the Romans retreated and never again tried to achieve their goals of conquering Arabia. They had been led in circles by a wily Nabataean whom they had thought was a trusted ally, ultimately thwarting Rome’s imperial aims.

Why did the Romans fail in Arabia? They were ignorant of Arabian realities, misjudging local personalities and assuming that Rome’s imperial wishes would be implemented obediently, even if they were not in the best interest of local rulers and kingdoms, like the Nabataeans, who did not want to lose their control of the spice trade (Rome was dependent on Nabataea for valuable spices like frankincense). As important as the failure of this one foray is in the ancient history of the Middle East, historians also see it as an early sign of the Roman Empire’s gradual retreat, in the following centuries, from distant lands and territorial ambitions.

The parallels with the United States in the Middle East today are intriguing. Intense American anger at the OPEC+ decision, and the Saudis’ equally firm insistence on pursuing policies they feel best serve their national interests, are signs of a dysfunctional relationship that has been exposed as such by both Riyadh and Washington. Since U.S.-Saudi relations were cemented in the 1940s, their various material and ideological drivers — oil, trade and investment, anti-Communism, militarism, maintaining autocracy in the insatiable but elusive quest for so-called “stability” — now no longer coexist easily.

To be fair to the Saudi leadership — which in my view has generally been a negative force in the region for decades — they have signaled since King Abdullah’s reign, from 2005 to 2015, that Riyadh and its increasingly assertive neighbor and ally, the UAE, would start taking initiatives on their own to protect their vital interests, without always asking foreign powers (that is, the United States) to agree or help. This trend accelerated rapidly after the Arab uprisings, when Gulf rulers feared popular revolts at home, and even included attempts to derail American foreign policy in the region, in their fierce opposition to the nuclear deal that President Barack Obama sealed with Iran and other world powers. But it has since reached reckless, even criminal, dimensions under the direction of Crown Prince Mohammad bin Salman, who recently made himself Saudi Arabia’s prime minister.

While serving notice that, in tandem with the UAE, it would not become a vassal state of the United States or anyone else, Saudi Arabia has taken aggressive steps that either mostly failed or generated international backlash: the blockade of Qatar, the war in Yemen, embracing surveillance authoritarianism, courting right-wing Western leaders like Donald Trump, detaining hundreds of prominent Saudis who were accused of corruption, killing Jamal Khashoggi, and detaining former Lebanese Prime Minister Saad Hariri, to mention only the most severe. Washington and other foreign powers with close ties and influence usually tacitly accepted, or occasionally actively assisted, such transgressions.

This policy shift seems to reflect several driving forces for Saudi leaders. They could not rely on the United States to protect them when they were really threatened, such as during the 2019 drone attacks on key ARAMCO oil facilities in eastern Saudi Arabia, widely attributed to Iran. They could pursue their core energy, trade, technology and other interests with powers like Russia and China, while maintaining their close U.S. ties. And, they faced dangers if they relied too heavily on U.S. policies in the region that were both strategically incoherent and served American (and often Israeli) priorities above all else.

The furor over the OPEC+ cuts captures all these irreconcilable forces. Riyadh seems finally to have reacted to Washington’s attempts to dictate Saudi oil output. The kingdom — and, most of all, Mohammad bin Salman — also seems unfazed that its decision will hurt American and European economies, the war effort in Ukraine against Russia, and the Democrats’ election fortunes next month, while also helping Russia and perhaps even Iran.

Biden has pledged that “there will be consequences,” but how will the United States actually respond? Will there be any signs of American officials acknowledging the traps of history in the Arabian Peninsula and the apparently eternal lessons of failed imperial overreach?

Rami G. Khouri is director of global engagement at the American University of Beirut, a nonresident senior fellow at the Harvard Kennedy School Middle East Initiative, and an internationally syndicated columnist. He tweets @ramikhouri.
This article originated at DAWN

Copyright ©2022 Rami G. Khouri — distributed by Agence Global

—————-
Released: 17 October 2022
Word Count: 1,239
—————-

  • « Previous Page
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • …
  • 166
  • Next Page »

Syndication Services

Agence Global (AG) is a specialist news, opinion and feature syndication agency.

Rights & Permissions

Email us or call us 24 hours a day, 7 days a week, for rights and permission to publish our clients’ material. One of our representatives will respond in less than 30 minutes over 80% of the time.

Social Media

  • Facebook
  • Twitter

Advisories

Editors may ask their representative for inclusion in daily advisories. Sign up to get advisories on the content that fits your publishing needs, at rates that fit your budget.

About AG | Contact AG | Privacy Policy

©2016 Agence Global