Agence Global

  • About AG
  • Content
  • Articles
  • Contact AG

Patterson Deppen, “The All-American base world: 750 U.S. military bases remain around the planet”

August 19, 2021 - TomDispatch

It was the spring of 2003 during the American-led invasion of Iraq. I was in second grade, living on a U.S. military base in Germany, attending one of the Pentagon’s many schools for families of servicemen stationed abroad. One Friday morning, my class was on the verge of an uproar. Gathered around our homeroom lunch menu, we were horrified to find that the golden, perfectly crisped French fries we adored had been replaced with something called “freedom fries.”

“What are freedom fries?” we demanded to know.

Our teacher quickly reassured us by saying something like: “Freedom fries are the exact same thing as French fries, just better.” Since France, she explained, was not supporting “our” war in Iraq, “we just changed the name, because who needs France anyway?” Hungry for lunch, we saw little reason to disagree. After all, our most coveted side dish would still be there, even if relabeled.

While 20 years have passed since then, that otherwise obscure childhood memory came back to me last month when, in the midst of the U.S. withdrawal from Afghanistan, President Biden announced an end to American “combat” operations in Iraq. To many Americans, it may have appeared that he was just keeping his promise to end the two forever wars that came to define the post-9/11 “global war on terror.” However, much as those “freedom fries” didn’t actually become something else, this country’s “forever wars” may not really be coming to an end either. Rather, they are being relabeled and seem to be continuing via other means.

Having closed down hundreds of military bases and combat outposts in Afghanistan and Iraq, the Pentagon will now shift to an “advise-and-assist” role in Iraq. Meanwhile, its top leadership is now busy “pivoting” to Asia in pursuit of new geostrategic objectives primarily centered around “containing” China. As a result, in the Greater Middle East and significant parts of Africa, the U.S. will be trying to keep a far lower profile, while remaining militarily engaged through training programs and private contractors.

As for me, two decades after I finished those freedom fries in Germany, I’ve just finished compiling a list of American military bases around the world, the most comprehensive possible at this moment from publicly available information. It should help make greater sense of what could prove to be a significant period of transition for the U.S. military.

Despite a modest overall decline in such bases, rest assured that the hundreds that remain will play a vital role in the continuation of some version of Washington’s forever wars and could also help facilitate a new Cold War with China. According to my current count, our country still has more than 750 significant military bases implanted around the globe. And here’s the simple reality: unless they are, in the end, dismantled, America’s imperial role on this planet won’t end either, spelling disaster for this country in the years to come.

Tallying up the “bases of empire” I was tasked with compiling what we’ve (hopefully) called the “2021 U.S. Overseas Base Closure List” after reaching out to Leah Bolger, president of World BEYOND War. As part of a group known as the Overseas Base Realignment and Closure Coalition (OBRACC) committed to shutting down such bases, Bolger put me in contact with its co-founder David Vine, the author of the classic book on the subject, Base Nation: How U.S. Military Bases Abroad Harm America and the World.  

Bolger, Vine, and I then decided to put together just such a new list as a tool for focusing on future U.S. base closures around the world. In addition to providing the most comprehensive accounting of such overseas bases, our research also further confirms that the presence of even one in a country can contribute significantly to anti-American protests, environmental destruction, and ever greater costs for the American taxpayer.

In fact, our new count does show that their total number globally has declined in a modest fashion (and even, in a few cases, fallen dramatically) over the past decade. From 2011 on, nearly a thousand combat outposts and a modest number of major bases have been closed in Afghanistan and Iraq, as well as in Somalia. Just a little over five years ago, David Vine estimated that there were around 800 major U.S. bases in more than 70 countries, colonies, or territories outside the continental United States. In 2021, our count suggests that the figure has fallen to approximately 750. Yet, lest you think that all is finally heading in the right direction, the number of places with such bases has actually increased in those same years.

Since the Pentagon has generally sought to conceal the presence of at least some of them, putting together such a list can be complicated indeed, starting with how one even defines such a “base.” We decided that the simplest way was to use the Pentagon’s own definition of a “base site,” even if its public counts of them are notoriously inaccurate. (I’m sure you won’t be surprised to learn that its figures are invariably too low, never too high.)

So, our list defined such a major base as any “specific geographic location that has individual land parcels or facilities assigned to it… that is, or was owned by, leased to, or otherwise under the jurisdiction of a Department of Defense Component on behalf of the United States.”

Using this definition helps to simplify what counts and what doesn’t, but it also leaves much out of the picture. Not included are significant numbers of small ports, repair complexes, warehouses, fueling stations, and surveillance facilities controlled by this country, not to speak of the nearly 50 bases the American government directly funds for the militaries of other countries. Most appear to be in Central America (and other parts of Latin America), places familiar indeed with the presence of the U.S. military, which has been involved in 175 years of military interventions in the region.

Still, according to our list, American military bases overseas are now scattered across 81 countries, colonies, or territories on every continent except Antarctica. And while their total numbers may be down, their reach has only continued to expand. Between 1989 and today, in fact, the military has more than doubled the number of places in which it has bases from 40 to 81.

This global presence remains unprecedented. No other imperial power has ever had the equivalent, including the British, French, and Spanish empires. They form what Chalmers Johnson, former CIA consultant turned critic of U.S. militarism, once referred to as an “empire of bases” or a “globe-girdling Base World.”

As long as this count of 750 military bases in 81 places remains a reality, so, too, will U.S. wars. As succinctly put by David Vine in his latest book, The United States of War, “Bases frequently beget wars, which can beget more bases, which can beget more wars, and so on.”

Over the horizon wars? In Afghanistan, where Kabul fell to the Taliban earlier this week, our military had only recently ordered a rushed, late-in-the-night withdrawal from its last major stronghold, Bagram Airfield, and no U.S. bases remain there. The numbers have similarly fallen in Iraq where that military now controls only six bases, while earlier in this century the number would have been closer to 505, ranging from large ones to small military outposts.

Dismantling and shutting down such bases in those lands, in Somalia, and in other countries as well, along with the full-scale departure of American military forces from two of those three countries, were historically significant, no matter how long they took, given the domineering “boots on the ground” approach they once facilitated. And why did such changes occur when they did? The answer has much to do with the staggering human, political, and economic costs of these endless failed wars. According to Brown University’s Costs of War Project, the toll of just those remarkably unsuccessful conflicts in Washington’s war on terror was tremendous: minimally 801,000 deaths (with more on the way) since 9/11 in Afghanistan, Iraq, Pakistan, Syria, and Yemen.

The weight of such suffering was, of course, disproportionately carried by the people of the countries who have faced Washington’s invasions, occupations, air strikes, and interference over almost two decades. More than 300,000 civilians across those and other countries have been killed and an estimated nearly 37 million more displaced. Around 15,000 U.S. forces, including soldiers and private contractors, have also died. Untold scores of devastating injuries have occurred as well to millions of civilians, opposition fighters, and American troops. In total, it’s estimated that, by 2020, these post-9/11 wars had cost American taxpayers $6.4 trillion.

While the overall number of U.S. military bases abroad may be in decline as the failure of the war on terror sinks in, the forever wars are likely to continue more covertly through Special Operations forces, private military contractors, and ongoing air strikes, whether in Iraq, Somalia, or elsewhere.

In Afghanistan, even when there were only 650 U.S. troops left, guarding the U.S. embassy in Kabul, the U.S. was still intensifying its airstrikes in the country. It launched a dozen in July alone, recently killing 18 civilians in Helmand province in southern Afghanistan. According to Secretary of Defense Lloyd Austin, attacks like these were being carried out from a base or bases in the Middle East equipped with “over the horizon capabilities,” supposedly located in the United Arab Emirates, or UAE, and Qatar. In this period, Washington has also been seeking (as yet without success) to establish new bases in countries that neighbor Afghanistan for continued surveillance, reconnaissance, and potentially air strikes, including possibly leasing Russian military bases in Tajikistan.

And mind you, when it comes to the Middle East, the UAE and Qatar are just the beginning. There are U.S. military bases in every Persian Gulf country except Iran and Yemen: seven in Oman, three in the UAE, 11 in Saudi Arabia, seven in Qatar, 12 in Bahrain, 10 in Kuwait, and those six still in Iraq. Any of these could potentially contribute to the sorts of “over the horizon” wars the U.S. now seems committed to in countries like Iraq, just as its bases in Kenya and Djibouti are enabling it to launch airstrikes in Somalia.

New bases, new wars Meanwhile, halfway around the world, thanks in part to a growing push for a Cold War-style “containment” of China, new bases are being constructed in the Pacific.

There are, at best, minimal barriers in this country to building military bases overseas. If Pentagon officials determine that a new $990 million base is needed in Guam to “enhance warfighting capabilities” in Washington’s pivot to Asia, there are few ways to prevent them from doing so.

Camp Blaz, the first Marine Corps base to be built on the Pacific Island of Guam since 1952, has been under construction since 2020 without the slightest pushback or debate over whether it was needed or not from policymakers and officials in Washington or among the American public. Even more new bases are being proposed for the nearby Pacific Islands of Palau, Tinian, and Yap. On the other hand, a locally much-protested new base in Henoko on the Japanese island of Okinawa, the Futenma Replacement Facility, is “unlikely” ever to be completed.

Little of any of this is even known in this country, which is why a public list of the full extent of such bases, old and new, around the world is of importance, however difficult it may be to produce based on the patchy Pentagon record available. Not only can it show the far-reaching extent and changing nature of this country’s imperial efforts globally, it could also act as a tool for promoting future base closures in places like Guam and Japan, where there at present are 52 and 119 bases respectively — were the American public one day to seriously question where their tax dollars were really going and why.

Just as there’s very little standing in the way of the Pentagon constructing new bases overseas, there is essentially nothing preventing President Biden from closing them. As OBRACC points out, while there is a process involving congressional authorization for closing any domestic U.S. military base, no such authorization is needed abroad. Unfortunately, in this country there is as yet no significant movement for ending that Baseworld of ours. Elsewhere, however, demands and protests aimed at shutting down such bases from Belgium to Guam, Japan to the United Kingdom — in nearly 40 countries all told — have taken place within the past few years.

In December 2020, however, even the highest-ranking U.S. military official, the chairman of the Joint Chiefs of Staff Mark Milley, asked: “Is every one of those [bases] absolutely positively necessary for the defense of the United States?”

In short, no. Anything but. Still, as of today, despite the modest decline in their numbers, the 750 or so that remain are likely to play a vital role in any continuation of Washington’s “forever wars,” while supporting the expansion of a new Cold War with China. As Chalmers Johnson warned in 2009, “Few empires of the past voluntarily gave up their dominions in order to remain independent, self-governing polities… If we do not learn from their examples, our decline and fall is foreordained.”

In the end, new bases only mean new wars and, as the last nearly 20 years have shown, that’s hardly a formula for success for American citizens or others around the world.

Patterson Deppen serves on the editorial board at E-International Relations where he is co-editor for student essays. A member of the Overseas Base Realignment and Closure Coalition, he recently completed research on the 750 U.S. military bases overseas in conjunction with World BEYOND War. The full listing of bases will appear in the future. This article originated at TomDispatch.

Copyright ©2021 Patterson Deppen — distributed by Agence Global

—————-

Released: 19 August 2021

Word Count: 2,245

—————-

Rebecca Gordon, “Debt and disillusionment”

August 17, 2021 - TomDispatch

For the last decade and a half, I’ve been teaching ethics to undergraduates. Now — admittedly, a little late to the party — I’ve started seriously questioning my own ethics. I’ve begun to wonder just what it means to be a participant, however minor, in the pyramid scheme that higher education has become in the years since I went to college.

Airplane Games Sometime in the late 1980s, the Airplane Game roared through the San Francisco Bay Area lesbian community. It was a classic pyramid scheme, even if cleverly dressed up in language about women’s natural ability to generate abundance, just as we gestate children in our miraculous wombs. If the connection between feminism and airplanes was a little murky — well, we could always think of ourselves as modern-day Amelia Earharts. (As long as we didn’t think too hard about how she ended up.)

A few women made a lot of money from it — enough, in the case of one friend of mine, for a down payment on a house. Inevitably, a lot more of us lost money, even as some like me stood on the sidelines sadly shaking our heads.

There were four tiers on that “airplane”: a captain, two co-pilots, four crew, and 8 passengers — 15 in all to start. You paid $3,000 to get on at the back of the plane as a passenger, so the first captain (the original scammer), got out with $24,000 — $3,000 from each passenger. The co-pilots and crew, who were in on the fix, paid nothing to join. When the first captain “parachuted out,” the game split in two, and each co-pilot became the captain of a new plane. They then pressured their four remaining passengers to recruit enough new women to fill each plane, so they could get their payday, and the two new co-pilots could each captain their own planes.

Unless new people continued to get on at the back of each plane, there would be no payday for the earlier passengers, so the pressure to recruit ever more women into the game only grew. The original scammers ran through the game a couple of times, but inevitably the supply of gullible women willing to invest their savings ran out. By the time the game collapsed, hundreds of women had lost significant amounts of money.

No one seemed to know the women who’d brought the game and all those “planes” to the Bay Area, but they had spun a winning story about endless abundance and the glories of women’s energy. After the game collapsed, they took off for another women’s community with their “earnings,” leaving behind a lot of sadder, poorer, and perhaps wiser San Francisco lesbians. 

Feasting at the tenure trough or starving in the ivory tower? So, you may be wondering, what could that long-ago scam have to do with my ethical qualms about working as a college instructor? More than you might think.

Let’s start with PhD programs. In 2019, the most recent year for which statistics are available, U.S. colleges and universities churned out about 55,700 doctorates — and such numbers continue to increase by about 1% a year. The average number of doctorates earned over the last decade is almost 53,000 annually. In other words, we’re talking about nearly 530,000 PhDs produced by American higher education in those 10 years alone. Many of them have ended up competing for a far smaller number of jobs in the academic world.

It’s true that most PhDs in science or engineering end up with post-doctoral positions (earning roughly $40,000 a year) or with tenure-track or tenured jobs in colleges and universities (averaging $60,000 annually to start). Better yet, most of them leave their graduate programs with little or no debt.

The situation is far different if your degree wasn’t in STEM (science, technology, engineering, or mathematics) but, for example, in education or the humanities. As a start, far more of those degree-holders graduate owing money, often significant sums, and ever fewer end up teaching in tenure-track positions — in jobs, that is, with security, decent pay, and benefits.

Many of the non-STEM PhDs who stay in academia end up joining an exploited, contingent workforce of part-time, or “adjunct,” professors. That reserve army of the underemployed is higher education’s dirty little secret. After all, we — and yes, I’m one of them — actually teach the majority of the classes in many schools, while earning as little as $1,500 a semester for each of them.

I hate to bring up transportation again, but there’s a reason teachers like us are called “freeway flyers.” A 2014 Congressional report revealed that 89% of us work at more than one institution and 27% at three different schools, just to cobble together the most meager of livings.

Many of us, in fact, rely on public antipoverty programs to keep going. Inside Higher Ed, reflecting on a 2020 report from the American Federation of Teachers, describes our situation this way:

“Nearly 25% of adjunct faculty members rely on public assistance, and 40% struggle to cover basic household expenses, according to a new report from the American Federation of Teachers. Nearly a third of the 3,000 adjuncts surveyed for the report earn less than $25,000 a year. That puts them below the federal poverty guideline for a family of four.”

I’m luckier than most adjuncts. I have a union, and over the years we’ve fought for better pay, healthcare, a pension plan, and a pathway (however limited) to advancement. Now, however, my school’s administration is using the pandemic as an excuse to try to claw back the tiny cost-of-living adjustments we won in 2019.

The Oxford Dictionary of English defines an adjunct as “a thing added to something else as a supplementary rather than an essential part.” Once upon a time, in the middle of the previous century, that’s just what adjunct faculty were — occasional additions to the full-time faculty. Often, they were retired professionals who supplemented a department’s offerings by teaching a single course in their area of expertise, while their salaries were more honoraria than true payments for work performed. Later, as more women entered academia, it became common for a male professor’s wife to teach a course or two, often as part of his employment arrangement with the university. Since her salary was a mere adjunct to his, she was paid accordingly.

Now, the situation has changed radically. In many colleges and universities, adjunct faculty are no longer supplements, but the most “essential part” of the teaching staff. Classes simply couldn’t go on without us; nor, if you believe college administrations, could their budgets be balanced without us. After all, why pay a full-time professor $10,000 to teach a class (since he or she will be earning, on average, $60,000 a year and covering three classes a semester) when you can give a part-timer like me $1,500 for the very same work?

And adjuncts have little choice. The competition for full-time positions is fierce, since every year another 53,000 or more new PhDs climb into the back row of the academic airplane, hoping to make it to the pilot’s seat and secure a tenure-track position.

And here’s another problem with that. These days the people in the pilots’ seats often aren’t parachuting out. They’re staying right where they are. That, in turn, means new PhDs find themselves competing for an ever-shrinking prize, as Laura McKenna has written in the Atlantic, “not only with their own cohort but also with the unemployed PhDs who graduated in previous years.” Many of those now clinging to pilots’ seats are members of my own boomer generation, who still benefit from a 1986 law (signed by then-75-year-old President Ronald Reagan) that outlawed mandatory retirements.

Grade inflation v. degree inflation? People in the world of education often bemoan the problem of “grade inflation” — the tendency of average grades to creep up over time. Ironically, this problem is exacerbated by the adjunctification of teaching, since adjuncts tend to award higher grades than professors with secure positions. The reason is simple enough: colleges use student evaluations as a major metric for rehiring adjuncts and higher grades translate directly into better evaluations. Grade inflation at the college level is, in my view, a non-issue, at least for students. Employers don’t look at your transcript when they’re hiring you and even graduate schools care more about recommendations and GRE scores.

The real problem faced by today’s young people isn’t grade inflation. It’s degree inflation.

Once upon a time in another America, a high-school diploma was enough to snag you a good job, with a chance to move up as time went on (especially if you were white and male, as the majority of workers were in those days). And you paid no tuition whatsoever for that diploma. In fact, public education through 12th grade is still free, though its quality varies profoundly depending on who you are and where you live.

But all that changed as increasing numbers of employers began requiring a college degree for jobs that don’t by any stretch of the imagination require a college education to perform. The Washington Post reports:

“Among the positions never requiring a college degree in the past that are quickly adding that to the list of desired requirements: dental hygienists, photographers, claims adjusters, freight agents, and chemical equipment operators.”

In 2017, Manjari Raman of the Harvard Business School wrote that

“the degree gap — the discrepancy between the demand for a college degree in job postings and the employees who are currently in that job who have a college degree — is significant. For example, in 2015, 67% of production supervisor job postings asked for a college degree, while only 16% of employed production supervisors had one.”

In other words, even though most people already doing such jobs don’t have a bachelor’s degree, companies are only hiring new people who do. Part of the reason: that requirement automatically eliminates a lot of applicants, reducing the time and effort involved in making hiring decisions. Rather than sifting through résumés for specific skills (like the ability to use certain computer programs or write fluently), employers let a college degree serve as a proxy. The result is not only that they’ll hire people who don’t have the skills they actually need, but that they’re eliminating people who do have the skills but not the degree. You won’t be surprised to learn that those rejected applicants are more likely to be people of color, who are underrepresented among the holders of college degrees.

Similarly, some fields that used to accept a BA now require a graduate degree to perform the same work. For example, the Bureau of Labor Statistics reports that “in 2015–16, about 39% of all occupational therapists ages 25 and older had a bachelor’s degree as their highest level of educational attainment.” Now, however, employers are commonly insisting that new applicants hold at least a master’s degree — and so up the pyramid we continually go (at ever greater cost to those students).

The biggest pyramid of all In a sense, you could say that the whole capitalist economy is the biggest pyramid of them all. For every one of the fascinating, fulfilling, autonomous, and well-paying jobs out there, there are thousands of boring, mind- and body-crushing ones like pulling items for shipment in an Amazon warehouse or folding clothes at Forever 21.

We know, in other words, that there are only a relatively small number of spaces in the cockpit of today’s economic plane. Nonetheless, we tell our young people that the guaranteed way to get one of those rare gigs at the top of the pyramid is a college education.

Now, just stop for a second and consider what it costs to join the 2021 all-American Airplane Game of education. In 1970, when I went to Reed, a small, private, liberal arts college, tuition was $3,000 a year. I was lucky. I had a scholarship (known in modern university jargon as a “tuition discount”) that covered most of my costs. This year, annual tuition at that same school is a mind-boggling $62,420, more than 20 times as high. If college costs had simply risen with inflation, the price would be about $21,000 a year, or just under triple the price.

If I’d attended Federal City College (now the University of D.C.), my equivalent of a state school then, tuition would have been free. Now, even state schools cost too much for many students. Annually, tuition at the University of California at Berkeley, the flagship school of that state’s system, is $14,253 for in-state students, and $44,007 for out-of-staters.

I left school owing $800, or about $4,400 in today’s dollars. These days, most financial “aid” resembles foreign “aid” to developing countries — that is, it generally takes the form of loans whose interest piles up so fast that it’s hard to keep up with it, let alone begin to pay off the principal in your post-college life. Some numbers to contemplate: 62% of those graduating with a BA in 2019 did so owing money — owing, in fact, an average of almost $29,000. The average debt of those earning a graduate degree was an even more staggering $71,000. That, of course, is on top of whatever the former students had already shelled out while in school. And that, in turn, is before the “miracle” of compound interest takes hold and that debt starts to grow like a rogue zucchini.

It’s enough to make me wonder whether a seat in the Great American College and University Airplane Game is worth the price, and whether it’s ethical for me to continue serving as an adjunct flight attendant along the way. Whatever we tell students about education being the path to a good job, the truth is that there are remarkably few seats at the front of the plane.

Of course, on the positive side, I do still believe that time spent at college offers students something beyond any price — the opportunity to learn to think deeply and critically, while encountering people very different from themselves. The luckiest students graduate with a lifelong curiosity about the world and some tools to help them satisfy it. That is truly a ticket to a good life — and no one should have to buy a seat in an Airplane Game to get one.

 

Rebecca Gordon writes regularly for TomDispatch (where this article originated). She teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Copyright ©2021 Rebecca Gordon — distributed by Agence Global

—————-

Released: 17 August 2021

Word Count: 2,383

—————-

Tom Engelhardt, “My extreme world”

August 12, 2021 - TomDispatch

Admittedly, I hadn’t been there for 46 years, but old friends of mine still live (or at least lived) in the town of Greenville, California, and now… well, it’s more or less gone, though they survived. The Dixie Fire, one of those devastating West Coast blazes, had already “blackened” 504 square miles of Northern California in what was still essentially the (old) pre-fire season. It would soon become the second-largest wildfire in the state’s history. When it swept through Greenville, much of downtown, along with more than 100 homes, were left in ashes as the 1,000 residents of that Gold Rush-era town fled.

I remember Greenville as a wonderful little place that, all these years later, still brings back fond memories. I’m now on the other coast, but much of that small, historic community is no longer there. This season, California’s wildfires have already devastated three times the territory burned in the same period in 2020’s record fire season. And that makes a point that couldn’t be more salient to our moment and our future. A heating planet is a danger, not in some distant time, but right now — yesterday, today, and tomorrow. Don’t just ask the inhabitants of Greenville, ask those in the village of Monte Lake, British Columbia, the second town in that Canadian province to be gutted by flames in recent months in a region that normally — or perhaps I should just say once upon a time — was used to neither extreme heat and drought, nor the fires that accompany them.

In case you hadn’t noticed, we’re no longer just reading about the climate crisis; we’re living it in a startling fashion. At least for this old guy, that’s now a fact — not just of life but of all our lives — that simply couldn’t be more extreme and I don’t even need the latest harrowing report of the UN’s Intergovernmental Panel on Climate Change (IPCC) to tell me so. Whether you’ve been sweating and swearing under the latest heat dome; fleeing fires somewhere in the West; broiling in a Siberia that’s releasing startling amounts of heat-producing methane into the atmosphere; being swept away by flood waters in Germany; sweltering in an unprecedented heat-and-fire season in Greece (where even the suburbs of Athens were being evacuated); baking in Turkey or on the island of Sardinia in a “disaster without precedent“; neck-deep in water in a Chinese subway car; or, after “extreme rains,” wading through the subway systems of New York City or London, you — all of us — are in a new world and we better damn well get used to it.

Floods, megadrought, the fiercest of forest fires, unprecedented storms — you name it and it seems to be happening not in 2100 or even 2031, but now. A recent study suggests that, in 2020 (not 2040 or 2080), more than a quarter of Americans had suffered in some fashion from the effects of extreme heat, already the greatest weather-based killer of Americans and, given this blazing summer, 2021 is only likely to be worse.

By the way, don’t imagine that it’s just us humans who are suffering. Consider, for instance, the estimated billion or more — yes, one billion! — mussels, barnacles, and other small sea creatures that were estimated to have died off the coast of Vancouver, Canada, during the unprecedented heat wave there earlier in the summer.

A few weeks ago, watching the setting sun, an eerie blaze of orange-red in a hazy sky here on the East Coast was an unsettling experience once I realized what I was actually seeing: a haze of smoke from the megadrought-stricken West’s disastrous early fire season. It had blown thousands of miles east for the second year in a row, managing to turn the air of New York and Philadelphia into danger zones.

In a way, right now it hardly matters where you look on this planet of ours. Take Greenland, where a “massive melting event,” occurring after the temperature there hit double the normal this summer, made enough ice vanish “in a single day last week to cover the whole of Florida in two inches of water.” But there was also that record brush fire torching more than 62 square miles of Hawaii’s Big Island. And while you’re at it, you can skip prime houseboat-vacation season at Lake Powell on the Arizona-Utah border, since that huge reservoir is now three-quarters empty (and, among Western reservoirs, anything but alone!).

It almost doesn’t matter which recent report you cite. When it comes to what the scientists are finding, it’s invariably worse than you (or often even they) had previously imagined. It’s true, for instance, of the Amazon rain forest, one of the great carbon sinks on the planet. Parts of it are now starting to release carbon into the atmosphere, as a study in the journal Nature reported recently, partially thanks to climate change and partially to more direct forms of human intervention.

It’s no less true of the Siberian permafrost in a region where, for the first time above the Arctic Circle, the temperature in one town reached more than 100 degrees Fahrenheit on a summer day in 2020. And yes, when Siberia heats up in such a fashion, methane (a far more powerful heat-trapping gas than CO2) is released into the atmosphere from that region’s melting permafrost wetlands, which had previously sealed it in. And recently, that’s not even the real news. What about the possibility, according to a new study published in the Proceedings of the National Academy of Sciences, that what’s being released now is actually a potential “methane bomb” not from that permafrost itself but from thawing rock formations within it?

In fact, when it comes to the climate crisis, as a recent study in the journal Bioscience found, “some 16 out of 31 tracked planetary vital signs, including greenhouse gas concentrations, ocean heat content, and ice mass, set worrying new records.” Similarly, carbon dioxide, methane, and nitrous oxide “have all set new year-to-date records for atmospheric concentrations in both 2020 and 2021.”

Mind you, just in case you hadn’t noticed, the last seven years have been the warmest in recorded history. And speaking of climate-change-style records in this era, last year, 22 natural disasters hit this country, including hurricanes, fires, and floods, each causing more than $1 billion in damage, another instant record with — the safest prediction around — many more to come.

“It looked like an atomic bomb” Lest you think that all of this represents an anomaly of some sort, simply a bad year or two on a planet that historically has gone from heat to ice and back again, think twice. A recent report published in Nature Climate Change, for instance, suggests that heat waves that could put the recent ones in the U.S. West and British Columbia to shame are a certainty and especially likely for “highly populated regions in North America, Europe, and China.” (Keep in mind that, a few years ago, there was already a study suggesting that the North China plain with its 400 million inhabitants could essentially become uninhabitable by the end of this century due to heat waves too powerful for human beings to survive!) Or as another recent study suggested, reports the Guardian, “heatwaves that smash previous records… would become two to seven times more likely in the next three decades and three to 21 times more likely from 2051-2080, unless carbon emissions are immediately slashed.”

It turns out that, even to describe the new world we already live in, we may need a new vocabulary. I mean, honestly, until the West Coast broiled and burned from Los Angeles to British Columbia this summer, had you ever heard of, no less used, the phrase “heat dome” before? I hadn’t, I can tell you that.

And by the way, there’s no question that climate change in its ever more evident forms has finally made the mainstream news in a major way. It’s no longer left to 350.org or Greta Thunberg and the Sunrise Movement to highlight what’s happening to us on this planet. It’s taken years, but in 2021 it’s finally become genuine news, even if not always with the truly fierce emphasis it deserves. The New York Times, to give you an example, typically had a recent piece of reportage (not an op-ed) by Shawn Hubler headlined “Is This the End of Summer as We’ve Known It?” (“The season Americans thought we understood — of playtime and ease, of a sun we could trust, air we could breathe and a natural world that was, at worst, indifferent — has become something else, something ominous and immense. This is the summer we saw climate change merge from the abstract to the now, the summer we realized that every summer from now on will be more like this than any quaint memory of past summers.”) And the new IPCC report on how fast things are indeed proceeding was front-page and front-screen news everywhere, as well it should have been, given the research it was summing up.

My point here couldn’t be simpler: in heat and weather terms, our world is not just going to become extreme in 20 years or 50 years or as this century ends. It’s officially extreme right now. And here’s the sad thing: I have no doubt that, no matter what I write in this piece, no matter how up to date I am at this moment, by the time it appears it will already be missing key climate stories and revelations. Within months, it could look like ancient history.

Welcome, then, to our very own not-so-slow-motion apocalypse. A friend of mine recently commented to me that, for most of the first 30 years of his life, he always expected the world to go nuclear. That was, of course, at the height of the Cold War between the U.S. and the Soviet Union. And then, like so many others, he stopped ducking and covering. How could he have known that, in those very years, the world was indeed beginning to get nuked, or rather carbon-dioxided, methaned, greenhouse-gassed, even if in a slow-motion fashion? As it happens, this time there’s going to be no pretense for any of us of truly ducking and covering.

It’s true, of course, that ducking and covering was a fantasy of the Cold War era. After all, no matter where you might have ducked and covered then — even the Air Force’s command center dug into the heart of Cheyenne Mountain in Colorado — you probably wouldn’t have been safe from a full-scale nuclear conflict between the two superpowers of that moment, or at least not from the world it would have left behind, a disaster barely avoided in the Cuban Missile Crisis of 1962. (Today, we know that, thanks to the possibility of “nuclear winter,” even a regional nuclear conflict — say, between India and Pakistan — could kill billions of us, by starvation if nothing else.)

In that context, I wasn’t surprised when a home owner, facing his house, his possessions, and his car burned to a crisp in Oregon’s devastating Bootleg Fire, described the carnage this way: “It looked like an atomic bomb.”

And, of course, so much worse is yet to come. It doesn’t matter whether you’re talking about a planet on which the Amazon rain forest has already turned into a carbon emitter or one in which the Gulf Stream collapses in a way that’s likely to deprive various parts of the planet of key rainfall necessary to grow crops for billions of people, while raising sea levels disastrously on the East Coast of this country. And that just begins to enumerate the dangers involved, including the bizarre possibility that much of Europe might be plunged into a — hold your hats (and earmuffs) for this one — new ice age!

World War III If this were indeed the beginning of a world war (instead of a world warm), you know perfectly well that the United States like so many other nations would, in the style of World War II, instantly mobilize resources to fight it (or as a group of leading climate scientists put it recently, we would “go big on climate” now). And yet in this country (as in too many others), so little has indeed been mobilized. Worse yet, here one of the two major parties, only recently in control of the White House, supported the further exploitation of fossil fuels (and so the mass creation of greenhouse gases) big time, as well as further exploration for yet more of them. Many congressional Republicans are still in the equivalent of a state of staggering (not to say, stark raving mad) denial of what’s underway. They are ready to pay nothing and raise no money to shut down the production of greenhouse gases, no less create the genuinely green planet run on alternative energy sources that would actually rein in what’s happening.

And criminal as that may have been, Donald Trump, Mitch McConnell, and crew were just aiding and abetting those that, years ago, I called “the biggest criminal enterprise in history.” I was speaking of the executives of major fossil-fuel companies who, as I said then, were and remain the true “terrarists” (and no, that’s not a misspelling) of history. After all, their goal in hijacking all our lives isn’t simply to destroy buildings like the World Trade Center, but to take down the Earth (Terra) as we’ve known it. And don’t leave out the leaders of countries like China still so disastrously intent on, for instance, producing yet more coal-fired power. Those CEOs and their enablers have been remarkably intent on quite literally committing terracide and, sadly enough, in that — as has been made oh-so-clear in this disastrous summer — they’ve already been remarkably successful.

Companies like ExxonMobil knew long before most of the rest of us the sort of damage and chaos their products would someday cause and couldn’t have given less of a damn as long as the mega-profits continued to flow in. (They would, in fact, invest some of those profits in funding organizations that were promoting climate-change denial.) Worse yet, as revealing comments by a senior Exxon lobbyist recently made clear, they’re still at it, working hard to undermine President Biden’s relatively modest green-energy plans in any way they can.

Thought about a certain way, even those of us who didn’t live in Greenville, California, are already in World War III. Many of us just don’t seem to know it yet. So welcome to my (and your) extreme world, not next month or next year or next decade or next century but right now. It’s a world of disaster worth mobilizing over if, that is, you care about the lives of all of us and particularly of the generations to come.

Tom Engelhardt created and runs the website TomDispatch.com (where this article originated). He is also a co-founder of the American Empire Project and the author of a highly praised history of American triumphalism in the Cold War, The End of Victory Culture. A fellow of the Type Media Center, his sixth and latest book is A Nation Unmade by War.

Copyright ©2021 Tom Engelhardt — distributed by Agence Global

—————-

Released: 12 August 2021

Word Count: 2,464

—————-

Liz Theoharis, “Generations of struggle: lessons on defending democracy”

August 10, 2021 - TomDispatch

My father, Athan G. Theoharis, passed away on July 3rd. A leading expert on the FBI, he was responsible for exposing the bureau’s widespread abuses of power. He was a loyal husband, dedicated father, scholar, civil libertarian, and voting-rights advocate with an indefatigable commitment to defending democracy. He schooled his children (and anyone who would listen, including scholars, journalists, and activists from a striking variety of political perspectives) to understand one thing above all: how hard the powers-that-be will work to maintain that power and how willing they are to subvert democracy in the process. His life is a reminder that much of American politics in 2021 is, in so many ways, nothing new.

He grew up poor in Milwaukee, the son of an undocumented Greek immigrant who ran a diner out of the first floor of his home. He returned to his hometown in 1969 as a professor of American history at Marquette University. There, he would take part in political campaigns and local democratic efforts and, of course, raise my siblings and me. After he retired as a professor — committed as he was to opening up space for new scholars and researchers — he remained involved with the Wisconsin ACLU and its campaigns to protect democracy and civil liberties. He became the chair of the board and (how appropriate given this moment of voter-suppression laws) worked to oppose the 2011 Wisconsin voter ID law, while aiding the recall campaign against then-Governor Scott Walker.

Although it seems long ago, in many ways that battle over democracy in America’s Dairyland set the scene for the Trump years and the national crisis unfolding around us now. In 2010, Wisconsin Republicans, fueled in part by a rising Tea Party Movement and having gained control of the state legislature and governorship, immediately passed a host of anti-democratic laws, while instituting regressive economic policies. This in a state that had once been a beacon of American democratic experimentation.

As anyone who visited our family would have learned on a driving tour my parents loved to offer, Milwaukee had a first-class park system because of its (rare) history of socialist mayors. Although Wisconsin was also home to that notorious anti-communist of the 1950s Senator Joseph McCarthy, and also the John Birch Society, it had striking progressive roots. However, in 2011, at a hearing on the state Senate’s version of that voter ID law, one political-science expert testified that “this version of the bill is more restrictive than any bill we’ve had in the past… Indeed, if this bill passes, it would be the most restrictive in the United States.”

That same year, a major campaign to recall Governor Walker began, partially in response to an “austerity budget” aimed at poor Wisconsinites. It would slash pensions and health benefits for public-sector workers and impose new statewide restrictions on union collective bargaining. When that budget was first introduced, Democratic legislators — and this should sound familiar, given recent events in Texas — fled the state to stave off a vote in its senate, while thousands of protestors besieged the capitol building in Madison. For a moment, Wisconsin commanded the attention of the nation.

That recall campaign unfolded over 18 long, bitter months, with Walker eventually holding onto his governorship. Mitt Romney, then on the presidential campaign trail, lauded him for his “sound fiscal policies” and swore that his victory over the recall would “echo beyond the borders of Wisconsin.” And he was right.

More than just a win for a beleaguered politician, the Wisconsin experience signaled a growing anti-democratic strain within the Republican Party and American politics coupled with an extreme economic ideology that benefited the rich and powerful. Even then — in the years when Donald Trump was no more than a businessman and TV show host — that ideology was already masquerading as populism. And in doing so, it echoed the development of so-called welfare reform more than a decade earlier, when former Governor Tommy Thompson’s “Wisconsin model” laid the basis for ending welfare as Americans knew it.

My father watched the fallout from these events with grave concern. For more than 50 years, he had researched and exposed how the FBI’s surveillance programs threatened civil liberties and weakened democratic expression. He knew what was possible when the levers of government power were in the wrong hands and recognized the emergence of the attack on democracy earlier than most. He taught us that wherever you were was ground zero when it came to voting rights and, sadly enough, the truth of this has only become clearer since his passing. Indeed, right now, amid a wave of voter suppression laws unseen since Reconstruction and the continued obstructionism in Congress, the fight for democracy is everywhere and, whether we like it or not, we’re all on the frontlines now.

A multi-racial democracy from below American history is punctuated by eras of dramatic democratic expansion but also of backlash, especially in response to any encouragement of a multiracial electorate coming together to lift society from the bottom up. In the wake of the Civil War, Reconstruction was a first great elaboration of American democracy. To this day, it remains the most radical experiment in popular government since the founding of the republic. After 250 years of slavery, the share of Black men eligible to vote across the South jumped from 0.5% in 1866 to 80.5% just two years later. In many of the former Confederate states, this, in turn, at least briefly inaugurated a sea change in political representation. In 1868, for instance, 33 Black state legislators were elected in Georgia.

Alongside those newly emancipated and enfranchised voters were many poor white sharecroppers and tenant farmers who, in the rubble of the slavocracy, were ready to exercise real political power for the first time. In a number of state legislatures, fusion coalitions of Blacks and poor whites advanced visionary new policies from the expansion of labor and healthcare rights to education reform. The development of public education was particularly significant for the four million Blacks just then emerging from slavery, as well as for poor whites who had been all but barred from school by the former white ruling elite.

If Reconstruction could be called a second American revolution, the Southern aristocracy and the Democratic Party of that era would soon enough set off a vicious counterrevolution, bloody in both word and deed. A violent divide-and-conquer campaign led by informally state-sanctioned paramilitary groups, especially the newly created Ku Klux Klan (headed by a former Confederate general) terrorized Blacks and whites. Meanwhile, those fusion state governments were broken up and, even though the 15th Amendment couldn’t be repealed, new voter suppression laws were implemented, including poll taxes, lengthened residency requirements, and literacy tests.

What’s often left out of this story is that many of those tactics had first been perfected in the North in response to waves of immigrants from Europe and beyond. Between the Civil War and World War I, 25 million people emigrated to this country. In many Northern states, this rising population of foreign-born, urban poor seemed to threaten the political status quo. As a result, nativist and anti-poor voter suppression laws, including new registration requirements, property stipulations, and voter-roll purges spread widely across the North. For years, white Southern reactionaries studied and borrowed from such anti-democratic trailblazing in states like Massachusetts, New Jersey, and Rhode Island.

Reflecting on this record, historian Gregory Downs has written that “when Americans treat voter disfranchisement as a regional, racial exception, they sustain their faith that the true national story is one of progressive expansion of voter rights. But turn-of-the-20th-century disfranchisement was not a regional or a racial story; it was a national one.” Then as now, it was about protecting the power of a class of wealthy, white Americans in the face of an urge from below for a multiracial democracy.

Echoes from that era could be heard half a century later in the reaction of Republicans and Southern Democrats to the Civil Rights Movement. In the South, since Jim Crow voter suppression had disenfranchised entire generations of Blacks, disproportionately living in poverty, civil-rights reforms threatened what some saw as a “natural social order.” Elsewhere across the country, fears arose that legislation like the Voting Rights Act of 1965 would empower poor people across the board. Two Republican congressmen from Michigan and Indiana, for instance, introduced a sham alternative to it that would have allowed states to use literacy tests in election season, a time-honored proxy for restricting the votes of the poor.

Such extremist politicians typically — and it should still sound all too familiar today — couched their opposition to the Voting Rights Act in terms of ensuring “voter integrity” and preventing “voter fraud.” Beneath such rhetoric, of course, lay an underlying fear of what broad democratic participation could mean for their political and economic interests. During his governorship of California in the late 1960s, Ronald Reagan first began connecting mass enfranchisement and welfare with the specter of poor people destroying American democracy. His future staffer Pat Buchanan highlighted a growing consensus in the Republican Party when he said, “The saving grace of the GOP in national elections has been the political apathy, the lethargy, of the welfare class. It simply does not bother to register to vote.”

President Reagan’s hyper-racialized caricature of the “welfare queen” has endured all these decades later, cementing the lie that the poor don’t care about democracy and stoking fears of a changing multiracial electorate. And while it may be true that a sizable portion of eligible poor and low-income voters don’t vote, it’s not because of indifference. Indeed, a recent report from the Poor People’s Campaign, which I co-chair, shows that typical reasons for lower voter participation among the poor are illness, disability, time and transportation issues, and a basic belief that too few politicians speak to their needs, ensuring that their votes simply don’t matter. This last point is especially important because, as the voter suppression tactics of the previous century have evolved into present-day full-scale attacks on voting rights, their concerns have proven anything but unfounded.

The chaos we have sown In 2013, in Shelby v. Holder, the Supreme Court struck down the Section 5 preclearance requirement of the Voting Rights Act. That section had placed certain districts with histories of racist voter suppression under federal jurisdiction, requiring them to submit to the Department of Justice any planned changes in their voting laws. Since then, there’s been a deluge of voter-suppression laws across the country.

After a multi-racial coalition of voters elected America’s first Black president, 2011 stood as the modern watershed for voter suppression with 19 restrictive laws passed in 14 states. (Barack Obama would nevertheless be reelected the next year.) Today, we’re at a new low point. Six months into 2021, a total of nearly 400 laws meant to obstruct the right to vote have been introduced across the country. So far, 18 states, ranging from Alabama and Arkansas to Texas, Utah, and Wyoming, have passed 30 of them, including an omnibus bill signed into law in Georgia in March. According to the Brennan Center for Justice, it “targets Black voters with uncanny accuracy.”

At this very moment, one major front in the battle over voting rights is still unfolding in Texas. There, the state Senate recently passed a massive “voter integrity” bill that would, among other things, ban 24-hour and drive-through voting, add new ID requirements, and criminalize election workers who don’t follow the onerous new rules. The bill would also grant new powers to partisan poll watchers, raising the possibility of far-right militia groups legally monitoring polling stations. Texas House Democrats fled the state before a vote could be introduced and now remain in Washington, D.C., in exile, awaiting the end of the special session called by Republican Governor Greg Abbott and possible federal action.

Those state legislators arrived in D.C. the same day President Biden gave a national address in Philadelphia on voting rights. His rhetoric was certainly impassioned, and he has since affirmed his support for both the For the People Act and for restoring the full power of the Voting Rights Act, which would indeed expand access to the ballot, while placing more political power in the hands of people of color and the poor. And yet he has offered little when it comes to developing an actual strategy for getting that done. Instead, he continues to insist that he is not in favor of ending the filibuster in the Senate, even though it’s the chief impediment to federal action on the subject. He argues instead that such a move would only throw Congress “into chaos.”

Reverend William Barber, my co-chair in the Poor People’s Campaign, recently laid out the hypocrisy of the president’s “support” for voting rights, even as he justifies inaction on the filibuster:

 

“President Biden, I have no doubt you care and desire to do right, but, as a clergy person, let me say pastorally, when you say ending the filibuster will create chaos that obscures the fact that the filibuster is facilitating chaos. The filibuster caused chaos with anti-slavery legislation, labor rights, women’s rights, civil rights, voting rights, and it once again is causing policy chaos by allowing a minority to obstruct justice. The filibuster has already been used to stop your goal of $15/hr. living wage. We believe the filibuster should end. But, at the very least, no one should ever say the filibuster is preventing chaos.”

As Barber notes, the filibuster is also obstructing urgent policy struggles around better wages and healthcare, immigration reform, and the large-scale infrastructure plan that the Biden administration has worked so hard to create. Action on these issues would dramatically improve the lives of millions of poor and low-income Americans and is precisely what a majority of voters support and extremists are so eager to block through voter suppression. That’s why there’s been a recent upsurge of grassroots actions meant to connect the fight for democracy, including voting rights, with economic justice and the abolition of the filibuster. This includes a season of non-violent moral direct action, including a March for Democracy and a Rally in Texas organized by the Poor People’s Campaign, because its members understand that what’s really underway in this country is a struggle between democracy and potential autocracy or, as Martin Luther King once put it, between community and chaos.

Our own choice is the sort of community where everyone has an equal voice in our democracy and, honestly, in that I believe I am simply following in my father’s footsteps.

Liz Theoharis writes regularly for TomDispatch (where this article originated). She is a theologian, ordained minister, and anti-poverty activist. Co-chair of the Poor People’s Campaign: A National Call for Moral Revival and director of the Kairos Center for Religions, Rights and Social Justice at Union Theological Seminary in New York City, she is the author of Always With Us? What Jesus Really Said About the Poor. Follow her on Twitter at @liztheo.

Copyright ©2021 Liz Theoharis — distributed by Agence Global

—————-

Released: 10 August 2021

Word Count: 2,427

—————-

Iman Zayat, “In a nutshell, Tunisians want Islamists out”

August 10, 2021 - The Arab Weekly

In January 2011, a few days after the return of the head of the Islamist Ennahda Movement Rached Ghannouchi to Tunisia from his self-exile in London, hundreds of demonstrators gathered on the Habib Bourguiba Avenue in downtown Tunis. They raised their voices in favour of a secular state and expressed their concern at the return of Islamists from abroad, noting that the people’s budding revolution should be led by younger faces and by those who have lived in the country through its various phases and those who were better positioned to understand the grievances of Tunisians, in general and the youth, in particular.

Most of the demonstrators at that time were young Tunisians, in their twenties and thirties. They had believed then in better days and had high hopes for a modern republic that would pull the country out of its many crises and uphold social rights and freedoms.

Unfortunately, the high hopes were not to be realised.

More than ten years after that small gathering on the Habib Bourguiba Avenue, the country is reeling under its worst political and economic crises since the 1956 independence from France. In the last couple of years, these crises were exacerbated by the outbreak of the coronavirus pandemic and a record surge in infections.

The gradual descent of the republic into the abyss of anarchy and injustice is just so bitter and hard to swallow for Tunisians, who have always looked back with pride at their history and achievements when it comes to development and modernisation. In fact, Tunisia, which was once known as one of the most progressive countries in the Middle East, is now beyond recognition, its brighter face fading into political disputes, endless polemics about constitutional prerogatives and useless debates about the role of religion in politics.

The late leader Habib Bourguiba, who was president of Tunisia from 1957 to 1987 after leading the country to independence from France, wanted this republic to rise as a model for citizenship, progress, scientific development, economic growth, modern education and women’s liberation. He certainly failed to install political freedom and pluralism at a time when very few viewed these issues as a priority and a prerequisite for the creation of a strong nation. Bourguiba did, however, set this republic on the right track for modernisation and anchored a belief in Tunisians’ ability to create “miracles.”

Alas, 21 years after Bourguiba, a nemesis of Islamists, passed away, these hopes were dashed. The Tunisian leader, himself, warned Tunisians on many occasions about the detrimental role of political Islam and the destructive project of Islamists. Some listened to him. Others were either driven by blurry-eyed openness to all forms of political currents including Islamists or blinded by the duplicitous discourse of the Muslim Brotherhood, particularly their Tunisian offshoot, represented by Ghannouchi and his disciples.

Since Tunisia’s 2011 uprising, Ennahda has played the cards of revolution and religion to lure in voters, particularly those from poor and working class neighbourhoods and marginalised regions, promising them a better future in which the dreams of youth can be realised and justice achieved.

But after the party’s rise to power, senior Islamist leaders, particularly Ghannouchi, looked at power as a trophy, dividing the spoils of the conquest among themselves while working to maintain control over state institutions through alliances, coalitions and political manoeuvres against their opponents and friends.

After numerous shocking political events, including the assassinations of leftist leaders Chokri Belaid and Mohamed Brahmi in 2013 and the gradual isolation of the late President Beji Caid Essebsi later in 2016, Tunisia began to see Ennahda’s true face, viewing the party as too unstable and fragile to help in any way counter the challenges the country faces.

False promises of employment and modernisation coupled with growing corruption and calls for compensation for all Ennahda members for so-called years of oppression, opened Tunisians’ eyes to the Islamist movement’s real agenda, a raw monopoly of power and reaping of personal benefits.

Now, as Ghannouchi’s castle of sand steadily melts away and the truth behind Ennahda’s opportunist pursuits emerges, the Islamist movement is facing one of the toughest challenges since its creation in the 1980’s.

On Sunday, Tunisian President Kais Saied ousted the Islamist-backed government of Prime Minister Hichem Mechichi and suspended the Islamist-controlled parliament with help from the army.

The president’s action followed months of deadlock and disputes pitting him against Mechichi and a dysfunctional parliament, as Tunisia descended into an economic crisis exacerbated by one of Africa’s worst COVID-19 outbreaks.

Of course, the Ennahda Movement and its leader Ghannouchi, now in their weakest moment since their return from exile, are crying foul and warning Saied’s action is endangering the so-called democratic system of 2011 which they introduced and shaped in a way that guarantees their political dominance.

So, what happened in Tunisia?

For sure, what happened was not a coup, as Islamists and their allies are claiming, nor a suspension of the democratic process. It is rather a rectification of the course of the democratic transition so as to pull the country out of the abyss of Ennahda’s malicious and short-sighted control over state institutions.

So far, no one can accurately spell out the intentions of the president in the absence of a clear roadmap that would detail the next steps he will take. However, one thing is certain: Tunisians want Islamists out. They want the Ennahda project to fade out from the political and social landscape. They also want justice as they hope to see the Islamists held accountable for ten years of political mismanagement, de-construction of the state, corruption and wealth accumulation.

Tunisians also want justice to be served against those who were involved in the assassinations of Belaid and Brahmi, especially after a group of lawyers pursuing evidence in the 2013 killings said they have uncovered information that incriminates the Ennahda Movement.

Tunisians want more than all of that. They want the electoral law to be amended in a way that would obstruct Ennahda’s control over the political scene again. They want competent people not political charlatans to be brought aboard. They also hope for the 2014 constitution to be amended and for the political system to be reviewed.

These very demands were raised by the president himself earlier in June, when he called for a dialogue with political parties on creating a new political system and amending the 2014 constitution, which he described as “with locks everywhere”, in an effort to ease the ongoing political crisis.

The president’s comments, however, fell on deaf ears at the time, with Ennahda, increasingly emboldened by its growing control over the government, rejecting any talks about the issue.

Saied, who decided to act on Sunday, probably lost patience with Ennahda’s intransigence. But, unlike Saied, Tunisians have lost patience with everything, including the disastrous role played by Ennahda in the country and the performance of most other political parties, regardless of their ideologies or affiliations.

Soon after Saied’s announcement of Sunday’s decisions, large crowds quickly poured into the streets to express support for his moves, reflecting their anger at Ennahda. In every city, people were spontaneously celebrating perhaps what they considered an end to the Islamist nightmare.

In my city in north-eastern Tunisia, I must confess I celebrated Saied’s decisions too, years after I lost hope about seeing this country free from Islamists’ control. I celebrated without thinking, like all compatriots, without asking about what might happen next and without fearing the possible violence that Ennahda and its allies might provoke to regain control and protect their interests.

We all celebrated but we did not celebrate a return to a one-man rule. We were not hailing the president as a hero as much as we were rejoicing that an era, marked and smothered by political Islam, was over.

Now, a few days after the Saied’s decisions, we feel more cool-headed and we are all waiting to see if the president will listen to the people and if he will guarantee this republic a future that is free of opportunism, dogma, corruption and the use of religion for political purposes. Time, maybe, for the long delayed dreams to be fulfilled.

Iman Zayat is the Managing Editor of The Arab Weekly.

Copyright ©2021 The Arab Weekly — distributed by Agence Global

—————-

Released: 10 August 2021

Word Count: 1,356

—————-

Andrew Bacevich, “Answering the armies of the cheated”

August 5, 2021 - TomDispatch

“The thirty-year interregnum of U.S. global hegemony,” writes David Bromwich in the journal Raritan, “has been exposed as a fraud, a decoy, a cheat, [and] a sell.” Today, he continues, “the armies of the cheated are struggling to find the word for something that happened and happened wrong.”

In fact, the armies of the cheated know exactly what happened, even if they haven’t yet settled on precisely the right term to describe the disaster that has befallen this nation.

What happened was this: shortly after the end of the Cold War, virtually the entire American foreign-policy establishment succumbed to a monumentally self-destructive ideological fever.

Call it INS, shorthand for Indispensable Nation Syndrome. Like Covid-19, INS exacts a painful toll of victims. Unlike Covid, we await the vaccine that can prevent its spread. We know that preexisting medical conditions can increase a person’s susceptibility to the coronavirus. The preexisting condition that increases someone’s vulnerability to INS is the worship of power.

Back in 1998, Secretary of State Madeleine Albright not only identified INS, but also captured its essence. Appearing on national TV, she famously declared, “If we have to use force, it is because we are America. We are the indispensable nation. We stand tall. We see further into the future.”

Now, allow me to be blunt: this is simply not true. It’s malarkey, hogwash, bunkum, and baloney. Bullshit, in short.

The United States does not see further into the future than Ireland, Indonesia, or any other country, regardless of how ancient or freshly minted it may be. Albright’s assertion was then and is now no more worthy of being taken seriously than Donald Trump’s claim that the “deep state” engineered the coronavirus pandemic. Also bullshit.

Some of us (but by no means all Americans) have long since concluded that Trump was and remains a congenital liar. To charge Albright with lying, however, somehow rates as bad form, impolite, even rude. She is, after all, a distinguished former official and the recipient of many honors.

Trump’s lies have made him persona non grata in polite society. Albright has not suffered a similar fate. And to be fair, Albright herself is not solely or even mainly responsible for the havoc that INS has caused. While the former secretary of state promoted the syndrome in notably expansive language, the substance of her remark was anything but novel. She was merely reiterating what, in Washington, still passes for a self-evident truism: America must lead. No conceivable alternative exists. Leadership implies responsibilities and, by extension, confers prerogatives. Put crudely — more crudely than Albright would have expressed it to a television audience — we make the rules.

More specifically, Albright was alluding to a particular prerogative that a succession of post-Cold War presidents, including Donald Trump and now Joe Biden, have exercised. Our political leaders routinely authorize the elimination, with extreme prejudice, of persons unwilling to acknowledge our indispensability.

Should Irish or Indonesian leaders assert such a prerogative, American officials would roundly condemn them. Indeed, when Russia’s president and the crown prince of Saudi Arabia each had the temerity to bump off an opponent, U.S. officials (in the former case) and the American media (in the latter case) professed profound shock. How could such things be permitted to occur in a civilized world? When an American president does such things, however, it’s simply part of the job description.

Three strikes and you’re out! Now, allow me to acknowledge the allure of exercising privileges. I once flew on a private jet — very cool, indeed.

Today, however, David Bromwich’s armies of the cheated have good reason to feel cheated. Their disappointment is not without justification. The bullshit has lost its mojo. Since the promulgation of the Albright Doctrine, U.S. forces have bombed, invaded, and occupied various countries across the Greater Middle East and Africa with élan. They’ve killed lots of people, unsettling millions more. And our divided, dysfunctional country is the poorer for it, as the cheated themselves have belatedly discovered.

Blame Donald Trump for that division and dysfunction? Not me. I hold the militant purveyors of INS principally responsible. However contemptible, Trump was little more than an accessory after the fact.

To understand how we got here, recall the narrative that ostensibly validates our indispensability. It consists of sequential binaries, pitting freedom and democracy against all manner of evils. In World War I, we fought militarism; in World War II, we destroyed fascism; during the Cold War, we resisted and “contained” communism. And after 9/11, of course, came the Global War on Terrorism, now approaching its 20th anniversary.

Good versus evil, us against them, over and over again. That recurring theme of American statecraft has endowed INS with its historical context.

Today, in Washington, a foreign-policy establishment afflicted with rigor mortis reflexively reverts to the logic of 1917, 1941, 1947, and 2001, even though those past binaries are about as instructive today as the religious conflicts touched off by the Protestant Reformation of the 1500s.

Confronting evil is no longer the name of the game. Understanding the game’s actual nature, however, would require jettisoning a past that purportedly illuminates but actually imprisons Americans in an ongoing disaster.

Today, race dominates the national conversation. And few Americans would deny that we have a race problem. But the United States also has a war problem. And just about no one is keen to talk about that problem.

More specifically, we actually have three problems with war.

Our first is that we have too many of them. Our second is that our wars drag on way too long and cost way too much. Our third is that they lack purpose: when our wars do eventually more or less end, America’s declared political objectives all too often remain unmet. U.S. forces don’t necessarily suffer defeat. They merely fail. For proof, look no further than the conduct and outcomes of the Iraq and Afghanistan wars.

Two trips to the plate. Two whiffs. How could that have happened? In Washington, the question not only goes unanswered but totally unasked, which, of course, leaves open the possibility of yet another similar failure in the future.

As a long-ago soldier of no particular distinction, I’m mystified at the apparent absence of curiosity regarding the inability of the world’s most generously supported military to accomplish its assigned missions. If the January 6th assault on the Capitol deserves a thorough investigation — as surely it does — then how can this nation pass over a succession of failed wars as if they were mere annoyances? Shouldn’t our collective commitment to “supporting the troops” include a modicum of curiosity about why they have been so badly misused, even if the resulting inquiry should prove embarrassing to senior civilian and military officials?

Liberal media outlets characterize Trump’s claim to have won the 2020 election as the Big Lie, as indeed it is. But it’s hardly the only one. Indispensable Nation Syndrome, along with the militarism that it’s spawned in this century, should certainly qualify as — at the very least — the Other Big Lie. Curbing Washington’s susceptibility to INS requires acknowledging that the proximate challenges facing this country are in no way amenable to even the most creative military solutions. Giving yet more taxpayer dollars to the Pentagon helps sustain the military-industrial complex, but otherwise solves nothing.

Think about it. The defining reality of our moment is the ever-worsening climate chaos that so many of us are now experiencing personally. That threat, after all, has potentially existential implications. Yet in Washington’s hierarchy of national security concerns, climate takes a back seat to gearing up for a new round of “great power competition.” In effect, a foreign-policy establishment devoid of imagination has tagged Xi Jinping’s China to fill the role once assigned to Kaiser Wilhelm’s Germany, Adolf Hitler’s Germany, Joseph Stalin’s Soviet Union, and Saddam Hussein’s Iraq.

That China and the United States must make common cause in addressing the climate crisis seems to count for little. Nor does the fact that the People’s Republic ranks as America’s biggest trading partner and holds more than a trillion dollars in U.S. debt. Sustaining the good-vs-evil binary as a basis for policy requires a major enemy. It hardly matters that the most basic assumptions about the continuity between past and present are not only illusory but distinctly counterproductive.

So, here’s the deal: history didn’t end when the Cold War did. At most, it paused briefly to catch its breath. Now, it’s resumed and is darting off in directions we’ve barely begun to identify. The past that we’ve been conditioned to cherish, that’s supposed to make sense of everything, makes sense of more or less nothing at all. As a result, it won’t work as either map or compass. Indispensable Nation? Spare me.

Don’t get me wrong. I’m not expecting Madeleine Albright to offer an apology, but it would be helpful if she at least issued a retraction. She might think of it as her parting gift to the nation.

Andrew Bacevich writes regularly for TomDispatch (where this article originated). He is president of the Quincy Institute for Responsible Statecraft. His new book, After the Apocalypse: America’s Role in a World Transformed, has just been published.

Copyright ©2021 Andrew Bacevich — distributed by Agence Global

—————-

Released: 05 August 2021

Word Count: 1,493

—————-

Kelly Denton-Borhaug, “Moral injury and the forever wars”

August 3, 2021 - TomDispatch

This summer, it seemed as if we Americans couldn’t wait to return to our traditional July 4th festivities. Haven’t we all been looking for something to celebrate? The church chimes in my community rang out battle hymns for about a week. The utility poles in my neighborhood were covered with “Hometown Hero” banners hanging proudly, sporting the smiling faces of uniformed local veterans from our wars. Fireworks went off for days, sparklers and cherry bombs and full-scale light shows filling the night sky.

But all the flag-waving, the homespun parades, the picnics and military bands, the flowery speeches and self-congratulatory messages can’t dispel a reality, a truth that’s right under our noses: all is not well with our military brothers and sisters. The starkest indicator of that is the rising number of them who are taking their own lives. A new report by Brown University’s Costs of War Project calculates that, in the post-9/11 era so far, four times as many veterans and active-duty military have committed suicide as died in war operations.

While July 4th remembrances across the country focused on the symbols and institutions of war and militarization, most of the celebrants seemed far less interested in hearing from current and former military personnel. After all, less than 1% of Americans have been burdened with waging Washington’s wars in these years, even as we taxpayers have funded an ever-more enormous military infrastructure.

As for me, though, I’ve been seeking out as many of those voices as I could for a long, long time. And here’s what I’ve learned: the truths so many of them tell sharply conflict with the remarkably light-hearted and unthinking celebrations of war we experienced this July and so many Julys before it. I keep wondering why so few of us are focusing on one urgent question: Why are so many of our military brothers and sisters taking their own lives?

The moral injuries of war The term moral injury is now used in military and healthcare settings to identify a deep existential pain destroying the lives of too many active-duty personnel and vets. In these years of forever wars, when the moral consciences of such individuals collided with the brutally harsh realities of militarization and killing, the result has been a sharp, sometimes death-dealing dissonance. Think of moral injury as an invisible wound of war. It represents at least part of the explanation for that high suicide rate. And it’s implicated in more than just those damning suicides: an additional 500,000 troops in the post-9/11 era have been diagnosed with debilitating, not fully understood symptoms that make their lives remarkably unlivable.

I first heard the term moral injury about 10 years ago at a conference at Riverside Church in New York City, where Jonathan Shay, the renowned military psychologist, spoke about it. For decades he had provided psychological care for veterans of the Vietnam War who were struggling with unremitting resentment, guilt, and shame in their post-deployment lives. They just couldn’t get on with those very lives after their military experiences. They had, it seemed, lost trust in themselves and anyone else.

Still, Shay found that none of the typical mental-health diagnoses seemed to fit their symptoms. This wasn’t post-traumatic stress disorder — a hyper-vigilance, anxiety, and set of fears arising from traumatic experience. No, what came to be known as moral injury seemed to result from a sense that the very center of one’s being had been assaulted. If war’s intent is to inflict physical injury and destruction, and if the trauma of war afflicts people’s emotional and psychic well-being, moral injury describes an invisible wound that burns away at a person’s very soul. The Iraq War veteran and writer Kevin Powers describes it as “acid seeping down into your soul, and then your soul is gone.”

A central feature of moral injury is a sense of having betrayed one’s own deepest moral commitments, as well as of being betrayed morally by others. People who are suffering from moral injury feel there’s nothing left in their world to trust, including themselves. For them, any notion of “a shared moral covenant” has been shattered. But how does anyone live in a world without moral guideposts, even flawed ones? The world of modern war, it seems, not only destroys the foundations of life for its targets and victims, but also for its perpetrators.

Difficult truths from those on the front lines of our wars For civilians like me, there’s no way to understand moral injury without listening to those afflicted with it. I’ve been doing so to try to make sense of our culture of war for years now. As a religious studies scholar, I’ve been especially concerned about the ways in which so many of us give American-style war a sacred quality. Think, for instance, about the meme that circulates during national holidays like the recent July 4th, or Veterans Day, or Memorial Day: “Remember that only two forces ever agreed to die for you: Jesus Christ and the American soldier. One died for your freedom, the other for your soul; pass it on!”

How, I wonder, do such messages further shame and silence those already struggling with moral injury whose experiences have led them to see war as anything but sacred?

It’s been years since I first heard Andy, a veteran of the Iraq War, testify in the most personal way about moral injury at a Philadelphia church. He’s part of a family with a long military history. His father and grandfather both served in this country’s wars before, at 17, he enlisted in the Army in 1999. He came to work in military intelligence and would eventually be deployed to Iraq.

But all was most definitely not well with Andy when, after 11 years in the Army, he returned to civilian life. He found himself struggling in his relationships, unable to function, a mess, and eventually suicidal. He bounced from one mental healthcare provider to the next for eight years without experiencing the slightest sense of relief. On the verge of ending his life, he was referred to a new “Moral Injury Group” led by chaplain Chris Antal and psychologist Peter Yeomans at the Crescenz VA Hospital in Philadelphia. At that moment, Andy decided this would be his last effort before calling it quits and ending his life. Frankly, given what I now know, I’m amazed that he was willing to take that one last chance after so many years of suffering, struggle, and pain to so little purpose.

The professionals who lead that particular group are remarkably blunt about what they call “the work avoidance” of most citizens — the way that the majority of us fail to take any responsibility for the consequences of the endless wars we’ve been fighting in this century. People, they’ve told me, regularly deflect responsibility by adopting any of three approaches to veterans: by valorizing them (think of the simplistic “thank you for your service/sacrifice” or the implicit message of those “hometown heroes” banners); by pathologizing them (seeing vets as mentally ill and irreparably broken); or by demonizing them (think of the Vietnam-era “baby-killers” moniker). Each of these approaches essentially represses what those veterans could actually tell us about their experiences and our wars.

So, the leaders of the Crescenz VA Moral Injury Group developed an unorthodox approach. They assured Andy that he had an important story to tell, one the nation needed to hear so that civilians could finally “bear the brunt of the burden” of sending him to war. Eight years after leaving the military and a few weeks into that program, he finally revealed for the first time to those caregivers and vets, the event at the root of his own loss of soul. While deployed in Iraq, he had participated in calling in an airstrike that ended up killing 36 Iraqi men, women, and children.

I’ll never forget watching Andy testify about that very moment in the Philadelphia church on Veterans Day before an audience that had expressly indicated its willingness to listen. With palpable anguish, he told how, after the airstrike, his orders were to enter the bombed structure. He was supposed to sift through the bodies to find the supposed target of the strike. Instead, he came upon the lifeless bodies of, as he called them, “proud Iraqis,” including a little girl with a singed Minnie Mouse doll. Those sights and the smell of death were, he told us, “etched on the back of his eyelids forever.” This was the “shame” he carried with him always, an “unholy perpetration,” as he described it.

The day of that attack, he said, he felt his soul leave his body. Over years of listening to veterans’ stories, I realize that I’ve heard similar descriptions again and again. It may seem extreme to speak about one’s very soul being eviscerated, but it shouldn’t be treated as an exaggeration. After all, how can we even imagine what the deaths of so many men, women, and children may have meant for the Iraqi families and communities whose loved ones perished that day?

Andy’s story clarifies a reality Americans badly need to grasp: the destruction of war goes far beyond its intended targets. In the end, its violence is impossible to control. It doesn’t stay in those distant lands where this country has been fighting for so many fruitless years. Andy is the proof of that. His “loss of soul” almost had the direst of consequences, as his own suicidal impulses began to take over. Of that moment and his seemingly eternal imprisonment in the hell of war, he said: “I relive this alone, the steel cylinder heavy with the .38, knowing that to drive one into my own face will free me from this prison, these sights and smells.”

Taking moral injury seriously goes against the grain of American war culture Valorizing, pathologizing, and demonizing vets are all ways of refusing to listen to the actual experiences of those who carry out our wars. And for them, returning home often just adds to their difficulties, since so much of what they might say goes against the grain of national culture.

We’re generally brought up to see ourselves as a nation whose military gets the job done, despite the “forever wars” of the last nearly 20 years. Through national rituals, holidays, and institutions, hot embers of intense pride are regularly stoked, highlighting our military as the fiercest and strongest in the world. Many of us identify what it means to be a citizen with belonging to the most feared and powerful armed forces on the planet. As a result, people easily believe that, when the U.S. goes to war, what we’re doing is, almost by definition, moral.

But those who dare to pay attention to the morally injured will find them offering inconvenient and uncomfortable truths that sharply conflict with exactly those assumptions. Recently, I listened to another group of military veterans and combat correspondents who gathered their courage to tell their stories publicly in a unique fashion for The Moral Injuries of War project. Here are just three small examples:

• “The military just teaches you don’t ask questions, and if you figure it out, it really isn’t your business anyway. That part, that probably is the biggest thing, having to do things you wonder about, but you can’t ask a question.”

• “The cynical part of me wants the public to understand that it’s your fault; we are all complicit in all of this horror. I don’t need other people to experience my pain, I need other people to understand that they are complicit in my pain.”

• “People want to say thank you for your service, wave a flag, but you’re left with these experiences that leave you feeling deeply shameful.  I burned through any relationship in my life with anybody who loved me. I have this feeling in my gut that something really bad is going to happen. God’s shoe was going to fall on me, I can’t breathe.”

I remember how struck I was at the Veterans Day gathering in that Philadelphia church where I first heard Andy speak, because it was so unlike most such celebrations and commemorations. Instead of laying wreaths or planting crosses in the ground; instead of speeches extolling vets as “the spine of the nation” and praising them for their “ultimate sacrifice,” we did something different. We listened to them tell us about the soul-destroying nature of what actually happened to them during their military service (and what’s happened to them ever since). And in addition to civilians like me, other vets were in those church pews listening, too.

After the testimonies, the VA chaplain leading the ceremony asked us all to come to the front of the church. There, he directed the vets to form a circle facing outwards. Then, he asked the civilians to form a circle around them and face them directly. What happened next challenged and moved me. The chaplain suggested we simply stand in silence for a minute, looking into each other’s eyes. You can’t imagine how slowly that minute passed. More than a few of us had tears running down our cheeks. It was as if we were all holding a painful, sharp, unforgiving reality — but doing it together.

Moral injury is a flashpoint that reveals important truths about our wars and the war-culture that goes with it. If focused on, instead of ignored, it raises uncomfortable questions. In the United States, military service often is described as the epitome of citizenship. Leaders and everyday folks alike claim to value veterans as our most highly esteemed citizens.

I wonder, though, if this isn’t just another way of avoiding a real acknowledgment of the disaster of this country’s twenty-first-century wars. Closing our ears to the veterans who have been on their front lines means ignoring the truths of those wars as well.

If this nation truly esteemed them, wouldn’t we do more to avoid placing them in just the kind of circumstances Andy faced? Wouldn’t our leaders work harder to find other ways of dealing with whatever dangers we confront? Wouldn’t everyday citizens raise more questions about the pervasive “innocent” celebrations of violence on national holidays that only sacralize war-culture as a crucial aspect of what it means to be an American citizen?

For Andy, that Moral Injury Group at the Crescenz VA was the place where his “screaming soul” could be heard. Instead of being “imprisoned by guilt,” he described how he began to feel “empowered” by it to tell the truth about our wars to the rest of us. He hopes that the nation will somehow learn to “bear its brunt of the burden” of those wars and the all-American war-culture that accompanies them in a way that truly matters — a new version of reality that would start with finally listening.

Kelly Denton-Borhaug has long been investigating how religion and violence collide in American war-culture. She teaches in the global religions department at Moravian University. She is the author of two books, U.S. War-Culture, Sacrifice and Salvation and, more recently, And Then Your Soul is Gone: Moral Injury and U.S. War-Culture. This article originated at TomDispatch.

Copyright ©2021 Kelly Denton-Borhaug — distributed by Agence Global

—————-

Released: 03 August 2021

Word Count: 2,477

—————-

Rami G. Khouri, “With its collapse, Lebanon joins a bleak club of Arab countries”

August 3, 2021 - Rami G. Khouri

It was once the exceptional Arab state that, despite civil war and constant political turmoil, still safeguarded pluralism and personal freedoms. But Lebanon now looks like a dozen other countries in the Middle East, in slow, seemingly inexorable decline into deprivation and autocracy. While too many Lebanese increasingly face poverty, lower living standards and diminished personal rights, entrenched ruling elites have embraced ever more militaristic and authoritarian ways to remain in power, continually rejecting reforms and condemning the country to further suffering.

Lebanon’s sad transformation over the past three years is significant for two reasons, at least. The first is that Lebanon’s pluralistic system, which allowed vibrant educational, media, business and cultural sectors to flourish in the country in the years before and after its 15-year civil war, also contributed repeatedly to the economic and social development of many other Arab countries in the same period. Whether in health care, education or private business, these other countries benefited from an influx of Lebanese enterprise and skills. But this signature Lebanese export is now declining and may ultimately disappear in due course, as the country’s economy collapses. The second reason is that Lebanon’s slide into pauperization and securitized governance seals the almost total retreat of Arab political rule into the club of autocrats, generals and dangerous young royals.

Lebanon has quickly turned into just another impoverished, troubled Arab country: Its citizens suffer more and more socioeconomic stress, while their political rights are seized by the heavy hand of a worried government that cannot seem to maintain social stability, except with battalions of police, soldiers and plain-clothed regime thugs wielding batons and throwing tear gas. One of the Lebanese government’s new favored tactics against its critics is to call in citizens for questioning by the security services — or, in some cases, to detain individuals and refer them to courts for allegedly “harming the state” through their activism or social media activity. This is new for Lebanon, but it has been common practice during the last decade in Bahrain, Egypt, Saudi Arabia, the United Arab Emirates, Morocco, Algeria, Sudan, Iraq and Jordan.

Lebanon’s decline has been mirrored in Palestine, where the inept and increasingly autocratic Palestinian Authority — answering more to Israel than to its own people — arrests or beats up protesters calling on President Mahmoud Abbas, whose original four-year term ended way back in 2009, to resign. The recent case of a Palestinian journalist, Nizar Banat, who died in the custody of Palestinian police sparked major street protests throughout the West Bank, which Abbas’ government met with heavily armed police and more plain-clothed security thugs.

These episodes, all too routine for countries like Egypt, rarely happened before in Lebanon or Palestine. Now that such repression is a reality for them, too, Lebanese and Palestinians are doubly angered by their inability to do anything about draconian state security forces. It also adds to the long list of cases, documented for too long by international and local human rights groups, of civilian protesters across the Arab world who are harassed, arrested, detained or even killed for simply demanding a better system of governance that protects their social, economic and political rights — especially the right to free expression.

Lebanon was for years a holdout in a region where too many states are defined by their crumbling economies, poverty-stricken citizens and militant governments. Now that Lebanon has joined that miserable club, it highlights a striking feature of the modern Arab political system that came into being a century ago, at the hands of European colonial powers and their favorite local elites: The entire region often seems to move to the beat of a common drum. This probably reflects the fact that most citizens share the same feelings of hope or frustration, because they seem to be governed by similar political systems that have never embraced genuine pluralistic democracy or political accountability.

The record on this is clear. The common struggle for independence from colonial rule in the early 20th century rippled through most Arab-majority societies. Then a shared focus on national development and state-building defined all Arab countries in the period roughly between 1930 and 1960. Starting in the early 1970s, the oil boom era funded a common rush toward brisk spending on both useful infrastructure as well as profligate, corruption-induced showcase ventures. This was followed by a two-decade period, from about 1980 to 2000, when most Arab countries saw erratic economic development that reflected fluctuating oil and gas incomes and ever-expanding corruption, in countries that never seriously built productive and balanced economies. At the same time, some states like Iraq, Syria, Tunisia, Sudan and Libya felt the pain of being governed by one-man military dictatorships that proved to be incompetent at both military action and national development.

Of course, by late 2010 and early 2011, spontaneous citizen rebellions erupted across half of the region. Some of them succeeded in toppling despots — most of all, in Tunisia and Egypt — while others morphed into civil wars—in Syria, Libya and later Yemen — that quickly attracted regional and international involvement. Despite various setbacks, this wave of uprisings and revolutions continues today — outside of the small, oil-rich sheikhdoms in the Gulf — because a majority of citizens despair of living a decent life or passing on any future promise to their children. Sudan, Algeria and Iraq — and Lebanon, too — have all experienced two years or more of nonstop popular protest, but in most cases with little or no sign of the ruling elites giving up the power they have long monopolized.

Unlike other Arab states that have seen these surges of protest, Lebanon has not had a tradition of a strong central government that monopolizes power and dominates all aspects of national life, from politics and the economy to security, the media and even popular culture and the arts. Lebanon has instead been broken by the persistence of the current sectarian power-sharing system that has ruled the country since the end of its civil war in 1990. Rather than a power-sharing system, it is really a power-hogging one by various sectarian warlords, supported by the single strongest actor in the country: Hezbollah. In the past few years, Lebanon’s sectarian leaders have collectively copied many other Arab states, whose deep-seated elites allow no serious political participation, zealously guarding their own interests.

The results are the multiple banking, foreign exchange and fiscal crises that have left Lebanon’s once vibrant culture and economy as a skeleton of its former self. The elites in Beirut that refuse to budge in the face of sustained street protests and public outcries — just like elites in Baghdad or Algiers — have robbed Lebanon blind and shattered its infrastructure. The evidence on the streets of Beirut is in the piles of uncollected garbage, the rolling power outages and the ruins of the city’s port. Nearly a year after last August’s devastating port explosion, investigations by Lebanese judges and prosecutors, which identified officials to be tried in court, have been repeatedly blocked, hindered or postponed by actors in the security sector, the presidency, the judiciary and parliament. Prices of almost everything in the country have tripled in the past two years, as the value of the Lebanese pound continues to decline every week.

About 60 percent of Lebanese now live in poverty, bringing the country closer to the average of nearly 70 percent of citizens in the Arab world who are either poor or vulnerable to poverty, according to U.N. data. Mirroring regional trends, the pauperized Lebanese continue to rise up in sustained and bitter public protest — to the point where demonstrators now carry nooses as a symbol of their desire to hang all leaders. The wealthy and holders of foreign passports steadily leave for other lands, but the vast majority cannot. They suffer and seethe with anger and a handful of other emotions like fear, humiliation, helplessness and, ultimately, dehumanization at the hands of their own leaders.

Protesters and ordinary citizens alike continue to search for a way out of this national trauma, having been unable through sustained protests or foreign pressure to force any concessions from those in power. Opposition and reformist victories in recent elections for professional associations and syndicates have prompted many Lebanese to organize for the 2022 parliamentary and presidential elections as a way to drive out the current rulers. Yet they are also all too aware that those same rulers can simply postpone the elections, as they have before.

Meanwhile, a strong spirit of communal solidarity and self-help has kicked in, with Lebanese all over the country aiding each other however they can — sharing food, water, medicine, electricity, gasoline and shelter. Many see it as setting the example for how a real and decent government should operate. But others fear that by meeting some urgent needs now, this communal solidarity will just let the government off the hook, so it can postpone any reforms.

Lebanon, like so many other Arab societies today, is now in an unfamiliar new zone where life for most of its citizens is a daily struggle for things as basic as food; no breakthroughs are on the horizon. The rest of the world, to most Lebanese, seems not to care, or in some cases even supports some of the sectarian leaders in the ruling oligarchy responsible for Lebanon’s collapse. Like most other Arab societies, the Lebanese now curse the political class that has made them suffer like this, and they cope as best they can. They keep searching for the magic key that will one day unlock the door to a better future. They insist they can and will become an Arab citizenry that defines its own values, rights and national policies — for the first time, perhaps, since the modern Arab state system was born a century ago.

Rami G. Khouri is the director of global engagement at the American University of Beirut, a nonresident senior fellow at the Harvard Kennedy School Middle East Initiative, and an internationally syndicated columnist. (This article originated at DAWN.)

Copyright ©2021 Rami G. Khouri — distributed by Agence Global

—————-

Released: 03 August 2021

Word Count: 1,627

—————-

William Astore, “Pivoting to America”

July 29, 2021 - TomDispatch

As a ROTC cadet and an Air Force officer, I was a tiny part of America’s vast Department of Defense (DoD) for 24 years until I retired and returned to civilian life as a history professor.  My time in the military ran from the election of Ronald Reagan to the reign of George W. Bush and Dick Cheney. It was defined by the Cold War, the collapse of the Soviet Union, America’s brief unipolar moment of dominance and the beginning of its end, as Washington embroiled itself in needless, disastrous wars in Afghanistan and Iraq after the 9/11 attacks.  Throughout those years of service, I rarely thought about a question that seems ever more critical to me today: What would a real system of American national defense look like?

During the Cold War, I took it for granted that this country needed a sprawling network of military bases, hundreds of them globally.  Back then, of course, the stated U.S. mission was to “contain” the communist pathogen.  To accomplish that mission, it seemed all too logical to me then for our military to emphasize its worldwide presence.  Yes, I knew that the Soviet threat was much exaggerated. Threat inflation has always been a feature of the DoD and at the time I’d read books like Andrew Cockburn’s The Threat: Inside the Soviet Military Machine. Still, the challenge was there and, as the leader of the “free world,” it seemed obvious to me that the U.S. had to meet it.

And then the Soviet Union collapsed — and nothing changed in the U.S. military’s global posture.

Or, put differently, everything changed.  For with the implosion of the USSR, what turned out to remain truly uncontained was our military, along with the dreams of neoconservatives who sought to remake the world in America’s image.  But which image?  That of a republic empowering its citizens in a participatory democracy or of an expansionist capitalist empire, driven by the ambition and greed of a set of oligarchs?

A few people spoke then of a “peace dividend.” They were, however, quickly drowned out by the military-industrial complex that President Dwight D. Eisenhower had warned this country about.  That complex, to which today we might add not only Congress (as Ike had done in an earlier draft of his address) but America’s sprawling intelligence apparatus of 18 agencies, eagerly moved into the void created by the Soviet collapse and that of the Warsaw Pact. It quickly came to dominate the world’s trade in arms, for instance, even as Washington sought to expand NATO, an alliance created to contain a Soviet threat that no longer existed.  Such an expansion made no sense, defensively speaking, but it did serve to facilitate further arms sales and bring U.S. imperial hegemony to the very borders of Russia.

And there was the rub — for me at least.  As an Air Force officer, I’d always thought of myself, however naively, as supporting and defending the Constitution against all enemies, foreign and domestic (the words of my oath of office).  After 1991, however, the main foreign enemy had disappeared and, though I didn’t grasp it then, our new enemy would prove to be domestic, not foreign.  It consisted of those who embraced as a positive good what I’ve come to think of as greed-war, while making no apologies for American leadership, no matter how violent, destructive, or self-centered it might prove to be.

In short, the arsenal of democracy of World War II fame had, by the 1960s, become the very complex of imperialism, militarism, and industrialism that Eisenhower warned Americans about first in his 1953 “Cross of Iron” speech and then in his more famous farewell address of 1961.  Despite the efforts of a few brave Americans, that arsenal of democracy was largely allowed to morph into an arsenal of empire, a radical change that came shrouded in the myth of “national security.”  The complex would then only serve to facilitate the war crimes of Vietnam and of subsequent disasters like Afghanistan, Iraq, and Libya, among so many others.

Yet those same misdeeds were so often dismissed by the mainstream media as the unavoidable costs of “national defense” or even supported as the unavoidable price of spreading freedom and democracy around the world. It was as if, in some twisted Orwellian fashion, war had come to be seen as conducive to liberty and righteousness.  But as George Orwell had indeed warned us, war is not peace, nor should constant warfare at a global level be the product of any democratic government worthy of its name.  War is what empires do and it’s what America has become: a machine for war.

Creating a people’s military So, I ask again: What would real national defense for this country look like?  Rarely do any of us pose this question, no less examine what it might truly mean.  Rarely do we think about all the changes we’d have to make as a nation and a people if we were to put defense first, second, and last, while leaving behind both our imperial wars and domestic militarism.

I know what it wouldn’t look like.  It wouldn’t look like today’s grossly inflated military.  A true Department of Defense wouldn’t need 800 foreign military bases, nor would the national security state need a budget that routinely exceeds a trillion dollars annually.  We wouldn’t need a huge, mechanized army, a navy built around aircraft carriers, or an air force that boasts of its global reach and global power, all of it created not for defense but for offense — for destruction, anytime, anywhere.

As a country, we would need to imagine a new “people’s” military as a force that could truly defend the American republic. That would obviously mean one focused above all on supporting the Constitution and the rights we (at least theoretically) hold sacred like freedom of speech, the press, and assembly, the right to privacy and due process, and of course the right to justice for all, not just for the highest bidder or those with the deepest pockets.

What might such a new military look like?  First, it would be much smaller.  America’s current military, including troops on active duty, reservists, and members of the National Guard, consists of roughly 2.4 million men and women.  Those numbers should gradually be cut at least in half.  Second, its budget should similarly be dramatically cut, the end goal being to have it 50% lower than next year’s proposed budget of $715 billion.  Third, it wouldn’t be based and deployed around the world. As a republican force (note the lower-case “r”), it would instead serve democratic ends rather than imperial ones.  It would certainly need far fewer generals and admirals.  Its mission wouldn’t involve “global reach,” but would be defensive, focused on our borders and this hemisphere.

A friend of mine, a Navy veteran of the Vietnam War, speaks of a military that would consist of a Coast Guard, “militias” (that is, the National Guard) for each of the fifty states, and little else.  Yes, in this America, that sounds beyond extreme, but he has a point.  Consider our unique advantages in terms of geography.  Our continent is protected by two vast oceans.  We share a long and peaceful border with Canada.  While the border with Mexico is certainly troubled, we’re talking about unarmed, desperate migrants, not a military invasion flooding into Texas to retake the Alamo.

Here, then, are just 10 ways America’s military could change under a vision that would put the defense of America first and free up some genuine funds for domestic needs as well:

1. No more new nuclear weapons.  It’s time to stop “modernizing” that arsenal to the tune of possibly $1.7 trillion over the next three decades.  Land-based intercontinental ballistic missiles like the Ground Based Strategic Deterrent, expected to cost more than $264 billion during its lifetime, and “strategic” (nuclear) bombers like the Air Force’s proposed B-21 Raider should be eliminated.  The Trident submarine force should also be made smaller, with limited modernization to improve its survivability.

2. All Army divisions should be reduced to cadres (smaller units capable of expansion in times of war), except the 82nd and 101st Airborne Divisions and the 10th Mountain Division.

3. The Navy should largely be redeployed to our hemisphere, while aircraft carriers and related major surface ships are significantly reduced in number.

4. The Air Force should be redesigned around the defense of America’s air space, rather than attacking others across the planet at any time.  Meanwhile, costly offensive fighter-bombers like the F-35, itself a potential $1.7 trillion boondoggle, should simply be eliminated and the habit of committing drone assassinations across the planet ended. Similarly, the separate space force created by President Trump should be folded back into a much-reduced Air Force.

5. The training of foreign militaries and police forces in places like Iraq and Afghanistan should be stopped.  The utter collapse of the U.S.-trained forces in Iraq in the face of the Islamic State in 2014 and the ongoing collapse of the U.S.-trained Afghan military today have made a mockery of this whole process.

6. Military missions launched by intelligence agencies like the CIA, including those drone assassination programs overseas, should be halted and the urge to intervene secretly in the political and military lives of so many other countries finally brought under some kind of control.

7. The “industrial” part of the military-industrial complex should also be brought under control, so that taxpayer dollars don’t go to fabulously expensive, largely useless weaponry. At the same time, the U.S. government should stop promoting the products of our major weapons makers around the planet.

8. Above all, in a democracy like ours, a future defensive military should only fight in a war when Congress, as the Constitution demands, formally declares one.

9. The military draft should be restored.  With a far smaller force, such a draft should have a limited impact, but it would ensure that the working classes of America, which have historically shouldered a heavy burden in military service, will no longer do so alone. In the future America of my military dreams, a draft would take the eligible sons and daughters of our politicians first, followed by all eligible students enrolled in elite prep schools and private colleges and universities, beginning with the Ivy League.  After all, America’s best and brightest will surely want to serve in a military devoted to defending their way of life.

10. Finally, there should be only one four-star general or admiral in each of the three services. Currently, believe it or not, there are an astonishing 44 four-star generals and admirals in America’s imperial forces. There are also hundreds of one-star, two-star, and three-star officers.  This top-heavy structure inhibits reform even as the highest-ranking officers never take responsibility for America’s lost wars.

Pivoting to America Perhaps you’ve heard of the “pivot to Asia” under the Obama administration — the idea of redeploying U.S. military forces from the Greater Middle East and elsewhere in response to perceived threats from China.  As it happened, it took the new Biden administration to begin to pull off that particular pivot, but America’s imperial military regularly seems to be pivoting somewhere or other.  It’s time to pivot to this country instead.

Echoing the words of George McGovern, a highly decorated World War II bomber pilot who unsuccessfully ran for president against Richard Nixon in 1972, “Come home, America.” Close all those foreign military bases.  Redirect resources from wars and weapons to peace and prosperity.  Focus on restoring the republic.  That’s how Americans working together could truly defend ourselves, not only from our “enemies” overseas, almost always much exaggerated, but from ourselves, the military-industrial-congressional complex, and all our fears.

Because let’s be frank: how could striking at allegedly Iranian-backed militias operating in Iraq and Syria possibly be a form of self-defense, as the Biden administration claimed back in June?  How is keeping U.S. troops in either of those two countries, or almost any other foreign country, truly a “defensive” act?  America’s “new” department of genuine defense, the one I imagine anyway, will know better.

In my nearly six decades, I’ve come to witness an America that increasingly equates “might” with “right,” and praises its presidents whenever they decide to bomb anyone (usually people in the Middle East or Central Asia, but occasionally in Africa now, too), as long as it’s framed in defensive or “preemptive” terms.  Whether you call this aggression, imperialism, militarism, or something even more unflattering (atrocity?), the one thing it shouldn’t be called is national defense.

Collectively, we need to imagine a world in which we as Americans are no longer the foremost merchants of death, in which we don’t imagine ourselves as the eternal global police force, in which we don’t spend as much on our military as the next 10 countries combined.  We need to dream of a world that’s not totally sliced and diced into U.S. military commands like Africa Command (AFRICOM); the Indo-Pacific Command or INDOPACOM; and the Middle Eastern Central Command (CENTCOM), among others.  How would Americans feel if China had an “AMERICOM” and patrolled the Gulf of Mexico with nuclear-armed aircraft carriers very much “Made in China”?  Chances are we wouldn’t accept Beijing’s high-minded claims about the “defensive” nature of those patrols.

This country’s rebirth will only begin when we truly put our Constitution first and seek to defend it in wiser, which means so much more restrained, ways.

William Astore, a retired lieutenant colonel (USAF) and professor of history, writes regularly for TomDispatch (where this article originated).  He is a senior fellow at the Eisenhower Media Network (EMN), an organization of critical veteran military and national security professionals. His personal blog is Bracing Views.

Copyright ©2021 William Astore — distributed by Agence Global

—————-

Released: 29 July 2021

Word Count: 2,244

—————-

Tom Engelhardt, “The forbidden word”

July 27, 2021 - TomDispatch

It was all so long ago, in a world seemingly without challengers. Do you even remember when we Americans lived on a planet with a recumbent Russia, a barely rising China, and no obvious foes except what later came to be known as an “axis of evil,” three countries then incapable of endangering this one? Oh, and, as it turned out, a rich young Saudi former ally, Osama bin Laden, and 19 hijackers, most of them also Saudis, from a tiny group called al-Qaeda that briefly possessed an “air force” of four commercial jets. No wonder this country was then touted as the greatest force, the superest superpower ever, sporting a military that left all others in the dust.

And then, of course, came the launching of the Global War on Terror, which soon would be normalized as the plain-old, uncapitalized “war on terror.” Yes, that very war — even if nobody’s called it that for years — began on September 11, 2001. At a Pentagon partially in ruins, Secretary of Defense Donald Rumsfeld, already aware that the destruction around him was probably Osama bin Laden’s responsibility, ordered his aides to begin planning for a retaliatory strike against… Saddam Hussein’s Iraq. Rumsfeld’s exact words (an aide wrote them down) were: “Go massive. Sweep it all up. Things related and not.”

Things related and not. Sit with that phrase for a moment. In their own strange way, those four words, uttered in the initial hours after the destruction of New York’s World Trade Center and part of the Pentagon, still seem to capture the twenty-first-century American experience.

Within days of 9/11, Rumsfeld, who served four presidents before recently stepping off this world at 88, and the president he then worked for, George W. Bush, would officially launch that Global War on Terror. They would ambitiously target supposed terror networks in no less than 60 countries. (Yep, that was Rumsfeld’s number!) They would invade Afghanistan and, less than a year and a half later, do the same on a far grander scale in Iraq to take down its autocratic ruler, Saddam Hussein, who had once been a hand-shaking buddy of the secretary of defense.

Despite rumors passed around at the time by supporters of such an invasion, Saddam had nothing to do with 9/11; nor, despite Bush administration claims, was his regime then developing or in possession of weapons of mass destruction; nor, if we didn’t act, would an Iraqi mushroom cloud have one day risen over New York or some other American city. And mind you, both of those invasions and so much more would be done in the name of “liberating” peoples and spreading American-style democracy across the Greater Middle East. Or, put another way, in response to that devastating attack by those 19 hijackers armed with knives, the U.S. was preparing to invade and dominate the oil-rich Middle East until the end of time. In 2021, almost two decades later, doesn’t that seem like another lifetime to you?

By the way, you’ll note that there’s one word missing in action in all of the above. Believe me, if what I just described had related to Soviet plans during the Cold War, you can bet your bottom dollar that word would have been all over Washington. I’m thinking, of course, of “empire” or, in its adjectival form, “imperial.” Had the Soviet Union planned similar acts to “liberate” peoples by “spreading communism,” it would have been seen in Washington as the most imperial project ever. In the early years of this century, however, with the Soviet Union long gone and America’s leaders imagining that they might reign supreme globally until the end of time, those two words were banished to history.

It was obvious that, despite the unprecedented 800 or so military bases this country possessed around the world, imperial powers were distinctly a thing of the past.

“Empires have gone there and not done it” Now, keep that thought in abeyance for a moment, while I take you on a quick tour of the long-forgotten Global War on Terror. Almost two decades later, it does seem to be drawing to some kind of lingering close. Yes, there are still those 650 American troops guarding our embassy in the Afghan capital, Kabul, and there is still that “over-the-horizon capacity” the president cites for U.S. aircraft to strike Taliban forces, even if American troops only recently abandoned their last air base in Afghanistan; and yes, there are still about 2,500 American troops stationed in Iraq (and hundreds more at bases across the border in Syria), regularly being attacked by Iraqi militia groups.

Similarly, despite the withdrawal of U.S. forces from Somalia as the Trump years ended, over-the-horizon airstrikes against the terror group al-Shabaab, halted when Joe Biden entered the Oval Office, have just been started again, assumedly from bases in Kenya or Djibouti; and yes, the horrendous war in Yemen continues with the U.S. still supporting the Saudis, even if by offering “defensive,” not “offensive” aid; and yes, American special operators are also stationed in staggering numbers of countries around the globe; and yes, prisoners are still being held in Guantanamo, that offshore Bermuda Triangle of injustice created by the Bush administration so long ago. Admittedly, officials in the new Biden Justice Department are at least debating, however indecisively, whether those detainees might have any due process rights under the Constitution (yes, that’s the U.S. Constitution!) and their numbers are at a historic low since 2002 of 39.

Still, let’s face it, this isn’t the set of conflicts that, once upon a time, involved invasions, massive air strikes, occupations, the killing of staggering numbers of civilians, widespread drone attacks, the disruption of whole countries, the uprooting and displacement of more than 37 million people, the deployment at one point of 100,000 U.S. troops in Afghanistan alone, and the spending of untold trillions of American taxpayer dollars, all in the name of fighting terror and spreading democracy. And think of it as mission (un)accomplished in the truest sense imaginable.

In fact, that idea of spreading of democracy didn’t really outlast the Bush years. Ever since, there’s been remarkably little discussion in official Washington about what this country was really doing as it warred across significant parts of the planet. Yes, those two decades of conflict, those “forever wars,” as they came to be called first by critics and then by anyone in sight, are at least winding, or perhaps spiraling, down — and yet, here’s the strange thing: Wouldn’t you think that, as they ended in visible failure, the Pentagon’s stock might also be falling? Oddly enough, though, in the wake of all those years of losing wars, it’s still rising. The Pentagon budget only heads ever more for the stratosphere as foreign policy “pivots” from the Greater Middle East to Asia (and Russia and the Arctic and, well, anywhere but those places where terror groups still roam).

In other words, when it comes to the U.S. military as it tries to leave its forever wars in someone else’s ditch, failure is the new success story. Perhaps not so surprisingly, then, the losing generals who fought those wars, while eternally promising that “corners” were being turned and “progress” made, have almost all either continued to rise in the ranks or gotten golden parachutes into other parts of the military-industrial complex. That should shock Americans, but really never seems to. Yes, striking percentages of us support leaving Afghanistan and the Afghans in a ditch somewhere and moving on, but it’s still generally a big “thank you for your service” to our military commanders and the Pentagon.

Looking back, however, isn’t the real question — not that anyone’s asking — this: What was America’s mission during all those years? In reality, I don’t think it’s possible to answer that or explain any of it without using the forbidden noun and adjective I mentioned earlier. And, to my surprise, after all these years when it never crossed the lips of an American president, Joe Biden, the guy who’s been insisting that “America is back” on this failing planet of ours, actually used that very word!

In a recent news conference, irritated to find himself endlessly discussing his decision to pull U.S. forces out of Afghanistan, he fielded this question from a reporter: “Given the amount of money that has been spent and the number of lives that have been lost, in your view, with making this decision, were the last 20 years worth it?”

His response: “I argued, from the beginning [in the Obama years], as you may recall — it came to light after the administration was over… No nation has ever unified Afghanistan, no nation. Empires have gone there and not done it.”

So, there! Yes, it was vague and could simply have been a reference to the fate in Afghanistan, that famed “graveyard of empires,” of the British empire in the nineteenth century and the Soviet one in the twentieth century. But I can’t help thinking that a president, however minimally, however indirectly, however much without even meaning to, finally acknowledged that this country, too, was on an imperial mission there and globally as well, a mission not of spreading democracy or of liberation but of domination. Otherwise, how the hell do you explain those 800 military bases on every continent but Antarctica? Is that really spreading democracy? Is that really liberating humanity? It’s not a subject discussed in this country, but believe me, if it were any other place, the words “empire” and “imperial” would be on all too many lips in Washington and the urge to dominate in such a fashion would have been roundly denounced in our nation’s capital.

A failing empire with a flailing military? Here’s a question for you: If the U.S. is “back,” as our president has been claiming, what exactly is it back as? What could it be, now that it’s proven itself incapable of dominating the planet in the fashion its political leaders once dreamed of? Could this country, which in these years dumped trillions of taxpayer dollars into its forever wars, now perhaps be reclassified as a failing empire with a flailing military?

Of course, such a possibility isn’t generally acknowledged here. If, for instance, Kabul falls to the Taliban months from now and U.S. diplomats need to be rescued from the roof of our embassy there, as happened in Saigon in 1975 — something the president has vehemently denied is even possible — count on one thing: a bunch of Republicans and right-wing pundits will instantly be down his throat for leaving “too fast.” (Of course, some of them already are, including, as it happens, the very president who launched the 2001 invasion, only to almost instantly refocus his attention on invading Iraq.)

Even domestically, when you think about where our money truly goes, inequality of every sort is only growing more profound, with America’s billionaires ever wealthier and more numerous, while the Pentagon and those weapons-making corporations float ever higher on taxpayer dollars, and the bills elsewhere go unpaid. In that sense, perhaps it’s time to start thinking about the United States as a failing imperial system at home as well as abroad. Sadly, whether globally or domestically, all of this seems hard for Americans to take in or truly describe (hence, perhaps, the madness of Donald Trump’s America). After all, if you can’t even use the words “imperial” and “empire,” then how are you going to understand what’s happening to you?

Still, forget any fantasies about us spreading democracy abroad. We’re now in a country that’s visibly threatening to lose democracy at home. Forget Afghanistan. From the January 6th assault on the Capitol to the latest (anti-)voting laws in Texas and elsewhere, there’s a flailing, failing system right here in the U.S. of A. And unlike Afghanistan, it’s not one that a president can withdraw from.

Yes, globally, the Biden administration has seemed remarkably eager to enter a new Cold War with China and “pivot” to Asia, as the Pentagon continues to build up its forces, from naval to nuclear, as if this country were indeed still the reigning imperial power on the planet. But it’s not.

The real question may be this: Three decades after the Soviet empire headed for the exit, is it possible that the far more powerful American one is ever so chaotically heading in the same direction? And if so, what does that mean for the rest of us?

Tom Engelhardt created and runs the website TomDispatch.com — where this article originated. He is also a co-founder of the American Empire Project and the author of a highly praised history of American triumphalism in the Cold War, The End of Victory Culture.  A fellow of the Type Media Center, his sixth and latest book is A Nation Unmade by War.

Copyright ©2021 Tom Engelhardt — distributed by Agence Global

—————-

Released: 27 July 2021

Word Count: 2,075

—————-

  • « Previous Page
  • 1
  • …
  • 19
  • 20
  • 21
  • 22
  • 23
  • …
  • 166
  • Next Page »

Syndication Services

Agence Global (AG) is a specialist news, opinion and feature syndication agency.

Rights & Permissions

Email us or call us 24 hours a day, 7 days a week, for rights and permission to publish our clients’ material. One of our representatives will respond in less than 30 minutes over 80% of the time.

Social Media

  • Facebook
  • Twitter

Advisories

Editors may ask their representative for inclusion in daily advisories. Sign up to get advisories on the content that fits your publishing needs, at rates that fit your budget.

About AG | Contact AG | Privacy Policy

©2016 Agence Global