Agence Global

  • About AG
  • Content
  • Articles
  • Contact AG

Rami G. Khouri, “Lebanon’s political crisis is a crisis of wider Arab statehood”

July 22, 2021 - Rami G. Khouri

Those who plunged the latest dagger into Lebanon’s reeling body are supposed to be the people leading it out of collapse and reviving its people’s vibrant role in the Arab region.

President Michel Aoun did not agree on Prime Minister-designate Saad Hariri’s proposed cabinet, and Hariri resigned within minutes, ending a nine-month-old drama. This immediately triggered a drop in both the value of the faltering national currency and in people’s hopes for an end to their daily misery in all walks of life.

The Lebanese people are collectively holding their breath yet again, as they anticipate another drawn-out political crisis among the half a dozen leaders of the main political parties whose collective absolute rule has devastated the country in recent years. But these leaders appear determined to continue their selfish game of holding on to political power at all costs.

This cycle of political discord among self-serving sectarian leaders has intensified since the current crisis began two years ago. But political stalemates like the Hariri-Aoun butting-of-heads that bring governance to a halt have occurred regularly in recent decades.

The slow collapse of governance, the economy, and modern life as we know it across Lebanon — especially in big cities where most people live — signals that what we witness today is not just a political crisis among two ideological actors.

Rather, it reflects a deeper crisis of statehood that is not only tragic for Lebanon but also plagues other Arab countries in similar ways. It is time to acknowledge the structural faults in the system of Lebanese statehood and others in the region that have brought us to this low point.

The costs of the crisis have become clear to every Lebanese household — other than the clients, business partners, guards, and employees of the ruling oligarchic elite. Alongside Sunni leader Hariri and Christian Maronite leader Aoun, this elite includes House Speaker Nabih Berri, Hezbollah leader Hassan Nasrallah, Druze leader Walid Jumblatt, and a few less powerful men who nevertheless play the deadly Lebanese political game with the same determination and catastrophic results.

They are all men, many of them are ageing, most of them inherited their positions from family or comrades, and all of them have given the Arab region its most spectacular example of how to run a once decent state into the ground and plunge its five million inhabitants into instant poverty and despair.

Daily reports from Lebanon depict how families suffer at every turn — electric power has almost vanished, meaning air conditioning, internet, refrigerators and elevators work only sporadically; gasoline is difficult to find and more expensive every week; food prices rise steadily while the value of the lira declines in tandem; essential medicines for infants or the elderly are almost impossible to find; clean water is supplied erratically; the banks with people’s life savings are forbidden territory.

Even when a cash withdrawal is possible the exchange rate set by the Central Bank means a depositor actually gets about 20 percent of the value of their original deposit. The education system is mostly in freefall, and decent new jobs do not exist.

More and more essential businesses will only accept cash dollars, which are beyond the reach of most ordinary Lebanese. Many increasingly survive by resorting to communal kitchens, charity handouts, borrowing, growing their own food in their ancestral mountain villages, or engaging in barter economy activities.

Those who can emigrate do so as fast as possible, but most cannot. The result is millions of angry, frustrated, fearful, and helpless Lebanese and foreign refugee families that feel so vulnerable and degraded that they find it hard to articulate their pain in words. Many have been stunned into a state of dehumanization, feeling that their own political and national leaders have treated them like animals.

This extreme situation is most dramatic for not being the consequence of war, but rather the result of the ruling elite’s sustained mismanagement, corruption, and disdain for the wellbeing and rights of fellow citizens.

The current crisis, as last week’s Hariri-Aoun show reconfirmed, reflects the convergence of several separate crises (political, economic, fiscal, banking, energy, environmental) that all are due to poor or absent decision-making by the ruling elite that has controlled the state since the end of the civil war in 1990.

The truth, though, is that this elite has controlled the state for much longer than that, in fact for most of the past century of statehood. The current collapse does not only reflect the ruling elite’s selfish incompetence; it also reveals the unsustainable structures of Lebanese sectarian statehood itself.

The timeline of an entire century since 1920 is important to keep in mind, for it reveals several threads that contribute to the weakness and slow implosion of the Lebanese state and economy.

Many contributing factors can be traced back to four dynamics that all have run their course over the past century: 1) the delayed consequences of European colonial decisions around 1920 that manufactured many Arab states; 2) the consequence of the Arab-Israeli conflict (also a century old); 3) the lack of genuine citizen participation in political decision-making or accountability in Arab states; and, 4) the non-stop interference in Arab countries by neighbouring or foreign powers, making Arab state sovereignty a common fiction.

Over the last 100 years, these four dynamics have brought us to the point where Lebanon, Syria, Iraq, Palestine, Yemen, and Libya, to mention only the most obvious, experience severe national distress that ultimately brings a state to its knees and citizens to despair or emigration.

Across the Arab region, a common picture emerges that now also plagues Lebanon: a majority of citizens are poor, vulnerable, and politically helpless, and their governments and state institutions increasingly hold citizen anger and rebellion in check by using military and security measures above all else.

Lebanon was born in the regional tumult of Arab independent statehood after 1920; it is imploding today within the continuing pressures of its own and other nearby Arab lands’ dysfunctional statehood, usually due to the same quartet of causes that date back a full century.

Lebanon reminds us that stable, democratic, productive, and genuinely sovereign Arab states remain an elusive goal, as we enter the second century of statehood and citizenship.

Rami G. Khouri is Director of Global Engagement and senior public policy fellow at the American University of Beirut, and a non-resident senior fellow at the Harvard Kennedy School’s Middle East Initiative. Follow him on Twitter: @ramikhouri  (This article originated at The New Arab)

Copyright ©2021 Rami G. Khouri — distributed by Agence Global

—————-

Released: 22 July 2021

Word Count: 1,035

—————-

John Feffer, “Artificial intelligence wants you (and your job)”

July 22, 2021 - TomDispatch

My wife and I were recently driving in Virginia, amazed yet again that the GPS technology on our phones could guide us through a thicket of highways, around road accidents, and toward our precise destination. The artificial intelligence (AI) behind the soothing voice telling us where to turn has replaced passenger-seat navigators, maps, even traffic updates on the radio. How on earth did we survive before this technology arrived in our lives? We survived, of course, but were quite literally lost some of the time.

My reverie was interrupted by a toll booth. It was empty, as were all the other booths at this particular toll plaza. Most cars zipped through with E-Z passes, as one automated device seamlessly communicated with another. Unfortunately, our rental car didn’t have one.

So I prepared to pay by credit card, but the booth lacked a credit-card reader.

Okay, I thought, as I pulled out my wallet, I’ll use cash to cover the $3.25.

As it happened, that booth took only coins and who drives around with 13 quarters in his or her pocket?

I would have liked to ask someone that very question, but I was, of course, surrounded by mute machines. So, I simply drove through the electronic stile, preparing myself for the bill that would arrive in the mail once that plaza’s automated system photographed and traced our license plate.

In a thoroughly mundane fashion, I’d just experienced the age-old conflict between the limiting and liberating sides of technology. The arrowhead that can get you food for dinner might ultimately end up lodged in your own skull. The car that transports you to a beachside holiday contributes to the rising tides — by way of carbon emissions and elevated temperatures — that may someday wash away that very coastal gem of a place. The laptop computer that plugs you into the cyberworld also serves as the conduit through which hackers can steal your identity and zero out your bank account.

In the previous century, technology reached a true watershed moment when humans, harnessing the power of the atom, also acquired the capacity to destroy the entire planet. Now, thanks to AI, technology is hurtling us toward a new inflection point.

Science-fiction writers and technologists have long worried about a future in which robots, achieving sentience, take over the planet. The creation of a machine with human-like intelligence that could someday fool us into believing it’s one of us has often been described, with no small measure of trepidation, as the “singularity.” Respectable scientists like Stephen Hawking have argued that such a singularity will, in fact, mark the “end of the human race.”

This will not be some impossibly remote event like the sun blowing up in a supernova several billion years from now. According to one poll, AI researchers reckon that there’s at least a 50-50 chance that the singularity will occur by 2050. In other words, if pessimists like Hawking are right, it’s odds on that robots will dispatch humanity before the climate crisis does.

Neither the artificial intelligence that powers GPS nor the kind that controlled that frustrating toll plaza has yet attained anything like human-level intelligence — not even close. But in many ways, such dumb robots are already taking over the world. Automation is currently displacing millions of workers, including those former tollbooth operators. “Smart” machines like unmanned aerial vehicles have become an indispensable part of waging war. AI systems are increasingly being deployed to monitor our every move on the Internet, through our phones, and whenever we venture into public space. Algorithms are replacing teaching assistants in the classroom and influencing sentencing in courtrooms. Some of the loneliest among us have already become dependent on robot pets.

As AI capabilities continue to improve, the inescapable political question will become: to what extent can such technologies be curbed and regulated? Yes, the nuclear genie is out of the bottle as are other technologies — biological and chemical — capable of causing mass destruction of a kind previously unimaginable on this planet. With AI, however, that day of singularity is still in the future, even if a rapidly approaching one. It should still be possible, at least theoretically, to control such an outcome before there’s nothing to do but play the whack-a-mole game of non-proliferation after the fact.

As long as humans continue to behave badly on a global scale — war, genocide, planet-threatening carbon emissions — it’s difficult to imagine that anything we create, however intelligent, will act differently. And yet we continue to dream that some deus in machina, a god in the machine, could appear as if by magic to save us from ourselves.

Taming AI? In the early 1940s, science fiction writer Isaac Asimov formulated his famed three laws of robotics: that robots were not to harm humans, directly or indirectly; that they must obey our commands (unless doing so violates the first law); and that they must safeguard their own existence (unless self-preservation contravenes the first two laws).

Any number of writers have attempted to update Asimov. The latest is legal scholar Frank Pasquale, who has devised four laws to replace Asimov’s three. Since he’s a lawyer not a futurist, Pasquale is more concerned with controlling the robots of today than hypothesizing about the machines of tomorrow. He argues that robots and AI should help professionals, not replace them; that they should not counterfeit humans; that they should never become part of any kind of arms race; and that their creators, controllers, and owners should always be transparent.

Pasquale’s “laws,” however, run counter to the artificial-intelligence trends of our moment. The prevailing AI ethos mirrors what could be considered the prime directive of Silicon Valley: move fast and break things. This philosophy of disruption demands, above all, that technology continuously drive down labor costs and regularly render itself obsolescent.

In the global economy, AI indeed helps certain professionals — like Facebook’s Mark Zuckerberg and Amazon’s Jeff Bezos, who just happen to be among the richest people on the planet — but it’s also replacing millions of us. In the military sphere, automation is driving boots off the ground and eyes into the sky in a coming robotic world of war. And whether it’s Siri, the bots that guide increasingly frustrated callers through automated phone trees, or the AI that checks out Facebook posts, the aim has been to counterfeit human beings — “machines like me,” as Ian McEwan called them in his 2019 novel of that title — while concealing the strings that connect the creation to its creator.

Pasquale wants to apply the brakes on a train that has not only left the station but no longer is under the control of the engine driver. It’s not difficult to imagine where such a runaway phenomenon could end up and techno-pessimists have taken a perverse delight in describing the resulting cataclysm. In his book Superintelligence, for instance, Nick Bostrom writes about a sandstorm of self-replicating nanorobots that chokes every living thing on the planet — the so-called grey goo problem — and an AI that seizes power by “hijacking political processes.”

Since they would be interested only in self-preservation and replication, not protecting humanity or following its orders, such sentient machines would clearly tear up Asimov’s rulebook. Futurists have leapt into the breach. For instance, Ray Kurzweil, who predicted in his 2005 book The Singularity Is Near that a robot would attain sentience by about 2045, has proposed a “ban on self-replicating physical entities that contain their own codes for self-replication.” Elon Musk, another billionaire industrialist who’s no enemy of innovation, has called AI humanity’s “biggest existential threat” and has come out in favor of a ban on future killer robots.

To prevent the various worst-case scenarios, the European Union has proposed to control AI according to degree of risk. Some products that fall in the EU’s “high risk” category would have to get a kind of Good Housekeeping seal of approval (the Conformité Européenne). AI systems “considered a clear threat to the safety, livelihoods, and rights of people,” on the other hand, would be subject to an outright ban. Such clear-and-present dangers would include, for instance, biometric identification that captures personal data by such means as facial recognition, as well as versions of China’s social credit system where AI helps track individuals and evaluate their overall trustworthiness.

Techno-optimists have predictably lambasted what they consider European overreach. Such controls on AI, they believe, will put a damper on R&D and, if the United States follows suit, allow China to secure an insuperable technological edge in the field. “If the member states of the EU — and their allies across the Atlantic — are serious about competing with China and retaining their power status (as well as the quality of life they provide to their citizens),” writes entrepreneur Sid Mohasseb in Newsweek, “they need to call for a redraft of these regulations, with growth and competition being seen as at least as important as regulation and safety.”

Mohasseb’s concerns are, however, misleading. The regulators he fears so much are, in fact, now playing a game of catch-up. In the economy and on the battlefield, to take just two spheres of human activity, AI has already become indispensable.

The automation of globalization The ongoing Covid-19 pandemic has exposed the fragility of global supply chains. The world economy nearly ground to a halt in 2020 for one major reason: the health of human workers. The spread of infection, the risk of contagion, and the efforts to contain the pandemic all removed workers from the labor force, sometimes temporarily, sometimes permanently. Factories shut down, gaps widened in transportation networks, and shops lost business to online sellers.

A desire to cut labor costs, a major contributor to a product’s price tag, has driven corporations to look for cheaper workers overseas. For such cost-cutters, eliminating workers altogether is an even more beguiling prospect. Well before the pandemic hit, corporations had begun to turn to automation. By 2030, up to 45 million U.S. workers will be displaced by robots. The World Bank estimates that they will eventually replace an astounding 85% of the jobs in Ethiopia, 77% in China, and 72% in Thailand.”

The pandemic not only accelerated this trend, but increased economic inequality as well because, at least for now, robots tend to replace the least skilled workers. In a survey conducted by the World Economic Forum, 43% of businesses indicated that they would reduce their workforces through the increased use of technology. “Since the pandemic hit,” reports NBC News,

“food manufacturers ramped up their automation, allowing facilities to maintain output while social distancing. Factories digitized controls on their machines so they could be remotely operated by workers working from home or another location. New sensors were installed that can flag, or predict, failures, allowing teams of inspectors operating on a schedule to be reduced to an as-needed maintenance crew.”

In an ideal world, robots and AI would increasingly take on all the dirty, dangerous, and demeaning jobs globally, freeing humans to do more interesting work. In the real world, however, automation is often making jobs dirtier and more dangerous by, for instance, speeding up the work done by the remaining human labor force. Meanwhile, robots are beginning to encroach on what’s usually thought of as the more interesting kinds of work done by, for example, architects and product designers.

In some cases, AI has even replaced managers. A contract driver for Amazon, Stephen Normandin, discovered that the AI system that monitored his efficiency as a deliveryman also used an automated email to fire him when it decided he wasn’t up to snuff. Jeff Bezos may be stepping down as chief executive of Amazon, but robots are quickly climbing its corporate ladder and could prove at least as ruthless as he’s been, if not more so.

Mobilizing against such a robot replacement army could prove particularly difficult as corporate executives aren’t the only ones putting out the welcome mat. Since fully automated manufacturing in “dark factories” doesn’t require lighting, heating, or a workforce that commutes to the site by car, that kind of production can reduce a country’s carbon footprint — a potentially enticing factor for “green growth” advocates and politicians desperate to meet their Paris climate targets.

It’s possible that sentient robots won’t need to devise ingenious stratagems for taking over the world. Humans may prove all too willing to give semi-intelligent machines the keys to the kingdom.

The new fog of war The 2020 war between Armenia and Azerbaijan proved to be unlike any previous military conflict. The two countries had been fighting since the 1980s over a disputed mountain enclave, Nagorno-Karabakh. Following the collapse of the Soviet Union, Armenia proved the clear victor in conflict that followed in the early 1990s, occupying not only the disputed territory but parts of Azerbaijan as well.

In September 2020, as tensions mounted between the two countries, Armenia was prepared to defend those occupied territories with a well-equipped army of tanks and artillery. Thanks to its fossil-fuel exports, Azerbaijan, however, had been spending considerably more than Armenia on the most modern version of military preparedness. Still, Armenian leaders often touted their army as the best in the region. Indeed, according to the 2020 Global Militarization Index, that country was second only to Israel in terms of its level of militarization.

Yet Azerbaijan was the decisive winner in the 2020 conflict, retaking possession of Nagorno-Karabkah. The reason: automation.

“Azerbaijan used its drone fleet — purchased from Israel and Turkey — to stalk and destroy Armenia’s weapons systems in Nagorno-Karabakh, shattering its defenses and enabling a swift advance,” reported the Washington Post‘s Robyn Dixon. “Armenia found that air defense systems in Nagorno-Karabakh, many of them older Soviet systems, were impossible to defend against drone attacks, and losses quickly piled up.”

Armenian soldiers, notorious for their fierceness, were spooked by the semi-autonomous weapons regularly above them. “The soldiers on the ground knew they could be hit by a drone circling overhead at any time,” noted Mark Sullivan in the business magazine Fast Company. “The drones are so quiet they wouldn’t hear the whir of the propellers until it was too late. And even if the Armenians did manage to shoot down one of the drones, what had they really accomplished? They’d merely destroyed a piece of machinery that would be replaced.”

The United States pioneered the use of drones against various non-state adversaries in its war on terror in Afghanistan, Iraq, Pakistan, Somalia, and elsewhere across the Greater Middle East and Africa. But in its 2020 campaign, Azerbaijan was using the technology to defeat a modern army. Now, every military will feel compelled not only to integrate increasingly more powerful AI into its offensive capabilities, but also to defend against the new technology.

To stay ahead of the field, the United States is predictably pouring money into the latest technologies. The new Pentagon budget includes the “largest ever” request for R&D, including a down payment of nearly a billion dollars for AI. As TomDispatch regular Michael Klare has written, the Pentagon has even taken a cue from the business world by beginning to replace its war managers — generals — with a huge, interlinked network of automated systems known as the Joint All-Domain Command-and-Control (JADC2).

The result of any such handover of greater responsibility to machines will be the creation of what mathematician Cathy O’Neill calls “weapons of math destruction.” In the global economy, AI is already replacing humans up and down the chain of production. In the world of war, AI could in the end annihilate people altogether, whether thanks to human design or computer error.

After all, during the Cold War, only last-minute interventions by individuals on both sides ensured that nuclear “missile attacks” detected by Soviet and American computers — which turned out to be birds, unusual weather, or computer glitches — didn’t precipitate an all-out nuclear war. Take the human being out of the chain of command and machines could carry out such a genocide all by themselves.

And the fault, dear reader, would lie not in our robots but in ourselves.

Robots of last resort In my new novel Songlands, humanity faces a terrible set of choices in 2052. Having failed to control carbon emissions for several decades, the world is at the point of no return, too late for conventional policy fixes. The only thing left is a scientific Hail Mary pass, an experiment in geoengineering that could fail or, worse, have terrible unintended consequences. The AI responsible for ensuring the success of the experiment may or may not be trustworthy. My dystopia, like so many others, is really about a narrowing of options and a whittling away of hope, which is our current trajectory.

And yet, we still have choices. We could radically shift toward clean energy and marshal resources for the whole world, not just its wealthier portions, to make the leap together. We could impose sensible regulations on artificial intelligence. We could debate the details of such programs in democratic societies and in participatory multilateral venues.

Or, throwing up our hands because of our unbridgeable political differences, we could wait for a post-Trumpian savior to bail us out. Techno-optimists hold out hope that automation will set us free and save the planet. Laissez-faire enthusiasts continue to believe that the invisible hand of the market will mysteriously direct capital toward planet-saving innovations instead of SUVs and plastic trinkets.

These are illusions. As I write in Songlands, we have always hoped for someone or something to save us: “God, a dictator, technology. For better or worse, the only answer to our cries for help is an echo.”

In the end, robots won’t save us. That’s one piece of work that can’t be outsourced or automated. It’s a job that only we ourselves can do.

John Feffer writes regularly for TomDispatch (where this article originated). He is the author of the dystopian novel Splinterlands and the director of Foreign Policy In Focus at the Institute for Policy Studies. Frostlands, a Dispatch Books original, is volume two of his Splinterlands series and the final novel in the trilogy, Songlands, has just been published. He has also written The Pandemic Pivot.

Copyright ©2021 John Feffer — distributed by Agence Global

—————-

Released: 22 July 2021

Word Count: 2,956

—————-

Robert Rudney, “Making nuclear weapons obsolete”

July 21, 2021 - The-Washington-Spectator

It’s high time to declare nuclear weapons obsolete.

The U.N. Treaty on the Prohibition of Nuclear Weapons — entering into force January 22, 2021 — underscores the most perilous environmental threat to humankind, a threat that cannot be ignored by the Biden administration.

Eighty-six nations, not including the United States and other nuclear weapons states, have signed the treaty that provides a diplomatic groundwork toward banning possession and use of these weapons. Yet militarily, the United States is moving rapidly in the opposite direction.

Biden inherits the Trump administration’s 2018 Nuclear Posture Review, which showcases a costly, across-the-board nuclear weapons modernization highlighted by a perilous war-fighting capability. This destabilizing strategy, explicitly laid out in the NPR, once more raises the life-or-death issues of the utility of nuclear weapons and the horrific dangers that the United States risks in concentrating its defense around the nuclear deterrence option.

The modernization package, with a price tag estimated by the Congressional Budget Office of at least $1.2 trillion over 30 years, seeks extensive upgrades to the lethal triad of ground-based, submarine-based, and bomber-based systems. This entire conceptual house of cards rests on the specious assumption that if deterrence fails, the United States will be able to achieve its political objectives (read “fight a nuclear war”).

In addition, the NPR proposes deployment of new tactical (dubbed “nonstrategic”) nuclear weapons whose deterrent rationale is shaky and whose war-fighting capabilities make them inherently destabilizing.

The proposed nuclear buildup offers a critical opportunity for the Biden administration to rethink national defense policy and cure this nuclear addiction. The extension of the New START Treaty simply maintains a ceiling on U.S. and Russian arsenals.

This paradigm shift cannot be achieved overnight. Both President Reagan at Reykjavik in 1986 and President Obama at Prague in 2009 emphasized that the goal of nuclear arms negotiations should be the elimination of all nuclear weapons, but both admitted that this was a distant goal.

However, the reality today is that existing U.S. strategic nuclear weapons systems provide sufficient deterrence well past the year 2040. The last OHIO-class ballistic missile submarine is scheduled to retire in 2042. The Minuteman III intercontinental ballistic missile can be extended past 2030, while the B-52H bomber, armed with cruise missiles, can be deployed into the 2040s. While the safety and security of these systems should be maintained, the modernization program is mindless and destabilizing.

In its place, the United States now has the option of adopting a declarative policy of nuclear weapons obsolescence. As weapons reach operational obsolescence, they can be taken out of the inventory, dismantled, and destroyed. Concurrently, over the next 20 years, the United States can invite Russia, China, and other nuclear weapons states to negotiate on a modernization freeze and build-down that can be effectively verified and enforced.

The flip side of the coin is that, if this process does not achieve comparable reductions in nuclear forces by these other states over a specified period of years, the United States will reluctantly take steps once more to assure a sufficient nuclear force.

Such a wholesale transformation of strategic thinking will attract critics. Much like their Cold War predecessors who inflated Soviet aggressive intentions, NPR apologists emphasize emerging Russian and Chinese menaces and their own nuclear modernization programs, but it is unimaginable that Vladimir Putin or Xi Jinping would risk annihilation of their homeland and destruction of their regime by engaging in nuclear saber-rattling. Even Kim Jong Un has an existential appreciation of the present U.S. nuclear force.

Yet nuclear weapons did not deter the 9/11 attacks or the anthrax attacks on the U.S. Capitol. Any contention that the U.S. nuclear weapons arsenal prevents proliferation and terrorism is a fallacious argument with no empirical grounding. The extended nuclear deterrence theory only operated in a bipolar, Cold War environment, yet following the near-disaster of the Cuban missile crisis, the value of nuclear weapons coercion by the United States has been more than offset by the inherent risks.

The United States can move away from the NPR’s “other-directed” nuclear planning fixation, where we strive to match our potential adversaries system by system. At this point, Americans should cease obsessing over the Cold War riddle of “How much is enough?” and affirm, “Enough is enough.” Mutual assured destruction no longer has rhyme or reason.

The NPR states that “if deterrence fails, the United States will strive to end any conflict at the lowest level of damage possible.” This commitment to postnuclear damage limitation is absurd. Keeping a nuclear conflict limited to the lowest possible level flies in the face of military history. To paraphrase Talleyrand, “You can do anything you like with nuclear weapons except fight a war with them.”

Dr. Robert Rudney is a retired senior adviser in the Department of the Air Force. He was also chief consultant to the ABA Task Force on the Nonproliferation of Weapons of Mass Destruction and a fellow in Senator Bernie Sanders’s office, working on defense issues. As a strategic analyst at the National Institute for Public Policy (1988–1999), he authored studies on the deterrence value of the multi-warhead MX Peacekeeper intercontinental ballistic missile and other nuclear weapons systems.

Copyright ©2021 The Washington Spectator — distributed by Agence Global

—————-

Released: 21 July 2021

Word Count: 780

—————-

Aviva Chomsky, “Migration is not the crisis: what Washington could do in Central America”

July 19, 2021 - TomDispatch

Earlier this month, a Honduran court found David Castillo, a U.S.-trained former Army intelligence officer and the head of an internationally financed hydroelectric company, guilty of the 2016 murder of celebrated Indigenous activist Berta Cáceres. His company was building a dam that threatened the traditional lands and water sources of the Indigenous Lenca people. For years, Cáceres and her organization, the Council of Popular and Indigenous Organizations of Honduras, or COPINH, had led the struggle to halt that project. It turned out, however, that Cáceres’s international recognition — she won the prestigious Goldman Environmental Prize in 2015 — couldn’t protect her from becoming one of the dozens of Latin American Indigenous and environmental activists killed annually.

Yet when President Joe Biden came into office with an ambitious “Plan for Security and Prosperity in Central America,” he wasn’t talking about changing policies that promoted big development projects against the will of local inhabitants. Rather, he was focused on a very different goal: stopping migration. His plan, he claimed, would address its “root causes.” Vice President Kamala Harris was even blunter when she visited Guatemala, instructing potential migrants: “Do not come.”

As it happens, more military and private development aid of the sort Biden’s plan calls for (and Harris boasted about) won’t either stop migration or help Central America. It’s destined, however, to spark yet more crimes like Cáceres’s murder. There are other things the United States could do that would aid Central America. The first might simply be to stop talking about trying to end migration.

How can the United States help Central America? Biden and Harris are only recycling policy prescriptions that have been around for decades: promote foreign investment in Central America’s export economy, while building up militarized “security” in the region. In truth, it’s the very economic model the United States has imposed there since the nineteenth century, which has brought neither security nor prosperity to the region (though it’s brought both to U.S. investors there). It’s also the model that has displaced millions of Central Americans from their homes and so is the fundamental cause of what, in this country, is so often referred to as the “crisis” of immigration.

In the nineteenth and early twentieth centuries, the U.S. began imposing that very model to overcome what officials regularly described as Central American “savagery” and “banditry.” The pattern continued as Washington found a new enemy, communism, to battle there in the second half of the last century. Now, Biden promises that the very same policies — foreign investment and eternal support for the export economy — will end migration by attacking its “root causes”: poverty, violence, and corruption. (Or call them “savagery” and “banditry,” if you will.) It’s true that Central America is indeed plagued by poverty, violence, and corruption, but if Biden were willing to look at the root causes of his root causes, he might notice that his aren’t the solutions to such problems, but their source.

Stopping migration from Central America is no more a legitimate policy goal than was stopping savagery, banditry, or communism in the twentieth century. In fact, what Washington policymakers called savagery (Indigenous people living autonomously on their lands), banditry (the poor trying to recover what the rich had stolen from them), and communism (land reform and support for the rights of oppressed workers and peasants) were actually potential solutions to the very poverty, violence, and corruption imposed by the US-backed ruling elites in the region. And maybe migration is likewise part of Central Americans’ struggle to solve these problems. After all, migrants working in this country send back more money in remittances to their families in Central America than the United States has ever given in foreign aid.

What, then, would a constructive U.S. policy towards Central America look like?

Perhaps the most fundamental baseline of foreign policy should be that classic summary of the Hippocratic Oath: do no harm. As for doing some good, before the subject can even be discussed, there needs to be an acknowledgement that so much of what we’ve done to Central America over the past 200 years has been nothing but harm.

The United States could begin by assuming historical responsibility for the disasters it’s created there. After the counterinsurgency wars of the 1980s, the United Nations sponsored truth commissions in El Salvador and Guatemala to uncover the crimes committed against civilian populations there. Unfortunately, those commissions didn’t investigate Washington’s role in funding and promoting war crimes in the region.

Maybe what’s now needed is a new truth commission to investigate historic U.S. crimes in Central America. In reality, the United States owes those small, poor, violent, and corrupt countries reparations for the damages it’s caused over all these years. Such an investigation might begin with Washington’s long history of sponsoring coups, military “aid,” armed interventions, massacres, assassinations, and genocide.

The U.S. would have to focus as well on the impacts of ongoing economic aid since the 1980s, aimed at helping U.S. corporations at the expense of the Central American poor. It could similarly examine the role of debt and the U.S.-Central America Free Trade Agreement in fostering corporate and elite interests. And don’t forget the way the outsized U.S. contribution to greenhouse gas emissions — this country is, of course, the largest such emitter in history — and climate change has contributed to the destruction of livelihoods in Central America. Finally, it could investigate how our border and immigration policies directly contribute to keeping Central America poor, violent, and corrupt, in the name of stopping migration.

Constructive options for U.S. policy in Central America

Providing Vaccines: Even as Washington rethinks the fundamentals of this country’s policies there, it could take immediate steps on one front, the Covid-19 pandemic, which has been devastating the region. Central America is in desperate need of vaccines, syringes, testing materials, and personal protective equipment. A history of underfunding, debt, and privatization, often due directly or indirectly to U.S. policy, has left Central America’s healthcare systems in shambles. While Latin America as a whole has been struggling to acquire the vaccines it needs, Honduras, Guatemala, and Nicaragua rank at the very bottom of doses administered. If the United States actually wanted to help Central America, the emergency provision of what those countries need to get vaccines into arms would be an obvious place to start.

Reversing economic exploitation: Addressing the structural and institutional bases of economic exploitation could also have a powerful impact. First, we could undo the harmful provisions of the 2005 Central America Free Trade Agreement (CAFTA). Yes, Central American governments beholden to Washington did sign on to it, but that doesn’t mean that the agreement benefited the majority of the inhabitants in the region. In reality, what CAFTA did was throw open Central American markets to U.S. agricultural exports, in the process undermining the livelihoods of small farmers there.

CAFTA also gave a boost to the maquiladora or export-processing businesses, lending an all-too-generous hand to textile, garment, pharmaceutical, electronics, and other industries that regularly scour the globe for the cheapest places to manufacture their goods. In the process, it created mainly the kind of low-quality jobs that corporations can easily move anytime in an ongoing global race to the bottom.

Central American social movements have also vehemently protested CAFTA provisions that undermine local regulations and social protections, while privileging foreign corporations. At this point, local governments in that region can’t even enforce the most basic laws they’ve passed to regulate such deeply exploitative foreign investors.

Another severe restriction that prevents Central American governments from pursuing economic policies in the interest of their populations is government debt. Private banks lavished loans on dictatorial governments in the 1970s, then pumped up interest rates in the 1980s, causing those debts to balloon. The International Monetary Fund stepped in to bail out the banks, imposing debt restructuring programs on already-impoverished countries — in other words, making the poor pay for the profligacy of the wealthy.

For real economic development, governments need the resources to fund health, education, and welfare. Unsustainable and unpayable debt (compounded by ever-growing interest) make it impossible for such governments to dedicate resources where they’re truly needed. A debt jubilee would be a crucial step towards restructuring the global economy and shifting the stream of global resources that currently flows so strongly from the poorest to the richest countries.

Now, add another disastrous factor to this equation: the U.S. “drug wars” that have proven to be a key factor in the spread of violence, displacement, and corruption in Central America. The focus of the drug war on Mexico in the early 2000s spurred an orgy of gang violence there, while pushing the trade south into Central America. The results have been disastrous. As drug traffickers moved in, they brought violence, land grabs, and capital for new cattle and palm-oil industries, drawing in corrupt politicians and investors. Pouring arms and aid into the drug wars that have exploded in Central America has only made trafficking even more corrupt, violent, and profitable.

Reversing climate change: In recent years, ever more extreme weather in Central America’s “dry corridor,” running from Guatemala through El Salvador, Honduras, and Nicaragua, has destroyed homes, farms, and livelihoods, and this climate-change-induced trend is only worsening by the year. While the news largely tends to present ongoing drought, punctuated by ever more frequent and violent hurricanes and tropical storms, as well as increasingly disastrous flooding, as so many individual occurrences, their heightened frequency is certainly a result of climate change. And about a third of Central America’s migrants directly cite extreme weather as the reason they were forced to leave their homes. Climate change is, in fact, just what the U.S. Department of Defense all-too-correctly termed a “threat multiplier” that contributes to food and water scarcity, land conflicts, unemployment, violence, and other causes of migration.

The United States has, of course, played and continues to play an outsized role in contributing to climate change. And, in fact, we continue to emit far more CO2 per person than any other large country. We also produce and export large amounts of fossil fuels — the U.S., in fact, is one of the world’s largest exporters as well as one of the largest consumers. And we continue to fund and promote fossil-fuel-dependent development at home and abroad. One of the best ways the United States could help Central America would be to focus time, energy, and money on stopping the burning of fossil fuels.

Migration as a problem solver Isn’t it finally time that the officials and citizens of the United States recognized the role migration plays in Central American economies? Where U.S. economic development recipes have failed so disastrously, migration has been the response to these failures and, for many Central Americans, the only available way to survive.

One in four Guatemalan families relies on remittances from relatives working in the United States and such monies account for about half of their income. President Biden may have promised Central America $4 billion in aid over four years, but Guatemala alone receives $9 billion a year in such remittances. And unlike government aid, much of which ends up in the pockets of U.S. corporations, local entrepreneurs, and bureaucrats of various sorts, remittances go directly to meet the needs of ordinary households.

At present, migration is a concrete way that Central Americans are trying to solve their all-too-desperate problems. Since the nineteenth century, Indigenous and peasant communities have repeatedly sought self-sufficiency and autonomy, only to be displaced by U.S. plantations in the name of progress. They’ve tried organizing peasant and labor movements to fight for land reform and workers’ rights, only to be crushed by U.S.-trained and sponsored militaries in the name of anti-communism. With other alternatives foreclosed, migration has proven to be a twenty-first-century form of resistance and survival.

If migration can be a path to overcome economic crises, then instead of framing Washington’s Central American policy as a way to stop it, the United States could reverse course and look for ways to enhance migration’s ability to solve problems.

Jason DeParle aptly titled his recent book on migrant workers from the Philippines A Good Provider is One Who Leaves. “Good providers should not have to leave,” responded the World Bank’s Dilip Ratha, “but they should have the option.” As Ratha explains,

“Migrants benefit their destination countries. They provide essential skills that may be missing and fill jobs that native-born people may not want to perform. Migrants pay taxes and are statistically less prone to commit crimes than native-born people… Migration benefits the migrant and their extended family and offers the potential to break the cycle of poverty. For women, migration elevates their standing in the family and the society. For children, it provides access to healthcare, education, and a higher standard of living. And for many countries of origin, remittances provide a lifeline in terms of external, counter-cyclical financing.”

Migration can also have terrible costs. Families are separated, while many migrants face perilous conditions, including violence, detention, and potentially death on their journeys, not to speak of inadequate legal protection, housing, and working conditions once they reach their destination. This country could do a lot to mitigate such costs, many of which are under its direct control. The United States could open its borders to migrant workers and their families, grant them full legal rights and protections, and raise the minimum wage.

Would such policies lead to a large upsurge in migration from Central America? In the short run, they might, given the current state of that region under conditions created and exacerbated by Washington’s policies over the past 40 years. In the longer run, however, easing the costs of migration actually could end up easing the structural conditions that cause it in the first place.

Improving the safety, rights, and working conditions of migrants would help Central America far more than any of the policies Biden and Harris are proposing. More security and higher wages would enable migrants to provide greater support for families back home. As a result, some would return home sooner. Smuggling and human trafficking rings, which take advantage of illegal migration, would wither from disuse. The enormous resources currently aimed at policing the border could be shifted to immigrant services. If migrants could come and go freely, many would go back to some version of the circular migration pattern that prevailed among Mexicans before the militarization of the border began to undercut that option in the 1990s. Long-term family separation would be reduced. Greater access to jobs, education, and opportunity has been shown to be one of the most effective anti-gang strategies.

In other words, there’s plenty the United States could do to develop more constructive policies towards Central America and its inhabitants. That, however, would require thinking far more deeply about the “root causes” of the present catastrophe than Biden, Harris, and crew seem willing to do. In truth, the policies of this country bear an overwhelming responsibility for creating the very structural conditions that cause the stream of migrants that both Democrats and Republicans have decried, turning the act of simple survival into an eternal “crisis” for those very migrants and their families. A change in course is long overdue.

 

Aviva Chomsky writes regularly for TomDispatch (where this article originated). She is professor of history and coordinator of Latin American studies at Salem State University in Massachusetts. Her new book, Central America’s Forgotten History: Revolution, Violence, and the Roots of Migration, will be published in April.

Copyright ©2021 Aviva Chomsky — distributed by Agence Global

—————-

Released: 19 July 2021

Word Count: 2,532

—————-

Steven Pressman, “A Simple and inexpensive way to help families with children”

July 19, 2021 - The-Washington-Spectator

Raising children is expensive. A typical middle-class, two-child family spends $13,000 annually on each child, or nearly a quarter-million dollars per child in total through age 17. This tally includes neither college costs nor the cost to a family of putting money aside to help the children attend college, which itself can easily run another quarter of a million dollars or more for each child. The challenge is even greater for poor families that have little income and need to spend a large fraction of it supporting their children. Most families with children in the United States today are struggling financially.

Fortunately, there is a remedy—child allowances. These are regular payments from the government to families for the sake of their children. Think of it as universal basic income, but for children only. The aim is to help families support their children and keep families from being penalized because they have children and more mouths to feed. This pro-child policy is necessary because firms won’t pay workers more money if they have children. Any firm that did this would find itself at a competitive disadvantage. For this reason, virtually every country in the world has a child allowance program.

Nations benefit from helping low-income families with children. Children growing up in poverty get less education, earn less income, and pay less in taxes. They are a bigger burden on society throughout their life. They experience more health problems, raising insurance costs for everyone, and receive more social insurance benefits. Finally, there is considerable evidence that poverty has noneconomic costs—it creates anxiety and behavioral problems in children and leads to greater instances of depression compared with non-poor children.

Support for child allowances comes from across the political spectrum. Conservatives like that they encourage parents to stay at home and care for their kids and that they don’t require the government to make decisions about people’s lives. As the Niskanen Center, a conservative think tank, put it, child allowances “leave paternalism to the parents.” Liberals like the fact that child allowances don’t stigmatize the poor by providing means-tested benefits (such as SNAP or Food Stamps) and because they are an effective way to reduce child poverty.  

Nobel laureate economist Robert Solow estimated the annual cost to the nation due to child poverty at 3 percent of U.S. gross domestic product, or approximately $630 billion today. The Center for American Progress estimated the annual cost at 4 percent of GDP, or $840 billion. The cost comes from reduced employment and taxes plus higher crime rates and health care expenditures.

Providing child allowances is a highly effective way to reduce child poverty and decrease these societal costs. According to the U.S. Census Bureau, 14.4 percent of U.S. children (one in seven) were poor in 2019. The Luxembourg Income Study, a cross-national database with comparable information on household income, estimates that in the mid-2010s the child poverty rate in the United States for two-parent families was 13 percent. By comparison, the child poverty rate for two-parent families was 7.9 percent in Germany, 7.7 percent in the U.K., and 1.6 percent in Finland. My research traces these international differences directly to government policies aiding families with children, particularly child allowances.

Given the benefits of reducing child poverty, why has the United States failed to adopt a child allowance program? Partly, the reason is that the United States has used other policies to help families with children.

The main source of support for families with children has been a tax exemption for each child. In 2017, the last year that this tax benefit was available, each dependent child provided an exemption that reduced taxable income by around $4,000. The tax saving for a family then would depend on its tax bracket. Those in the top tax bracket (40 percent) got back $1,600 per child; a family in the 10 percent bracket gained only $400 from a $4,000 child exemption. Those in the 0 percent bracket, owing no taxes, got no help. This was an upside-down subsidy. It helped the affluent raise their children, but it did nothing to help poor families and little to help middle-class families.  

A push for change began in the 1990s. The National Commission on Children recommended a universal $1,000 child tax credit (the equivalent of $2,000 today) in a 1991 report. With a tax credit, every household receives the same monetary benefit. It is very nearly a child allowance. The United States first instituted a $400 child tax credit in the 1997 Taxpayer Relief Act. But the credit was not refundable. Families owing no taxes got nothing; and families owing less than the full amount of the credit only got back the taxes they owed to the Federal government. Low-income families, needing the most help to raise their children, were helped the least.

When George W. Bush increased the tax credit to $1,000, as part of his 2001 tax cut, Democrats pushed to have it be partially refundable. The credit has been increased several times since then. In 2020, the credit was $2,000; $1,400 was refundable to those with at least $2,500 of earned income. Still, families with children in the greatest need received no aid or very little. The Brookings Institution has calculated that 40 percent of the $118 billion spent on this program in 2020 went to households with incomes above $100,000. In contrast, most children living in households in the bottom 10 percent of the income distribution got nothing.  

Child poverty experts in Congress, including Democratic Senators Michael Bennet of Colorado, Sherrod Brown of Ohio, and Representative Rosa DeLauro of Connecticut, have pushed long and hard to make the tax credit fully refundable. They succeeded when President Biden signed the American Rescue Plan on March 11. This $1.9 trillion Covid-19 relief bill increased the child tax credit from $2,000 to $3,000, with an extra $600 for children under age 6, and made the credit fully refundable.

Finally, the American Rescue Plan stipulated that payments be made monthly to families with children, rather than annually through a tax refund. Beginning July 2021 and continuing through June 2022, most families will get monthly payments from the IRS of $300 for each young child and $250 for children over the age of 5. Providing money sooner helps the many families with variable income. Monthly payments will also reduce child hunger and homelessness, as well as the high-interest debt that families incur (e.g., payday loans) to put food on the table and pay utility bills. As a full child allowance policy, these payments will make a huge difference in the lives of children whose families live paycheck to paycheck.

The Center on Budget and Policy Priorities estimates that the changes in the child tax credit will lift 4.1 million children above the poverty line. The Center on Poverty and Social Policy at Columbia University estimates that nearly five million children will escape poverty due to these changes, reducing the U.S. child poverty rate to 7.9 percent. It will cost $109 billion, according to the Congressional Joint Committee on Taxation.

Given the costs of child poverty, as estimated by Solow and the Center for American Progress, the American Rescue Plan should pay for itself quickly by cutting the U.S. child poverty rate nearly in half. Virtually no private investment has such a large rate of return. Furthermore, the government can borrow money for this investment at the rate of around 1.5 percent (the interest rate on 10-year government bonds).

Like Cinderella at midnight, after June 2022, the fully refundable child tax credit will return to what it was during 2020. A number of Democrats hope to make the refundable tax credit permanent. If they succeed, it would be a major step forward in reducing child poverty in the United States.  

A permanent child tax credit, or child allowance program, would be of significant help to low- and middle-income families with children. The reduction in child poverty will benefit the entire nation in demonstrable ways. It is time to follow the rest of the world and make this a permanent feature of the U.S. tax code.

Steven Pressman is professor of economics at Colorado State University, author of Fifty Major Economists, 3rd edition (Routledge, 2013), and president of the Association for Social Economics.

Copyright ©2021 The Washington Spectator — distributed by Agence Global

—————-

Released: 19 July 2021

Word Count: 1,353

—————-

Andrea Mazzarino, “Who authorized America’s wars?”

July 15, 2021 - TomDispatch

Sometimes, as I consider America’s never-ending wars of this century, I can’t help thinking of those lyrics from the Edwin Starr song, “(War, huh) Yeah! (What is it good for?) Absolutely nothing!” I mean, remind me, what good have those disastrous, failed, still largely ongoing conflicts done for this country?  Or for you?  Or for me?

For years and years, what came to be known as America’s “war on terror” (and later just its “forever wars”) enjoyed remarkable bipartisan support in Congress, not to say the country at large. Over nearly two decades, four presidents from both parties haven’t hesitated to exercise their power to involve our military in all sorts of ways in at least 85 countries around the world in the name of defeating “terrorism” or “violent extremism.”  Such interventions have included air strikes against armed groups in seven countries, direct combat against such groups in 12 countries, military exercises in 41 countries, and training or assistance to local military, police, or border patrol units in 79 countries. And that’s not even to mention the staggering number of U.S. military bases around the world where counterterrorism operations can be conducted, the massive arms sales to foreign governments, or all the additional deployments of this country’s Special Operations forces.

Providing the thinnest of legal foundations for all of this have been two ancient acts of Congress. The first was the authorization for the use of military force (AUMF) that allowed the president to act against “those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored such organizations or persons.” It led, of course, to the disastrous war in Afghanistan. It was passed in the week after those attacks on New York City and Washington, D.C. That bill’s lone opponent in the House, Representative Barbara Lee (D–CA), faced death threats from the public for her vote, though she stood by it, fearing all too correctly that such a law would sanction endless wars abroad (as, of course, it did).

The second AUMF passed on October 15, 2002, by a 77-23 vote in the Senate. Under the false rationale that Saddam Hussein’s Iraq harbored weapons of mass destruction (it didn’t), that AUMF gave President George W. Bush and his crew a green light to invade Iraq and topple its regime. Last month, the House finally voted 268-161 (including 49 Republican yes votes) to repeal the second of those authorizations.

Thinking back to when America’s “forever wars” first began, it’s hard to imagine how we could still be fighting in Iraq and Syria under the same loose justification of a war on terror almost two decades later or that the 2001 AUMF, untouched by Congress, still stands, providing the fourth president since the war on terror began with an excuse for actions of all sorts.

I remember watching in March 2003 from my home in northern California as news stations broadcast bombs going off over Baghdad. I’d previously attended protests around San Francisco, shouting my lungs out about the potentially disastrous consequences of invading a country based on what, even then, seemed like an obvious lie. Meanwhile, little did I know that the Afghan War authorization I had indeed supported, as a way to liberate the women of that country and create a democracy from an abusive state, would still be disastrously ongoing nearly 20 years later.

Nor did I imagine that, in 2011, having grasped my mistake when it came to the Afghan War, I would co-found Brown University’s Costs of War Project; nor that, about a decade into that war, I would be treating war-traumatized veterans and their families as a psychotherapist, even as I became the spouse of a Navy submariner. I would spend the second decade of the war on terror shepherding my husband and our two young children through four military moves and countless deployments, our lives breathless and harried by the outlandish pace of the disastrous forever (and increasingly wherever) wars that had come to define America’s global presence in the twenty-first century.

Amid all the talk about Joe Biden’s Afghan withdrawal decision which came “from the gut,” according to an official close to the president, it’s easy to forget that this country continues to fight some of those very same wars.

What keeps us safe? Take, for example, late last month when President Biden ordered “defensive” airstrikes in Iraq and Syria against reportedly Iran-backed Iraqi militia groups. Those groups were thought to be responsible for a series of at least five drone attacks on weapons storage and operational bases used by U.S. troops in Iraq and Syria. The June American air strikes supposedly killed four militia members, though there have been reports that one hit a housing complex, killing a child and wounding three other civilians (something that has yet to be verified). An unnamed “senior administration official” explained: “We have a responsibility to demonstrate that attacking Americans carries consequences, and that is true whether or not those attacks inflict casualties.” He did not, however, explain what those American troops were doing in the first place at bases in Iraq and Syria.

Note that such an act was taken on presidential authority alone, with Congress thoroughly sidelined as it has been since it passed those AUMFs so long ago. To be sure, some Americans still argue that such preemptive attacks — and really, any military buildups whatsoever — are precisely what keep Americans safe.

My husband, a Navy officer, has served on three nuclear and ballistic submarines and one battleship. He’s also built a nearly 20-year career on the philosophy that the best instrument of peace, should either of the other two great powers on this planet step out of line, is the concept of mutually-assured destruction — the possibility, that is, that a president would order not air strikes in Syria, but nuclear strikes somewhere.

He and I argue about this regularly. How, I ask him, can any weapons, no less nuclear ones, ever be seen as instruments of safety? (Though living in the country with the most armed citizens on the planet, I know that this isn’t exactly a winning argument domestically.) I mean, consider the four years we’ve just lived through! Consider the hands our nuclear arsenal was in from 2017 to 2020!

My husband always simply looks at me as if he knows so much more than I do about this. Yet the mere hint of a plan for “peace” based on a world-ending possibility doesn’t exactly put me at ease, nor does a world in which an American president can order air strikes more or less anywhere on the planet without the backing of anyone else, Congress included.

Every time my husband leaves home to go to some bunker or office where he would be among the first to be sheltered from a nuclear attack, my gut clenches. I feel the hopelessness of what would happen if we ever reached that point of no return where the only option might be to strike back because we ourselves were about to die. It would be a “solution” in which just those in power might remain safe. Meanwhile, our more modest preemptive attacks against other militaries and armed groups in distant lands exact a seldom-recognized toll in blood and treasure.

Every time I hear about preemptive strikes like those President Biden ordered last month in countries we’re not even officially at war with, attacks that were then sanctioned across most of the political spectrum in Washington from Democratic House Speaker Nancy Pelosi to Oklahoma Republican Senator Jim Inhofe, I wonder: How many people died in those attacks? Whose lives in those target areas were destroyed by uncertainty, fear, and the prospect of long-term anxiety?

In addition, given my work as a therapist with vets, I always wonder how the people who carried out such strikes are feeling right now. I know from experience that just following such life-ending orders can create a sense of internal distress that changes you in ways almost as consequential as losing a limb or taking a bullet.

How our wars kill at home For years now, my colleagues and I at the Costs of War Project have struggled to describe and quantify the human costs of America’s never-ending twenty-first-century wars. All told, we’ve estimated that more than 801,000 people died in fighting among U.S., allied, and opposing troops and police forces. And that doesn’t include indirect deaths due to wrecked healthcare systems, malnutrition, the uprooting of populations, and the violence that continues to plague traumatized families in those war zones (and here at home as well).

According to a stunning new report by Boston University’s Ben Suitt, the big killer of Americans engaged in the war on terror has not, in fact, been combat, but suicide, which has so far claimed the lives of 30,177 veterans and active servicemembers. Suicide rates among post-9/11 war veterans are higher than for any cohort of veterans since before World War II. Among those aged 18 to 35 (the oldest of whom weren’t even of voting age when we first started those never-ending wars and the youngest of whom weren’t yet born), the rate has increased by a whopping 76% since 2005.

And if you think that those most injured from their service are the ones coming home after Iraq and Afghanistan, consider this: over the past two decades, suicide rates have increased most sharply among those who have never even been deployed to a combat zone or have been deployed just once.

It’s hard to say why even those who don’t fight are killing themselves so far from America’s distant battlefields. As a psychotherapist who has seen my share of veterans who attempted to kill or — later — succeeded in killing themselves, I can say that two key predictors of that final, desperate act are hopelessness and a sense that you have no legitimate contribution to make to others.

As Suitt points out, about 42% of Americans are now either unaware of the fact that their country is still fighting wars in the Greater Middle East and Africa or think that the war on terror is over. Consider that for a moment. What does it mean to be fighting wars for a country in which a near majority of the population is unaware that you’re even doing so?

As a military spouse whose partner has not been deployed to a combat zone, the burdens of America’s forever wars are still shared by us in concrete ways: more frequent and longer deployments with shorter breaks, more abusive and all-encompassing command structures, and very little clear sense of what it is this country could possibly be fighting for anymore or what the end game might be.

If strikes like the ones President Biden authorized last month reflect anything, it’s that there are few ways — certainly not Congress — of reining in our commander in chief from sending Americans to harm and be harmed.

“Are soldiers killers?” I recall lying awake in 1991, at age 12, my stomach in knots, thinking about the first display of pyrotechnics I can remember, when President George H.W. Bush authorized strikes against Saddam Hussein’s Iraq in what became known as the First Gulf War. I told my father then, “I can’t sleep because I think that something bad is going to happen!” I didn’t know what, but those balls of fire falling on Baghdad on my New Jersey TV screen seemed consequential indeed.

Where were they landing? On whom? What was going to happen to our country? My father, who used a minor college football injury to dodge the Vietnam draft and has supported every war since then, shrugged, patted me on the back, and said he didn’t know, but that I shouldn’t worry too much about it.

As a parent myself now, I can still remember what it was like to first consider that people might kill others. As a result, I try to keep a conversation going with my own children as they start to grapple with the existence of evil.

Recently, our six-year-old son, excited to practice his newfound reading skills, came across a World War II military history book in my husband’s office and found photos of both Nazi soldiers and Jewish concentration camp prisoners. He stared at the gaunt bodies and haunted eyes of those prisoners. After a first-grade-level conversation about war and hatred, he suddenly pointed at Nazi soldiers in one photo and asked, “Are soldiers killers?” My husband and I flinched. And then he asked: “Why do people kill?”

Over and over, as such questions arise, I tell my son that people die in wars because so many of us turn our backs on what’s going on in the world we live in. I’m all too aware that we stop paying attention to what elected officials do because we’ve decided we like them (or hate them but can’t be bothered by them). I tell him that we’re going to keep reading the news and talking about it, because my little family, whatever our arguments, agrees that Americans don’t care enough about what war does to the bodies and minds of those who live through it.

Here’s the truth of it: we shouldn’t be spending this much time, money, and blood on conflicts whose end games are left to the discretion of whoever our increasingly shaky electoral system places in this country’s highest office. Until we pressure lawmakers to repeal that 2001 AUMF and end the forever conflicts that have gone with it, America’s wars will ensure that our democracy and the rule of law as we know it will make any promises of peace, self-defense, and justice ring hollow.

Don’t doubt it for a second. War is a cancer on our democracy.

Andrea Mazzarino writes regularly for TomDispatch (where this article originated). She co-founded Brown University’s Costs of War Project. She has held various clinical, research, and advocacy positions, including at a Veterans Affairs PTSD Outpatient Clinic, with Human Rights Watch, and at a community mental health agency. She is the co-editor of War and Health: The Medical Consequences of the Wars in Iraq and Afghanistan.

Copyright ©2021 Andrea Mazzarino — distributed by Agence Global

—————-

Released: 15 July 2021

Word Count: 2,301

—————-

Michael Klare, “On the brink in 2026: U.S.-China near-war status report”

July 13, 2021 - TomDispatch

It’s the summer of 2026, five years after the Biden administration identified the People’s Republic of China as the principal threat to U.S. security and Congress passed a raft of laws mandating a society-wide mobilization to ensure permanent U.S. domination of the Asia-Pacific region. Although major armed conflict between the United States and China has not yet broken out, numerous crises have erupted in the western Pacific and the two countries are constantly poised for war. International diplomacy has largely broken down, with talks over climate change, pandemic relief, and nuclear nonproliferation at a standstill. For most security analysts, it’s not a matter of if a U.S.-China war will erupt, but when.

Does this sound fanciful? Not if you read the statements coming out of the Department of Defense (DoD) and the upper ranks of Congress these days.

“China poses the greatest long-term challenge to the United States and strengthening deterrence against China will require DoD to work in concert with other instruments of national power,” the Pentagon’s 2022 Defense Budget Overview asserts. “A combat-credible Joint Force will underpin a whole-of-nation approach to competition and ensure the Nation leads from a position of strength.”  

On this basis, the Pentagon requested $715 billion in military expenditures for 2022, with a significant chunk of those funds to be spent on the procurement of advanced ships, planes, and missiles intended for a potential all-out, “high-intensity” war with China. An extra $38 billion was sought for the design and production of nuclear weapons, another key aspect of the drive to overpower China.

Democrats and Republicans in Congress, contending that even such sums were insufficient to ensure continued U.S. superiority vis-à-vis that country, are pressing for further increases in the 2022 Pentagon budget. Many have also endorsed the EAGLE Act, short for Ensuring American Global Leadership and Engagement — a measure intended to provide hundreds of billions of dollars for increased military aid to America’s Asian allies and for research on advanced technologies deemed essential for any future high-tech arms race with China.

Imagine, then, that such trends only gain momentum over the next five years. What will this country be like in 2026? What can we expect from an intensifying new Cold War with China that, by then, could be on the verge of turning hot?

Taiwan 2026: perpetually on the brink Crises over Taiwan have erupted on a periodic basis since the start of the decade, but now, in 2026, they seem to be occurring every other week. With Chinese bombers and warships constantly probing Taiwan’s outer defenses and U.S. naval vessels regularly maneuvering close to their Chinese counterparts in waters near the island, the two sides never seem far from a shooting incident that would have instantaneous escalatory implications. So far, no lives have been lost, but planes and ships from both sides have narrowly missed colliding again and again. On each occasion, forces on both sides have been placed on high alert, causing jitters around the world.

The tensions over that island have largely stemmed from incremental efforts by Taiwanese leaders, mostly officials of the Democratic Progressive Party (DPP), to move their country from autonomous status as part of China to full independence. Such a move is bound to provoke a harsh, possibly military response from Beijing, which considers the island a renegade province.

The island’s status has plagued U.S.-China relations for decades. When, on January 1, 1979, Washington first recognized the People’s Republic of China, it agreed to withdraw diplomatic recognition from the Taiwanese government and cease formal relations with its officials. Under the Taiwan Relations Act of 1979, however, U.S. officials were obligated to conduct informal relations with Taipei. The act stipulated as well that any move by Beijing to alter Taiwan’s status by force would be considered “a threat to the peace and security of the Western Pacific area and of grave concern to the United States” — a stance known as “strategic ambiguity,” as it neither guaranteed American intervention, nor ruled it out.

In the ensuing decades, the U.S. sought to avoid conflict in the region by persuading Taipei not to make any overt moves toward independence and by minimizing its ties to the island, thereby discouraging aggressive moves by China. By 2021, however, the situation had been remarkably transformed. Once under the exclusive control of the Nationalist Party that had been defeated by communist forces on the Chinese mainland in 1949, Taiwan became a multiparty democracy in 1987. It has since witnessed the steady rise of pro-independence forces, led by the DPP. At first, the mainland regime sought to woo the Taiwanese with abundant trade and tourism opportunities, but the excessive authoritarianism of its Communist Party alienated many island residents — especially younger ones — only adding momentum to the drive for independence. This, in turn, has prompted Beijing to switch tactics from courtship to coercion by constantly sending its combat planes and ships into Taiwanese air and sea space.

Trump administration officials, less concerned about alienating Beijing than their predecessors, sought to bolster ties with the Taiwanese government in a series of gestures that Beijing found threatening and that were only expanded in the early months of the Biden administration. At that time, growing hostility to China led many in Washington to call for an end to “strategic ambiguity” and the adoption of an unequivocal pledge to defend Taiwan if it were to come under attack from the mainland.

“I think the time has come to be clear,” Senator Tom Cotton of Arkansas declared in February 2021. “Replace strategic ambiguity with strategic clarity that the United States will come to the aid of Taiwan if China was to forcefully invade Taiwan.”

The Biden administration was initially reluctant to adopt such an inflammatory stance, since it meant that any conflict between China and Taiwan would automatically become a U.S.-China war with nuclear ramifications. In April 2022, however, under intense congressional pressure, the Biden administration formally abandoned “strategic ambiguity” and vowed that a Chinese invasion of Taiwan would prompt an immediate American military response. “We will never allow Taiwan to be subjugated by military force,” President Biden declared at that time, a striking change in a longstanding American strategic position.

The DoD would soon announce the deployment of a permanent naval squadron to the waters surrounding Taiwan, including an aircraft carrier and a supporting flotilla of cruisers, destroyers, and submarines. Ely Ratner, President Biden’s top envoy for the Asia-Pacific region, first outlined plans for such a force in June 2021 during testimony before the Senate Armed Services Committee. A permanent U.S. presence, he suggested, would serve to “deter, and, if necessary, deny a fait accompli scenario” in which Chinese forces quickly attempted to overwhelm Taiwan. Although described as tentative then, it would, in fact, become formal policy following President Biden’s April 2022 declaration on Taiwan and a brief exchange of warning shots between a Chinese destroyer and a U.S. cruiser just south of the Taiwan Strait.

Today, in 2026, with a U.S. naval squadron constantly sailing in waters near Taiwan and Chinese ships and planes constantly menacing the island’s outer defenses, a potential Sino-American military clash never seems far off. Should that occur, what would happen is impossible to predict, but most analysts now assume that both sides would immediately fire their advanced missiles — many of them hypersonic (that is, exceeding five times the speed of sound) — at their opponent’s key bases and facilities. This, in turn, would provoke further rounds of air and missile strikes, probably involving attacks on Chinese and Taiwanese cities as well as U.S. bases in Japan, Okinawa, South Korea, and Guam. Whether such a conflict could be contained at the non-nuclear level remains anyone’s guess.

The incremental draft In the meantime, planning for a U.S.-China war-to-come has dramatically reshaped American society and institutions.  The “Forever Wars” of the first two decades of the twenty-first century had been fought entirely by an All-Volunteer Force (AVF) that typically endured multiple tours of duty, in particular in Iraq and Afghanistan. The U.S. was able to sustain such combat operations (while continuing to maintain a substantial troop presence in Europe, Japan, and South Korea) with 1.4 million servicemembers because American forces enjoyed uncontested control of the airspace over its war zones, while China and Russia remained wary of engaging U.S. forces in their own neighborhoods.

Today, in 2026, however, the picture looks radically different: China, with an active combat force of two million soldiers, and Russia, with another million — both militaries equipped with advanced weaponry not widely available to them in the early years of the century — pose a far more formidable threat to U.S. forces. An AVF no longer looks particularly viable, so plans for its replacement with various forms of conscription are already being put into place.

Bear in mind, however, that in a future war with China and/or Russia, the Pentagon doesn’t envision large-scale ground battles reminiscent of World War II or the Iraq invasion of 2003. Instead, it expects a series of high-tech battles involving large numbers of ships, planes, and missiles. This, in turn, limits the need for vast conglomerations of ground troops, or “grunts,” as they were once labeled, but increases the need for sailors, pilots, missile launchers, and the kinds of technicians who can keep so many high-tech systems at top operational capacity.

As early as October 2020, during the final months of the Trump administration, Secretary of Defense Mark Esper was already calling for a doubling of the size of the U.S. naval fleet, from approximately 250 to 500 combat vessels, to meet the rising threat from China. Clearly, however, there would be no way for a force geared to a 250-ship navy to sustain one double that size. Even if some of the additional ships were “uncrewed,” or robotic, the Navy would still have to recruit several hundred thousand more sailors and technicians to supplement the 330,000 then in the force. Much the same could be said of the U.S. Air Force.

No surprise, then, that an incremental restoration of the draft, abandoned in 1973 as the Vietnam War was drawing to a close, has taken place in these years. In 2022, Congress passed the National Service Reconstitution Act (NSRA), which requires all men and women aged 18 to 25 to register with newly reconstituted National Service Centers and to provide them with information on their residence, employment status, and educational background — information they are required to update on an annual basis. In 2023, the NSRA was amended to require registrants to complete an additional questionnaire on their technical, computer, and language skills. Since 2024, all men and women enrolled in computer science and related programs at federally aided colleges and universities have been required to enroll in the National Digital Reserve Corps (NDRC) and spend their summers working on defense-related programs at selected military installations and headquarters. Members of that Digital Corps must also be available on short notice for deployment to such facilities, should a conflict of any sort threaten to break out.

The establishment of just such a corps, it should be noted, had been a recommendation of the National Security Commission on Artificial Intelligence, a federal agency established in 2019 to advise Congress and the White House on how to prepare the nation for a high-tech arms race with China. “We must win the AI competition that is intensifying strategic competition with China,” the commission avowed in March 2021, given that “the human talent deficit is the government’s most conspicuous AI deficit.” To overcome it, the commission suggested then, “We should establish a… civilian National Reserve to grow tech talent with the same seriousness of purpose that we grow military officers. The digital age demands a digital corps.”

Indeed, only five years later, with the prospect of a U.S.-China conflict so obviously on the agenda, Congress is considering a host of bills aimed at supplementing the Digital Corps with other mandatory service requirements for men and women with technical skills, or simply for the reinstatement of conscription altogether and the full-scale mobilization of the nation. Needless to say, protests against such measures have been erupting at many colleges and universities, but with the mood of the country becoming increasingly bellicose, there has been little support for them among the general public. Clearly, the “volunteer” military is about to become an artifact of a previous epoch.

A new cold war culture of repression With the White House, Congress, and the Pentagon obsessively focused on preparations for what’s increasingly seen as an inevitable war with China, it’s hardly surprising that civil society in 2026 has similarly been swept up in an increasingly militaristic anti-China spirit. Popular culture is now saturated with nationalistic and jingoistic memes, regularly portraying China and the Chinese leadership in derogatory, often racist terms. Domestic manufacturers hype “Made in America” labels (even if they’re often inaccurate) and firms that once traded extensively with China loudly proclaim their withdrawal from that market, while the streaming superhero movie of the moment, The Beijing Conspiracy, on a foiled Chinese plot to disable the entire U.S. electrical grid, is the leading candidate for the best film Oscar.  

Domestically, by far the most conspicuous and pernicious result of all this has been a sharp rise in hate crimes against Asian Americans, especially those assumed to be Chinese, whatever their origin. This disturbing phenomenon, which began at the outset of the Covid crisis, when President Trump, in a transparent effort to deflect blame for his mishandling of the pandemic, started using terms like “Chinese Virus” and “Kung Flu” to describe the disease. Attacks on Asian Americans rose precipitously then and continued to climb after Joe Biden took office and began vilifying Beijing for its human rights abuses in Xinjiang and Hong Kong. According to the watchdog group Stop AAPI Hate, some 6,600 anti-Asian incidents were reported in the U.S. between March 2020 and March 2021, with almost 40% of those events occurring in February and March 2021.

For observers of such incidents back then, the connection between anti-China policymaking at the national level and anti-Asian violence at the neighborhood level was incontrovertible. “When America China-bashes, then Chinese get bashed, and so do those who ‘look Chinese,’” said Russell Jeung, a professor of Asian American Studies at San Francisco State University at that time. “American foreign policy in Asia is American domestic policy for Asians.”

By 2026, most Chinatowns in America have been boarded up and those that remain open are heavily guarded by armed police. Most stores owned by Asian Americans (of whatever background) were long ago closed due to boycotts and vandalism, and Asian Americans think twice before leaving their homes.

The hostility and distrust exhibited toward Asian Americans at the neighborhood level has been replicated at the workplace and on university campuses, where Chinese Americans and Chinese-born citizens are now prohibited from working at laboratories in any technical field with military applications. Meanwhile, scholars of any background working on China-related topics are subject to close scrutiny by their employers and government officials. Anyone expressing positive comments about China or its government is routinely subjected to harassment, at best, or at worst, dismissal and FBI investigation.

As with the incremental draft, such increasingly restrictive measures were first adopted in a series of laws in 2022. But the foundation for much of this was the United States Innovation and Competition Act of 2021, passed by the Senate in June of that year. Among other provisions, it barred federal funding to any college or university that hosted a Confucius Institute, a Chinese government program to promote that country’s language and culture in foreign countries. It also empowered federal agencies to coordinate with university officials to “promote protection of controlled information as appropriate and strengthen defense against foreign intelligence services,” especially Chinese ones.

Diverging from the path of war Yes, in reality, we’re still in 2021, even if the Biden administration regularly cites China as our greatest threat. Naval incidents with that country’s vessels in the South China Sea and the Taiwan Strait are indeed on the rise, as are anti-Asian-American sentiments domestically. Meanwhile, as the planet’s two greatest greenhouse-gas emitters squabble, our world is growing hotter by the year.

Without question, something like the developments described above (and possibly far worse) will lie in our future unless action is taken to alter the path we’re now on. All of those “2026” developments, after all, are rooted in trends and actions already under way that only appear to be gathering momentum at this moment. Bills like the Innovation and Competition Act enjoy near unanimous support among Democrats and Republicans, while strong majorities in both parties favor increased funding of Pentagon spending on China-oriented weaponry. With few exceptions — Senator Bernie Sanders among them — no one in the upper ranks of government is saying: Slow down. Don’t launch another Cold War that could easily go hot.

“It is distressing and dangerous,” as Sanders wrote recently in Foreign Affairs, “that a fast-growing consensus is emerging in Washington that views the U.S.-Chinese relationship as a zero-sum economic and military struggle.” At a time when this planet faces ever more severe challenges from climate change, pandemics, and economic inequality, he added that “the prevalence of this view will create a political environment in which the cooperation that the world desperately needs will be increasingly difficult to achieve.”

In other words, we Americans face an existential choice: Do we stand aside and allow the “fast-growing consensus” Sanders speaks of to shape national policy, while abandoning any hope of genuine progress on climate change or those other perils? Alternately, do we begin trying to exert pressure on Washington to adopt a more balanced relationship with China, one that would place at least as much emphasis on cooperation as on confrontation. If we fail at this, be prepared in 2026 or soon thereafter for the imminent onset of a catastrophic (possibly even nuclear) U.S.-China war.

Michael T. Klare writes regularly for TomDispatch (where this article originated). He is the five-college professor emeritus of peace and world security studies at Hampshire College and a senior visiting fellow at the Arms Control Association. He is the author of 15 books, the latest of which is All Hell Breaking Loose: The Pentagon’s Perspective on Climate Change. He is a founder of the Committee for a Sane U.S.-China Policy.

Copyright ©2021 Michael T. Klare — distributed by Agence Global

—————-

Released: 13 July 2021

Word Count: 2,983

—————-

Rebecca Gordon, “The fires this time: a climate change view from California”

July 12, 2021 - TomDispatch

In San Francisco, we’re finally starting to put away our masks. With 74% of the city’s residents over 12 fully vaccinated, for the first time in more than a year we’re enjoying walking, shopping, and eating out, our faces naked. So I was startled when my partner reminded me that we need to buy masks again very soon — N95 masks, that is. The California wildfire season has already begun, earlier than ever, and we’ll need to protect our lungs during the months to come from the fine particulates carried in the wildfire smoke that’s been engulfing this city in recent years.

I was in Reno last September, so I missed the morning when San Franciscans awoke to apocalyptic orange skies, the air freighted with smoke from burning forests elsewhere in the state. The air then was bad enough even in the high mountain valley of Reno. At that point, we’d already experienced “very unhealthy” purple-zone air quality for days. Still, it was nothing like the photos that could have been from Mars then emerging from the Bay Area. I have a bad feeling that I may get my chance to experience the same phenomenon in 2021 — and, as the fires across California have started so much earlier, probably sooner than September.

The situation is pretty dire: this state — along with our neighbors to the north and southeast — is now living through an epic drought. After a dry winter and spring, the fuel-moisture content in our forests (the amount of water in vegetation, living and dead) is way below average. This April, the month when it is usually at its highest, San Jose State University scientists recorded levels a staggering 40% below average in the Santa Cruz Mountains, well below the lowest level ever before observed. In other words, we have never been this dry.

Under the heat dome When it’s hot in most of California, its often cold and foggy in San Francisco. Today is no exception. Despite the raging news about heat records, it’s not likely to reach 65 degrees here. So it’s a little surreal to consider what friends and family are going through in the Pacific Northwest under the once-in-thousands-of-years heat dome that’s settled over the region. A heat dome is an area of high pressure surrounded by upper-atmosphere winds that essentially pin it in place. If you remember your high-school physics, you’ll recall that when a gas (for example, the air over the Pacific Northwest) is contained, the ratio between pressure and temperature remains constant. If the temperature goes up, the pressure goes up.

The converse is also true; as the pressure rises, so does the temperature. And that’s what’s been happening over Oregon, Washington, and British Columbia in normally chilly Canada. Mix in the fact that climate change has driven average temperatures in those areas up by three to four degrees since the industrial revolution, and you have a recipe for the disaster that struck the region recently.

And it has indeed been a disaster. The temperature in the tiny town of Lytton, British Columbia, for instance, hit 121 degrees on June 29th, breaking the Canadian heat record for the third time in as many days. (The previous record had stood since 1937.) That was Tuesday. On Wednesday night, the whole town was engulfed in the flames of multiple fires. The fires, in turn, generated huge pyrocumulus clouds that penetrated as high as the stratosphere (a rare event in itself), producing lightning strikes that ignited new fires in a vicious cycle that, in the end, simply destroyed the kilometer-long town.

Heat records have been broken all over the Pacific Northwest. Portland topped records for three days running, culminating with a 116-degree day on June 28th; Seattle hit a high of 108, which the Washington Post reported “was 34 degrees above the normal high of 74 and higher than the all-time heat record in Washington, D.C., among many other cities much farther to its south.”

With the heat comes a rise in “sudden and unexpected” deaths. Hundreds have died in Oregon and Washington and, according to the British Columbia coroner, at least 300 in her state — almost double the average number for that time period.

Class, race, and hot air It’s hardly a new observation that the people who have benefited least from the causes of climate change — the residents of less industrialized countries and poor people of all nations — are already suffering most from its results. Island nations like the Republic of Palau in the western Pacific are a prime example. Palau faces a number of climate-change challenges, according to the United Nations Development Program, including rising sea levels that threaten to inundate some of its lowest-lying islands, which are just 10 meters above sea level. In addition, encroaching seawater is salinating some of its agricultural land, creating seaside strips that can now grow only salt-tolerant root crops. Meanwhile, despite substantial annual rainfall, saltwater inundation threatens the drinking water supply. And worse yet, Palau is vulnerable to ocean storms that, on our heating planet, are growing ever more frequent and severe.

There are also subtle ways the rising temperatures that go with climate change have differential effects, even on people living in the same city. Take air conditioning. One of the reasons people in the Pacific Northwest suffered so horrendously under the heat dome is that few homes in that region are air conditioned. Until recently, people there had been able to weather the minimal number of very hot days each year without installing expensive cooling machinery.

Obviously, people with more discretionary income will have an easier time investing in air conditioning now that temperatures are rising. What’s less obvious, perhaps, is that its widespread use makes a city hotter — a burden that falls disproportionately on people who can’t afford to install it in the first place. Air conditioning works on a simple principle; it shifts heat from air inside an enclosed space to the outside world, which, in turn, makes that outside air hotter.

A 2014 study of this effect in Phoenix, Arizona, showed that air conditioning raised ambient temperatures by one to two degrees at night — an important finding, because one of the most dangerous aspects of the present heat waves is their lack of night-time cooling. As a result, each day’s heat builds on a higher base, while presenting a greater direct-health threat, since the bodies of those not in air conditioning can’t recover from the exhaustion of the day’s heat at night. In effect, air conditioning not only heats the atmosphere further but shifts the burden of unhealthy heat from those who can afford it to those who can’t.

Just as the coronavirus has disproportionately ravaged black and brown communities (as well as poor nations around the world), climate-change-driven heat waves, according to a recent University of North Carolina study reported by the BBC, mean that “black people living in most U.S. cities are subject to double the level of heat stress as their white counterparts.” This is the result not just of poverty, but of residential segregation, which leaves urban BIPOC (black, indigenous, and other people of color) communities in a city’s worst “heat islands” — the areas containing the most concrete, the most asphalt, and the least vegetation — and which therefore attract and retain the most heat.

“Using satellite temperature data combined with demographic information from the U.S. Census,” the researchers “found that the average person of color lives in an area with far higher summer daytime temperatures than non-Hispanic white people.” They also discovered that, in all but six of the 175 urban areas they studied in the continental U.S., “people of color endure much greater heat impacts in summer.” Furthermore, “for black people this was particularly stark. The researchers say they are exposed to an extra 3.12C  [5.6F] of heating, on average, in urban neighborhoods, compared to an extra 1.47C [2.6F] for white people.”

That’s a big difference.

Food, drink, and fires — the view from California Now, let me return to my own home state, California, where conditions remain all too dry and, apart from the coast right now, all too hot. Northern California gets most of its drinking water from the snowpack that builds each year in the Sierra Nevada mountains. In spring, those snows gradually melt, filling the rivers that fill our reservoirs. In May 2021, however, the Sierra snowpack was a devastating six percent of normal!

Stop a moment and take that in, while you try to imagine the future of much of the state — and the crucial crops it grows.

For my own hometown, San Francisco, things aren’t quite that dire. Water levels in Hetch Hetchy, our main reservoir, located in Yosemite National Park, are down from previous years, but not disastrously so. With voluntary water-use reduction, we’re likely to have enough to drink this year at least. Things are a lot less promising, however, in rural California where towns tend to rely on groundwater for domestic use.

Shrinking water supplies don’t just affect individual consumers here in this state, they affect everyone in the United States who eats, because 13.5% of all our agricultural products, including meat and dairy, as well as fruits and vegetables, come from California. Growing food requires prodigious amounts of water. In fact, farmland irrigation accounts for roughly 80% of all water used by businesses and homes in the state.

So how are California’s agricultural water supplies doing this year? The answer, sadly, is not very well. State regulators have already cut distribution to about a quarter of California’s irrigated acreage (about two million acres) by a drastic 95%. That’s right. A full quarter of the state’s farmlands have access to just 5% of what they would ordinarily receive from rivers and aqueducts. As a result, some farmers are turning to groundwater, a more easily exhausted source, which also replenishes itself far more slowly than rivers and streams. Some are even choosing to sell their water to other farmers, rather than use it to grow crops at all, because that makes more economic sense for them. As smaller farms are likely to be the first to fold, the water crisis will only enhance the dominance of major corporations in food production.

Meanwhile, we’ll probably be breaking out our N95 masks soon. Wildfire season has already begun — earlier than ever. On July 1st, the then-still-uncontained Salt fire briefly closed a section of Interstate 5 near Redding in northern California. (I-5 is the main north-south interstate along the West coast.) And that’s only one of the more than 4,500 fire incidents already recorded in the state this year.

Last year, almost 10,000 fires burned more than four million acres here, and everything points to a similar or worse season in 2021. Unlike Donald Trump, who famously blamed California’s fires on a failure to properly rake our forests, President Biden is taking the threat seriously. On June 30th, he convened western state leaders to discuss the problem, acknowledging that “we have to act and act fast. We’re late in the game here.” The president promised a number of measures: guaranteeing sufficient, and sufficiently trained, firefighters; raising their minimum pay to $15 per hour; and making grants to California counties under the Federal Emergency Management Agency’s BRIC (Building Resilient Infrastructure and Communities) program.

Such measures will help a little in the short term, but none of it will make a damn bit of difference in the longer run if the Biden administration and a politically divisive Congress don’t begin to truly treat climate change as the immediate and desperately long-term emergency it is.

Justice and generations In his famous A Theory of Justice, the great liberal philosopher of the twentieth century John Rawls proposed a procedural method for designing reasonable and fair principles and policies in a given society. His idea: that the people determining such basic policies should act as if they had stepped behind a “veil of ignorance” and had lost specific knowledge of their own place in society. They’d be ignorant of their own class status, ethnicity, or even how lucky they’d been when nature was handing out gifts like intelligence, health, and physical strength. 

Once behind such a veil of personal ignorance, Rawls argued, people might make rules that would be as fair as possible, because they wouldn’t know whether they themselves were rich or poor, black or white, old or young — or even which generation they belonged to. This last category was almost an afterthought, included, he wrote, “in part because questions of social justice arise between generations as well as within them.”

His point about justice between generations not only still seems valid to me, but in light of present-day circumstances radically understated. I don’t think Rawls ever envisioned a trans-generational injustice as great as the climate-change one we’re allowing to happen, not to say actively inducing, at this very moment.

Human beings have a hard time recognizing looming but invisible dangers. In 1990, I spent a few months in South Africa providing some technical assistance to an anti-apartheid newspaper. When local health workers found out that I had worked (as a bookkeeper) for an agency in the U.S. trying to prevent the transmission of AIDS, they desperately wanted to talk to me. How, they hoped to learn, could they get people living in their townships to act now to prevent a highly transmissible illness that would only produce symptoms years after infection? How, in the face of the all-too-present emergencies of everyday apartheid life, could they get people to focus on a vague but potentially horrendous danger barreling down from the future? I had few good answers and, almost 30 years later, South Africa has the largest HIV-positive population in the world.

Of course, there are human beings who’ve known about the climate crisis for decades — and not just the scientists who wrote about it as early as the 1950s or the ones who gave an American president an all-too-accurate report on it in 1965. The fossil-fuel companies have, of course, known all along — and have focused their scientific efforts not on finding alternative energy sources, but on creating doubt about the reality of human-caused climate change (just as, once upon a time, tobacco companies sowed doubt about the relationship between smoking and cancer). As early as 1979, the Guardian reports, an internal Exxon study concluded that the use of fossil fuels would certainly “cause dramatic environmental effects” in the decades ahead. “The potential problem is great and urgent,” the study concluded.

A problem that was “great and urgent” in 1979 is now a full-blown existential crisis for human survival.

Some friends and I were recently talking about how ominous the future must look to the younger people we know. “They are really the first generation to confront an end to humanity in their own, or perhaps their children’s lifetimes,” I said.

“But we had The Bomb,” a friend reminded me. “We grew up in the shadow of nuclear war.” And she was right of course. We children of the 1950s and 1960s grew up knowing that someone could “press the button” at any time, but there was a difference. Horrifying as is the present retooling of our nuclear arsenal (going on right now, under President Biden), nuclear war nonetheless remains a question of “if.” Climate change is a matter of “when” and that when, as anyone living in the Northwest of the United States and Canada should know after these last weeks, is all too obviously now.

It’s impossible to overstate the urgency of the moment. And yet, as a species, we’re acting like the children of indulgent parents who provide multiple “last chances” to behave. Now, nature has run out of patience and we’re running out of chances. So much must be done globally, especially to control the giant fossil-fuel companies. We can only hope that real action will emerge from November’s international climate conference. And here in the U.S., unless congressional Democrats succeed in ramming through major action to stop climate change before the 2022 midterms, we’ll have lost one more last, best chance for survival.

Rebecca Gordon writes regularly for TomDispatch (where this article originated). She teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Copyright ©2021 Rebecca Gordon — distributed by Agence Global

—————-

Released: 12 July 2021

Word Count: 2,683

—————-

Karen J. Greenberg, “America’s accountability problem”

July 8, 2021 - TomDispatch

America has an accountability problem. In fact, if the Covid-19 disaster, the January 6th Capitol attack, and the Trump years are any indication, the American lexicon has essentially dispensed with the term “accountability.”

This should come as no surprise. After all, there’s nothing particularly new about this. In the Bush years, those who created a system of indefinite offshore detention at Guantánamo Bay, Cuba, those who implemented a CIA global torture program and the National Security Agency’s warrantless surveillance policy, not to mention those who purposely took us to war based on lies about nonexistent Iraqi weapons of mass destruction, were neither dismissed, sanctioned, nor punished in any way for obvious violations of the law. Nor has Congress passed significant legislation of any kind to ensure that all-encompassing abuses like these will not happen again.

Now, early in the Biden era, any determination to hold American officials responsible for such past wrongdoing, even the president who helped launch an assault on the Capitol, seems little more than a fantasy. It may be something to discuss, rail against, or even make promises about, but not actually reckon with — not if you’re either a deeply divided Congress or a Department of Justice that has compromised itself repeatedly in recent years. Under other circumstances, of course, those would be the two primary institutions with the power to pursue genuine accountability in any meaningful way for extreme and potentially illegal government acts.

Today, if thought about at all, accountability — whether in the form of punishment for misdeeds or meaningful reform — has been reduced to a talking point. With that in mind, let’s take a moment to consider the Biden administration’s approach to accountability so far.

How we got here Even before Donald Trump entered the Oval Office, the country was already genuinely averse to accountability. When President Obama took office in January 2009, he faced the legacy of the George W. Bush administration’s egregious disregard for laws and norms in its extralegal post-9/11 war on terror. From day one of his presidency, Obama made clear that he found his predecessor’s policies unacceptable by both acknowledging and denouncing those crimes. But he insisted that they belonged to the past.

Fearing that the pursuit of punishment would involve potentially ugly encounters with former officials and would seem like political retribution in a country increasingly divided and on edge, he clearly decided that it wouldn’t be worth the effort. Ultimately, as he said about “interrogations, detentions, and so forth,” it was best for the nation to “look forward, as opposed to looking backward.”

True to the president’s word, the Obama administration refused to hold former officials responsible for violations of fundamental constitutional and legal issues. Among those who escaped retrospective accountability were Vice President Dick Cheney, who orchestrated the invasion of Saddam Hussein’s Iraq based on lies; the lawyer in the Justice Department’s Office of Legal Counsel, John Yoo, who, in his infamous “Torture Memos,” justified the “enhanced interrogation” of war-on-terror prisoners; and Secretary of Defense Donald Rumsfeld, who created a Bermuda triangle of injustice at Guantánamo Bay, Cuba. In terms of reform, Obama did ensure a degree of meaningful change, including decreeing an official end to the CIA torture of prisoners of war. But too much of what had happened remained unaddressed and lay in wait for abuse at the hands of some irresponsible future president.

As a result, many of the sins that were at the heart of the never-ending response to the 9/11 attacks have become largely forgotten history, leaving many potential crimes unaddressed. And even more sadly, the legacy of accountability’s demise only continues. Biden and his team entered office facing a brand-new list of irregularities and abuses by high-ranking officials, including President Trump.

In this case, the main events demanding accountability had occurred on the domestic front. The January 6th insurrection, the egregious mishandling of the pandemic, the interference in the 2020 presidential election, and the use of the Department of Justice for political ends all awaited investigation after inauguration day. At the outset, the new government dutifully promised that some form of accountability would indeed be forthcoming. On January 15th, House Speaker Nancy Pelosi announced that she planned to convene an independent commission to thoroughly investigate the Capitol riots, later pledging to look into the “facts and causes” of that assault on Congress.

Attorney General nominee Merrick Garland similarly promised, “If confirmed, I will supervise the prosecution of white supremacists and others who stormed the Capitol on January 6th.” Meanwhile, signaling some appetite for holding his predecessor accountable, during the presidential campaign, Joe Biden had already ruled out the possibility of extending a pardon to Donald Trump. In that way, he ensured that, were he elected, numerous court cases against the president and his Trump Organization would be open to prosecution — even as Noah Bookbinder, the executive director of Citizens for Responsibility and Ethics in Washington, recently suggested, reviving of the obstruction of justice charges that had been central to the Mueller investigation of the 2016 presidential election.

Reluctance in the halls of accountability Six months after Joe Biden took office, there has been no firm movement toward accountability by his administration. On the question of making Donald Trump and his allies answer for their misdeeds, the appetite of this administration so far seems wanting, notably, for example, when it comes to the role the president may have played in instigating the Capitol attack. Sadly, Pelosi’s call for an independent commission to investigate that insurrectionary moment passed the House, but fell victim last month to the threat of a filibuster and was blocked in the Senate. (Last week, largely along party lines, the House passed a select committee to investigate the insurrection.)

Trump’s disastrous mishandling of the pandemic, potentially responsible for staggering numbers of American deaths, similarly seems to have fallen into the territory of unaccountability. The partisan divisions of Congress continue to stall a Covid-19 investigation. National security expert and journalist Peter Bergen, for instance, called for a commission to address the irresponsible way the highest levels of government dealt with the pandemic, but the idea failed to gain traction. Instead, the focus has turned to the question of whether or not there was malfeasance at a Chinese government lab in Wuhan.

It matters not at all that numerous journalists, including Lawrence Wright, Michael Lewis, and Nicholson Baker, have impressively documented the mishandling of the pandemic here. Such disastrous acts included early denials of the lethality of the disease, the disavowal of pandemic preparedness plans, the dismantling of the very government office meant to respond to pandemics, the presidential promotion of quack cures, a disregard for wearing masks early on, and so much else, all of which contributed to a generally chaotic governmental response, which ultimately cost tens of thousands of lives.

In truth, a congressional investigation into either the Capitol riots or the Trump administration’s mishandling of the pandemic might never have led to actual punitive accountability. After all, the 9/11 Commission, touted as the gold standard for such investigations, did nothing of the sort. While offering a reputable history of the terrorist threat that resulted in the attacks of September 11, 2001, and a full-scale summary of government missteps and lapses that led up to that moment, the 9/11 report did not take on the mission of pointing fingers and demanding accountability.

In a recent interview with former New York Times reporter Philip Shenon, whose 2008 book The Commission punctured that group’s otherwise stellar reputation, Just Security editor Ryan Goodman offered this observation: “[An] important lesson from your book is the conscious tradeoff that the 9/11 Commission members made in prioritizing having a unanimous final report which sacrificed their ability to promote the interests of accountability (such as identifying and naming senior government officials whose acts or omissions were responsible for lapses in U.S. national security before the attack).”

Shenon added that the tradeoff between accountability and unanimity was acknowledged by commission staff members frustrated by the absence of what they thought should have been the report’s “most important and controversial” conclusions. In other words, when it came to accountability, the 9/11 Report proved an inadequate model at best. Still, even its version of truth-telling proved too much for congressional Republicans facing a similar commission on the events of January 6th.

Note, however, that the 9/11 Commission did lead to movement along another path of accountability: reform. In its wake came certain structural changes, including a bolstering of the interagency process for sharing information and the creation of the Office of the Director of National Intelligence.

No such luck today. And signs of the difficulty of facing any kind of accountability are now evident inside the Department of Justice (DOJ), too. Despite initial rhetoric to the contrary from Attorney General Merrick Garland, the department has shown little appetite for redress when it comes to those formerly in the highest posts. And that reality should bring to mind the similar reluctance of Barack Obama, the president who originally nominated Garland unsuccessfully to the Supreme Court.

For anyone keeping a scorecard of DOJ actions regarding Trump-era excesses, the record is slim indeed. While the department did, at least, abandon any possible prosecution of former National Security Advisor John Bolton for supposedly disclosing classified information in his memoir on his time in the Trump administration, Garland also announced that he would not pursue several matters that could have brought to light information about President Trump’s abuse of power.

In May, for instance, the department appealed a court-ordered call for the release of the full version of a previously heavily redacted DOJ memo advising then-Attorney General Bill Barr that the evidence in the Mueller Report was “not sufficient to support a conclusion beyond a reasonable doubt that the President violated the obstruction-of-justice statutes.” In fact, the Mueller Report did not exonerate Trump, as Mueller himself would later testify in Congress and as hundreds of federal prosecutors would argue in a letter written in the wake of the report’s publication, saying, “Each of us believes that the conduct of President Trump described in Special Counsel Robert Mueller’s report would… result in multiple felony charges for obstruction of justice.”

Adding fuel to the fire of disappointment, Garland pulled back from directly assessing fault lines inside the Department of Justice when it came to its independence from partisan politics. Instead, he turned over to the DOJ inspector general any further investigation into Trump’s politicization of the department.

The path forward — or not? These are all discouraging signs, yet there’s still time to strengthen our faltering democracy by reinstating the idea that abuses of power and violations of the law — from inside the White House, no less — are not to be tolerated. Even without an independent commission looking into January 6th or the DOJ prosecuting anyone, some accountability should still be possible. (After all, it was a New York State court that recently suspended Rudy Giuliani’s license to practice law.)

On June 24th, Nancy Pelosi announced at a news conference that a select Congressional committee, even if not an independent 9/11-style commission, would look into the Capitol attack. That committee, she added, will “establish the truth of that day and ensure that an attack of that kind cannot happen and that we root out the causes of it all.” True, she didn’t specify whether accountability and reform would be part of that committee’s responsibilities, but neither goal is off the table.

And Pelosi’s fallback plan to convene a House select committee could still have an impact. After all, remember the Watergate committee in the Nixon era. It, too, was a select committee and it launched an investigation into abuses of power in the Watergate affair that helped bring about President Nixon’s resignation from office and helped spark or support court cases against many of his partners in crime. Similarly, the 1975 Church Commission investigation into the abuses of the intelligence community, among them the FBI’s notorious counter-intelligence program, COINTELPRO, was also a select committee project. It led to significant barriers against future abuses — including a ban on assassinations and a host of “good government” bills.

Pelosi rightly insists that she’s intent on pursuing an investigation into the Capitol attack. Adam Schiff and Jerry Nadler are similarly determined to investigate the government seizure of Internet communications. Local court cases against Trump, Giuliani, and others will, it appears, continue apace.

Through such efforts, perhaps the potentially shocking facts could see the light of day. Continuing such quests may lead to anything but perfect accountability, particularly in a country growing ever more partisan. Above and beyond the immediate importance of giving the public — and history — a reliable narrative of recent events, it’s important to let Americans know that accountability is still a crucial part of our democracy as are the laws and norms accountability aims to protect. Otherwise, this country will have to face a new reality: that we are now living in the age of impunity.

Karen J. Greenberg writes regularly for TomDispatch (where this article originated). She is the director of the Center on National Security at Fordham Law and author of the forthcoming Subtle Tools: The Dismantling of Democracy from the War on Terror to Donald Trump  (Princeton University Press, August). Julia Tedesco helped with research for this piece.

Copyright ©2021 Karen J. Greenberg — distributed by Agence Global

—————-

Released: 08 July 2021

Word Count: 2,158

—————-

Alfred McCoy, “America’s drug wars: fifty years of reinforcing racism”

July 6, 2021 - TomDispatch

Fifty years ago, on June 17, 1971, President Richard Nixon stood before the White House press corps, staffers at his side, to announce “a new, all-out offensive” against drug abuse, which he denounced as “America’s public enemy number one.” He called on Congress to contribute $350 million for a worldwide attack on “the sources of supply.” The first battle in this new drug war would be fought in South Vietnam where, Nixon said, “a number of young Americans have become addicts as they serve abroad.”

While the president was declaring his war on drugs, I was stepping off a trans-Pacific flight into the searing tropical heat of Saigon, the South Vietnamese capital, to report on the sources of supply for the drug abuse that was indeed sweeping through the ranks of American soldiers fighting this country’s war in Vietnam.

As I would soon discover, the situation was far worse than anything Nixon could have conveyed in his sparse words. Heroin vials littered the floors of Army barracks. Units legendary for their heroism in World War II like the 82nd Airborne were now known as the “jumping junkies.” A later survey found that more than a third of all GIs fighting the Vietnam War “commonly used” heroin. Desperate to defeat this invisible enemy, the White House was now about to throw millions of dollars at this overseas drug war, funding mass urinalysis screening for every homeward-bound GI and mandatory treatment for any who tested positive for drugs.

Even that formidable effort, however, couldn’t defeat the murky politics of heroin, marked by a nexus of crime and official collusion that made mass drug abuse among GIs possible. After all, in the rugged mountains of nearby Laos, Air America, a company run by the CIA, was transporting opium harvested by tribal farmers who were also serving as soldiers in its secret army. The commander of the Royal Lao Army, a close ally, then operated the world’s largest illicit lab, turning raw opium into refined heroin for the growing numbers of GI users in neighboring Vietnam. Senior South Vietnamese commanders colluded in the smuggling and distribution of such drugs to GIs in bars, in barracks, and at firebases. In both Laos and South Vietnam, American embassies ignored the corruption of their local allies that was helping to fuel the traffic.

Nixon’s drug war As sordid as Saigon’s heroin politics were, they would pale when compared to the cynical deals agreed to in Washington over the next 30 years that would turn the drug war of the Vietnam era into a political doomsday machine. Standing alongside the president on that day when America’s drug war officially began was John Ehrlichman, White House counsel and Nixon confidante.

As he would later bluntly tell a reporter, 

“The Nixon White House had two enemies: the antiwar left and black people… We knew we couldn’t make it illegal to be either against the war or black, but by getting the public to associate the hippies with marijuana and blacks with heroin, and then criminalizing both heavily, we could disrupt those communities. We could arrest their leaders, raid their homes, break up their meetings, and vilify them night after night on the evening news.”

And just in case anyone missed his point, Ehrlichman added, “Did we know we were lying about the drugs? Of course, we did.”

To grasp the full meaning of this admission, you need to begin with the basics: the drug war’s absolute, unqualified, irredeemable failure. Just three pairs of statistics can convey the depth of that failure and the scope of the damage the war has done to American society over the past half-century:

• Despite the drug war’s efforts to cut supplies, worldwide illicit opium production rose 10-fold — from 1,200 tons in 1971 to a record 10,300 tons in 2017.

• Reflecting its emphasis on punishment over treatment, the number of people jailed for drug offenses would also grow 10-fold from 40,900 in 1980 to 430,900 in 2019.

• Finally, instead of reducing domestic use, the drug war actually helped stimulate a 10-fold surge in the number of American heroin users from just 68,000 in 1970 to 745,000 in 2019.

In addition, the drug war has had a profound impact on American society by perpetuating, even institutionalizing, racial disparities through the raw power of the police and prisons. Remember that the Republican Party saw the Voting Rights Act of 1965, which ended decades of Jim Crow disenfranchisement for Blacks in the deep South, as a rare political opportunity. In response, Nixon and his men began developing a two-part strategy for winning over white voters in the South and blunting the Democratic advantage with Black voters nationwide.

First, in the 1970 midterm elections, the Republicans began pursuing a “Southern strategy” of courting disgruntled white-supremacist voters in the South in a successful attempt to capture that entire region politically. Three years later, they launched a relentless expansion of the drug war, policing, and prisons. In the process, they paved the way for the mass incarceration of African Americans, denying them the vote not just as convicts but, in 15 states, for life as ex-convicts. Pioneering this cunning strategy was New York’s Republican governor Nelson Rockefeller. The harsh mandatory penalties of 15 years to life for petty drug possession he got the state legislature to pass raised the number of people imprisoned on drug charges from 470 in 1970 to 8,500 in 1999, 90% of them African-American or Latinx.

Such mass incarceration moved voters from urban Democratic bailiwicks to rural prisons where they were counted in the census, but otherwise disenfranchised, giving a bit of additional help to the white Republican vote in upstate New York — a winning strategy Republicans elsewhere would soon follow. Not only did the drug war let conservatives shave opposition vote tallies in close elections, but it also dehumanized African Americans, justifying repressive policing and mass incarceration.

None of this was pre-ordained but the result of a succession of political deals made during three presidencies — that of Nixon, who started it; of Ronald Reagan, whose administration enacted draconian punishments for drug possession; and of the Democrat Bill Clinton, who expanded the police and prisons to enforce those very drug laws. After remaining remarkably constant at about 100 prisoners per 100,000 population for more than 50 years, the U.S. incarceration rate started climbing relentlessly to 293 by the end of Reagan’s term in 1990 and 464 by the end of Clinton’s in 2000. It reached a peak of 760 by 2008 — with a racial bias that resulted in nothing less than the “mass incarceration” of African Americans.

Reagan domesticates the drug war While Nixon fought his war largely on foreign battlefields trying, and failing, to stop narcotics at their source, the next Republican president, Ronald Reagan, fully domesticated the drug war through ever harsher penalties for personal use and a publicity campaign that made abstinence a moral virtue and indulgence a fiercely punishable vice. Meanwhile, he also signaled clearly that he was determined to pursue Nixon’s Southern strategy by staging a major 1980 election campaign rally in Neshoba County, Mississippi, where three civil rights workers had previously been murdered.

Taking office in 1981, Reagan found, to his surprise, that reviving the drug war at home had little public support, largely because the outgoing Democratic administration had focused successfully on drug treatment rather than punishment. So, First Lady Nancy Reagan began crisscrossing the country, while making TV appearances with choruses of cute kids wearing “Just Say No” T-shirts. Even after four years of the First Lady’s campaign and the simultaneous spread of crack cocaine and cocaine powder in cities and suburbs nationwide, only about 2% of the electorate felt that drug abuse was the nation’s “number one problem.”

Then personal tragedy provided Reagan with the perfect political opportunity. In June 1986, just a day after signing a multimillion-dollar contract with the NBA’s Boston Celtics, college basketball sensation Len Bias collapsed in his dorm at the University of Maryland from a fatal cocaine overdose. Five months later, President Reagan would sign the Anti-Drug Abuse Act, aka the “Len Bias Law.” It would lead to a quantum expansion of the domestic drug war, including a mandatory minimum sentence of five years just for the possession of five grams of cocaine and a revived federal death penalty for traffickers.

It also put into law a racial bias in imprisonment that would prove staggering: a 100:1 sentencing disparity between those convicted of possessing crack-cocaine (used mainly by inner-city Blacks) and those using cocaine powder (favored by suburban whites) — even though there was no medical difference between the two drugs. To enforce such tough penalties, the law also expanded the federal anti-drug budget to a massive $6.5 billion.

In signing that law, Reagan would pay special tribute to the first lady, calling her “the co-captain in our crusade for a drug-free America” and the fight against “the purveyors of this evil.” And the two of them had much to take credit for. After all, by 1989, an overwhelming 64% of Americans had come to feel that drugs were the nation’s “number one problem.” Meanwhile, thanks largely to the Anti-Drug Abuse Act, Americans jailed for nonviolent drug offenses soared from 50,000 in 1980 to 400,000 in 1997. Driven by drug arrests, in 1995 nearly one-third of all African-American males between 20 and 29 would either be in prison or on parole.

Clinton’s all-too-bipartisan drug war If those two Republican presidents were adept at portraying partisan anti-drug policies as moral imperatives, their Democratic successor, Bill Clinton, proved adept at getting himself reelected by picking up their seductive rhetoric. Under his administration, a racialized drug policy, with its disenfranchisement and denigration of African Americans, would become fully bipartisan.

In 1994, two years after being elected president, Clinton lost control of Congress to Republican conservatives led by House Speaker Newt Gingrich. Desperate for something he could call a legislative accomplishment, he tacked hard right to support the Violent Crime Control Act of 1994. It would prove the largest law-enforcement initiative in American history: nearly $19 billion dollars for 100,000 new cops to sweep the streets for drug offenders and a massive prison-expansion program to house those who would now be sentenced to life after three criminal convictions (“three strikes”).

A year later, when the non-partisan U.S. Sentencing Commission recommended that the 100:1 disparity in penalties for crack-cocaine and cocaine powder be abolished, along with its blatant racial bias, Clinton flatly rejected the advice, signing instead Republican-sponsored legislation that maintained those penalties. “I am not,” he insisted, “going to let anyone who peddles drugs get the idea that the cost of doing business is going down.”

The country’s Black political leaders were eloquent in their condemnation of this political betrayal. The Reverend Jesse Jackson, a former Democratic presidential candidate, claimed Clinton knew perfectly well that “crack is code for black” and labelled the president’s decision “a moral disgrace” by a man “willing to sacrifice young black youth for white fear.” The Congressional Black Caucus would similarly denounce the sentencing disparity as “a mockery of justice.”

As they predicted all too accurately, the relentless rise of Black incarceration only accelerated. In the five years following passage of Clinton’s omnibus crime bill, the country added 204 prisons and its inmate population shot up by a mind-boggling 28% to 1,305,300. Of those, nearly half (587,300) were Black, though African Americans made up only 13% of the country’s population.

Facing a tough reelection campaign in 1996, Clinton again worked with hard-right congressional Republicans to pass the Personal Responsibility Work Act, which, as he put it, brought an “end to welfare as we know it.” With that law’s work requirement for welfare, even as unemployment among Black residents of cities like Chicago (left behind by industry) hit 20% to 25%, youth in inner cities across America found that street-level drug dealing was fast becoming their only opportunity. In effect, the Clintons gained short-term political advantage by doing long-term social and economic damage to a core Democratic constituency, the African American community.

Reviving Jim Crow’s racial stereotypes Nonetheless, during his 1996 reelection campaign, Clinton trumpeted such dubious legislative achievements. Speaking at a campaign rally in New Hampshire, for instance, Hillary Clinton celebrated her husband’s Violent Crime Control Act for taking back the streets from murderous minority teenagers. “They are often the kinds of kids that are called ‘super-predators,’” Clinton said. “No conscience, no empathy. We can talk about why they ended up that way, but first we have to bring them to heel.”

The term “super-predator” had, in fact, originated with a Princeton University political scientist, John Dilulio, who described his theory to the first couple during a 1995 White House working dinner on juvenile crime. In an article for a neo-conservative magazine that November, the academic trumpeted his apocalyptic analysis. Based solely on the spottiest of anecdotal evidence, he claimed that “black inner-city neighborhoods” would soon fall prey to such “super predators” — a new kind of juvenile criminal marked by “impulsive violence, the vacant stares, and the remorseless eyes.” Within five years, he predicted, there would be 30,000 “more murderers, rapists, and muggers on the streets” who would “place zero value on the lives of their victims, whom they reflexively dehumanize as just so much worthless ‘white trash.’” This rising demographic tide, he warned, would soon “spill over into upscale central-city districts, inner-ring suburbs, and even the rural heartland.”

By the way, the truly significant part of Hillary Clinton’s statement based on Dilulio’s “analysis” was that phrase about bringing super-predators to heel. A quick quiz. Who or what does one “bring to heel”: (a.) a woman, (b.) a man, or (c.) a child? Answer: (d.) None of the above.

That term is used colloquially for controlling a leashed dog. By implicitly referring to young Black males as predators and animals, Clinton was tapping into one of America’s most venerable and virulent ethnic stereotypes: the Black “buck” or “brute.” The Jim Crow Museum of Racist Memorabilia at Ferris State University in Michigan reports that “the brute caricature portrays black men as innately savage, animalistic, destructive, and criminal — deserving punishment, maybe death… Black brutes are depicted as hideous, terrifying predators.”

Indeed, Southern fiction of the Jim Crow era featured the “Black brute” as an animal predator whose natural prey was white women. In words strikingly similar to those Dilulio and Clinton would later use for their super-predator, Thomas Dixon’s influential 1905 novel The Clansman: A Historical Romance of the Ku Klux Klan described the Black brute as “half child, half animal… a being who, left to his will, roams at night and sleeps in the day, whose speech knows no word of love, whose passions, once aroused, are as the fury of the tiger.” When turned into a movie in 1915 as The Birth of a Nation (the first film ever screened in the White House), it depicted a Black man’s animalistic rape of a virtuous white woman and reveled in the Klan’s retribution by lynching.

In effect, the rhetoric about “super-predators” revived the most virulent stereotype from the Jim Crow lexicon. By the end of President Clinton’s term in 2000, nearly every state in the nation had stiffened its laws on juveniles, setting aside family courts and sending young, mainly minority, offenders directly to adult prisons for long sentences.

Of course, the predicted wave of 30,000 young super-predators never happened. Instead, violent juvenile crime was already declining when Hillary Clinton gave that speech. By the time President Clinton’s term ended in 2001, the juvenile homicide rate had fallen well below its level in 1985.

Amazingly, it would be another 20 years before Hillary Clinton was compelled to confront the meaning of those freighted words of hers. While she was speaking to a donors’ meeting in South Carolina during her 2016 presidential campaign, Ashley Williams, a young Black activist, stood up in the front row and unfurled a small banner that read: “We have to bring them to heel.” Speaking calmly, she asked: “Will you apologize to black people for mass incarceration?” And then she added, “I am not a super-predator, Hillary Clinton.”

When Clinton tried to talk over her, she insisted: “I know that you called black people super-predators in 1994.” As the Secret Service hurried that young woman out of the room amid taunts from the largely white audience, Clinton announced, with a palpable sense of relief, “Okay, back to the issues.”

In its report on the incident, the Washington Post asked Clinton for a comment. In response, she offered the most unapologetic of apologies, explaining that, back in 1994, she had been talking about “violent crime and vicious drug cartels and the particular danger they pose to children and families.”

“As an advocate, as first lady, as senator, I was a champion for children,” she added, though admitting as well that, “looking back, I shouldn’t have used those words.”

That was it. No mention of mass incarceration. No apology for using the power of the White House pulpit to propagate the most virulent of racial stereotypes. No promises to undo all the damage she and her husband had caused. Not surprisingly, in November 2016, the African-American turnout in 33 states — particularly in the critical swing states of Florida, Michigan, Pennsylvania, and Wisconsin — was markedly down, costing her the election.

The burden of this past As much as both Republicans and Democrats might wish us to forget the costs of their deals, this tragic past is very much part of our present. In the 20 years since the drug war took final form under Clinton, politicians have made some relatively inconsequential reforms. In 2010, Congress made a modest cut in the sentencing disparity between the two kinds of cocaine that reduced the prison population by an estimated 1,550 inmates; Barack Obama pardoned 1,700 drug offenders; and Donald Trump signed the First Step Act that released 3,000 prisoners. Add up all those “reforms” and you end up with only 1.5% of those now in prison for drug offenses — just the tiniest drop of mercy in a vast ocean of misery.

So, even 50 years later, this country is still fighting a war on drugs and on non-violent drug users. Thanks to its laws, petty drug possession is still a felony with heavy penalties. As of 2019, this country’s prisons remained overcrowded with 430,900 people convicted of drug crimes, while drug offenders represented 46% of all those in federal penitentiaries. In addition, the U.S. still has the world’s highest incarceration rate at 639 prisoners per 100,000 population (nearly double Russia’s), with 1,380,400 people imprisoned, of whom 33% are Black.

So many decades later, the drug war’s mass incarceration still denies millions of African Americans the right to vote. As of 2020, 48 states refused their convicts the vote, while 34 states imposed a range of restrictions on ex-convicts, effectively denying suffrage to about 2.2 million Blacks, or 6.3% of all African-American adults.

Recent challenges have made more visible the drug war’s once largely invisible mechanisms for denying African Americans their rightful political power as a community. In a 2018 plebiscite, Florida voters restored electoral rights to that state’s 1.4 million ex-convicts, including 400,000 African Americans. Almost immediately, however, Republican governor Ron DeSantis required that 800,000 of those felons pay whatever court costs and fines they still owed before voting — a decision he successfully defended in federal court just before the 2020 presidential election. The effect of such determined Republican efforts meant that fewer than 8% of Florida’s ex-convicts were able to vote.

But above all, Black male drug users are still stigmatized as dangerous predators, as we all saw in the recent trial of Minneapolis police officer Derek Chauvin, who tried to defend kneeling on George Floyd’s neck for nine minutes because an autopsy found that the victim had opioids in his blood. And in March 2020, a paramilitary squad of Louisville police broke down an apartment door with a battering ram on a no-knock drug raid for a suspected Black drug dealer and wound up killing his sleeping ex-girlfriend, medical worker Breonna Taylor.

Maybe now, half a century later, it’s finally time to end the war on drug users — repeal the heavy penalties for possession; pardon the millions of nonviolent offenders; replace mass incarceration with mandatory drug treatment; restore voting rights to convicts and ex-convicts alike; and, above all, purge those persistent stereotypes of the dangerous Black male from our public discourse and private thoughts.

If only…

 

Alfred W. McCoy writes regularly for TomDispatch. He is the Harrington professor of history at the University of Wisconsin-Madison. He is the author most recently of In the Shadows of the American Century: The Rise and Decline of U.S. Global Power (Dispatch Books). His latest book (to be published in October by Dispatch Books) is To Govern the Globe: World Orders and Catastrophic Change.

Copyright ©2021 Alfred W. McCoy — distributed by Agence Global

—————-

Released: 06 July 2021

Word Count: 3,412

—————-

  • « Previous Page
  • 1
  • …
  • 20
  • 21
  • 22
  • 23
  • 24
  • …
  • 166
  • Next Page »

Syndication Services

Agence Global (AG) is a specialist news, opinion and feature syndication agency.

Rights & Permissions

Email us or call us 24 hours a day, 7 days a week, for rights and permission to publish our clients’ material. One of our representatives will respond in less than 30 minutes over 80% of the time.

Social Media

  • Facebook
  • Twitter

Advisories

Editors may ask their representative for inclusion in daily advisories. Sign up to get advisories on the content that fits your publishing needs, at rates that fit your budget.

About AG | Contact AG | Privacy Policy

©2016 Agence Global