Agence Global

  • About AG
  • Content
  • Articles
  • Contact AG

Karen J. Greenberg, “Trump’s document grab”

September 20, 2022 - TomDispatch

Thanks to Donald Trump, secrecy is big news these days. However, as political pundits and legal experts race to expose the layers of document-related misdeeds previously buried at his Mar-a-Lago estate, one overlooked reality looms large: despite all the coverage of the thousands of documents Trump took with him when he left the White House, there’s been next to no acknowledgment that such a refusal to share information has been part and parcel of the Washington scene for far longer than the current moment.

The hiding of information by the former president, repeatedly described as “unprecedented” behavior, is actually part of a continuum of withholding that’s been growing at a striking pace for decades. By the time Donald Trump entered the Oval Office, the stage had long been set for removing information from the public record in an alarmingly broad fashion, a pattern that he would take to new levels.

The “secrecy president”

As recent history’s exhibit number one, this country’s global war on terror, launched soon after the 9/11 attacks, was largely defined and enabled by the withholding of information — including secret memos, hidden authorizations, and the use of covert methods. During President George W. Bush’s first term in office, government lawyers and officials regularly withheld information about their actions and documents related to them from public view, both at home and abroad.

Those officials, for instance, legalized the brutal interrogations of war-on-terror prisoners, while conveniently replacing the word “torture” with the phrase “enhanced interrogation techniques” and so surreptitiously evading a longstanding legal ban on the practice. The CIA then secretly utilized those medieval techniques at “black sites” around the world where its agents held suspected terrorists. It later destroyed the tapes made of those interrogations, erasing the evidence of what its agents had done. On the home front, in a similarly secretive fashion, unknown to members of Congress as well as the general public, President Bush authorized the National Security Agency to set up an elaborate and far-reaching program of warrantless surveillance on Americans and others inside the United States.

Consider that the launching of an era of enhanced secrecy techniques. No wonder Bush earned the moniker of the “secrecy president.” Only weeks after the 9/11 attacks, for instance, he put in place strict guidelines about who could brief Congress on classified matters, while instituting new, lower standards for transparency. He even issued a signing statement rebuking Congress for requiring reports “in written form” on “significant anticipated intelligence activities or significant intelligence failure.” To emphasize his sense of righteousness in defying calls for information, he insisted on the “president’s constitutional authority to… withhold information” in cases of foreign relations and national security. In a parallel fashion, his administration put new regulations in place limiting the release of information under the Freedom of Information Act (FOIA).

President Obama also withheld information when it came to war-on-terror efforts. Notably, his administration shrouded in secrecy the use of armed drones to target and kill suspected terrorists (and civilians) in Libya, Pakistan, Somalia, and Yemen. Official reports omitted reliable data about who was killed, where the killings had taken place, or the number of civilian casualties. As the American Civil Liberties Union concluded, administration reporting on civilian harm fell “far short of the standards for transparency and accountability needed to ensure that the government’s targeted killing program is lawful under domestic and international law.”

And well beyond the war-on-terror context, the claim to secrecy has become a government default mechanism. Tellingly, the number of classified documents soared to unimaginable heights in those years. As the National Archives reports, in 2012, documents with classified markings — including “top secret,” “secret,” and “confidential” — reached a staggering 95 million. And while the overall numbers had declined by 2017, the extent of government classification then and now remains alarming.

Erasing the record before it’s created

President Trump’s document theft should be understood, then, as just another piece of the secrecy matrix.

Despite his claim — outrageous, but perhaps no more than so many other claims he made — to being the “most transparent” president ever, he turned out to be a stickler for withholding information on numerous fronts. Taking the war-on-terror behavioral patterns of his predecessors to heart, he expanded the information vacuum well beyond the sphere of war and national security to the purely political and personal realms. As a start, he refused to testify in the Mueller investigation into the 2016 presidential election. On a more personal note, he also filed suit to keep his tax records secret from Congress.

In fact, during his time in office, Trump virtually transformed the very exercise of withholding information. In place of secrecy in the form of classification, he developed a strategy of preventing documents and records from even being created in the first place.

Three months into his presidency, Trump announced that the White House would cease to disclose its visitor logs, citing the supposed risk to both national security and presidential privacy. In addition to hiding the names of those with whom he met, specific high-level meetings took place in an unrecorded fashion so that even the members of his cabinet, no less the public, would never know about them.

As former National Security Advisor John Bolton and others have attested, when it came to meetings with Russian President Vladimir Putin, Trump even prohibited note-taking. In at least five such meetings over the course of his first two years in office, he consistently excluded White House officials and members of the State Department. On at least one occasion, he even confiscated notes his interpreter took to ensure that there would be no record.

Congress, too, was forbidden access to information under Trump. Lawyers in the Department of Justice (DOJ) drafted memos hardening policies against complying with congressional requests for information in what former DOJ lawyer Annie Owens has described as “a policy that approached outright refusal” to share information. In addition, the Trump administration was lax or even dismissive when it came to compliance with the production of required reports on national security matters. Note as well the reversal of policies aimed at transparency, as in the decision to reverse an Obama era policy of making public the number of nuclear weapons the U.S. possessed.

But don’t just blame Donald Trump. Among the most recent examples of erasing evidence, it’s become clear that the Secret Service deleted the text messages of its agents around the president from the day before and the day of the January 6th insurrection. So, too, the phone records of several top Immigration and Customs Enforcement officials were wiped clean when they left office in accordance with directives established early in the Trump presidency. Similarly, the phone records of top Department of Defense and Department of Homeland Security officials were scrapped. In other words, recent reports on the way Trump regularly shredded documents, flushed them down the White House toilet, and generally withheld presidential papers — even classified documents, as revealed during the Mar-a-Lago search — were of a piece with a larger disdain on the part of both the president and a number of his top officials for sharing information.

Erasing the record in one fashion or another became the Trump administration’s default setting, variations on a theme hammered out by his predecessors and taken to new levels on his watch.

A perpetual right to secrecy?

Admittedly, before Trump arrived on the scene, there were some efforts to reverse this pattern, but in the long run they proved anemic. Barack Obama arrived at the White House in January 2009 acknowledging the harm caused by excessive government secrecy. Emphasizing transparency’s importance for accountability, informed public debate, and establishing trust in government, the new president issued an executive order on his first full day in office emphasizing the importance of “transparency and open government” and pledging to create “an unprecedented level of openness in government.”

Nearly a year later, he followed up with another executive order setting out a series of reforms aimed at widening the parameters for information-sharing. That order tightened guidelines around classification and broadened the possibilities for declassifying information. “Our democratic principles require that the American people be informed of the activities of their government,” it read. Six years later, Obama’s Director of National Intelligence James Clapper produced a report on the “principles of Intelligence transparency for the intelligence community” and a “transparency implementation plan” that again aimed at clarifying the limits, as well as the purposes, of secrecy.

And Obama’s efforts did indeed make some headway. As Steven Aftergood, former director of the Federation of American Scientists, concluded, “The Obama administration broke down longstanding barriers to public access and opened up previously inaccessible records of enormous importance and value.” Among other things, Aftergood reported, Obama “declassified the current size of the U.S. nuclear arms arsenal for the first time ever,” as well as thousands of the president’s daily briefs, and established a National Declassification Center.

Still, in the end, the progress proved disappointing. As Washington Post columnist Margaret Sullivan put it, the Obama administration’s record on transparency was among “the most secretive” in our history. She also castigated the president’s team for “setting new records for stonewalling or rejecting Freedom of Information Requests.” As an Associated Press analysis of federal data verified, the Obama administration did indeed set records in some years when it came to not granting those FOIA requests.

Executive distaste for sharing information is certainly nothing new and has often been linked, as during the war on terror, to misrepresentations, misdeeds, and outright deceit. After all, half a century ago, the administration of President Richard Nixon (of Watergate fame) defended the right to withhold information from the public as an effective way of covering up the American role in Vietnam. Those withheld materials, eventually released by the New York Times, showed that, over the course of four administrations, the national security state had misled the public about what the U.S. was doing in Vietnam, including hiding the secret bombing of neighboring Cambodia and Laos.

Still, let’s recognize what Donald Trump has, in fact, done. Though no longer president, he’s now taken the withholding of government information well beyond the borders of the government itself and deep into his private realm. In doing so, he’s set a dangerous precedent, one that brought the FBI to his doorstep (after months of attempts to access the documents in less intrusive ways). The challenge now is to address not just Trump’s clumsy efforts to unilaterally privatize a government practice, but the systemic overreach officials have relied on for decades to withhold staggering amounts of information from the public.

The Biden administration is alert to this issue. Notably, President Biden reversed several of Trump’s classification decisions, including his policy of not reporting the number of American nuclear weapons. More systematically, the National Security Council recently launched an effort aimed at revising the nation’s unwieldy classification system, while Director of National Intelligence Avril Haines has stated her intention to review the excessive classification of government documents.

In a 2022 letter to Congress, Haines pointed to the downside of a government that refuses to share information. “It is my view,” she wrote, “that deficiencies in the current classification system undermine our national security, as well as critical democratic objectives, by impeding our ability to share information in a timely manner, be that sharing with our intelligence partners, our oversight bodies, or, when appropriate, with the general public.”

True to her word, in the three months following that statement of allegiance to transparency, Haines has released a steady flow of material on controversial topics, including unclassified reports on everything from the origins of Covid to climate change to an assessment of the “Saudi government’s role in the killing of Jamal Khashoggi.”

Still, despite such efforts, the powers that be are arguably being hoisted on their own petard. After all, Donald Trump followed in the wake of his predecessors in sanctioning expansive secrecy, then made it a be-all and end-all of his presidency, and now claims that it’s part of his rights as a former president and private citizen. As the head of a political movement, now out of office, he’s done the once unthinkable by claiming that the veil of secrecy, the right to decide what should be known and who should know it, is his in perpetuity.

The horror of his claim to untethered secret authority — no wonder some of his MAGA followers refer to him as their “god-emperor” — violates the very idea that a democracy is a pact between individual citizens and elected officials. The valid response to the holding of documents at Mar-a-Lago shouldn’t just be reclaiming them for the public record or even the clear demarcation of the law as it applies to a private citizen as opposed to a president (though both are essential). What’s needed is a full-throated demand that policies of secrecy, allowed to expand exponentially in this century without accountability or transparency, are destructive of democracy and should be ended.

Karen J. Greenberg writes regularly for TomDispatch (where this article originated). She is the director of the Center on National Security at Fordham Law and author most recently of Subtle Tools: The Dismantling of Democracy from the War on Terror to Donald Trump (Princeton University Press). Julia Tedesco conducted research for this article.

Copyright ©2022 Karen J. Greenberg — distributed by Agence Global

—————-
Released: 20 September 2022
Word Count: 2,160
—————-

Julia Gledhill and William D. Hartung, “Spending unlimited”

September 12, 2022 - TomDispatch

Congress has spoken when it comes to next year’s Pentagon budget and the results, if they weren’t so in line with past practices, should astonish us all. The House of Representatives voted to add $37 billion and the Senate $45 billion to the administration’s already humongous request for “national defense,” a staggering figure that includes both the Pentagon budget and work on nuclear weapons at the Department of Energy. If enacted, the Senate’s sum would push spending on the military to at least $850 billion annually, far more — adjusted for inflation — than at the height of the Korean or Vietnam wars or the peak years of the Cold War.

U.S. military spending is, of course, astronomically high — more than that of the next nine countries combined. Here’s the kicker, though: the Pentagon (an institution that has never passed a comprehensive financial audit) doesn’t even ask for all those yearly spending increases in its budget requests to Congress. Instead, the House and Senate continue to give it extra tens of billions of dollars annually. No matter that Secretary of Defense Lloyd Austin has publicly stated the Pentagon has all it needs to “get the capabilities… to support our operational concepts” without such sums.

It would be one thing if such added funding were at least crafted in line with a carefully considered defense strategy.  More often than not, though, much of it goes to multibillion dollar weapons projects being built in the districts or states of key lawmakers or for items on Pentagon wish lists (formally known as “unfunded priorities lists”). It’s unclear how such items can be “priorities” when they haven’t even made it into the Pentagon’s already enormous official budget request.

In addition, throwing yet more money at a department incapable of managing its current budget only further strains its ability to meet program goals and delivery dates. In other words, it actually impairs military readiness. Whatever limited fiscal discipline the Pentagon has dissipates further when lawmakers arbitrarily increase its budget, despite rampant mismanagement leading to persistent cost overruns and delivery delays on the military’s most expensive (and sometimes least well-conceived) weapons programs.

In short, parochial concerns and special-interest politics regularly trump anything that might pass as in the national interest, while doing no favors to the safety and security of the United States. In the end, most of those extra funds simply pad the bottom lines of major weapons contractors like Lockheed Martin and Raytheon Technologies. They certainly don’t help our servicemembers, as congressional supporters of higher Pentagon budgets routinely claim.

A captured Congress

The leading advocates of more Pentagon spending, Democrats and Republicans alike, generally act to support major contractors in their jurisdictions. Representative Jared Golden (D-ME), a co-sponsor of the House Armed Services Committee proposal to add $37 billion to the Pentagon budget, typically made sure it included funds for a $2 billion guided-missile destroyer to be built at General Dynamics’ shipyard in Bath, Maine.

Similarly, his co-sponsor, Representative Elaine Luria (D-VA), whose district abuts Huntington Ingalls Industries’ Newport News Shipyard, successfully advocated for the inclusion of ample funding to produce aircraft carriers and attack submarines at that complex. Or consider Representative Mike Rogers (R-AL), the ranking Republican on the House Armed Services Committee and a dogged advocate of annually increasing the Pentagon budget by at least 3% to 5% above inflation. He serves a district south of Huntsville, Alabama, dubbed “rocket city” because it’s the home to so many firms that work on missile defense and related projects.

There are even special congressional caucuses devoted solely to increasing Pentagon spending while fending off challenges to specific weapons systems. These range from the House shipbuilding and F-35 caucuses to the Senate ICBM Coalition. That coalition has been especially effective at keeping spending on a future land-based intercontinental ballistic missile dubbed the Sentinel on track, while defeating efforts to significantly reduce the number of ICBMs in the U.S. arsenal. Such “success” has come thanks to the stalwart support of senators from Montana, North Dakota, Utah, and Wyoming, all states with ICBM bases or involved in major ICBM development and maintenance.

The jobs card is the strongest tool of influence available to the arms industry in its efforts to keep Congress eternally boosting Pentagon spending, but far from the only one. After all, the industrial part of the military-industrial-congressional complex gave more than $35 million in campaign contributions to members of Congress in 2020, the bulk of it going to those on the armed services and defense appropriations committees who have the most sway over the Pentagon budget and what it will be spent on.

So far, in the 2022 election cycle, weapons firms have already donated $3.4 million to members of the House Armed Services Committee, according to an analysis by Open Secrets.org, an organization that tracks campaign spending and political influence. Weapons-making corporations also currently employ nearly 700 lobbyists, more than one for every member of Congress, while spending additional millions to support industry-friendly think tanks that regularly push higher Pentagon spending and a more hawkish foreign policy.

The arms industry has another lever to pull as well when it comes to the personal finances of lawmakers. There are scant, if any, restrictions against members of Congress owning or trading defense company stocks, even those who sit on influential national-security-related committees. In other words, it’s completely legal for them to marry their personal financial interests to those of defense contractors.

The cost of coddling contractors

Legislators arbitrarily inflate Pentagon spending despite clear evidence of corporate greed and repeated failures when it comes to the development of new weapons systems. Under the circumstances, it should be no surprise that weapons acquisitions are on the Government Accountability Office’s “High Risk List,” given their enduring vulnerability to waste and mismanagement. In fact, overfunding an already struggling department only contributes to the development of shoddy products. It allows the Pentagon to fund programs before they’ve been thoroughly tested and evaluated.

Far from strengthening national defense, such lawmakers only reinforce the unbridled greed of weapons contractors. In the process, they ensure future acquisition disasters. In fact, much of the funding Congress adds to the Pentagon budget will be wasted on price gouging, cost overruns, and outright fraud. The most notorious recent case is that of the TransDigm Group, which overcharged the government up to 3,850% for a spare part for one weapons system and 10 to 100 times too much for others.

The total lost: at least $20.8 million. And those figures were based on just a sampling of two-and-a-half years of that company’s sales to the government, nor was it the first time TransDigm had been caught price gouging the Pentagon.  Such practices are, in fact, believed to be typical of many defense contractors.  A full accounting of such overcharges would undoubtedly amount to billions of dollars annually.

Then there are weapons systems like Lockheed Martin’s F-35 fighter aircraft and that same company’s Littoral Combat Ship (LCS). Both are costly programs that have proven incapable of carrying out their assigned missions. The F-35 is slated to cost the American taxpayer a staggering $1.7 trillion over its life cycle, making it the most expensive single weapons program ever. Despite problems with its engine performance, maintenance, and basic combat capabilities, both the House and the Senate added even more of them than the Pentagon requested to their latest budget plans. House Armed Services Committee Chair Adam Smith (D-WA) famously remarked that he was tired of “throwing money down that particular rat hole,” but then argued that the F-35 program was too far along to cancel. Its endurance has, in fact, forced the Pentagon to restart older jet fighter production lines like the F-15, developed in the 1970s, to pick up the slack. If the U.S. is going to be forced to buy older fighters anyway, cutting the F-35 could instantly save $200 billion in procurement funding.

Meanwhile, the LCS, a ship without a mission that can’t even defend itself in combat, nonetheless continues to be protected by advocates like Representative Joe Courtney (D-CT), co-chair of the House shipbuilding caucus. The final House and Senate authorization bills prevented the Navy from retiring five of the nine LCS’s that the service had hoped to decommission on the grounds that they would be useless in a potential military faceoff with China (a conflict that should be avoided in any case, given the potentially devastating consequences of a war between two nuclear-armed powers).

No surprise, then, that a substantial part of the tens of billions of dollars Congress is adding to the latest Pentagon budget will directly benefit major weapons contractors at the expense of military personnel. In the House version of the military spending bill, $25 billion — more than two-thirds of its additional funding — is earmarked for weapons procurement and research that will primarily benefit arms contractors.

Only $1 billion of the added funds will be devoted to helping military personnel and their families, even as many of them struggle to find affordable housing or maintain an adequate standard of living. In fact, one in six military families is now food insecure, a devastating reflection of the Pentagon’s true priorities.

In all, the top five weapons contractors — Lockheed Martin, Raytheon, Boeing, General Dynamics, and Northrop Grumman — split more than $200 billion in “defense” revenue in the last fiscal year, mostly from the Pentagon but also from lucrative foreign arms sales. The new budget proposals will only boost those already astounding figures.

Pushing back on contractor greed

Congress has shown little intent to decouple itself in any way from what’s still known as “the defense industry.” There is, however, a clear path to do so, if the people’s representatives were to band together and start pushing back against the greed of weapons contractors.

Some lawmakers have begun making moves to prevent price gouging while improving weapons-buying practices. The Senate Armed Services Committee, for instance, included in its version of the defense budget a provision to establish a program that would improve contractor performance through financial incentives.  Its goal is to make the Pentagon a smarter buyer by addressing two main issues: delivery delays and cost overruns, especially by companies that charge it above-market prices to pad their bottom lines. It would also curb the ability of contractors to overcharge on replacement parts and materials.

The program to prevent further price gouging has a couple of possible paths to President Biden’s desk. Senator Elizabeth Warren (D-MA) and Representative John Garamendi (D-CA) also included it in the bicameral Stop Price Gouging the Military Act, an ambitious proposal to protect the Pentagon from outrageous contractor overcharges. The bill would close loopholes in existing law that allow companies to eternally rip off the Defense Department.

There are obviously all too many obstacles in the path of eliminating moneyed interests from defense policy, but creating an incentive structure to improve contractor performance and transparency would, at least, be a necessary first step. It might also spur greater public input into such policy-making.

Secrecy, Inc.

Here’s the sad reality of the national security state: we taxpayers will fork over nearly a trillion and a half dollars this year in national security spending and yet the policy-making process behind such outlays will essentially remain out of our control. The Senate Armed Services Committee typically debates and discusses its version of the National Defense Authorization Act (NDAA) behind closed doors. The subcommittee hearings open to the public rarely last — and yes, this is not a mistake! — more than 15 minutes. Naturally, the House and Senate will reconcile any differences between their versions in secret, too. In other words, there’s little transparency when it comes to the seemingly blank check our representatives write for our defense every year.

Sadly, such a system allows lawmakers, too many of whom maintain financial stakes in the defense industry, to deliberate over Pentagon spending and other national security matters without real public input. At the Pentagon, in fact, crucial information isn’t just kept private; it’s actively suppressed and the situation has only gotten worse over the years.

Here’s just one example of that process: in January 2022, its Office of the Director of Operational Test & Evaluation issued an annual report on weapons costs and performance.  For the first time in more than 30 years, however,  it excluded nearly all the basic information needed to assess the Pentagon’s weapons-buying process. Redacting information about 22 major acquisition programs, the director treated data once routinely shared as if it were classified. Given the Pentagon’s rocky track record when it comes to overfunding and under-testing weapons, it’s easy enough to imagine why its officials would work so hard to keep unclassified information private.

Scamming the taxpayer has become a way of life for the national security state. We deserve a more transparent, democratic policy-making process. Our elected officials owe us their allegiance, not the defense-industry giants that make such hefty campaign contributions while beefing up lawmakers’ stock portfolios.

Isn’t it time to end the national-security version of spending unlimited in Washington?

Julia Gledhill writes regularly for TomDispatch (where this article originated). She is an analyst at the Center for Defense Information at the Project On Government Oversight.
William D. Hartung is also a TomDispatch regular, and a Senior Research Fellow at the Quincy Institute for Responsible Statecraft. He is the author most recently ofPathways to Pentagon Spending Reductions: Removing the Obstacles.

Copyright ©2022 Julia Gledhill and William D. Hartung — distributed by Agence Global

—————-
Released: 12 September 2022
Word Count: 2,169
—————-

Steve Fraser, “The Trump Supreme Court is nothing new”

September 6, 2022 - TomDispatch

Has the Trump Supreme Court gone rogue? The evidence mounts. Certainly, its recent judicial blitzkrieg has run roughshod over a century’s worth of settled law.

A woman’s right to get an abortion? Gone (at least as a constitutionally protected civil right). Meanwhile, voting rights are barely hanging on, along with the 1965 Voting Rights Act that gave them life. State legislatures, so the court ruled, may no longer rein in the wanton availability of firearms and so the bloodshed will inevitably follow. Climate catastrophe will only get closer as the Supremes have moved to disarm the Environmental Protection Agency’s efforts to reduce carbon emissions. Religion, excluded from the public arena since the nation’s founding, can now invade the classroom, thanks to the court’s latest pronouncement.

This renegade court is anything but finished doing its mischief. Affirmative action may be next on the chopping block. Gerrymandering, long an ignoble tradition in American political life, could become unconstrained if the Supremes decide to exempt such practices from state court judicial review. And who knows what they are likely to rule when every election not won by the Republican Party may be liable to a lawsuit.

Donald Trump’s three appointments to the court — Neil Gorsuch, Brett Kavanaugh, and Amy Coney Barrett — cemented in place a rightward shift in its center of gravity that had begun decades earlier. Ever since, in 1986, President Ronald Reagan appointed William Rehnquist, a staunch conservative, as chief justice, the court has only become ever more averse to regulating business, even as it worked to reduce the power of the Federal government.

Don’t forget that it essentially appointed George W. Bush president in 2000 by ruling that Florida couldn’t conduct a recount of the vote, though it seemed likely that Al Gore would prevail and enter the Oval Office. And even after Rehnquist passed away, the court’s 2010 Citizens United decision granted corporations the same free speech rights as people, further eroding democracy by removing limitations on their campaign contributions.

This march to the right was in stark contrast to the earlier deliberations of the court led by Chief Justice Earl Warren. The Warren court was, of course, best known for its landmark 1954 Brown v. Board of Education decision striking down public school segregation. It would also become the judicial centerpiece of a post-World War II liberal order that favored labor unions, civil rights, government oversight of business, and the welfare state.

Historically speaking, however, the Warren Court was the exception, not the one cobbled together by Donald Trump and effectively, if not officially, presided over by Justice Clarence Thomas. The Supremes were born to be bad.

Enshrined in the Constitution

From the beginning, the Supreme Court was conceived as a bulwark against excessive democracy, as indeed was the Constitution itself.

During the years leading up to the 1787 constitutional convention in Philadelphia, the country was in a chronic state of upheaval. Local insurrections against heavy taxation, land and currency speculators, and merchant-bankers had called into question the security and sanctity of private property. Local legislatures proved vulnerable to take-over by the hoi polloi who felt free to cancel debts, print paper money, stop evictions, and oust elites from their accustomed positions of power.

Various impediments to this kind of “mobocracy” were baked into the Constitution, including the electoral college for presidential votes and the indirect election of senators by state legislatures (until the 17th amendment was ratified in 1913). The Supreme Court was just another such obstacle.

Founding Father James Madison typically saw that court as protection against “factious majorities” at the state and local level that might threaten the rights of property-holders. Fearing “passionate majorities,” he went so far as to propose a joint executive-judicial council with veto power over all legislation.

That idea went nowhere. Still, the principle of “judicial review” — the power of the court to have the last say on the constitutionality of legislation — although not made explicit in the Constitution was implicit in the way the founding fathers sought to reign in democratic impulses. French author Alexis de Tocqueville in his nineteenth-century classic, Democracy in America, typically recognized the special status accorded to judicial elites, describing them as America’s “high political class.“

At first, the Supreme Court’s services weren’t needed as a guardian of vested interests and its presence was muted indeed. It met in the basement of the Capitol and, between 1803 and 1857, struck down only two federal statutes. (Compare that to the 22 it struck down between 1992 and 2002 alone.)

The court would, however, establish an enduring reputation for conservatism thanks to its infamous 1857 Dred Scott decision. By a 7-2 majority, the justices declared all Black people — free or enslaved — to be non-citizens. They also ruled that, even if a slave made his or her way to a free state, he or she would remain the property of the slave owner and declared that no territory under U.S. jurisdiction could prohibit slavery.

Dred Scott is generally considered to be the most egregious decision in the court’s 250 year history. That ruling was, however, in keeping with its basic orientation: to side with propertied interests, not the unpropertied; slave-owners, not slaves; and industrialists and financiers rather than with those who worked for and depended on them.

Gatling-gun injunctions and yellow dog contracts

After the Civil War, the court became ever more aggressive in defending the interests of the powerful. There was a need for that as, once again, the powerless threatened the status quo.

Reconstruction — the period immediately after the Civil War when the Federal government imposed martial law on the former Confederate states — empowered ex-slaves to militantly exercise their rights to full civil and political equality under the 14th and 15th amendments. Desperate farmers in the Midwest, on the Great Plains, and in the South were then mobilizing to protect themselves from predatory banks, railroads, and commodity speculators. Industrial workers were engaged in pitched battles with their employers, confrontations that elicited widespread sympathy in cities and towns across the country.

“Passionate majorities” needed chastening and the court met the challenge. It launched an era, much like our own, of “judge-made law” that would last from the late 1880s into the 1920s.

Early on, the Supremes declared a civil rights act unconstitutional. Later, in Plessy v. Ferguson, they made segregation constitutionally legitimate via the doctrine of “separate but equal” and so helped restore elite white rule in the South. By ensconcing segregation, they also ended the hopes aroused by the Populist movement for an alliance of black and white rural poor against predatory banks and landlords.

The populist fervor of that era led some state legislatures to adopt laws regulating railroad rates and the fees charged by grain-elevator operators, while challenging corporate monopoly power over the vital necessities of life. Initially, the court tread carefully. Soon enough, however, the justices shed that reticence, using the power of judicial review to wipe such laws off the books. With a distinct touch of irony, they concluded that, in the eyes of the law, corporations were indeed persons and so entitled to the very civil rights guaranteed to ex-slaves by the 14th amendment (“rights” presumably denied them under state regulatory statutes).

Regulating business, the justices suggested, was tantamount to confiscating it. As one railroad lawyer had argued before the court, such regulation was “communism pure and simple.” From that same perspective, the court found a federal law establishing an income tax unconstitutional. (It took the 16th amendment, passed in 1913, to make the income tax national law.)

Industrial capitalism accumulated its wealth by subjecting the lives of millions of workers to abject misery: poverty, overwork, danger, disease, and profound indignity. It would prove a bloody affair, igniting confrontations between workers and their bosses more violent than anywhere else in the western world. As those workers began organizing collectively, their middle-class allies occasionally succeeded in passing relevant laws for minimum wages, outlawing child labor, putting a ceiling on the work hours an employer could enforce, and making the workplace safer or, at least, compensating those injured on the job.

The justices of the Supreme Court, some of whom had once been lawyers for the railroad, iron, and steel industries, knew just what to do in response to such democratic challenges to the prerogatives of capital. While the right to strike might be honored in theory, the court issued injunctions to stop such strikes from happening so often that the era became known (after the early machine gun of that time) for its “gatling-gun injunctions.” That term was used in part as well because such rulings could be enforced by the Army or its state militia equivalents, not to mention the imprisonment and heavy fines often involved. During one such bloody encounter, William Howard Taft, then an Ohio judge, later president, and finally chief justice of the Supreme Court, complained that federal troops had “killed only six of the mob as yet. This is hardly enough to make an impression.”

To rub yet more salt in the wound, such injunctions were often justified under the Sherman Anti-Trust Act of 1890. Originally designed to break up monopolies, it would be used far more frequently to bust strikes (and sympathy boycotts) on the grounds that they were “conspiracies in restraint of trade.” The court repeatedly enjoined “secondary boycotts“; that is, supportive actions by other unions or groups sympathetic to striking workers. It also struck down a Kansas statute that banned “yellow dog contracts” — agreements promising that they would never join a union that many workers were forced to sign on being hired.

Laws that attempted to ameliorate the harshness of working-class life were treated with similar disdain. New York state, for example, passed one banning cigar making in tenement workshops as a danger to workers’ health. The court saw otherwise, treating such tenement dwellers as independent contractors who had freely chosen their way of life.

New York also tried to limit the hours bakers could work to 10 a day and 60 a week. At the time, they were normally compelled to work 75 to 100 hours weekly in ill-ventilated cellars of tenement bakeries where breathing in the flour was a danger to their lungs. The justices begged to differ. In Lochner v. New York — named after the bakery owner who sued the state — they refused to recognize any threat to the well-being of bakers who, in the eyes of the court, had freely contracted to work on those terms. They were after all as free as their employers to strike a bargain or choose not to work.

The freedom of contract was then the reigning judicial orthodoxy, inherited ironically enough from the long struggle against slave labor. Unlike slavery, free labor allegedly enjoyed an equality of standing in any contractual relationship with an employer. Laws or unions which interfered with that “freedom” were rendered nugatory by the Court and it didn’t matter how obvious it was that the imputed equality between owners of capital and the men and women compelled to work for them was illusory.

The only laws of that sort which passed muster were those protecting women and child laborers. The justices considered such workers inferior and dependent, and so, unlike men, unable to freely enter into relations of contractual equality. In the case of women, there was the added danger of jeopardizing their maternal role. Still, consider it an indication of just how reliant businesses had then become on child labor that even a federal law that controlled the ages and hours children could work was, in the end, struck down by the Supreme Court.

The court v. the people

By the turn of the twentieth century, the outcry against “judge-made law,” the willful manipulation of the Constitution to shore up endangered bastions of wealth and power, had grown ever stronger. Some more recent scholars have found the court’s rulings then not as one sided as its reputation suggests, but contemporaries certainly didn’t share those doubts.

When the Supreme Court overturned an income tax law, a dissenting justice vividly described its decision as a “surrender to the moneyed classes.“

Similarly, in 1905, Supreme Court Justice Oliver Wendell Holmes broke with his colleagues when they ruled in the Lochner case, noting that “the 14th amendment does not enact Mr. Herbert Spencer’s Social Statics.” (Spencer was then the world’s foremost proponent of social darwinism and a staunch defender of free-market economics.) A few years later, future Supreme Court Justice Louis Brandeis cuttingly noted that “to destroy a business is illegal. It not illegal to lower the standard of the working man’s living or to destroy the union which aims to raise or maintain such a standard. A business is property… A man’s standard of living is not property.”

Other voices were also being raised in alarm over the coming of a “judicial oligarchy.” Politicians from former president Theodore Roosevelt to perennial Socialist Party presidential candidate Eugene Debs began denouncing “the rogue court.” When he ran again for president in 1912 as the candidate of the Bull Moose, or Progressive Party, Roosevelt declared that the people are “the ultimate makers of their own Constitution” and swore that Americans would not surrender that prerogative to “any set of men, no matter what their positions or their character.” His rival for the party’s nomination, Wisconsin senator Robert LaFollette, typically offered this observation: “Evidence abounds that… the courts pervert justice almost as often as they administer it.” There existed, he concluded, “one law for the rich and another for the poor.“

Calls for reform back then should sound eerily familiar today. Populist presidential candidate James Weaver urged that Supreme Court justices be elected and lifetime terms abolished. A bill introduced in Congress proposed that a majority of both houses should have the power to recall and remove a judge from office. Another demanded a super-majority of justices — seven out of nine — be required to invalidate a law. Roosevelt argued that there should be popular referenda on the court’s decisions. The Socialist Party demanded that the Supreme Court’s power to review the constitutionality of federal laws be done away with and all judges elected for short terms.

Still, the court prevailed until the Great Depression of the 1930s. President Franklin Roosevelt, however, passed new laws regulating business and finance, as well as a national minimum wage and maximum-work-hours statute, while legalizing the right to join a union. Together with yet another uprising of beleaguered industrial workers in those years, this would shift the balance of power. Even then, the Supreme Court justices at first succeeded in nullifying key pieces of Roosevelt’s economic recovery legislation, while Democrats at the time, (as today), talked about adding new justices to the court.

In the end, however, the national trauma of a capitalism seemingly on the verge of collapse, the weight of changing public opinion, and the aging out of some of the justices ended the dominion of the Lochner court.

“The race question”

During the long years of opposition to that court, little of the criticism touched on “the race question.” How to account for that? From the Gilded Age of the late nineteenth century to Roosevelt’s New Deal, Americans were preoccupied with “the labor question” (as it was then called) — that is, how to deal with the great social divide between capital and labor opened up by industrialization.

The silence when it came to the no less striking racial bias of the Supreme Court speaks to a ubiquitous national blindness on matters of racial justice then. Of course, segregation was settled law at the time. In the words of a justice deciding the Plessy case, white supremacy was “in the nature of things.” (Sound familiar?) So, too, the relative weakness of mass movements addressing the racial dilemma during the Lochner court years was striking, making the issue easier to ignore.

The Supreme Court’s original responsibility was, as James Madison once put it, to guard against the “tyranny of the majority.” African-Americans were, of course, a long-tyrannized minority.

However, on that subject the Lochner court went AWOL, even by its own standards. If the “minority” in question happened to be a corporation, it, of course, needed the court’s protection. Not so fortunate were millions of ex-slaves and their descendants.

Eventually, a different Supreme Court, the one overseen by Chief Justice Earl Warren, faced the “race question.” Indeed, it expanded civil rights and civil liberties generally by making racial segregation illegal in public schools, increasing the constitutional rights of defendants, outlawing state-sponsored school prayer, and creating the groundwork to legalize abortion.

Times had changed. Civil rights for African-Americans (about which Roosevelt’s New Deal did little) became an increasing concern during and after World War II. Growing civil rights organizations and a then-powerful labor movement began to press the issue ever harder. By the time the Warren Court made its celebrated 1954 Brown v. Board of Education decision, race had become a “question,” just as the “labor question” had in the New Deal era.

Before then, pressure alone, however muscular, had not produced a shift in the high court’s approach as the Lochner court so amply demonstrated. Segregation had, after all, become entrenched as a way of life endorsed by local white legislatures. Southern commercial interests in particular — plantation owners, textile manufacturers, and raw material producers — depended on it.

Beyond those circles, however, segregation had become increasingly repellent in a culture ever more infused with the multi-ethnic sympathies and cosmopolitanism of the New Deal era. In beginning the dismantlement of legal segregation, the Warren court would not, in fact, threaten the country’s central institutions of power and wealth which, if anything, had by then come to find American-style apartheid inimical to their interests.

Justice is supposed to be nonpolitical, but that has never been the case. What was once termed the “counter-majoritarian” mission of the court — to discipline “passionate majorities” — produced great wrongs in the era of the gatling-gun injunction as had also been true earlier. The Warren court, however, was the exception. It achieved the very opposite results, even as it relied on the same constitutional logic (the civil rights enshrined in the 14th amendment) the Lochnercourt had in thwarting mass movements for justice and equality.

Today’s Supreme Court is more than Donald Trump’s creation. It’s the result of a long counter-revolution against the political, economic, and cultural reforms of the New Deal, as well as of the labor, civil rights, women’s, and gay liberation movements of the last century.

Sadly, those are the “passionate majorities” the court now seems all too determined to squelch and in that it stands in a long American tradition, though one most of us had forgotten in the Warren years. One thing should be obvious by now: if the country is ever to live up to its democratic and egalitarian promise, the tyranny of the Supreme Court must be ended.

Steve Fraser writes regularly for TomDispatch (where this article originated). He is the author of Mongrel Firebugs and Men of Property: Capitalism and Class Conflict in American History. His previous books include Class Matters, The Age of Acquiescence, and The Limousine Liberal. He is a co-founder and co-editor of the American Empire Project.

Copyright ©2022 Steve Fraser — distributed by Agence Global

—————-
Released: 06 September 2022
Word Count: 3,145
—————-

Steven Pressman, “The Fed’s battle with inflation: A Pyrrhic victory? or will the federal government join the fight?”

September 5, 2022 - The-Washington-Spectator

The central bank of the U.S. Federal Reserve (or Fed for short) has been hiking interest rates this year to try to bring down inflation. Before addressing the economic consequences of this, a few words about central banks seems in order.

Regular retail and consumer banks do their own banking at central banks — making deposits and borrowing money. Central banks in turn regulate regular banks to ensure they remain solvent. But their most important function is changing interest rates to control inflation and unemployment.

Central banks set the rate at which they lend money to banks and control the rate that banks charge each other for loans. When inflation is high, central banks raise these rates. Banks then charge their customers more for loans. When unemployment is high, central banks cut the rates they charge banks. Banks then lower the rates they charge to businesses and consumers. Higher interest rates reduce spending and are intended to dampen inflationary pressures; lower rates increase spending, economic growth, and employment.

Currently U.S. unemployment is below 4 percent and near a 50-year low. Inflation, under 2 percent in the B.C. (before Covid) years, rose above 9 percent for the year ending in June 2022, its highest rate in four decades. Last year, the Biden administration provided many benefits to U.S. families (such as stimulus checks and a refundable tax credit) through the American Rescue Plan, helping them keep up with rising prices. These programs have now expired. As prices rise much faster than incomes, Americans struggle to pay their rent or mortgage, fill up their car with gasoline, and put food on the table.

The Fed has responded by increasing interest rates. From nearly 0 percent in January, it raised interest rates a total of 2.25 percentage points between March and July. And it expects to hike rates another percentage point before the year is out.

In hindsight, the Fed should have started raising rates last fall, when the U.S. unemployment rate was under 5 percent and declining and government spending programs were stimulating the economy. Erring on the side of keeping unemployment down, the Fed fell behind the inflation curve. Now it seeks to make up for lost time and deflect blame from the fact that it kept rates too low for too long.

Consumers are already feeling the consequences of this — higher mortgage rates, higher interest rates on auto and college loans, and higher rates on credit card balances. This is one downside of reducing inflation.

Many economists fear the Fed will push the U.S. into a recession, leading to the dreaded stagflation (high inflation and unemployment at the same time) that plagued the economy in the late 1970s and early 1980s. Frequently the Fed has gone too far when it starts raising rates. Fed Chair Paul Volcker overdid it in the late 1970s and early 1980s, leading to 10 percent unemployment. In the early 1990s, Fed Chair Alan Greenspan raised rates, creating a recession that helped end more than a decade of Republican rule in Washington. He raised rates again in 2004, which generated very slow economic growth and quickly had to be reversed.

While not a foregone conclusion, a recession is highly likely. We may be there already. The U.S. economy shrank 0.4 percent (1.6 percent at an annual rate) in the first quarter of this year and 0.2 percent in the second quarter. Furthermore, three recession indicators are flashing brightly — the stock market has fallen sharply this year; commodity prices (oil, cotton, copper, and even corn and wheat) are falling; and the yield curve has inverted (interest rates on short-term government bonds exceed interest rates on longer-term bonds).

Another problem is that while higher interest rates can control inflation caused by too much spending, the Fed can’t counteract the supply problems we currently face. Interest rate hikes won’t reduce high gas prices stemming from an embargo of Russian oil. They won’t replace the loss of Ukrainian grain on the world food market. They can’t reduce high auto and appliance prices that stem from a computer chip shortage due to a drought in Taiwan. And they can’t undo labor shortages resulting from Covid.

Even worse, higher interest rates can increase inflation. This is clearest in the case of housing, the largest spending category for most households. Interest rates on a 30-year fixed mortgage have risen from 2.8 percent last August to 5.5 percent in mid-July. For a $450,000 mortgage, this increases monthly housing costs by nearly $750. People priced out of homeownership will remain renters, adding to the demand for apartments and pushing up rents.

Although the Fed can’t solve our current inflation problem, it also didn’t create it. It didn’t invade Ukraine. It didn’t provide tax cuts to the rich during the Bush and Trump administrations or give Covid benefits to many people who didn’t really need them during the Trump and Biden administrations. And it didn’t raise tariffs sharply on imported goods, making them more expensive. President Trump did this. The Fed merely waited too long to start raising rates.

Still, the Fed is responsible for cleaning up the inflation mess. But its tools are weak and ineffective when it comes to supply-side inflation, and raising rates much further will increase unemployment sharply. The main anti-inflation alternatives are cuts in government spending and raising taxes. But politicians are loath to enact such policies because it hurts their constituents, the people they count on to get reelected. So, by default, the job of controlling inflation falls to central bankers removed from such political pressure.

Right now, our great danger is that the Fed will wait too long to stop raising rates, just as it waited too long to start raising rates. As economists are fond of saying, there are long and variable lags between changes in interest rates and when these changes impact the economy. It is about time for the Fed to hit the pause button. But the inflation problem squeezing so many lower-income and middle-class households still needs to be addressed.

The good news is that another solution to the inflation problem exists. As the Fed steps down to examine the impact of what it has done already, fiscal policy needs to take the lead in battling inflation. Unlike the Fed, President Biden and Congress have the tools to battle supply-side inflation without creating a recession. They need to use them! Besides reducing inflation, these policy actions will also take pressure off the Fed to continually raise interest rates in an attempt to tame inflation.

Here are just a few things the president and Congress can do.

The president can temporarily reduce import taxes and other trade restrictions, and temporarily suspend requirements that ships carrying goods between two U.S. ports must be built in the United States and operated by Americans. These actions would lower the cost of all imported goods. Congress and the president can increase legal immigration and the number of seasonal work visas to ameliorate labor shortages and raise income taxes on the wealthy to reduce demand-side inflationary pressures stemming from Covid relief bills that provided benefits to households that did not need the money and are now spending their windfall. This last policy is far better than having the Fed raise interest rates again, which would hurt indebted low-income and middle-class households and increase mortgage rates and housing costs that constitute a large part of monthly expenditures for those who are not wealthy.

Finally, the president and Congress can provide tax breaks to companies that allow their employees to work from home, and subsidies to state and local governments that reduce rail and bus fares for consumers. The latter policy will encourage people to use mass transit when traveling. Both policies, by reducing time behind the wheel, will help drive down the cost of gasoline.

Steven Pressman is adjunct professor of economics at the New School for Social Research, professor emeritus of economics and finance at Monmouth University, and author of Fifty Major Economists, 3rd edition (Routledge, 2013).

Copyright ©2022 The Washington Spectator — distributed by Agence Global

—————-
Released: 05 September 2022
Word Count: 1,302
—————-

Peter Galbraith, “The Supreme Court and the crisis of legitimacy”

September 1, 2022 - The-Washington-Spectator

Alexander Hamilton once described the judiciary as the least dangerous branch of government. But today it is no exaggeration to say that the Supreme Court poses a greater threat to individual freedoms, to the future of the planet, and to democracy itself than any other government branch.

The Supreme Court is now a political entity masquerading as a judicial body. This has been obvious since the 2000 decision in Bush v. Gore. On December 9, 2000, Justice Antonin Scalia, speaking for four Republican court members, ordered Florida to stay its recount of votes in the presidential race. Scalia then waited until December 12 before ruling that a recount was not possible because December 12 was Florida’s “safe harbor” deadline for recording presidential votes. If not for Scalia’s stay, the recount could have been completed by December 12, something the uberpartisan Scalia did not want to happen.

Faced with identical law and facts, a real court will reach the same result regardless of the political party of the plaintiff and the defendant. We can be quite sure that the five Republican Supreme Court members who stopped the Florida recount would not have done so had Gore been slightly ahead of Bush. As we move into the 2024 election cycle, government officials must be prepared to ignore partisan decisions from an entity that is a court in name only. The future of American democracy is at stake.

Other court decisions are threatening to constitutional government. In overturning Roe v. Wade (in Dobbs v. Jackson Women’s Health Organization), the Supreme Court for the first time took away a previously recognized constitutional right, in this instance declaring that the previous right to an abortion no longer exists. Despite Justice Samuel Alito’s assertion that the decision should not be read as affecting other rights, there can be no doubt that the right to same-sex marriage is in the court’s crosshairs. And since overturning Roe still leaves abortion legal in large parts of the country (it is popular even in archconservative states like Kansas), the court’s next steps will likely hold up prohibitions on travel for abortions and, perhaps, ultimately outlaw abortion altogether by declaring a fetus to be a person.

Despite the passage of President Biden’s climate-control bill, the justices have already reached environmental decisions aimed at making it impossible for the United States to address the existential threat of climate change.

Next year, the court will rule on a case that involves the so-called independent state legislature doctrine. Based on the text and supposed original intent of two articles of the Constitution related to congressional and presidential elections, this once fringe doctrine asserts that legislatures are uniquely empowered to decide congressional redistricting and the method for selecting presidential electors. Under this doctrine, state legislatures can ignore state courts and constitutions on congressional redistricting and, more ominously, ignore the popular vote in their states when it comes to choosing presidential electors.

Four Republican Supreme Court members have agreed to take this case, and three (Thomas, Alito, and Gorsuch) have already signaled their support for the doctrine. Their motives are obviously partisan. Republicans control both houses of 30 state legislatures, while the Democrats control just 17, and will clearly benefit from unchecked state legislature gerrymandering. Republicans control the legislatures in the five Biden states — Arizona, Georgia, Michigan, Pennsylvania, and Wisconsin — where Republican members of Congress voted on January 6, 2021, to throw out the results of the popular vote. One would have to be very naive to believe that the six Republicans on the Supreme Court would have much interest in the independent state legislature doctrine if Democrats were the principal beneficiaries of unchecked gerrymandering or if Democrats controlled legislatures in states the Republican presidential candidate might carry. Had the independent state legislature doctrine been law in 2020, Donald Trump likely would be president today.

So how should the Biden administration, congressional Democrats, state governors, state judges, and lower federal courts respond to what may be a dire threat to American democracy? Alexander Hamilton described the judiciary as the least dangerous branch of government because it commands no army and no police. It has no ability to enforce its decisions except to the extent that the other branches accept the decisions as legitimate. The decisions of a court that is nakedly partisan, that is intent on taking away previously protected constitutional rights, and that is on the verge of ending free elections cannot be considered legitimate.

Even if the court had the requisite constitutional authority, one branch of government cannot be allowed to destroy the democratic basis of the other two branches, nor can it be allowed to destroy the federal system by imposing illegitimate decisions on state governments. But even using the logic of the court’s Republican majority, the court does not in fact have the authority to do much of what it reaches to do.

In his decision in Dobbs, Samuel Alito asserts that there is no right to abortion in the Constitution and therefore the 50-year precedent of Roe v. Wade must be overturned. But judicial review of laws is also not in the Constitution. The Constitution says nothing that gives the Supreme Court the power to rule on the constitutionality of laws passed by Congress, on the constitutionality of executive branch actions, or on the constitutionality of state government actions. Nor is there any evidence that the Philadelphia delegates to the Constitutional Convention intended for the Supreme Court to be a constitutional court. In fact, it is not clear that they thought much about judicial power at all in that hot summer in Philadelphia, and the jurisdiction of the Supreme Court in Article 3 of the Constitution is very limited. (Later, as part of the case for ratification, Alexander Hamilton argued in Federalist 78 that, in the event of a conflict of law between the Constitution and an ordinary law, courts would have to choose the Constitution; but this is not the same as conferring on the Supreme Court the power to overturn a law.)

The Supreme Court arrogated to itself the power of constitutional review in the 1803 decision in Marbury v. Madison. In that decision, Justice John Marshall ruled that American courts have the power to strike down laws that they find violate the Constitution. While Marbury v. Madison is a longer-lasting precedent than Roe v. Wade, it has no greater constitutional authority. Neither abortion nor judicial review are in the Constitution.

The best solution to the problem of an illegitimate Supreme Court is to reform it. Ideally, there would be a nonpolitical way to choose justices (as in Europe) so that they are judges and not politicians. Short of that, there could be a system of staggered term limits so that presidents of both parties would be able to choose justices. However, these reforms can only be made by amending the Constitution, and securing the votes to do that — two-thirds of both houses of Congress and three-fourths of the state legislatures — is impossible to achieve.

There are two possible legislative solutions — increasing the court membership and stripping the court of some of its jurisdiction. Expanding the court — which might be best described as “court unpacking” to undo the Republican right’s court packing — is the simplest reform, but it requires larger Democratic majorities than currently exist in either house of Congress. Jurisdiction stripping — for example taking away the court’s power to review certain federal or state laws — is more complicated and solves only part of the problem. As with court unpacking, the congressional votes aren’t there.

Executive branch officials, state governors, and state courts can simply refuse to enforce judgments that follow from the decisions of an illegitimate court. For example, in abortion cases, they can decline to recognize civil awards pursuant to laws like the one in Texas that permits suits against anyone assisting an abortion. Where abortion is criminalized, federal and state officials can simply not extradite those criminally charged. Prosecutors in states where abortion is illegal can refuse to prosecute.

Lower federal courts and state courts can choose to ignore Supreme Court rulings that they see as partisan and illegitimate. (The history of our century might have been quite different if Florida had done so in December 2000.) Without the cooperation of state and federal officials and of state and lower federal courts, there is very little the Supreme Court can do enforce its decisions. This tactic is understandably concerning to some, as it is reminiscent of tactics white Southerners used as part of their “massive resistance” to Supreme Court desegregation decisions in the 1950s. The problem then was not the tactics but the goals for which they were used. There is a huge difference between resisting court decisions that promote equal rights and freedom and not implementing those that take away rights and end democracy.

American legal scholars and lawyers are trained to regard Supreme Court decisions the way Catholics are meant to view papal bulls. The court can be wrong, but it is better to accept a bad decision than undermine the rule of law. To the extent that it keeps social peace, this approach has merit. However, the Constitution cannot be whatever five Republican extremists say it is.

I have spent much of my career as a diplomat working on countries that fall apart — Yugoslavia, Indonesia (East Timor), Iraq, and Afghanistan. The single most important factor in the breakup of countries — and the start of civil wars — is the justified belief by a significant part of the population that the instruments of state are being used to treat them unfairly. Can anyone seriously believe that Americans would meekly accept a Supreme Court decision to overturn a future presidential election by allowing Republican legislatures to ignore the popular vote in their state?

This is a formula for massive unrest if not outright civil war. We can hope that the 2024 elections are sufficiently decisive that such a scenario does not occur. But Joe Biden won in 2020 by more than seven million votes nationally, with a three-state margin in the Electoral College, and this did not stop Donald Trump and a majority of elected Republican members of Congress from trying to overturn the election. As we approach 2024, many of the pro-democracy Republican members of Congress have retired or been purged. Next year, a partisan Supreme Court could give the anti-democracy Republican majority the tools to succeed.

Under the current circumstances, the best response to illegitimate and partisan court decisions is to ignore them. Refusing to acknowledge the most extreme decisions of this Supreme Court will no doubt cause confusion in the U.S. court system. However, delegitimizing a partisan Supreme Court may be necessary to help prepare for the all too possible situation where the court is integral to undoing the next election. And it may even lead the court to reconsider the consequences of its actions.

The Constitution is not whatever the Supreme Court says it is. The court’s partisan majority cannot be allowed to use the Constitution as a vehicle to destroy American democracy. At some point, we have to take a stand for the Constitution and for democracy.

Peter W. Galbraith, a former U.S. Ambassador to Croatia and Assistant Secretary General of the United Nations in Afghanistan, is the author of The End of Iraq: How American Incompetence Created a War Without End, first published in 2006.

Copyright ©2022 The Washington Spectator — distributed by Agence Global

—————-
Released: 01 September 2022
Word Count: 1,856
—————-

Tom Engelhardt, “Living in a sci-fi world”

August 29, 2022 - TomDispatch

Honestly, if you had described this America to me more than half a century ago, I would have laughed in your face.

Donald Trump becoming president? You must be kidding!

If you want a bizarre image, just imagine him in the company of Abraham Lincoln. I mean, really, what’s happened to us?

Not, of course, that we haven’t had bizarre politicians in Washington before.  I still remember watching the mad, red-baiting Senator Joseph McCarthy on our new black-and-white television set in April 1953.  He was a brute and looked it (though, to my nine-year-old mind, he also seemed like every belligerent dad I knew). Still, whatever he was, he wasn’t president of the United States. At the time, that was former World War II military commander Dwight D. Eisenhower.

And whatever McCarthy might have been, he wasn’t a sign of American (or planetary) decline. The Donald? Well, he’s something else again. In some ways, he could be considered the strangest marker of decline in our history. After all, when he entered the Oval Office, he took over a country whose leaders had long considered it the greatest, most powerful, most influential nation ever.

Think of him, if you will, as the weirdest seer of our times. To put him in the context of the science fiction I was reading in the previous century, he might be considered a genuine Philip K. Dick(head).

As I wrote in April 2016 in the midst of Trump’s initial run for the presidency, he was exceptional among our political class and not for any of the obvious reasons either. No, what caught my attention was that slogan of his, the one he had trademarked in the wake of Mitt Romney’s loss to Barack Obama in 2012: Make America Great Again, or MAGA. The key word in it, I realized then, was that again. As I noted at the time, he was unique in a presidential race not just as a bizarre former TV personality or even a successful multiple bankruptee, but as “the first person to run openly and without apology on a platform of American decline.” In his own way, he had his eye — and what an eye it was! — on a reality no other politician in Washington even dared consider, not when it came to the “sole superpower” of planet Earth. He was, after all, insisting then that this country was no longer great.

Trump proved to be a one-of-a-kind candidate (not that he wouldn’t have been without that MAGA slogan). And as we now know, his message, which rang so few bells among the political class in Washington, rang all too many in the (white) American heartland. In other words, Donald Trump became the prexy of decline and what a decline it would be! According to one recent survey, half of all Americans, in this increasingly over-armed country of ours, have come to believe that an actual civil war is on the way in the near future.

Think of the miracle — if you don’t mind my using such a word in this context — of Donald Trump’s presidency this way: in some sense, he managed to turn not just Republicans but all of us into his apprentices. And those years of our apprenticeship occurred not just in an increasingly crazed and violent America, but on an ever stranger, more disturbed planet.

Yes, once upon a time I read sci-fi novels in a way I no longer do and felt then that I was glimpsing possible futures, however weird. But believe me, what’s happening today wouldn’t have passed as halfway believable fiction in the late 1960s or early 1970s.

So, let me say it again: honestly, Donald Trump?

Our Liz

Having lived through the antiwar movement of the 1960s and 1970s (often enough in the streets) and the madness of the American war of destruction in Vietnam, it’s strange to spend my waning years in a country where the main protest movement, the Trumpist one, represents a nightmare of potential destruction right here at home. And by “right,” of course, I mean wrong beyond belief. It’s led, after all, by a superduper narcissist who wouldn’t qualify as a fascist only because he prefers fans to followers, apprentices to jackbooted thugs. As the events of January 6, 2021, showed, however, he wouldn’t reject them either. In an earlier moment, in fact, he urged such thugs to “stand back and stand by.”

You know that you’re in a world from hell when the heroine of this moment is the politically faithful daughter of a former vice president who, along with President George W. Bush, used the 9/11 attacks to usher us into wars of aggression in Afghanistan and Iraq, as well as into the expansive Global War on Terror — who, that is, remains an unindicted war criminal first class. Keep in mind as well that, before she became our Liz, she voted against impeaching President Trump in 2019 and voted for his programs (if you can faintly call them that) a mere 93% of the time.

And mind you, all of this is just scratching the surface of our world from hell.

Not even in my worst nightmares of half a century ago was this the American world I imagined. Not for a day, not for an hour, not for a second did I, for instance, dream of American school guards armed with assault rifles. I mean, what could possibly go wrong?

I was born, of course, into an America on the rise in which you could still imagine — it seems ridiculous to use the word today — progress toward a genuinely better world of some sort. That world is evidently now something for the history books.

Think of Donald Trump as an all-too-literal sign of the times at a moment when about 70% of Republicans consider the last election to have been stolen and Joe Biden an illegitimate president. Perhaps 40% of them also believe that violence against the government can sometimes be justified. This in a country that had long fancied itself as the greatest of all time.

And if you really want a little sci-fi madness that would, in the 1960s, have blown my mind (as we liked to say then), consider climate change. As we argue like mad about the last election, while Trumpists pursue local secretary-of-state positions (not to speak of governorships) that could give them control over future election counts, as Americans arm themselves to the teeth and democracy seems up for grabs, let’s not forget about the true nightmare of this moment: the desperate warming of this planet.

Yes, “our” Earth is burning in an all-too-literal way — and flooding, too, with “superstorms” in our future. And don’t forget that it’s melting as well at a rate far more extreme than anyone imagined once upon a time. Recent research on the Arctic suggests that instead of warming, as previously believed, at a rate two to three times faster than the rest of the planet, it’s now heating four times as fast. In some areas, in fact, make that seven times as fast! So, in the future, see ya Miami, New York, Ho Chi Minh City, Shanghai, and other coastal metropolises as sea levels rise ever faster.

Kissing the planet goodbye?

Honestly, you’d hardly know it in parts of this country and among Republicans (even if that party’s key figures were, once upon a time, environmentalists), but this planet is literally going down — or maybe, in temperature terms, I mean up — in flames.

Greenhouse gases continue to pour into the atmosphere and certain heads of state, like Donald Trump in his White House days, remain remarkably dedicated to emitting yet more of them. The Mexican president is one example, the Russian president another. And you no longer have to turn to science fiction to imagine the results. An unnerving sci-fi-style future is becoming the grim present right before our eyes. This summer, for instance, Europe has seen unparalleled heat and drought, with both Germany’s Rhine River and Italy’s Po River drying up in disastrous fashion. And just to add to the mix, parts of that continent have also seen storms of a startling magnitude and staggering flooding.

Meanwhile, China has been experiencing a devastating more-than-two-month-long set of heat waves with record temperatures and significant drought, all of which has proved disastrous for its crops, economy, and people. And oh yes, like the Rhine and Po, the Yangtze, the world’s third-largest river, is drying up fast, while the heat wave there shows little sign of ending before mid-September. Meanwhile, the American southwest and west continue to experience a megadrought the likes of which hasn’t been seen on this continent in at least 1,200 years. Like the Rhine, Po, and Yangtze, the Colorado River is losing water in a potentially disastrous fashion, while the season for heat waves in the United States is now 45 days longer than in the 1960s. And that’s only to begin recording planetary weather catastrophes. After all, I haven’t even mentioned the ever-fiercer wildfires, or megafires, whether in Alaska, New Mexico, France, or elsewhere; nor have I focused on the increasingly powerful hurricanes and typhoons that have become part of everyday life (and destruction and death).

So, isn’t it a strange form of science fiction that, in response to such a world, such a crisis, one that could someday signal the end of civilization, the focus in this country is on Donald Trump and company? Don’t you find it odd that the two greatest greenhouse gas emitters and powers on the planet, the United States and China, have responded in ways that should appall us all?

Joe Biden and his top national-security officials have continually played up the dangers of the rise of China, put significant energy into developing military alliances against that country in the Indo-Pacific region, and functionally launched a new Cold War, more than 30 years after the old one went to its grave. In addition, Nancy Pelosi, a number of other congressional representatives, and even state ones have pointedly visited the island of Taiwan, purposely infuriating the Chinese leadership. Meanwhile, the U.S. Navy has ever more regularly sent its vessels through the Taiwan Strait and aircraft carrier task forces into the South China Sea.

For its part, China’s officialdom, while continuing to push the building of coal-fired power plants, has recently launched military demonstrations of an escalating sort against Taiwan, while preparing to join Vladimir Putin’s Russia for the second year in a row in “military exercises,” even as the war in Ukraine, a first class carbon disaster, goes on and on and on. At the same time, furious about those Taiwan visits from Washington, China’s leaders have essentially cut off all relations with the U.S., including any further discussions about how to cooperate in dealing with climate change.

So, a second cold war amid a growing climate disaster? If you had put that into your sci-fi novel in 1969, it would undoubtedly have seemed too absurd a future to be publishable. You would have been laughed out of the room.

Admittedly, the history of humanity has largely been a tale of the triumph of the unreasonable. Still, you might think that, as a species, we would, at a minimum, not actively opt for the destruction of the very planet we live on. And yet, think again.

At least the Biden administration did recently get a bill through Congress (despite the opposition of every Trumpublican) that dealt with global warming in a reasonably significant way and the president may still invoke his executive powers to do more. It’s true as well that the Chinese have been working hard to create ever less expensive alternative energy sources. Still, none of this takes us far enough. Not on this planet. Not now.

And keep in mind that, were a desperate and disparate America to elect Donald Trump again in 2024 (by hook or crook), the country that historically has put more greenhouse gasses into the atmosphere than any other land, might be kissing the planet we’ve known goodbye.

Believe me, it’s strange to find myself remembering a long-gone world in which the major destruction was happening thousands of miles away in Vietnam, Laos, and Cambodia. I mean, that was bad enough to get me into the streets then. Now, however, the destruction we’re significantly responsible for is happening right here, right where you and I both live, no matter where that might be.

What we’re watching is a tragedy of an unparalleled sort in the making for our children and grandchildren, which leaves me sad beyond words. As far as I’m concerned, that’s a sci-fi novel I’d rather not read and a sci-fi life I’d rather not be living.

Tom Engelhardt created and runs the website TomDispatch.com. He is also a co-founder of the American Empire Project and the author of a highly praised history of American triumphalism in the Cold War, The End of Victory Culture.  A fellow of the Type Media Center, his sixth and latest book is A Nation Unmade by War.

Copyright ©2022 Tom Engelhardt— distributed by Agence Global

—————-
Released: 29 August 2022
Word Count: 2,122
—————-

Robert Lipsyte, “Being anything but a good sport in Saudi Arabia”

August 18, 2022 - TomDispatch

Here’s the big question in Jock Culture these days: Is the Kingdom of Golf being used to sportswash the Kingdom of Saudi Arabia? Or is it the other way around? After all, what other major sport could use a sandstorm of Middle Eastern murder and human-rights abuses to obscure its own history of bigotry and greed? In fact, not since the 1936 Berlin Olympics was used to cosmeticize Nazi Germany’s atrocities and promote Aryan superiority have sports and an otherwise despised government collaborated so blatantly to enhance their joint international standings.

Will it work this time?

The jury has been out since the new Saudi-funded LIV Tour made an early August stop at the Trump National Golf Course in Bedminster, New Jersey. (That LIV comes from the roman numerals for 54, the number of holes in one of its tourneys.) And I’m sure you won’t be surprised to learn that it was hosted by a former president so well known for flouting golf’s rules that he earned the title Commander-in-Cheat for what, in the grand scheme of things, may be the least of his sins.

That tournament featured 10 of the top 50 players in the world. They were poached by the Saudis from the reigning century-old Professional Golfers Association (PGA), reportedly for hundreds of millions of dollars in signing bonuses and prize money. It was a shocking display for a pastime that has traded on its image of honesty and sportsmanship, not to mention an honor system that demands players turn themselves in for any infractions of the rules, rare in other athletic events where gamesmanship is less admired.

No wonder our former president hailed the tour as “a great thing for Saudi Arabia, for the image of Saudi Arabia. I think it’s going to be an incredible investment from that standpoint, and that’s more valuable than lots of other things because you can’t buy that — even with billions of dollars.”

The tournament was held soon after Joe Biden gave that already infamous fist bump to crown prince and de facto Saudi ruler Mohammed bin Salman. The two events radically raised bin Salman’s prestige at a moment when, thanks to the war in Ukraine, oil money was just pouring into that kingdom, and helped sportswash the involvement of his countrymen in the 9/11 attacks, as well as the brutal murder and dismemberment of Saudi dissident and Washington Post columnist Jamal Khashoggi.

Deals they couldn’t refuse

The buy-off money came from the reported $347 billion held by the Public Investment Fund, Saudi Arabia’s sovereign wealth fund. Top golfers were lured into the LIV tour with sums that they couldn’t refuse. A former number-one player on the PGA tour, Dustin Johnson, asked about the reported $125 million that brought him onto the Saudi tour, typically responded by citing “what’s best for me and my family.”

Phil Mickelson, the most famous of the LIV recruits and a long-time runner-up rival of Tiger Woods, justified his reported $200 million in a somewhat more nuanced fashion. In a February interview at the website The Fire Pit Collective, he admitted that Saudi government officials are “scary motherfuckers,” have a “horrible record on human rights,” and “execute people… for being gay.” Yet he also insisted that the LIV was a “once-in-a-lifetime opportunity to reshape how the PGA Tour operates.”

Family needs and the supposed inequities of the PGA’s previously hegemonic universe were the explanations a number of golfers used to justify biting the hand that had fed them for so long. Meanwhile, Tiger Woods, the greatest recipient of PGA largesse and probably the greatest golfer of our time, if not any time, reportedly turned down an almost billion-dollar offer with sharp words for those who had gone for the quick cash.

The PGA obviously agreed and barred any golfer who took up the Saudi offers from its tournaments. In response, some of them promptly sued the PGA.

The Kingdom of Golf

On the face of it, creating a Kingdom of Golf might not seem like a crucial thing for a morally challenged monarchy to do. After all, golf isn’t exactly a charity or a social justice campaign that’s likely to signal your virtue. It’s just a game whose players use sticks to swat little balls into holes in the ground while strolling around. It’s not even good exercise and far less so if you’re driving the course in a motorized cart or hire a caddie to carry your sticks. And it gets worse. After all, the irrigation water and poisonous chemicals necessary to keep the playing fields luxuriantly green at all times are abetting ecological disaster.

Golf symbolized reactionary greed even before the Saudis entered the picture. For starters, its competitors are among the only professional athletes ranked purely by the cash prizes they’ve won. And the leading golfers invariably earn far more from endorsements and speaking engagements. The sport’s almost comic upper-class snootiness sometimes seems like an orchestrated distraction from the profound racism, sexism, and anti-Semitism lodged in its history and, even today, the discrimination against women that still exists at so many of the leading country clubs that sustain the game.

Golf has long been retrograde, exclusionary, and money-obsessed. To put that in perspective, the estimated revenue of the Professional Golf Association in 2019 was $1.5 billion — and it boasts a non-profit status that’s sometimes been questioned. Lucrative as it is, it also proved distinctly vulnerable to an attack by an oil-soaked autocracy that, in warming up to invade golf, had already invested in Formula One racing, e-sports, wrestling, and its most recent controversial purchase, a British Premier League soccer team (which provoked protests from fans and Amnesty International).

Still, the Saudis’ move on golf was even bolder, more ambitious, and somehow almost ordained to happen.

Unlike football and baseball, which are convenient amalgams of socialism for the owners (in their collusive cooperation) and dog-eat-dog capitalism for the players and other personnel, golf is more of a monarchy along the lines of, um, Saudi Arabia. Until the LIV Tour came along, the main PGA tour, that sport’s equivalent of the major leagues, had been all-powerful in its control over both golfers and venues.

Over the years, golfers have indeed complained about that, but except for Greg Norman, a 67-year-old Australian former champion, not too loudly. Now a highly successful clothing and golf-course-design entrepreneur, Norman is called the Great White Shark for his looks and aggressive style. No wonder he’s now the CEO of LIV Golf and the ringleader of the campaign to recruit the top pros to play in the breakaway tour.

Norman denies that he answers to the crown prince, but his attempts to distance himself from that ruthless Saudi ruler are not taken seriously by most observers of golf, including the Washington Post’s Sally Jenkins, who wrote:

Let’s be frank. LIV Golf is nothing more than a vanity project for Norman and his insatiable materialism — and an exhibition-money scam for early-retiree divas who are terrified of having to fly commercial again someday. By the way, the supposed hundreds of millions in guaranteed contracts for a handful of stars — has anyone seen the actual written terms, the details of what Phil Mickelson and Dustin Johnson will have to do to collect that blood-spattered coin, or is everyone just taking the word of Norman and a few agents trying to whip up commissions that it’s all free ice cream?

One of the best sports columnists, Jenkins may seem excessive in her attack on Norman, but the passions that golf and Saudi Arabia have raised separately only increase in tandem. On the one hand, there’s the outrage when it comes to Saudi Arabia’s murderous human-rights abuses and Washington’s continuing complicity with the regime, thanks in particular to its ongoing massive arms sales to that country. (The latest of those deals, largely Patriot missiles sold to that country for $3 billion, feels distinctly like a kind of bribery.)

On the other hand, there’s the long-standing resentment of golf as a symbol of rich, white, male supremacy. In fact, it’s still seen as a private meeting place to create and maintain relationships that will lead to significant political and business decisions, the sports equivalent of, um, Saudi missile deals.

The pro golfers profiting from the current bonanza may not engender much sympathy, but the derision for their materialism should, at least, be put in context. Until the LIV came along, they had next to no options in their sport and few of them made Mickelson- or Johnson-style money. Worse yet, their lonely gunslinger lifestyles made unionization at best the remotest of possibilities, especially for figures deeply wired into the corporate community through their sponsorship deals.

The Saudi golf coup (because that’s indeed what it is) has taken place at an interesting juncture for the sport and its two most compelling figures, Trump and Tiger, who have indeed played together, both seeming to enjoy the trash talk that went with the experience.

Tiger in twilight

Tiger, who is now in steep decline, has long been the face of the sport at its most accomplished, captivating, and richest despite, or perhaps because of, his paradoxical nature.

His first auto accident in 2009 revealed a tortured soul involved a maelstrom of sexual infidelities and occasioned a re-evaluation of his mythic rise. No surprise then that he’s struggled ever since, briefly regaining his form before more accidents and surgeries diminished his dominance.

As long as he continued to show up and hit a ball, popular interest in the game was sustained and the PGA’s grip held firm. As he diminished, however, so did public fascination with golf.

In a way, he had been Tiger-washing the sport. It was hard to sustain a critique of golf’s retrograde and exclusionary nature, however justified, while it hid behind his Black face. Of course, that vision of golf was already wearing thin when Tiger refused to define himself as African-American, preferring “Cablinasian” — meant to reflect his racial mix of Caucasian, Black, (American) Indian, and Asian.

With Tiger, at 46, fading as an active force, PGA golf had already become vulnerable to a coup long before the Saudis and The Donald appeared on the scene. And who could have been a handier guy for those Middle Eastern royals than one with such experience in coups, even if his first try, with all those armed deplorables, failed on January 6, 2021.

This time around, though, Trump had millionaires with golf clubs, Middle Eastern oil royalty, and the equivalent of bottomless sacks of PAC money.

And, of course, with Trump involved, anything could happen. The first time he was infamously linked to sports, in the early 1980s as the owner of the New Jersey Generals of the upstart United States Football League (USFL), he managed to destroy his own organization in what would emerge as his signature style of reckless, narcissistic malfeasance. An early Trump lie (in an interview with me, no less) was that the USFL would continue its summer schedule so as not to interfere with the National Football League’s winter one. Within days of that statement, he led a lawsuit aimed at forcing a merger of his league and the National Football League. It ended badly for Trump and the USFL.

This time around, Trump has said that the LIV Tour would avoid scheduling tournaments in conflict with major PGA events. That will probably turn out to be anything but the case, too. So how will his latest foray into Jock Culture play out? Will the PGA beat back the Saudi coup (maybe by raising its prize money) or will the Saudis burnish their global image through a sport undeservedly renowned for integrity and class?

And what about the Commander-in-Cheat? If only this Saudi enterprise would leave him too busy on the links (not to speak of fighting off jail in connection with those purloined secret documents of his) to run for the presidency again in 2024.

Ultimately, whether Saudi Arabia or golf gets sportswashed, it’s Trump we need to rinse out of our lives.

Robert Lipsyte writes regularly for TomDispatch (where this article originated) and is a former sports and city columnist for the New York Times. He is the author, among other works, of SportsWorld: An American Dreamland.

Copyright ©2022 Robert Lipsyte — distributed by Agence Global

—————-
Released: 18 August 2022
Word Count: 2,010
—————-

Kelly Denton-Borhaug, “Is moral clarity possible in Donald Trump’s America?”

August 16, 2022 - TomDispatch

Recent episodes of purposeful and accidental truth-telling brought to my mind the latest verbal lapse by George W. Bush, the president who hustled this country into war in Afghanistan and Iraq after the 9/11 attacks. He clearly hadn’t planned to make a public confession about his own warmongering in Iraq when he gave a speech in Texas this spring. Still, asked to decry Russian president Vladimir Putin’s unjustified invasion of Ukraine, Bush inadvertently and all too truthfully placed his own presidential war-making in exactly the same boat. The words spilled out of his mouth as he described “the decision of one man to launch a wholly unjustified invasion of Iraq — I mean of Ukraine.”

Initially, he seemed shocked that he had blurted that out and tried to back off his slip by shrugging and muttering, “Iraq, too,” as if it were a joke. Some in his audience even laughed. But his initial attempt to sideline his comment only deepened the hole he was in. Then he tried another ploy. He suggested that his slip could be forgiven or excused because of his age, 75, and that his invasion and the destruction of Iraq could now be forgiven because of his cognitive decline. All in all, it was a first-class mess.

An earlier pathetic attempt at comedy

I remember another of Bush’s attempted jokes that got an immediate laugh from his audience, but soon fell seriously flat. It was in 2004. The Iraq War was underway and the president was at the yearly dinner of the Radio and Television Correspondents Association, a black-tie event attended by both journalists and politicians.

After various comedy sketches, then-President Bush rose to present a short meant-to-be humorous slideshow featuring himself supposedly looking for the nonexistent weapons of mass destruction (WMD) in Saddam Hussein’s Iraq. Remember that, in the lead-up to war there, Americans were hammered with fearful and deceptive political messaging, emphasizing that only an invasion could stop that country’s ruler, Saddam Hussein, from having WMD. (None were ever found, of course.) At that dinner, Bush showed photos of himself supposedly searching for those devastating weapons in the Oval Office beneath a cushion on the couch and under the desk. “No weapons under there! Maybe they’re here!” said the smiling president repeatedly in a sing-song voice, as if engaged in a child’s game. Horrifyingly enough, many in that audience of journalists did indeed laugh.

I was offended then, just as I was by Bush’s recent slip and his sorry attempts to minimize and excuse his responsibility for the blood on his hands, the massive death toll from his invasion, and so much additional destruction and suffering. According to The Costs of War project, more than 207,000 Iraqi civilians were killed in that nightmare, while the number who died from the indirect violence of that war was far higher, given the damage done to the Iraqi health care system and the rest of that devastated country’s infrastructure. More than 20 years later, people are still dying needlessly. And I also mourn the more than 7,000 U.S. servicemembers who died in the post-9/11 war zones Bush created, as well as the many more who were wounded.

I can’t help but wonder if George Bush doesn’t feel at least a little of this himself. Otherwise, why would he have made such a slip? Or maybe it wasn’t a slip at all, but an inadvertent confession.

That his telling gaffe about Iraq and Ukraine received so little attention certainly reveals something about our media’s ongoing uneasiness with Bush’s wars and perhaps the conflicted feelings of our citizenry as well when it comes to what they did (and didn’t do) during the Iraq War. How many who were initially enthusiastic about the Afghan and Iraq wars would now, like their former president, admit we were wrong? How many people who supported those conflicts have taken what happened to heart and are thinking more deeply about an American propensity for war and the war culture that goes with it? Like George W. Bush, too few, I’m afraid.

Worshipping lies

This past July 24th, the New York Times featured “I was wrong” op-ed pieces by a number of its columnists. The editors defined “being wrong” as “incorrect predictions and bad advice,” as well as “being off the mark.” Of course, one of the definitions of the Greek word for “sin” (amartia) in the New Testament is “missing the mark.” Fascinating.

I would have taken the editors’ definitions further though. Saying “I was wrong” means more than “rethinking our positions on all kinds of issues,” as the Timessuggested. Often, the problem isn’t simply that people lack the best, most up-to-date information or data. Only by digging into ethics and social psychology will we better understand why people deceive not just others but even themselves with lies, slippery rationalizations, or comedic attempts at distraction to cover up deeper dynamics that have to do with privilege and power, or what religious traditions sometimes call “worshipping false idols.”

Moral psychologist Albert Bandera has explored some of the diverse mechanisms people rely on to morally disengage and excuse inhumane conduct. They shift their rhetoric and thinking to redefine and even rename what they are doing, “sanitizing” language (and their acts) in the process. In this way, they often shift responsibility onto someone else, minimize any damaging consequences for themselves, and dehumanize the victims of the violence they’ve let loose.

But there are other examples of moral disengagement that are even harder to understand. In such cases, people make decisions and act in ways that even undercut their own self-interest and values. For me, one of the saddest recent examples is Stephen Ayres, a witness at the House select committee’s January 6th hearings this summer. He had been part of the Trumpist mob that stormed the Capitol. A family man who, until then, owned a house and had a job with a cabinet company, Ayres came across in those hearings as a lost soul who couldn’t fully comprehend how he had willingly injured himself and his family by idolizing Donald Trump and his election lies.

His arrest for participating in the insurrection resulted in the loss of almost everything he had. With his wife sitting behind him, he testified about having to sell his house, losing his job, and struggling to come to terms with his actions. “I wish I had done my own research,” he said, trying to explain how he could have been so easily deceived by Trumpist lies regarding the 2020 presidential election.

Clearly, the social media bubble he slipped into that captivated and compelled him to head for Washington had given his life new meaning and an otherwise missing sense of excitement. He hadn’t planned to enter the Capitol building that day but was swept away by the moment. “Basically, we were just following what [Trump] said,” Ayres testified. In handing over his critical thinking to right-wing social media and a president intent on hanging onto power at any cost, he unwittingly also handed over his capacity for moral deliberation and, in the end, his very life.

Liz Cheney’s struggle for moral clarity

In recent weeks, Liz Cheney, vice-chairperson of the January 6th committee, was questioned about a past moral choice of hers by Leslie Stahl in a 60 Minutesinterview — specifically, how years ago she threw her lesbian sister and family under the bus for political purposes. It was a time when Cheney was struggling to get elected in conservative Wyoming. That meant coming out as anti-LGBTQ. Now, she says, “I was wrong” to have condemned her sister then.

Listening to her, I wanted to hear more about such moral grappling and how, in these years, her convictions had or hadn’t changed when it came to people, religion, family, political life, power, and the role her father played as George W. Bush’s vice president in those godforsaken wars in Afghanistan and Iraq. Unfortunately, Stahl didn’t push her further.

I disagree with Liz Cheney on almost every policy position she’s taken in these years. Nonetheless, I find myself grateful for her rejection of Donald Trump’s mad election claims and her determined, even steely, leadership of the January 6th committee hearings. Cheney eventually discovered her moral bearings on her sister’s sexual orientation and family life. Now, I wonder if that past moral struggle influenced her decision to throw political expediency to the wind regarding her own House seat in a Wyoming primary that she might lose on August 16th. After all, by resisting the Trumpian tide, she’s become one of the few Republicans willing to do some serious truth-telling.

Today, Cheney finds herself in another league from most of her party’s leaders and power players. In the state where I live, Pennsylvania, Republicans are coalescing behind the candidacy of Doug Mastriano for governor. Candidate Mastriano not only wants to arm school employees, but according to my local newspaper, he even organized buses for January 6th, now “rubs shoulders with QAnon conspiracy theorists,” and until recently had an active social media account at Gab, a site well-known for its white supremacist and anti-semitic rhetoric.

Mastriano continues to spread Trump’s lies about the 2020 election, is a Christian nationalist, and believes in an abortion ban without exceptions, and the list goes on and on. Nonetheless, Republicans like Andy Reilly, a member of the state GOP national committee, rationalize their support for Mastriano by saying things like, “When you play team sports, you learn what being part of a team means… Our team voted for him in the primary.”

Lying to others and oneself

What enables such self-deception? According to journalist Mark Leibovich, author of Thank You for Your Servitude: Donald Trump’s Washington and the Price of Submission, what “made Trump possible” even after the January 6th insurrection was “rationalization followed by capitulation and then full surrender.” Reviewing Leibovich’s book, Geoffrey Kabaservice added this: “The routine was always numbingly the same, and so was the sad truth at the heart of it. They all knew better.” In other words, “knowing better” doesn’t assure anyone of doing the right thing. Instead, too many Americans were swayed by “greed, ambition, opportunism, fear, and fascination of Trump as a pure and feral rascal.”

Tim Miller, author of Why We Did It: Travelogue from the Republican Road to Hell, adds “hubris, ambition, idiocy, desperation, and self-deception” to the mix of reasons why so many politicians do what they do. “How do people justify going along?” he asks. But he, too, played that game once upon a time. A Republican gay man with a husband, he rationalized helping the GOP pass anti-LGBTQ legislation by “compartmentalizing” his personal life from his professional one. As he now says, “Being around power, being addicted to power,” along with the insatiable compulsion to “be in the room where it happens,” is a recipe that leads people to act self-deceptively, while deceiving others.

It’s like placing scales over your own eyes and those of others, to blind as many people as possible, yourself included, to the immorality of your acts. And some lie even more to themselves, claiming that they can resist the worst tendencies of destructive power-mongering. They say, “We need to have good people in the room” to stop the worst from happening, even as they capitulate to power players and justify what should never be justified.

Many of us are waiting to hear an “I was wrong” from so many politicians (though I can’t imagine Donald Trump ever succumbing to honesty), including most of the Republican leadership. Just for starters, I’d like to hear “I was wrong” regarding Muslim bans, the demonization of immigrants, the refusal to seriously address gun violence, the denial of women’s human rights, the gerrymandering and weakening of voting rights, religious nativism, and sidling up to white supremacy, not to speak of the supposed “steal” of the 2020 election. But given the likelihood that people in power will lie to themselves and others, I’m not holding my breath.

Telling the truth about U.S. military spending

What I’m also waiting for is an “I was wrong” from both Democratic and Republican politicians in Washington who, year after year, support ever more outlandish military budgets, despite so many other existential crises in our country and on the planet, despite the death-dealing costs of war to the servicemembers Americans claim to highly esteem, and despite the fact that our violence abroad simply hasn’t worked.

Remember that the United States spends more than half of its entire discretionary federal budget on militarization and war, a tally greater than the military budgets of the next nine highest-spending countries combined. Tragically, it doesn’t appear that this will change any time soon.

According to an analysis by the anti-corruption group Public Citizen , in 2022, the congressional armed services committees only added to the already gigantic military budget the Biden administration requested for 2023. The House added another $37.5 billion, while the Senate added $45 billion. Our leaders refuse to learn from the last decades of unremitting war. Instead, power and privilege continue to hold sway.

As the same report explained, after military-industrial-complex corporations donated $10 million to congressional armed services committee members, “the Department of Defense received a potential $45 billion spending increase.” This was in addition to the president’s $813 billion recommendation. The report concluded, “The defense contractors will have clinched a return on its $10 million investment of nearly 450,000%.”

It’s discouraging to see how deception and rationalization so regularly undermine truth and moral courage. It’s also sobering to witness individuals who willingly lie to themselves and, in doing so, subvert their own and others’ wellbeing. But I’m also encouraged by times when, as with Liz Cheney on that committee, some of us demonstrate what it means to dig deeply for moral clarity against the prevailing headwinds of moral disengagement, disinformation, power, and privilege.

The fact is that truth-telling and confession, while difficult, are good for the soul. I wish for more and hope it will be enough. God knows, all of us and this beleaguered planet truly need it.

Kelly Denton-Borhaug writes regularly for TomDispatch (where this article originated). She has long been investigating how religion and violence collide in American war-culture and teaches in the global religions department at Moravian University. She is the author of two books, U.S. War-Culture, Sacrifice and Salvation and, more recently, And Then Your Soul is Gone: Moral Injury and U.S. War-Culture.

Copyright ©2022 Kelly Denton-Borhaug — distributed by Agence Global

—————-
Released: 16 August 2022
Word Count: 2,346
—————-

Andrea Mazzarino, “A military rich in dollars, poor in people”

August 15, 2022 - TomDispatch

The American military is now having trouble recruiting enough soldiers. According to the New York Times, its ranks are short thousands of entry-level troops and it’s on track to face the worst recruitment crisis since the Vietnam War ended, not long after the draft was eliminated.

Mind you, it’s not that the military doesn’t have the resources for recruitment drives. Nearly every political figure in Washington, including House Speaker Nancy Pelosi and Senate Minority Leader Mitch McConnell, invariably agrees on endlessly adding to the Pentagon’s already staggering budget. In fact, it’s nearly the only thing they seem capable of agreeing on. After all, Congress has already taken nearly a year to pass a social-spending package roughly half the size of this year’s defense budget, even though that bill would mitigate the costs of health care for so many Americans and invest in clean energy for years to come. (Forget about more money for early childhood education.)

Nor is the Pentagon shy about spending from its bloated wallet to woo new recruits. It’s even cold-calling possible candidates and offering enlistment bonuses of up to $50,000.

As it happens, though, its recruiters keep running into some common problems that either prevent young people from enlisting or from even wanting to do so, including the poor physical or mental health of all too many of them, their mistrust of the government (and its wars), and the recent pandemic-related school closures that made it so much harder for recruiters to build relationships with high-school kids. Many of these recruitment issues are also all-American ones, related to the deteriorating quality of life in this country. From a basic standard of living to shared values or even places where we might spend much time together, we seem to have ever less connecting us to each other. In a nation where friendships across socioeconomic classes are vital to young peoples’ access to new opportunities, this ought to trouble us.

Playing alone

When I arrived to pick my kids up from camp recently, an elementary school classmate playing basketball with them was yelling “This is for Ukraine!” as he hurled the ball towards the hoop. It promptly bounced off the backboard, landing on a child’s head just as he was distracted by a passing bird. Another mother and I exchanged playful winces. Then we waited a few more minutes while our kids loped back and forth between the hoops, not really communicating, before taking our charges home.

By the time I had gotten my young kids signed up for a camp so that my spouse, an active-duty military officer, and I could continue our work lives this summer, basketball was all that was left. The sun often baked the courts so that less time was spent outside playing and more time talking, while trying to recover from the heat. Though our children were new to group activities, having largely engaged in distance learning during the height of the coronavirus pandemic, they did find a couple of things to talk about with the other kids that reflected our difficult world. “Mommy,” said my seven year old when we got home one day, “a kid said Russia could nuclear bomb us. Could they?” On another occasion, he asked, “Is Ukraine losing?”

They know about such subjects because they sometimes listen in on nighttime discussions my spouse and I have. We might typically consider Russian President Vladimir Putin’s elusive nuclear redline and how close the U.S. will dare creep to it in arming Ukrainian forces. As a therapist who works with active-duty military families, I’m all too aware that kids like ours often worry about violence. Similarly, it’s my experience that military kids tend to wonder whether some kind of repeat of the January 6th attack on our Capitol by Trump’s armed mob could, in the future, involve our military in conflicts at home in which our troops might either kill or be killed by their fellow citizens.

Such violence at home and abroad has become routine for daily life in this country and been absorbed by troubled young minds in a way that left them attracted to video games involving violence. Those can, under the circumstances, seem like a strangely familiar comfort. It’s a way for them to turn the tables and put themselves in control. I recently had a perceptive neighbor’s kid tell me that playing the military game Call of Duty was a way of making war fun instead of worrying about when World War III might break out.

My family is fortunate because we can afford to be home in our spacious yard long enough to let our kids play outside with one another, delighting in nature. I also watch them play “war” with sticks that they reimagine as guns, but that’s about where their militarism ends.

I know that military spouses are expected to encourage their children to join the armed forces. In fact — don’t be shocked — some 30%  of young adults who do join these days have a parent in one of the services. But I guess I’m a bit of an odd duck. Yes, I married into the military out of love for the man, but I’ve led a career distinct from his. I even co-founded the Costs of War Project at Brown University, which played a vital role in critiquing this country’s wars in this century. I also became a therapist with a professional, as well as personal, view of the healthcare deficits, internal violence, and exposure to tough work conditions that military life often brings with it.

To take one example, my spouse and I have been waiting for months to get care for a life-threatening condition that those with comparable insurance coverage in the civilian population would often have access to in weeks or less. A host of related health conditions are no less poorly treated in our all-too-well-funded military these days.

As we plan to wind down our family’s stint in the military, it’s hard to ignore how little of our fat military budget with its ever fancier weaponry goes to help Americans in those very services. A line from the new film Top Gun: Maverick comes to mind, as title character’s commanding officer warns him: “The future is coming. And you’re not in it.”

Capitalism’s military marriage

Thanks in part to growing wealth inequalities in this country and what often seems to be a perpetual stalemate in Congress regarding social spending, the next generation of would-be fighters turn out to be in surprisingly rough shape. It’s no secret that the U.S. military targets low-income communities in its recruitment drives. It has a long record, for instance, of focusing on high schools that have higher proportions of poor students. Recruiters are also reportedly showing up at strip malls, fast-food joints, and even big box stores — the places, that is, where many poor and working-class Americans labor, eat, or shop.

So, too, has the military and the rest of the national security state piggybacked on an American love of screens. The alliance between Hollywood and military recruiters goes all the way back to World War I. After the attacks of September 11, 2001, however, the government stepped up its efforts to sell this country’s latest wars to the public, presenting them as a ticket to greater opportunities for those who enlisted and, of course, a patriotic fight against terrorism. The smoke had barely cleared from the site of the Twin Towers when Pentagon officials began meeting with Hollywood directors to imagine future war scenarios in which the U.S. might be involved. Present at those meetings were the directors of movies like Delta Force One, Missing in Action, and Fight Club.

It appears that those efforts had an effect. A 2014 social-science study found, for instance, that when it came not to the military directly but to the U.S. intelligence community, 25% of the viewers of either the combat film Argo or Zero Dark Thirty changed their opinions about its actions in the war on terror. Who knew that, with the help of stars like Jessica Chastain, waterboarding and sleep deprivation could be made to look so sexy?

Some kids were more likely than others to pick up such messages. On average, low-income children have more screen time daily than higher income ones do. And many teens increased their screen time by hours during the pandemic, particularly in poor families, which grew only poorer compared with wealthy ones in those years. As a result, in a country where basic services like school and healthcare have been harder to access due to Covid-19, the few spaces for social interaction available to many vulnerable Americans have remained saturated with violence.

A frayed social safety net and the military

In such communities, it turns out that the military might no longer be able to promise opportunity to that many young people anymore. After all, our government has done an increasingly poor job of providing a basic safety net of food security, a decent education, and reasonable healthcare to our poorest citizens and so seems to have delivered many of them to adulthood profoundly unwell and in no condition to join the military.

Annually, the proportion of young people who are mentally and physically healthy has been shrinking. As a result, roughly three quarters of those between the ages of 17 and 24 are automatically disqualified from serving in the military for obesity, having a criminal record, drug use, or other similar reasons.

To take one example, obesity among kids has skyrocketed in recent years. During the pandemic, in fact, it began rising a stunning five times faster than in previous years. While obesity may not always disqualify young people from serving in the military, it usually does, as do obesity-related diseases like diabetes and high blood pressure. While its underlying causes are complicated, two things are clear: it’s far more prevalent among the lower- and middle-income segments of the population and per capita it’s strongly linked to wealth inequality.

Legislation like the Healthy Food Access for All Americans bill, which has the potential to expand access to less fattening foods through tax credits and grants for grocers and food banks, was introduced in the Senate more than a year ago. You undoubtedly won’t be surprised to learn that it has yet to pass.

The casualties of not caring for our own in this way are high. According to the National Institutes of Health, an estimated 300,000 deaths each year are due to this country’s obesity epidemic. Unfortunately, deadly as such a phenomenon might prove to be, it doesn’t make for the sort of gripping plots that popular movies need.

Similarly, the military’s recruitment efforts suffer because of poor mental-health levels among young people. One in five young women and one in 10 young men experience an episode of major depression before turning 25. Meanwhile, the suicide rate in this country is the highest among wealthy nations and now — thanks, in part, to all the weapons flooding this society — it’s also the second-leading cause of death among 10-to-24-year-olds. Worse yet, poor kids are significantly more likely to die by suicide. Globally, wealth- and race-based inequalities are key determinants of mental health, in part because people who sense that the world they live in is deeply unfair are more likely to develop clinical mental-health disorders.

A 2019 United Nations report suggested that, in order to improve mental health, governments ought to focus on investing in social programs to support people who have experienced trauma, abuse, and neglect at home or in their neighborhoods. It seems unlikely, though, that our elected representatives are ready for such things.

This is for democracy

The human frailties that hinder enlistment are symptoms of something more sinister than a military lacking bodies. The threat that is guaranteed to further undermine any American readiness to face life as it should be faced in this discordant twenty-first century with its ever more feverish summers is the dismantling of our democratic system.

A recent survey ranked the U.S. only 26th globally when it comes to the quality of its democracy. And that’s sad because functional democratic systems are better at creating the conditions in which people can help each other and be involved in public service of all sorts, yes, including in the military.

Democracies are also better at educating people and generally have more efficient health-care systems in part due to the lesser likelihood of corruption. Ask anyone who has sought care in an autocracy like Russia and they’ll tell you that even being rich doesn’t guarantee you quality care when bribery and political retaliation infuse social life.

Democracies have less criminal violence and less likelihood of civil war. In a true democracy where the peaceful transition of power is a given, the kinds of emergencies that necessitate a strong military and law enforcement response are much less likely, which is why the January 6th insurrection at the Capitol was so ominous. Worse yet, investing in weapons rather than human livelihood is guaranteed to have costs that are not only far reaching, but hard to predict. One thing is certain, though: war and ever greater preparations for more of it do not lay the groundwork for a good democracy.

All this is to say that our government ought to stop using movie screens and strip malls to sell its bloody practices overseas. It ought to stop investing in the national (in)security state and the corporations that support it in a way that has become unimaginable for the rest of society. It ought to develop a truly functional social-support system at home that would include the Americans now not quite filling the Pentagon’s tired ranks.

Andrea Mazzarino writes regularly for TomDispatch (where this article originated). She co-founded Brown University’s Costs of War Project. She has held various clinical, research, and advocacy positions, including at a Veterans Affairs PTSD Outpatient Clinic, with Human Rights Watch, and at a community mental health agency. She is the co-editor of War and Health: The Medical Consequences of the Wars in Iraq and Afghanistan.

Copyright ©2022 Andrea Mazzarino — distributed by Agence Global

—————-
Released: 15 August 2022
Word Count: 2,280
—————-

Tom Engelhardt, “The decline and fall of everything (including me)”

August 11, 2022 - TomDispatch

I find nothing strange in Joe Biden, at 79 (going on 80), being the oldest president in our history and possibly planning to run again in 2024. After all, who wouldn’t want to end up in the record books? Were he to be nominated and then beat the also-aging Donald Trump, or Florida Governor Ron DeSantis, or even Fox News’s eternally popular Tucker Carlson, he would occupy the White House until he was 86.

Honestly, wouldn’t that be perfect in its own way? I mean, what could better fit an America in decline than a president in decline, the more radically so the better?

Okay, maybe, despite the Republican National Committee’s clip on the subject, when Joe Biden had to be guided to that red carpet in Israel, it wasn’t because he was an increasingly doddering old guy. Still…

I mean, I get it. I really do. After all, I just turned 78 myself, which leaves me only a year and four months behind Joe Biden in the aging sweepstakes. And believe me, when you reach anything close to our age, whatever White House spokespeople might say, decline becomes second nature to you. In fact, I’m right with Joe on that carpet whenever someone brings up a movie I saw or book I read years ago (or was it last month?) and I can’t remember a damn thing about it. I say to any of you of a certain age, Joe included: Welcome to the club!

It’s strange, if not eerie, to be living through the decline of my country — the once “sole superpower” on Planet Earth — in the very years of my own decline (even if Fox News isn’t picking on me). Given the things I’m now forgetting, there’s something spookily familiar about the decline-and-fall script in the history I do recall. As Joe and his top officials do their best to live life to the fullest by working to recreate a three-decades-gone Cold War, even as this country begins to come apart at the seams, all I can say is: welcome to an ever lousier version of the past (just in case you’re too young to have lived it).

Since the disappearance of the Neanderthals and the arrival of us, tell me that decline hasn’t been among the most basic stories in history. After all, every child knows that what goes up, must… I don’t even have to complete that sentence, do I, whatever your age? Thought of a certain way, decline and fall is the second oldest story around, after the rise and… whatever you want to call it.

Just ask the last emperors of China’s Han dynasty, or the once-upon-a-time rulers of Sparta, or Romulus Augustulus, the last head of the Roman Empire (thanks a lot, Nero!). But here, in the third decade of the twenty-first century, that ancient tale has a brand-new twist. After all that time when humanity, in its own bloody, brutal fashion, flourished, whether you want to talk about the loss of species, the destruction of the environment, or ever more horrific weather disasters arriving ever more quickly, it’s not just the United States (or me) going down… it’s everything. And don’t think that doesn’t include China, the supposedly rising power on Planet Earth. It also happens to be releasing far more greenhouse gases into the atmosphere than any other country right now and suffering accordingly (even if the falling power of this moment, the United States, remains safely in first place as the worst carbon emitter of all time).

So, unless we humans can alter our behavior fast, it looks like only half our story may soon be left for the telling.

The rise and fall of Tom Engelhardt (and so much else)

To speak personally, I find myself experiencing three versions of that ultimate story: that of my own fall; that of my country; and that of an increasingly overheating planet as a habitable place for us all. With that in mind, let me take you on a brief trip through those three strangely intertwined tales, starting with me.

I was born in July 1944 into an America that had been roused from a grotesque depression, the “Great” one as it was known, and was then being transformed into a first-rate military and economic powerhouse by World War II. (My father was in that war as, in her own fashion, was my mother.) That global conflict, which mobilized the nation in every way, wouldn’t end until, more than a year later, two American B-29s dropped newly invented weapons of disastrous destructive power, atomic bombs, on the Japanese cities of Hiroshima and Nagasaki, more or less obliterating them. In those acts, for the first time in history, lay the promise of an ultimate end to the human story of a sort once left to the gods. In other words, V-J (or Victory over Japan) Day instantly had an underside that couldn’t have been more ominous.

I was born, then, into a newly minted imperial power already exhibiting an unparalleled global punch. Soon, it would face off in a planet-wide struggle, initially focused on the Eurasian continent, against another superpower-in-the making, the Soviet Union (and its newly communized Chinese ally). That would, of course, be the not-quite-world war (thanks to the threat of those nuclear weapons, multiplied and improved many times over) that we came to call the Cold War. In it, what was then known as the “free world” — although significant parts of it were anything but “free” and the U.S. often worked its wiles to make other parts ever less so — was set against the communized “slave” version of the same.

In the United States, despite fears of a nuclear conflict that left children like me “ducking and covering” under our school desks, Americans experienced the hottest economy imaginable. In the process, an ever wealthier society was transformed from a good one into — as President Lyndon Johnson dubbed it in 1964 — the Great Society. Despite “red scares” and the like, it was one that would indeed prove better for many Americans, including Blacks in the wake of a Civil Rights Movement that finally ended the Jim Crow system of segregation that had succeeded slavery.

In the process, the U.S. developed a global system around what was then called the “Iron Curtain,” the lands the Soviet Union controlled. It would be anchored by military bases on every continent but Antarctica and alliances of every sort from NATO in Europe to SEATO in Southeast Asia, as well as secretive CIA operations across much of the globe.

As for me, I, too, was still rising (though sometimes, as in the Vietnam years, in full-scale protest against what my country was doing in the world), first as a journalist, then as an editor in publishing. I even wrote a version of the history of my times in a book I called The End of Victory Culture: Cold War America and the Disillusioning of a Generation. Little did I know then quite how disillusioning the world we were creating would turn out to be. Meanwhile, in the 1980s and ’90s, during the presidencies of Ronald Reagan and Bill Clinton, during what came to be known as the neoliberal moment, another kind of rise became more evident domestically. It was of a kind of corporate wealth and power, as well as a growing inequality, previously unknown in my lifetime.

In 1991, when I was 47 years old, the Cold War suddenly ended. In 1989, the Red Army had limped home from a decade-long disastrous war in Afghanistan (from which, of course, Washington would turn out to learn absolutely nothing) and the Soviet Union soon imploded. Miracle of miracles, after nearly half a century, the United States was left alone and seemingly victorious, “the sole superpower” on Planet Earth.

The former bipolar world order was no more and, in the phrase of conservative Washington Post columnist Charles Krauthammer, we were now in “the unipolar moment.” Uni because there was only one power that mattered left on this planet. Admittedly, Krauthammer didn’t expect that uni-ness to last long, but too many politicians in Washington felt differently. As it turned out, the top officials in the administrations of Bush the elder and then Bush the younger had every intention of turning that moment of unparalleled global triumph into a forever reality. What followed were wars, invasions, and conflicts of every sort meant to cement the global order, starting with President George H.W. Bush’s Operation Desert Storm against Saddam Hussein’s Iraq in 1991, which (sadly enough) came to be known as the first Gulf War.

Hence, too, the missing “peace dividend” that had been promised domestically as the Cold War ended. Hence, too, after “peace” arrived came the never-ending urge to pour yet more taxpayer dollars into the Pentagon, into a “defense” budget beyond compare, and into the weapons-making corporations of the military-industrial complex, no matter what the U.S. military was actually capable of accomplishing.

All of this was to be the global legacy of that sole superpower, as its leaders worked to ensure that this country would remain so until the end of time. A decade into that process, horrified by the response of Bush the younger and his top officials to the 9/11 attacks, I created TomDispatch, the website that would see me through my own years of decline.

Bankruptcy, Inc.

Keep in mind that, in those years of supposed triumph, the third decline-and-fall story was just beginning to gain momentum. We now know that climate change was first brought to the attention of an American president, Lyndon Johnson, by a science advisory committee in 1965. In 1977, Jimmy Carter, who two years later would put solar panels on the White House (only to have them removed in 1986 by Ronald Reagan), was warned by his chief science adviser of the possibility of “catastrophic climate change.” And yet, in all the years that followed, remarkably little was done by the sole superpower, though President Barack Obama did play a key role in negotiating the Paris Climate agreement (from which Donald Trump would dramatically withdraw this country).

In its own fashion, Trump’s victory in 2016 summed up the fate of the unipolar moment. His triumph represented a cry of pain and protest over a society that had gone from “great” to something far grimmer in the lifetime of so many Americans, one that would leave them as apprentices on what increasingly looked like a trip to hell.

That narcissistic billionaire, ultimate grifter, and dysfunctional human being somehow lived through bankruptcy after bankruptcy only to emerge at the top of the heap. He couldn’t have been a more appropriate symptom and symbol of troubled times, of decline — and anger over it. It wasn’t a coincidence, after all, that the candidate with the slogan Make America Great Again won that election. Unlike other politicians of that moment, he was willing to admit that, for so many Americans, this country had become anything but great.

Donald Trump would, of course, preside over both greater domestic inequality and further global decline. Worse yet, he would preside over a global power (no longer “sole” with the rise of China) that wasn’t declining on its own. By then, the planet was in descent as well. The American military would also continue to demonstrate that it was incapable of winning, that there would never again be the equivalent of V-J Day.

Meanwhile, the political elite was shattering in striking ways. One party, the Republicans, would be in almost total denial about the very nature of the world we now find ourselves in — a fate that, in ordinary times, might have proven bad news for them. In our moment, however, it only strengthened the possibility of a catastrophe for the rest of us, especially the youngest among us.

And yes, recently West Virginia coal magnate Joe Manchin finally came around (in return for a barrel full of favors for his major donors in the oil and gas industry), but the country that created the Manhattan Project that once produced those atomic bombs is now strangely unrecognizable, even to itself. During World War II, the government had poured massive sums of money into that effort, while mobilizing large numbers of top scientists to create the nuclear weapons that would destroy Hiroshima and Nagasaki. To this day, in fact, it still puts staggering sums and effort into “modernizing” the American nuclear arsenal.

When it comes to saving the world rather than destroying it, however, few in Washington could today even imagine creating a modern version of the Manhattan Project to figure out effective new ways of dealing with climate change. Better to launch a dreadful version of the now-ancient Cold War than deal with the true decline-and-fall situation this country, no less this civilization, faces.

Admittedly, though I recently stumbled across something I wrote in the 1990s that mentioned global warming, I only became strongly aware of the phenomenon in this century as my own decline began (almost unnoticed by me). Even when, at TomDispatch, I started writing fervently about climate change, I must admit that I didn’t initially imagine myself living through it in this fashion — as so many of us have in this globally overheated summer of 2022. Nor did I imagine that such devastating fires, floods, droughts, and storms would become “normal” in my own lifetime. Nor, I must admit, did I think then that the phenomenon might lead to a future all-too-literal end point for humanity, what some scientists are starting to term a “climate endgame” — in other words, a possible extinction event.

And yet here we are, in a democratic system under unbelievable stress, in a country with a gigantic military (backed by a corporate weapons-making complex of almost imaginable size and power) that’s proven incapable of winning anything of significance, even if funded in a fashion that once might have been hard to imagine in actual wartime. In a sense, its only “success” might lie its remarkable ability to further fossil-fuelize the world. In other words, we now live in an America coming apart at the seams at a moment when the oldest story in human history might be changing, as we face the potential decline and fall of everything.

One thing is certain: as with all of us, when it comes to my personal story, there’s no turning around my own decline and fall. When it comes to our country and the world, however, the end of the story has yet to be written. The question is: Will we find some way to write it that won’t end in the fall not just of this imperial power but of humanity itself?

Tom Engelhardt created and runs the website TomDispatch.com. He is also a co-founder of the American Empire Project and the author of a highly praised history of American triumphalism in the Cold War, The End of Victory Culture.  A fellow of the Type Media Center, his sixth and latest book is A Nation Unmade by War.

Copyright ©2022 Tom Engelhardt — distributed by Agence Global

—————-
Released: 11 August 2022
Word Count: 2,449
—————-

  • « Previous Page
  • 1
  • …
  • 4
  • 5
  • 6
  • 7
  • 8
  • …
  • 166
  • Next Page »

Syndication Services

Agence Global (AG) is a specialist news, opinion and feature syndication agency.

Rights & Permissions

Email us or call us 24 hours a day, 7 days a week, for rights and permission to publish our clients’ material. One of our representatives will respond in less than 30 minutes over 80% of the time.

Social Media

  • Facebook
  • Twitter

Advisories

Editors may ask their representative for inclusion in daily advisories. Sign up to get advisories on the content that fits your publishing needs, at rates that fit your budget.

About AG | Contact AG | Privacy Policy

©2016 Agence Global