Agence Global

  • About AG
  • Content
  • Articles
  • Contact AG

Erin L. Thompson, “Remembering a feminist movement that hasn’t ended”

August 24, 2020 - TomDispatch

On August 26, 2020, Alice in Wonderland will get some company. She will be joined in New York City’s Central Park by Susan B. Anthony, Elizabeth Cady Stanton, and Sojourner Truth, the first statues there of women who, unlike Alice, actually existed. The monument is a gift to the park from Monumental Women, a non-profit organization formed in 2014. The group has raised the $1.5 million necessary to commission, install, and maintain the new “Women’s Rights Pioneers Monument” and so achieve its goal of “breaking the bronze ceiling” in Central Park.

Preparations for its unveiling on the centennial anniversary of the ratification of the 19th Amendment, which granted suffrage (that is, the right to vote) to women, are in full swing. Celebratory articles have been written. The ceremony will be live-streamed. Viola Davis, Meryl Streep, Zoe Saldana, Rita Moreno, and America Ferrera have recorded monologues in English and Spanish as Stanton, Anthony, and Truth. The Pioneers Monument, breaking what had been a moratorium, is the first new statue placed in Central Park in decades.

As statues topple across the country, the Pioneers Monument is a test case for the future of public art in America. On the surface, it’s exactly what protesters have been demanding: a more diverse set of honorees who better reflect our country’s history and experience. But critics fear that the monument actually reinforces the dominant narrative of white feminism and, in the process, obscures both historical pain and continuing injustice.

Ain’t I a woman? In 2017, Monumental Women asked artists to propose a monument with statues of white suffragists Anthony and Stanton while “honoring the memory” of other voting-rights activists. In 2018, they announced their selection of Meredith Bergmann’s design in which Anthony stood beside Stanton who was seated at a writing desk from which unfurled a scroll listing the names of other voting rights activists.

Famed feminist Gloria Steinem soon suggested that the design made it look as if Anthony and Stanton were actually “standing on the names of these other women.” Similar critical responses followed and, in early 2019, the group reacted by redesigning the monument. The scroll was gone, but Anthony and Stanton remained.

The response: increasing outrage from critics over what the New York Times’ Brent Staples called the monument’s “lily-white version of history.” The proposed monument, wrote another critic in a similar vein, “manages to recapitulate the marginalization Black women experienced during the suffrage movement,” as when white organizers forced Black activists to walk at the back of a 1913 women’s march on Washington. Historian Martha Jones in an op-ed in the Washington Post criticized the way the planned monument promoted the “myth” that the fight for women’s rights was led by Anthony’s and Stanton’s “narrow, often racist vision,” and called for adding escaped slave, abolitionist, and women’s rights promoter Sojourner Truth.

Although the New York City Public Design Commission had approved the design with just Anthony and Stanton, Monumental Women did indeed rework the monument, adding a portrait of Truth in June 2019. The sculptor would later make additional smaller changes in response to further criticism about her depiction of Truth, including changing the positioning of her hands and body to make her a more active participant in the scene. (In an earlier version, she was seated farther from Stanton’s table, her hands resting quietly as if she were merely listening to the white suffragists.)

Their changes didn’t satisfy everyone. More than 20 leading scholars of race and women’s suffrage, for instance, sent a letter to Monumental Women, asking it to do a better job showing the racial tensions between the activists. Their letter acknowledged that Truth had indeed been a guest in Stanton’s home during a May 1867 Equal Rights Association meeting. They noted, however, that this was before white suffragists fully grasped the conflict between the fight for the right of women to vote and the one for the political participation of African Americans, newly freed by the Civil War, in the American democratic system. Stanton and Anthony came to believe that, of the two struggles, (white) women’s votes should take precedence, though they ultimately lost when Congress passed the 15th Amendment in 1870, extending the vote to Black men.

The tensions between race and women’s rights arose again when, in 1919, Congress finally passed the 19th Amendment, intending to give women the right to vote. Its ratification, however, was delayed largely because Southern states feared the very idea of granting the vote to Black women. During the summer of 1920, realizing that they still needed to convince one more Southern state to ratify the amendment, white suffragists began a campaign to remind white southerners that the Jim Crow laws already on their books to keep Black men from voting would do the same for Black women. Tennessee then voted to ratify.

The white suffragists would prove all too accurate. When southern Black women tried to exercise their new right to vote, they would be foiled by discriminatory literacy tests, poll taxes, or just plain violence. In 1926, for instance, Indiana Little, a teacher in Birmingham, Alabama, led a march of hundreds of African Americans on the city’s voter registration office. They were not, however, permitted to register and Little was both beaten and sexually assaulted by a police officer. (Meanwhile, Native American women remained without American citizenship, much less the right to vote, until 1924.)

For Black women, according to Martha Jones, author of the forthcoming book Vanguard: How Black Women Broke Barriers, Won the Vote, and Insisted on Equality for All, the 1965 Voting Rights Act would prove to be the “15th and 19th Amendments rolled into one.” It would give teeth to what had been merely a promise when it came to granting them the vote. And they would prove a crucial part of the fight to make it a reality. Amelia Boynton Robinson, the first Black woman in Alabama to run for Congress (her campaign motto: “A voteless people is a hopeless people”), even turned her husband’s memorial service into Selma’s first mass meeting for voting rights. She then became a key organizer of the 1965 march from Selma to the state capital, Montgomery, during which an Alabama state trooper beat her brutally as she tried to cross the Edmund Pettus Bridge. A widely published photograph of her lying on the ground, bloody and unconscious, would form part of the campaign that led to the passage of the Voting Rights Act a few months later.

Glamour shots in bronze With its gentle portraits of Stanton, Anthony, and Truth, the Women’s Rights Pioneers Monument is far from that image of a bloodied protester. In following the model of the very kind of traditional monument it means to replace, it leaves out the pain and the struggle of the women’s movement.

It didn’t have to be that way. In 2015, one of Monumental Women’s leaders told the New York Times that they wanted a memorial that wouldn’t be “old-fashioned.” Nonetheless, the design they ultimately selected, with its realistic, larger-than-life portrait statues on a pedestal, would prove to be in precisely that traditional style.

The group has claimed that just such a stylistic compromise was necessary because the New York Parks Department refused to allow an “overtly modern” monument in Central Park. (That department disagrees that it should be blamed for the monument’s style. Its press officer told me that they “encourage innovative contemporary art” and pointed to a number of examples of modern, abstract monuments that “grace our parks” in Queens, Manhattan, Staten Island, and Brooklyn.) The Pioneers Monument sits on a leafy promenade nicknamed “Literary Walk” because of its statues of authors like William Shakespeare and Robert Burns. It fits in perfectly there, and would go hardly less well with the future “National Garden of American Heroes” President Trump demanded in response to Black Lives Matter protests. In his executive order to make it so, he specified that the statues in his garden must be realistic, “not abstract or modernist.”

Monumental Women’s style choice conveys important messages. For one, monuments traditionally show the people they honor in the most flattering form imaginable and this one is no exception. Bergmann has sculpted the women as attractively as possible (while being more or less faithful to the historical record). If the monument represents the moment in 1857 when the three women were together, Truth would have been 70 years old and Anthony, the youngest, in her late 40s. Yet all three are shown with unwrinkled faces, smooth hands, and firm necks. Stanton’s hair falls in perfect curls. While they may not look exactly young, neither are they aging. Think of the monument as the equivalent of Glamour Shots in bronze.

As historian Lyra Monteiro, known for her critique of the way playwright Lin-Manuel Miranda erased the slave past in his Broadway hit “Hamilton” — even as he filled the roles of the founding fathers with actors of color — pointed out to me, the monument makes the three women into feminists of a type acceptable even to conservative viewers. Besides portraying them as conventionally attractive, the sculpture uses symbols that emphasize the more traditional feminine aspects of their lives: Truth’s lap full of knitting; Stanton’s delicate, spindly furniture; and Anthony’s handbag. Who could doubt that their armpit hair is also under control?

The women’s faces are, by the way, remarkably emotionless, which is unsurprising for a monument in the traditional style. Since Greco-Roman antiquity, heroic statuary has famously sported faces of almost preternatural calm. Such expressions, however, only contribute to what Monteiro called the concealment of “the struggle” that marked feminism from its first moments.

Sojourner Truth, for instance, was known for speeches like “Ain’t I a Woman?” in which she drew deep and emotional reactions from listeners by describing the sufferings she experienced before escaping from slavery. The triumphalist calm of the Pioneers Monument avoids those emotions and so belongs to a long tradition in American statuary that celebrates revolutionary deeds as, in Monteiro’s words, “very old and very, very done.” Such monuments ask viewers to offer thanks for victory instead of spurring them on to continue the fight.

Monteiro also points out that the choice of commemorating universal suffrage is telling in itself. No matter how many fierce debates it once inspired, the idea that women should have the right to vote is today uncontroversial. But other women’s rights issues remain hotly debated. Imagine statuary celebrating the fight for the right to abortion or to use the bathroom of your choice.

As an example of monuments that energize viewers in an ongoing fight instead of tranquilizing them into thinking victory has been won, Monteiro pointed to Mexico City’s antimonumentos (anti-monuments), large if unofficial displays aimed at calling out government negligence. A typical one, made of metal and portraying the international symbol for women with a raised fist at its center, installed during a 2019 protest march in one of that city’s main squares, bears an inscription indicating that protestors were not going to shut up when it came to the gender-based violence that then continues unchecked in their country. City officials have let such antimonumentos remain in place, undoubtedly fearing negative publicity from their removal. So they continue to act as reminders that the government’s actions are both questionable and being scrutinized.

The triumphalism of the Pioneers Monument suggests that the problem of women’s rights is oh-so-settled. But of course, in the age of Donald Trump in particular, the kinds of oppressions that Truth, Stanton, and Anthony fought couldn’t be more current. Many feminists of color feel that white feminists still tend to ignore racial issues and seldom have the urge to share leadership in activism.

And today, despite Democratic presidential candidate Joe Biden’s recent choice of Kamala Harris as his running mate, the voting rights of women of color remain imperiled. Since a 2013 Supreme Court decision struck down one of the Voting Rights Act’s key protections, minority voters have found it ever more difficult to exercise their theoretical right to vote amid growing efforts by Republican officials to suppress minority (and so Democratic) votes more generally. The fight for women’s votes is hardly over, no matter what the Pioneers Monument might have to say about it.

Todd Fine, a preservation activist, told me that he wishes Monumental Women had focused their discussions on what a truly diverse community might have wanted for such a commemoration rather than responding to bursts of criticism with modest tweaks of their proposed statue.

One explanation for the group’s resistance to change is that it is led by exactly the type of well-off, educated, white women whose right to vote hasn’t been in question since 1920. In the same period that they were reacting to criticism of their proposed monument’s exclusion of women of color, I found that Monumental Women’s tax filings reveal that they added three women of color to their board of directors. Diversification of leadership is certainly a positive step, but the organization’s president and other officers remain the same. And at least two of the new directors had already raised funds for the planned Stanton and Anthony monument, writing and speaking positively about the organization and its goals, and so could be expected to be at best modest critics of its path.

Historic lies and scented candles One reaction to the debate around the Pioneers Monument is to think that Monumental Women simply didn’t make the best decision about whom to honor or how to do it. But historian Sally Roesch Wagner has no doubt that searching for the right honoree is itself not the right way to go. She told me that, when it comes to the feminist movement, monuments to individuals are “a standing historic lie” because women’s rights have been won “by a steady history of millions of women and men… working together at the best of times, separately at the worst.” Wagner believes that to honor individuals for such achievements today is to disempower the movement itself.

Early feminists horrified the public. The Pioneers Monument is designed to soothe. It invites you to light a scented candle rather than to burn your bra. Bronze is long-lasting, but perhaps it’s no longer the best material for monuments. In a moment when a previously almost unimaginable American president is defending traditional Confederate monuments in a big way, perhaps something else is needed.

The playwright Ming Peiffer will premier “Finish the Fight,” an online theatrical work, as August ends. She aims to let us listen to some of the Black, Asian, Latinx, and Native American activists whose roles in the fight for the vote have been forgotten. Perhaps in 2020, the best monuments to the fight for women’s rights — for all our rights — may look nothing like what most of us would imagine.

Erin L. Thompson writes regularly for TomDispatch (where this article originated). She is a professor of art crime at John Jay College (CUNY). An expert on the deliberate destruction of art, she is the author of the forthcoming Smashing Statues: The Rise and Fall of America’s Public Monuments (Norton, 2021). Follow her on Twitter @artcrimeprof.

Copyright ©2020 Erin L. Thompson — distributed by Agence Global

—————-

Released: 24 August 2020

Word Count: 2,468

—————-

Belle Chesler, “The ‘great’ reopening”

August 20, 2020 - TomDispatch

Seventeen years ago, against the advice of my parents, I decided to become a public school teacher. Once I did, both my mother and father, educators themselves, warned me that choosing to teach was to invite attacks from those who viewed the profession with derision and contempt. They advised me to stay strong and push through when budgets were cut, my intellect questioned, or my dedication to my students exploited. Nobody, however, warned me that someday I might have to defend myself against those who asked me to step back into my classroom and risk my own life, the lives of my students and their families, of my friends, my husband, and my child in the middle of a global pandemic. And nobody told me that I’d be worrying about whether or not our nation’s public schools, already under siege, would survive the chaos of Covid-19.

Pushing students back into school buildings right now simply telegraphs an even larger desire in this society to return to business as usual. We want our schools to open because we want a sense of normalcy in a time of the deepest uncertainty. We want to pretend that schools (like bars) will deliver us from the stresses created by a massive public health crisis. We want to believe that if we simply put our children back in their classrooms, the economy will recover and life as we used to know it will resume.

In reality, the coronavirus is — or at least should be — teaching us that there can be no going back to that past. As the first students and teachers start to return to school buildings, images of crowded hallways, unmasked kids, and reports of school-induced Covid-19 outbreaks have already revealed the depths to which we seem willing to plunge when it comes to the safety and well-being of our children.

So let’s just call the situation what it is: a misguided attempt to prop up an economy failing at near Great Depression levels because federal, state, and local governments have been remarkably unwilling to make public policy grounded in evidence-based science. In other words, we’re living in a nation struggling to come to terms with the deadly repercussions of a social safety net gutted even before the virus reached our shores and decisions guided by the most self-interested kind of politics rather than the public good.

A return to school? For teachers like me, with the privilege of not having to work a second or third job, summer can be a time to reflect on the previous school year and prepare for the next. I take classes, read, develop new curriculum, and spend time with family and friends. Summer has been a time to catch up with all the pieces of my life I’ve neglected during the school year and recharge my physical and emotional batteries. Like many other public school teachers I know, I step away in order to step back in.

Not this summer, though. In these months, there’s been no reprieve. In Portland, Oregon, where I live, the confluence of the historic Black Lives Matter uprising, a subsequent invasion by the president’s federal agents, the hovering menace and tragic devastation of the coronavirus, and rising rates of homelessness and joblessness have contributed to a seismic disruption of the routines and structures of our community. A feeling of uncertainty and anxiety now permeates every facet of daily life. Like so many, I’ve been parenting full time without relief since March, acutely aware of the absence of the usual indispensable web of teachers, caregivers, coaches, camp counselors, family, and friends who have helped me raise my child so that I can help raise the children of others.

The dislocation from my community and the isolation caused by the breakdown of normal social ties, as well as my daughter’s and my lack of access to school, has had a profound effect on our lives. And yet, knowing all that, feeling it all so deeply, I would still never advocate sending our children back to school in person as Covid-19 still rages out of control.

Without a concerted effort to stop the spread of the virus — as cases in this country soar past five million and deaths top 170,000 — including masking mandates, widespread testing, effective contact tracing, enough funding to change the physical layout of classrooms and school buildings, a radical reduction in class sizes, and proper personal protective equipment for all school employees, returning to school becomes folly on a grand scale. Of course, an effort like that would require a kind of social cohesion, innovation, and focused allocation of resources that, by definition, is nonexistent in the age of Trump.

Sacrificing the vulnerable In late July, when it was announced that school districts across the state of Oregon would open fully online again this fall, I felt two things: enormous relief and profound grief. The experience of virtual schooling in the spring had resulted in many families suffering due to a lack of access to the social, emotional, and educational resources of school. No one understands that reality better than the teachers who have dedicated our waking hours to supporting those students and the parents who have watched them suffer.

As refreshing as it should be to hear politicians across the political spectrum communicating their worries about a widening achievement gap and the ways in which the most vulnerable American children will fall behind if they don’t experience in-person schooling, their concerns ring hollow. Our most vulnerable children are historically the least served by our schools and the most likely to get sick if they go back. Having never prioritized the needs of those very students, their families, and the communities they live in, those politicians have the audacity to demand that schools open now.

Truly caring for the health and well-being of such students during the pandemic would mean extending unemployment benefits, providing rental assistance, and enacting universal health care. The answer is hardly sending vulnerable kids into a building where they could possibly become infected and carry the virus back to communities that have already been disproportionately affected by Covid-19.

Take the example of my school, which has an air ventilation system that’s been on the fritz for more than a decade, insufficient soap or even places to wash your hands, and windows that don’t open. In other words, perfect conditions for spreading a virus. Even if I were given a face shield and ample hand sanitizer, I’d still be stuck in classrooms with far too many students and inadequate air flow. And those are just the physical concerns.

What very few people seem to be considering, no less discussing, is the long-term psychological trauma associated with the spread of the virus. What if I unknowingly infected my students or their family members? What if I brought the virus home to my family and friends? What if I contracted the virus from a student and died? No educator I know believes that online teaching will better serve our students, but stepping back into in-person learning while the virus is still out of control in America will clearly only contribute to its further spread.

Schools are unable to shoulder the burden of this crisis. Politicizing the return to school and pitting parents against teachers — as if many teachers weren’t themselves parents — is a devious way of once again scapegoating those very schools for perennial failures of funding, leadership, and policy. Forty years of the neoliberal version of austerity and divestment from public schools by both Democratic and Republican administrations have ensured that, unlike in many of the wealthiest nations on this planet, public schools in the U.S. don’t have the necessary institutional support, infrastructure, or resources to envision and carry out a safe and effective return to school.

To put all this in perspective, in its budget proposal for 2021, the Trump administration requested $66.6 billion for the Department of Education, $6.1 billion less than in 2020. In contrast, Congress just passed the National Defense Authorization Act authorizing $740 billion in spending for the Defense Department. Even with the proposed allocation of an additional $70 billion dollars for schools in the Republican-backed HEALS Act, the now-stalled second attempt to respond to the spreading pandemic, two-thirds of those funds would only be available to school districts that hold in-person classes. And because a majority of school funding is tied to local and state tax revenues, badly hit by an economy hobbled by the virus, schools will actually be operating on even smaller budgets this year.

Grassroots privatization It’s as if they want us to fail. Perhaps the most powerful foe of public education in the Trump administration, Secretary of Education Betsy DeVos, even threatened to withhold federal funding if local school districts decided to resume school totally online this fall. After she was reminded that she didn’t have the authority to do so, she pivoted instead to asking parents to consider other options for their children. That request amounted to encouraging them to pull their children from public schools (depriving them of essential funding) and instead seek out vouchers for private or charter schools.

DeVos didn’t just stop there. In an attempt to redirect funds allocated to low-income students by the CARES Act, Congress’s initial response to the pandemic, she ruled that school districts deciding to use that money for programs that might benefit all students (instead of just low-income students) must also pay for “equitable services” for all private schools in the district. This would potentially siphon up to $1.5 billion dollars of CARES Act money from public to private schools. Such schools have already benefited from Paycheck Protection Program loans that were distributed as part of the CARES Act. I’m sure you won’t be surprised to know that they stand to receive yet more money if anything like the present version of the Senate’s HEALS Act ever passes. It’s easy to see who wins and who loses in such an equation.

The fear and anxiety prompted by uncertainty about how public schools will function in the chaos of this moment is giving way to grassroots decision-making that will adversely affect such basic institutions for the foreseeable future and may even contribute to even more segregated schools. People like me — white, highly educated, and accustomed to having options — are scrambling to figure out individual solutions to problems that would best be solved by community organizing.

Some families are indeed choosing to pull their children out of public schools, enrolling them in online academies, private schools, or simply homeschooling their kids. Others are forming small instructional pods, or micro schools, and hiring private teachers or tutors to educate their kids.

The twisted irony of these developments is that many white people who support the Black Lives Matter movement are making decisions for their own children that will adversely affect Black students for years to come. Declining enrollment and white divestment in public schools will bring about funding shortages and educational disparities sure to undermine whatever gains those protests achieve.

The inevitable result will be more segregated schools, while the gap between the haves and the have-nots only widens. Ultimately, privatization on the smallest scale plays into the desire of those like DeVos who seek to undermine and, in the end, even potentially dismantle public education in favor of private schools and charter schools, which, unsurprisingly enough, were first formed to perpetuate school segregation.

The survival of public schools Public schools are deeply imperfect institutions. Historically, they’ve perpetuated racial inequities and solidified economic and social disparities. In many ways, they’ve failed all our children on almost every conceivable level. Their funding models are little short of criminal and the lack of resources across the system should have been (but generally wasn’t) considered unconscionable long before the coronavirus struck.

Yet institutions are made up of people and, many of them, myself included, believe that a free public education accessible to all is a foundation for hope in the future. In the end, schools may still prove to be our last best chance for salvaging what’s left of our fractured nation and the promise of democracy. Abandon them now, when they’re under threat at the federal, state, and grassroots level, and you imperil the fate of the nation.

Needed today are creative solutions that put the focus on the most vulnerable of our children. Perhaps enlisting our nation’s retirees, many of whom are currently isolated at home, to help small groups of students, or launching a civilian corps of the currently unemployed, paid to step in to rebuild critical public school infrastructure or provide supplementary support and tutoring for kids who might otherwise be left behind, would help. I know there are creative solutions out there that don’t just benefit the most privileged among us, that could, in fact, focus on the most marginalized students. Now is the time to be creative, not to withdraw from the system. Now is the time to pool resources, while amplifying the voices of students, parents, and families historically not invited into such conversations.

Long-term divestment in public education has brought America’s schools to a dangerous crossroads, where mistrust of science and expert advice is threatening the very fabric of this nation. The only way out of this mess is to reverse the tide. Do we really want to be governed by fear and self-imposed scarcity? Do we really want the gears of institutional racism to grind on, whether virtually or in person? It’s time to act more collectively, to truly put the “public” back in public schools. It’s time to set partisanship aside to protect all our children as we navigate the unknown and unknowable.

As I prepare for an academic year unlike any other, I expect to watch with terror as many of our nation’s schools, woefully unprepared, open in the midst of a pandemic. Exhausted and heartbroken, I will worry nonstop about the students and teachers walking through those doors.

Belle Chesler writes regularly for TomDispatch (where this article originated). She is a visual arts teacher at a public school in Beaverton, Oregon.

Copyright ©2020 Belle Chesler — distributed by Agence Global

—————-

Released: 20 August 2020

Word Count: 2,325

—————-

Dilip Hiro, “Whose century is it?”

August 18, 2020 - TomDispatch

For the Trump administration’s senior officials, it’s been open season on bashing China. If you need an example, think of the president’s blame game about “the invisible Chinese virus” as it spreads wildly across the U.S.

When it comes to China, in fact, the ever more virulent criticism never seems to stop.

Between the end of June and the end of July, four members of his cabinet vied with each other in spewing anti-Chinese rhetoric. That particular spate of China bashing started when FBI Director Christopher Wray described Chinese President Xi Jinping as the successor to Soviet dictator Joseph Stalin. It was capped by Secretary of State Mike Pompeo’s clarion call to U.S. allies to note the “bankrupt” Marxist-Leninist ideology of China’s leader and the urge to “global hegemony” that goes with it, insisting that they would have to choose “between freedom and tyranny.” (Forget which country on this planet actually claims global hegemony as its right.)

At the same time, the Pentagon deployed its aircraft carriers and other weaponry ever more threateningly in the South China Sea and elsewhere in the Pacific. The question is: What lies behind this upsurge in Trump administration China baiting? A likely answer can be found in the president’s blunt statement in a July interview with Chris Wallace of Fox News that “I’m not a good loser. I don’t like to lose.”

The reality is that, under Donald Trump, the United States is indeed losing to China in two important spheres. As the FBI’s Wray put it, “In economic and technical terms [China] is already a peer competitor of the United States… in a very different kind of [globalized] world.” In other words, China is rising and the U.S. is falling. Don’t just blame Trump and his cronies for that, however, as this moment has been a long time coming.

Facts speak for themselves. Nearly unscathed by the 2008-2009 global recession, China displaced Japan as the world’s second largest economy in August 2010. In 2012, with $3.87 trillion worth of imports and exports, it overtook the U.S. total of $3.82 trillion, elbowing it out of a position it had held for 60 years as the number one cross-border trading nation worldwide. By the end of 2014, China’s gross domestic product, as measured by purchasing power parity, was $17.6 trillion, slightly exceeding the $17.4 trillion of the United States, which had been the globe’s largest economy since 1872.

In May 2015, the Chinese government released a Made in China 2025 plan aimed at rapidly developing 10 high-tech industries, including electric cars, next-generation information technology, telecommunications, advanced robotics, and artificial intelligence. Other major sectors covered in the plan included agricultural technology, aerospace engineering, the development of new synthetic materials, the emerging field of biomedicine, and high-speed rail infrastructure. The plan was aimed at achieving 70% self-sufficiency in high-tech industries and a dominant position in such global markets by 2049, a century after the founding of the People’s Republic of China

Semiconductors are crucial to all electronic products and, in 2014, the government’s national integrated circuit industry development guidelines set a target: China was to become a global leader in semiconductors by 2030. In 2018, the local chip industry moved up from basic silicon packing and testing to higher value chip design and manufacturing. The following year, the U.S. Semiconductor Industry Association noted that, while America led the world with nearly half of global market share, China was the main threat to its position because of huge state investments in commercial manufacturing and scientific research.

By then, the U.S. had already fallen behind China in just such scientific and technological research. A study by Nanjing University’s Qingnan Xie and Harvard University’s Richard Freeman noted that between 2000 and 2016, China’s share of global publications in the physical sciences, engineering, and math quadrupled, exceeding that of the U.S.

In 2019, for the first time since figures for patents were compiled in 1978, the U.S. failed to file for the largest number of them. According to the World Intellectual Property Organization, China filed applications for 58,990 patents and the United States 57,840. In addition, for the third year in a row, the Chinese high-tech corporation Huawei Technologies Company, with 4,144 patents, was well ahead of U.S.-based Qualcomm (2,127). Among educational institutions, the University of California maintained its top rank with 470 published applications, but Tsinghua University ranked second with 265. Of the top five universities in the world, three were Chinese.

The neck-and-neck race in consumer electronics By 2019, the leaders in consumer technology in America included Google, Apple, Amazon, and Microsoft; in China, the leaders were Alibaba (founded by Jack Ma), Tencent (Tengxun in Chinese), Xiaomi, and Baidu. All had been launched by private citizens. Among the US companies, Microsoft was established in 1975, Apple in 1976, Amazon in 1994, and Google in September 1998. The earliest Chinese tech giant, Tencent, was established two months after Google, followed by Alibaba in 1999, Baidu in 2000, and Xiaomi, a hardware producer, in 2010. When China first entered cyberspace in 1994, its government left intact its policy of controlling information through censorship by the Ministry of Public Security.

In 1996, the country established a high-tech industrial development zone in Shenzhen, just across the Pearl River from Hong Kong, the first of what would be a number of special economic zones. From 2002 on, they would begin attracting Western multinational corporations keen to take advantage of their tax-free provisions and low-wage skilled workers. By 2008, such foreign companies accounted for 85% of China’s high-tech exports.

Shaken by an official 2005 report that found serious flaws in the country’s innovation system, the government issued a policy paper the following year listing 20 mega-projects in nanotechnology, high-end generic microchips, aircraft, biotechnology, and new drugs. It then focused on a bottom-up approach to innovation, involving small start-ups, venture capital, and cooperation between industry and universities, a strategy that would take a few years to yield positive results.

In January 2000, less than 2% of Chinese used the Internet. To cater to that market, Robin Li and Eric Xu set up Baidu in Beijing as a Chinese search engine. By 2009, in its competition with Google China, a subsidiary of Google operating under government censorship, Baidu garnered twice the market share of its American rival as Internet penetration leapt to 29%.

In the aftermath of the 2008-2009 global financial meltdown, significant numbers of Chinese engineers and entrepreneurs returned from Silicon Valley to play an important role in the mushrooming of high-tech firms in a vast Chinese market increasingly walled off from U.S. and other Western corporations because of their unwillingness to operate under government censorship.

Soon after Xi Jinping became president in March 2013, his government launched a campaign to promote “mass entrepreneurship and mass innovation” using state-backed venture capital. That was when Tencent came up with its super app WeChat, a multi-purpose platform for socializing, playing games, paying bills, booking train tickets, and so on.

Jack Ma’s e-commerce behemoth Alibaba went public on the New York Stock Exchange in September 2014, raising a record $25 billion with its initial public offering. By the end of the decade, Baidu had diversified into the field of artificial intelligence, while expanding its multiple Internet-related services and products. As the search engine of choice for 90% of Chinese Internet users, more than 700 million people, the company became the fifth most visited website in cyberspace, its mobile users exceeding 1.1 billion.

Xiaomi Corporation would release its first smartphone in August 2011. By 2014, it had forged ahead of its Chinese rivals in the domestic market and developed its own mobile phone chip capabilities. In 2019, it sold 125 million mobile phones, ranking fourth globally. By the middle of 2019, China had 206 privately held start-ups valued at more than $1 billion, besting the U.S. with 203.

Among the country’s many successful entrepreneurs, the one who particularly stood out was Jack Ma, born Ma Yun in 1964. Though he failed to get a job at a newly opened Kentucky Fried Chicken outlet in his home city of Hangzhou, he did finally gain entry to a local college after his third attempt, buying his first computer at the age of 31. In 1999, he founded Alibaba with a group of friends. It would become one of the most valuable tech companies in the world. On his 55th birthday, he was the second richest man in China with a net worth of $42.1 billion.

Born in the same year as Ma, his American counterpart, Jeff Bezos, gained a degree in electrical engineering and computer science from Princeton University. He would found Amazon.com in 1994 to sell books online, before entering e-commerce and other fields. Amazon Web Services, a cloud computing company, would become the globe’s largest. In 2007, Amazon released a handheld reading device called the Kindle. Three years later, it ventured into making its own television shows and movies. In 2014, it launched Amazon Echo, a smart speaker with a voice assistant named Alexa that let its owner instantly play music, control a Smart home, get information, news, weather, and more. With a net worth of $145.4 billion in 2019, Bezos became the richest person on the planet.

Deploying an artificial intelligence inference chip to power features on its e-commerce sites, Alibaba categorized a billion product images uploaded by vendors to its e-commerce platform daily and prepared them for search and personalized recommendations to its customer base of 500 million. By allowing outside vendors to use its platform for a fee, Amazon increased its items for sale to 350 million — with 197 million people accessing Amazon.com each month.

China also led the world in mobile payments with America in sixth place. In 2019, such transactions in China amounted to $80.5 trillion. Because of the Covid-19 pandemic, the authorities encouraged customers to use mobile payment, online payment, and barcode payment to avoid the risk of infection. The projected total for mobile payments: $111.1 trillion. The corresponding figures for the United States at $130 billion look puny by comparison.

In August 2012, the founder of the Beijing-based ByteDance, 29-year-old Zhang Yiming, broke new ground in aggregating news for its users. His product, Toutiao (Today’s Headlines) tracked users’ behavior across thousands of sites to form an opinion of what would interest them most, and then recommended stories.

By 2016, it had already acquired 78 million users, 90% of them under 30.

In September 2016, ByteDance launched a short-video app in China called Douyin that gained 100 million users within a year. It would soon enter a few Asian markets as TikTok. In November 2017, for $1 billion, ByteDance would purchase Musical.ly, a Shanghai-based Chinese social network app for video creation, messaging, and live broadcasting, and set up an office in California.

Zhang merged it into TikTok in August 2018 to give his company a larger footprint in the U.S. and then spent nearly $1 billion to promote TikTok as the platform for sharing short-dance, lip-sync, comedy, and talent videos. It has been downloaded by 165 million Americans and driven the Trump administration to distraction. A Generation Z craze, in April 2020 it surpassed two billion downloads globally, eclipsing U.S. tech giants. That led President Trump (no loser he!) and his top officials to attack it and he would sign executive orders attempting to ban both TikTok and WeChat from operating in the U.S. or being used by Americans (unless sold to a U.S. tech giant). Stay tuned.

Huawei’s octane-powered rise But the biggest Chinese winner in consumer electronics and telecommunications has been Shenzhen-based Huawei Technologies Company, the country’s first global multinational. It has become a pivot point in the geopolitical battle between Beijing and Washington.

Huawei (in Chinese, it means “splendid achievement”) makes phones and the routers that facilitate communications around the world. Established in 1987, its current workforce of 194,000 operates in 170 countries. In 2019, its annual turn-over was $122.5 billion. In 2012, it outstripped its nearest rival, the 136-year-old Ericsson Telephone Corporation of Sweden, to become the world’s largest supplier of telecommunications equipment with 28% of market share globally. In 2019, it forged ahead of Apple to become the second largest phone maker after Samsung.

Several factors have contributed to Huawei’s stratospheric rise: its business model, the personality and decision-making mode of its founder Ren Zhengfei, state policies on high-tech industry, and the firm’s exclusive ownership by its employees.

Born in 1944 in Guizhou Province, Ren Zhengfei went to Chongqing University and then joined a military research institute during Mao Zedong’s chaotic Cultural Revolution (1966-1976). He was demobilized in 1983 when China cut back on its engineering corps. But the army’s slogan, “fight and survive,” stayed with him. He moved to the city of Shenzhen and worked in the country’s infant electronics sector for four years, saving enough to co-found what would become the tech giant Huawei. He focused on research and development, adapting technologies from Western firms, while his new company received small orders from the military and later substantial R&D (research and development) grants from the state to develop GSM (Global System for Mobile Communication) phones and other products. Over the years, the company produced telecommunications infrastructure and commercial products for third generation (3G) and fourth generation (4G) smartphones.

As China’s high-tech industry surged, Huawei’s fortunes rose. In 2010, it hired IBM and Accenture PLC to design the means of managing networks for telecom providers. In 2011, the company hired the Boston Consulting Group to advise it on foreign acquisitions and investments.

Like many successful American entrepreneurs, Ren has given top priority to the customer and, in the absence of the usual near-term pressure to raise income and profits, his management team has invested $15 to 20 billion annually in research and development work. That helps explain how Huawei became one of the globe’s five companies in the fifth generation (5G) smartphone business, topping the list by shipping out 6.9 million phones in 2019 and capturing 36.9% of the market. On the eve of the release of 5G phones, Ren revealed that Huawei had a staggering 2,570 5G patents.

So it was unsurprising that in the global race for 5G, Huawei was the first to roll out commercial products in February 2019. One hundred times faster than its 4G predecessors, 5G tops out at 10 gigabits per second and future 5G networks are expected to link a huge array of devices from cars to washing machines to door bells.

Huawei’s exponential success has increasingly alarmed a Trump administration edging ever closer to conflict with China. Last month, Secretary of State Pompeo described Huawei as “an arm of the Chinese Communist Party’s surveillance state that censors political dissidents and enables mass internment camps in Xinjiang.”

In May 2019, the U.S. Commerce Department banned American firms from supplying components and software to Huawei on national security grounds. A year later, it imposed a ban on Huawei buying microchips from American companies or using U.S.-designed software. The White House also launched a global campaign against the installation of the company’s 5G systems in allied nations, with mixed success.

Ren continued to deny such charges and to oppose Washington’s moves, which have so far failed to slow his company’s commercial advance. Its revenue for the first half of 2020, $65 billion, was up by 13.1% over the previous year.

From tariffs on Chinese products and that recent TikTok ban to slurs about the “kung flu” as the Covid-19 pandemic swept America, President Trump and his team have been expressing their mounting frustration over China and ramping up attacks on an inexorably rising power on the global stage. Whether they know it or not, the American century is over, which doesn’t mean that nothing can be done to improve the U.S. position in the years to come.

Setting aside Washington’s belief in the inherent superiority of America, a future administration could stop hurling insults or trying to ban enviably successful Chinese tech firms and instead emulate the Chinese example by formulating and implementing a well-planned, long-term high-tech strategy. But as the Covid-19 pandemic has made abundantly clear, the very idea of planning is not a concept available to the “very stable genius” presently in the White House.

Dilip Hiro writes regularly for TomDispatch (where this article originated). He is the author, among many other works, of After Empire: The Birth of a Multipolar World. He is currently researching a sequel to that book, which would cover several interlinked subjects, including the Covid-19 pandemic.

Copyright ©2020 Dilip Hiro — distributed by Agence Global

—————-

Released: 18 August 2020

Word Count: 2,704

—————-

Andrew Bacevich, “Biden wins, then what?”

August 13, 2020 - TomDispatch

Assume Joe Biden wins the presidency. Assume as well that he genuinely intends to repair the damage our country has sustained since we declared ourselves history’s “Indispensable Nation,” compounded by the traumatic events of 2020 that demolished whatever remnants of that claim survived. Assume, that is, that this aging career politician and creature of the Washington establishment really intends to salvage something of value from all that has been lost.

If he seriously intends to be more than a relic of pre-Trump liberal centrism, how exactly should President Biden go about making his mark?

Here, free of charge, Joe, is an action plan that will get you from Election Night through your first two weeks in office. Follow this plan and by your 100th day in the White House observers will be comparing you to at least one President Roosevelt, if not both.

On Election Night (or whatever date you are declared the winner): Close down your Twitter account. Part of your job, Joe, is to restore some semblance of dignity to the office of the presidency. Twitter and similar social media platforms are a principal source of the coarseness and malice that today permeate American politics. Remove yourself from that ugliness. Your predecessor transformed a presidency that had acquired imperial pretensions into an office best described as a cesspool of grotesque demagoguery. One of your central tasks will be to model a genuine alternative: a presidency appropriate for a constitutional republic, where reason, candor, and a commitment to the common good really do prevail over partisan name-calling. That’s a lot to ask for, but returning to a more traditional conception of the Bully Pulpit would certainly be a place to start.

During the transition: Direct your press secretary to announce that on January 20th there will be no ritzy Inaugural balls. Take your cues from Franklin Delano Roosevelt’s inauguration for his fourth term in office, a distinctly low-key event. After all, in January 1945, the nation was still at war; victory had not yet arrived; celebration could wait. Our present-day multifaceted crisis bears at least some comparison to that World War II moment. So, as you plan your own inauguration, ditch the glitz. A secondary benefit: you won’t have to hit up wealthy donors for the dough to pay for the party. And with no party, you won’t have to worry about inaugural festivities triggering another spike in Covid-19 infections.

In addition to selecting a cabinet and ignoring your predecessor’s bleating, the main focus of your transition period has to be policy planning. When you take office, the coronavirus pandemic will still be with us: that’s a given. Even if optimistic predictions of an effective vaccine becoming available by early 2021 were to pan out, we won’t be out of the woods. Not faintly. So your number-one priority during the transition must be to do what Trump never came close to doing: devise a concrete national strategy for limiting the spread of the virus along with a blueprint for prompt and comprehensive vaccine distribution when one is ready.

That said, it would also be prudent to engage in quiet contingency planning to lay out possible courses of action should your predecessor refuse to acknowledge his defeat (“rigged election!”) or leave the White House.

On January 20th, the big day arrives.

Noon, Eastern Standard Time: With the chief justice of the Supreme Court presiding, take the oath of office in the East Room of the White House in the presence of Vice President Kamala Harris and your immediate family. No inaugural address, no parade, no festivities whatsoever. Make like you’re George Washington: he wasn’t into making a fuss. When the ceremony ends, have lunch and get down to work.

That afternoon: Issue an executive order directing the formation of a National Commission on Reconciliation and Reparations, or NCRR. Recruit Harvard professor Henry Louis Gates or another scholar of comparable stature to head the effort. While likely to be a lengthy and contentious endeavor, the NCRR will provide a point of departure for addressing the persistence of American racism by taking on this overarching question: What does justice require?

That evening: Speak to the nation from the Oval Office. Make it brief. Your address will set the tone for your administration. The nation has its hands full with concurrent crises. The moment calls for humility and hard work, not triumphalism. Don’t overpromise. Consider Abraham Lincoln’s second inaugural address as a model. Curb your inclination to blather. Abe only needed 701 words. See if you can better that.

Day 2: In a letter to House and Senate leaders, unveil the details of your coronavirus strategy, which must include: 1) a national plan to curb the existing Covid-19 outbreak and prevent future ones; 2) a nationwide approach to vaccine distribution; 3) a strategy for averting and, if needed, curbing the outbreak of comparable diseases; 4) adequate funding of key government pandemic relief and prevention facilities and activities. In the process, identify near-term and longer-term funding requirements that will require congressional action.

Day 3: Issue an executive order reversing the announced withdrawal of the United States from the Paris Climate Accords. Describe this as just an initial down payment on the $2 trillion Green New Deal you promised Americans during the election campaign. Joe, if you can make meaningful progress toward curbing climate change, future generations will put you on Mount Rushmore in place of one of those slaveholders.

Day 4: Send a personal message to the German chancellor, the British prime minister, and the presidents of China, France, and Russia, declaring your intention to recommit the United States to the Iran nuclear deal that Donald Trump ditched in 2018. Quietly initiate the process of opening a back channel to the Iranian leadership. (I’ve got colleagues who might be able to lend a hand in laying the groundwork. Let me know if the Quincy Institute can be of help.) That same day, on your first visit as president to the White House press room, casually mention that the United States will henceforth adhere to a policy of no-first-use regarding its nuclear weapons. Simultaneously, tell the Pentagon to stop work on “modernizing” the U.S. nuclear arsenal. That’s $2 trillion that can be better spent elsewhere. No first use will flush “fire and fury like the world has never seen” down the toilet. Generals, weapons contractors, and aging Cold Warriors will tell you that you’re taking a great risk. Ignore them and you will substantially reduce the possibility of nuclear war.

Day 5: Issue an executive order suspending any further work on your predecessor’s border “wall.” At the same time, announce your intention to form a non-partisan task force to recommend policies related to border security and immigration, whether legal or otherwise. Ask former Secretary of Housing and Urban Development Julián Castro to chair that task force, with a report due prior to the 100th day of your presidency.

Day 6: Accompanied by Secretary of State Elizabeth Warren, visit the State Department for an all-hands-on-deck meeting. Let it be known that your administration will reserve all senior diplomatic appointments for seasoned Foreign Service officers. No more selling of ambassadorships to campaign contributors or old friends hoping to acquire an honorific title. Make clear your intention to revitalize American diplomacy, recognizing that the principal threats to our wellbeing are transnational and not susceptible to military solutions. The Pentagon can’t do much to alleviate pandemics, environmental degradation, and climate change. Those true national security crises will require collaborative action. Also use this occasion to announce the formation of a non-partisan task force that will recommend ways to reform and re-professionalize the Foreign Service. Top-flight diplomacy requires top-drawer diplomats. Ask former Ambassadors Chas Freeman and Thomas Pickering, both savvy global thinkers and seasoned diplomats, to co-chair that effort, with instructions to report back by July 11th, the birthday of John Quincy Adams, our greatest secretary of state.

Day 7: Begin your morning by inviting General Mark Milley to the Oval Office for a one-on-one meeting. Ask him to tender his immediate resignation as chairman of the Joint Chiefs of Staff. Milley’s participation in the infamous Lafayette Square stunt, even if unwitting, renders him unfit for further employment. Later that same day, visit the remaining chiefs in the Pentagon. Explain your intention to commence a wholesale reevaluation of the U.S. military’s global posture — command structure, bases, budgets, priorities, and above all emerging threats. Ask for their forthright assistance in this endeavor, making it clear that anyone obstructing the process will be gone.

Day 8: Call on Ruth Bader Ginsberg in her chambers at the Supreme Court. Invite her to retire now that the Senate is in Democratic hands. Offer private assurances that her successor will be a) liberal; b) a woman; c) a person of color; and d) a distinguished jurist.

Day 9: Do what your predecessor vowed to do, but didn’t: end America’s endless wars. At your first full-fledged cabinet meeting, charge your new Defense Secretary James Webb with providing a detailed schedule for a deliberate, but comprehensive withdrawal (no ifs, ands, or buts) of U.S. forces from Afghanistan and the Persian Gulf, with a completion date by the end of your first year in office.

Day 10: Visit Mexico City. Engage in a trilateral discussion with President Andrés Manuel López Obrador and Canadian Prime Minister Justin Trudeau. At day’s end, sign the Declaration of Tenochtitlan affirming a common commitment to democracy, the rule of law, human rights, economic growth, and continental security. Your predecessors have taken Mexico and Canada for granted. You will correct that oversight. In fact, no two countries on the planet are of greater importance to the wellbeing of the American people.

Day 11: Invite China’s president Xi Jinping for an informal meeting at Camp David at a date of his choosing. As you know, Joe, the United States and China are hurtling toward a new Cold War. Reversing the momentum of events will prove difficult indeed. This will require considerable personal diplomacy on your part. Given the need for the planet’s two major economic powers to cooperate on lowering greenhouse gasses globally, nothing is more important than this. Start now.

Day 12: Announce plans to visit NATO headquarters in the near future. Begin quiet consultations with European members of the alliance to nudge them toward taking responsibility for their own security. Let them know that before the year is out you intend to make public a 10-year timetable for withdrawing all U.S. forces from Europe. That will concentrate minds in London, Paris, Berlin, and elsewhere in the alliance.

Day 13: Convene a meeting of the best minds in tech (which, by the way, does not necessarily mean the wealthiest tech tycoons). Pick their brains on the issue of privacy. This challenge will extend beyond your presidency. You can at least highlight the problem.

Day 14: You’re 78, the oldest man ever to walk into the Oval Office as president. Be smart. Take a day totally off to recharge your batteries. You have a long way to go.

Joe, you’re a bit long in the tooth for the duties you’re about to assume. Keep in mind the adage that applies to all us old folks: time is fleeting. We never know how much we have left, so seize the moment. No offense, but your days (like mine) are numbered.

Good luck. I’ll be pulling for you.

Andrew Bacevich writes regularly for  TomDispatch (where this article originated). He is president of the Quincy Institute for Responsible Statecraft. His new book is The Age of Illusions: How America Squandered Its Cold War Victory.

Copyright ©2020 Andrew Bacevich — distributed by Agence Global

—————-

Released: 13 August 2020

Word Count: 1,901

—————-

Bob Dreyfuss, “October Surprise — Will war with Iran be Trump’s election eve shocker?”

August 11, 2020 - TomDispatch

Was Donald Trump’s January 3rd drone assassination of Major General Qasem Soleimani the first step in turning the simmering Cold War between the United States and Iran into a hot war in the weeks before an American presidential election? Of course, there’s no way to know, but behind by double digits in most national polls and flanked by ultra-hawkish Secretary of State Mike Pompeo, Trump is a notoriously impetuous and erratic figure. In recent weeks, for instance, he didn’t hesitate to dispatch federal paramilitary forces to American cities run by Democratic mayors and his administration also seems to have launched a series of covert actions against Tehran that look increasingly overt and have Iran watchers concerned about whether an October surprise could be in the cards.

Much of that concern arises from the fact that, across Iran, things have been blowing up or catching fire in ways that have seemed both mysterious and threatening. Early last month, for instance, a suspicious explosion at an Iranian nuclear research facility at Natanz, which is also the site of its centrifuge production, briefly grabbed the headlines. Whether the site was severely damaged by a bomb smuggled into the building or some kind of airstrike remains unknown. “A Middle Eastern intelligence official said Israel planted a bomb in a building where advanced centrifuges were being developed,” reported the New York Times. Similar fiery events have been plaguing the country for weeks. On June 26th, for instance, there was “a huge explosion in the area of a major Iranian military and weapons development base east of Tehran.” On July 15th, seven ships caught fire at an Iranian shipyard. Other mysterious fires and explosions have hit industrial facilities, a power plant, a missile production factory, a medical complex, a petrochemical plant, and other sites as well.

“Some officials say that a joint American-Israeli strategy is evolving — some might argue regressing — to a series of short-of-war clandestine strikes,” concluded another report in the Times.

Some of this sabotage has been conducted against the backdrop of a two-year-old “very aggressive” CIA action plan to engage in offensive cyber attacks against that country. As a Yahoo! News investigative report put it: “The Central Intelligence Agency has conducted a series of covert cyber operations against Iran and other targets since winning a secret victory in 2018 when President Trump signed what amounts to a sweeping authorization for such activities, according to former U.S. officials with direct knowledge of the matter… The finding has made it easier for the CIA to damage adversaries’ critical infrastructure, such as petrochemical plants.”

Meanwhile, on July 23rd, two U.S. fighter jets buzzed an Iranian civilian airliner in Syrian airspace, causing its pilot to swerve and drop altitude suddenly, injuring a number of the plane’s passengers.

For many in Iran, the drone assassination of Soleimani — and the campaign of sabotage that followed — has amounted to a virtual declaration of war. The equivalent to the Iranian major general’s presidentially ordered murder, according to some analysts, would have been Iran assassinating Secretary of State Pompeo or Chairman of the Joint Chiefs of Staff Mark Milley, although such analogies actually understate Soleimani’s stature in the Iranian firmament.

In its aftermath, Iran largely held its fire, its only response being a limited, telegraphed strike at a pair of American military bases in Iraq. If Soleimani’s murder was intended to draw Iran into a tit-for-tat military escalation in an election year, it failed. So perhaps the U.S. and Israel designed the drumbeat of attacks against critical Iranian targets this summer as escalating provocations meant to goad Iran into retaliating in ways that might provide an excuse for a far larger U.S. response.

Such a conflict-to-come would be unlikely to involve U.S. ground forces against a nation several times larger and more powerful than Iraq. Instead, it would perhaps involve a sustained campaign of airstrikes against dozens of Iranian air defense installations and other military targets, along with the widespread network of facilities that the United States has identified as being part of that country’s nuclear research program.

The “art” of the deal in 2020 In addition to military pressure and fierce sanctions against the Iranian economy, Washington has been cynically trying to take advantage of the fact that Iran, already in a weakened state, has been especially hard hit by the Covid-19 pandemic. Those American sanctions have, for instance, made it far harder for that country to get the economic support and medical and humanitarian supplies it so desperately needs, given its soaring death count.

According to a report by the European Leadership Network,

“Rather than easing the pressure during the crisis, the U.S. has applied four more rounds of sanctions since February and contributed to the derailing of Iran’s application for an IMF [International Monetary Fund] loan. The three special financial instruments designed to facilitate the transfer of humanitarian aid to Iran in the face of secondary sanctions on international banking transactions… have proven so far to have been one-shot channels, stymied by U.S. regulatory red tape.”

To no avail did Human Rights Watch call on the United States in April to ease its sanctions in order to facilitate Iran’s ability to grapple with the deadly pandemic, which has officially killed nearly 17,000 people since February (or possibly, if a leaked account of the government’s actual death figures is accurate, nearly 42,000).

Iran has every reason to feel aggrieved. At great political risk, President Hassan Rouhani and Supreme Leader Ali Khamenei agreed in 2015 to a deal with the United States and five other world powers over Iran’s nuclear research program. That accord, the Joint Comprehensive Plan of Action (JCPOA), accomplished exactly what it was supposed to do: it led Iran to make significant concessions, cutting back both on its nuclear research and its uranium enrichment program in exchange for an easing of economic sanctions by the United States and other trade partners.

Though the JCPOA worked well, in 2018 President Trump unilaterally withdrew from it, reimposed far tougher sanctions on Iran, began what the administration called a campaign of “maximum pressure” against Tehran, and since assassinating Soleimani has apparently launched military actions just short of actual war. Inside Iran, Trump’s confrontational stance has helped tilt politics to the right, undermining Rouhani, a relative moderate, and eviscerating the reformist movement there. In elections for parliament in February, ultraconservatives and hardliners swept to a major victory.

But the Iranian leadership can read a calendar, too. Like voters in the United States, they know that the Trump administration is probably going to be voted out of office in three months. And they know that, in the event of war, it’s more likely than not that many Americans — including, sadly, some hawkish Democrats in Congress, and influential analysts at middle-of-the-road Washington think tanks — will rally to the White House. So unless the campaign of covert warfare against targets in Iran were to intensify dramatically, the Iranian leadership isn’t likely to give Trump, Pompeo, and crew the excuse they’re looking for.

As evidence that Iran’s leadership is paying close attention to the president’s electoral difficulties, Khamenei only recently rejected in the most explicit terms possible what most observers believe is yet another cynical ploy by the American president, when he suddenly asked Iran to reengage in direct leader-to-leader talks. In a July 31st speech, the Iranian leader replied that Iran is well aware Trump is seeking only sham talks to help him in November. (In June, Trump tweeted Iran: “Don’t wait until after the U.S. Election to make the Big deal! I’m going to win!”) Indeed, proving that Washington has no intention of negotiating with Iran in good faith, after wrecking the JCPOA and ratcheting up sanctions, the Trump administration announced an onerous list of 12 conditions that would have to precede the start of such talks. In sum, they amounted to a demand for a wholesale, humiliating Iranian surrender. So much for the art of the deal in 2020.

October surprises, then and now Meanwhile, the United States isn’t getting much support from the rest of the world for its thinly disguised effort to create chaos, a possible uprising, and the conditions to force regime change on Iran before November 3rd. At the United Nations, when Secretary of State Pompeo called on the Security Council to extend an onerous arms embargo on Iran, not only did Russia and China promise to veto any such resolution but America’s European allies opposed it, too. They were particularly offended by Pompeo’s threat to impose “snapback” economic sanctions on Iran as laid out in the JCPOA if the arms embargo wasn’t endorsed by the council. Not lost on the participants was the fact that, in justifying his demand for such new U.N. sanctions, the American secretary of state was invoking the very agreement that Washington had unilaterally abandoned. “Having quit the JCPOA, the U.S. is no longer a participant and has no right to trigger a snapback at the U.N.,” was the way China’s U.N. ambassador put it.

That other emerging great power has, in fact, become a major spoiler and Iranian ally against the Trump administration’s regime-change strategy, even as its own relations with Washington grow grimmer by the week. Last month, the New York Times reported that Iran and China had inked “a sweeping economic and security partnership that would clear the way for billions of dollars of Chinese investments in energy and other sectors, undercutting the Trump administration’s efforts to isolate the Iranian government.” The 18-page document reportedly calls for closer military cooperation and a $400 billion Chinese investment and trade accord that, among other things, takes direct aim at the Trump-Pompeo effort to cripple Iran’s economy and its oil exports.

According to Shireen Hunter, a veteran Middle Eastern analyst at Georgetown University, that accord should be considered a world-changing one, as it potentially gives China “a permanent foothold in Iran” and undermines “U.S. strategic supremacy in the [Persian] Gulf.” It is, she noted with some alarm, a direct result of Trump’s anti-Iranian obsession and Europe’s reluctance to confront Washington’s harsh sanctions policy.

On June 20th, in a scathing editorial, the Washington Post agreed, ridiculing the administration’s “maximum pressure” strategy against Iran. Not only had the president failed to bring down Iran’s government or compelled it to change its behavior in conflicts in places like Syria and Yemen, but now, in a powerful blow to U.S. interests, “an Iranian partnership with China… could rescue Iran’s economy while giving Beijing a powerful new place in the region.”

If, however, the traditional Washington foreign policy establishment believes that Trump’s policy toward Iran is backfiring and so working against U.S. hegemony in the Persian Gulf, his administration seems not to care. As evidence mounts that its approach to Iran isn’t having the intended effect, the White House continues apace: squeezing that country economically, undermining its effort to fight Covid-19, threatening it militarily, appointing an extra-hardliner as its “special envoy” for Iran, and apparently (along with Israel) carrying out a covert campaign of terrorism inside the country.

Over the past four decades, “October surprise” has evolved into a catch-all phrase meaning any unexpected action by a presidential campaign just before an election designed to give one of the candidates a surprise advantage. Ironically, its origins lay in Iran. In 1980, during the contest between President Jimmy Carter and former California Governor Ronald Reagan, rumors surfaced that Carter might stage a raid to rescue scores of American diplomats then held captive in Tehran. (He didn’t.) According to other reports, the Reagan campaign had made clandestine contact with Tehran aimed at persuading that country not to release its American hostages until after the election. (Two books, October Surprise by Gary Sick, a senior national security adviser to Carter, and Trick or Treason by investigative journalist Bob Parry delved into the possibility that candidate Reagan, former CIA Director Bill Casey, and others had engaged in a conspiracy with Iran to win that election.)

Consider it beyond irony if, this October, the latest election “surprise” were to take us back to the very origins of the term in the form of some kind of armed conflict that could only end terribly for everyone involved. It’s a formula for disaster and like so many other things, when it comes to Donald J. Trump, it can’t be ruled out.

Bob Dreyfuss is an investigative journalist and writes regularly for TomDispatch (where this article originated). He is a contributing editor at the Nation and has written for Rolling Stone, Mother Jones, the American Prospect, the New Republic, and many other magazines. He is the author of Devil’s Game: How the United States Helped Unleash Fundamentalist Islam.

Copyright ©2020 Robert Dreyfuss — distributed by Agence Global

—————-

Released: 11 August 2020

Word Count: 2,055

—————-

Patrick Cockburn, “War and pandemic journalism”

August 6, 2020 - TomDispatch

The struggle against Covid-19 has often been compared to fighting a war. Much of this rhetoric is bombast, but the similarities between the struggle against the virus and against human enemies are real enough. War reporting and pandemic reporting likewise have much in common because, in both cases, journalists are dealing with and describing matters of life and death. Public interest is fueled by deep fears, often more intense during an epidemic because the whole population is at risk. In a war, aside from military occupation and area bombing, terror is at its height among those closest to the battlefield.

The nature of the dangers stemming from military violence and the outbreak of a deadly disease may appear very different. But looked at from the point of view of a government, they both pose an existential threat because failure in either crisis may provoke some version of regime change. People seldom forgive governments that get them involved in losing wars or that fail to cope adequately with a natural disaster like the coronavirus. The powers-that-be know that they must fight for their political lives, perhaps even their physical existence, claiming any success as their own and doing their best to escape blame for what has gone wrong.

My first pandemic I first experienced a pandemic in the summer of 1956 when, at the age of six, I caught polio in Cork, Ireland. The epidemic there began soon after virologist Jonas Salk developed a vaccine for it in the United States, but before it was available in Europe. Polio epidemics were at their height in the first half of the twentieth century and, in a number of respects, closely resembled the Covid-19 experience: many people caught the disease but only a minority were permanently disabled by or died of it. In contrast with Covid-19, however, it was young children, not the old, who were most at risk. The terror caused by poliomyelitis, to use its full name, was even higher than during the present epidemic exactly because it targeted the very young and its victims did not generally disappear into the cemetery but were highly visible on crutches and in wheelchairs, or prone in iron lungs.

Parents were mystified by the source of the illness because it was spread by great numbers of asymptomatic carriers who did not know they had it. The worst outbreaks were in the better-off parts of modern cities like Boston, Chicago, Copenhagen, Melbourne, New York, and Stockholm. People living there enjoyed a good supply of clean water and had effective sewage disposal, but did not realize that all of this robbed them of their natural immunity to the polio virus. The pattern in Cork was the same: most of the sick came from the more affluent parts of the city, while people living in the slums were largely unaffected. Everywhere, there was a frantic search to identify those, like foreign immigrants, who might be responsible for spreading the disease. In the New York epidemic of 1916, even animals were suspected of doing so and 72,000 cats and 8,000 dogs were hunted down and killed.

The illness weakened my legs permanently and I have a severe limp so, even reporting in dangerous circumstances in the Middle East, I could only walk, not run. I was very conscious of my disabilities from the first, but did not think much about how I had acquired them or the epidemic itself until perhaps four decades later. It was the 1990s and I was then visiting ill-supplied hospitals in Iraq as that country’s health system was collapsing under the weight of U.N. sanctions. As a child, I had once been a patient in an almost equally grim hospital in Ireland and it occurred to me then, as I saw children in those desperate circumstances in Iraq, that I ought to know more about what had happened to me. At that time, my ignorance was remarkably complete. I did not even know the year when the polio epidemic had happened in Ireland, nor could I say if it was caused by a virus or a bacterium.

So I read up on the outbreak in newspapers of the time and Irish Health Ministry files, while interviewing surviving doctors, nurses, and patients. Kathleen O’Callaghan, a doctor at St. Finbarr’s hospital, where I had been brought from my home when first diagnosed, said that people in the city were so frightened “they would cross the road rather than walk past the walls of the fever hospital.” My father recalled that the police had to deliver food to infected homes because no one else would go near them. A Red Cross nurse, Maureen O’Sullivan, who drove an ambulance at the time, told me that, even after the epidemic was over, people would quail at the sight of her ambulance, claiming “the polio is back again” and dragging their children into their houses or they might even fall to their knees to pray.

The local authorities in a poor little city like Cork where I grew up understood better than national governments today that fear is a main feature of epidemics. They tried then to steer public opinion between panic and complacency by keeping control of the news of the outbreak. When British newspapers like the Times reported that polio was rampant in Cork, they called this typical British slander and exaggeration. But their efforts to suppress the news never worked as well as they hoped. Instead, they dented their own credibility by trying to play down what was happening. In that pre-television era, the main source of information in my hometown was the Cork Examiner, which, after the first polio infections were announced at the beginning of July 1956, accurately reported on the number of cases, but systematically underrated their seriousness.

Headlines about polio like “Panic Reaction Without Justification” and “Outbreak Not Yet Dangerous” regularly ran below the fold on its front page. Above it were the screaming ones about the Suez Crisis and the Hungarian uprising of that year. In the end, this treatment only served to spread alarm in Cork where many people were convinced that the death toll was much higher than the officially announced one and that bodies were being secretly carried out of the hospitals at night.

My father said that, in the end, a delegation of local businessmen, the owners of the biggest shops, approached the owners of the Cork Examiner, threatening to withdraw their advertising unless it stopped reporting the epidemic. I was dubious about this story, but when I checked the newspaper files many years later, I found that he was correct and the paper had almost entirely stopped reporting on the epidemic just as sick children were pouring into St. Finbarr’s hospital.

The misreporting of wars and epidemics By the time I started to research a book about the Cork polio epidemic that would be titled Broken Boy, I had been reporting wars for 25 years, starting with the Northern Irish Troubles in the 1970s, then the Lebanese civil war, the Iraqi invasion of Kuwait, the war that followed Washington’s post-9/11 takeover of Afghanistan, and the U.S.-led 2003 invasion of Iraq. After publication of the book, I went on covering these endless conflicts for the British paper the Independent as well as new conflicts sparked in 2011 by the Arab Spring in Libya, Syria, and Yemen.

As the coronavirus pandemic began this January, I was finishing a book (just published), War in the Age of Trump: The Defeat of Isis, the Fall of the Kurds, the Confrontation with Iran. Almost immediately, I noticed strong parallels between the Covid-19 pandemic and the polio epidemic 64 years earlier. Pervasive fear was perhaps the common factor, though little grasped by governments of this moment. Boris Johnson’s in Great Britain, where I was living, was typical in believing that people had to be frightened into lockdown, when, in fact, so many were already terrified and needed to be reassured.

I also noticed ominous similarities between the ways in which epidemics and wars are misreported. Those in positions of responsibility — Donald Trump represents an extreme version of this — invariably claim victories and successes even as they fail and suffer defeats. The words of the Confederate general “Stonewall” Jackson came to mind. On surveying ground that had only recently been a battlefield, he asked an aide: “Did you ever think, sir, what an opportunity a battlefield affords liars?”

This has certainly been true of wars, but no less so, it seemed to me, of epidemics, as President Trump was indeed soon to demonstrate (over and over and over again). At least in retrospect, disinformation campaigns in wars tend to get bad press and be the subject of much finger wagging. But think about it a moment: it stands to reason that people trying to kill each other will not hesitate to lie about each other as well. While the glib saying that “truth is the first casualty of war” has often proven a dangerous escape hatch for poor reporting or unthinking acceptance of a self-serving version of battlefield realities (spoon-fed by the powers-that-be to a credulous media), it could equally be said that truth is the first casualty of pandemics. The inevitable chaos that follows in the wake of the swift spread of a deadly disease and the desperation of those in power to avoid being held responsible for the soaring loss of life lead in the same direction.

There is, of course, nothing inevitable about the suppression of truth when it comes to wars, epidemics, or anything else for that matter. Journalists, individually and collectively, will always be engaged in a struggle with propagandists and PR men, one in which victory for either side is never inevitable.

Unfortunately, wars and epidemics are melodramatic events and melodrama militates against real understanding. “If it bleeds, it leads” is true of news priorities when it comes to an intensive care unit in Texas or a missile strike in Afghanistan. Such scenes are shocking but do not necessarily tell us much about what is actually going on.

The recent history of war reporting is not encouraging. Journalists will always have to fight propagandists working for the powers-that-be. Sadly, I have had the depressing feeling since Washington’s first Gulf War against Saddam Hussein’s Iraq in 1991 that the propagandists are increasingly winning the news battle and that accurate journalism, actual eyewitness reporting, is in retreat.

Disappearing news By its nature, reporting wars is always going to be difficult and dangerous work, but it has become more so in these years. Coverage of Washington’s Afghan and Iraqi wars was often inadequate, but not as bad as the more recent reporting from war-torn Libya and Syria or its near total absence from the disaster that is Yemen. This lack fostered misconceptions even when it came to fundamental questions like who is actually fighting whom, for what reasons, and just who are the real prospective winners and losers.

Of course, there is little new about propaganda, controlling the news, or spreading “false facts.” Ancient Egyptian pharaohs inscribed self-glorifying and mendacious accounts of their battles on monuments, now thousands of years old, in which their defeats are lauded as heroic victories. What is new about war reporting in recent decades is the far greater sophistication and resources that governments can deploy in shaping the news. With opponents like longtime Iraqi ruler Saddam Hussein, demonization was never too difficult a task because he was a genuinely demonic autocrat.

Yet the most influential news story about the Iraqi invasion of neighboring Kuwait in 1990 and the U.S.-led counter-invasion proved to be a fake. This was a report that, in August 1990, invading Iraqi soldiers had tipped babies out of incubators in a Kuwaiti hospital and left them to die on the floor. A Kuwaiti girl reported to have been working as a volunteer in the hospital swore before a U.S. congressional committee that she had witnessed that very atrocity. Her story was hugely influential in mobilizing international support for the war effort of the administration of President George H.W. Bush and the U.S. allies he teamed up with.

In reality it proved purely fictional. The supposed hospital volunteer turned out to be the daughter of the Kuwaiti ambassador in Washington. Several journalists and human rights specialists expressed skepticism at the time, but their voices were drowned out by the outrage the tale provoked. It was a classic example of a successful propaganda coup: instantly newsworthy, not easy to disprove, and when it was — long after the war — it had already had the necessary impact, creating support for the U.S.-led coalition going to war with Iraq.

In a similar fashion, I reported on the American war in Afghanistan in 2001-2002 at a time when coverage in the international media had left the impression that the Taliban had been decisively defeated by the U.S. military and its Afghan allies. Television showed dramatic shots of bombs and missiles exploding on the Taliban front lines and Northern Alliance opposition forces advancing unopposed to “liberate” the Afghan capital, Kabul.

When, however, I followed the Taliban retreating south to Kandahar Province, it became clear to me that they were not by any normal definition a beaten force, that their units were simply under orders to disperse and go home. Their leaders had clearly grasped that they were over-matched and that it would be better to wait until conditions changed in their favor, something that had distinctly happened by 2006, when they went back to war in a big way. They then continued to fight in a determined fashion to the present day. By 2009, it was already dangerous to drive beyond the southernmost police station in Kabul due to the risk that Taliban patrols might create pop-up checkpoints anywhere along the road.

None of the wars I covered then have ever really ended. What has happened, however, is that they have largely ended up receding, if not disappearing, from the news agenda. I suspect that, if a successful vaccine for Covid-19 isn’t found and used globally, something of the same sort could happen with the coronavirus pandemic as well. Given the way news about it now dominates, even overwhelms, the present news agenda, this may seem unlikely, but there are precedents. In 1918, with World War I in progress, governments dealt with what came to be called the Spanish Flu by simply suppressing information about it. Spain, as a non-combatant in that war, did not censor the news of the outbreak in the same fashion and so the disease was most unfairly named “the Spanish Flu,” though it probably began in the United States.

The polio epidemic in Cork supposedly ended abruptly in mid-September 1956 when the local press stopped reporting on it, but that was at least two weeks before many children like me caught it. In a similar fashion, right now, wars in the Middle East and north Africa like the ongoing disasters in Libya and Syria that once got significant coverage now barely get a mention much of the time.

In the years to come, the same thing could happen to the coronavirus.

Patrick Cockburn is a Middle East correspondent for the Independent of London and the author of six books on the Middle East, the latest of which is War in the Age of Trump: The Defeat of Isis, the Fall of the Kurds, the Confrontation with Iran (Verso). This article originated at TomDispatch.com

Copyright ©2019 Patrick Cockburn — distributed by Agence Global

—————-

Released: 06 August 2020

Word Count: 2,527

—————-

Andrea Mazzarino, “The military is sick”

August 4, 2020 - TomDispatch

American military personnel are getting sick in significant numbers in the midst of the ongoing pandemic. As The New York Times reported in a piece buried in the back pages of its July 21st edition, “The infection rate in the services has tripled over the past six weeks as the United States military has emerged as a potential source of transmission both domestically and abroad.”

Indeed, the military is sick and I think of it as both a personal and an imperial disaster.

As the wife of a naval officer, I bear witness to the unexpected ways that disasters of all sorts play out among military families and lately I’ve been bracing for the Covid-19 version of just such a disaster. Normally, for my husband and me, the stressors are relatively mild. After all, between us we have well-paid jobs, two healthy children, and supportive family and friends, all of which allow us to weather the difficulties of military life fairly smoothly. In our 10 years together, however, over two submarine assignments and five moves, we’ve dealt with unpredictable months-long deployments, uncertainty about when I will next be left to care for our children alone, and periods of 16-hour workdays for my spouse that strained us both, not to speak of his surviving a major submarine accident.

You would think that, as my husband enters his third year of “shore duty” as a Pentagon staffer, the immediate dangers of military service would finally be negligible. No such luck. Since around mid-June, as President Trump searched for scapegoats like the World Health Organization for his own Covid-19 ineptitude and his concern over what rising infection rates could mean for his approval ratings, he decided that it was time to push this country to “reopen.”

As it turned out, that wouldn’t just be a disaster for states from Florida to California, but also meant that the Pentagon resumed operations at about 80% capacity. So, after a brief reprieve, my spouse is now required to report to his office four days a week for eight-hour workdays in a poorly ventilated, crowded hive of cubicles where people neither consistently mask nor social distance.

All of this for what often adds up to an hour or two of substantive daily work. Restaurants, dry cleaners, and other services where Pentagon staffers circulate only add to the possibility of his being exposed to Covid-19.

My husband, in other words, is now unnecessarily risking his own and his family’s exposure to a virus that has to date claimed more than 150,000 American lives — already more than eight times higher than the number of Americans who died in both the 9/11 terrorist attacks and the endless wars in Iraq and Afghanistan that followed.

In mid-August, he will transfer to an office job in Maryland, a state where cases and deaths are again on the rise. One evening, I asked him why it seemed to be business as usual at the Pentagon when numbers were spiking in a majority of states. His reply: “Don’t ask questions about facts and logic.”

After all, unless Secretary of Defense Mark Esper decides to speak out against the way President Trump has worked to reopen the country to further disaster, the movement of troops and personnel like my husband within and among duty stations will simply continue, even as Covid-19 numbers soar in the military.

America’s archipelago of bases Global freedom of movement has been a hallmark of America’s vast empire of bases, at least 800 of them scattered across much of the planet. Now, it may prove part of the downfall of that very imperial structure. After all, Donald Trump’s America is at the heart of the present pandemic. So it’s hardly surprising that, according to the Times, U.S. troops seem to be carrying Covid-19 infections with them from hard-hit states like Arizona, California, Florida, and Texas, a number of which have had lax and inconsistently enforced safety guidelines, to other countries where they are stationed.

For example, at just one U.S. base on the Japanese island of Okinawa, the Marine Corps reported nearly 100 cases in July, angering local officials because American soldiers had socialized off-base and gone to local bars in a place where the coronavirus had initially been suppressed. No longer. In Nigeria, where official case counts are low but healthcare workers in large cities are reporting a spike in deaths among residents with symptoms, the U.S. military arms, supplies, and trains the national security forces. So a spike in cases among U.S. troops now places local populations (as well as those soldiers) at additional risk in a country where testing and contact tracing are severely lacking. And this is a problem now for just about any U.S. ally from Europe to South Korea.

What this virus’s spread among troops means, of course, is that the U.S. empire of bases that spans some 80 countries — about 40% of the nations on this planet — is now part of the growing American Covid-19 disaster. There is increasing reason to believe that new outbreaks of what the president likes to call the “Chinese virus” in some of these countries may actually prove to be American imports. Like many American civilians, our military personnel are traveling, going to work, socializing, buying things, often unmasked and ungloved, and anything but social distanced.

Public health experts have been clear that the criteria for safely reopening the economy without sparking yet more outbreaks are numerous. They include weeks of lower case counts, positive test rates at or beneath four new cases per 100,000 people daily, adequate testing capacity, enforcing strict social-distancing guidelines, and the availability of at least 40% of hospital ICU beds to treat any possible future surge.

To date, only three states have met these criteria: Connecticut, New Jersey, and New York. The White House’s Opening Up America plan, on the other hand, includes guidelines of just the weakest and vaguest sort like noting a downward trajectory in cases over 14-day periods and “robust testing capacity” for healthcare workers (without any definition of what this might actually mean).

Following White House guidance, the Department of Defense is deferring to local and state governments to determine what, if any, safety measures to take. As the White House then suggested, in March when a military-wide lockdown began, troops needed to quarantine for 14 days before moving to their next duty station. At the close of June, the Pentagon broadly removed travel restrictions, allowing both inter-state recreational and military travel by troops and their families. Now, in a country that lacks any disciplined and unified response to the global pandemic, our ever-mobile military has become a significant conduit of its spread, both domestically and abroad.

To be sure, none of us knew how to tackle the dangers posed by this virus. The last global pandemic of this sort, the “Spanish Flu” of 1918-1919 in which 50 million or more people died worldwide, suggested just how dire the consequences of such an outbreak could be when uncontained. But facts and lived experience are two different things. If you’re young, physically fit, have survived numerous viruses of a more known and treatable sort, and most of the people around you are out and about, you probably dismiss it as just another illness, even if you’re subject to some of the Covid-19 death risk factors that are indeed endemic among U.S. military personnel.

Perhaps what the spread of this pandemic among our troops shows is that the military-civilian divide isn’t as great as we often think.

Protecting life in the Covid-19 era Full disclosure: I write this at a time when I’m frustrated and tired. For the past month, I’ve provided full-time child care for our two pre-school age kids, even while working up to 50 hours a week, largely on evenings and weekends, as a psychotherapist for local adults and children themselves acutely experiencing the fears, health dangers, and economic effects of the coronavirus. Like many other moms across the country, I cram work, chores, pre-K Zoom sessions, pediatrician and dentist appointments, and grocery shopping into endless days, while taking as many security precautions as I can. My husband reminds me of the need to abide by quarantines, as (despite his working conditions) he needs to be protected from exposing top Pentagon officials to the disease.

Yet the military has done little or nothing to deal with the ways the families of service members, asked to work and “rotate,” might be exposed to infection. In the dizziness of fatigue, I have little patience for any institution that carries on with business as usual under such circumstances.

What’s more, it’s hard to imagine how any efforts to quarantine will bear fruit in a country where even those Americans who do follow scientific news about Covid-19 have often dropped precautions against its spread. I’ve noted that, these days, some of my most progressive friends have started to socialize, eat indoors at restaurants, and even travel out of state to more deeply affected places by plane. They are engaging in what we therapists sometimes call “emotion-based reasoning,” or “I’m tired of safety precautions, so they must no longer be necessary.”

And that’s not even taking into account the no-maskers among us who flaunt the safety guidelines offered by the Centers for Disease Control and Prevention to indicate their supposed love of individual liberties. A relative, an officer with the Department of Homeland Security, recently posted a picture on Facebook of his three young children and those of a workmate watching fireworks arm in arm at an unmasked July 4th gathering. The picture was clearly staged to provoke those like me who support social-distancing and masking guidelines. When I talk with him, he quickly changes the subject to how he could, at any moment, be deployed to “control the rioters in D.C. and other local cities.” In other words, in his mind like those of so many others the president relies on as his “base,” the real threat isn’t the pandemic, it’s the people in the streets protesting police violence.

I wonder how the optics of American families celebrating together could have superseded safety based on an understanding of how diseases spread, as well as a healthy respect for the unknowns that go with them.

Sometimes, our misplaced priorities take my breath away, quite literally so recently. Craving takeout from my favorite Peruvian chicken restaurant and wanting to support a struggling local business, I ordered such a meal and drove with my kids to pick it up. Stopping at the restaurant, I noted multiple unmasked people packed inside despite a sign on the door mandating masks and social distancing. Making a quick risk-benefit assessment, I opened the car windows, blasted the air conditioning, and ran into the restaurant without my kids, making faces at them through the window while I stood in line.

A voice suddenly cut through the hum of the rotisseries: “Shameful! Shameful!” A woman, unmasked, literally spat these words, pointing right at me. “Leaving your kids in the car! Someone could take them! Shameful!” I caught my breath. Riddled with guilt and fearful of what she might do, I returned to my car without my food. She followed me, yelling, “Shameful!”

Aside from the spittle flying from this woman’s mouth, notable was what she wasn’t ashamed of: entering such a place, unmasked and ready to spit, with other people’s children also in there running about. (Not to mention that in Maryland reported abductions of children by strangers are nil.)

What has this country come to when we are more likely to blame the usual culprits — negligent mothers, brown and Black people, illegal immigrants (you know the list) — than accept responsibility for what’s actually going on and make the necessary sacrifices to deal with it (perhaps including, I should admit, going without takeout food)?

Typically in these years, top Pentagon officials and the high command are prioritizing the maintenance of empire at the expense of protecting the very bodies that make up the armed services (not to speak of those inhabitants of other countries living near our hundreds of global garrisons). After all, what’s the problem, when nothing could be more important than keeping this country in the (increasingly embattled) position of global overseer? More bodies can always be produced. (Thank you, military spouses!)

The spread of this virus around the globe, now aided in part by the U.S. military, reminds me of one of those paint-with-water children’s books where the shading appears gradually as the brush moves over the page, including in places you didn’t expect. Everywhere that infected Americans socialize, shop, arm, and fight, this virus is popping up, eroding both our literal ability to be present and the institutions (however corrupt) we’re still trying to prop up. If we are truly in a “war” against Covid-19 — President Trump has, of course, referred to himself as a “wartime president” — then it’s time for all of us to make the sacrifices of a wartime nation by prioritizing public health over pleasure. Otherwise, I fear that what’s good about life in this country will also be at risk, as will the futures of my own children.

Andrea Mazzarino writes regularly for TomDispatch (where this article originated). She co-founded Brown University’s Costs of War Project. She has held various clinical, research, and advocacy positions, including at a Veterans Affairs PTSD Outpatient Clinic, with Human Rights Watch, and at a community mental health agency. She is the co-editor of War and Health: The Medical Consequences of the Wars in Iraq and Afghanistan.

Copyright ©2020 Andrea Mazzarino — distributed by Agence Global

—————-

Released: 04 August 2020

Word Count: 2,194

—————-

Karen J. Greenberg, “Accountability is gone in America”

August 3, 2020 - TomDispatch

Whether you consider the appalling death toll or the equally unacceptable rising numbers of Covid-19 cases, the United States has one of the worst records worldwide when it comes to the pandemic. Nevertheless, the president has continued to behave just as he promised he would in March when there had been only 40 deaths from the virus here and he said, “I don’t take responsibility at all.”

In April, when 50,000 Americans had died, he praised himself and his administration, insisting, “I think we’ve done a great job.” In May, as deaths continued to mount nationwide, he insisted, “We have met the moment and we have prevailed.” In June, he swore the virus was “dying out,” contradicting the views and data of his just-swept-into-the-closet coronavirus task force. In July, he cast the blame for the ongoing disaster on state governors, who, he told the nation, had handled the virus “poorly,” adding, “I supplied everybody.” It was the governors, he assured the public, who had failed to acquire and distribute key supplies, including protective gear and testing supplies.

All told, he’s been a perfect model in deflecting all responsibility, even as the death toll soared over 150,000 with more than four million cases reported nationwide and no end in sight, even as he assured the coronavirus of a splendid future in the U.S. by insisting that all schools reopen this fall (and that the Centers for Disease Control and Prevention back him on that).

In other words, Donald Trump and his team have given lack of accountability a new meaning in America. Their refusal to accept the slightest responsibility for Covid-19’s rampage through this country may seem startling (or simply like our new reality) in a land that has traditionally defined itself as dedicated to democratic governance, and the rule of law. It has long seen itself as committed to transparency and justice, through investigations, reports, and checks and balances, notably via the courts and Congress, designed to ensure that its politicians and officials be held responsible for their actions. The essence of democracy — the election — was also the essence of accountability, something whose results Donald Trump recently tried to throw into doubt when it comes to the contest this November.

Still, the loss of accountability isn’t simply a phenomenon of the Trump years. Its erosion has been coming for a long time at what, in retrospect, should seem an alarmingly inexorable pace.

In August 2020, it should be obvious that America, a still titanic (if fading) power, has largely thrown accountability overboard. With that in mind, here’s a little history of how it happened.

The war on terror As contemporary historians and political analysts tell it, the decision to go to war in Iraq in the spring of 2003, which cost more than 8,000 American lives and led to more than 200,000 Iraqi deaths, military and civilian, was more than avoidable. It was the result of lies and doctored information engineered to get the U.S. involved in a crucial part of what would soon enough become its “forever wars” across the Greater Middle East and Africa.

As Robert Draper recently reminded us, those in the administration of President George W. Bush who contested information about the presence of weapons of mass destruction in Saddam Hussein’s Iraq were ignored or silenced. Worse yet, torture was used to extract a false confession from senior al-Qaeda member Ibn Sheikh al-Libi regarding the terror organization’s supposed attempts to acquire such weaponry there. Al-Libi’s testimony, later recanted, was used as yet another pretext to launch an invasion that top American officials had long been determined to set in motion.

And it wasn’t just a deceitful decision. It was a thoroughly disastrous one as well. There is today something like a consensus among policy analysts that it was possibly the “biggest mistake in American military history” or, as former Senate Majority Leader Harry Reid (D-NV) put it four years after the invasion, “the worst foreign policy mistake in U.S. history,” supplanting the Vietnam War in the minds of many.

And that raises an obvious question: Who was held accountable for that still unending disaster? Who was charged with the crime of willfully and intentionally taking the nation to war — and a failed war at that — based on manufactured facts? In numerous books, the grim realities of that moment have been laid out clearly. When it comes to any kind of public censure, or trial, or even an official statement of wrongdoing, none was ever forthcoming.

Nor was there any accountability for the policy and practice of torture, “legally” sanctioned then, that took the country back to practices more common in the Middle Ages. (It’s worth noting as well that John Yoo, who wrote the memos authorizing such torture then, is now helping the Trump administration find ways to continue evading checks on the presidency.)

More than a decade ago at TomDispatch, I wrote about how the Bush administration supported such acts at the highest levels. As a result, in the early years of the war on terror, in 20 CIA “black sites,” located in eight countries, the U.S. government used torture, as a Senate Select Intelligence Committee Report of December 2014 would detail, to elicit information and misinformation from dozens of “high-value detainees.”

It should go without saying that torture violates just about every precept of the modern rule of law: the renunciation of adjudication in favor of brutality, the use of dungeon-like chambers and medieval equipment rather than the expertise of intelligence professionals gathering information, and of course the rejection of any conviction that civility and rights are valuable.

Among his first acts on entering the Oval Office, Barack Obama pledged that the United States under his leadership would “not torture.” Nonetheless, the lawyers who wrote the memos legally approving those policies were never held accountable, nor were the Bush administration officials who signed off on them (and had such techniques demonstrated to them in the White House); nor, of course, were the actual torturers and the doctors who advised them in any way censured or criminally charged in American courts.

Indeed, many of their careers only advanced as they took jobs like a federal judge, a professor at a prestigious law school, or a well-remunerated author. When suggestions for leveling criminal charges or holding congressional hearings and investigations were raised, the Obama administration decided not to proceed. Attorney General Eric Holder claimed that “the admissible evidence would not be sufficient to obtain and sustain a conviction beyond a reasonable doubt,” while President Obama insisted that the administration should “look forward as opposed to looking backwards.” Accountability was once again abandoned.

And looming over the war on terror, the invasion of Iraq, and those torture policies was a refusal to hold any agency, administration, or anyone at all responsible for failing to stop 9/11 from happening in the first place. The 9/11 Commission Report might have been an initial step in that process, but as journalist Philip Shenon put it in his book The Commission: The Uncensored History of the 9/11 Investigation, the report “skirt[ed] judgements about people who almost certainly had some blame for failing to prevent September 11.”

Evasion elsewhere It wasn’t only in relation to the war on terror that accountability vanished. The government responded to the 2007-2008 banking crisis with a similar determination to avoid it. At that time, the men who ran the nation’s largest banks had played upon the greed of investors to leverage mortgage investments until, lacking government bailouts, their companies would have gone under. In response, both the Bush and Obama administrations bandaged the losses with federal funds. Yet when it came to a classic dive into irresponsible and even illegal financial behavior, they offered stern warnings and nothing else.

Accountability had been similarly elusive for corporate crimes for decades. Take, for instance, the 1989 Exxon Valdez oil spill that covered 1,300 miles of Alaskan coastline with oil, while killing thousands of birds, otters, seals, and whales. Lawsuits brought by that state did result in payments of more than $1 billion after the federal government indicted ExxonMobil for violating the Clean Water Act. However, only the captain of the ship, whom many experts felt had been scapegoated, was convicted of a criminal offense.

A separate lawsuit filed on behalf of local fishermen, native Alaskans, and landowners fared less well. In our post-9/11 era of unaccountability, the penalties that had been leveled against the oil company were reconsidered. In 2008, the Supreme Court reduced a $5 billion punitive damages award by 89% to $507.5 million dollars. And in 2017, in the early months of the Trump administration, 26 years of litigation came to an abrupt end when a federal court in Alaska decided not to pursue a final ExxonMobil payment of $100 million for damages from the spill.

As it turns out, (lack of) accountability is increasingly not just a matter of the law but of politics, as the Mueller investigation of Russian interference in the presidential election of 2016 highlighted. No matter how much information Mueller and his team collected demonstrating violations of both law and policy in future president Donald Trump’s dealings with Russia, or how much information a series of career diplomats and national security officials provided on his quid pro quo approach to Ukrainian officials, escaping blame, not to mention impeachment, has proven all too easy for the president.

As Attorney General Barr told the nation, misrepresenting the essence of the Mueller report, the investigation “did not find that the Trump campaign or anyone associated with it conspired or coordinated with Russia in its efforts to influence the 2016 U.S. presidential election.” More accurately, the report concluded that the evidence “does not exonerate” the president.

Subsequently, nine individuals, seven of them members of the Trump team, were found guilty and 13 Russian nationals and three Russian companies were indicted (though charges against two of the companies have been dropped by Barr’s Department of Justice). And while five of those convicted went to jail, Donald Trump commuted the sentence of his close associate Roger Stone. Meanwhile, the prosecution of his first National Security Advisor Michael Flynn is still in turmoil after the Department of Justice directed and a federal appeals court ordered the case to be dropped. As the Flynn episode demonstrates, even when individuals were held accountable, the president and his administration have, in essence, refused to accept the judgments of the courts.

In other words, the mechanisms for shining a light on government wrongdoing are being systematically undermined and abolished. In that spirit, in April and May at the behest of the president, numerous inspectors general, tasked by law with investigating and reporting on wrongdoing in their agencies, were fired, including those for the State Department and the Intelligence Community, as well as the acting inspectors general for the departments of Defense, Health and Human Services, and Transportation.

In the age of Trump we’re reaching the end of the line when it comes to accountability in the halls of government. Increasingly, it’s no longer an American concept.

Once upon a time It hasn’t always been this way. In the past, when government policy or the officials making it have gone rogue, broken the law, and conspired against the basic tenets of American democracy, they have, at times, paid the price. Nearly a century ago, for instance, President Warren Harding’s Secretary of the Interior Albert Fall went to prison for accepting bribes from oil companies in the Teapot Dome Scandal. In fact, the list of former government officials who have been convicted and served time in jail is long.

Fifty years later, in the Watergate scandal of Richard Nixon’s presidency, 69 individuals, including several top government officials, were indicted and 48 of them found guilty of burglarizing documents from and wiretapping Democratic Party headquarters, among other things. The trail of illegality and cover-up went right up to the office of the president, ending in impeachment proceedings, which led President Nixon to resign.

During the years of Ronald Reagan’s presidency, misuse of power was punished as well. Fourteen people in or close to his administration were convicted for their participation in the Iran-Contra scandal in which the government secretly sold weapons to Iran, an act proscribed by law, with plans to use the funds from those sales to support American-backed Contra rebels in Nicaragua (also in violation of U.S. law). True, of the 14 charged and 11 convicted, only one actually served his sentence in prison. Nonetheless, the convictions stood as a testament to a public acknowledgement of governmental wrongdoing.

Perhaps the saddest part of all is that the Trump administration has not just refused to take responsibility for anything whatsoever, but has blamed others, even those on the front lines of pandemic defense, for things that it did. Since Covid-19 struck American shores and the president and his officials failed to respond, resulting in a catastrophically high — and climbing — death toll, accountability has been harnessed to political whims in a new way. The president has, for instance, blamed President Obama whose pandemic office was dismantled by Trump’s own national security advisor John Bolton.

Until recently, President Trump refused to wear a mask in public and insisted — until belatedly canceling the Republican National Convention in Jacksonville, Florida, still rife with the pandemic — on holding a maskless, unsocial-distanced indoor rally in Tulsa despite overwhelming evidence that indoor transmission is the predominant means by which Covid-19 spreads. In doing so, he also encouraged irresponsible behavior at a local level, while supporting governors ready to imprudently reopen their state economies far too quickly and so condemn Americans there to an explosion of new cases.

It’s possible that this abdication of leadership, leading to a disastrously rising death rate, will, in the end, help Americans turn the corner from unaccountability to accountability — and not just for the disastrous Covid-19 response. Recent street protests from Portland to Manhattan, Chicago to Kansas City, are a sign that accountability is long overdue, not just for the current era, but for this century of American life.

In March, journalist Peter Bergen was the first person to call for a 9/11-style commission to investigate the government’s response to the coronavirus, “if only to make sure the nation is prepared for the next pandemic.” Recently, Democratic Senator Dianne Feinstein and Democratic Congressman Adam Schiff, both from California, also proposed a Covid-19 Commission “not as a political exercise to cast blame, but to learn from our mistakes so we can prevent the problems we now face from being tragically repeated… An honest analysis is the only way to adequately prepare for the next novel virus or another disaster.”

Of course, no such thing is imaginable until Donald Trump is out of office and the Senate in Democratic hands, which does look possible. In the meantime, in its own deadly fashion, the pandemic crisis may actually help turn the tide and bring accountability back to American shores. If more than 150,000 deaths, countless numbers of them preventable, don’t offer a compelling reason to hold our public officials responsible, then what would?

Whatever the punishments, however symbolic or cosmetic, crimes of this sort need to be exposed for what they are and those who carried them out officially identified and held to account. This has nothing to do with retribution. It is not about exacting punishment. It’s about shining a beam of light on deeds that have been harmful beyond imagination and must never be repeated. We as a nation need to remind ourselves of what morality, justice, and the responsible use of power can mean. The country has to be given a chance to restore its long-faded commitment to accountable government. And perhaps we should acknowledge one more crucial thing: that this may prove to be our last chance.

Karen J. Greenberg writes regularly for TomDispatch (where this article originated). She is the director of the Center on National Security at Fordham Law, the host of the Vital Interests Podcast, an International Security Fellow at New America, and the author of Rogue Justice: The Making of the Security State and editor of Reimagining the National Security State: Liberalism on the Brink. Julia Tedesco helped with research for this article.

Copyright ©2020 Karen J. Greenberg — distributed by Agence Global

—————-

Released: 03 August 2020

Word Count: 2,632

—————-

John Feffer, “How Covid-19 could upend geopolitics”

July 28, 2020 - TomDispatch

I don’t trust you.

Don’t take it personally. It doesn’t matter whether you’re a friend or a stranger. I don’t care about your identity or your politics, where you work or if you work, whether you wear a mask or carry a gun.

I don’t trust you because you are, for the time being, a potential carrier of a deadly virus. You don’t have any symptoms? Maybe you’re an asymptomatic superspreader. Show me your negative test results and I’ll still have my doubts. I have no idea what you’ve been up to between taking the test and receiving the results. And can we really trust that the test is accurate?

Frankly, you shouldn’t trust me for the same reasons. I’m not even sure that I can trust myself. Didn’t I just touch my face at the supermarket after palpating the avocados?

I’m learning to live with this mistrust. I’m keeping my distance from other people. I’m wearing my mask. I’m washing my hands. I’m staying far away from bars.

I’m not sure, however, that society can live with this level. Let’s face it: trust makes the world go around. Protests break out when our faith in people or institutions is violated: when we can’t trust the police (#BlackLivesMatter), can’t trust male colleagues (#MeToo), can’t trust the economic system to operate with a modicum of fairness (#OccupyWallStreet), or can’t trust our government to do, well, anything properly (#notmypresident).

Now, throw a silent, hidden killer into this combustible mix of mistrust, anger, and dismay. It’s enough to tear a country apart, to set neighbor against neighbor and governor against governor, to precipitate a civil war between the masked and the unmasked.

Such problems only multiply at the global level where mistrust already permeates the system — military conflicts, trade wars, tussles over migration and corruption. Of course, there’s also been enough trust to keep the global economy going, diplomats negotiating, international organizations functioning, and the planet from spinning out of control. But the pandemic may just tip this known world off its axis.

I’m well aware of the ongoing debate between the “not much” and “everything” factions. Once a vaccine knocks it out of our system, the coronavirus might not have much lasting effect on our world. Even without a vaccine, people can’t wait to get back to normal life by jumping into pools, heading to the movie theater, attending parties — even in the United States where cases continue to rise dramatically. The flu epidemic of 1918-1919, which is believed to have killed at least 50 million people, didn’t fundamentally change everyday life, aside from giving a boost to both alternative and socialized medicine. That flu passed out of mind and into history and so, of course, might Covid-19.

Or, just as the Black Death in the fourteenth century separated the medieval world from all that followed, this pandemic might draw a thick before-and-after line through our history. Let’s imagine that this novel virus keeps circulating and recirculating, that no one acquires permanent immunity, that it becomes a nasty new addition to the cold season except that it just happens to kill a couple of people out of every hundred who get it. This new normal would certainly be better than if Ebola, with a 50% case fatality rate if untreated, became a perennial risk everywhere. But even with a fatality rate in the low single digits, Covid-19 would necessarily change everything.

The media is full of speculation about what a periodic pandemic future will look like. The end of theater and spectator sports. The institutionalization of distance learning. The death of offices and brick-and-mortar retail.

But let’s take a look beyond that — at the even bigger picture. Let’s consider for a moment the impact of this new, industrial-strength mistrust on international relations.

The future of the nation-state Let’s say you live in a country where the government responded quickly and competently to Covid-19. Let’s say that your government established a reliable testing, contact tracing, and quarantine system. It either closed down the economy for a painful but short period or its system of testing was so good that it didn’t even need to shut everything down. Right now, your life is returning to some semblance of normal.

Lucky you.

The rest of us live in the United States. Or Brazil. Or Russia. Or India. In these countries, the governments have proven incapable of fulfilling the most important function of the state: protecting the lives of their citizens. While most of Europe and much of East Asia have suppressed the pandemic sufficiently to restart their economies, Covid-19 continues to rage out of control in those parts of the world that, not coincidentally, are also headed by democratically elected right-wing autocrats.

In these incompetently run countries, citizens have very good reason to mistrust their governments. In the United States, for instance, the Trump administration botched testing, failed to coordinate lockdowns, removed oversight from the bailouts, and pushed to reopen the economy over the objections of public-health experts. In the latest sign of early-onset dementia for the Trump administration, White House Press Secretary Kayleigh McEnany declared this month that “science should not stand in the way” of reopening schools in the fall.

Voters, of course, could boot Trump out in November and, assuming he actually leaves the White House, restore some measure of sanity to public affairs. But the pandemic is contributing to an already overwhelming erosion of confidence in national institutions. Even before the virus struck, in its 2018 Trust Barometer the public relations firm Edelman registered an unprecedented drop in public trust connected to… what else?… the election of Trump. “The collapse of trust in the U.S. is driven by a staggering lack of faith in government, which fell 14 points to 33% among the general population,” the report noted. “The remaining institutions of business, media, and NGOs also experienced declines of 10 to 20 points.”

And you won’t be surprised to learn that the situation hadn’t shown signs of improvement by 2020, with American citizens even more mistrustful of their country’s institutions than their counterparts in Brazil, Italy, and India.

That institutional loss of faith reflects a longer-term trend. According to Gallup’s latest survey, only 11% of Americans now trust Congress, 23% big business and newspapers, 24% the criminal justice system, 29% the public school system, 36% the medical system, and 38% the presidency. The only institution a significant majority of Americans trust — and consider this an irony, given America’s endless twenty-first-century wars — is the military (73%). The truly scary part is that those numbers have held steady, with minor variations, for the last decade across two very different administrations.

How low does a country’s trust index have to go before it ceases being a country? Commentators have already spent a decade discussing the polarization of the American electorate. Much ink has been spilled over the impact of social media in creating political echo chambers. It’s been 25 years since political scientist Robert Putnam observed that Americans were “bowling alone” (that is, no longer participating in group activities or community affairs in the way previous generations did).

The coronavirus has generally proven a major force multiplier of such trends by making spontaneous meetings of unlike-minded people ever less likely. I suspect I’m typical. I’m giving a wide berth to pedestrians, bicyclists, and other joggers when I go out for my runs. I’m not visiting cafes. I’m not talking to people in line at the supermarket. Sure, I’m on Zoom a lot, but it’s almost always with people I already know and agree with.

Under these circumstances, how will we overcome the enormous gaps of perception now evident in this country to achieve anything like the deeper basic understandings that a nation-state requires? Or will Americans lose faith entirely in elections, newspaper stories, hospitals, and public transportation, and so cease being a citizenry altogether?

Trust is the fuel that makes such institutions run. And it looks as though we passed Peak Trust long ago and may be on a Covid-19 sled heading downhill fast.

Globalization unravels The global economy also runs on trust: in financial transactions, the safety of workplace conditions, the long-distance transport of goods, and the consumer’s expectation that the purchased product will work as advertised.

To cause a breakdown in the global assembly line, Covid-19 didn’t have to introduce doubt into every step in this supply chain (though it would, in the end, do something like that). It only had to sever one link: the workplace. When the Chinese government shut down factories in early 2020 to contain the pandemic — leading to a 17% decline in exports in January and February compared to the previous year — companies around the world suddenly faced critical shortages of auto parts, smartphone components, and other key goods.

The workplace proved a weak link in the global supply chain for another reason: cost. Labor has traditionally been the chief expense in manufacturing, which, from the 1990s on, led corporations to outsource work to cheaper locations like Mexico, China, and Vietnam. Since then, however, the global assembly line has changed and, as the McKinsey consulting firm explains, “over 80% of today’s global goods trade is [no longer] from a low-wage country to a high-wage country.”

Labor’s centrality to the location of manufacturing had been further eroded by the growth of automation, which, according to economists, tends to surge during downturns. As it happens, both artificial intelligence and robotization were already on the rise even before the pandemic hit. By 2030, up to 20 million jobs worldwide will be filled by robots. The World Bank estimates that they will eventually replace an astounding 85% of the jobs in Ethiopia, 77% in China, and 72% in Thailand.

Then there are the environmental costs of that same global assembly line. Moving freight contributes 7% to 8% of global greenhouse gas emissions, with air transport being the most carbon intensive way to go. (Add to that, of course, the carbon footprint of the factories themselves.)

If all that doesn’t change the minds of CEOs about the benefits of globalization, then national security considerations might. The pandemic exposed how vulnerable countries are in terms of key commodities. Because China is responsible for producing more respirators, surgical masks, and protective garments than the rest of the world combined, countries began to panic when Covid-19 first hit because they no longer had sufficient national capacity to produce the basic tools to address the spreading pandemic themselves. The same applied to essential drugs. The United States stopped producing penicillin, for instance, in 2004.

The threat of infection, the spread of automation, the environmental impact, the risk of foreign control: the global assembly line just doesn’t seem to make much sense any more. Why not relocate manufacturing back home to a “dark factory” that’s fully automated, doesn’t need lights, heating, or air conditioning, and is practically pandemic-proof?

The current pandemic won’t spell the end of globalization, of course. Corporations, as the McKinsey report points out, will still find compelling reasons to relocate manufacturing and services overseas, including “access to skilled labor or natural resources, proximity to consumers, and the quality of infrastructure.” Consumers will still want pineapples in winter and cheap smart phones. But capitalists eyeing the bottom line, in combination with Trump-style nationalists insisting that capital return home, will increasingly disassemble what we all took for granted as globalization.

The world economy won’t simply disappear. After all, agriculture has persisted in the modern era. It just employs an ever-diminishing segment of the workforce. The same will likely happen to global trade in a pandemic age. In the early part of the last century, surplus labor no longer needed on the farms migrated to the cities to work in factories. The question now is: What will happen to all those workers no longer needed in the global assembly line?

Neither the international community nor the free market has a ready answer, but authoritarian populists do: stop all those displaced workers from migrating.

Wall world From the moment he descended that Trump Tower escalator into the presidential race, Donald Trump’s effort to seal off the U.S. border with Mexico has been his signature policy position. That “big, fat, beautiful wall” of his may be simplistic, anti-immigrant, xenophobic, and mistrustful of the world — and may never really be completed — but unfortunately, he’s been anything but alone in his obsession with walls.

Israel pioneered modern wall building in the mid-1990s by sealing off Palestinians in the Gaza Strip, followed by a 440-mile-long barrier to wall off the West Bank. In 2005, responding to a wave of migrants escaping wars and poverty in North Africa and the Middle East, Hungary built new bulwarks along its southern borders to keep out the desperate. Bulgaria, Greece, Slovenia, and Croatia have done the same. India has fenced off the Kashmir region from Pakistan. Saudi Arabia has constructed a 600-mile barrier along its border with Iraq.

In 1989, there were about a dozen major walls separating countries, including the soon-to-fall Berlin Wall. Today, that number has grown to 70.

In this context, the novel coronavirus proved a godsend to nationalists the world over who believe that if good fences make good neighbors, a great wall is best of all. More than 135 countries added new restrictions at their borders after the outbreak. Europe reestablished its internal Schengen area borders for the first time in 25 years and closed its external ones as well. Some countries — Japan and New Zealand, in particular — practically walled themselves off.

Even as the pandemic fades in certain parts of the world, many of those new border restrictions remain in place. If you want to travel to Europe this summer, you can only do so if you’re from one of a dozen countries on a European Union-approved list (and that doesn’t include Americans). New Zealand has had only a handful of cases over the last few months (with a high of four new cases on June 27th), but its borders remain closed to virtually everyone. Even a “travel bubble” with nearby Australia is off the table for now. Japan has banned entry to people from 129 countries, including the United States, but there’s an exemption for U.S. soldiers traveling to American military bases. A recent outbreak of coronavirus at such garrisons on the island of Okinawa may well prompt Tokyo to tighten its already strict rules further.

And such border restrictions are potentially just the beginning. So far, the pandemic has unleashed an everyone-for-themselves spirit — from export restrictions on essential goods to a feverish competition to develop a vaccine first. The United Nations has made various pleas for greater international cooperation, its secretary general even urging a “global ceasefire” among warring parties. The World Health Organization (WHO) attempted to organize a global response to the virus at its annual meeting. However, the Trump administration promptly announced that it would be pulling out of the WHO, very few combatants observed a Covid-19 ceasefire, and there is no coordinated international response to the pandemic outside of the community of scientists sharing research.

So, is this to be the future: each country transformed into a gated community? How long can a sense of internationalism survive in Wall World?

Rebuilding trust Conservatives used to make fun of the left for its penchant for relativism, for arguing that everything depends on context. “If you ask me what the biggest problem in America is, I’m not going to tell you debt, deficits, statistics, economics,” former Republican House Speaker Paul Ryan said in 2011, “I’ll tell you it’s moral relativism.” Once upon a time, the rightwing railed against deconstructionists who emphasized interpretation over facts.

What, then, to make of the Republican Party today? So many of its leaders, including the president, don’t believe in the science behind either climate change or Covid-19. Many of them embrace the most lunatic conspiracy theories and some current congressional candidates even believe, by way of the far-right conspiracy theory QAnon, that a cabal of satanic child molesters in Hollywood, the Democratic Party, and various international organizations controls the world. In July, Donald Trump achieved the dubious milestone of telling more than 20,000 lies during his tenure as president. In other words, speaking of relativism, the Republican Party has put its trust in a man untethered from reality.

And then along came that pandemic like lighter fluid to a brushfire. The resulting conflagration of mistrust threatens to spread out of control until nothing is left, not the nation-state, not the global economy, not the international community.

In this pandemic era, a fire somewhere is a fire everywhere, for the virus cares nothing about borders. But the key to restoring trust must begin where the trust deficit has grown largest and that certainly is the United States. Not only have Americans lost faith in their own institutions, so, it seems, has everyone else. Since 2016, there has been a 50% drop in the world’s trust in the United States, the largest decline ever in the US News and World Report’s Best Countries survey.

And the reason the United States has the worst record dealing with the coronavirus is quite simple: Donald Trump. He is the leader of an ever-diminishing proportion of the public that continues to believe the coronavirus is a hoax or refuses to comply with basic precautions to prevent its spread. A scofflaw president who refuses to mandate the use of facemasks (even after officially donning one for his Twitter feed) inspires a scofflaw minority that puts the majority at risk.

Restoring trust in this country’s public health system and governance must begin with a competent system of testing, contact tracing, and quarantine. Yet the Trump administration still refuses to take this necessary step. Senate Republicans have pushed for $25 billion to help establish testing and tracing systems at the state level, but the president actually wants to eliminate even this modest amount from the budget (along with additional funds for government agencies tasked with addressing the pandemic).

Americans increasingly mistrust their institutions because growing numbers of us believe that we derive ever fewer benefits from them. The Trump administration has typically done its best to make matters disastrously worse, only recently, amid the pandemic and with millions unemployed, demanding that the Supreme Court gut the health insurance provided by the Obama administration’s Affordable Care Act. The bulk of the stimulus funds passed by Congress went to wealthy individuals and corporations — and the president’s men didn’t even exercise due diligence to prevent nearly $1.4 billion in stimulus checks from being mailed to dead people.

The next administration (assuming there is one) will have a massive clean-up job restoring faith of any sort in such an unequal, broken system. After addressing the acute crisis of the pandemic, it will have to demonstrate that the rule of law is again functioning. The most dramatic proof would, of course, be to throw the book at Donald Trump and his closest enablers. They have violated so many laws that trust in the legal system will be further weakened unless they’re tried and punished for their crimes, including their willingness to sacrifice American lives in staggering numbers in pursuit of The Donald’s reelection.

In 1996, Bill Clinton spoke of building a bridge to the twenty-first century. Two decades into this century, Donald Trump has effectively torn down that bridge and replaced it with a (still largely unbuilt) wall reminiscent of the fortifications of the Middle Ages. Covid-19 has only reinforced the insular paranoia of this president and his followers. The path back to trust, at both a domestic and international level, will be difficult. There will be monsters to battle along the way. But in the end, it’s possible for us to take this country back, create a just and sustainable global economy, and rebuild the international community.

You and I can do this. Together.

Trust me.

John Feffer writes regularly for TomDispatch (where this article originated). He is the author of the dystopian novel Splinterlands and the director of Foreign Policy In Focus at the Institute for Policy Studies. His latest novel is Frostlands, a Dispatch Books original and book two of his Splinterlands series.

Copyright ©2020 John Feffer — distributed by Agence Global

—————-

Released: 28 July 2020

Word Count: 3,312

—————-

Rebecca Gordon, “Why does essential work pay so little… and cost so much?”

July 23, 2020 - TomDispatch

In two weeks, my partner and I were supposed to leave San Francisco for Reno, Nevada, where we’d be spending the next three months focused on the 2020 presidential election. As we did in 2018, we’d be working with UNITE-HERE, the hospitality industry union, only this time on the campaign to drive Donald Trump from office.

Now, however, we’re not so sure we ought to go. According to information prepared for the White House Coronavirus Task Force, Nevada is among the states in the “red zone” when it comes to both confirmed cases of, and positive tests for, Covid-19. I’m 68. My partner’s five years older, with a history of pneumonia. We’re both active and fit (when I’m not tripping over curbs), but our ages make us more likely, if we catch the coronavirus, to get seriously ill or even die. That gives a person pause.

Then there’s the fact that Joe Biden seems to have a double-digit lead over Trump nationally and at least an eight-point lead in Nevada, according to the latest polls. If things looked closer, I would cheerfully take some serious risks to dislodge that man in the White House. But does it make sense to do so if Biden is already likely to win there? Or, to put it in coronavirus-speak, would our work be essential to dumping Trump?

Essential work? This minor personal conundrum got me thinking about how the pandemic has exposed certain deep and unexamined assumptions about the nature and value of work in the United States.

In the ethics classes I teach undergraduates at a college here in San Francisco, we often talk about work. Ethics is, after all, about how we ought to live our lives — and work, paid or unpaid, constitutes a big part of most of those lives. Inevitably, the conversation comes around to compensation: How much do people deserve for different kinds of work? Students tend to measure fair compensation on two scales. How many years of training and/or dollars of tuition did a worker have to invest to become “qualified” for the job? And how important is that worker’s labor to the rest of society?

Even before the coronavirus hit, students would often settle on medical doctors as belonging at the top of either scale. Physicians’ work is the most important, they’d argue, because they keep us alive. “Hmm…” I’d say. “How many of you went to the doctor today?” Usually not a hand would be raised. “How many of you ate something today?” All hands would go up, as students looked around the room at one another. “Maybe,” I’d suggest, “a functioning society depends more on the farmworkers who plant and harvest food than on the doctors you normally might see for a checkup once a year. Not to mention the people who process and pack what we eat.”

I’d also point out that the workers who pick or process our food are not really unskilled. Their work, like a surgeon’s, depends on deft, quick hand movements, honed through years of practice.

Sometimes, in these discussions, I’d propose a different metric for compensation: maybe we should reserve the highest pay for people whose jobs are both essential and dangerous. Before the pandemic, that category would not have included many healthcare workers and certainly not most doctors. Even then, however, it would have encompassed farmworkers and people laboring in meat processing plants. As we’ve seen, in these months it is precisely such people — often immigrants, documented or otherwise — who have also borne some of the worst risks of virus exposure at work.

By the end of April, when it was already clear that meatpacking plants were major sites of Covid-19 infection, the president invoked the Defense Production Act to keep them open anyway. This not only meant that workers afraid to enter them could not file for unemployment payments, but that even if the owners of such dangerous workplaces wanted to shut them down, they were forbidden to do so. By mid-June, more than 24,000 meatpackers had tested positive for the virus. And just how much do these essential and deeply endangered workers earn? According to the U.S. Bureau of Labor Statistics, about $28,450 a year — better than minimum wage, that is, but hardly living high on the hog (even when that’s what they’re handling).

You might think that farmworkers would be more protected from the virus than meatpackers, perhaps because they work outdoors. But as the New York Times has reported: “Fruit and vegetable pickers toil close to each other in fields, ride buses shoulder-to-shoulder, and sleep in cramped apartments or trailers with other laborers or several generations of their families.”

Not surprisingly, then, the coronavirus has, as the Times report puts it, “ravaged” migrant farm worker communities in Florida and is starting to do the same across the country all the way to eastern Oregon. Those workers, who risk their lives through exposure not only to a pandemic but to more ordinary dangers like herbicides and pesticides so we can eat, make even less than meatpackers: on average, under $26,000 a year.

When the president uses the Defense Production Act to ensure that food workers remain in their jobs, it reveals just how important their labor truly is to the rest of us. Similarly, as shutdown orders have kept home those who can afford to stay in, or who have no choice because they no longer have jobs to go to, the pandemic has revealed the crucial nature of the labor of a large group of workers already at home (or in other people’s homes or eldercare facilities): those who care for children and those who look after older people and people with disabilities who need the assistance of health aides.

This work, historically done by women, has generally been unpaid when the worker is a family member and poorly paid when done by a professional. Childcare workers, for example, earn less than $24,000 a year on average; home healthcare aides, just over that amount.

Women’s work Speaking of women’s work, I suspect that the coronavirus and the attendant economic crisis are likely to affect women’s lives in ways that will last at least a generation, if not beyond.

Middle-class feminists of the 1970s came of age in a United States where it was expected that they would marry and spend their days caring for a house, a husband, and their children. Men were the makers. Women were the “homemakers.” Their work was considered — even by Marxist economists — “non-productive,” because it did not seem to contribute to the real economy, the place where myriad widgets are produced, transported, and sold. It was seldom recognized how essential this unpaid labor in the realm of social reproduction was to a functioning economy. Without it, paid workers would not have been fed, cared for, and emotionally repaired so that they could return to another day of widget-making. Future workers would not be socialized for a life of production or reproduction, as their gender dictated.

Today, with so many women in the paid workforce, much of this work of social reproduction has been outsourced by those who can afford it to nannies, day-care workers, healthcare aides, house cleaners, or the workers who measure and pack the ingredients for meal kits to be prepared by other working women when they get home.

We didn’t know it at the time, but the post-World War II period, when boomers like me grew up, was unique in U.S. history. For a brief quarter-century, even working-class families could aspire to an arrangement in which men went to work and women kept house. A combination of strong unions, a post-war economic boom, and a so-called breadwinner minimum wage kept salaries high enough to support families with only one adult in the paid labor force. Returning soldiers went to college and bought houses through the 1944 Servicemen’s Readjustment Act, also known as the G.I. Bill. New Deal programs like social security and unemployment insurance helped pad out home economies.

By the mid-1970s, however, this golden age for men, if not women, was fading. (Of course, for many African Americans and other marginalized groups, it had always only been an age of fool’s gold.) Real wages stagnated and began their long, steady decline. Today’s federal minimum wage, at $7.25 per hour, has remained unchanged since 2009 (something that can hardly be said about the wealth of the 1%). Far from supporting a family of four, in most parts of the country, it won’t even keep a single person afloat.

Elected president in 1980, Ronald Reagan announced in his first inaugural address, “Government is not the solution to our problem, government is the problem.” He then set about dismantling President Lyndon Johnson’s War on Poverty programs, attacking the unions that had been the underpinning for white working-class prosperity, and generally starving the beast of government. We’re still living with the legacies of that credo in, for example, the housing crisis he first touched off by deregulating savings and loan institutions and disempowering the Department of Housing and Urban Development.

It’s no accident that, just as real wages were falling, presidential administrations of both parties began touting the virtues of paid work for women — at least if those women had children and no husband. Aid to Families with Dependent Children (“welfare”) was another New Deal program, originally designed to provide cash assistance to widowed women raising kids on their own at a time when little paid employment was available to white women.

In the 1960s, groups like the National Welfare Rights Organization began advocating that similar benefits be extended to Black women raising children. (As a welfare rights advocate once asked me, “Why is it fine for a woman to look to a man to help her children, but not to The Man?”) Not surprisingly, it wasn’t until Black and Latina women began receiving the same entitlements as their white sisters that welfare became a “problem” in need of “reform.”

By the mid-1990s, the fact that some Black women were receiving money from the government while not doing paid labor for an employer had been successfully reframed as a national crisis. Under Democratic President Bill Clinton, Congress passed the Personal Responsibility and Work Reconciliation Act of 1996, a bill that then was called “welfare reform.” After that, if women wanted help from The Man, they had to work for it — not by taking care of their own children, but by taking care of their children and holding down minimum-wage jobs.

Are the kids all right? It’s more than a little ironic, then, that the granddaughters of feminists who argued that women should have a choice about whether or not to pursue a career came to confront an economy in which women, at least ones not from wealthy families, had little choice about working for pay.

The pandemic may change that, however — and not in a good way. One of the unfulfilled demands of liberal 1970s feminism was universal free childcare. An impossible dream, right? How could any country afford such a thing?

Wait a minute, though. What about Sweden? They have universal free childcare. That’s why a Swedish friend of mine, a human rights lawyer, and her American husband who had a rare tenure track university job in San Francisco, chose to take their two children back to Sweden. Raising children is so much easier there. In the early days of second-wave feminism, some big employers even built daycare centers for their employees with children. Those days, sadly, are long gone.

Now, in the Covid-19 moment, employers are beginning to recognize the non-pandemic benefits of having employees work at home. (Why not make workers provide their own office furniture? It’s a lot easier to justify if they’re working at home. And why pay rent on all that real estate when so many fewer people are in the office?) While companies will profit from reduced infrastructure costs and in some cases possibly even reduced pay for employees who relocate to cheaper areas, workers with children are going to face a dilemma. With no childcare available in the foreseeable future and school re-openings dicey propositions (no matter what the president threatens), someone is going to have to watch the kids. Someone — probably in the case of heterosexual couples, the person who is already earning less — is going to be under pressure to reduce or give up paid labor to do the age-old unpaid (but essential) work of raising the next generation. I wonder who that someone is going to be and, without those paychecks, I also wonder how much families are going to suffer economically in increasingly tough times.

Grateful to have a job? Recently, in yet another Zoom meeting, a fellow university instructor (who’d just been interrupted to help a child find a crucial toy) was discussing the administration’s efforts to squeeze concessions out of faculty and staff. I was startled to hear her add, “Of course, I’m grateful they gave me the job.” This got me thinking about jobs and gratitude — and which direction thankfulness ought to flow. It seems to me that the pandemic and the epidemic of unemployment following in its wake have reinforced a common but false belief shared by many workers: the idea that we should be grateful to our employers for giving us jobs.

We’re so often told that corporations and the great men behind them are Job Creators. From the fountain of their beneficence flows the dignity of work and all the benefits a job confers. Indeed, as this fairy tale goes, businesses don’t primarily produce widgets or apps or even returns for shareholders. Their real product is jobs. Like many of capitalism’s lies, the idea that workers should thank their employers reverses the real story: without workers, there would be no apps, no widgets, no shareholder returns. It’s our effort, our skill, our diligence that gives work its dignity. It may be an old saying, but no less true for that: labor creates all wealth. Wealth does not create anything — neither widgets, nor jobs.

I’m grateful to the universe that I have work that allows me to talk with young people about their deepest values at a moment in their lives when they’re just figuring out what they value, but I am not grateful to my university employer for my underpaid, undervalued job. The gratitude should run in the other direction. Without faculty, staff, and students there would be no university. It’s our labor that creates wealth, in this case a (minor) wealth of knowledge.

As of July 16th, in the midst of the Covid-19 crisis, 32 million Americans are receiving some kind of unemployment benefit. That number doesn’t even reflect the people involuntarily working reduced hours, or those who haven’t been able to apply for benefits. One thing is easy enough to predict: employers will take advantage of people’s desperate need for money to demand ever more labor for ever less pay. Until an effective vaccine for the coronavirus becomes available, expect to see the emergence of a three-tier system of worker immiseration: low-paid essential workers who must leave home to do their jobs, putting themselves in significant danger in the process, while we all depend on them for sustenance; better paid people who toil at home, but whose employers will expect their hours of availability to expand to fill the waking day; and low-paid or unpaid domestic laborers, most of them women, who keep everyone else fed, clothed, and comforted.

Even when the pandemic finally ends, there’s a danger that some modified version of this new system of labor exploitation might prove too profitable for employers to abandon. On the other hand, hitting the national pause button, however painfully, could give the rest of us a chance to rethink a lot of things, including the place of work, paid and unpaid, in our lives.

So, will my partner and I head for Reno in a couple of weeks? Certainly, the job of ousting Donald Trump is essential. I’m just not sure that a couple of old white ladies are essential workers in the time of Covid-19.

Rebecca Gordon writes regularly for TomDispatch (where this article originated). She teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new book on the history of torture in the United States.

Copyright ©2020 Rebecca Gordon — distributed by Agence Global

—————-

Released: 23 July 2020

Word Count: 2,687

—————-

  • « Previous Page
  • 1
  • …
  • 29
  • 30
  • 31
  • 32
  • 33
  • …
  • 40
  • Next Page »

Syndication Services

Agence Global (AG) is a specialist news, opinion and feature syndication agency.

Rights & Permissions

Email us or call us 24 hours a day, 7 days a week, for rights and permission to publish our clients’ material. One of our representatives will respond in less than 30 minutes over 80% of the time.

Social Media

  • Facebook
  • Twitter

Advisories

Editors may ask their representative for inclusion in daily advisories. Sign up to get advisories on the content that fits your publishing needs, at rates that fit your budget.

About AG | Contact AG | Privacy Policy

©2016 Agence Global