Agence Global

  • About AG
  • Content
  • Articles
  • Contact AG

Todd Miller, “Bikes not walls: demilitarizing the border”

May 17, 2021 - TomDispatch

From the mountaintops of southern Arizona, you can see a world without borders. I realized this just before I met Juan Carlos. I was about 20 miles from the border but well within the militarized zone that abuts it. I was, in fact, atop the Baboquivari mountain range, a place sacred to the Tohono O’odham, the Native American people who have inhabited this land for thousands of years. At that moment, however, I couldn’t see a single Border Patrol agent or any sign of what, in these years, I’ve come to call the border-industrial complex. On the horizon were just sky and clouds — and mountain ranges like so many distant waves. I couldn’t tell where the United States ended or Mexico began, and it didn’t matter.

I was reminded of astronaut Edgar Mitchell’s reaction when he gazed back at the Earth from the moon: “It was [a] beautiful, harmonious, peaceful-looking planet, blue with white clouds, and one that gave you a deep sense… of home, of being, of identity. It is what I prefer to call instant global consciousness.”

A couple hours after my own peaceful moment of global consciousness, Juan Carlos appeared at the side of a dirt road. I was by then driving in a desolate stretch of desert and he was waving his arms in distress. I halted the car and lowered the window. “Do you want some water?” I asked in Spanish, holding out a bottle, which he promptly chugged down.

“Is there anything else I can do for you?” I asked.

“Can you give me a ride to the next town?”

At that moment, my vision of a borderless world evaporated. Even though I couldn’t see them, I could feel the proximity of armed border agents in their green-striped trucks. Perhaps one of the high-tech surveillance towers in the area already had us in its scope. Maybe I had tripped a motion sensor and a Predator B drone was flying over the car. Unfortunately, I knew far too much about one of the most surveilled borders on this planet and how it’s designed to create a potentially deadly crisis for people like Juan Carlos who cross it.

Although this particular incident happened a couple years ago, the U.S. border strategy still regularly forces such migrants into the deep and dangerous desert, as has been true for the last quarter-century.

The reason I so palpably felt the surveillance system all around me was because I knew that I was risking a prison sentence if I gave a ride to Juan Carlos, who told me he was from Guatemala. So, I hesitated. The natural impulse to help a fellow human being was almost instantly overridden by a law making it a felony to transport him and in any way further his presence in this country.

My hesitation both infuriated me and reminded me of how borders can be internalized. I had to think about what the Border Patrol would notice if they pulled me over, particularly that Juan Carlos only spoke Spanish and that he had brown skin. They would assume he was undocumented. Such racial profiling is encoded in the border-security paradigm.

In the end, I wrote a whole book, Build Bridges, Not Walls: A Journey to a World Without Borders, as a kind of meditation on that moment of hesitation and how it acted like a prism through which I could reflect on my two decades of border reporting. But the book is also a reckoning with the border itself, based on conversations I had with refugees, migrants like Juan Carlos, Border Patrol agents seeking out those like him, border-industrial complex officials making money off such voyagers, journalists and scholars covering the never-ending “crisis” there, indigenous people watching their lands being walled off, and those among them who have visions of how all of this can work differently.

One of the most important conversations of all came with someone who will inherit this wall-plagued world of ours, my five-year-old son, William. One day, on a beach south of San Diego, a Border Patrol agent yelled at him as he ran toward the tall, steel-barred wall there at the border to greet people waving from the other side, in Tijuana, Mexico. I remember him sitting in the sand, trying to grasp why that agent wouldn’t let him go to the fence and be friendly. Later, when we talked over the incident, he asked me: “Why can’t we turn the wall into bikes?”

A good question and, with Donald Trump and his talk of a “big, fat, beautiful wall” gone, there’s been lots of news coverage about Biden-era immigration reform, about “fixing a broken system.” While my expectations are low, there also couldn’t be a better moment to begin to demilitarize our border and turn it into something else. As my son suggested, another world, a world of bikes, not walls — both more humane and more sustainable — is not only possible, but essential to pursue.

“An element of absurdity” It was a hot day in 2008 when the Border Patrol dispatcher radioed agent Brendan Lenihan, telling him that a motion sensor had been tripped in a rugged mountain range about 30 miles from where I met Juan Carlos. Thousands of such sensors have been implanted along the U.S.-Mexico border, even miles inland where Lenihan slowly drove to an empty mine shaft at the top of a mountain. There, as he got out of his truck, a man appeared waving his arms in distress and talking rapidly in Spanish.

As you consider Brendan’s story, which he told me long after, keep in mind that our closed but porous borders are also an enormously elaborate system of death-by-design. Yet even mentioning the concept of “open borders” usually brings, at best, polite rejection and often instant ridicule. From the more courteous side, a common argument is that open borders would be a threat to this country’s stability. Yet Brendan’s story not only illustrates the border’s violence and — his word — “absurdity,” but also the way in which borders actually maintain instability in a world of immense inequality.

That day, when Lenihan pulled his assault rifle over his shoulder and followed Rogelio — a name he would learn later — he had no idea what he was heading into. He was a new agent, taken on during a post-9/11 hiring surge when the border suddenly became a “counterterrorism” priority mission and the fiscal faucets opened wide for U.S. Customs and Border Protection under the newly minted Department of Homeland Security. Never had there been more Border Patrol agents.

Descending the ravine, he came across a scene that would only become more common in an age of increasing border “enforcement.” An older man, Miguel, was gently rocking a younger one, his cousin and Rogelio’s brother, Roberto, like a baby. Roberto’s eyes, when open, were rolled back and white. The situation was clearly dire. Brendan radioed for help, but a helicopter couldn’t land in the ravine.

By clasping their arms, Rogelio and he formed a human stretcher. Roberto started to vomit. Black bile oozed from one corner of his mouth. As they climbed up that ravine under a burning sun, the strain and sweat made their hands slip and Brendan became ever more aware that Rogelio’s callused hands were locked in his. It was, he would later tell me, “strangely intimate,” holding hands with someone he would normally arrest. Then he simply forgot who he was. The border disappeared. With it went his uniform, his badge, and his gun. Looking down at Roberto, he saw only a young man in his arms and, for a long moment, felt as if he were carrying his own brother.

Suddenly, his radio crackled and he came back to his senses. He was still a Border Patrol agent. The border itself had never gone away. At that very moment, it was, in fact, killing Roberto. Now, however, Brendan found himself moving with a new sense of empathy. To experience this was little short of miraculous, given his intense Border Patrol training, given that the border, by its very nature, is anti-empathetic.

As it happened, the Border Patrol EMT unit was unable to revive Roberto. At a bar later that night, seeing that he was upset, Brendan’s fellow agents assured him that such a tragedy was just part of the “border game.” And callous as it may have sounded, it was true. The border, by its very nature, by its strategy, by the way the border-industrial complex had developed it, was indeed death by design and most of them had already experienced that all too vividly.

The next day, Brendan’s supervisor called him. Don’t worry, he said, they were nothing but “drug mules.” When Brendan relayed this to me, he looked exasperated and added, “What did I care?”

“It didn’t matter to you that they were allegedly smuggling drugs?” I asked.

“To me,” he said, “it doesn’t make a difference. They just seemed like regular guys. And who knows what kind of job I would have had if I grew up with them in their situation in life. It could have been me. I could have been one of them.”

Shortly after that call from his supervisor, he noticed the scent of marijuana coming through his apartment window. “And now,” he added, “my neighbor is smoking the very thing I’m trying to stop. There’s an element of absurdity to it all.”

Yes, indeed, when it comes to the border and its many “crises,” the absurdity runs deep. Take those claims about immigrants and drugs. In reality, more than 80% of all illicit drugs making it into the United States arrive through regular ports of entry, not the vastness of the desert. Along the same lines, the usual claims that immigrants are likely to be criminals or prone to crime are simply untrue, as study after study after study has shown.

And by the way, other studies clearly indicate that, far from depressing the economy, higher rates of immigration bolster it. An analysis from the investigative news site ProPublica, for example, indicates that, for every 1% increase in immigration, there was a simultaneous 1.15% increase in the gross domestic product. In other words, if President Trump actually wanted to achieve the 4% economic growth he swore, in 2016, that his presidency would bring, the one surefire way to do so, as ProPublica’s Lena Groeger suggested, would have been to stop building that wall of his and let eight million immigrants into the country.

No less important, as Brendan Lenihan’s experience implicitly suggested, this country’s ever more fortified borders have little to do with global stability. In fact, they play a key role in maintaining the instability of a world in which 2,153 billionaires (many of them American) have more wealth than the poorest 4.6 billion people on this planet. We’re talking, of course, about a place where forecasts of climate displacement suggest that, by 2050, as many as one billion people could be desperately on the move.

Borders, at least as presently imagined, are an impediment to a sustainable world based on empathy and equality.

Shifting shapes Soon enough, my son’s mind would turn from bikes to other possibilities. Why, he wondered, couldn’t the wall be made into houses or rails for trains, anything more useful for us human beings and the health of the planet (one of his growing concerns)? When, like him, I begin to imagine shifting the shapes of things in our world, I often think of budgets. From 2003 to 2021, the federal government spent $332.7 billion on U.S. Customs and Border Protection and U.S. Immigration and Customs Enforcement — on, that is, our designated border and immigration control agencies. Those sums would translate into nearly 700 miles of walls — built not just by our last president, but over multiple administrations — as well as more than 20,000 armed agents, billions in high-tech border surveillance technology, and at least 200 detention centers.

When it came to actual human security and wellbeing, however, that money was distinctly ill-spent. As Flint, Michigan, has shown, for instance, contaminated water is a tangible and major threat to human health. Imagine if some of that border-fortification money had been directed not to ludicrous walls on our southern border, but to producing cleaner, safer water or better health care. Wouldn’t that have brought stability in a way another mile of border wall or the latest surveillance tower never does? Imagine, for instance, a world in which such money was used not to purchase medium-sized drones with facial recognition capabilities, but to help alleviate the crisis in (un)affordable housing.

And mind you, 1,000 border walls won’t stop climate change, the “biggest threat that modern humans have ever faced,” as British naturalist David Attenborough told the U.N. Security Council. Imagine the carbon that might sooner or later be gone from this world if our 21,000 Border Patrol agents planted one tree every day for years to come. Turning such agents into gardeners and foresters might sound silly, but it might prove crucial for future generations. Maybe demilitarizing the border and turning it into a lush garden would bolster human security more than any wall, guard, or gun.

Facing the displacement crisis I never had a chance to ask Juan Carlos how or why he had found himself lost and desperate by that desert road. Still, I did know that he wasn’t part of a “border crisis” but, as Harsha Walia puts it, a “displacement crisis.” As she writes, “Migrants and refugees do not just appear at our borders. They are produced by systemic forces.”

Looking back, I have no doubt his request at that moment was also part of that very displacement crisis and U.S. policy had played a significant role in producing it. I mean, how else can you think of his country, Guatemala, where the CIA instigated a coup in the name of the United Fruit Company in 1954 and our government trained homicidal generals responsible for atrocity after atrocity in the 1980s? There’s a whole forgotten history of what this country helped create in Central America, as historian Aviva Chomsky has made all too clear, one that’s intrinsically tied to today’s ongoing immigration disaster.

Any future border freedom of movement policy would be the twin pillar with another fundamental right, the right to stay home and live a dignified life. A fortified border falls, in other words, with the creation of a more humane world.

Perhaps Juan Carlos had been a farmer whose harvest never came in thanks to the increasing Central American droughts associated with a warming globe. I know my country was far more responsible than his for the greenhouse gas emissions now in the biosphere creating that overheated world. Or he could have been displaced by the transnational influx of extractive industries in his country intent on taking its natural wealth, part of a long legacy of dispossession by foreign companies in what still passes for a free-market economy. Or maybe his trip north was thanks to persecution from military and police units (many U.S.-trained) or organized crime and gangs, or both at the same time. I had no way of knowing.

What I did know was that there were no border patrols trying to stop the mining companies, the military-security assistance crews, the economic dispossessors, or the greenhouse gas emitters. The border patrols were reserved for the displaced, not those responsible for their displacement — those, that is, who really live in a world of open borders.

And so, as I sat there, infuriated by my own fear, my hesitation about giving Juan Carlos a ride, I realized — as had Brendan many years before — that I was the one who actually needed help. I was the one who needed Juan Carlos to orient me when it came to what a more humane world might be like. I was the one whose spirit was thirsty and needed a drink. I was the one who needed to imagine a world in which such human-made, fortified, militarized borders melted away amid a new global consciousness and solidarity.

So, I looked at Juan Carlos, who needed that lift to the nearest town and knew that, to get to such a world of solidarity and global consciousness, it would be necessary to break the law. And though after that morning, I never saw him again — somehow, he remains with me to this day.

Todd Miller writes regularly for TomDispatch (where this article originated). He has written on border and immigration issues for the New York Times, Al Jazeera America, and the NACLA Report on the Americas. His latest book is Build Bridges, Not Walls: A Journey to a World Without Borders. You can follow him on Twitter @memomiller and view more of his work at toddmillerwriter.com.

Copyright ©2021 Todd Miller — distributed by Agence Global

—————-

Released: 17 May 2021

Word Count: 2,736

—————-

Liz Theoharis, “Mother’s Day tears”

May 13, 2021 - TomDispatch

One hundred and fifty years ago, in the bloody wake of the Civil War, the abolitionist Julia Ward Howe issued a “Mother’s Day Proclamation.” The world, she wrote, could no longer bear such terrible violence and death. She called on women across the country to “rise up through the ashes and devastation” and come together in the cause of peace. Forty years later, her daughter Anna Jarvis created Mother’s Day.

In the midst of another national trauma, with the latest Mother’s Day just past, perhaps it’s an auspicious moment to celebrate not just mothers, but women more generally. I think about countless women like my mom (who died nearly a year ago) enduring tremendous adversity to make ends meet and care for those they love. During the pandemic, after all, women have found themselves on the front lines in so many ways. They make up more than 75% of healthcare workers, almost 80% of frontline social workers, and more than 70% of government and community-based service workers. Add in one more thing: women have been hit first and worst by the economic crisis that Covid-19 set off, as female-dominated industries like retail, leisure, and hospitality were decimated.

The situation continues to be so dire for women that economists have even begun to talk about a “shecession.” A recent poll found that a quarter of women claimed they were financially worse off a year into the pandemic. In March, the percentage of women out of, or looking for, work was the highest it’s been since December 1988. For the first time in American history, job and income losses in an economic crisis have been worse for women than for men. And it’s been poorer women and women of color who have been hit hardest of all.

But the true depth of this crisis can’t be measured by job numbers and frontline risks alone. In an intensified yet eerily familiar way, this past year-plus has laid bare the pressures, burdens, and violence that women, especially poor women and women of color, face every day. It’s highlighted the disproportionate, unpaid labor they shoulder at home; the role they take in raising and educating children while caring for the sick and elderly; and the paternalistic, often punitive, presence of welfare and law enforcement agencies like Child Protective Services, the police, and U.S. Immigration and Customs Enforcement (ICE) agents in their lives.

In such a moment, we should all think about the opening words of Howe’s 150-year-old proclamation: “Arise, then, women of this day! Arise, all women who have hearts, whether our baptism be that of water or of tears!”

Of water and tears Before slavery was outlawed in America, formerly enslaved abolitionist leader Frederick Douglass insisted that those who feel the first pains of injustice must be the first to strike out against it. That was the very kind of “baptism” Howe invoked in her proclamation — an invitation to initiate women into struggles born from those already so much a part of their lives. Today, her invocation of “water and tears” should resonate for millions. Among them, it may have no greater relevance than for the women of the Michigan cities of Flint and Detroit.

April 25th marked the seventh anniversary of the ongoing water crisis in Flint. Many will remember the breaking news coverage about the lead poisoning of that city’s water system at the end of 2015. Others will recall President Barack Obama’s “mission accomplished” moment when he visited Flint and drank a cup of newly filtered tap water. But for the many women, poor and largely of color, who have become Flint’s “water protectors,” the crisis isn’t over. Even now, new water lines are still needed in some neighborhoods. A $641 million class-action settlement fund from lawsuits against the state of Michigan has indeed recently been set up for Flint residents, particularly impacted children, to receive help. However, community leaders are continuing to organize, because unfortunately many of the families and children who need the resources the most will be left out since the settlement requires documentation, which the poorest and most vulnerable families will struggle to obtain.

It’s important to note that the struggle of these warriors for clean water did not begin when the first cameras arrived in Flint to record the disaster. It began when, in 2011, Michigan Governor Rick Snyder appointed an unelected emergency manager to rule the city with near-dictatorial powers.

A similar emergency manager had already imposed mass water shutoffs in Detroit after that city went bankrupt, while the one in Flint switched from piping in well-treated water from Detroit to pumping water directly out of the Flint River, which had been an unofficial waste-disposal site for local industry for decades. It was seen as a cost-saving measure for that financially strapped city until a new water-piping system could be built. Warnings and safety precautions were ignored when it came to lead and other pollutants ending up in local drinking water, a decision that would, in the end, condemn Flint’s inhabitants to years of mass lead poisoning. Because of that same tainted water, more than 100 people would also die of Legionnaires’ disease.

We’re talking about a place that had once been a beacon of industry and prosperity, a city now struggling under the weight of deindustrialization and growing poverty. Claire McClinton, a long-time Flint community organizer and leader, summed up the crisis this way: “They could not have taken our water away without taking our democracy first.”

Her words are informed as much by history as by contemporary events. McClinton and many of the other women fighting for clean water had already spent decades organizing for a broad range of welfare, labor, and economic rights. She and many of those other Michigan water warriors are my political mothers and mentors. No wonder I once again celebrated them (as well as my own mom) this Mother’s Day.

Even earlier, in 1996, during the heyday of neoliberal austerity politics, welfare-rights and labor activists like those in Flint and Detroit witnessed Democratic President Bill Clinton eliminate the entitlement of millions to welfare and better living standards of millions by signing into law the Personal Responsibility and Work Opportunity Reconciliation Act. Among its other “reforms,” it replaced Aid to Families with Dependent Children, a program which provided desperately needy children with welfare payments, with the far more restrictive Temporary Assistance for Needy Families. They watched as government agencies kicked staggering numbers of people off life-saving federal assistance programs and continued to forcibly rip kids away from parents who, in terrible economic circumstances, could no longer afford to feed and house their own families adequately.

As the situation in Flint made clear, the historic fight for welfare was integrally connected to the ongoing fight for clean and affordable water, as well as, in our present moment in thousands of communities, the fight for living wages and voting rights. And don’t forget the need for a revival of an increasingly impoverished, not to say (in the wake of Donald Trump) ravaged, democracy.

Indeed, all these issues raise questions about the role the government should play in caring for people and addressing fundamental fractures in society like poverty, hunger, and sickness, which always disproportionately hurt women. All of these are, then — or at least should be — non-negotiable issues for women today.

Lifting from the bottom up The first 100 days of the administration of Joe Biden and Kamala Harris have clearly represented a surprising pivot from neoliberalism’s halcyon days under Clinton. For an anti-poverty organizer like myself, schooled in the politics of the 1990s and early 2000s, it was startling, even moving, to watch Biden address a joint session of Congress and announce that “trickle-down economics has never worked. It’s time to grow the economy from the bottom up and middle-out… We have a real chance to root out systemic racism that plagues American life… A chance to deliver real equity. Good jobs and good schools. Affordable housing. Clear air and clean water.”

Without a doubt, one of the administration’s biggest achievements so far is the American Rescue Plan Act (ARPA), a $1.9-trillion relief package that has already begun to inject desperately needed resources into a needy America. Included in it was the Child Tax Credit (CTC), a potentially breakthrough anti-poverty program.

The CTC could be transformative for millions of poor families, especially if it were to be expanded and made permanent. For some observers, it may seem like a good idea conceived by policy experts for a critical but passing moment of national need. Dig a little deeper, though, and what you’ll find is that the CTC is an inheritance from the efforts of poor women over these last decades, especially those of the National Welfare Rights Organization (NWRO) in the 1960s and 1970s, some of whom are still organizing in Flint and Detroit.

The NWRO was a national organization of poor women on welfare, Black and white alike, at a time when more than eight million single women and their children received regular but meager benefits through the Aid to Families with Dependent Children program. NWRO leaders, however, saw such welfare not as a form of charity, but as a right. They insisted on the dignity of all work, whether in traditional jobs or at home, and the need to compensate all women for their labor. They championed a welfare system that wouldn’t separate the “deserving” from “undeserving” poor but instead put agency and power in the hands of welfare recipients rather than bureaucrats and social workers.

As it grew, their organizing coalesced around a demand for a guaranteed adequate annual income — and, in the late 1960s, they would prove a force to be reckoned with, recruiting leaders like Reverend Martin Luther King, Jr., to their cause. Their political imaginations were decades ahead of their time and their moral clarity on the position of poor women in this society prophetically advanced.

In 1972, Johnnie Tillmon, the first chairwoman of the NWRO, published a paradigm-shifting essay entitled “Welfare is a Women’s Issue,” in which she wrote:

 

“I’m a woman. I’m a black woman. I’m a poor woman. I’m a fat woman. I’m a middle-aged woman. And I’m on welfare. In this country, if you’re any of those things you count less as a human being. If you’re all of those things, you don’t count at all.”

Nearly half a century later, as we pause to honor mothers, isn’t it time to recognize the ways women like Johnnie Tillmon have for all too long been discarded by this society? Isn’t it time to be honest about those men — and women — who have risen to great heights, only to wield power in ways that hurt women? As for me, I can’t forget the moment when, during the fight over ARPA, Arizona Senator Krysten Sinema, a Democrat, made a show of walking past the Senate clerk’s desk, giving an exaggerated thumbs down to an amendment to the bill that would have raised the minimum wage to $15 an hour.

When a reporter from the Huffington Post inquired about her vote, the senator’s spokesperson claimed that it was sexist to comment on a female politician’s “body language” or “physical demeanor.” Much more harmful to women, though, is the disproportionate impact of poverty and low wages on their families and them.

Sinema represents a state in which nearly three million people are poor or one emergency away from economic ruin, a majority of them women. It’s troubling, then, that a woman who has reportedly experienced poverty herself (although questions have been raised about whether she has exaggerated how poor she was) would deny living wages to poor and low-income women in her state and across the country. Among the Democrats in the Senate joining Sinema in dissent were New Hampshire’s Maggie Hassan and Jeanne Shaheen, as well as five male senators. (All seven of them are millionaires.) Their actions are a stark reminder that women need genuine representation in Congress, as well as policies that lift us all.

Sadly, that “Nay” vote against including a minimum wage raise in the Covid-19 relief package hurt women, people of color, and the poor. Altogether, 59% of low-wage workers are women and nearly 40% of all Black workers labor for less than $15 an hour. Yes, during the pandemic, we’ve begun calling many low-wage workers “essential.” It turns out, though, that they aren’t essential enough to be guaranteed wages that might help them afford the essentials of life.

Now, Sinema is also at the center of another legislative battle — about the future of the filibuster, a racist relic of the slavery and then Jim Crow eras that still has democracy in chains. It continues to prove a powerful cudgel for extremists in the Republican Party who are increasingly unable to win a governing majority fairly. It’s especially useful for those determined to stonewall on a host of policies that disproportionately impact women, from wage increases to welfare programs and reproductive rights. Sadly, despite claiming to care about sexism and the fortunes of women, Sinema continues to help hold the Senate hostage to score political points.

Arise women of this day As a white woman — a mother, a pastor, a feminist, an activist, a teacher, and the co-chair of the Poor People’s Campaign: A National Call for Moral Revival — I feel obliged to challenge Senator Sinema: for her performance on the Senate floor, for her stance against living wages and for the filibuster, and for the long-term impact her actions will have on the 140 million poor and low-income people in this country, especially the 74 million poor and low-income women.

I also feel honored and obliged to uphold the work of women like Claire McClinton, Johnnie Tillmon, and Julia Ward Howe who have allowed us glimpses of what a government and economy that served and empowered all women could look like and who have highlighted the prophetic leadership of women impacted by the social injustices of their day. Now is the time to raise wages, ensure vaccine equity, and so much more. Now is the time to lift from the bottom so that all of society can rise, as poor people have been saying for years and President Biden has recently reaffirmed.

Tillmon couldn’t have made it clearer for us. “Women’s liberation,” she said so many years ago, “is simple. No woman in this country can feel dignified, no woman can be liberated, until all women get off their knees.”

Today, let’s hear her and arise together! In truth, every day should be Mother’s Day.

Liz Theoharis writes regularly for TomDispatch (where this article originated). She is a theologian, ordained minister, and anti-poverty activist. Co-chair of the Poor People’s Campaign: A National Call for Moral Revival and director of the Kairos Center for Religions, Rights and Social Justice at Union Theological Seminary in New York City, she is the author of Always With Us? What Jesus Really Said About the Poor. Follow her on Twitter at @liztheo.

Copyright ©2021 Liz Theoharis — distributed by Agence Global

—————-

Released: 13 May 2021

Word Count: 2,427

—————-

Mandy Smithberger, “Why the Pentagon budget never goes down”

May 11, 2021 - TomDispatch

The first 100 days of President Joe Biden’s administration have come and gone. While somewhat exaggerated, that milestone is normally considered the honeymoon period for any new president. Buoyed by a recent election triumph and inauguration, he’s expected to be at the peak of his power when it comes to advancing the biggest, boldest items on his agenda.

And indeed, as far as, say, infrastructure or pandemic vaccination goals, Biden has delivered in a major way. Blindly funding the Pentagon and its priorities in the stratospheric fashion that’s become the essence of Washington has, however, proven another matter entirely. One-hundred days later and it’s remarkable how little has changed when it comes to pouring money into this country’s vast military infrastructure and the wars, ongoing or imagined, that accompany it.

For the past decade, debate about the Pentagon budget was governed, in part, by the Budget Control Act, which placed at least nominal caps on spending levels for both defense and non-defense agencies. In reality, though, unlike so many other government agencies, the Pentagon was never restrained by such a cap. Congress continued to raise its limits as military budgets only grew and, no less important, defense spending had a release valve that allowed staggering sums of money to flow without serious accounting into an off-budget fund meant especially for its wars and labelled “the overseas contingency operations account.” The Congressional Research Service has estimated that such supplemental spending from September 11, 2001, to fiscal year 2019 totaled an astonishing $2 trillion above and beyond the congressionally agreed upon Pentagon budget.

Now, however, the Budget Control Act has expired, leaving this administration with a striking opportunity to reorient the country away from trillion-dollar-plus national security budgets and endless wars, though there’s little sign that such a path will be taken.

If there’s one thing Americans should have learned in the last year-plus, it’s that endless Pentagon spending doesn’t actually make us safer. The pandemic, the insurrection at the Capitol, and the persistent threat of white nationalist extremism should have made it all too clear that defending this country against the most significant risks to domestic public health and safety don’t fall within the Pentagon’s purview. In addition, the Department of Defense is perhaps the country’s greatest source of wasteful spending and mismanagement.

Sadly enough, however, it’s likely to be business as usual as long as the money continues to flow in the usual fashion. How striking and inexcusable then that, when it comes to the Pentagon, the Biden administration has visibly wasted its pivotal first 100 days in office on yet more of the same. What we already know, for instance, is that, despite a planned withdrawal of American troops from Afghanistan and claims about winding down America’s “forever wars,” the first proposed Biden Pentagon budget of $715 billion actually represents a modest increase over the staggering sums the Pentagon received in the last year of the Trump administration.

Admittedly, there is at least a little good news about the Pentagon’s finances in the Biden era (though it was already included in the last Trump administration Pentagon budget). The overseas contingency operations slush fund is finally being eliminated. While some saw this as a natural consequence of the end of the Budget Control Act, it was definitely a victory over weapons-industry-funded think tanks like the Center for Strategic and International Studies that were trying to persuade lawmakers and the public to “reform” the fund instead.

In addition, the Biden administration’s decision to bring the last troops home from Afghanistan could be an important initial step in drawing down this country’s endlessly expensive wars. It’s estimated that the United States will have spent upwards of $2.5 trillion dollars on the war in Afghanistan alone (including approximately $12.5 billion annually for the next 40 years on the care of its veterans), a conflict in which, according to Brown University’s Costs of War Project, more than a quarter of a million people were killed.

But Biden must do more if he wants to fulfill his promise to end the forever wars. That includes encouraging Congress to repeal long outdated war authorizations and committing not to let any future conflicts start without actual congressional declarations of war. Meanwhile, withdrawing troops from Afghanistan and other war fronts should result in significant Pentagon budget reductions, as has happened historically after wars — but don’t count on it.

The Pentagon behemoth of waste If you want a bellweather for measuring the Pentagon’s influence in America, consider this: even the most disastrous weapons programs regularly get a pass and it’s unlikely the Biden era will end that reality.

Right now, any number of wasteful and troubled Pentagon programs, most notoriously Lockheed Martin’s F-35 Joint Strike Fighter, are officially being reviewed. The cost of the creation and maintenance of that jet alone has already ensured that it will be the most expensive weapons program in history: an expected $1.7 trillion over its lifetime. Even department officials and members of Congress have — and this is rare indeed — balked at just how expensive and unreliable that fighter aircraft has proven to be. Trump’s outgoing acting Secretary of Defense Christopher Miller called the F-35 a “piece of…,” tellingly leaving the last word hanging, but later referring to the plane as “a monster.” Meanwhile, Representative Adam Smith (D-WA), the chair of the House Armed Services committee, has made it clear that he’d like to stop throwing taxpayer dollars down that particular “rathole.”

Once upon a time, Americans were assured that, as the country’s future jet fighter, the F-35 would be “more Chevrolet than Porsche”; that is, on the low (and cheap) end of any new mix of future air power. A lot has changed since then. Total program costs have doubled, while the future price of maintaining the planes soared — unlike the planes themselves. Often, in fact, they aren’t in good enough shape to fly, raising serious concerns about whether enough F-35s will be available for future combat. The chief of staff of the Air Force now claims that it’s not the Chevrolet, but the “Ferrari,” of jet fighters and so should, in the future, be used sparingly. The predictable evolution of that plane was described by the legendary late Colonel Everest Riccioni as a modern Pentagon version of “unilateral disarmament.”

At the very least, no more F-35s should be purchased until testing is successfully completed, but such common sense has not, in recent memory, been a notable Pentagon trait — not in the world of the “revolving door” of the military-industrial complex. In this sense, the F-35 program has been typical of our times.  In 2017, when delays and exploding costs led the Department of Defense to consider reducing the program’s size, then-commandant of the Marine Corps General Joe Dunford weighed in on the subject. Largely ignoring F-35 testing data, he promptly declared that the program had indeed reached initial operational capability (which it likely hadn’t). Unsurprisingly, soon after his retirement in 2019, he joined the board of Lockheed.

The future of the Pentagon will largely be shaped by the personnel selected to lead it. In too many instances, they’ve come directly from a defense industry that’s profited handsomely from its soaring budget. In the Trump administration, for instance, figures were selected for the position of secretary of defense who had worked for top defense firms. Retired general Jim Mattis had been on the board of General Dynamics (and returned to it shortly after his stint at the Pentagon ended); Patrick Shanahan came from Boeing; and Mark Esper came from Raytheon.

Although Joe Biden issued a strong ethics executive order to be applied to his political appointees across the board, so far his administration doesn’t look that different from past ones when it comes to the Pentagon. After all, his secretary of defense, retired General Lloyd Austin III, arrived directly from the board of Raytheon; while Frank Kendall, nominated to be Air Force secretary, comes from the board of Leidos, another top Pentagon contractor, though one that provides services rather than building weaponry. (While often overlooked, service contracts make up nearly half of all the department’s contract spending.)

Spreading defense contracts across congressional districts, a practice known in Washington as “political engineering,” also needs to end. Lockheed, for instance, claims that the F-35 program has created jobs in 45 states. According to conventional wisdom, it’s this reality that makes the Pentagon too big to fail. Though seldom noted, similar money put into non-military funding like infrastructure or clean energy almost invariably proves to be a greater job creator than the military version of the same.

Here, then, is a question that might be worth considering in the early months of the Biden administration: Is there a more striking indictment of this country’s approach to military budgeting than continuing to buy a weapon because our political system is too corrupt to change course?

Militarism at home In our recent history, Washington has distinctly been a Pentagon-first sort of place. Often forgotten is how such an approach has negatively impacted communities not just in Afghanistan, Iraq, Somalia, or Yemen, but also here at home. To take one example, the Pentagon has played a key role in militarizing this country’s police forces, only contributing to the destructive cycle that was first widely noticed after police used military-grade weapons against those protesting the killing of an unarmed Black teenager, Michael Brown, in Ferguson, Missouri, in 2014. Continued police violence targeting the Black community finally gained major attention in the wake of the murder of George Floyd and the police response to the Black Lives Matter movement last summer. As colleagues of mine at the Project On Government Oversight have written, the militarization of our police makes the public “both less safe and less free.”

The Pentagon has negatively impacted the policing of America through its 1033 program, which in recent years has transferred staggering amounts of excess military equipment, sometimes directly off the battlefields of this country’s “forever wars,” to police departments across the country. Tools of war now transferred to local police forces include tanks, mine-resistant ambush-protected vehicles, assault rifles, and bayonets, among many other military items. The group Open The Books, dedicated to government transparency, found that since 1993 the program has transferred 581,000 items of military gear worth $1.8 billion to the police. Unsurprisingly, a 2017 study found police departments that received such equipment were more likely to kill the very civilians they are supposed to protect and serve.

At the beginning of the Biden administration, it appeared that the 1033 program would be curtailed. In January, Reuters reported that the president was preparing to sign an executive order that very month, which would at least put significant limits on the program. As of yet, more than three months later, the White House has taken no such action, though in March, Representative Hank Johnson (D-GA) did introduce legislation to curtail the program. According to the Security Policy Reform Institute, the National Association of Police Organizations claimed credit for delaying the president’s action.

So today, the military continues to make this country’s police look ever more like they’re occupying some foreign land.

The China chickenhawks And if the China hawks who have gained significant power among the Biden foreign-policy team have anything to say about it, funding the Pentagon will continue to be the order of the day.

Not surprisingly, the Biden administration faces increasing pressure over China and the dangers of war, a narrative that seems like a response to a growing public consensus that we can’t continue to put the Pentagon’s needs first. The military services are already beginning to turn on each other as they fight for their share of the future budget pie. Concerned that the money train may finally be preparing to run off the tracks, there’s been a persistent drumbeat of exaggeration about the military threat posed by China.

In that context, the key document Pentagon boosters continue to cite, though it was published in 2018, is a report from the National Defense Strategy Commission. It recommended cutting the entitlement programs that make up this country’s social safety net to pay for a 3% to 5% annual increase in Pentagon spending. Most of the panelists on that commission were defense industry consultants, board members of the giant weapons makers, or lobbyists for the same. Needless to say, they had a financial stake in raising concerns that China would overtake the United States militarily in the reasonably near future.

Indeed, it’s a fact of life that competition with China is now a challenge, but it’s important to maintain a sense of realism about the nature of that threat. As John Isaacs of the Council for a Livable World recently showed, in capacity and strength, the U.S. military dominates China’s many times over. “It seems that China has become the new Soviet Union strawman,” Isaacs wrote. “But there’s one big difference: while the Soviet military and nuclear arsenal were a fair match for the United States’, China’s simply aren’t.” The new cold war with China that the Biden administration is already promoting only threatens to weaken this country as resources are diverted away from combating the most serious threats of our time like pandemics, climate change, and white supremacy.

Unfortunately, in February, the Biden administration, having largely bought into this rhetoric, announced the establishment of a new Pentagon China Task Force. The most likely outcome, as my colleague Dan Grazier points out, is that the president and his foreign policy team will provide ample “cover for elected officials to back unpopular policy recommendations that will end up fulfilling the wish list of the defense industry.”

As longtime Atlantic correspondent and defense-reform expert James Fallows has noted, America’s draft-less twenty-first-century wars have essentially ensured that the U.S. has become a “chickenhawk nation.” For those unfamiliar with the term, chickenhawk refers to “those eager to go to war, as long as someone else is going” in their place. The net result is that the American public has, in this century, proven remarkably complacent about how Washington has used force, “blithely assuming we would win.” It was bad enough with Afghanistan, Iraq, and the other forever-war countries, but when it comes to China, it’s hard to imagine anything but the most negative outcomes from those encouraging military conflict.

Meanwhile, as with so much related to the Pentagon, the consequences at home of the China scare are already apparent. As has been increasingly obvious of late, overheated rhetoric about the dangers of China have led to an increase in hate-crime attacks against Asian Americans nationwide. While former President Trump’s anti-China rhetoric (“Kung-flu,” “China Virus”) seems to have contributed significantly to this increase in hate crimes, so has the rise in fear-mongering about the China threat and the bolstering of what’s still called “defense” policy that’s gone with it.

This country would undoubtedly benefit from more competition with (as well as cooperation with) China that would strengthen the economy and create more prosperity here. On the other hand, a new cold war atmosphere will allow the Pentagon to horde resources that would otherwise go to our greater public health and safety needs.

Unfortunately, 100-plus days later, the Biden administration has already wasted its first opportunity to change course.

Mandy Smithberger writes regularly for TomDispatch (where this article originated). She is the director of the Center for Defense Information at the Project On Government Oversight (POGO).

Copyright ©2021 Mandy Smithberger — distributed by Agence Global

—————-

Released: 11 May 2021

Word Count: 2,541

—————-

William Astore, “Endless war is a feature of our national programming”

May 10, 2021 - TomDispatch

Why don’t America’s wars ever end?

I know, I know: President Joe Biden has announced that our combat troops will be withdrawn from Afghanistan by 9/11 of this year, marking the 20th anniversary of the colossal failure of George W. Bush and Dick Cheney to defend America.

Of course, that other 9/11 in 2001 shocked us all. I was teaching history at the U.S. Air Force Academy and I still recall hushed discussions of whether the day’s body count would exceed that of the Battle of Antietam, the single bloodiest day of the Civil War. (Fortunately, bad as it was, it didn’t.)

Hijacked commercial airliners, turned into guided missiles by shadowy figures our panicky politicians didn’t understand, would have a profound impact on our collective psyche. Someone had to pay and among the first victims were Afghans in the opening salvo of the misbegotten Global War on Terror, which we in the military quickly began referring to as the GWOT. Little did I know then that such a war would still be going on 15 years after I retired from the Air Force in 2005 and 80 articles after I wrote my first for TomDispatch in 2007 arguing for an end to militarism and forever wars like the one still underway in Afghanistan.

Over those years, I’ve come to learn that, in my country, war always seems to find a way, even when it goes badly — very badly, in fact, as it did in Vietnam and, in these years, in Afghanistan and Iraq, indeed across much of the Greater Middle East and significant parts of Africa. Not coincidentally, those disastrous conflicts haven’t actually been waged in our name. No longer does Congress even bother with formal declarations of war. The last one came in 1941 after Pearl Harbor. During World War II, Americans united to fight for something like national security and a just cause. Today, however, perpetual American-style war simply is. Congress postures, but does nothing decisive to stop it. In computer-speak, endless war is a feature of our national programming, not a bug.

Two pro-war parties, Republicans and Democrats, have cooperated in these decades to ensure that such wars persist… and persist and persist. Still, they’re not the chief reason why America’s wars are so difficult to end. Let me list some of those reasons for you. First, such wars are beyond profitable, notably to weapons makers and related military contractors. Second, such wars are the Pentagon’s reason for being. Let’s not forget that, once upon a time, the present ill-named Department of Defense was so much more accurately and honestly called the Department of War. Third, if profit and power aren’t incentive enough, wars provide purpose and meaning even as they strengthen authoritarian structures in society and erode democratic ones. Sum it all up and war is what America now does, even if the reasons may be indefensible and the results so regularly abysmal.

Support our troops! (who are they, again?) The last truly American war was World War II. And when it ended in 1945, the citizen-soldiers within the U.S. military demanded rapid demobilization — and they got it. But then came the Iron Curtain, the Cold War, the Korean War, fears of nuclear Armageddon (that nearly came to fruition during the Cuban Missile Crisis in 1962), and finally, of course, Vietnam. Those wars were generally not supported — not with any fervor anyway — by the American people, hence the absence of congressional declarations. Instead, they mainly served the interests of the national security state, or, if you prefer, the military-industrial-congressional complex.

That’s precisely why President Dwight D. Eisenhower issued his grave warning about that Complex in his farewell address in 1961. No peacenik, Ike had overseen more than his share of military coups and interventions abroad while president, so much so that he came to see the faults of the system he was both upholding and seeking to restrain. That was also why President John F. Kennedy called for a more humble and pacific approach to the Cold War in 1963, even as he himself failed to halt the march toward a full-scale war in Southeast Asia. This is precisely why Martin Luther King, Jr., truly a prophet who favored the fierce urgency of peace, warned Americans about the evils of war and militarism (as well as racism and materialism) in 1967. In the context of the enormity of destruction America was then visiting on the peoples of Southeast Asia, not for nothing did he denounce this country as the world’s greatest purveyor of violence.

Collectively, Americans chose to ignore such warnings, our attention being directed instead toward spouting patriotic platitudes in support of “our” troops. Yet, if you think about it for a moment, you’ll realize those troops aren’t really ours. If they were, we wouldn’t need so many bumper stickers reminding us to support them.

With the military draft gone for the last half-century, most Americans have voted with their feet by not volunteering to become “boots on the ground” in the Pentagon’s various foreign escapades. Meanwhile, America’s commanders-in-chief have issued inspiring calls for their version of national service, as when, in the wake of 9/11, President George W. Bush urged Americans to go shopping and visit Disney World. In the end, Americans, lacking familiarity with combat boots, are generally apathetic, sensing that “our” wars have neither specific meaning to, nor any essential purpose in their lives.

As a former Air Force officer, even if now retired, I must admit that it took me too long to realize this country’s wars had remarkably little to do with me — or you, for that matter — because we simply have no say in them. That doesn’t mean our leaders don’t seek to wage them in our name. Even as they do so, however, they simultaneously absolve us of any need to serve or sacrifice. We’re essentially told to cheer “our” troops on, but otherwise look away and leave war to the professionals (even if, as it turns out, those professionals seem utterly incapable of winning a single one of them).

You know that yellow “crime scene” tape the police use to keep curious bystanders at bay? Our government essentially uses “war scene” tape to keep the curious among us from fathoming what the military is doing across so much of the world. That “tape” most often involves the use of classification, with everything that might matter to us designated “secret” or “top secret” and not fit for our eyes to see. This cult of secrecy enables ignorance and reinforces indifference.

Anyone like a Chelsea Manning or a John Kiriakou who seeks to cut that tape and so let ordinary citizens examine any of our war crime scenes in all their ugliness is punished. You, John Q. Public, are not supposed to know of war crimes in Iraq. You, Jane Q. Public, are not supposed to know of CIA torture programs. And when you don’t know, and even when you do (if only a little), you have no ability to question this country’s warlords in any rigorous fashion. You have no ability to resist wars vigorously and you know it, so most likely you won’t act — as so many once did in the Vietnam era — to stop them.

For a self-styled democracy that should abjure such conflicts, war has instead become both omnipresent, omni-absent (if you’ll let me invent a word for our strange situation), and oddly mercenary in these disunited states of ours. Borrowing a line from The Godfather, war isn’t personal in America, it’s strictly business. Basically, this country has its very own powerful warlords, even if they don’t have personal names, just collective ones — like Boeing, Lockheed Martin, and Raytheon. In those wars of “ours” lies undeniable evidence that corporations are indeed citizens, as the Supreme Court declared in 2010 by judicial fiat in the eerily named “Citizens United” case. As a result, America’s corporate warlords are now a new kind of ultra-powerful citizen. Think of them as warped versions of Marvel superheroes, collectively profiting from incessant conflict.

Did I say America no longer has citizen-soldiers? Of course, America has them. In place of old-style heroes like Alvin York (from World War I) or Audie Murphy (from World War II), we now have “heroes” like Citizen Raytheon and Citizen Boeing. Remember, as Mitt Romney reminded us, “corporations are people, my friend.”

Your views on war don’t matter — or do they? As I think about war, American-style, certain phrases pop into my head from the Catholic catechism: is now and ever shall be, world without end, Amen. Apply that to America’s global conflicts and you’ve captured the grim reality of this forever-war moment, even if President Biden is now trying to get U.S. combat troops out of one of them (and others are looking fervently for ways to continue fighting it). Worse yet, behind the scenes, that “world without end” invariably threatens to become a world with an end as the Pentagon persists in building yet more nuclear weapons — the phrase of the moment is “modernizing the nuclear arsenal” — while pursuing an antagonistic new cold war with China and Russia.

Referring to Catholic doxology in this fashion may seem heretical to some, but thought about another way, it’s all too appropriate, as war in some sense is a widely shared cult, if not a religion, in America. Too many people believe in it, even worship it. Signs of this include the transformation of anyone who wears a military uniform into an automatic hero. People sacrifice their children to that cult. And even if you or your children choose not to serve (as so many Americans do), or if you’re among those rare citizens who vociferously protest against our wars, your tax dollars nevertheless feed a war machine that’s always cranking away, well-lubricated by our endless cash contributions.

While our coins still say “In God We Trust,” the god our nation’s leaders profess to trust is most assuredly a warrior, not the prince of peace. Under the circumstances and against a backdrop of perpetual war, no one should be surprised that this country is increasingly wracked by conflict and rent by violent impulses.

Common sense informed by history tells us that war is terror, atrocity, and murder. More than a few of America’s sons and daughters have indeed been transformed by war into murderers overseas — and that’s before “our” troops come home, haunted by deadly experiences and their physical and moral wounds. Yet despite their pain, despite those wounds, America’s war machine rumbles on, sowing the dragon’s teeth of future conflicts through vast weapons sales abroad and further military deployments that so often are justified, bizarrely enough, as helping to prevent war.

Of course, we’d like to think of our country as a shining city on a hill, but to others we must seem more like a citadel bristling with weaponry, a colossus of war. And sadly enough, too many of our fellow Americans in that citadel would rather be militarily strong and wrong than pacifically meek and right.

That grim reality was summed up for me by an offhand comment from that self-styled lord of war, then-Vice President Dick Cheney. Early in 2008, his administration’s invasion and occupation of Iraq having cratered and with casualties mounting, he was reminded that public opinion in this country had turned against that war and people wanted it to end. “So?” Cheney replied.

Who cares if the people are against war? For that matter, who cares about right and wrong? What matters is what the national security state wants and what it wants is war till the end of time.

What is to be done? I see two possible paths for this country. One is to work to find ways to end all our wars and the massive global military presence that goes with them. In the process, we would begin to dismantle our imperial war machine and so hobble the military-industrial complex and its warlords. The other is the path this country remains on (despite Joe Biden’s inclination to end the Afghan War). If followed, it will continue to allow the petty Caesars among us to rage until this imperial power finally collapses under the weight of its military excesses and failures. One path would lead to a possible restoration of democracy and citizen empowerment as America’s founders intended; the other will undoubtedly terminate in the chaos of slow-motion collapse in a world threatened by nuclear annihilation.

There is no fate but what we make, said Sarah Connor in the Terminator movies. What’ll it be, America? Do we have the collective courage to make a better fate for ourselves by pulling the plug on the war machine?

William Astore, a retired lieutenant colonel (USAF) and professor of history, writes regularly for TomDispatch (where this article originated). He is a senior fellow at the Eisenhower Media Network (EMN), an organization of critical veteran military and national security professionals. His personal blog is Bracing Views.

Copyright ©2021 William Astore — distributed by Agence Global

—————-

Released: 10 May 2021

Word Count: 2,117

—————-

Alfred McCoy, “The true mean of the Afghan ‘withdrawal'”

May 6, 2021 - TomDispatch

Many of us have had a recurring nightmare. You know the one. In a fog between sleeping and waking, you’re trying desperately to escape from something awful, some looming threat, but you feel paralyzed. Then, with great relief, you suddenly wake up, covered in sweat. The next night, or the next week, though, that same dream returns.

For politicians of Joe Biden’s generation that recurring nightmare was Saigon, 1975. Communist tanks ripping through the streets as friendly forces flee. Thousands of terrified Vietnamese allies pounding at the U.S. Embassy’s gates. Helicopters plucking Americans and Vietnamese from rooftops and disgorging them on Navy ships. Sailors on those ships, now filled with refugees, shoving those million-dollar helicopters into the sea. The greatest power on Earth sent into the most dismal of defeats.

Back then, everyone in official Washington tried to avoid that nightmare. The White House had already negotiated a peace treaty with the North Vietnamese in 1973 to provide a “decent interval” between Washington’s withdrawal and the fall of the South Vietnamese capital. As defeat loomed in April 1975, Congress refused to fund any more fighting. A first-term senator then, Biden himself said, “The United States has no obligation to evacuate one, or 100,001, South Vietnamese.” Yet it happened anyway. Within weeks, Saigon fell and some 135,000 Vietnamese fled, producing scenes of desperation seared into the conscience of a generation.

Now, as president, by ordering a five-month withdrawal of all U.S. troops from Afghanistan by this September 11th, Biden seems eager to avoid the return of an Afghan version of that very nightmare. Yet that “decent interval” between America’s retreat and the Taliban’s future triumph could well prove indecently short.

The Taliban’s fighters have already captured much of the countryside, reducing control of the American-backed Afghan government in Kabul, the capital, to less than a third of all rural districts. Since February, those guerrillas have threatened the country’s major provincial capitals — Kandahar, Kunduz, Helmand, and Baghlan — drawing the noose ever tighter around those key government bastions. In many provinces, as the New York Times reported recently, the police presence has already collapsed and the Afghan army seems close behind.

If such trends continue, the Taliban will soon be primed for an attack on Kabul, where U.S. airpower would prove nearly useless in street-to-street fighting. Unless the Afghan government were to surrender or somehow persuade the Taliban to share power, the fight for Kabul, whenever it finally occurs, could prove to be far bloodier than the fall of Saigon — a twenty-first-century nightmare of mass flight, devastating destruction, and horrific casualties.

With America’s nearly 20-year pacification effort there poised at the brink of defeat, isn’t it time to ask the question that everyone in official Washington seeks to avoid: How and why did Washington lose its longest war?

First, we need to get rid of the simplistic answer, left over from the Vietnam War, that the U.S. somehow didn’t try hard enough. In South Vietnam, a 10-year war, 58,000 American dead, 254,000 South Vietnamese combat deaths, millions of Vietnamese, Laotian, and Cambodian civilian deaths, and a trillion dollars in expenditures seem sufficient in the “we tried” category. Similarly, in Afghanistan, almost 20 years of fighting, 2,442 American war dead, 69,000 Afghan troop losses, and costs of more than $2.2 trillion should spare Washington from any charges of cutting and running.

The answer to that critical question lies instead at the juncture of global strategy and gritty local realities on the ground in the opium fields of Afghanistan. During the first two decades of what would actually be a 40-year involvement with that country, a precise alignment of the global and the local gave the U.S. two great victories — first, over the Soviet Union in 1989; then, over the Taliban, which governed much of the country in 2001.

During the nearly 20 years of U.S. occupation that followed, however, Washington mismanaged global, regional, and local politics in ways that doomed its pacification effort to certain defeat. As the countryside slipped out of its control and Taliban guerrillas multiplied after 2004, Washington tried everything — a trillion-dollar aid program, a 100,000 troop “surge,” a multi-billion-dollar drug war — but none of it worked. Even now, in the midst of a retreat in defeat, official Washington has no clear idea why it ultimately lost this 40-year conflict.

Secret war (drug war) Just four years after the North Vietnamese army rolled into Saigon driving Soviet-made tanks and trucks, Washington decided to even the score by giving Moscow its own Vietnam in Afghanistan. When the Red Army occupied Kabul in December 1979, President Jimmy Carter’s national security advisor, Zbigniew Brzezinski, crafted a grand strategy for a CIA covert war that would inflict a humiliating defeat on the Soviet Union.

Building upon an old U.S. alliance with Pakistan, the CIA worked through that country’s Inter Service Intelligence agency (ISI) to deliver millions, then billions of dollars in arms to Afghanistan’s anti-Soviet guerrillas, known as the mujahideen, whose Islamic faith made them formidable fighters. As a master of geopolitics, Brzezinski forged a near-perfect strategic alignment among the U.S., Pakistan, and China for a surrogate conflict against the Soviets. Locked into a bitter rivalry with its neighbor India that erupted in periodic border wars, Pakistan was desperate to please Washington, particularly since, ominously enough, India had only recently tested its first nuclear bomb.

Throughout the long years of the Cold War, Washington was Pakistan’s main ally, providing ample military aid and tilting its diplomacy to favor that country over India. To shelter beneath the U.S. nuclear umbrella, the Pakistanis were, in turn, willing to risk Moscow’s ire by serving as the springboard for the CIA’s secret war on the Red Army in Afghanistan.

Beneath that grand strategy, there was a grittier reality taking shape on the ground in that country. While the mujahideen commanders welcomed the CIA’s arms shipments, they also needed funds to sustain their fighters and soon turned to poppy growing and opium trafficking for that. As Washington’s secret war entered its sixth year, a New York Times correspondent travelling through southern Afghanistan discovered a proliferation of poppy fields that was transforming that arid terrain into the world’s main source of illicit narcotics. “We must grow and sell opium to fight our holy war against the Russian nonbelievers,” one rebel leader told the reporter.

In fact, caravans carrying CIA arms into Afghanistan often returned to Pakistan loaded with opium — sometimes, reported the New York Times, “with the assent of Pakistani or American intelligence officers who supported the resistance.” During the decade of the CIA’s secret war there, Afghanistan’s annual opium harvest soared from a modest 100 tons to a massive 2,000 tons. To process the raw opium into heroin, illicit laboratories opened in the Afghan-Pakistani borderlands that, by 1984, supplied a staggering 60% of the U.S. market and 80% of the European one. Inside Pakistan, the number of heroin addicts surged from almost none at all in 1979 to nearly 1.5 million by 1985.

By 1988, there were an estimated 100 to 200 heroin refineries in the area around the Khyber Pass inside Pakistan operating under the purview of the ISI. Further south, an Islamist warlord named Gulbuddin Hekmatyar, the CIA’s favored Afghan “asset,” controlled several heroin refineries that processed much of the opium harvest from the country’s southern provinces. In May 1990, as that secret war was ending, the Washington Post reported that American officials had failed to investigate drug dealing by Hekmatyar and his protectors in Pakistan’s ISI largely “because U.S. narcotics policy in Afghanistan has been subordinated to the war against Soviet influence there.”

Charles Cogan, director of the CIA’s Afghan operation, later spoke frankly about the Agency’s priorities. “We didn’t really have the resources or the time to devote to an investigation of the drug trade,” he told an interviewer. “I don’t think that we need to apologize for this… There was fallout in term of drugs, yes. But the main objective was accomplished. The Soviets left Afghanistan.”

There was also another kind of real fallout from that secret war, though Cogan didn’t mention it. While it was hosting the CIA’s covert operation, Pakistan played upon Washington’s dependence and its absorption in its Cold War battle against the Soviets to develop ample fissionable material by 1987 for its own nuclear bomb and, a decade later, to carry out a successful nuclear test that stunned India and sent strategic shockwaves across South Asia.

Simultaneously, Pakistan was also turning Afghanistan into a virtual client state. For three years following the Soviet retreat in 1989, the CIA and Pakistan’s ISI continued to collaborate in backing a bid by Hekmatyar to capture Kabul, providing him with enough firepower to shell the capital and slaughter some 50,000 of its residents. When that failed, from the millions of Afghan refugees inside their borders, the Pakistanis alone formed a new force that came to be called the Taliban — sound familiar? — and armed them to seize Kabul successfully in 1996.

The invasion of Afghanistan In the aftermath of the September 2001 terrorist attacks, when Washington decided to invade Afghanistan, the same alignment of global strategy and gritty local realities assured it another stunning victory, this time over the Taliban who then ruled most of the country. Although its nuclear arms now lessened its dependence on Washington, Pakistan was still willing to serve as a springboard for the CIA’s mobilization of Afghan regional warlords who, in combination with massive U.S. bombing, soon swept the Taliban out of power.

Although American air power readily smashed its armed forces — seemingly, then, beyond repair — that theocratic regime’s real weakness lay in its gross mismanagement of the country’s opium harvest. After taking power in 1996, the Taliban had first doubled the country’s opium crop to an unprecedented 4,600 tons, sustaining the economy while providing 75% of the world’s heroin. Four years later, however, the regime’s ruling mullahs used their formidable coercive powers to make a bid for international recognition at the U.N. by slashing the country’s opium harvest to a mere 185 tons. That decision would plunge millions of farmers into misery and, in the process, reduce the regime to a hollow shell that shattered with the first American bombs.

While the U.S. bombing campaign raged through October 2001, the CIA shipped $70 million in bundled bills into Afghanistan to mobilize its old coalition of tribal warlords for the fight against the Taliban. President George W. Bush would later celebrate that expenditure as one of history’s biggest “bargains.”

Almost from the start of what became a 20-year American occupation, however, the once-perfect alignment of global and local factors started to break apart for Washington. Even as the Taliban retreated in chaos and consternation, those bargain-basement warlords captured the countryside and promptly presided over a revived opium harvest that climbed to 3,600 tons by 2003, or an extraordinary 62% of the country’s gross domestic product (GDP). Four years later, the drug harvest would reach a staggering 8,200 tons — generating 53% of the country’s GDP, 93% of the world’s illicit heroin, and, above all, ample funds for a revival of… yes, you guessed it, the Taliban’s guerrilla army.

Stunned by the realization that its client regime in Kabul was losing control of the countryside to the once-again opium-funded Taliban, the Bush White House launched a $7-billion drug war that soon sank into a cesspool of corruption and complex tribal politics. By 2009, the Taliban guerrillas were expanding so rapidly that the new Obama administration opted for a “surge” of 100,000 U.S. troops there.

By attacking the guerrillas but failing to eradicate the opium harvest that funded their deployment every spring, Obama’s surge soon suffered a defeat foretold. Amid a rapid drawdown of those troops to meet the surge’s use-by date of December 2014 (as Obama had promised), the Taliban launched the first of its annual fighting-season offensives that slowly wrested control of significant parts of the countryside from the Afghan military and police.

By 2017, the opium harvest had climbed to a new record of 9,000 tons, providing about 60% of the funding for the Taliban’s relentless advance. Recognizing the centrality of the drug trade in sustaining the insurgency, the U.S. command dispatched F-22 fighters and B-52 bombers to attack the Taliban’s labs in the country’s heroin heartland. In effect, it was deploying billion-dollar aircraft to destroy what turned out to be 10 mud huts, depriving the Taliban of just $2,800 in tax revenues. To anyone paying attention, the absurd asymmetry of that operation revealed that the U.S. military was being decisively outmaneuvered and defeated by the grittiest of local Afghan realities.

At the same time, the geopolitical side of the Afghan equation was turning decisively against the American war effort. With Pakistan moving ever closer to China as a counterweight to its rival India and U.S.-China relations becoming hostile, Washington grew increasingly irritated with Islamabad. At a summit meeting in late 2017, President Trump and India’s Prime Minister Modi joined with their Australian and Japanese counterparts to form “the Quad” (known more formally as the Quadrilateral Security Dialogue), an incipient alliance aimed at checking China’s expansion that soon gained substance through joint naval maneuvers in the Indian Ocean.

Within weeks of that meeting, Trump would trash Washington’s 60-year alliance with Pakistan with a single New Year’s Day tweet claiming that country had repaid years of generous U.S. aid with “nothing but lies & deceit.” Almost immediately, Washington announced suspension of its military aid to Pakistan until Islamabad took “decisive action” against the Taliban and its militant allies.

With Washington’s delicate alignment of global and local forces now fatally misaligned, both Trump’s capitulation at peace talks with the Taliban in 2020 and Biden’s coming retreat in defeat were preordained. Without access to landlocked Afghanistan from Pakistan, U.S. surveillance drones and fighter-bombers now potentially face a 2,400-mile flight from the nearest bases in the Persian Gulf — too far for effective use of airpower to shape events on the ground (though America’s commanders are already searching desperately for air bases in countries far nearer to Afghanistan to use).

Lessons of defeat Unlike a simple victory, this defeat offers layers of meaning for those with the patience to plumb its lessons. During a government investigation of what went wrong back in 2015, Douglas Lute, an Army general who directed Afghan war policy for the Bush and Obama administrations, observed: “We were devoid of a fundamental understanding of Afghanistan — we didn’t know what we were doing.” With American troops now shaking the dust of Afghanistan’s arid soil off their boots, future U.S. military operations in that part of the globe are likely to shift offshore as the Navy joins the rest of the Quad’s flotilla in a bid to check China’s advance in the Indian Ocean.

Beyond the closed circles of official Washington, this dismal outcome has more disturbing lessons. The many Afghans who believed in America’s democratic promises will join a growing line of abandoned allies, stretching back to the Vietnam era and including, more recently, Kurds, Iraqis, and Somalis, among others. Once the full costs of Washington’s withdrawal from Afghanistan become apparent, the debacle may, not surprisingly, discourage potential future allies from trusting Washington’s word or judgment.

Much as the fall of Saigon made the American people wary of such interventions for more than a decade, so a possible catastrophe in Kabul will likely (one might even say, hopefully) produce a long-term aversion in this country to such future interventions. Just as Saigon, 1975, became the nightmare Americans wished to avoid for at least a decade, so Kabul, 2022, could become an unsettling recurrence that only deepens an American crisis of confidence at home.

When the Red Army’s last tanks finally crossed the Friendship Bridge and left Afghanistan in February 1989, that defeat helped precipitate the complete collapse of the Soviet Union and the loss of its empire within a mere three years. The impact of the coming U.S. retreat in Afghanistan will undoubtedly be far less dramatic. Still, it will be deeply significant. Such a retreat after so many years, with the enemy if not at the gates, then closing in on them, is a clear sign that imperial Washington has reached the very limits of what even the most powerful military on earth can do.

Or put another way, there should be no mistake after those nearly 20 years in Afghanistan. Victory is no longer in the American bloodstream (a lesson that Vietnam somehow did not bring home), though drugs are. The loss of the ultimate drug war was a special kind of imperial disaster, giving withdrawal more than one meaning in 2021. So, it won’t be surprising if the departure from that country under such conditions is a signal to allies and enemies alike that Washington hasn’t a hope of ordering the world as it wishes anymore and that its once-formidable global hegemony is truly waning.

Alfred W. McCoy writes regularly for TomDispatch (where this article originated). He, is the Harrington professor of history at the University of Wisconsin-Madison. He is the author most recently of In the Shadows of the American Century: The Rise and Decline of U.S. Global Power (Dispatch Books). His latest book (to be published in October by Dispatch Books) is To Govern the Globe: World Orders and Catastrophic Change.

Copyright ©2021 Alfred W. McCoy — distributed by Agence Global

—————-

Released: 06 May 2021

Word Count: 2,815

—————-

Karen J. Greenberg, “Can Guantánamo ever be shut down?”

May 4, 2021 - TomDispatch

The Guantánamo conundrum never seems to end.

Twelve years ago, I had other expectations. I envisioned a writing project that I had no doubt would be part of my future: an account of Guantánamo’s last 100 days. I expected to narrate in reverse, the episodes in a book I had just published, The Least Worst Place: Guantánamo’s First 100 Days, about — well, the title makes it all too obvious — the initial days at that grim offshore prison. They began on January 11, 2002, as the first hooded prisoners of the American war on terror were ushered off a plane at that American military base on the island of Cuba.

Needless to say, I never did write that book. Sadly enough, in the intervening years, there were few signs on the horizon of an imminent closing of that U.S. military prison. Weeks before my book was published in February 2009, President Barack Obama did, in fact, promise to close Guantánamo by the end of his first year in the White House. That hope began to unravel with remarkable speed. By the end of his presidency, his administration had, in fact, managed to release 197 of the prisoners held there without charges — many, including Mohamedou Ould Slahi, the subject of the film The Mauritanian, had also been tortured — but 41 remained, including the five men accused but not yet tried for plotting the 9/11 attacks. Forty remain there to this very day.

Nearly 20 years after it began, the war in Afghanistan that launched this country’s Global War on Terror and the indefinite detention of prisoners in that facility offshore of American justice is now actually slated to end. President Biden recently insisted that it is indeed “time to end America’s longest war” and announced that all American troops would be withdrawn from that country by September 11th, the 20th anniversary of al-Qaeda’s attack on the United States.

It makes sense, of course, that the conclusion of those hostilities would indeed be tied to the closure of the now-notorious Guantánamo Bay detention facility. Unfortunately, for reasons that go back to the very origins of the war on terror, ending the Afghan part of this country’s “forever wars” may not presage the release of those “forever prisoners,” as New York Times reporter Carol Rosenberg so aptly labeled them years ago.

Biden and Guantánamo Just as President Biden has a history, dating back to his years as Obama’s vice-president, of wanting to curtail the American presence in Afghanistan, so he called years ago for the closure of Guantánamo. As early as June 2005, then-Senator Biden expressed his desire to shut that facility, seeing it as a stain on this country’s reputation abroad.

At the time, he proposed that an independent commission take a look at Guantánamo Bay and make recommendations as to its future. “But,” he said then, “I think we should end up shutting it down, moving those prisoners. Those that we have reason to keep, keep. And those we don’t, let go.” Sixteen years later, he has indeed put in motion an interagency review to look into that detention facility’s closing. Hopefully, once he receives its report, his administration can indeed begin to shut the notorious island prison down. (And this time, it could even work.)

It’s true that, in 2021, the idea of shutting the gates on Guantánamo has garnered some unprecedented mainstream support. As part of his confirmation process, Secretary of Defense Lloyd Austin, for instance, signaled his support for its closure. And Congress, long unwilling to lend a hand, has offered some support as well. On April 16th, 24 Democratic senators signed a letter to the president calling that facility a “symbol of lawlessness and human rights abuses” that “continues to harm U.S. national security” and demanding that it be shut.

As those senators wrote,

“For nearly two decades, the offshore prison has damaged America’s reputation, fueled anti-Muslim bigotry, and weakened the United States’ ability to counter terrorism and fight for human rights and the rule of law around the world. In addition to the $540 million in wasted taxpayer dollars each year to maintain and operate the facility, the prison also comes at the price of justice for the victims of 9/11 and their families, who are still waiting for trials to begin.”

Admittedly, the number of signatories on that letter raises many questions, including why there aren’t more (and why there isn’t a single Republican among them). Is it just a matter of refusing to give up old habits or does it reflect a lack of desire to address an issue long out of the headlines? Where, for example, was Senate Majority Leader Chuck Schumer’s name, not to mention those other 25 missing Democratic senatorial signatures?

And there’s another disappointment lurking in its text. While those senators correctly demanded a reversal of the Trump administration’s “erroneous and troubling legal positions” regarding the application of international and domestic law to Guantánamo, they failed to expand upon the larger context of that forever nightmare of imprisonment, lawlessness, and cruelty that affected the war-on-terror prisoners at Guantánamo as well as at the CIA’s “black sites” around the world.

Still, that stance by those two-dozen senators is significant, since Congress has, in the past, taken such weak positions on closing the prison. As such, it provides some hope for the future.

For the rest of Congress and the rest of us, when thinking about finally putting Guantánamo in the history books, it’s important to remember just what a vast deviation it proved to be from the law, justice, and the norms of this society. It’s also worth thinking about the American “detainees” there in the context of what normally happens when wars end.

Prisoners of war Defying custom and law, the American war in Afghanistan broke through norms like a battering ram through a gossamer wall. Guantánamo was created in just that context, a one-of-a-kind institution for this country. Now, so many years later, it’s poised to break through yet another norm.

Usually, at the end of hostilities, battlefield detainees are let go. As Geneva Convention III, the law governing the detention and treatment of prisoners of war, asserts: “Prisoners of war shall be released and repatriated without delay after the cessation of active hostilities.”

That custom of releasing prisoners has, in practice, pertained not only to those held on or near the battlefield but even to those detained far from the conflict. Before the Geneva Conventions were created, the custom of releasing such prisoners was already in place in the United States. Notably, during World War II, the U.S. held 425,000 mostly German prisoners in more than 500 camps in this country. When the war ended, however, they were released and the vast majority of them were returned to their home countries.

When it comes to the closure of Guantánamo, however, we can’t count on such an ending. Two war-on-terror realities stand in the way of linking the coming end of hostilities in Afghanistan to the shutting down of that prison. First, the Authorization for the Use of Military Force that Congress passed right after the 9/11 attacks was not geographically defined or limited to the war in Afghanistan. It focused on but was not confined to two groups, the Taliban and al-Qaeda, as well as anyone else who had contributed to the attacks of 9/11. As such, it was used as well to authorize military engagements — and the capture of prisoners — outside Afghanistan. Since 2001, in fact, it has been cited to authorize the use of force in Pakistan, Yemen, Somalia and elsewhere. Of the 780 prisoners held at Guantánamo Bay at one time or another, more than a third came from Afghanistan; the remaining two-thirds were from 48 other countries.

A second potential loophole exists when it comes to the release of prisoners as that war ends. The administration of George W. Bush rejected the very notion that those held at Guantánamo were prisoners of war, no matter how or where they had been captured. As non-state actors, according to that administration, they were exempted from prisoner of war status, which is why they were deliberately labeled “detainees.”

Little wonder then that, despite Secretary of Defense Austin’s position on Guantánamo, as the New York Times recently reported, Pentagon spokesman John Kirby “argued that there was no direct link between its future and the coming end to what he called the ‘mission’ in Afghanistan.”

In fact, even if that congressional authorization for war and the opening of Guantánamo on which it was based never were solely linked to the conflict in Afghanistan, it’s time, almost two decades later, to put an end to that quagmire of a prison camp and the staggering exceptions that it’s woven into this country’s laws and norms since 2002.

A “forever prison”? The closing of Guantánamo would finally signal an end to the otherwise endless proliferation of exceptions to the laws of war as well as to U.S. domestic and military legal codes. As early as June 2004, Supreme Court Justice Sandra Day O’Connor flagged the possibility that a system of indefinite detention at Guantánamo could create a permanent state of endless legal exceptionalism.

She wrote an opinion that month in a habeas corpus case for the release of a Guantánamo detainee, the dual U.S.-Saudi citizen Yaser Hamdi, warning that the prospect of turning that military prison into a never-ending exception to wartime detention and its laws posed dangers all its own. As she put it, “We understand Congress’ grant of authority for the use of ‘necessary and appropriate force’ to include the authority to detain for the duration of the relevant conflict, and our understanding is based on longstanding law-of-war principles.” She also acknowledged that, “If the practical circumstances of a given conflict are entirely unlike those of the conflicts that informed the development of the law of war, that [the] understanding [of release upon the end of hostilities] may unravel. But,” she concluded, “that is not the situation we face as of this date.”

Sadly enough, 17 years later, it turns out that the detention authority may be poised to outlive the use of force. Guantánamo has become an American institution at the cost of $13 million per prisoner annually. The system of offshore injustice has, by now, become part and parcel of the American system of justice — our very own “forever prison.”

The difficulty of closing Guantánamo has shown that once you move outside the laws and norms of this country in a significant way, the return to normalcy becomes ever more problematic — and the longer the exception, the harder such a restoration will be. Remember that, before his presidency was over, George W. Bush went on record acknowledging his preference for closing Guantánamo. Obama made it a goal of his presidency from the outset. Biden, with less fanfare and the lessons of their failures in mind, faces the challenge of finally closing America’s forever prison.

With all that in mind, let me offer you a positive twist on this seemingly never-ending situation. I won’t be surprised if, in fact, President Biden actually does manage to close Guantánamo. He may not do so as a result of the withdrawal of all American forces from Afghanistan, but because he seems to have a genuine urge to shut the books on the war on terror, or at least the chapter of it initiated on 9/11.

And if he were also to shut down that prison, in the spirit of that letter from the Democratic senators, it would be because of Guantánamo’s gross violations of American laws and norms. While the letter did not go so far as to name the larger war-on-terror sins of the past, it did at least draw attention directly to the wrongfulness of indefinite detention as a system created expressly to evade the law — and one that brought ill-repute to the United States globally.

That closure should certainly happen under President Biden. After all, any other course is not only legally unacceptable, but risks perpetuating the idea that this country continues to distrust the principles of law, human rights, and due process – indeed, the very fundamentals of a democratic system.

Karen J. Greenberg writes regularly for TomDispatch (where this article originated). She is the director of the Center on National Security at Fordham Law and author of the forthcoming Subtle Tools: The Dismantling of Democracy from the War on Terror to Donald Trump (Princeton University Press, August). Julia Tedesco helped with research for this piece.

Copyright ©2021 Karen J. Greenberg — distributed by Agence Global

—————-

Released: 04 May 2021

Word Count: 2,031

—————-

Tom Engelhardt, “American-style war ’til the end of time?”

April 29, 2021 - TomDispatch

Here’s the strange thing in an ever-stranger world: I was born in July 1944 in the midst of a devastating world war. That war ended in August 1945 with the atomic obliteration of two Japanese cities, Hiroshima and Nagasaki, by the most devastating bombs in history up to that moment, given the sweet code names “Little Boy” and “Fat Man.”

I was the littlest of boys at the time. More than three-quarters of a century has passed since, on September 2, 1945, Japanese Foreign Minister Mamoru Shigemitsu and General Yoshijiro Umezu signed the Instrument of Surrender on the battleship U.S.S. Missouri in Tokyo Bay, officially ending World War II. That was V-J (for Victory over Japan) Day, but in a sense for me, my whole generation, and this country, war never really ended.

The United States has been at war, or at least in armed conflicts of various sorts, often in distant lands, for more or less my entire life. Yes, for some of those years, that war was “cold” (which often meant that such carnage, regularly sponsored by the CIA, happened largely off-screen and out of sight), but war as a way of life never really ended, not to this very moment.

In fact, as the decades went by, it would become the “infrastructure” in which Americans increasingly invested their tax dollars via aircraft carriers, trillion-dollar jet fighters, drones armed with Hellfire missiles, and the creation and maintenance of hundreds of military garrisons around the globe, rather than roads, bridges, or rail lines (no less the high-speed version of the same) here at home. During those same years, the Pentagon budget would grab an ever-larger percentage of federal discretionary spending and the full-scale annual investment in what has come to be known as the national security state would rise to a staggering $1.2 trillion or more.

In a sense, future V-J Days became inconceivable. There were no longer moments, even as wars ended, when some version of peace might descend and America’s vast military contingents could, as at the end of World War II, be significantly demobilized. The closest equivalent was undoubtedly the moment when the Soviet Union imploded in 1991, the Cold War officially ended, and the Washington establishment declared itself globally triumphant. But of course, the promised “peace dividend” would never be paid out as the first Gulf War with Iraq occurred that very year and the serious downsizing of the U.S. military (and the CIA) never happened.

Never-ending war Consider it typical that, when President Biden recently announced the official ending of the nearly 20-year-old American conflict in Afghanistan with the withdrawal of the last U.S. troops from that country by 9/11/21, it would functionally be paired with the news that the Pentagon budget was about to rise yet again from its record heights in the Trump years. “Only in America,” as retired Air Force lieutenant colonel and historian William Astore wrote recently, “do wars end and war budgets go up.”

Of course, even the ending of that never-ending Afghan War may prove exaggerated. In fact, let’s consider Afghanistan apart from the rest of this country’s war-making history for a moment. After all, if I had told you in 1978 that, of the 42 years to follow, the U.S. would be involved in war in a single country for 30 of them and asked you to identify it, I can guarantee that Afghanistan wouldn’t have been your pick. And yet so it’s been. From 1979 to 1989, there was the CIA-backed Islamist extremist war against the Soviet army there (to the tune of billions and billions of dollars). And yet the obvious lesson the Russians learned from that adventure, as their military limped home in defeat and the Soviet Union imploded not long after — that Afghanistan is indeed the “graveyard of empires” — clearly had no impact in Washington.

Or how do you explain the 19-plus years of warfare there that followed the 9/11 attacks, themselves committed by a small Islamist outfit, al-Qaeda, born as an American ally in that first Afghan War? Only recently, the invaluable Costs of War Project estimated that America’s second Afghan War has cost this country almost $2.3 trillion (not including the price of lifetime care for its vets) and has left at least 241,000 people dead, including 2,442 American service members. In 1978, after the disaster of the Vietnam War, had I assured you that such a never-ending failure of a conflict was in our future, you would undoubtedly have laughed in my face.

And yet, three decades later, the U.S. military high command still seems not faintly to have grasped the lesson that we “taught” the Russians and then experienced ourselves. As a result, according to recent reports, they have uniformly opposed President Biden’s decision to withdraw all American troops from that country by the 20th anniversary of 9/11. In fact, it’s not even clear that, by September 11, 2021, if the president’s proposal goes according to plan, that war will have truly ended. After all, the same military commanders and intelligence chiefs seem intent on organizing long-distance versions of that conflict or, as the New York Times put it, are determined to “fight from afar” there. They are evidently even considering establishing new bases in neighboring lands to do so.

America’s “forever wars” — once known as the Global War on Terror and, when the administration of George W. Bush launched it, proudly aimed at 60 countries — do seem to be slowly winding down. Unfortunately, other kinds of potential wars, especially new cold wars with China and Russia (involving new kinds of high-tech weaponry) only seem to be gearing up.

War in our time In these years, one key to so much of this is the fact that, as the Vietnam War began winding down in 1973, the draft was ended and war itself became a “voluntary” activity for Americans. In other words, it became ever easier not only to not protest American war-making, but to pay no attention to it or to the changing military that went with it. And that military was indeed altering and growing in remarkable ways.

In the years that followed, for instance, the elite Green Berets of the Vietnam era would be incorporated into an ever more expansive set of Special Operations forces, up to 70,000 of them (larger, that is, than the armed forces of many countries). Those special operators would functionally become a second, more secretive American military embedded inside the larger force and largely freed from citizen oversight of any sort. In 2020, as Nick Turse reported, they would be stationed in a staggering 154 countries around the planet, often involved in semi-secret conflicts “in the shadows” that Americans would pay remarkably little attention to.

Since the Vietnam War, which roiled the politics of this nation and was protested in the streets of this country by an antiwar movement that came to include significant numbers of active-duty soldiers and veterans, war has played a remarkably recessive role in American life. Yes, there have been the endless thank-yous offered by citizens and corporations to “the troops.” But that’s where the attentiveness stops, while both political parties, year after endless year, remain remarkably supportive of a growing Pentagon budget and the industrial (that is, weapons-making) part of the military-industrial complex. War, American-style, may be forever, but — despite, for instance, the militarization of this country’s police and the way in which those wars came home to the Capitol last January 6th — it remains a remarkably distant reality for most Americans.

One explanation: though the U.S. has, as I’ve said, been functionally at war since 1941, there were just two times when this country felt war directly — on December 7, 1941, when the Japanese attacked Pearl Harbor, and on September 11, 2001, when 19 mostly Saudi hijackers in commercial jets struck New York’s World Trade Center and the Pentagon.

And yet, in another sense, war has been and remains us. Let’s just consider some of that war-making for a moment. If you’re of a certain age, you can certainly call to mind the big wars: Korea (1950-1953), Vietnam (1954-1975) — and don’t forget the brutal bloodlettings in neighboring Laos and Cambodia as well — that first Gulf War of 1991, and the disastrous second one, the invasion of Iraq in 2003. Then, of course, there was that Global War on Terror that began soon after September 11, 2001, with the invasion of Afghanistan, only to spread to much of the rest of the Greater Middle East, and to significant parts of Africa. In March, for instance, the first 12 American special-ops trainers arrived in embattled Mozambique, just one more small extension of an already widespread American anti-Islamist terror role (now failing) across much of that continent.

And then, of course, there were the smaller conflicts (though not necessarily so to the people in the countries involved) that we’ve now generally forgotten about, the ones that I had to search my fading brain to recall. I mean, who today thinks much about President John F. Kennedy’s April 1961 CIA disaster at the Bay of Pigs in Cuba; or President Lyndon Johnson’s sending of 22,000 U.S. troops to the Dominican Republic in 1965 to “restore order”; or President Ronald Reagan’s version of “aggressive self-defense” by U.S. Marines sent to Lebanon who, in October 1983, were attacked in their barracks by a suicide bomber, killing 241 of them; or the anti-Cuban invasion of the tiny Caribbean island of Grenada that same month in which 19 Americans were killed and 116 wounded?

And then, define and categorize them as you will, there were the CIA’s endless militarized attempts (sometimes with the help of the U.S. military) to intervene in the affairs of other countries, ranging from taking the nationalist side against Mao Zedong’s communist forces in China from 1945 to 1949 to stoking a small ongoing conflict in Tibet in the 1950s and early 1960s, and overthrowing the governments of Guatemala and Iran, among other places. There were an estimated 72 such interventions from 1947 to 1989, many warlike in nature. There were, for instance, the proxy conflicts in Central America, first in Nicaragua against the Sandinistas and then in El Salvador, bloody events even if few U.S. soldiers or CIA agents died in them. No, these were hardly “wars,” as traditionally defined, not all of them, though they did sometimes involve military coups and the like, but they were generally carnage-producing in the countries they were in. And that only begins to suggest the range of this country’s militarized interventions in the post-1945 era, as journalist William Blum’s “A Brief History of Interventions” makes all too clear.

Whenever you look for the equivalent of a warless American moment, some reality trips you up. For instance, perhaps you had in mind the brief period between when the Red Army limped home in defeat from Afghanistan in 1989 and the implosion of the Soviet Union in 1991, that moment when Washington politicians, initially shocked that the Cold War had ended so unexpectedly, declared themselves triumphant on Planet Earth. That brief period might almost have passed for “peace,” American-style, if the U.S. military under President George H. W. Bush hadn’t, in fact, invaded Panama (“Operation Just Cause”) as 1989 ended to get rid of its autocratic leader Manuel Noriega (a former CIA asset, by the way). Up to 3,000 Panamanians (including many civilians) died along with 23 American troops in that episode.

And then, of course, in January 1991 the First Gulf War began. It would result in perhaps 8,000 to 10,000 Iraqi deaths and “only” a few hundred deaths among the U.S.-led coalition of forces. Air strikes against Iraq would follow in the years to come. And let’s not forget that even Europe wasn’t exempt since, in 1999, during the presidency of Bill Clinton, the U.S. Air Force launched a destructive 10-week bombing campaign against the Serbs in the former Yugoslavia.

And all of this remains a distinctly incomplete list, especially in this century when something like 200,000 U.S. troops have regularly been stationed abroad and U.S. Special Operations forces have deployed to staggering numbers of countries, while American drones regularly attacked “terrorists” in nation after nation and American presidents quite literally became assassins-in-chief. To this day, what scholar and former CIA consultant Chalmers Johnson called an American “empire of bases” — a historically unprecedented 800 or more of them — across much of the planet remains untouched and, at any moment, there could be more to come from the country whose military budget at least equals those of the next 10 (yes, that’s 10!) countries combined, including China and Russia.

A timeline of carnage The last three-quarters of this somewhat truncated post-World War II American Century have, in effect, been a timeline of carnage, though few in this country would notice or acknowledge that. After all, since 1945, Americans have only once been “at war” at home, when almost 3,000 civilians died in an attack meant to provoke — well, something like the war on terror that also become a war of terror and a spreader of terror movements in our world.

As journalist William Arkin recently argued, the U.S. has created a permanent war state meant to facilitate “endless war.” As he writes, at this very moment, our nation “is killing or bombing in perhaps 10 different countries,” possibly more, and there’s nothing remarkably out of the ordinary about that in our recent past.

The question that Americans seldom even think to ask is this: What if the U.S. were to begin to dismantle its empire of bases, repurpose so many of those militarized taxpayer dollars to our domestic needs, abandon this country’s focus on permanent war, and forsake the Pentagon as our holy church? What if, even briefly, the wars, conflicts, plots, killings, drone assassinations, all of it stopped?

What would our world actually be like if you simply declared peace and came home?

Tom Engelhardt created and runs the website TomDispatch.com (where this article originated). He is also a co-founder of the American Empire Project and the author of a highly praised history of American triumphalism in the Cold War, The End of Victory Culture. A fellow of the Type Media Center, his sixth and latest book is A Nation Unmade by War.

Copyright ©2021 Tom Engelhardt — distributed by Agence Global

—————-

Released: 29 April 2021

Word Count: 2,322

—————-

John Feffer, “Waiting for the cyber-apocalypse”

April 27, 2021 - TomDispatch

America has a serious infrastructure problem.

Maybe when I say that what comes to mind are all the potholes on your street. Or the dismal state of public transportation in your city. Or crumbling bridges all over the country. But that’s so twentieth century of you.

America’s most urgent infrastructure vulnerability is largely invisible and unlikely to be fixed by the Biden administration’s $2 trillion American Jobs Plan.

I’m thinking about vulnerabilities that lurk in your garage (your car), your house (your computer), and even your pocket (your phone). Like those devices of yours, all connected to the Internet and so hackable, American businesses, hospitals, and public utilities can also be hijacked from a distance thanks to the software that helps run their systems. And don’t think that the U.S. military and even cybersecurity agencies and firms aren’t seriously at risk, too.

Such vulnerabilities stem from bugs in the programs — and sometimes even the hardware — that run our increasingly wired society. Beware “zero-day” exploits — so named because you have zero days to fix them once they’re discovered — that can attract top-dollar investments from corporations, governments, and even black-market operators. Zero days allow backdoor access to iPhones, personal email programs, corporate personnel files, even the computers that run dams, voting systems, and nuclear power plants.

It’s as if all of America were now protected by nothing but a few old padlocks, the keys to which have been made available to anyone with enough money to buy them (or enough ingenuity to make a set for themselves). And as if that weren’t bad enough, it was America that inadvertently made these keys available to allies, adversaries, and potential blackmailers alike.

The recent SolarWinds hack of federal agencies, as well as companies like Microsoft, for which the Biden administration recently sanctioned Russia and expelled several of its embassy staff, is only the latest example of how other countries have been able to hack basic U.S. infrastructure. Such intrusions, which actually date back to the early 2000s, are often still little more than tests, ways of getting a sense of how easy it might be to break into that infrastructure in more serious ways later. Occasionally, however, the intruders do damage by vacuuming up data or wiping out systems, especially if the targets fail to pay cyber-ransoms. More insidiously, hackers can also plant “timebombs” capable of going off at some future moment.

Russia, China, North Korea, and Iran have all hacked into this country’s infrastructure to steal corporate secrets, pilfer personal information, embarrass federal agencies, make money, or influence elections. For its part, the American government is anything but an innocent victim of such acts. In fact, it was an early pioneer in the field and continues to lead the way in cyberoperations overseas.

This country has a long history of making weapons that have later been used against it. When allies suddenly turn into adversaries like the Iranian government after the Shah was ousted in the 1979 revolution or the mujahideen in Afghanistan after their war against the Red Army ended in 1989, the weapons switch sides, too. In other cases, like the atomic bomb or unmanned aerial vehicles, the know-how behind the latest technological advances inevitably leaks out, triggering an arms race.

In all these years, however, none of those weapons has been used with such devastating effect against the U.S. homeland as the technology of cyberwarfare.

The worm that turned In 2009, the centrifuges capable of refining Iranian uranium to weapons-grade level began to malfunction. At first, the engineers there didn’t pay much attention to the problem. Notoriously finicky, such high-speed centrifuges were subject to frequent breakdowns. The Iranians regularly had to replace as many as one of every 10 of them. This time, however, the number of malfunctions began to multiply and then multiply again, while the computers that controlled the centrifuges started to behave strangely, too.

It was deep into 2010, however, before computer security specialists from Belarus examined the Iranian computers and discovered the explanation for all the malfunctioning. The culprit responsible was a virus, a worm that had managed to burrow deep into the innards of those computers through an astonishing series of zero-day exploits.

That worm, nicknamed Stuxnet, was the first of its kind. Admittedly, computer viruses had been creating havoc almost since the dawn of the information age, but this was something different. Stuxnet could damage not only computers but the machines that they controlled, in this case destroying about 1,000 centrifuges. Developed by U.S. intelligence agencies in cooperation with their Israeli counterparts, Stuxnet would prove to be but the first salvo in a cyberwar that continues to this day.

It didn’t take long before other countries developed their own versions of Stuxnet to exploit the same kind of zero-day vulnerabilities. In her book This Is How They Tell Me the World Ends, New York Times reporter Nicole Perlroth describes in horrifying detail how the new cyber arms race has escalated. It took Iran only three years to retaliate for Stuxnet by introducing malware into Aramco, the Saudi oil company, destroying 30,000 of its computers. In 2014, North Korea executed a similar attack against Sony Pictures in response to a film that imagined the assassination of that country’s leader, Kim Jong-un. Meanwhile, Perlroth reports, Chinese hackers have targeted U.S. firms to harvest intellectual property, ranging from laser technology and high-efficiency gas turbines to the plans for “the next F-35 fighter” and “the formulas for Coca-Cola and Benjamin Moore paint.”

Over the years, Russia has become especially adept at the new technology. Kremlin-directed hackers interfered in Ukraine’s presidential election in 2014 in an effort to advance a far-right fringe candidate. The next year, they shut down Ukraine’s power grid for six hours. In the freezing cold of December 2016, they turned off the heat and power in Kyiv, that country’s capital. And it wasn’t just Ukraine either. Russian hackers paralyzed Estonia, interfered in England’s Brexit referendum, and nearly shut down the safety controls of a Saudi oil company.

Then Russia started to apply everything it learned from these efforts to the task of penetrating U.S. networks. In the lead-up to the 2016 elections, Russian hackers weaponized information stolen from Democratic Party operative John Podesta and wormed their way into state-level electoral systems. Later, they launched ransomware attacks against U.S. towns and cities, hacked into American hospitals, and even got inside the Wolf Creek nuclear power plant in Kansas. “The Russians,” Perlroth writes, “were mapping out the plant’s networks for a future attack.”

The United States did not sit idly by watching such incursions. The National Security Agency (NSA) broke into Chinese companies like Huawei, as well as their customers in countries like Cuba and Syria. With a plan nicknamed Nitro Zeus, the U.S. was prepared to take down key elements of Iran’s infrastructure if the negotiations around a nuclear deal failed. In response to the Sony hack, Washington orchestrated a 10-hour Internet outage in North Korea.

As the leaks from whistleblower Edward Snowden revealed in 2013, the NSA had set up full-spectrum surveillance through various communications networks, even hacking into the private phones of leaders around the world like Germany’s Angela Merkel. By 2019, having boosted its annual budget to nearly $10 billion and created 133 Cyber Mission teams with a staff of 6,000, the Pentagon’s Cyber Command was planting malware in Russia’s energy grid and plotting other mischief.

Unbeknownst to Snowden or anyone else at the time, the NSA was also stockpiling a treasure trove of zero-day exploits for potential use against a range of targets. At first glance, this might seem like the cyber-equivalent of setting up a network of silos filled with ICBMs to maintain a rough system of deterrence. The best defense, according to the hawk’s catechism, is always an arsenal of offensive weapons.

But then the NSA got hacked.

In 2017, an outfit called the Shadow Brokers leaked 20 of the agency’s most powerful zero-day exploits. That May, WannaCry ransomware attacks suddenly began to strike targets as varied as British hospitals, Indian airlines, Chinese gas stations, and electrical utilities around the United States. The perpetrators were likely North Korean, but the code, as it happened, originated with the NSA, and the bill for the damages came to $4 billion.

Not to be outdone, Russian hackers turned two of the NSA zero-day exploits into a virus called NotPetya, which caused even more damage. Initially intended to devastate Ukraine, that malware spread quickly around the world, causing at least $10 billion in damages by briefly shutting down companies like Merck, Maersk, FedEx, and in an example of second-order blowback, the Russian oil giant Rosneft as well.

Sadly enough, in 2021, as Kim Zetter has written in Countdown to Zero Day, “[C]yberweapons can be easily obtained on underground markets or, depending on the complexity of the system being targeted, custom-built from scratch by a skilled teenage coder.” Such weapons then ricochet around the world before, more often than not, they return to sender.

Sooner or later, cyber-chickens always come home to roost.

Trump makes things worse Donald Trump notoriously dismissed Russian interference in the 2016 elections. His aides didn’t even bother bringing up additional examples of Russian cyber-meddling because the president just wasn’t interested. In 2018, he even eliminated the position of national cybersecurity coordinator, which helped National Security Advisor John Bolton consolidate his own power within the administration. Later, Trump would fire Christopher Krebs, who was in charge of protecting elections from cyberattacks, for validating the integrity of the 2020 presidential elections.

The SolarWinds attack at the end of last year highlighted the continued weakness of this country’s cybersecurity policy and Trump’s own denialism. Confronted with evidence from his intelligence agencies of Russian involvement, the president continued to insist that the perpetrators were Chinese.

The far right, for partisan reasons, abetted his denialism. Strangely enough, commentators on the left similarly attempted to debunk the idea that Russians were involved in the Podesta hack, 2016 election interference, and other intrusions, despite overwhelming evidence presented in the Mueller report, the Senate Intelligence Committee findings, and even from Russian sources.

But this denialism of the right and the left obscures a more important Trump administration failure. It made no attempt to work with Russia and China to orchestrate a truce in escalating global cyber-tensions.

Chastened by the original Stuxnet attack on Iran, the Putin government had actually proposed on several occasions that the international community should draw up a treaty to ban computer warfare and that Moscow and Washington should also sort out something similar bilaterally. The Obama administration ignored such overtures, not wanting to constrain the national security state’s ability to launch offensive cyber-operations, which the Pentagon euphemistically likes to label a “defend forward” strategy.

In the Trump years, even as he was pulling the U.S. out of one arms control deal after another with the Russians, The Donald was emphasizing his superb rapport with Putin. Instead of repeatedly covering for the Russian president — whatever his mix of personal, financial, and political reasons for doing so — Trump could have deployed his over-hyped art-of-the-deal skills to revive Putin’s own proposals for a cyber-truce.

With China, the Trump administration committed a more serious error.

Stung by a series of Chinese cyber-thefts, not just of intellectual property but of millions of the security-clearance files of federal employees, the Obama administration reached an agreement with Beijing in 2015 to stop mutual espionage in cyberspace. “We have agreed that neither the U.S. [n]or the Chinese government will conduct or knowingly support cyber-enabled theft of intellectual property, including trade secrets or other confidential business information for commercial advantage,” Obama said then. “We’ll work together and with other nations to promote other rules of the road.”

In the wake of that agreement, Chinese intrusions in U.S. infrastructure dropped by an astonishing 90%. Then Trump took office and began to impose tariffs on Chinese goods. That trade war with Beijing would devastate American farmers and manufacturers, while padding the bills of American consumers, even as the president made it ever more difficult for Chinese firms to buy American products and technology. Not surprisingly, China once again turned to its hackers to acquire the know-how it could no longer get legitimately. In 2017, those hackers also siphoned off the personal information of nearly half of all Americans through a breach in the Equifax credit reporting agency.

As part of his determination to destroy everything that Obama achieved, of course, Trump completely ignored that administration’s 2015 agreement with Beijing.

Head for the bunkers? Larry Hall once worked for the Defense Department. Now, he’s selling luxury apartments in a former nuclear missile silo in the middle of Kansas. It burrows 15 stories into the ground and he calls it Survival Condo. The smallest units go for $1.5 million and the complex features a gym, swimming pool, and shooting range in its deep underground communal space.

When asked why he’d built Survival Condo, Hall replied, “You don’t want to know.”

Perhaps he was worried about a future nuclear exchange, another even more devastating pandemic, or the steady ratcheting up of the climate crisis. Those, however, are well-known doomsday scenarios and he was evidently alluding to a threat to which most Americans remain oblivious. What the Survival Condo website emphasizes is living through five years “completely off-grid,” suggesting a fear that the whole U.S. infrastructure could be taken down via a massive hack.

And it’s true that modern life as most of us know it has become increasingly tied up with the so-called Internet of Things, or IoT. By 2023, it’s estimated that every person on Earth will have, on average, 3.6 networked devices. Short of moving to a big hole in the ground in Kansas and living completely off the grid, it will be difficult indeed to extricate yourself from the consequences of a truly coordinated attack on such an IoT.

A mixture of short-sighted government action — as well as inaction — and a laissez-faire approach to markets have led to the present impasse. The U.S. government has refused to put anything but the most minimal controls on the development of spyware, has done little to engage the rest of the world in regulating hostile activities in cyberspace, and continues to believe that its “defend forward” strategy will be capable of protecting U.S. assets. (Dream on, national security state!)

Plugging the holes in the IoT dike is guaranteed to be an inadequate solution. Building a better dike might be a marginally better approach, but a truly more sensible option would be to address the underlying problem of the surging threat. Like the current efforts to control the spread of nuclear material, a non-proliferation approach to cyberweapons requires international cooperation across ideological lines.

It’s not too late. But to prevent a rush to the bunkers will take a concerted effort by the major players — the United States, Russia, and China — to recognize that cyberwar would, at best, produce the most pyrrhic of victories. If they don’t work together to protect the cyber-commons, the digital highway will, at the very least, continue to be plagued by potholes, broken guardrails, and improvised explosive devices whose detonations threaten to disrupt all our lives.

John Feffer writes regularly for TomDispatch (where this article originated). He is the author of the dystopian novel Splinterlands and the director of Foreign Policy In Focus at the Institute for Policy Studies. Frostlands, a Dispatch Books original, is volume two of his Splinterlands series and the final novel, Songlands, will be published in June. He has also written The Pandemic Pivot.

Copyright ©2021 John Feffer — distributed by Agence Global

—————-

Released: 27 April 2021

Word Count: 2,531

—————-

Nina Burleigh, “The great forgetting”

April 22, 2021 - TomDispatch

The second Moderna shot made me sick — as predicted. A 24-hour touch of what an alarmed immune system feels like left me all the more grateful for my good fortune in avoiding the real thing and for being alive at a time when science had devised a 95% effective vaccine in record time.

To distract myself from the fever as I tried to sleep, I visualized strands of synthetic messenger RNA floating into my cells to produce the alien spike protein that attracted my warrior T-cells. I drifted off envisioning an epic micro-battle underway in my blood and had a series of weird nightmares. At about two a.m., I woke up sweating, disoriented, and fixated on a grim image from one of the studies I had consulted while writing my own upcoming book, Virus: Vaccinations, the CDC, and the Hijacking of America’s Response to the Pandemic, on the Covid-19 chaos of our moment. In his Vaccine: The Controversial Story of Medicine’s Greatest Lifesaver, Arthur Allen described how, in the days of ignorance — not so very long ago — doctors prescribed “hot air baths” for the feverish victims of deadly epidemics of smallpox or yellow fever, clamping them under woolen covers in closed rooms with the windows shut.

Mildly claustrophobic in the best of times, my mind then scrabbled to other forms of medical persecution I’d recently learned about. In the American colonies of the early eighteenth century, for example, whether or not to take the Jenner cowpox vaccine was a matter of religious concern. Puritans were taught that they would interfere with God’s will if they altered disease outcomes. To expiate that sin, or more likely out of sheer ignorance, medical doctors of the day decreed that the vaccine would only work after weeks of purging, including ingesting mercury, which besides making people drool and have diarrhea, also loosened their teeth. “Inoculation meant three weeks of daily vomiting, purges, sweats, fevers,” Allen wrote.

To clear my thoughts, to forget, I opened my window, let in the winter air, and breathed deep. I then leaned out into the clean black sky of the pandemic months, the starlight brighter since the jets stopped flying and we ceased driving, as well as burning so much coal.

Silence. An inkling of what the world might be like without us.

Chilled, I lay back down and wondered: What will the future think of us in this time? Will people recoil in horror as I had just done in recalling, in feverish technicolor, the medically ignorant generations that came before us?

The glorious dead When America reached the half-million-dead mark from Covid-19 at the end of February, reports compared the number to our war dead. The pandemic had by then killed more Americans than had died in World War I, World War II, and the Vietnam War combined — and it wasn’t done with us yet. But the Covid dead had not marched into battle. They had gone off to their jobs as bus drivers and nurses and store clerks, or hugged a grandchild, or been too close to a health-care worker who arrived at a nursing home via the subway.

Every November 11th, on Veterans Day, our world still remembers and celebrates the moment World War I officially ended. But the last great pandemic, the influenza epidemic of 1918-1920 that became known as “the Spanish flu” (though it wasn’t faintly Spain’s fault, since it probably began in the United States), which infected half a billion people on a far less populated planet, killing an estimated 50 million to 100 million of them — including more soldiers than were slaughtered in that monumental war — fell into a collective memory hole.

When it was over, our grandparents and great-grandparents turned away and didn’t look back. They simply dropped it from memory.  Donald Trump’s grandfather’s death from the Spanish flu in 1919 changed the fortunes of his family forever, yet Trump never spoke of it — even while confronting a similar natural disaster. Such a forgetting wasn’t just Trumpian aberrance; it was a cultural phenomenon.

That virus, unlike Covid-19, mainly killed young healthy people. But there are eerie, even uncanny, similarities between the American experience of that pandemic and this one. In the summer of 1919, just after the third deadly wave, American cities erupted in race riots. As with the summer of 2020, the 1919 riots were sparked by an incident in the Midwest: a Chicago mob stoned a black teenager who dared to swim off a Lake Michigan beach whites had unofficially declared whites-only. The boy drowned and, in the ensuing week of rioting, 23 blacks and 15 whites died. The riots spread across the country to Washington, D.C., and cities in Nebraska, Tennessee, Arkansas, and Texas, with Black veterans who had served in World War I returning home to second-class treatment and an increase in Ku Klux Klan lynchings.

As today, there were similar controversies then over the wearing of masks and not gathering in significant numbers to celebrate Thanksgiving. As in 2020-2021, so in 1918-1919, frontline medics were traumatized. The virus killed within hours or a few days in a particularly lurid way. People bled from their noses, mouths, and ears, then drowned in the fluid that so copiously built up in their lungs. The mattresses on which they perished were soaked in blood and other bodily fluids.

Doctors and nurses could do nothing but bear witness to the suffering, much like the front-liners in Wuhan and then New York City in the coronavirus pandemic’s early days. Unlike today, perhaps because it was wartime and any display of weakness was considered bad, the newspapers of the time also barely covered the suffering of individuals, according to Alex Navarro, editor-in-chief of the University of Michigan’s Influenza Encyclopedia about the 1918 pandemic. Strangely enough, even medical books in the following years barely covered the virus.

Medical anthropologist Martha Louise Lincoln believes the tendency to look forward — and away from disaster — is also an American trait. “Collectively, we obviously wrongly shared a feeling that Americans would be fine,” Lincoln said of the early days of the Covid-19 pandemic. “I think that’s in part because of the way we’re conditioned to remember history… Even though American history is full of painful losses, we don’t take them in.”

Guardian columnist Jonathan Freedland argues that pandemic forgetting is a human response to seemingly pointless loss, as opposed to a soldier’s death. “A mass illness does not invite that kind of remembering,” he wrote. “The bereaved cannot console themselves that the dead made a sacrifice for some higher cause, or even that they were victims of an epic moral event, because they did not and were not.”

Instead, to die of Covid-19 is just rotten luck, something for all of us to forget.

Who will ask rich men to sacrifice? Given the absence of dead heroes and a certain all-American resistance to pointless tragedy, there are other reasons we, as Americans, might not look back to 2020 and this year as well. For one thing, pandemic profiteering was so gross and widespread that to consider it closely, even in retrospect, might lead to demands for wholesale change that no one in authority, no one in this (or possibly any other recent U.S. government) would be prepared or motivated to undertake.

In just the pandemic year 2020, this country’s billionaires managed to add at least a trillion dollars to their already sizeable wealth in a land of ever more grotesque inequality. Amazon’s Jeff Bezos alone packed in another $70 billion that year, while so many other Americans were locked down and draining savings or unemployment funds. The CEOs of the companies that produced the medical milestone mRNA vaccines reaped hundreds of millions of dollars in profits by timing stock moves to press releases about vaccine efficacy.

No one today dares ask such rich men to sacrifice for the rest of us or for the rest of the world.

The pandemic might, of course, have offered an opportunity for the government and corporate leaders to reconsider the shareholder model of for-profit medicine. Instead, taxpayer money continued to flow in staggering quantities to a small group of capitalists with almost no strings attached and little transparency.

A nation brought to its knees may not have the resources, let alone the will, to accurately remember how it all happened. Congress is now investigating some of the Trump administration’s pandemic deals. The House Select Committee on the Coronavirus Crisis has uncovered clear evidence of its attempts to cook and politicize data. And Senator Elizabeth Warren led somewhat fruitful efforts to expose deals between the Trump administration and a small number of health-care companies. But sorting through the chaos of capitalist mischief as the pandemic hit, all those no-bid contracts cut without agency oversight, with nothing more than a White House stamp of approval affixed to them, will undoubtedly prove an Augean stables of a task.

In addition, looking too closely at the tsunami of money poured into Big Pharma that ultimately did produce effective vaccines could well seem churlish in retrospect. The very success of the vaccines may blunt the memory of that other overwhelming effect of the pandemic, which was to blow a hole in America’s already faded reputation as a health-care leader and as a society in which equality (financial or otherwise) meant anything at all.

Forgetting might prove all too comfortable, even if remembering could prompt a rebalancing of priorities from, for instance, the military-industrial complex, which has received somewhere between 40% and 70% of the U.S. discretionary budget over the last half century, to public health, which got 3% to 6% of that budget in those same years.

The most medically protected generation For most Americans, the history of the 1918 flu shares space in that ever-larger tomb of oblivion with the history of other diseases of our great-grandparents’ time that vaccines have now eradicated.

Until the twentieth century, very few people survived childhood without either witnessing or actually suffering from the agonies inflicted by infectious diseases. Parents routinely lost children to disease; people regularly died at home. Survivors — our great-grandparents — were intimately acquainted with the sights, smells, and sounds associated with the stages of death.

Viewed from above, vaccines are a massive success story. They’ve been helping us live longer and in states of safety that would have been unimaginable little more than a century ago. In 1900, U.S. life expectancy was 46 years for men and 48 for women. Someone born in 2019 can expect to live to between 75 and 80 years old, although due to health inequities, lifespans vary depending on race, ethnicity, and gender.

The scale of change has been dramatic, but it can be hard to see. We belong to the most medically protected generation in human history and that protection has made us both complacent and risk averse.

The history of twentieth-century vaccine developments has long seesawed between remarkable advances in medical science and conspiracy theories and distrust engendered by its accidents or failures. Almost every new vaccine has been accompanied by reports of risks, side effects, and sometimes terrible accidents, at least one involving tens of thousands of sickened people.

Children, however, are now successfully jabbed with serums that create antibodies to hepatitis B, measles, mumps, rubella, diphtheria, tetanus, pertussis — all diseases that well into the twentieth century spread through communities, killing babies or permanently damaging health.  A number of those are diseases that today’s parents can barely pronounce, let alone remember.

Remembering is the way forward The catastrophe of the Spanish flu globally and in this country (where perhaps 675,000 Americans were estimated to have died from it) had, until Covid-19 came along, been dropped in a remarkable manner from American memory and history. It lacked memorial plaques or a day of remembrance, though it did leave a modest mark on literature. Pale Horse, Pale Rider, Katherine Anne Porter’s elegiac short story, for instance, focused on how the flu extinguished a brief wartime love affair between two young people in New York City.

We are very likely to overcome the virus at some point in the not-too-distant future. As hard as it might be to imagine right now, the menace that shut down the world will, in the coming years, undoubtedly be brought to heel by vaccines on a planetary scale.

And in this, we’ve been very, very lucky. Covid-19 is relatively benign compared with an emergent virus with the death rates of a MERS or Ebola or even, it seems, that 1918 flu. As a species, we will survive this one. It’s been bad — it still is, with cases and hospitalizations remaining on the rise in parts of this country — but it could have been so much worse. Sociologist and writer Zeynep Tufekci has termed it “a starter pandemic.” There’s probably worse ahead in a planet that’s under incredible stress in so many different ways.

Under the circumstances, it’s important that we not drop this pandemic from memory as we did the 1918 one. We should remember this moment and what it feels like because the number of pathogens waiting to jump from mammals to us is believed to be alarmingly large. Worse yet, modern human activity has made us potentially more, not less, vulnerable to another pandemic. A University of Liverpool study published in February 2021 found at least 40 times more mammal species could be infected with coronavirus strains than were previously known. Such a virus could easily recombine with any of them and then be passed on to humanity, a fact researchers deemed an immediate public health threat.

In reality, we may be entering a new “era of pandemics.” So suggests a study produced during an “urgent virtual workshop” convened in October 2020 by the United Nations Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (ISPBES) to investigate the links between the risk of pandemics and the degradation of nature. Due to climate change, intense agriculture, unsustainable trade, the misuse of land, and nature-disrupting production and consumption habits, more than five new infectious diseases emerge in people every year, any one of which could potentially spark a pandemic.

That ISPBES study predicted that “future pandemics will emerge more often, spread more rapidly, do more damage to the world economy, and kill more people than Covid-19, unless there is a transformative change in the global approach to dealing with infectious diseases.”

Is our species capable of such a change? My inner misanthrope says no, but certainly the odds improve if we don’t delete this pandemic from history like the last one. This, after all, is the first pandemic in which the Internet enabled us to bear witness not only to the panic, illness, and deaths around us, but to the suffering of our entire species in every part of the globe in real time. Because of that alone, it will be difficult to evade the memory of this collective experience and, with it, the reminder that we are all made of the same vulnerable stuff.

Nina Burleigh is a journalist of American politics and the author of six previous books. Her seventh, Virus: Vaccinations, the CDC, and the Hijacking of America’s Response to the Pandemic (Seven Stories Press, to be published May 18th) is a real-life thriller that delves into the official malfeasance behind America’s pandemic chaos and the triumph of science in an era of conspiracy theories and contempt for experts. This article originated at TomDispatch.com

Copyright ©2021 Nina Burleigh — distributed by Agence Global

—————-

Released: 22 April 2021

Word Count: 2,501

—————-

Dilip Hiro, “Biden’s Anti-China ambitions”

April 20, 2021 - TomDispatch

Like his immediate predecessor, Joe Biden is committed to a distinctly anti-China global strategy and has sworn that China will not “become the leading country in the world, the wealthiest country in the world, and the most powerful country in the world… on my watch.” In the topsy-turvy universe created by the Covid-19 pandemic, it was, however, Jamie Dimon, the CEO and chairman of JP Morgan Chase, a banking giant with assets of $3.4 trillion, who spoke truth to Biden on the subject.

While predicting an immediate boom in the U.S. economy “that could easily run into 2023,” Dimon had grimmer news on the future as well. “China’s leaders believe that America is in decline,” he wrote in his annual letter to the company’s shareholders. While the U.S. had faced tough times in the past, he added, today “the Chinese see an America that is losing ground in technology, infrastructure, and education — a nation torn and crippled by politics, as well as racial and income inequality — and a country unable to coordinate government policies (fiscal, monetary, industrial, regulatory) in any coherent way to accomplish national goals.” He was forthright enough to say, “Unfortunately, recently, there is a lot of truth to this.”

As for China, Dimon could also have added, its government possesses at least two powerful levers in areas where the United States is likely to prove vulnerable: dominant control of container ports worldwide and the supplies of rare earth metals critical not just to the information-technology sector but also to the production of electric and hybrid cars, jet fighters, and missile guidance systems. And that’s only a partial list of the areas where China is poised to become dominant in the foreseeable future. Here’s a likely scenario.

The digital yuan versus the (missing) digital dollar Within the broad headline of the globe’s “second-largest economy,” China has already either surpassed the United States or is running neck-and-neck with it in certain specific sectors.

With a global smartphone market share of 20% in the second quarter of 2020, China’s Huawei Technologies topped the charts, marginally exceeding South Korea’s Samsung, and well ahead of Apple, according to the International Data Corporation. This happened despite a concerted drive by President Donald Trump’s administration to damage Huawei that culminated, in May 2020, with Washington barring companies worldwide from using U.S.-made machinery or software to design or produce chips for that company or its entities from that September on. Nonetheless, with a 47% share of China’s booming 5G smartphone market, Huawei topped the list there while it kept up its investment in future-oriented, cutting-edge technologies and basic research to the tune of a striking $3 billion to $5 billion annually.

Broadly speaking, China continues to make impressive strides when it comes to developing its information and communications technology sector. Its Fintech (Financial Technology) report, published in October 2020, showed that an estimated 87% of Chinese consumers used fintech services. With a vast mobile-payment system that hit $29 trillion (200 trillion yuan) worth of payments in 2019, China is shaping up to become the globe’s first “cashless society” and its largest financial-technology ecosystem by the end of this decade.

Less than 10% of Americans use mobile payments, which means a similar scenario for the United States is nowhere on the horizon. With mobile transactions in China already accounting for at least four out of every five payments and more than half the value of all non-cash retail payments, that country is poised to leave the U.S., a comparative laggard in fintech, shackled to a cash-dominated system.

In their relentless drive for innovation, the Chinese authorities started pushing the development of a digital currency in certain regions in August 2020. Their specific goals were to make daily life easier for citizens and digital payments more secure. While non-bank payment platforms like Alipay and WeChat Pay required users to link to bank accounts, a digital wallet with an e-currency deposit could be opened with a unique personal identification — a driver’s license or a mobile phone number — enabling the un-banked population of China to embrace the digital world.

As a result, the People’s Bank of China became the first major central bank to issue a virtual currency. A broader roll-out is expected for the Winter Olympics in Beijing in February 2022, which will give the digital yuan international exposure.

This has alarmed the Biden administration. Officials at the Treasury Department, the State Department, the Pentagon, and the National Security Council are frantically trying to comprehend the potential implications of a virtual yuan system. They are particularly eager to understand how it would be distributed, and whether it could be used to bypass Washington’s international sanctions as applied to Iran. What distresses some American officials and experts is the notion that someday China’s virtual yuan could replace the U.S. dollar as the world’s dominant reserve currency.

At the Federal Reserve, Chairman Jerome Powell insisted that the central bank was involved in a large-scale research and development project on a possible future digital dollar, though pointing out that such a project could only be launched via a law that would have to be passed by a deeply divided Congress. In short, irrespective of the future of China’s virtual currency, a digital dollar is not likely, not in the near future anyway.

Building infrastructure (or not) As for recent economic history, even a cursory look at the performances of the United States and China in combating the 2008 financial meltdown tells a striking tale.

China made an indelible mark in meeting that financial challenge. Its government sharply increased its infrastructure spending, resulting in higher imports that helped counter flagging global demand. While this move increased Beijing’s debt, it also helped build a foundation to further transform the country’s economy into a productivity-led growth model. A decade after that great recession, according to the World Economic Forum’s Global Competitiveness Report, China’s infrastructure ranking jumped from 66th place to 36th place out of 152 countries.

Although infrastructure building on a large scale requires significant upfront investment, it’s guaranteed to yield productivity gains in the long run. Time and cost savings for commuters, improved market access, healthier competition, increased exchange of ideas, and enlarged innovation capacity, all aided by modern infrastructure, are a springboard for economic development.

During the decade following the 2008 crisis, the number of Chinese cities with metro services jumped from 10 to 34 and 1.1 million kilometers of highways were built, raising the total to 4.8 million kilometers. The length of its high-speed rail system shot up by 52,000 kilometers to 132,000 kilometers. Introduced on the eve of the 2008 Olympics in Beijing, it’s now by far the world’s longest system, accounting for two-thirds of the globe’s high-speed rail. Its advances in information-and computer-technology were equally impressive. On average, mobile-phone subscriptions came to exceed one per person — about the same as in the United States.

High-speed rail (of which the United States has none) reduces journey times, while linking dense urban areas with less crowded cities. In doing so, it allows for a more balanced distribution of labor and business development without sacrificing the benefits of an increasingly urbanized economy. Economies of scale in turn mean that productivity rises as rail usage increases.

Little wonder, then, that President Barack Obama and his team promoted the $787 billion American Recovery and Reinvestment Act of 2009 as an infrastructure-building program in response to the 2008 economic crisis. In reality, however, only $80 billion, a tenth of the money Congress sanctioned, would be devoted to actual infrastructure. Of that, about a third was spent on roads and bridges, improving about 67,600 kilometers of roads and 2,700 bridges. The program also included investment in modern infrastructure like smart grids and broadband development.

In 2010, Obama announced what was to be the “largest investment in infrastructure since the Interstate Highway System,” the creation of a high-speed rail network that would rival China’s. More than a decade later, the only visible progress is a much-delayed and still incomplete 275-kilometer Central Valley California line from Bakersfield to Merced. And in the Trump years, when essentially no government money went into such projects, “infrastructure week” became a standing joke. President Biden seems determined to rectify this, but how successful he’ll be with his $2 trillion infrastructure proposal in the face of a rigidly divided Congress remains to be seen.

For its part, the Chinese government combined its program of rapid infrastructure development with upgrading of the labor force. It did so by implementing an educational system that stressed science, technology, engineering, and math, known as STEM. By achieving higher productivity in this way, the government planned to compensate for a projected shrinkage in its work force.

To promote STEM, the government issued guidelines in 2016 to create a national development strategy aimed at advancing China to the forefront of innovative countries by 2030. In February 2017, the Ministry of Education officially added STEM education to the primary-school curriculum. Since then, encouraged by official policies, schools in both the public and private sectors have implemented such programs.

In 2019, the government allocated 100% of its research funding to top universities to the ones that concentrated on STEM disciplines. By comparison, South Korea allocated 62% of such funding that way. By contrast, U.S. universities ranked in the top 100 maintained a greater balance in funding among STEM fields, humanities, and social sciences.

In October 2019, three of China’s biggest mobile-phone carriers launched advanced 5G services, giving it the world’s largest 5G mobile network. A year later, the Wall Street Journal reported that China had more 5G subscribers than the U.S., not just in total but per capita.

Given the ubiquity of smartphones, the news that America seemed to be losing the tech race to China was widely noted. Mostly ignored, however, was the extent to which the U.S. had become vulnerable to Chinese pressure in international trade.

America’s vulnerabilities In testimony before Congress in October 2019, Carolyn Bartholomew, chairwoman of the U.S.-China Economic and Security Review Commission, revealed that at least two-thirds of the world’s top 50 maritime container ports were directly owned and managed by the Chinese or supported by that country’s investments (up from roughly 20% a decade ago). These included terminals at major American container ports in Los Angeles and Seattle. When it came to such ports, it led the world with seven of the 10 largest ones.

A year earlier, officials at the state-owned China Ocean Shipping Company, one of the globe’s largest container shipping lines, acknowledged that the company had connected its routes along what was officially called the Maritime Silk Road, linking regional markets in West Africa, Northern Europe, the Caribbean, and the U.S. to form a more comprehensive and balanced globalized trading network. “By owning and/or operating a network of logistical nodes across Asia, Europe, and Africa, China can control a significant portion of its inbound supply chain for essential commodities and outbound trade routes for its exports,” Bartholomew explained. “In the event of conflict, China could use its control over these and other ports to hinder trade access to other countries.”

In the manufacturing sector, China finds itself in a privileged position by virtue of its special mineral deposits, called rare earth elements. A group of 17 rare earth metals, including lanthanum, cerium, yttrium, europium, and gadolinium, often called “industrial gold,” are critical components of such high-technology and clean-energy products as wind turbines, solar panels, and electric cars, because of their magnetism, luminescence, and strength. They are also used in a wide variety of weapons from jet fighters to nuclear submarines.

Unsurprisingly, in recent years, there has been a rapid rise in the demand for these minerals in advanced economies. They are dispersed in low concentrations and are costly to extract from ore, an industry in which China has invested a great deal since the 1970s.

According to the U.S. Geological Survey, in 2020, China accounted for 58% of rare earth minerals production, down from around 90% four years earlier, as the United States and Australia boosted their own mining of them. Still, as of 2018, the United States imported 80.5% of its rare earth metals from China. In May of that year, the Trump administration added these to a list of minerals deemed critical to American economic and national security. And in July 2019, it declared them “essential to the national defense,” which freed up resources for the Department of Defense to take action to secure a domestic rare earth production capability.

Even if the mining of these ores increased in the U.S., refining them requires specialist technology and trained personnel as well as high upfront investment. Due to the lack of these in the U.S. so far, China continues to enjoy a near monopoly in processing the ore, with the raw material containing the prized metal mined outside China shipped to the Chinese sites. The refining process also generates large amounts of radioactive waste and pollutes the environment. As a result, developed countries usually opt for getting the refining done in emerging economies.

All in all, when you view the globe in the throes of a once-in-a-century pandemic, you find an authoritarian state, wedded to centralized planning, initiating programs with long-term benefits for its citizens and seeing them through. You also see a politically riven democratic republic operating primarily on an ad hoc basis.

The stark truth is that an American president cannot even bet on his policies, however laudable or otherwise, surviving his four-year term. Trump’s succession after the Obama era illustrated this dramatically, as has that of Trump’s successor, Biden. When judged purely on the basis of final results, centralized planning clearly beats short-term programming, even if it is viewed with a mixture of derision and condemnation by the Western governments that Biden is attempting to coopt to challenge China. The reality of our moment: that country is now rising on a distinctly wounded planet.

Dilip Hiro writes regularly for TomDispatch (where this article originated). He is the author of 37 books, including most recently After Empire: The Birth of a Multipolar World.

Copyright ©2021 Dilip Hiro — distributed by Agence Global

—————-

Released: 20 April 2021

Word Count: 2,317

—————-

  • « Previous Page
  • 1
  • …
  • 23
  • 24
  • 25
  • 26
  • 27
  • …
  • 166
  • Next Page »

Syndication Services

Agence Global (AG) is a specialist news, opinion and feature syndication agency.

Rights & Permissions

Email us or call us 24 hours a day, 7 days a week, for rights and permission to publish our clients’ material. One of our representatives will respond in less than 30 minutes over 80% of the time.

Social Media

  • Facebook
  • Twitter

Advisories

Editors may ask their representative for inclusion in daily advisories. Sign up to get advisories on the content that fits your publishing needs, at rates that fit your budget.

About AG | Contact AG | Privacy Policy

©2016 Agence Global