Tuesday, December 22, 2020

CHRISTMAS TREES AND CROOKED TIMBERS

My daughter, her boyfriend and I went shopping for a Christmas tree this year.  

At our first stop, the pickings were slim.  The "farm" to which we ventured had pretty much run out.  Some threadbare remainders scattered the lot, orphans in a strange year that the season itself seemed unable to make right.  Up the road, a small field of uncut pines in another farm seemed promising.  But those were not being cut or sold.  Unlike their orphaned companions, they were not desperate.  Just determined.  Refusing to celebrate a year in which so much could not be celebrated, they were more or less . . .

Un-Christmas trees.

They would not be unearthed.

It was hard to blame them.  

Optimism and pessimism are generally considered dispositions. One of my best friends is the proverbial boy in the room full of you-know-what furiously shoveling because he knows there's a pony in there somewhere. We tease him mercilessly but envy his ability to stare down adversity.  Some of history's greatest leaders had a similar bent.  FDR's nothing-to-fear-but-fear itself kept the nation afloat in the Depression, as did Churchill's steely unwillingness to bend in any way to Hitler as Great Britain marched alone through its "finest hour" in 1940. 

Both were indispensable. 

The Depression emasculated America as it unemployed a whole generation of middle-aged men, and the German Blitz killed 43,000 Londoners in eight months while making another 1 in 6 homeless as it destroyed 1.1 million British flats. 

Neither, however, was sufficient.  

It took more than Roosevelt's or Churchill's rhetoric to end the Depression or survive the Blitz and ultimately win the war.

And both men knew this.  

The New Deal was a program, not a personality.  In the eight years before World War II, it entailed enormous amounts of domestic spending in the form of direct relief, price supports and unemployment and old age insurance to support and increase economic demand; a relaxed gold standard that increased the money supply (which also enhanced demand); labor laws that allowed workers to compete more equally with their employers to bid up wages; and structural reforms to the banking and securities industries to end the speculative abuses that had facilitated the economic crisis in the first place.  

As for Britain and the war, Churchillian rhetoric became actual success for two reasons, neither of which depended on the great man's words alone.  The first was the Royal Air Force (RAF) which won the Battle of Britain and thus torpedoed Germany's invasion plans.  When Churchill told the British in the Summer of 1940 that "Never in the history of human conflict has so much been owed by so many to so few," he was paying tribute to the Spitfire pilots who almost single handedly had kept Germany at bay.  And when, on December 7, 1941, he knew Britain had been delivered and would ultimately prevail, it was because America was entering the war.  He had hoped for it, argued for it, and begged for it.  In contemplating the possibility that the British Isles might be overrun, he had even predicted it. But in the end, it was the Japanese and Hitler who made it happen.

Though today's challenges are eerily similar, the cumulative response has been anything but.  

As Covid marches on, America's death toll has reached 320,000.  Over 2,500 Americans are dying on average every day. Though vaccines are becoming available, they will not be administered to the vast majority of us until sometime next summer.  For herd immunity to occur, at least 70% of Americans will have to get vaccinated.  Though a September poll found that only about 60% said they would do so, that level of support had reached 70% by December.  In the meantime, social distancing, masks and intermittent quarantines -- all of which have unfortunately been turned into political statements -- remain the only effective antidotes.

The absence of leadership is and has been striking.

On Covid these days, Donald Trump is AWOL.  Other than claiming credit for the vaccines, he has literally said nothing about the crisis since the November election.  His White House arranged a host of holiday parties, disregarding scientific advice to the contrary and generating additional super spreader possibilities.  At those events, masks were more in evidence than in the past but distancing -- a casualty of indoor events -- was not. 

Trump himself has avoided many of these holiday events, instead hunkering down is his residential bunker where he avoids public exposure, and on the rare occasions when he does show up, he simply touts his phony claims of election fraud.  Those claims, repeatedly rejected by the courts and for which literally no credible evidence has emerged, never die in his White House, even as they become more outlandish.  Over the past few days, a coterie of crazies -- Giuliani, sometimes Trump-lawyer Sidney Powell and the pardoned former National Security Adviser (and ex-General) Michael Flynn -- even discussed the possibility of Trump imposing martial law to stop Biden from being sworn in on January 20.  

For his part, Flynn reportedly endorsed and advocated the idea. 

Whether Trump himself endorsed the idea is not clear, and this kind of opacity is typical of both him and his administration.  A bully and pathological liar, Trump's signature rhetorical move generally takes the form of "people are saying".  It allows him to traffic in lies while denying responsibility for them. Flynn's suggestion creates a slight variation on this move.  This time, there actually is a person -- an ex-General, no less -- saying something.  So, Trump gets to hide behind Flynn's ostensible credentials while further stoking the (false) fires of illegitimacy designed to destroy Biden's coming Presidency.

And we all get to marvel at another new low.

At the end of the day, there will be no martial law.  Indeed, were this morally bankrupt President to give any such order, the military would not follow it and the courts would promptly kill it.  It is unclear who among the cowards' caucus of Congressional Republicans would join in this repudiation, and sadly, the number would be far short of all of them.  Since the election, one of the principal cowards -- Sen. Lindsey Graham of South Carolina -- has taken to painting the political future as a contest between better manners and poor policies.  According to him, "Our problem is tone, their problem is policy."  Graham likes the GOP's "chances better because we can act better [but] it's harder for them to legislate differently."

Seriously?

The four years of lies, successful Russian cyber-attacks, 320,000 Covid deaths, assaults on Obamacare with no credible alternative, weakened alliances, continued climate crisis, ongoing racial unrest and newly active militias and "some fine people" white supremacists are all a consequence of . . .

Tone?

What Graham refuses to recognize is the rot that lies at the foundation of his political party.  He won re-election and his party survived the past four years not in spite of Trump but because of him. Anywhere from a quarter to a third of the GOP vote is now comprised of inveterate Trumpists, those who actually thought the Supreme Court should have discarded the ballots of over 20 million Americans and now believe martial law would be a good thing.  

Pundits think Trump drove up turnout among Democrats, which is true, and that once he leaves, that will abate, which may be true as well.  But Trump himself also drove up turnout among Republicans and succeeded in so dividing the country that the GOP cannot win these days without all of his voters, even the crazy ones.  If those voters stay home, as may be the case in the coming Senate run-off races in Georgia, Republicans will lose.  But if they show up, they will drive "tone" into the same ditch Trump has driven it for the past four years. For them, like Trump, "act[ing] better" is for losers and being better is not really possible.

Today, there are two political parties in America.   

The Democratic Party.  

And the Anti-Democratic Party.  

The first counts votes.  The second suppresses them.  The first respects elections. The second tries to overturn them.   The first follows the law.  The second breaks it. The first has won the popular vote in seven of the last eight presidential elections. The second has won it once.  The first wins in spite of a federal structure that tilts the electoral playing field against it. The second wins because of that structure.  The first needs to maintain its coalition and policy sanity to succeed going forward.  The second needs to get rid of its rot if it has the same goal.

If I were betting, it would be on the Democrats.  There are lots of Joe Bidens -- decent more or less centrist pols -- in the Democratic Party.  But there aren't a lot of Mitt Romneys in the Republican Party.  And there are way too many Donald Trumps.   

Graham and his friends will discover how many once Trump leaves. 

In search of a Christmas tree, our final stop was a nursery three towns away.  When we arrived, the only  other customer was on his way out.  Christmas tree shopping in the world of Covid is like all shopping in the world of Covid.  Socially distant at best, solitary and lonely at worst. A couple of employees, no doubt seasonal and brought in for the (un)expected rush, lingered over a chain saw as we inspected the selection. 

This nursery had plenty of trees.  Erect, they bent ever so slightly because they were real.  Though full, the branches were alive. They darted in and out.  And up and down. They avoided artificial symmetry and were thus appropriate reminders, perhaps more so this year than any other, of us -- Kant's veritable "crooked timber of humanity" from which "no straight thing was ever made".

We bought one, took it home and put it up. 

It stands in front of a mountain of books, antidotes to ignorance for any crooked timbers to pursue.

It is the perfect Christmas tree.  

In the perfect place.

For this imperfect year.






Sunday, November 22, 2020

THANKSGIVING 2020 -- FLOWERS IN THE FALL

I remember the day we buried my best friend's baby.  

His wife was lucky to be alive.  Well into her ninth month and with the birth mere weeks away, her uterus burst internally, killing their son and almost killing her as well.  My friend was by then a prominent New York attorney, well-acquainted with late nights and long trips.  That night, however, he was home.  Had he been away, he would have come home to two tragedies.  

One was enough.

At the funeral, as he began his sermon, the priest looked out over a small group of familiar faces.  

He knew us all.  He knew we were ambitious, hard working, take-charge types, unwilling and unable to let mere fate take its course, always prepared to intervene, to move life in the direction we wanted, to shape the future rather than simply await it. He knew this because, decades before, he had been one of the Jesuits who taught us the skills we'd use to do so.

Now, in the still, silence of a small chapel in Manhattan, he was teaching us something else.

We highly educated, ambitious, hard-working citizens of early 21st century America control things.

"But," he said, "not everything."

Welcome to 2020. 

The year of lost control.

Covid, a quintessential manifestation of lost control at so many levels, has killed over 250,000 in America and more than 1.3 million worldwide.  Bookended by a dysfunctional politics that has made things even worse, we approach Thanksgiving in circumstances very similar to those that existed in 1863 when Lincoln first proclaimed the last Thursday in November a national day of thanks.  Then, we were on our way to killing 763,000 in a war with ourselves over whether one man had the right to own another.  Now, we are marching toward a projected number of deaths of anywhere from 350,00 to over half a million largely dependent upon whether all of us are willing to wear masks, stand apart, and occasionally quarantine until an effective vaccine is widely distributed.  

In 1863, Lincoln asked us to thank God for the fact that war had not "arrested the plough, the shuttle or the ship." The economy, it seemed, was doing rather well wherever armies were not creating the first killing fields of modern warfare. At the same time, acutely aware of the war's toll in "widows, orphans, mourners [and] sufferers", Lincoln demanded that the day's prayer be one of "humble penitence for our national perverseness and disobedience."  He knew that the Civil War was at its core the product of both.

In 2020, our prayers should be similarly schizophrenic. The genius of American science and American business is on full display as three companies complete testing and are on the verge of issuing an effective vaccine. The US military is poised to deliver it to the initial wave of recipients within 48 hours of final approval.  The speed with which this likely remedy was produced is unique, breathtaking really, and conjures a reality that buttresses all those (often exaggerated) claims of American exceptionalism.

Be thankful for that.

Covid did not "arrest", as Lincoln would put it, our ingenuity.

After thirty years of over the top right-wing rhetoric, it also did not arrest our ability to embrace the necessity of effective, even big, government.  

One of the principal reasons Pfizer and Moderna and AztraZeneca were able to generate their vaccine at the bureaucratic equivalent of the speed of light is that the federal government -- the one so many of us now habitually hate -- guaranteed their market.  It's a lot easier to get on with the business of discovery, testing and production when you have already been guaranteed payment.  In economic terms, it pretty much eliminates risk, or what is worse in the capitalism of contemporary America, shareholder derivative suits.  For the time being, we also can ignore the fact that, in truth, this whole guaranteed governmental purchase thing is a form of socialism for shareholders, a truth that uncovers a bee hive of hypocrisy in our current zeitgeist.  As is always the case when socialism adds to corporate profit margins, we must put that aside.  

And just . . . 

Thank Uncle Sam.

Not unexpectedly, our current but soon to be ex-President is demanding we thank him as well.  When not inventing election fraud claims his own lawyers -- lest they lose their licenses -- refuse to repeat in actual courts before actual judges, he now asserts that his FDA never would have fast tracked a vaccine without his insistence. Hard to know if this is the case, but even a "perhaps" deserves a nod of approval.  The problem, however, is that Trump insists upon so much, and so much that is false, that it is hard to tell whether anything he says generates progress rather than chaos.  The other problem is that, even assuming he deserves credit for lighting the bureaucratic fires that got this (almost) done, he is now impeding the incoming Biden Administration's ability to distribute the vaccine by refusing to share the data and plans the new President will have to implement.  

I have no idea what, specifically, Trump said to the FDA.  Or even if the agency listened.  But I am certain that, if something goes wrong between now and when the first vaccine is delivered, Trump's insistence claim will vanish.  I am also certain that Trump will never concede his loss to Biden and have tired of waiting for Republican office-holders and appointees to do their jobs regardless.  The head of the GSA is still pretending she cannot "ascertain" that Biden won the election and thus free up transition funds, and every day she waits is one lost to a seamless vaccine delivery after January 20.

So, no thanks, Mr. President.

My original intention in this year's Thanksgiving missive was to thank my Mom.  She is 91 and Covid has been difficult for her and my sister, both of whom live together about two hours from me in New Jersey.  She misses her grandchildren and me and her daughter-in-law.  On Mothers Day, we drove down to surprise them and assembled (socially-distanced) at their coop's picnic area.  With the new spikes, the area has been closed.  She is (fortunately) healthy and we all intend to keep it that way.  She is also a nurse and understands better than most what is at stake.

Most adults have a hard time with change.  My mother, however, is different.   I am not certain she always accepted change.  But she certainly navigated it.  In 1947, she graduated from an elite NYC high school but did not have the chance to go to college. In 1950, she became a nurse, a profession she practiced into her 80s and one for which she was perfectly suited.  Principally because she had a natural empathy for others, a natural aversion to judgment, and a respect for intellect and science.  Illness, especially mental illness, was a fact, not a fault.  And she nursed everyone, including herself, through it.

In 1955 she married my father, who she loved. He, however, loved martinis a bit more. Which, generally speaking, ends badly.  At 50, she was newly divorced and wondered whether she'd grow old poor.  At 56, she married my step-father, who loved her (and not martinis). They took care of each other.  At 72, however, she buried him.  Like her sister, she had had sixteen years of a very good marriage.  

The only problem was . . .

It ended too soon.

Instead of resenting reality, however, she surfed it.  She always found the next wave.  And when she washed out, she always got back on the board.  She passed her dreams of higher education onto her children.  Both of us.  Gender did not matter.  She refused to assume that a bad marriage could not be followed by a good one.  Or that the death of loved ones made it impossible to be sustained by the memory of their love.  In Landslide, Stevie Nicks asks "Oh, mirror in the sky what is love?/Can the child within my heart rise above?/Can I sail through the changin' ocean tides?/Can I handle the seasons of my life?"  

She needs to meet my mother.  

Whose life is a resounding "Yes!"

Someday I'll say all of this in a church.  I hope that day is a long way away.  In the meantime, I'm reminded of another priest who wrote my mother a letter once praising her work and explaining that bouquets were not just for funerals.  

So, Mom, this Thanksgiving . . .

This bouquet's for you.








Wednesday, November 11, 2020

FLAT EARTHERS, BIG-BANGERS, AND THE RISKS GOING FORWARD

A week and a day ago, America voted and Joe Biden and Kamala Harris won.   They are now the President-Elect and Vice President-Elect of the United States.  While some votes remain to be counted and their vote totals will therefore rise, we know at this moment that 76,997,481 voted for them and that, on January 20, 2021, at high noon in Washington, DC, Donald Trump will no longer be President.

That is the good news.

Here is the bad.

71,926,283 voted for Trump.

In the wake of his defeat, Trump has for the past week tweeted out an unending string of lies claiming he won the election.  He has made unproven and unevidenced assertions that illegal votes were counted and that the election itself was marred by fraud in critical states (though never in the ones he won or, apparently, in any districts where the GOP won House or Senate seats).  He has also initiated a host of lawsuits making the same unspecified claims, all of which are now before the courts, many of which have already been dismissed, and none of which have either a basis in fact or the ability to overturn the result.

In a national opinion survey conducted from last Saturday to yesterday by Reuters/Ipsos, 79% said they believed Biden won the election, 13% said the election had not yet been decided, 3% said they thought Trump won and 5% said they did not know.  If so, 97% of the country is telling us that Donald Trump is . . .

Nuts.

It is more or less a given in this country that 97% of us can never agree on anything.  

In 2018, a YouGov.com poll found that only 84% of those surveyed "have always believed the earth is round".  Though only 2% reported they "have always believed the earth is flat", 7% reported that they had "recently" become "skeptical" one way or the other and 7% said they "weren't sure" or didn't know.  When the internet lit up with a partial report, based on that poll, that a third of millennials thought the earth flat, outrage was immediate -- only 4% of them thought that; the other 29% were skeptical or not sure.

But c'mon.

How can anyone be "skeptical" . . . 

Or "not sure".

On the flat earth question?

There are, of course, many questions on which disagreement is both predictable and to be expected.  

Take, for example, the debate between creationism and evolution.

A July 2019 Gallup poll reported that 40% of Americans believe in a so-called “pure” form of creationism in which God alone created the universe and us roughly 10,000 years ago.  In that same poll, the rest reported that humans evolved over millions of years.  23% said that happened without God's assistance.  33% said it occurred with divine assistance.

I'm in that last group.

I believe in the big bang theory.  I think the universe started gazillions of years ago in that instantaneous explosion and that ever since it has been expanding and evolving.  I believe we humans represent a mere nano-second on that time-line and that our forebears were apes.  I believe -- with Einstein -- that if we could run at the speed of light, we would . . .

Out-run time.

And I believe that when we die . . .

We finally do.

I believe all that because I also believe in God, the Divine Pre-Banger as it were , the Aristotelian first cause that set it all in motion. I believe that all those scientific laws we keep discovering, the ones Stephen Hawking and his friends won Nobel Prizes elucidating and writing books about, could not exist without Him or Her or It.  And I believe that the bright light reported by those who have approached death but returned to tell us about it is . . .

God winking.  

Inviting us to that other side of the space-time continuum.

When pressed to disaffirm any of these beliefs, I can't.  

Not because they are false.  

Or true. 

In fact, they are neither.

They are just unknowable.

The results of the recent election, however, are neither unknown nor unknowable. Biden and Harris won. Trump and Pence lost.  That the vast majority -- in fact, at this point all but three -- of the GOP's Congressional caucus refuse to say so at this point does not make it so.  Most of them privately admit the gig is up for Trump.  They know the law suits are just the most recent examples of the President's narcissism and petulance, that none of them can overturn the outcome and that collectively, therefore, they are simply a waste of time.

They also know that what Trump is doing is dangerous.

Perhaps fatally so.

There is nothing ineluctable about democracy or the American republic.  Unlike the Big Bang, it was an act of men, not God. Like the Big Bang, however, it contains its own set of unknowables that make it work.  These are the unwritten traditions and norms that, over time, have stopped the experiment from imploding.  In 1800, after he lost the Presidency to Thomas Jefferson, John Adams thought he had been robbed by an ignorant mob of the unschooled and uninformed.  He could have told his Federalist supporters to hit the streets and undo the perceived wrong.

Instead, he went home to Massachusetts and cooled off.

In so doing, he birthed the tradition that power is to be transferred peacefully when the opposition beats you.  And with one exception, that tradition has been followed ever since.  By Adams's son in 1828, Van Buren in 1840, Cleveland in 1888, Hoover in 1932, Ford in 1976, Carter in 1980 and Bush in 1992. The exception was 1860 when the south refused to tolerate Lincoln.  As a consequence, 720,00 Americans died in the Civil War.

Trump does not advocate violence but he flirts with it.  Over the past week, the violent, white supremacist Proud Boys claimed that Trump's infamous "stand-back and stand-by" pronunciamento from the first presidential debate -- a statement widely derided at the time as a presidential permission slip in waiting -- had been "rescinded", and though on-line threats have not actually materialized since then, neither have they evaporated.  

This, moreover, is the Trump way.  

Every planet of reality -- counting and certifying the vote, conceding the election, funding and undertaking the transition a mere 70 days away, the normal day to day functioning of the federal departments, even proscriptions against violence -- must orbit around the sun of his own narcissistic fictions and lies.  

If you dissent or even demur, you are removed.  

Two days ago, the Defense Secretary was fired, throwing the country's national security apparatus into disarray. The head of the FBI, Christopher Wray, is reportedly on the chopping block. The Attorney General remains in office, probably because he authorized the DOJ to investigate any ostensibly "substantial" irregularities in last week's vote count.  Though this was not all of what Trump wanted, it for now apparently will do.  In response, Richard Pilger, the director of the elections crimes branch in DOJ's Public Integrity Section, resigned. Said Pilger: the "new policy abrogate[es] the forty-year old Non-Interference Policy for ballot fraud investigations in the period prior to elections becoming certified and uncontested."

The 71.9 million who voted for Trump for the most part do not believe he won but also do not care that he has refused to concede or has settled on frivolous legal maneuvers for ends unknown to all but his embittered psyche. In this, they are behaving true to form.  They did not care about the 20,000 plus lies, or bribing the Ukrainian President to start a phony investigation of Biden, or separating the children from their parents at the border, or the Access Hollywood comments, or the two dozen sexual harassment claims.  They did not even care about his rank incompetence on Covid.

How is that possible?

Are they all flat-earthers denying reality or at least skeptical enough about it that the Trumps of the world now get a chance they would otherwise be denied and can hold all of us hostage in the meantime?

Or are they big-bangers, secure in their faith that controlling the Senate and the Supreme Court outweighs any risks that shattering long-practiced traditions entails?

I do not know.

The part of me that celebrates is grateful that 76.9 million of my neighbors got rid of Trump, a cancer on democracy.  The part of me that worries is that 71.9 million of them didn't.   I have Facebooked with them, tweeted with them, talked with them, and am even related to and love some of them.

I just do not understand them anymore.



Saturday, October 31, 2020

BIDEN AND BEYOND

This is an act of faith and an act of hope.

Faith in that fragile thing Alexis de Tocqueville called "Democracy in America".

And hope that a system structured more than two centuries ago for a different time and different place can rise to the occasion and self-correct in this time and this place.

On Tuesday, November 3, 2020, I will vote for Joe Biden for President.  I have voted in every election since 1976.  I mean all of them.  Off years, mid-terms, local, state and national.  

This, however, will be the most important vote I will ever cast.  

The Covid pandemic, which might have been avoided had the President read his intelligence briefings last December, and which certainly would have been significantly mitigated had an all hands on deck policy embracing masks, testing, quarantines and tracing been adopted early on, has killed over 230,000 Americans.  It has destroyed lives, families and whole communities.  Its economic consequences defy modeling.  The best picture of what is happening is not  a V, U or W.  It's a boomerang.  22.3 million jobs were lost by May.  9.3 million had been recovered by the end of September.  But a new Covid wave threatens to send us back to the future as we head toward winter.

Instead of combating the disease, President Trump politicized it.  Never before -- not with the 2009 swine flu pandemic or the 1918 flu epidemic or even AIDS -- has public health been turned into an overtly partisan issue.  

But Trump did it.

He couldn't help himself.  

He never can.

And it is killing more than those tragically falling to Covid.

It is killing American democracy.

It did not have to be this way.  After Trump surpised the world  -- and himself  --  by winning the Presidency, everyone was willing to give him a chance.  In spite of the lying, the "pussy grabbing", the obnoxiousness, the sheer lunacy of a mouth -- as often as not -- apparently disconnected from any functioning brain.  

In spite of it all.

In fact, in the aftermath of that surprise, Hillary Clinton gave voice to that willingness for all of us who had chosen her and rejected him.  The morning after the election she spoke to her own supporters. "We have seen that our nation is more deeply divided than we thought", she said, "But I still believe in America and always will.  And if you do, then we must accept this result and then look to the future.  Donald Trump is going to be our President.  We owe him an open mind and a chance to lead."   

Even then, the result was incongruous. Slivers of votes in the rural precincts of three rust belt states had sent the world and the country into a collective bout of "How the hell did that happen" and "What now?" The truth, however, was that Trump had hit a nerve.  The quarter-century, neo-liberal consensus that made economies global and elites rich contained one big flaw.  

Not everyone was being invited to the party.

And, like most who stand forever on the outside looking in, wondering why the cool kids do not want to hang with them, the outsiders were pissed.  They hadn't gotten a raise in thirty years.  Their jobs were being shipped overseas.  And they had nowhere to go.  For some, sorrow was bathed in addiction. For others, it was bathed in anger . . . 

At the PhDs who studied but never helped them.

The politicians who courted but never delivered for them.

And the elites who secretly despised them.

So they embraced Trump.

In truth, of course, Trump was one of the elites.  Indeed, one of the elitest of the elites -- private schools, inherited wealth, gold plated penthouses and trophy wives.  

But he didn't sound like an elite.  

He was ill-tempered and foul-mouthed.  He did not respect his political opponents.  He made fun of them.  He ranked them out.  He thought they were beneath contempt.  In truth, he thought they were all full of shit.

Which was pretty much what a lot of those white guys in Pennsylvania, Michigan and Wisconsin thought as well.

So they gave him a chance.

And Hillary told the rest of us to give him a chance as well.

So we did too.

And he has blown it.

Every day and in every way.

He has blown it for those of us who do not count neo-Nazis or white supremacists as among the "fine people";  who do not think our government should separate asylum seeking parents from their children;  who do not want their President bribing a foreign leader to orchestrate a phony investigation into a political opponent; who understand alliances and common cause are the only way to preserve the peace and meet global challenges like climate change; who know the Civil War ended 155 years ago and civil rights are long overdue; and who realize that  truth is not optional and lying is not a measure of authenticity.  

But he has also blown it for the very people who put him over the top.  

Despite promising, he did not re-build America’s infrastructure.  His tax breaks for the rich did not trickle down and transform Appalachia, which still does not have enough broadband, and his tariffs have not resurrected American manufacturing or ended trade deficits.  Though his always-promised but never delivered health care plan does not exist, he is still in the Supreme Court trying to get Obamacare declared illegal.  And he may succeed.  Despite the opposition of more than half the country, he and Mitch McConnell have packed the Supreme Court with conservatives.  The pandemic has created 9.1 million Covid survivors, and in Trump's world, they are walking pre-existing conditions that insurers will not have to cover.

I am a life-long Democrat.  I have run for Congress as a Democrat and even served for a time as a member of the New York State Democratic committee.  Trump's supporters like their man because he gets in the face of people like me.  

But instead of liking him because he kicks my ass, maybe they should stop letting him kick their own.

Joe Biden is not perfect.

But he may be perfect for this time. 

He knows that Covid does not have an address or a political party.  He knows that progress on economic inequality is a matter of policies that increase wages, cover costs and generate growth.  Instead of outrage or the Trump-like empty promise of a "big" plan "next week" or "just after the election", he offers small bore incrementalism -- a $15 minimum wage, a public option added to Obamacare that increases the number of insured, ending the ban that stops Medicare from negotiating prescription drug prices with pharmaceutical companies.  It may not be fancy.  But it will work.

Biden has always been ambitious. 

But he has never been an elitist.  

Or a phony.

He does not have to tell us he feels our pain.

Because too often in his own life he has experienced it. 

America is at a crossroads.  

If Trump is re-elected, it will almost certainly be without a popular vote majority and possibly because he and the GOP have suppressed the vote and succeeded in making sure that some of it is not even counted.  They have enlisted an army of lawyers who stand ready to march into close states in an effort to stop any post-election day counts of mailed-in ballots.  Different states count different votes at different times.  Some are counting the early vote now.  Some aren't.  Some count the election day vote first.  And because Republicans have cast fewer early or mail-in votes than Democrats, at least one of Trump's long-term operatives, Steve Bannon, has said the President plans to declare victory on election night when he may be leading.

If this comes to pass, Americans should hit the streets.

Every vote must be counted.

And no winner should be declared until every vote is counted.

If anything other than that happens, American democracy dies.

Don't let it.









Friday, October 16, 2020

ALL OF THAT -- SUPERPRECEDENTS, SUPERHEROES, AND THE SUPREMES

The Senate Judiciary committee is about to approve along party lines the nomination of Judge Amy Coney Barret to the Supreme Court.  Once sent to the full Senate, probably before the presidential election that is mere weeks away and in which voting throughout the nation has in many places already begun, it will again be approved along party lines.  She will then be immediately sworn in and become the 115th person to sit on the Supreme Court  

Barrett, currently a judge on the US Court of Appeals for the Seventh Circuit, is what is generally referred to as an originalist and strict constructionist.  This means she claims to interpret the Constitution and its amendments as written and construed by the framers and believes it is illegitimate for unelected justices to impose their own policy or jurisprudential preferences on that document.   

It is widely expected -- and with more than reasonable basis -- that Barrett will vote to overturn Roe v. Wade, the Court's 1973 decision legalizing abortion.  She has written that Roe is not a "super-precedent", which is how originalists protect decisions -- like Brown v. Board of Education ---which originalism would otherwise preclude but they do not want to disturb (because they know that, in doing so, they'd kill their movement).  She also clerked for Justice Antonin Scalia -- the dean of originalists who opposed Roe -- and is one of his more aggressive admirers.  

In 2006, she actually signed on to a two-page ad sponsored by a group calling itself the Right to Life of St. Joseph County. The ad was run on the anniversary date of the Roe decision.  It opposed "abortion on demand" and supported "the right to life from fertilization to the end of natural life."  And since becoming a judge on the Seventh Circuit in November 2017, she has joined dissents upholding laws banning abortion for minors absent parental notification, banning abortions if they are pursued for particular reasons, and requiring the burial and cremation of fetal remains.

If, as is likely, Barrett votes to overturn Roe, it cannot be saved.  Justices Breyer, Sotomayor and Kagan will preserve the precedent but everyone else now on the Court is on record -- one way or the other -- as opposing it. Chief Justice Roberts, an institutionalist worried about a politicized Court, may attempt some triage in the form of declining to hear cases where the issue might arise or deferring to lower courts where possible, but even these tactics will have limited utility. Because, with Barrett on the Court, it will have six dyed-in-the-wool conservatives and originalists.  

Under the Supreme Court's rules, only four are required to accept a case and five can stay a lower court order.  

So, even if Roberts wanted to hit the brakes, he will not be driving the train anymore.

What will be lost when Roe is killed?

In the overheated and tribal world of today's American politics, what is at stake with the destruction of Roe is generally missed.   This is because, though everyone knows the headline result of the case, almost no one outside the academy has read the decision and even fewer understand the basis for the right to privacy on which it rests.

That's too bad.

Right wingers would have us believe that Roe was conjured out of thin air.  For decades, states had made abortion illegal, and though these acts of legislative fiat never killed the practice (and actually wound up killing many women), neither had the practice killed the laws.  Then, in a grand act of illegitimate judicial hubris, the Supreme Court upended it all.

This is the story they tell and have told for forty-six years.

And it is a lie.

Here are some facts.  

In the late 1960s and early 1970s, when Roe was litigated and making its way to the Supreme Court, a number of states were repealing their anti-abortion laws.  In 1970, New York made abortion legal through the twenty-fourth week of pregnancy. By 1971, Alaska, Washington and Hawaii had decriminalized early term abortions.  By the same time, thirteen other states permitted abortion if the mental or physical health of the mother was at risk.  Even Ronald Reagan, a conservative stalwart, signed into law California's bill doing so in 1967.  The trend was decidedly in the direction of abandoning criminalization as a policy.

Critics, including the late Justice Ruth Bader Ginsburg, think Roe short-circuited this trend by removing the issue from the political branches across the country and resolving it immediately and for everyone.  And they may be right. Roe single-handedly created the Right to Life movement and turbocharged strict constructionism on the bench and in the academy.  The opinion itself was also long, and in surveying the history of abortion the world over, summarizing the opposing medical and ethical views, and then fashioning its three-trimester rule where the scope of the right changed over time, it struck many as more akin to a statute than a Supreme Court decision. 

But, apart from the politics and the presentation, was Roe wrong as a matter of Constitutional law?

The answer is no.

The most important passage in Justice Blackmun's opinion in Roe occurred seventy-six pages into it.  There, Blackmun noted that, though "[t]he Constitution does not explicitly mention any right of privacy", the document makes no sense without it.  

He didn't say it that way.

But I have.  

Because what he did say -- in noting the "zones of privacy" the Court had by then found to exist as  a consequence of the First Amendment (guaranteeing free speech or none at all), the Fourth and Fifth Amendments (protecting against unreasonable searches and seizures, self-incrimination, and deprivations of liberty without due process), the Ninth Amendment (reserving unenumerated rights to the People), and the Fourteenth Amendment (which, for decades, had been held to protect fundamental rights with respect to marriage, education and child-rearing) -- made it clear that the document as a whole collapses without it.  

Put differently, there are certain rights inherent in being an individual, rights that exist regardless of the state and therefore only carefully and narrowly subject to regulation by it.  Jefferson eloquently said these were the self-evident rights of life, liberty and the pursuit of happiness.  To that, the less eloquent among us have added the equally important right . . .

To be left alone. 

Especially north of the knees and south of the navel.

And this, unfortunately, is what the strict constructionists and Judge (soon to be Justice) Amy Coney Barrett do not get.  

They think they can jettison Roe and the world will go confidently on its way. But it won't. Because once you have decided that the Constitution does not contain the right to privacy set out in Justice Blackmun's decision, a host of other questions arise, none of which have acceptable answers.  For example, if there is no right to privacy, is Griswold v. Connecticut -- the case holding that states could not make illegal the purchase of birth control because it violated the right to "marital privacy" -- still good law?  How about Loving v. Virginia, the case that declared anti-miscegenation statutes unconstitutional?  Or Frontiero v. Richardson, the case that overturned gender-based housing allowances in the military as violative of both the equal protection and due process clauses? Or Lawrence v. Texas, the 2003 decision holding anti-sodomy statutes uncontitutional?  Or Obergefell, the case that only recently held statutes forbidding gay or lesbian marriage unconstitutional? 

As with Roe, the Constitution did not mention any of the rights upheld in those cases either. Nor did any of the framers -- either in 1787 when the Constitution and Bill of Rights were written or in 1868 when the Fourteenth Amendment was passed -- have any of these subjects in mind when they put pen to paper and created the foundational documents.

And it gets even worse.

Originalists like Judge Barrett take refuge in the notion of so-called "super precedents".  These are decisions from the Supreme Court ostensibly so accepted that they will never be overturned regardless of their unsteady moorings in the originalist version of what the Constitution requires or protects.  According to Judge (soon to be Justice) Barrett, the usual suspects for this protective shield are two opinions from the Marshall Court in the early 1800s -- Marbury v. Madison, which announced the principle of judicial review, and Martin v. Hunter's Lessee, allowing federal judicial review of state court judgments; Helvering v. Davis, upholding the Social Security Act; the Legal Tender Cases, which made government-issued paper money constitutional; Mapp v. Ohio, which made the Fourth Amendment's ban un unreasonable searches and seizures applicable to the states;  Brown v. Board of  Education, which held segregated schools unconstitutional; and the Civil Rights Cases, which held the Fourteenth Amendment applicable only to state action.  

None of this, however, is true.

There is no statute, legal principle or Supreme Court authority that in any way supports the view that there are any cases known as "superprecedents."  The term was invented by a couple of law professors in the 1970s and then promptly went into hibernation until 2005, when Sen. Specter tossed it to John Roberts as a sort of life line in the latter's confirmation hearings. Like other conservatives, Roberts fashions himself an originalist.  For him, Supreme Court justices are umpires calling balls and strikes, not players in the game.  They get to apply the Constitution as written and as understood when it was written.  Unfortunately for him, when the Fourteenth Amendment's equal protection and due process clauses were added to the Constitution, no one thought that either of them forbade segregation in public schools or in any other accommodations.  

Which created a problem.

For originalists.

Who knew their interpretive scheme would die if it killed Brown.

So conservative legal scholars invented the deus ex machina  of "superprecedents", a sort of jurisprudential  Batman and Robin that protects originalists from . . . 

Themselves.

Since then, others have jumped on the superprecedents train, Judge (soon to be Justice) Barrett among them.  

And her view exposed the whole notion for the canard it always was.  

Super precedents, she noted in a 2013 law review article, were "cases that no justice would overrule, even if she disagrees with the interpretive premise from which the precedent proceeds."  She then explained how an opinion became one: "The force of so-called superprecedents  . . . does not derive from any decision by the Court about the degree of deference they warrant . . . The force of these cases derives from the people, who have taken their validity off the Court's agenda. Litigants do not challenge them.  If they did, no inferior or state court would take them seriously, at least in the absence of indicia that the broad consensus supporting a precedent was crumbling.   When the status of a superprecedent is secure -- e.g., the constitutionality of paper money -- a lawsuit implicating its validity is unlikely to survive a motion to dismiss.  And without disagreement below, the issue is unlikely to make it onto the Court's agenda." 

This, however, creates more problems than it solves.  

If superprecedent status is simply a matter of public acceptance, what is the status of a Court decision before it reaches that point?  Brown was widely disparaged throughout the south when it was initially decided and school desegregation was not fully enforced there until well into the Nixon administration more than fifteen years later.  Could the decision have been overturned during that period but not thereafter?  Is the reverse true as well?  According to Barrett,  once a decision is not widely accepted, it becomes fair game.  

None of this justifies originalism.  Indeed, to the contrary, the whole notion makes a mockery of originalism's  interpretive commitment because it ties the legitimacy of the Court's opinions not to the Constitution's text or the meaning assigned to that text by its drafters but rather to the whims of public opinion.  

For years, conservatives complained that justices should not be acting like legislators. 

Now they are telling them to act like pollsters.

At her confirmation hearings this week, Judge Barret was careful in claiming that Roe is not a superprecedent.  Neither, apparently, are GriswoldLoving, Frontiero, Lawrence or Obergefell. But why not?  If you surveyed Americans today, you'd get super-majorities opposing any effort by legislators to ban condoms.  You'd probably get close to that if you suggested bans on interracial marriage, and even though gay rights and gay marriage as constitutional rights are relatively recent phenomena, significant majorities support them as well.  Ditto for opposition to gender-based discrimination on statutory benefits. Even with Roe, a solid majority of American do not want to ban abortion other than in cases where the mother’s life is endangered.  But once (not if) Roe is overturned, snap back and trigger laws throughout the country will make that the law of the land in almost half the states. 

At the hearings this week, Barrett appeared to depart slightly from the position taken in her 2013 article. In response to questions, she testified that "Roe is not a superprecedent because calls for its overruling have never ceased." Presumably, this saves Griswold and Loving. But not Roe or Lawrence  or  Obergefell.  Even though all of them are premised in whole or part on the right of privacy.

In truth, however, it saves nothing.

Privacy is not a whim.

It's a right.

There is nothing super about Constitutional protections that live or die on the basis of public acceptance.  The whole purpose of the Bill of Rights is to protect minorities from majorities and the whole point of judicial review is to enforce those rights even when majorities do not want to and even after "calls for . . . overruling" them have not ceased.  And in times, like these, of division and populist outrage, those protections become even more important. 

Public opinion is malleable. 

It can be ephemeral.  

And as has been proven in the wake of the election in 2016, it can also be dangerous.

We live in an era of populist outrage.  White grievance has metastasized, public health has been politicized, and even the lives of public officials have been jeopardized.  Originalism created none of that.  But if, as is likely, it takes hold of the Court, it could cripple an institution that might otherwise invoke -- to echo Lincoln -- the better angels of our nature. Now is not the time to embrace the fiction of superprecedents because we are unwilling to reject the farce of originalism.  

The Constitution was never set in stone.  It was written by lawyers steeped in common law.  They used words like "due process" and "equal protection", the meanings of which were not fixed then, had not been fixed before, and were not expected to be fixed in the future.  They expected judges to be jurists, to mine the inheritance of common law in an effort to divine principles that could guide an ever-changing world.  They did not expect them to be legislators, engaged in the rough and tumble of every day politics, with the trade-offs and compromises that entails and the squared circles it often creates.  But they also did not expect them to be pollsters or the tools of aggressive litigants.

They thought judges should be better than . . .

All of that.

So, Judge (soon to be Justice) Barrett, if you can . . .

Be better.

If you can't . . .

Be quiet.

 





Sunday, September 20, 2020

YOU WERE THE BEST OF US

Two wrongs don't make a right.

Right?

Not quite.

Ruth Bader Ginsburg died this past Friday. In her twenty-seven years on the Supreme Court, forty years as a judge, and sixty years as a lawyer, this waif of a woman -- five feet tall and a mere hundred pounds -- was to the legal battle against discrimination based on sex or gender what Thurgood Marshall had been to the legal battle against discrimination based on race.

Namely, its giant.  

In the 1970s, initially as a law professor and then as the head of the ACLU's Womens Rights Project, she orchestrated a step-by-step approach that had the Supreme Court first  hold (in 1971 in Reed v. Reed) that the equal protection clause barred discrimination based on gender  and then begin to systematically apply that holding to reverse the sterotypical realities embedded in the world in which she had grown up and lived.  One by one, and at her careful urging, they fell -- sex-based discrimination in laws regulating military housing allowances, Social Security survivor benefits, state-regulated drinking ages, and rules on who could opt out of jury service or administer an estate.  

Her approach was as strategic as it was enlightened.  

A number of her clients, the "victims" of gender-based discrimination, were men.  

As were, at that time, all the Justices on the Supreme Court.

In 1980, President Carter appointed her to the federal Circuit Court of Appeals in Washington, DC. In her thirteen years there, she developed a reputation as a measured and cautious jurist. She sought consensus. Colleagues with whom she disagreed were not thought of as  opponents or enemies. To the contrary, she befriended two of the most conservative among them, Robert Bork and Antonin Scalia, and became a lifelong friend of the latter.  

Their families regularly celebrated New Years Eve together. 

And Scalia for his part was smitten.

At a lunch with his own law clerks in the early '90s, he was kiddingly given a conservative's Hobson's choice -- "If you had to spend the rest of your life on a desert island with Harvard Law Professor Laurence Tribe or New York Governor Mario Cuomo," the clerks asked, " which would you choose?"

His reply: "Ruth Bader Ginsburg".  

Professor Steven Calabresi, who clerked for Bork on the DC Circuit Court while she was there, called her "a common law constitutionalist. She thinks the Court should not go too far in any given case."

He was right.  

In a much-reported speech at NYU while an appellate judge, she criticized the Supreme Court's decision in Roe v. Wade, arguing that it was "overly broad" and had "prolonged divisiveness" by "halt[ing] a political process that was moving in a reform direction".  In her view, the Court should have limited itself to holding the Texas statute (which banned abortions in all cases except to save the life of the mother) unconstitutional.  States would then have had to determine whether other regulations  were appropriate, and those that passed would have been challenged in court. She thought incremental evaluation and progress was preferable to Justice Blackman's all at once approach, that it would have allowed the rest of the country to catch up and stopped the Court from getting too far ahead of the public. 

She also thought it would have short-circuited the right to life movement and the rise of judicial fundamentalism.

Was she right?

We'll never know 

The Roe-inspired rise of the evangelical right suggests she was.  The fact, however, that state-based reforms were then being vigorously attacked in all the legislatures where they were moving forward (by, among others,the Catholic hierarchy) suggests she wasn't.   

In 1993, President Clinton nominated her to the Supreme Court.  Because of her reputation as a moderate and her expressed skepticism on Roe, liberals were nervous and conservatives silent.  The Senate confirmed her by a vote of 96-3.  No nomination since has beaten that number.  In her twenty-seven years on the Court, she successfully fought back efforts to curtail abortion rights to the point of non-existence, though the effort was far more successful in the earlier years (when Sandra Day O'Connor, John Paul Stevens, David Souter and Anthony Kennedy were on the Court with her) and today hangs by the merest of threads.  

Meanwhile, she continued to erase the plague of sex discrimination, open the doors to equal justice and equal rights, and preserve Congress’s ability to solve national problems. In her most celebrated early decision, United States v. Virginia, she wrote the 7-1 majority opinion striking down the Virginia Military Institute's males-only admissions policy. The only vote against it was Scalia's, who said the ruling would kill VMI.  It didn't.  In Obergefell v. Hodges, she was part of the Court majority that declared gay marriage a constitutional right.  Fundamentalists predicted that this would kill traditional marriage.  

But it didn't either.  

In Ledbetter v. Goodyear Tire & Rubber Co., her most celebrated later dissent, she upbraided the brethren for running the clock on a claim of wage discrimination where the victim, Lily Ledbetter, was not and could not have even been aware that she had a claim. Her dissent was so persuasive that Congress later amended the statute to cure the Court's ridiculous determination.  And on the Affordable Care Act and voting rights, she was stalwart in rejecting the right wing notion that Congress lacked the authority to legislate (health care) or that the problem had been solved (voting rights). 

After its 1992 decision in Planned Parenthood v. Casey, Supreme Court  jurisprudence on abortion shifted away from Roe's focus on privacy and toward Casey's concern that any regulations not impose an "undue burden" on women.  

The doctrinal  shift was tailor made for Justice Ginsburg.  

As a litigator, she had fought against gender discrimination on the grounds that stereotypical distinctions based on sex violated the equal protection clause. In her later years on the Court, often in dissent, she embraced a version of that same analysis in refusing to approve increasingly burdensome regulations on the legal right to an abortion. In Gonzalez v. Carhart, which upheld a state ban on partial birth abortion that did not include a health exception, she wrote in dissent that "legal challenges to undue restrictions on abortion procedures do not seek to vindicate some generalized notion of privacy; rather, they center on a woman's autonomy to determine her life's course and thus enjoy equal citizenship stature."  She also explained that "the absence of a health exception burdens all women for whom it is relevant" and "the reasoned medical judgments of highly trained doctors" ought not be rejected as "'preferences' motivated by 'mere convenience'".

For Ruth Bader Ginsburg, the issue in the final analysis was who decides.

And with her passing, that issue will assume center stage once again and be debated at multiple levels.  

Here are at least two.

First, and on abortion, the fundamental issue will be, as it has always been, who gets to decide when life begins. Or, put differently, who gets to decide when a person is . . .

A person.

The Roman Catholic hierarchy thinks it gets to decide, as do Biblical fundamentalists within various other sects.  According to them, life begins -- a person exists -- at conception.  

In other words, embryos are people.  

This, however, is not a position notable Church fathers like St. Augustine or St. Thomas Acquinas advocated or accepted.  To the contrary, they thought ensoulment or personhood was organic and developed over time.  Conception was not some sort of magic moment.  But today, this view is among the Catholic Church's best kept secrets. The hierarchy assiduously ignores it and instead claims to have opposed abortion for two millenia.  The problem, however, is that the Church's reason for that opposition has changed dramatically over time.  Early on, it opposed abortion because it viewed the act as a form of birth control, not because it thought the act tantamount to murder or infanticide.  Only later, and only after losing civil political power in Italy and claiming it had to be the determinative moral arbiter for the world's 1.2 billions of Catholics, did it turn -- contra Sts. Augustine and Thomas -- embryos into people.

For their part, American conservatives think state legislators get to decide.  In their minds, if an unelected Supreme Court refuses to respect the wishes of any state legislature on this question, that Court usurps the will of the majority reflected in the votes of its duly elected state representatives. The Court then becomes illegitimate, a counter-majoritarian pariah. The problem with this claim, however, is that the Court in Roe and Casey did not counter or usurp any national majority. In fact, to the contrary, those decisions pretty much reflect the national consensus on abortion. What they do not reflect is the opposition to abortion in states whose local majorities constitute a distinct national minority.

Second, in view of the fact that we are now a mere forty-four days from a presidential election, there is the overriding question of who gets to decide Justice Ginsburg's replacement.

In a normal America, the answer would be self-evident. Under the Constitution, the President gets to nominate a new Justice to fill any vacancy and the Senate gets to advise and consent.   This, however, is not how things work in Mitch McConnell's world.  In his world, Presidents do not get to fill vacancies on the Supreme Court in election years.  Instead, only the next President gets to do so.  No one actually thought this was the rule until February 2016, when Justice Scalia died and McConnell made it clear that his party was not even going to hold a hearing -- let alone allow a vote -- on President Obama's then nominee to replace him, Judge Merrick Garland.

But here we are.

My own view is that McConnell was wrong in 2016, that denying Trump the right to nominate Justice Ginsburg's successor might arguably be wrong in 2020, and that generally speaking two wrongs do not make a right.  

Except when they do.

As they do . . .  

Right now.

To begin, McConnell's rule was always a fiction.  When Scalia died, Obama's term was a little short of a year from being over and the election was ten months away.  It was more than possible to do the customary investigations and hearings that now accompany appointments to the Supreme Court.  It was also not accurate to say (though McConnell and his seconds did) that Justices had not been nominated or confirmed in election years in the so-called modern era.  Supreme Court vacancies were filled in 1916, 1932, 1956 and 1986, all election years.  

So, McConnell was wrong.

More or less categorically (though one could argue that nominations and confirmations in the weeks before an election were more or less non-existent; in fact, in the country's entire history, those have occurred only twice).

Unfortunately, his wrong was not without consequences, both short and long term.

In the short term, McConnell's perversion of history has allowed Donald Trump to fill two vacancies to date and has shifted the balance of power on the Supreme Court.  The addition of a third Trump-appointed conservative will change that institution for decades.  

Trump never should have been put in the position where that was possible. And had McConnell not invented his no-Supreme Court-appointments-in-an-election-year canard, he would not have been.  Garland would have been on the Court when Trump assumed the Presidency and Trump to date only would have been able to fill the single seat vacated when Justice Kennedy later retired.  

But he has picked two, not one, and now wants to fill the Ginsburg seat.

If it can be, that has to be stopped.  

If it can't be, and Trump loses in November, the Democrats have to pass structural reforms that re-set the balance on the Supreme Court.  To date, many such reforms have been proposed and can later be considered.  What the Democrats cannot accept is business as usual.  

McConnell never has.

Nor has Trump.

I'm not worried about my own hypocrisy, or squaring any ethical circles, or engaging in silly debates over whether I actually understand the so-called McConnell rule.  Though Mitch is now (conveniently) saying his rule is not applicable today because it only applies when different parties control the Senate and the Presidency, that is not what he said in 2016.  

Nor am I interested in debating his claim that, since the 1880s, no Supreme Court nominee has ever been confirmed in an election year in which the Senate and the Presidency were controlled by different parties. True or not, the number of times this was even possible is not a remotely large enough sample from which anyone -- let alone a power hungry partisan like McConnell --  should be permitted to deduce or infer a rule. 

In the longer term, McConnell's pas de deux with invented rules and hypocritical reversals is killing the Senate, an institution already on life support.  Savvy critics of anti-democratic or counter-majoritarian threats understand that the Senate is one par excellence. The 53 Republicans who today control that body do not come from states even close to representing majority opinion in the nation as a whole, and the positions they advocate on health care, voting rights, gay rights, abortion, taxes, and judicial appointments are not shared by any national majority.

Sooner or later --  and my bet is sooner rather than later -- America will get sick and tired of minority rule, whether it comes clothed in Presidents who continually lose the popular vote  or in Congresses hamstrung by Senates that over-represent small states.  Trump and McConnell could have healed the wounds these structural possibilities cause by governing from the center, and Trump himself could have done so by checking his insults, ad hominems  and anger  at the door.

Neither has.

In many respects, they represent the worst of us.

So, RIP Ruth Bader Ginsburg.

In a life where brilliance was your alter ego but discrimination your companion, where a law school dean wondered why you were there and judges and law firms would not give you a job, you found and married one of the few guys who was different . . . 

And then made your country different as well.

You were the best of us.








Sunday, September 6, 2020

ONE BOY THAT SUMMER

Dear God,

This is a thank you note.

Long delayed in one case but pretty current in another.

As You know, Tom Seaver died last week.

For twenty years, from 1967 to 1986, Seaver  mesmerized baseball fans, winning 311 games, striking out 3,640 batters, compiling a career earned run average of 2.86 and pitching a now unheard of 231 complete games.  He won the Cy Young award three times (1969, 1973 and 1975), struck out 200 or more batters in a single season nine times, threw more than 200 innings (also unheard of today) in sixteen seasons, and was a near-unanimous choice to the Hall of Fame in his first year of eligibility, winning a then-record 98.84% of the votes.

In 1970, in a game tying the major league record for 19 strikeouts, he fanned the last ten batters in a row.

No starting pitcher could do that today either, mostly because they never last beyond the seventh inning.

You probably don't need me to tell You all this, You being God and all.  But on the off chance that  baseball is not Your most important concern -- it is, after all, a game -- I'll continue.

In 1969, Seaver led the New York Mets to their first world championship.  There has been one since, in 1986,  and three other losing appearances (in 1973, 2000 and 2015).  Two of those five trips to the World Series were improbable.  In 1973, the team was in last place on August 30 with a losing record but snuck into the playoffs a month later two games over .500, four teams barely behind them, none of them having particularly acted like they wanted to win.   And in 2000, they made it to the playoffs as a wild card team, which is our way of turning second place into total victory (but, in truth, was probably created to generate more advertising revenue for the teams).

In any case, then there was 1969  . . .

Which was impossible.

Some even called it a miracle.  (Please advise.)

The team, true to form, lost on opening day that year.

To an expansion team, the Montreal Expos.

God, just think about that for a minute.  The baseball gods -- these are the folks who don't have as much power as You but sometimes think they do and act that way -- had decided to spread pixie dust on the hapless Mets, literal bottom dwellers in the first seven seasons of their brief existence, by scheduling them on opening day against a team that was even worse, a team that would go 52-110 that season and finish 48 games out of first place.

But the Mets lost anyway.

The bullpen gave up four runs in the eighth inning.

And things did not get much better thereafter.

They were 3-7 ten days into the season . On June 1 they were in third place in their six team division but still had a losing record.  On July 1 they were in second six games above .500, an extraordinary improvement given their sorry history but still eight games behind the first place Cubs.  And by August 14 they had slipped to ten games behind.

Then they did nothing but win.

Which is more or less when people think You may have gotten involved.

They went 38-9 the rest of the season to finish first in their division, swept Hank Aaron's Atlanta Braves in a best of five playoff for the NL pennant, and then beat the Baltimore Orioles four games to one to take the World Series. Hank Aaron is baseball's all-time home run king. (Or was until Barry Bonds beat his record.  But Bonds was cheating so lots of us think Hammerin' Hank is still the guy on home runs.) And the Baltimore Orioles, who went 109-53 that year, were so good that, at the start of the World Series, no one down here gave the Mets any chance at all.

Seaver was stellar. 

25-9 during the regular season, he won the first playoff game and pitched a masterful 10 innings to win the fourth and pivotal game of the World Series. The next day, with Jerry Koosman pitching, the Mets won the whole shebang.

When the last out came, Seaver was the first player from the dugout to jump on Koosman. Tons of other folks ran onto the field to celebrate.  In a sort of non-Biblical rapture, children ran happily screaming through schools.  Sadly, some people in Baltimore were heard taking Your name in vain. You, however, might want to forgive them. This was the second time in a year that a supposedly superior team from Baltimore had been beaten by a weaker one from New York, given that the Colts had lost to Joe Namath's football  Jets in the Super Bowl that January, and people from Maryland probably thought You were just being unfair to let that happen again.

(Sorry for the sidebar. I don’t need to tell You who to forgive.)

Anyway, and as You also know, I was born in 1956 and grew up in Brooklyn, NY.  I was named after my maternal grandfather, who I called "Poppa" and loved beyond words (he's with You now so please say "Hello" for me).  Poppa, also born and raised in Brooklyn, had been a lifelong Brooklyn Dodger fan.  But when the Dodgers left Brooklyn for Los Angeles in 1958, he -- like most Dodger fans in Brooklyn -- became a baseball widow.  He hated the Yankees and Walter O'Malley, the owner who took the Dodgers west.  It was thus not remotely possible for him to root for the Los Angeles Dodgers or switch his allegiance to the baseball team that remained in New York.

So, truly, Poppa was in mourning.

To say that a part of Brooklyn died when the Dodgers left is not an overstatement. The team was the borough's identity.  The summer sound track in Brooklyn in the 1950s was of kids playing and Dodger games on the radio. You could literally walk down the streets following the play by play from open windows, or so I was told.  For the most part, the players themselves lived in its neighborhoods, certainly during the season. And with Jackie Robinson, the team had literally changed America.

All men may not have been created equal in the segregated south (or even the red-lined north).

But they sure as hell were at Ebbets Field.

(Apologies for the "hell" in that last sentence, but there really is no other way to make the point.)

And then,  in 1958,  they left.  

And  a giant hole was left to fill.

Until 1962, when the Mets filled it.

Poppa's love of baseball was resurrected that year.  I think it was, as we say down here, an Act of Yours.  At the very least it was mystical and somewhat  inexplicable.  New York had a new team.  And the new team even had two old Dodgers, first baseman Gil Hodges (who would retire in 1963 and begin managing the Washington Senators) and third baseman Don Zimmer.  The new team was in the National League and it wasn't the Yankees.

So Poppa rooted for them. 

Even though they stunk.

And I loved him.

So I rooted for them too.

Even though they stunk.

It wasn't easy.  In 1964 or so, years before Seaver arrived and  Gil Hodges returned, and long before the "miracle" of  '69, my parents decided to give me a baseball uniform as a birthday present.  What did I want, they asked, Yankees or Mets?  Mets, of course, I said, never wanting to be caught dead in a Yankees outfit in front of my grandfather.  Then I went outside in it . . .

And was laughed at by the best baseball player on the block.

A Yankee lover as it turned out.

In Brooklyn, no less.

Who woulda thunk that?

By September of 1969, however, no one was laughing at the Mets.  Armed with a pitching staff -- Seaver and Koosman chief among them --  that stifled opponents, their manager, Hodges (who had returned in 1968),  adroitly platooned lefties against righties (and vice versa)  at first base , second, third and in right field, suffered neither fools nor the lazy gladly, and made them winners.  They were in truth a schizophrenic team.  The same guys were rarely in the line-up three days in a row.  I don't know if You see this sort of teamwork among humans in general (probably not these days, at least here in the US), but on the '69 Mets it was magical.  Ron Swoboda and Art Shamsky, for example, who were platooned in right field, each had about 300 at bats that year and each drove in about fifty runs.  Roughly the  same was true of Donn Clendenon and Ed Kranepool, who were platooned at first base. And Ed Charles and Wayne Garret, platooned at third.

Gil Hodges was what a famous author down here, Roger Kahn, called one of the "Boys of Summer". They were all those beloved Dodger players in the late 1940s and 1950s -- Robinson, Pee Wee Reese, Duke Snider, Carl Erskine, Roy Campanella, Clem Labine, Don Newcombe, Billy Cox.  Hodges, however, was especially beloved.  He was known as the "one who stayed".  That was because, as You know, he married a girl from Brooklyn, bought a home and raised his famiy there.  He was also a regular at Our Lady Help of Christians, the Catholic parish in which his family lived. 

All of his kids went to the parish school.

Me too.

One of them, Cynthia, was in my class and once gave me a signed picture of her Dad. 

So I always thought she was pretty cool.

As Tom Seaver will be the first to tell You, Gil Hodges made the Mets.  He was a total professional.  And a no-nonsense guy.  When the team's star left fielder, Cleon Jones, didn't hustle on  a ball hit to him during a game, Hodges walked all the way out to left field and pulled him from the game.  In baseball, just so You know, not hustling is a cardinal sin; it can cause all sorts of problems if it becomes a habit. Needless to say, Cleon never lacked for hustle again.  Hodges was also a great teacher.  He knew what he wanted players to do but had a way of getting them to do it on their own.  

(Part of that may have been owing to the fact that players were a little scared of him; at least that's what one of my friends was told years later by Art Shamsky.  You should ask. Also, and as a relevant aside, really a plea, Hodges should be in the Hall of Fame and the people down here who decide those things obviously need some help on that score. Twenty-five guys are in there who over time received fewer votes than he. In the 1950s, he led all major league first basemen in hits, home runs, RBIs, total bases and extra-base hits. He was an All-Star eight times (also more than any other first baseman).  And then, following seven seasons in which they finished either last or second to last,  he turned the Mets into World Champs, the only pre-free agency expansion team ever to do so.  Could You help here, please?)

Returning to 1969 . . .

It’s hard to put into words how special that summer was for kids like me.    But here's a little vignette that may help.

The World Series in those days was played during the day.  And the deciding  fifth game started in the afternoon of  a school day.  At Our Lady Help of Christians that morning, Sister Louise Claudia told our class that "Cynthia was out sick" and then chuckled.  Everyone knew Cynthia was at Shea Stadium with the rest of her family watching her Dad help make history.  Meanwhile, the nuns decided to make a little history of their own.  As the game started, they stopped class, rolled the educational TVs into the classrooms, and turned it on.  

As You know better than anyone, Catholic nuns in the mid-20th century were serious about school. They ran the places with iron fists, some of which my classmates experienced from time to time.  In my eight years in Our Lady Help of Christians, the only other time I recall classes stopping was on June 5,1968, when we all were marched over to church to pray for Bobby Kennedy.  This was different.  This time, the nuns were stopping school for . . .

A baseball game.

But maybe they knew something we didn't . . .  

Or You do.

One of my friends is John Sexton.  He is a former president of New York University, a former dean of that university's law school, and a former high school teacher who I met in 1973 at a high school summer debate institute at Georgetown University.  For years, he has taught a seminar at NYU called Baseball as a Road to God,  and in 2013 he turned that into a book with the same title.  In it, he makes the point that baseball, like You, is often "ineffable".  The ineffable, he writes,  is a window on the "sacred", on You, a "mystery, both fearful and fascinating", and is "experienced, not defined, revealing itself in moments of intense feeling."  The setting can be "a house of worship or a mountaintop or a ballpark."

For him, the ineffable was "eff-ed" on October 5, 1955, when Gil Hodges caught the last out and the Brooklyn Dodgers won their only World Series (against the Yankees), overcoming a decade of ultimate loss with their one and only ultimate win.

For me, it was "eff-ed" on October 16, 1969, when the Mets overcame seven years of loss and I, along with my classmates, slid down the bannisters at Cynthia's school, yelling ecstatically alongside our teachers, the equally ecstatic nuns.

From the school, I ran to Poppa's house.  

In the years that followed, the Mets muddled along, occasionally great, often frustrating. They won again but I don't think they were ever as miraculous or ineffable again. In 1989, another former university president and baseball lover, Yale's Bart Giamatti, wrote a book called Take Time For Paradise.  In it, he said "I believe we have played games, and watched games, to imitate the gods, to become godlike in our worship of each other and, through those moments of transmutation, to know for an instant what the gods know."

So thank you.

For Tom Seaver.

For Gil Hodges.

And for the '69 Miracle Mets.

Sincerely,

One Boy That Summer.