Blog

1,000 Days of Psychic Torture

Of no other U.S. presidency have journalists, academics, public servants, ethicists, moralists, psychologists, bloggers, activists, artists, minorities, teachers and students, old and young, earnest and passive, indeed, all who are invested in the nation’s welfare and their own, counted the days — the creeping petty pace — as most Americans have of the Trump presidency. We’ve kept the tally like a drumbeat, desperate for the terrible progression to cease.

For nearly three years Brian Williams, host of The 11th Hour with Brian Williams on MSNBC, has begun his nightly newscast with a grim reminder of l’enfant terrible’s tenure: “Good evening on this, Day 866 of the Trump presidency...”

For nearly three years these words have tortured the national psyche like the compounding beat of Chinese water torture.

Today that relentless beating is joined by rage’s blue fire and frustration’s plea for salvation. For today is Day 1,000 of the Trump presidency.

Let reflection on 1,000 days of Trump remind us to reflect more, much more, on 2,000 years ago — when someone of great mind, great empathy, and great common sense summed up the mindless, the oppressive, and the ignorant like this: “Forgive them, for they do not know what they are doing.”

Guilty! (Well, Probably)

Alleged is a beautiful word and an ugly word. When it is ugly, alleged seems to play unfair, cheating the just of justice. (Alleged is the word of judges that at times seems to show no fervor for justice.)

Alleged amplifies the frustration of the guy who “knows” it was the problem neighbor who cut down the tree, yet has no case without a witness or the wrongdoer’s calling card: evidence.

Thus is alleged an ugly word in its every application to a president Congress “knows” is guilty of obstruction of justice, guilty of public corruption, and guilty of other abuses of power — yet dithers to impeach.

The difference between Trump and that problem neighbor is that there’s no alleged about Trump’s guilt: It manifests in open view.

So what’s up with the spineless Democrats who refuse to get off the pot and impeach the criminal in the White House?

Ugliest of all is that these Democratic holdouts seem as worried as their Republican nemeses of offending the bum.

Bimbo

A TV news anchor (for the record, a woman), having just introduced a segment about a zoo gorilla, giddily turns to her co-anchor (for the record, a black man) and declares: “He kind of looks like you!”

Across the U.S. a wire-pack explosion of sentiment condemns the sensibility-starved newswoman as every kind of jerk, from bigot to ignoramus. Somewhere in the Midwest, this assessment ensues:

Person 1:
I wonder who hired this bimbo.

Person 2:
Bimbo? Clearly you don’t realize that the word is a misogynistic putdown.

Person 1:
It can be. However, the universal definition of bimbo includes the meaning, “a foolish, stupid, or inept person.” Bimbo is not gender-specific; it may be used, correctly, to describe women and men.

Person 2:
I beg to differ. The definition of bimbo, based on common usage, is “a foolish, stupid, or inept woman.”

Person 1:
Your assertion of common usage — that is, of a cultural definition — is relevant: the rationale being that repetitive, widespread use of a word over time, in a given application, leads to common understanding and acceptance (e.g., the word bad, in which, in the ‘80s, Michael Jackson infused an urban meaning: “hip, awesome, cool”).

However, cultural definitions pass the test of meaning only so far: a word’s cultural definition cannot usurp its universal definition.

Given the TV anchor’s cavalier comment about her co-anchor’s looks, it might have been more appropriate for me — but only for the sake of enhanced clarity — to describe her rather as an oaf or boor. However, I wholly stand by my choice of bimbo.

The reasons:

  • My use is acceptable; i.e., it cannot be disputed as incorrect.
  • To use the language wonderfully is to use it for full effect. My use of bimbo achieves this goal, because it aptly describes the foolish TV anchor, and does so full-frontally, boldly.

The English language is at once maiden and shrew: accommodating and graceful, yet a head-scratcher for its mood swings. Ultimately its beauty is in its fine lines: its nuances.

I’ve enjoyed our debate about bimbo, and hope that my clarification of the word’s elasticity has helped bolster understanding.

In closing, allow me to use bimbo correctly in a sentence: “President Trump is a bimbo.”

Migrant Migraine

Our current illegal-immigrant problem — our refugee problem — is complex and harrowing and difficult to square, because it exists in a modern world of interminable complexities.

There was a time, not long ago on the human timeline, when mapped borders were invisible and Trumpian walls weren’t yet imagined: when human beings could come and go as they pleased, without restraint. But that was before immigration laws — and before our bloated numbers necessitated these laws.

We’re squeezed, folks; we’re running out of space. (“The U.S. is full of space; just look at the National Parks,” my detractors say. To which I respond, “Our rapacious hunger for development paves over hundreds of thousands of acres of open land every year, and our National Parks are so overcrowded now that they look more like human zoos than wildlife sanctuaries of natural beauty.”)

Frustration over the immigrant flux has less to do today, than just 20 years ago, with fear of aliens taking our jobs. Today’s frustration stems more so from deep anxiety about how to manage, and live comfortably with, untoward numbers — millions — of migrating people, with no tapering-off in sight.

Besides suffocating us, the probably endless migrant flow across our southern border, combined with disinterest in managing our sovereign numbers, is threatening to make null and void our once manageable immigration laws — and our ability to care.

What's Wrong With 2019 America

Many excuse president Trump and his followers on the notion that “we’re all entitled to our opinions, and so are they.” Dead wrong. Most of Trump’s, and his administration’s, “opinions” (ethical, moral, and policy viewpoints) are — plain and simple — wrong. Wrongheaded. And wrong for America.

In assessing White House [soon-to-be-ex] press secretary Sarah Huckabee Sanders’s performance, for example, some Americans excuse its encumbered biases — its partisan pandering to the reactionary policies of her boss, the president, and their party, the GOP — as the entitlement of a disparate viewpoint.

One New York Times letter-writer even noted recently, “[Against those with whom you disagree], you could start a shouting match, grace them with a withering stink eye, or you could just be nice to them.”

Without argument, the egalitarian sentiment in this writer’s apparent proposal, “just be nice,” works well in most of life’s daily social proceedings.

But it doesn’t work well in Trump America — and shouldn’t.

In every respect, in every analysis, Trump’s (and Sanders’s, et al.) notion of what’s good for America — climate destruction, big-business pandering, health-care apathy, ally alienation, corruption, lies, condescension, etc. — is Not good for America. It’s only good for them.

To excuse the likes of Sarah Sanders (and Trump) is not about excusing differences in viewpoint. Rather, it’s about excusing self-righteousness, prejudice, indecency, narcissism, cheating, belligerence, corruption,...

This simple, basic truth encapsulates what is dead wrong about the Trump presidency.

America the Prudish?

The light caress and kiss on the head mark an unfortunate predicament for a lively, gregarious old man for whom such public gestures are as naturally — yes, naturally — rooted in his generation’s social norms and mores as they are in his own heart: offered in kindness, without motives.

Vice President Joe Biden is famous for the idiosyncratic ways he shows appreciation to others: a social style that sometimes nudges barriers, and, alas, has been deemed, in 21st-century parlance (just as Christmas Nativity scenes), as wholly out of line.

I’m not condoning the extent of Mr. Biden’s show of admiration for public servant Lucy Flores; however, I’m not condemning it, either. For clearly his admiration did not signal a desire to make advances. He did not pat a butt.

Instead, Joe Biden merely pressed a shoulder and kissed a head: in appreciation. Comparable, one could argue convincingly, to male dignitaries in certain foreign countries greeting each other with kisses on the mouths.

So then, how much affection is too much these days?

In the U.S., it seems it’s anything more than a handshake.

The Elusive Wish

In her eloquent eulogy today to her father John McCain (who died on August 25), Meghan McCain was mostly correct in her assertion that the U.S. does not need to be made great again (as President Trump insists), because America has always been great. But she was not entirely correct.

I love the U.S. However, it is a difficult love, because the U.S. is as imperfect as the American war hero who was rightly held captive by a people who were wholly innocent of crimes against our nation, and thus undeserving victims of the U.S.’s aggression against them.

John McCain went on to do great things for his country, and became a friend of the Vietnamese. But his name will always be a metaphor of the exceedingly complex moral burden that must be borne by “the most powerful nation on earth.”

America is great in concept. But its ceaseless divisions, especially those now aggravated by our rogue president — and some 40% of Americans who worship him as fellow bigots and xenophobes — make America less than great in its daily applications.

Meghan McCain’s eulogy to her father was beautiful and spot on in its accurate testament to John McCain’s greatness. But in a week from now it will be recalled as just one more fragile wish for a nation where too many wishes get blown away with the wind.

Juror 419

The last time I reported for jury duty (this month) my number was among the first to be called. Along with 71 other juror candidates, I was to be considered to hear the case of a husband and wife, early 40s, who were charged with crimes now epidemic in the U.S.: sexual assault, sodomy, and distribution of child pornography.

The man was accused of sodomizing a seven-year-old neighbor boy. Together, husband and wife were accused of first-degree sexual assault and distribution of child pornography, including a video of the episode with the neighbor boy.

Before the end of this interminable day, 12 of my peers would be selected for the jury box. Two additional souls would be named to suffer as alternates.

As particularly disturbing as a case like this is, to test our fitness to serve without bias we were grilled with many questions, some uncomfortable: “Have you, a family member, or someone else you know, been a victim of sexual abuse?” The prosecuting attorney continued: “In your responses, please state only who was abused, for instance a sister, a friend — no names please — and whether the victim filed a police report.” One among us who stood up said that her niece had been raped. Reminded to state whether the niece had reported her abuse, her aunt said, “She did not. She was murdered.”

In all, more than 20 responded yes to the attorney’s inquiry about sexual abuse. Of those, six beseeched the judge to allow them to share their stories privately.

Before this grating, intensely emotional part of the six-hour jury-selection process mercifully ended (having taken up the entire third hour), the prosecuting attorney asked to interview one more person.

That ‘one more person’ was I.

“Juror 419, please stand,” said the prosecuting attorney.

We were all seated on the benches at which the public would gather during the trial: set to begin the next day. From the second bench, left of center aisle, I rose.

“Mr. Vanderbeek, I understand you’re a writer. What do you write?” asked the attorney.

“Literature,” I answered, “fiction, nonfiction.”

“Have you written about sexual abuse?” asked the attorney.

“Yes, a short story,” I answered. “Perhaps coincidentally, it was recently published in a national literary journal.”*

[A hush overtakes the courtroom.]

“Would you please expound, Mr. Vanderbeek.”

“By expound, ma’am, do you mean—?”

“I mean, would you describe, briefly please, what your story is about.”

“Surely,” I said. “The tale concerns a priest and sexual abuse of children.”

[Commotion now. “Please, people,” warns a bailiff, making the dropping hand gesture to quiet down.]

“And did your story require research?” asked the attorney.

“Yes,” I answered.

Thank you, Mr. Vanderbeek. No further questions.”

As I retook my seat I glanced peripherally at the old man beside whom I had sat for three hours (and would for three more). Up to that point we had endured the proceedings nearly shoulder-to-shoulder, yet had kept to ourselves, occasionally shuffling, otherwise maintaining silent composure. Yet at precisely the instant that my interview ended, whether in seeking comfort — he had been among the half dozen who requested private meetings — or in response to my testimony, the priest, resplendent in his impeccable black cassock and white clerical collar, suddenly wheezed and moved a good distance away.

Persistent Petty Partisanship

Ah, partisanship — America’s new-millennium pie: sour, unpalatable, constantly in your face. Now enter the latest mud fight, over Confederate monuments.

No matter to those who would tear down these artifacts that many were erected to courage, not disunion; no matter that those before had kept any enmity rationally to themselves (for want of protesting worthy issues); to these 2017 partisans, the “symbols of racism” must go.

To demand total removal of Civil War monuments, as many of the sudden critics are doing, is at once to stereotype intent and demean history. In southern cities from Charlottesville to New Orleans and on to St. Louis, mayors and city council members still aching over Dylann Roof’s 2015 murder of nine African-Americans at a Charleston, SC church are using this white supremacist’s atrocity in a latest effort to ease Southern shame of the Treasonous War. The same movement that has demanded removal of the Confederate flag from public buildings is now demanding removal of Confederate monuments from public parks.

In New Orleans, workers donning bulletproof vests have already vanquished the city’s Robert E. Lee monument. To celebrate the Confederate General’s dismissal, Mayor Mitch Landrieu made a big poetic public speech in which he asked, “How can we expect to inspire a nation when our most prominent public spaces are dedicated to the reverence of the fight for bondage and supremacy?” Reverence? Hardly. The mayor continued: “There are no slave ship monuments, no prominent markers on public land to remember the lynchings or the slave blocks.”

In such remarks, generalized and skewed, the speaker dismisses the Civil War’s useful lessons in favor merely of vilifying its transgressions. Considering Landrieu’s remark about slave blocks, slave ships, and African-American lynching, one efforts, without success, to conjure any democratic society that displays these kinds of memorials.

In Charlottesville, VA, city council members suddenly want the city to sell its hundred-year-old statue of Lee, because it was donated by a segregationist. (In deference to small-mindedness, the council’s demand makes no concession to Lee’s unimpeachable integrity, nor to what also to do, if anything, about the name of the park in which the monument stands: Lee Park.)

To compound this partisan nonsense, most of those calling for total removal of Civil War monuments from public view are suggesting that they be relocated to museums — you know, those institutions designed for public view. (What next? Insistence in relocating Jefferson Davis’s house to a museum? And what about World War II’s concentration camps? Why are they still around?)

There is consolation in the fact that Henry Shrady, who sculpted Charlottesville’s Lee (on his horse Traveller) had previously completed a comparable monument to Union General Ulysses Grant; proving that, in fairness to historical preservation (and scholarship), if the image of one man is preserved in bronze and stone, that of the other must likewise be.

At issue with the partisan rally cry against Confederate monuments is that it wholly rejects the contrasting viewpoint. An argument has been made in Memphis, TN to remove the statue of Confederate General Nathan Bedford Forrest (astride his horse King Phillip) from the city’s Health Sciences Park on grounds he was a prolific slave trader and first Grand Wizard of the Ku Klux Klan. Given such logic, who next should be toppled? Caesar, bully of the Gauls? Napoleon, propagandist and plunderer? Benedict Arnold, traitor, egotist?*

Regardless, one presumes no rational person would act as the mayor of St. Louis, MO, who, in keeping with the Joneses, suddenly insists on removal of the city’s Forest Park monument to young Confederate soldiers marching off to war — a gift of their mourning mothers and grandmothers (the Ladies’ Confederate Monument Association).

Mayor Landrieu’s speech about removal of the Robert E. Lee statue in New Orleans (and, by way of association, all Confederate statues) as a national moral imperative adds fuel to my millennial fear that America is losing her soul. The suddenness of this imperative and its ox-pull on a legion of Americans suggests that hysteria is usurping rationality and thoughtful sensibility.

My allegiance to preserving Civil War monuments where they stand has nothing to do with partisan zeal, but rather, wholly with a mature adult’s prudent sense of decency.

Yes, remove the Confederate flag from public spaces; it is the creation of a nation founded by white supremacists, a nation that neither exists within, nor along with, the United States. But honor the Civil War monuments.

Along with the written, illustrative, and photographic records of the Civil War, its monuments — masterly artworks and historical markers — should remain right where they are.

Meanwhile, a good start to dismantling the partisan bitterness may be for everyone to stop referring to them as monuments. Call them statues.

  • In Saratoga National Historical Park, Stillwater, NY, stands a monument to the Revolutionary War American traitor Benedict Arnold (nowhere upon this small stone artifice is Arnold’s name inscribed). The Boot Monument, as it is called, depicts a boot worn by the General during the Battle of Saratoga, during which his left foot and horse were shot almost simultaneously. The foot survived; the horse did not.

The United States of Burnout

The two-sided response to norms and mores (ideas) that today is ripping our nation’s social, economic, and political fabric resembles the oppositional forces that divided neighbor from neighbor and culminated in America’s Civil War.


Then and now, animosities created two camps: one of persons accepting of change (then: mechanizing Northerners, now: progressives), the other of persons suspicious or disdainful of change (then: Southern “aristocrats” and slaveholders, now: conservatives). At both times, society’s increasing complexity has been the engine of angry partisanship.

Just 50 years ago, the issues that are today’s raging adolescents were innocent babes. For the most part, rural Americans didn’t feel left behind and desperate, because agri-conglomerates hadn’t yet swallowed their family farms; obesity was not yet a prevailing problem for children, because malls, big box stores, and mega-highways had not yet usurped open spaces for play and exploration; young truants and derelicts weren’t as numerous, because parents set the rules, and set aside time each evening for the family to break bread together; the tragic struggle for equal rights and acceptance hardly existed for LGBTs, because most were still in the closet; women weren’t crying out for equal pay in the workplace, because most still worked at home; before Roe v. Wade, abortions were mostly handled privately, and without stigmatization; globalization was a word hardly uttered, because the world was still small;...

Most respected scientists who study the brain have concluded that humans are not wired to multitask successfully. Maybe this incapacity holds the key to our present partisan animosities: we’re all burned out.

The Morning After

Surely a bully liar misogynist xenophobe like Donald J. Trump could never win the U.S. presidency, the most respected job on the planet. (I was so sure he couldn’t that I wrote his “obituary” before the vote count had even begun; see OBIT 11/8/2016.) But Trump did win. More than 61 million adults voted for this childish brat to preside over our fair nation for the next four years.

I was shocked, of course, along with tens of millions of other adults who’d learned well enough in their formative years about right and wrong to understand that voting a prejudiced potty-mouthed narcissist to the highest office in the free world would have been to set a canker on their own character.

Donald Trump has a little more than two months to prepare for the presidency. Meanwhile, his victory has already put reason, logic, and fair play on notice. We of the precious adult qualities that hail adultness fell last night into a rabbit hole and woke up this morning in Trumptyland.

To his wayward worshippers, Trump may prove a worthy president.

But he is not my president.

OBIT 11/8/2016

Donald Trump was a log on a bump
(or to say it dif, a bumptious old grump).
Never knew a girl he wouldn't hump
or a reasonable man he could ever stump —
so down fell Trumpty, thump, thump, thump:
consigned by History to the Incorrigible's Dump.

History’s Footnote:
The sorry state of “the blind leading the blind” must rank as one of humankind’s most pathetic/idiotic predicaments, for it is a self-destructive predicament that leads to no positive outcome, only certain loss for the demented leader and the loony followers. (If the hundreds of thousands among Hitler’s pathetic/stupid supporters who ended up getting obliterated by Allied bombs could speak to the living, they’d say, “Yes! Yes! It is true: I was a pathetic idiot!”) Mob mentality — which believes, as the great Hannah Arendt wrote, “truth was whatever respectable society had hypocritically passed over” — encapsulates the pathetic stupidity of Donald Trump and his rage-driven supporters, and their certain loss in today’s presidential election.

The Big Dump: Life After Trump

Thank goodness for last week’s gift of Donald Trump’s 2005 foot-in-midsection misogyny video, which has effectively demolished his bid for the presidency. Soon our nation will have its first woman president. Supreme Court nominee Merrick Garland will finally have his day in court. Adult-acting adults will thrive again. And all moral, balanced people (adults and children alike) who’ve had to endure Lewd Lout Trump will breathe easy: eyes wide open again, ears unclogged of incessant incivilities.

One wonders, after voters have thoroughly dismissed him on November 8, whether Trump will go quietly into the night. Easier to predict is how this oaf will be remembered. The obvious answer: not favorably.

Because Trump will be remembered not just as a blowhard, but also as a principal cause and symbol of the GOP’s collapse, a new word should enter the lexicon: “trumpish” — an adjective describing those who are devoid of decency, empathy, humility, and other basic moral traits.

For starters, trumpish may be used to aptly describe Speaker Paul Ryan, Senators Ted Cruz and Mitch O’Connell, and every other GOP congressional egotist of the early 21st century who tried — unsuccessfully — to hijack women’s rights, stiff the poor and the middle class, censure LGBTs, refute Barack Obama’s U.S. citizenship, shut down the government, rape national parks of their natural resources, deny that humans have caused climate change, etc., etc., etc.

As trumpish Trump and his trumpish political cronies prove time and again, nothing good comes from being trumpish; that is, except throughout the 2016 presidential campaign, when Trump and the GOPers continuously trumped each other’s disgrace.

Greek. Latin. Palin.

Sarah Palin is like that mother-in-law — with all due respect to mothers-in-law everywhere — who just doesn’t know when to leave. We thought she’d spared us of any more of her after she and John McCain lost the 2008 presidential election. But then she went and published a book. And got a reality show. And a news show (as a Fox “commentator”). And then her daughter kicked out her boyfriend. And her son literally kicked out his girlfriend. And now she’s stumping for Trumpty Dumpty for King.

By now I would have begged the State of Alaska to bag her on a fast sled to the most desolate iceberg in the Arctic Ocean. That is, if it hadn't been for one thing.

Listening to Palin talk these days is actually entertaining. Funny, even!

Seriously, have you ever gotten past that mousey nasally voice and actually listened to the words and whole sentences it forms? Priceless.

In January, Palin endorsed Donald Trump for the High Office. Here’s part of what she said in her speech:

“How about...us? Right-winging, bitter-clinging, proud clingers of our guns, our God, and our religion, and our Constitution.”

(How does she feel about your God, I wondered. O, but so entertaining!)

Sarah Palin probably couldn’t diagram a sentence if her next contract depended on it. But we can diagram her.

Start with her first name. Spell it backward. What you get is close to the actual word that describes how she handles opponents: Haras.

Next, fill in the blank: “A ____ in the rump.” (Clue: For the answer, remove the “L” from her last name.)

Hey, this is fun! Let's continue!

Question: Ever notice that when Palin speaks, she seems like a robot? Hey, she’s a ... Palindrone!

Perhaps. But to those lucky GOPers, she’ll always be their:

Avid Diva

The Age of Snobs

Once again in early 21st century America — land of the 1% vs. the 99% — a fat-cat snob has won, and the big-hearted commoners who naively persist in filling his purse have lost. This time, the nauseating moneybags is Enos Stanley Kroenke, owner of the St. Louis Rams football team he yanked last night from our city.

Before pulling the plug, Kroenke trashed St. Louis in an 11th-hour report to the NFL in which he characterized our city as too weak, economically, to continue to support a professional football team. Note to Kroenke: Your contention misses the mark. Rather than the city, it is many of your Rams employees who are economically weak, because they lost their jobs today; and come next football season, the St. Louis business owners who thrived on the patronage of your Rams fans will suffer, too, economically.

So much for Best Business Practices, Enos. So much for Community.

Ah, Enos Stanley Kroenke. How telling that this snob bears the names of two Hall of Fame baseball players from the St. Louis Cardinals, Enos Slaughter and Stan Musial.

Of course, Enos and Stan we love.

Enos Stanley Kroenke we loathe....

Winning the Presidency By a Hair

Dear Donald Trump,

Thank you for busting through the age-old clutter of polyspeak these days with your neo-enlightening style, your clarity and concision of upright thought, and, above all, with your animus which hugs sort of as it lectures and improves us all.

You are the light in dark America.

In standing up against the duh in existing U.S. policies and providing, rather, brilliant, down-to-earth solutions for America's pressing needs, you stand out among all the other 2016 presidential candidates (your countless Republican peers included) as a flowering lily pad in a stagnant pool.

Your defiance of all the stupid, senseless laws no longer apposite in 21st century application (e.g., the 14th Amendment, with its nonsensical guarantee of citizen rights) is timely. Your call for building the 200,000-mile-long Great Wall of Xenophobia is monumental. Your deft insights about the President, Mexicans, liberals, women, aging models, LGBTs, TV networks, ISIS, stiffs, Rosie, Megyn, Jeb, tiny little guys, and others who are not you are priceless.

But the primary reason I'm writing to you today, Mr. Trump, is that I believe to your bullhorn, to your body language — indeed, to all your pertinent invective and revolutionary ideas that have culled from our dark landscape such a horde of devotees — I believe that there's an asset as yet untapped which, when enjoined to those others, will guarantee you the 2016 presidential nomination. That last asset, Mr. Trump, is your hair.

Specifically, we Americans implore you to change your hairstyle. Get rid of the waveover. Pay your favorite stylist (or maid) to execute a tasteful trim. Tell her or him that you need a new mane: a style that reflects your awesome sense of order and propriety. This idea is genius, Mr. Trump. In a phrase, your new hair will seal the deal.

The People are blown away by your excellent stances for America. The People love you because, through your anger and nonconformity, you have given voice to their anger and torn down the commoners’ Great Wall of Fear of political incorrectness. The People love you, Mr. Trump — just as people love cultists — because their awe of your novelty blinds them to your façade. Cut your hair, sir, and those who follow you will grow to legions!

Trimming your tuft, neatening that coiffure, will fix your reputation as the one and only American who can get things done for America. Do this, Mr. Trump, for it will make you in the bedazzled eyes of The People even more worthy than they already believe you are.

In sum, Mr. Trump, perfect the only thing about you that as yet inhibits your full bearing as "one of us." Compromise that hair, Mr. Trump, and surely you will saunter to Nomination Night as if on an ornamental steed: America’s next “Brightest Hope.”

Thank you, and...

Best Regards,

Kenneth Vanderbeek

Does God Like Everybody?

The God Old Party (also known as the Grand Old Party or GOP) has just waxed its angel wings again. Last Thursday, during the first Republican debate of the 2016 presidential campaign, the 10 candidate participants invoked the name of God 19 times as they verbally crucified President Barack Obama. At the same time, the GOPers made it eminently clear that God “blesses” them.

After witnessing these mere Christians inflict such a holier-than-thou pummeling of Mr. Obama (not the first time), one wonders if, in God, the feeling is mutual: Is He an angry God who likewise disparages Mr. Obama? Or is He a kind God who likes the president along with His other children? Hmm, let’s see.

Consider: During Mr. Obama’s nearly eight years as president of the United States, he has worked continually for the wellbeing of others via an inclusionary style that recognizes — and respects — the lawful rights and earthly potential of every human being. And he has sensibly focused this vision in the here and now, giving, as all presidents should, far more attention to the quality of life on earth than in heaven. If it may be presumed that God sits at the center of everyone (including the president), it may subsequently be construed that the difference between Mr. Obama and so many of today’s Republican ‘leaders’ is that the president correctly realizes that God need not be consulted for advice in nearly every discussion about what’s righteous and lawful in the world.

In one of the Republican debates, moderators at the Fox News Channel asked every GOP presidential candidate if he’d lately received “any word from God.” A stupendously odd question to ask public officials in public, but not surprising. Throughout Mr. Obama’s presidency, GOP zealots have vilified him as “demonic,” “Satanic,” and the “anti-Christ”; many have insisted that he is a practicing Muslim (clearly he is not; however, if he were, good for him); and lately, Tea Party members have been wielding signs that read “We Need A Christian President.” (Note: In Article VI, paragraph 3 of the U.S. Constitution it is written, “No religious Test shall ever be required as a Qualification to any Office or public Trust under the United States.” Our forefathers were not only wise; they were also fair.) It’s not that religion should be felt and not heard. The problem with today’s GOPers is that they don’t seem to abide that old common-sense dictum, “There’s a time and place”; nor do they seem compelled to support the laws that prudently separate church from state.

It’s a flippant mind that panders to God in the public square as if He’s a partisan, for such a mind avails itself to self-righteousness and ignorance. Consider, again, today’s GOP. As it preaches individual rights and strength of Union, it largely ignores the poor, the gun problem, gay entitlement, the right of women to equivalent pay in the workplace (etc.), and denies existence of the greatest ‘sin’ of our time: manmade Climate Change.

God is either laughing (out of incredulity) or crying (in pity of the hypocrites).

Callous Adults, Cheated Kids

Today’s youths find themselves staring up at a colossal character hurdle. No different in their mental curiosity and emotional fragility than youths of the past, today’s, however — more than their grandparents when they were children, more even than their mothers and fathers in their own formative years — suffer a disadvantage to traditional forces that have become pervasive: peer pressure (now especially to drugs more deadly than ‘recreational’); bullying (much of it sadistic); puberty (complicated not just by temptation, but also by burgeoning affronts) and, perhaps most problematic of all, adults — too many of whom these days are anything but good role models.

Recently a friend and I (both of us parents) were discussing characteristics of these dysfunctional peers, and agreed that, in the realm of instruction (‘leading by example’), most share a common patriarch: negativity — that is, that having the choice to lead by encouragement (positive reinforcement) or disparagement (negative reinforcement), a majority chooses disparagement.

In sports, especially, many of today’s youth coaches use the game to bolster their own sensibilities (fulfill a desire to control; compensate for personal athletic shortcomings) rather than to avail the game’s gifts to the participants: enjoyment, camaraderie, collaboration, leadership-development. A telling example of this disturbing truth is the case of my friend’s daughter, the starting goalie of her high-school soccer team, who has helped tally more wins than losses so far this season, including eight shutouts. Yet because she missed a save in a recent game, her coach benched her for the next — his rationale being that punishment (not encouragement) is the right strategy for improvement.*

Because my friend’s daughter is a quite poised young lady, and exceptionally mature for her age, her coach’s misdeed will not sully her adultness. Yet other youths are not so fortunate. For many, it’s an unaffirmative style of instruction that principally shapes their childhood, thus also their adulthood — and, for those who make careers, their professionalism, as well.

Is it any surprise, then, that in today’s workforce a common managerial tact for remedying employee error is to chastise and proceed quickly to disciplinary action, rather than to fairly discuss the mistake and offer means (including encouragement) to improvement?

My friend asks this question another way: “Is it any wonder that so many are so beat-up and burned out by their ‘leaders’ that they lose their healthy zeal for joy?”

Ah, joy. For many today, this is something they can barely envision at the top of a colossal hurdle.

  • During 20 years of coaching soccer, of the dozens of other coaches I met, not more than a few led by compassion. The rest were dictatorial and aloof.

Open Letter to Economists and Other Human Types

Dear Fellow Earthlings,

Throughout my years on earth, most of them having been lived as an adult — thus consequently it may be said my conscience is fully formed, and more than sometimes I rationalize astutely — still, I have yet to understand coherently the relationship between economics and population.

Ever since the ancients wrote the Book of Genesis, it has been a principal motivation of mankind to ‘inherit the earth.’ Certainly, as it has measurably contributed to his economic wellbeing and advancement, man’s migration and propagation throughout the world have worked well in his favor — albeit mostly at times when natural resources have outnumbered him (such as during the early arrivals, in the 1500s, of scant Europeans to join the modest native populations of the primordial United States).

It seems reasonable to presume, considering that the landed United States today bear 319 million people, and counting (and the world billions more, and counting), that as humans continue to multiply, it will eventually be mankind that outnumbers natural resources. (Consider California, for example, where it has never been more evident that humans are gaining the upper hand over the earth. More on this later.)

Thus the pressing concern: If, indeed, we humans will someday outnumber our life-sustaining needs (those natural resources), how may that affect economic prosperity?

Back to the example of California. Note the auspicious insights on economics and population in The New York Times article, “Brown’s Arid California, Thanks Partly to His Father” (May 17, 2015), which laments the state’s now devastating water shortages — devastating, all the more, amidst a long-term drought. As California’s governor from 1959 to 1967, Pat Brown promoted significant (that is to say, unchecked) population growth for California, in part through the California State Water Project, which ensures water delivery to two-thirds of the state’s population, particularly in urban centers and to farmers.

At the start of Governor Pat Brown’s tenure, California’s population was 15 million. Today it’s 39 million, and [you guessed it] increasing. As a result of this endless human swell, California’s natural-resources infrastructure is severely pinched and getting, of course, more so by the day: this, the legacy of Pat Brown’s loud call for settlers, as well his, and his contemporaries’, narrow foresight about responsible natural-resources — and, dare I say, also population — management for the future.

In truth, we humans are party to a compelling paradox: As population grows, economies generally prosper; but natural resources decline. As population shrinks, economies generally flounder; but natural resources recover.

Considering these two ‘equations,’ it seems to me that only the latter — entailing fewer babies, humbler lifestyles, natural-resources conservation — incorporates the element of responsibility, and thus would reduce our onslaught on earth and climate.

Utopian thinking? I cannot deny that it is. However, it’s also rational, and for all (except the super rich) would pose no substantial hardship.

Most Sincerely Yours,

Kenneth Vanderbeek

Tight Lips the Thespian

Recently a friend and I, quite simultaneously, discovered that each and the other is lately trapped in an odd state of consciousness — odd in that, given as we are captives to the forces of natural law, it stands to reason that we should not be so encumbered. Yet here we are, the two of us locked — notwithstanding (and not coincidentally) along with millions of other woozy souls across the planet — in incredulity, still, over the imponderable strangeness of Super Bowl XLIX; or, in a phrase, by how it happened that a nouveau Big Bang turned the football game, in its last 20 seconds, into an alternative universe.

Why, we remain stupefied to this day (and shall for years to come), did the Seattle Seahawks (reining Super Bowl champs) — finding themselves all of a stretched-out jockstrap removed from the end zone and the win — why did they trade away the battering-ram efficiency of the run (and the game’s best batterer, Marshawn Lynch) for the Icarus risk of letting the football fly? Given a preponderance of precedents in the NFL that point to running, not passing, as the surer way to victory for teams blessed with Hulks who can thrash the ball at will a measly 36 inches to the promised land, how could the Seahawks wind up in the desert?

The most plausible answer: stupidity. (As in, dimwitted play calling.)

However, my friend offers another possibility: The fix was on.

Now, considering how ethically stunted so many people are these days — be it of poor upbringing and/or an acute need for peer acceptance at any cost and/or a honed worldview drowned in apathy — news of a fixed Super Bowl would surprise me no more than news of, say, an international bank giving rich clients illegal tax breaks, or a celebrity acting like an imbecile, or an elected official getting caught in a lie, or, to recommence the sports theme, a Hall of Fame athlete, say, Whorin (aka Warren) Sapp, getting arrested for, say, having propositioned (and possibly also pounded) a Lady of the Night.

And how do those among the moralistic square a conspiracy theory which supposes that the reason Seahawks Coach P. Carroll and his O. boys called “The Pass” was to keep the ball — and thus also the MVP trophy — out of the hands of their All-Pro (but also All-Embarrassing) running back Lynch? Personally I don't believe the speculation; however, if it would turn out to be true, how ironically justified would such a dastardly act suddenly seem considering how its pawn Marshawn ceaselessly patronizes the media (with bumbling and mumbling) and America's youth (with crotch-grabs on national TV) — that is, until a call to speak with the guarantee of a fattened pocketbook might come alone, as, indeed, it has in the form of the latest Progressive Insurance commercial, in which Tight Lips miraculously transforms into a thespian.

It will be high drama to observe Sea-rattled again, starting next fall. Will this broken team be able to repair itself? — will quarterback Russell Wilson, the errant passer, his psyche? Will Mar’yawn continue to give America the silent treatment?

Personally, for the ’Hawks — as for all transgressors — I envision a precipitous decline toward average.

America's Two Camps

Unless you live at a curling rink, and without cell phone, radio, or TV, you’ve by now heard about the latest incident in the New England Patriots football organization of ethics-gone-awry — this involving someone (or more than one) who allegedly released just enough air from footballs in the hope that this loony schoolyard ploy would give the team an advantage against its rivals, the Indianapolis Colts, in the AFC championship game (played January 18, 2015). To wit: It is claimed (by those who apparently know such things) that lower-than-standard air pressure — between 12.5 psi and 13.5 psi — makes a football easier to throw, and catch, in cold weather. In Boston, where the now infamous game was played, it was cold, very cold.

Should anyone be surprised by the allegation that the Patriots cheated? Sadly, no. First, because (assuming “Deflationgate” is true) the Patriots’ obsession with pigskin parameters will not have been the first incident in which they ignored good sportsmanship; in 2007 (in a dubious affair since dubbed “Spygate”), the team violated NFL rules by videotaping New York Jets coaches across the field as they flashed strategic signals to their players.

Media investigations increasingly suggest that cheating may be “business as usual” in many professional sports. Alas, this is no more surprising than the Patriots’ singular faux pas on the 18th; nor is a glaring consequence of that toe-stub, which is that yet again a misstep in principles has turned up the volume in two distinct camps: the one in which principles matter, the other in which they don't so much.

To those of us who will forever refuse to submit to anti-norms that are spreading like an aggressive cancer in persons who pooh-pooh ethics in favor of hubris (and looking the other way), we say: There is just one way (and it's inarguable) to describe the Patriots' latest alleged infraction: It was wrong. To those ethically challenged persons who have forgiven the Patriots based on an argument that the deflated balls couldn’t have significantly affected the game's outcome, we add: The final score is “clearly not the point.” We who are ethically mindful know this, because we correctly understand — and live by — sound principles.

To those in that other camp — the camp of Excuses and Easy Forgiveness and Looking the Other Way — we of principles wish to inform you that it is your Apathy, your Denial, your Outright Compromise of Good Values that is turning a once upright nation into a fallen child.

Puerile Politicians

In recent comments about the zillions of ad dollars political candidates are spending this midterm election season, an astute big-time ad exec noted: “Having alternatives to spending a lot of money is a relatively new concept.” Thinking she meant that politicians are finally spending less on their overindulgent campaigns (thereby also discovering some measure of statesmanship), I rejoiced. That is, until I read on. She qualified her remark, explaining that, in view of increasing media venues, political-ad expenditures continue to soar as political candidates realize they’re more able to “target” citizens they (the candidates) rely on for votes. What a concept.

I have a better one.



Note to politicians: Next campaign season (and the one after that, and so on), before you spend a penny to lure our votes, tell your ad agency folks to write commercials that affirm you and your vision rather than rake your opponent over the coals. In other words, for a change, enlighten us. If you want our vote (next time), rather than deflect your message with the usual smut, once and for all address us positively and informatively; in a phrase, with words worthy of a fully developed adult. That way, in addition to being able to determine whether you're a decent, intelligent servant or just another crass, bumbling oaf, we voters will have a better chance understanding your stance on key issues (a posture you owe us).

Meanwhile...



Note to my fellow voters: Research reveals us to be little better than gullible gluttons who persistently fall for off-putting political messages. I hold out hope that we common Americans — we who wield the power of election — are actually smarter than this; indeed, smart enough to unite in letting the puerile politicians, and their media grunts, know how utterly disgusted we are by blowhard politics, and that we won’t take it anymore.

A good start is to write your elected officials. Do this often, and in droves....

The Individual: 21st Century's Go-To

In democracy, the fundamental notion of governance is that it originates with the individual; that is, all citizens of the state share an equivalent voice in how they are governed. This viewpoint presides with particular significance in America, where early on economic, political, and moral conflict with an authoritative patriarch emboldened the individuals under its dominion to act collectively in pursuit of independence.

In 21st-century America the individual is more relevant than ever: this magisterial voice on the Internet and in social media; this walking mixed-art canvas of often outspoken vocal, and fleshy, expression; this face of an increasingly accessible limelight. For centuries, the prevailing democratic powers of expression and action were well within reach of the individual; in today’s America, they are firmly grasped.

If democracy may be imagined as a fabric, each individual constitutes a thread. In early America, Jefferson-, Jackson-, and Lincoln-like individuality was enough to make our country’s quilt hardy. But no longer. As important as the individual still is, and (one hopes) always will be, in today’s eminently more complex society he faces hundreds of times as many challenges (and stresses) as his colonial forefathers, and doubly or triply as many as his mother and father. Thus, largely reserving attention to, and action for, one’s personal needs, desires, and space is no longer as practical, or as socially acceptable, as it was just a generation ago; nor may the individual any longer even consider his own exclusive version of morality as tenable on a planet where, in just 100 years, his species has despoiled much of it to oblivion.

In human terms, a primary consequence of nature change (dubbed by science as climate change/global warming) is a paradoxical shift in the notion of individuality. For more than 5,000 years, beginning with the first civilizations, societies were assessed in terms of the individual (or individual type). In the 21st century, however — due in large part to the significant run of climate-related disasters just 14 years in — the individual now is perceived as much for his role in the aggregate as for his autonomy: a key catalyst, in collective, of social, economic, and political action.

Lately, it seems that more and more Americans are embracing this 21st-century brand of empowerment. Particularly regarding our relationship with the earth, psychiatrist-author Robert Jay Lifton notes that “Americans appear to be undergoing a significant psychological shift” from individual concern about the environment to shared: a shift in which disbelief is steadily giving over to acceptance of science’s reliable documentation of climate change, and conscience is trumping apathy. This is good news for a species which, in brainpower, is the smartest on earth, yet in earthly stewardship, has surely been the stupidest.

Over the years, I’ve lamented in a recurring suspicion that our species may be too intelligent for its own good. A case for this contention may be made from contrasting observations of ourselves as infants and as adults. In my experience I have yet to observe or hear of an infant setting wildfires; or tossing trash out of a moving vehicle; or shirking at every chance to recycle paper, plastic, metal, glass, and all other things replenishable (in opposition to a rudimentary respect for the world’s finite resources).... That so many Americans (around 40% in the latest polls) still refuse to accept that humankind is aiding longer droughts, wilder rainstorms, bigger hurricanes and tornadoes, and an abundance of other tempestuous weather events (all veritable signs of climate change) has me stupefied.

Many social scientists have classified climate-change doubters into psychological groups: those who don’t believe the existence of global warming because, so far they haven’t personally been affected by any of the documented increases in hurricanes, tornadoes, droughts, wildfires, extreme hot and cold weather, floods, and rising sea levels; those, like the unaffected, whose localized bias has set up blinders to warming’s worldwide danger; those for whom climate change so far has posed no significant financial hardship (as it has for insurers, engineers, and many other business and government sectors); those who are easily brainwashed by certain political pundits and radio talk-show hosts against accepted scientific findings; and those whose understanding of the connection between the world’s increasingly nasty weather and climate change has so far been, for want of information, sketchy rather than honed.

Yet the number of naysayers is in decline. A 2014 survey conducted by Yale University reveals that, over the last three years, Americans “who think global warming is not happening have become substantially less sure of their position,” while those “whose certainty that the earth is warming has increased.”

Still, many Americans remain uninterested in, apathetic to, or downright unconscious of the critical importance of individual responsibility for protecting the environment. A 2014 Pew Research survey on “Recycling and Reusing” reveals that less than 50% of American adults make recycling a daily habit (despite the fact that many pay for collection service at their homes). Seven in 10 Liberals (70%) say they recycle regularly, compared to four in 10 (39% of) Conservatives. In my own observation, I am continuously stunned by those I observe at work, and in other public places, who make a practice of tossing recyclables into trash cans rather than into the dashing green or blue recycle bins located just a few paces away; as, too, I am astounded by those who persist in this laxness at home — perhaps because there, too, they anguish over the few steps they would have to execute in order to get to the recycle tub. And what can one say about those who regularly ignore litter — aluminum cans, plastic water bottles, cigarette packs, and the like — as it amasses in chain-link piles along the curbs of their front yards?

Not a day passes that multitudes don’t proclaim, “I want a better life for my kids” and “I’m concerned about how the world will be for my grandchildren.” Yet how many take a stake (though “I’m only one person,” a crucial stake nonetheless) in helping — to the best of their individual ability — shape the future as they hope and imagine it will be for those who follow?

A significant flaw in mankind is that it often delays prudent decision-making until crises occur. (The powers that be in New Orleans knew it was only a matter of time before the Big One would hit. Yet they failed to bolster the city’s levies ahead of Hurricane Katrina.) Under the likelihood of increasing weather havoc, mankind is playing Russian roulette the longer it delays active worldwide action to reverse global warming.

In recalling the worldwide shift in response to the potential of a nuclear holocaust, from individual (‘fragmentary’) at the start of the Cold War in the 1940s, to collective (‘formed’) by the 1980s, the psychiatrist Robert Jay Lifton notes, “People came to feel that it was deeply wrong, perhaps evil, to engage in nuclear war, and are [now] coming to an awareness that it is deeply wrong, perhaps evil, to destroy our habitat and create a legacy of suffering for our children and grandchildren.”

Well, it’s about time.

However, for we humans to have a real chance to reverse the ravages of climate change, individuals must unite en masse to turn awareness into a local-regional-national-global coalition of action — a social movement.

Such a concerted charge may already have had its start, or at least a good kindling, on September 21, 2014 in New York City, when tens of thousands, angry about the absence of an established worldwide alliance to curb global warming, marched in Manhattan waving scrawled protests and bull-horning vocal ones to the world, as well as to American and global leaders among them who were to convene about climate change a few days later at the United Nations.

Meanwhile, starting this instant, it would behoove every one of us humans to look away from our TVs, smart phones, and tablets, that we may commence an individual movement to make environmental stewardship a foremost personal calling.

In Search of Domestic Bliss

Question: Might the billions and billions and billions of dollars endlessly poured into Foreign Wars, Foreign Treasuries, Wasteful Government Agendas, Election Advertising, Insanely Inflated Corporate Executive Salaries, Bonuses, and Stock Privileges, Insanely Inflated Professional Athlete Salaries and Add-Ons, and Insanely Inflated Congressional Expense Accounts and Pay Raises be better used toward Ending Poverty (not to mention a few other domestic needs, as well — you know, like proper veteran care, education, economic rejuvenation, climate change legislation, energy reform, business reform, natural-disaster relief, urban renovation, infrastructure renovation, voter registration and polling efficiency, medical research, technology research, charitable causes, tolerance education, earning back citizen trust,...)?

Now Appearing

My short story, “The Prophets,” is featured in the 2014 issue of the national literary magazine, Blue Moon Literary & Art Review, published by the University of the Pacific, Davis, California.

“The Prophets” is an account of the express hopes and challenges of an aluminum can in the twenty-first century.

To order copies of Blue Moon Literary & Art Review, and for more information, please visit its website.

The Odd Differences Over Climate Change

1

Insofar as most behavioral scientists have ascertained (note here, “most”), not one among the nine million other animal species on earth suffers the naïve inability of humans to accept facts always for what they are: information that is proven and irrefutable. (Of course, the reason those nine million don’t share our propensity for ignorance in the face of confirmation is because, not ignoring the irony, they aren’t as intelligent.)

Consider the opposing viewpoints about global climate change (global warming). Despite the preponderance of evidence — much, indeed, proven as fact — that human activity is altering earth’s life-support system, 25% of American adults (57 million) rejects this information. Comparably, only 40% of American adults regards the observed consequences of global climate change as a major threat to physical wellbeing and property.1 Those consequences include: greater frequency and intensity of hurricanes, tornados, monsoons, hailstorms, snowstorms, and other weather systems; increasing greenhouse gas levels; rampant drought and wildfires; melting polar icecaps; rising atmospheric and oceanic temperatures; and burgeoning health problems resulting from increasing pollution and summertime heat.

Doubters of climate change are even counted among meteorologists, those weathercasters who tell us if tomorrow will bring sunshine or precipitation, hot or cold. In a random sampling of 571 TV meteorologists, only one-third of the respondents said that they believe climate change is “caused mostly by human activities.”2 Even the founder of the Weather Channel, John Coleman, dismisses the facts of mankind’s contribution to climate change. “There isn’t any climate crisis,” he said. “It’s totally manufactured.”3 Conversely, climatologists, who study weather patterns over time, agree, nearly by consensus, that the earth is warming and human activity is contributing to climate change.4

Above all, the world’s leading coalition for the scientific study of global climates, The Intergovernmental Panel on Climate Change (IPCC), confidently asserts 95% probability that human activity is the primary cause of climate change.5

Here is a synopsis of the IPCC’s ongoing research on climate change:

Although the panel’s scientists concede that it is difficult to calculate precisely the past, present, and potential future effects of human activity on climate, their research shows irrefutably that the exponential increase of human population, production, and consumption is having a detrimental effect. In its 2013 report, the IPCC concluded, “It is extremely likely humans are the dominant cause of warming in the past 60 years.” The report further concluded that aggressive weather is more common, more widespread, and longer lasting; dry regions are getting drier, and wet regions wetter; and, with more than a billion vehicles now operating worldwide, carbon dioxide (CO2) emissions are escalating beyond natural levels.

Three leading causes of the rift between public perception and scientific proof are journalists, politicians, and the human brain.

Journalists discuss the weather earnestly, plugging it around the clock on radio, giving it the page one lead in newspapers whenever it acts up, and having bequeathed it its own TV channel; yet coverage of climate change has been feeble. An ongoing study by the Center for Science and Technology Policy Research, University of Colorado-Boulder, reveals that mainstream media coverage of climate change has decreased significantly in countries throughout the world since 2010. The study tracks climate reportage at 50 newspapers in 20 countries on six continents.6 Not surprisingly, coverage has been more robust in industrialized than non-industrialized nations.

In the political arena, polarization disrupts, and often times defuses, clear-headed nonpartisan discussion of climate.

Declaring that “Americans across the country are already paying the price of inaction,”7 President Barack Obama, in June 2013, presented a comprehensive plan for combating climate change — chief among his proposals, to significantly reduce manufacturer emissions of CO2.8 But the plan has stalled as a result of lukewarm public opinion and congressional charges that its measures would restrict energy production and stall the nation’s economic recovery.9

Environmental policy dissent on opposite sides of the political aisle hasn’t always been the norm in America. Republicans and Democrats united in praise of Theodore Roosevelt for championing environmental protection and natural-resources conservation, including his foresight to establish our National Parks Service; both parties extended majority support of Franklin Roosevelt’s Soil Conservation Service; and the two sides joined in signing into law the hallmark environmental legislation of the 1960s and 1970s (e.g., the Clean Water Act of 1972).

But collaboration quickly evolved into contention in the 1980s when Ronald Reagan characterized many of the nation’s environmental laws as burdensome on business (an attitude accented by his infamous claim, “Trees cause more pollution than automobiles do”). The Reagan administration’s [now clichéd] charge that “government is the problem, not the solution” exacerbated the two parties’ increasingly disparate earthview in its suggestion that the Environmental Protection Agency had bloated nearly to the point of irrelevance from spendthrift policies of Democrats.

By 1997, the year the U.N. established the Kyoto protocol10 to reduce worldwide CO2 emissions, Rush Limbaugh, and other conservative commentators, had already started rejecting evidence of global warming and proposals for its reversal in what now seems as much an effort to vilify those who acknowledge the scientific facts as to propagandize the contrasting beliefs.

Today, “[n]owhere is the partisan gap on environmental issues more apparent than on climate change.”11 Surveys conducted by pollsters (Gallop, Pew Research Center, and others), as well as the news and wire services, show, on average, that more than 75% of congressional Democrats sanction the scientific data that global warming is a real phenomenon and human activity is its main cause. Conversely, only 40% of congressional Republicans share this viewpoint. Not surprisingly, as a demonstration of shared ideology (and party allegiance), the percentages are nearly exact among registered Democratic and Republican voters, respectively. (Responses of voters who describe themselves as independent are split nearly evenly, 50-50.)

Meanwhile, the world’s life-support system continues to face a bleak future unless Americans and their leaders of all political persuasions — indeed, leaders and their constituencies throughout the world — collaborate soon to establish and enforce an aggressive policy to reverse global warming. Notwithstanding the Kyoto protocol, Obama plan, and other sovereign and international proposals to mandate this reversal, to date perhaps the bluntest is the Durban Platform, a U.N. assessment which declares that the worst impacts of global warming can be avoided only if carbon emissions are reduced at a rate that prevents the global temperature from rising more than two degrees Celsius (2°C). The Platform’s warning is auspicious: To ensure that the global temperature remains below 2°C, humans may emit only 250 billion tons of additional carbon into the atmosphere. Considering that humans currently burn about 10 billion tons of carbon a year, at this rate mankind will use up its allowance in just 25 years.12

If humans do not accede to the emissions limit prescribed by the Durban Platform, the global temperature may reach four degrees Celsius (4°C) before the end of this century.13 Translated into actual scenarios, by 2100 this heat spike could well have caused CO2 levels to exceed those at the time of the last extinction, 65 million years ago.14

2

Given the 95% probability assertion by The Intergovernmental Panel on Climate Change, the world’s leading coalition for the scientific study of global climates, that human activity is the primary cause of global warming, how can it be that 57 million American adults believes humans bear little, or no, responsibility? A majority of human adults understands that when water is continuously expended on iron, the iron rusts. Why, then, do so many refuse to accept the truth that the same carbon emissions long proven to pollute skies, soils, and waterways are also proven to be altering the world’s climates?

Genetic factors certainly help sway avowal of the facts of climate change in some and disavowal in others; specifically, mental factors that can cause emotional/subjective responses (bias, partisanship, susceptibility to persuasion, etc.) to predominate over cognitive/objective responses (disinterest, tolerance, independence, etc.). Cultural influences also help shape mindsets about climate change; social factors (ignorance and denial), political factors (persuasion tactics and partisan loyalties), and economic factors (income status and threats to financial interests) all influence those who acknowledge global warming and those who debunk it as illusion.

Ignorance (consequence of being uninformed) and denial (refusal to accept facts) may be the principal culprits in the still widespread disavowal of the effects of human activity on climate change. Among the 57 million American adults who dismiss the scientific data about climate change, the catalysts of their rejection of the facts range from unawareness and fear to repudiation stemming from socio-political viewpoints, religious beliefs, and suspicion (e.g., influence of conspiracy theories). In 2013 polls by livescience.com, the number of respondents who said they “don’t know” if the global climate is changing increased 16% to 23% from April to November.15

Long-term scientific revelations about the human brain — especially the theory of motivated reasoning — have revealed psychological factors that contribute to one’s prevailing tendency either to abide reason (acknowledge facts) or abide perception (kowtow to beliefs). Specifically, the theory refers to the rejection of information (i.e., facts) that refutes beliefs.

In its application on global climate change, the theory explains repudiation of science.

For example, a 2013 study revealed that “people with a ‘conservative’ political world view could be more likely to reject climate science than ‘liberals,’ but less likely, say, to reject childhood vaccination. And people with a more ‘conservative’ world view who are more highly educated could be more skeptical of climate science than those who have fewer years of education.”16

Will mankind, en masse, ever achieve the 95% certainty of their scientist brethren that human activity is causing climate change? Only time will tell.

Swallowing hard with trepidation, many scientists predict that time is fewer than 25 years away.

  1. Drake, B. (2013). “Most Americans believe climate change is real, but fewer see it as a threat,” Pew Research Center. Also see: United States Census Bureau (2013). Of the current 317 million Americans, 230 million are adults.
  2. Maibach, E., Wilson, K., & Witte, J. (2010). “A National Survey of Television Meteorologists about Climate Change: Preliminary Findings.” George Mason University, Center for Climate Change Communication, Fairfax, Virginia.
  3. Coleman, J. (2010). “The experts explain the global warming myth: John Coleman,” YouTube presentation.
  4. Op cit.
  5. Climate Change (2008). The Intergovernmental Panel on Climate Change (IPCC). The IPCC was established in 1988 by the United Nations Environment Programme (UNEP) and the World Meteorological Organization (WMO) to promote worldwide scientific evaluation of climate change and its potential environmental and socio-economic impact. The IPCC assesses risks of climate change on societies, and options toward confronting such change.
  6. Boykoff, M. & Nacu-Schmidt, A. (2013). “2004-2013 World Newspaper Coverage of Climate Change or Global Warming,” University of Colorado-Bolder, Cooperative Institute for Research in Environmental Sciences, Center for Science and Technology Policy Research. http://sciencepolicy.colorado.edu/media_coverage.
  7. Obama, B. (2013). “The President’s Climate Action Plan,” presidential address, Georgetown University, Washington, D.C.
  8. “The President’s Climate Action Plan” (2013). The White House, Washington, D.C. http://www.whitehouse.gov/sites/default/files/image/president27sclimateactionplan.pdf.
  9. Landler, M. & Broder, J. (2013). “Obama Outlines Ambitious Plan to Cut Greenhouse Gases,” The New York Times.
  10. Kyoto protocol (1997). U.N. Framework Convention on Climate Change, Kyoto, Japan. Responding to a 40% increase in carbon-dioxide emissions worldwide, from 1990-2009 (see Netherlands Environmental Assessment Agency report), the Kyoto protocol became the first agreement among nations to mandate country-by-country reductions in greenhouse-gas emissions. Nearly all industrialized nations worldwide have signed the treaty, with the notable exception of the United States. Policies of the protocol were enforced beginning in 2005.
  11. Dunlap, R. (2008). “Climate-Change Views: Republican-Democrat Gaps Expand,” Gallop. http://www.gallup.com/poll/107569/climatechange-views-republicandemocratic-gaps-expand.aspx.
  12. Durban U.N. Climate Change Conference (COP17/CMP7) (2011). Durban, South Africa.
  13. Nordhaus, W. (2013). The Climate Casino, Yale University Press, New Haven, Connecticut.
  14. Ibid. Note: Scientists universally agree that at least five mass extinctions have occurred on the earth. According to the National Geographic Society, “[M]any scientists think...evidence indicates a sixth mass extinction is under way. The blame for this one, perhaps the fastest in Earth’s history, falls firmly on the shoulders of humans. By the year 2100, human activities such as pollution, land clearing, and overfishing may have driven more than half of the world’s marine and land species to extinction.” http://science.nationalgeographic.com/science/prehistoric-world/mass-extinction. For more information, see Kolbert, E. (2014). The Sixth Extinction: An Unnatural History, Henry Holt & Company, New York, New York.
  15. Pappas, S. (2014). More Americans Don’t Believe Global Warming is Happening: Survey.” livescience.com in collaboration with weather.com. http://www.weather.com/news/science/environment/more-americans-dont-believe-global-warming-happening-survey-20140117.
  16. Bastian, H. (2013). “Motivated reasoning: Fuel for controversies, conspiracy theories and science denialism alike,” Scientific American, blog. http://blogs.scientificamerican.com/absolutely-maybe/2013/10/14/motivated-reasoning-fuel-for-controversies-conspiracy-theories-and-science-denialism-alike.

New & Noteworthy

The national literary magazine, The Griffin, has accepted my short story, “The Prophets,” an account of the express hopes and challenges of an aluminum can. “The Prophets” will appear in the 2014 edition of The Griffin, slated for publication in the fall. For more information, please visit its website.

The editors of Kudzu Review have nominated my short story, “Senseless Man,” for inclusion in the Best of the Net Anthology 2013, featuring writing judged most outstanding among all fiction, nonfiction, and poetry published in 2013 by the literary websites. The anthology is the Web’s counterpart to the eminent Pushcart Prize and Best American series in print. In addition to appearing in the online Summer 2013 Issue 3.1 of Kudzu Review, “Senseless Man” was also published in the companion print edition. To read online, and for information about how to purchase the print edition, visit the website.

Of Baboons and Bimbos

I was recently fired from my job in Corporate America, in part, for being too proficient in my work. In the other part, for daring to expect the same from colleagues.

The company that canned me engages in engineering. You know, that useful trade in which the practitioners apply an egghead’s knowledge of math and physics to make sure buildings don’t fall down, electrical grids don’t blow up, that sort of thing.

Of course, proficiency — allow me to be more precise, allegiance to accuracy — matters in every line of work. However, we all know that in disciplines especially like the law, where carelessness can send an innocent to jail, medicine, where a physician’s smallest flinch or misdiagnosis can make for a patient’s bad day, and engineering, where imprecision can cause implosions, the importance of accuracy is redoubled by extraordinarily complex challenges that command not just smarts, but also levelheadedness.

Thus engineering actually requires even more than an allegiance to accuracy; it also begs an intensity of devotion.

Of course, this is how every kind of business should operate. Must operate.

So it’s troubling that my former employer doesn’t.

Instead, its engineers — and The Brass who persistently pussy-whip them into action beyond reasonable performance expectations — mindlessly enable a dysfunctional culture in which productivity is measured not by exactitude, but by volume and speed. At this company (as at far too many other American companies in this age of Profit First), bottom line is the Evil Queen and accuracy is Snow White.

Accountability also takes a back seat. I especially know this. I was responsible for proofreading the engineers’ technical reports, which, before issuing to clients, arrived on my desk invariably full of spelling, grammatical, and, in some cases, even factual mistakes. I also know this, because, before client invoices went out, I routinely discovered mistakes in them, as well, particularly omissions of time charges and expenses — the result of which is a continuous and significant loss of earnings.

Thus, as if engineer sloppiness isn’t troubling enough, many other of the company’s disciplines also suffer from accuracy apathy.

  • Accounting is known to pay credits as if they are debits.
  • I.T. is forever trying (and usually failing) to repair something or other in the company server.
  • Certain executives never send an email that isn’t full of typos. (I’ll never forget the emails, two in particular, of the most infamous keyboard-challenged executive in the company. In one, he’d typed the word “shit” when he’d meant “shift” — “Refer to the attached report for the specific changes in your shit.” In the other, he’d announced having terminated So-and-So, obviously clueless of the fact that it is far less violent to terminate employment than an employee.)
  • Administrative Assistants, who are responsible for taking in client assignments, also regularly make a mockery of diligence: misspelling client names, getting phone numbers wrong, bastardizing the language, and distorting the details of scope of work.
  • And so on...

Consider the case of an engineer on assignment who’d started driving west through Missouri, toward Kansas City. Halfway there, he called me.

“Where the hell is this place?”

Naturally I asked him what was written on his assignment.

“Kennett,” he answered.

“Kennett is in Missouri’s bootheel, southeast,” I told him.

The engineer had missed his mark by 300 miles. (I almost suggested that next time he might consider consulting a map; better yet, an online mapping service. But I didn’t. By then I’d willfully ceased being proficient in babysitting.)

The more I pointed out the company’s plethora of problems to my supervisor, the more they fell on deaf ears. So I started taking them up the ladder to his supervisor (whom I’d believed was a sensible ally), with a collaborative desire to help steer procedural changes that could fix the problems. In hindsight I realize, of course, that the more I’d elicited that executive’s assistance, the more black marks he’d compiled against me (probably on some spreadsheet where he’d entered my name, misspelled).

Baboons are known to place such a premium on accuracy that they can usually decipher the contents of shopping bags based on their exterior markings. How? Answer: a canny dedication to details. Thus the bag with the grocery store logo is the one they’ve come to know contains the mother lode, fruits and veggies; the other bags, the ones with the pharmacy symbol, hardware nomenclature, twine handles, etc., are superfluous. There’s a lot the bimbos of Corporate America need to learn from this.

Politicians, of their propensity to turn fibs into facts, are infamous for strutting a cavalier attitude about accuracy, especially in election seasons. And the media remind us of the mindless and apathetic inattention to precision that daily mangles mechanisms and causes throughout our increasingly dumbed-down society.

At least two explanations of the preponderance of U.S. workplace negligence are those age-old villains, hubris and deceit. Yet research suggests that our faux pas now are just as often (or more so) the byproduct of post-modern personality quirks: burnout; anxiety; the human mind’s natural resistance to multitasking; the steady increase of favorable performance evaluations despite mediocre work; and, to return to the animal analogy above, the fact that many in today’s workforce couldn’t give a rat’s ass about quality: they’re far more interested in slothfulness.

Can organizational mistakes be a sign of systemic problems? Responding to this question for a recent New York Times article, a leading expert in human systems integration (fancy phrase for collaboration), answered with a definitive Yes, and proceeded to provide examples of the most common causes of workplace mistakes in the twenty-first century: (1) failure to enforce fact-checking (checks and balances) and other corporate Best Practices; (2) breakdown in communication among departments; and, most telling in the context of my firing (3) fear among subordinates to question higher-ups.1 (Of course, what did me in at the engineering firm was that I have no fear.)

Are any signs pointing to resolution of these problems? Sadly, at large, the answer is No. In fact, in the U.S. workplace mistakes are on the rise.

Meanwhile, at my former employer (as well as at many, many other companies throughout the U.S.), The Brass continue to get away with treating underlings — the core of company success or failure — not as equal contributors, but as chattel.

  1. “Making the Most of Your Workplace Mistakes,” by Phyllis Korkki, The New York Times, January 17, 2009.

Tight Squeeze

A few years back the most amazing thing happened; or didn’t, depending on your perspective. Either way, it was a game of inches.

It was midmorning, a time of transitions. The previous day’s mind had finally reawakened and the sated body was revving in full gear. I was situated at a corporate desk on a contract assignment, typing a brief about some sort of boiler explosion an engineer I was assisting was soon to investigate, the workday well into its forward march, when one of those gurgitations announced itself in the innards in its precarious way.

Instinctively (and very tightly) I squeezed my cheeks and held my breath, in that same instant rising with the utmost care and an unrequited desire to get to the rest station down the hall just as fast as a duck walk may allow.

To my horror, when I arrived housekeeping was...tending the thrones!

WHAT TO DO?

Firstly, I waddled right up to them. Now with every step a bubble of air popped behind me and there was no telling which one might suddenly yield to the liquid state. Secondly (actually, still firstly), I redoubled the squeezing. This segued naturally into a quite deliberate pacing in circles and spewing of Oh don’t fail me nows!, the former albeit somewhat measured such that it might spring ‘a hint’ but not the gurgitation.

To be sure, the cleaning people took their tissues and frightened looks and scrammed!

O telltale clock! No longer ticking, it was pounding. Marching bands were playing John Philip Sousa, fat men were bursting out of canons; stars and birdies were dogfighting around my head. Time was nearing its end: Gravity was about to obliterate it!

And then—

And then—

Tsunami in a toilet!

“Is everything O.K. in there, sir?” asked housekeeping, who had just returned. “It sounded like something — crashed.”

I’m not sure whether my response was a wheeze or a toot.

“You needing some help, sir?”

“No.”

“You sure?”

I don’t need any help!

This time their exiting footsteps were even more fleeting. And in their trailing voices I thought I heard “Eee-uuu.”

Finally I reopened my eyes. Panting, sweating, heart still in swift revolt, I looked in front of me, behind me, and under me. My pants, they’d made it — undefiled!

For a time I didn’t move (‘movement’ in this case referring to the act of getting up and far away). First I had to square whether this was God’s punishment for my having returned to work before entirely kicking the flu and pampering the famished stomach with a half-pound burger piled high with blue cheese chunks, a plateful of seasoned fries, a bowl of creamy chicken soup, and three beers. But really, it didn’t matter. For I’d just defied the odds.

O relief! O miracle! O...

I rushed to turn on the fan.

Take Me Out Of The Ballgame

Of all the four seasons (earth’s perfect quatrain), it is the spring that gives this poem of life its vibrancy. And baseball its music.

The spring’s revitalized sun wakens winter’s sloth and entertains his children well, its warming breath casting the lovers’ spell; and baseball gives rhythm to these rites.

And so it is baseball that turns the cold air warm and the sun so much brighter than the winter’s, glazed and steely; it is baseball that makes the dogwoods flower and the cherries blossom; baseball that makes the masses waken to crave again jumbo dogs and slushies, sudsies and pinstripes. Baseball that makes the old young and the young wiser.

This is how baseball has been in America for more than half its life. And how it also was for me.

But no longer does this game — I should say, rather, Major League Baseball’s (MLB’s) version of it — invigorate me. No longer does it resonate.

Its alienation (its crass infringement on trust) began in 1994, the year of the shortened MLB season: more precisely, the year when greed and egotistical pettiness ended play. On September 14 — almost exactly a month after the players had struck against the owners, the day acting Commissioner Bud Selig had called off all remaining games, including the World Series — devastated (a former player; at the time, a youth-league coach), I found myself suddenly barely clinging to Big League baseball, and this only because of the pure game’s indelible assets: its elegant, civilized form and function; its eminent social and celebratory caliber; and its profound gift of providing pleasure and reminiscence for innocents fortunate to have been born to the game’s graces ever since its own birth in the 1840s, in the time of my great-great-great grandparents.1 Surely, I thought, the professional game would survive the gamers (as it had at other grim intervals; e.g., the 1919 Black Sox Scandal): its fans once more to intervene and preserve The Great American Pastime; and I was right. But that summer, baseball’s professional steward, MLB, had missed a crucial sign and swung badly, outside and low.

Stunningly conspicuous about the 1994 players’ strike was its breathless deceit — the owners, the players, and their favorite mascot, American Capitalism, doing their best finally to strangle the fun out of baseball and solidify it as an extravaganza more about purse and payoff than pastime and pride. Not as immediately conspicuous was its effect on the fans, who, for weeks, months, for some, years afterward, suffered between numbness and bitterness and the woeful indecisiveness that made its nexus: whether to remain loyal to MLB (or even to their favorite MLB team) or bail. The first sign of rebellion didn’t materialize until the next season, as attendance declined. Yet that winter the fans had already begun rising en masse against the business of baseball — and for me, MLB had begun to fade to irrelevance.

But foolishly (as time would prove, also ironically), I, and too many others among the outraged, in 1998 turned the other cheek to MLB’s indignities when Mark McGwire, of the St. Louis Cardinals, and Sammy Sosa, of the Chicago Cubs, slugged away at New York Yankee Roger Maris’s 37-year-old single-season home-run record (61), and then commenced a veritable love affair with the game when McGwire shattered the record with 70 — all the while giddily ignorant of the fact that all along he and his bat had been conspiring with a third force, drugs, to propel all those balls over the walls.

Enter “The Steroids Era,” legacy of that MLB firestorm of dingers. For the next 15 years, more than 100 MLB players would test positive for performance-enhancing drugs (PEDs) (including McGwire and Sosa), and nearly as many would be suspected, spawning widespread, and unprecedented, player disgrace (and penalties), fan disgust, MLB rules changes, even a congressional inquiry.

The consequence: fan revolt anew, culminating, this time, in a mass boycott of MLB. True? Unfortunately, false. For despite the now seemingly endless shames that taint Major League Baseball, it continues to draw fans by the tens of millions. How?

To get to the answer, begin with MLB’s latest embarrassment: the drug-related game suspensions of Milwaukee Brewers outfielder, Ryan Braun; New York Yankees’ third baseman, Alex Rodriguez; and 12 other MLB players.2 Many cheats had already preceded these latest among the infamous. But what sets Braun, Rodriguez and the dirty dozen apart is how their indignities, as none before, sweepingly amplify not just the collective shames of MLB in the twenty-first century (icon worship, undue tolerance, enabling, etc.), but especially the fans’ collective response to them: a perpetual trance of forgiveness and forgetfulness, as if every childish player’s slipups this millennium haven’t really been “that bad,” at least not enough to warrant — boycott. Besides, think a majority of fans, if most of the guilty haven’t been caught by now, soon they will be, and then finally the whole bloody steroids era will have quickly ended, and all again will prevail in goodness, and Jupiter will align with Mars, etc., etc. But this is a dream tinged in ignorance.

Braun and Rodriguez, especially, each of the body and mind of the quintessential cheat, consummately portray the severely unbalanced consciences of all of today’s ethically challenged athletes (including, alas, those who shall follow ad infinitum). Their characteristics are as old as humanity, archetypal: first the rule-breaking (stupidity); then the lies (egotism): Braun, “All [of my achievements] are a result of me...carrying myself the right way and staying out of trouble...,” and Rodriguez, “Steroids? Gee, why would anybody take them? What do they do? I don’t know anything about it”; and more lies (narcissism): Braun, “Everything I’ve done in my career has been done out of respect,” and Rodriguez, characterizing, as “not legitimate,” allegations he had used PEDs provided by the Biogenesis clinic in Coral Gables, Florida; and then, upon proof that each, indeed, had violated MLB rules, admission [sort of] of culpability (self-righteousness); and finally, apology (manufactured contrition): Braun, “I realize now that I have made some mistakes,” and Rodriguez, “I got caught up in this ‘Everybody’s doing it’ era...I feel deep regret for that.”

As if this isn’t all trust-busting enough, nearly as disquieting is the ongoing profusion of halfwits (players and fans alike), who, despite the overwhelming evidence, the proof against, continue to stand by their men.

If, from the schoolyard, a child brought home the kind of disgraces that pervade MLB, his parents (that is, the normal ones) would surely ground him for at least a month of Sundays and probably also throw in counseling for good measure. Yet to a majority of fans the attitude persists: Who cares that Ryan Braun and Alex Rodriguez and so many others have been banned from playing out the season — and that others just like them lurk around the corner. Let’s keep buying tickets!

Are the hoodwinked forgivers and outright disbelievers just as stupid as the deviants they persist in cheering? Many are, certainly. However, I think that most persons who continue to support — emotionally and economically — the tainted players, and this tainted game, are more so moved by a Pavlovian-like earnestness to preserve the game at all costs, not to mention also a conscious (or unconscious) fear of feeling forever heartbroken if they don’t.

But what game? This baseball is business. And today, baseball is Big Business. Gigantic, more-form-than-function stadiums with pristine turf and squeamish butt-challenged seats with no elbowroom stacked in Twiggy aisles five stories high — all this to “enjoy,” in an afternoon or evening, for the mere equivalent (for most families) of one third, or one half, of one week’s earnings (or, if you like more comfy, a climate-controlled viewing station called a box, for a mere one year’s salary). And that’s not all, folks! Outrageously priced watery beers, wimpy hotdogs, box snacks with thumb-sized cardboard prizes inside — all delivered by vendors who regularly impede your view in a continuous marching up and down, down and up, in excessive numbers, like riot police. All told: your esteemed privilege to subsidize this game’s players in the form of loot they wouldn’t need for a dozen lifetimes, let alone just one.

I am sad. I miss this old pastime, uniquely American. Yet in its MLB version, baseball is the former good buddy who stole my girlfriend.

I hope I may never give in to temptation and attend another MLB game. For the love of baseball, I may give half an ear to the MLB version on radio or catch a few innings now and then on TV. But I shall no longer abuse my wallet by dragging it into an MLB stadium.

From now on, I will enjoy baseball’s professional version at fields at which the great game is played with the requisite skills by part-timers (as a “side line”), where the highest seat overlooks a somewhat healthy leap to a flower-lined walking path below, where clowns and children gleefully run the bases between innings, where the food and drink are not much pricier than fast food, and where most of the going tickets are graciously exchanged for all of a five-spot. From here in St. Louis, these are the ball fields of the River City Rascals, performing just west of the city in suburban O’Fallon, and the Gateway Grizzlies, performing not far east, across the Mississippi River in Illinois, on an old cornfield.3

For me, this, now, is baseball such as the professional version ought to be played.

And not at all coincidental is that around the perimeter of one of these ball fields stands a quaint grove of decorative trees. Dogwoods.

  1. The first recorded “base ball” game, June 1846, was played at the woodland-encompassed Elysian Fields of Hoboken, New Jersey. The New York Baseball Club defeated the New York Knickerbockers 23-1.
  2. This summer, MLB levied suspensions ranging from 50 games (for 12 dopers) to 65 games for Braun and 211 for Rodriguez, including the entire 2014 season.
  3. With luck, a team of the Frontier League may play near you.

Happy [Less Than] Independence Day

On this day, July 4, 2013, the day of annual celebration of our forefathers’ declaration of freedom over oppression, and equality for all, all across the United States of America those who care to are putting aside their beers and barbeques for a few moments so that they may reflect on the larger celebration before us, that in our nation’s 237th year the guarantees of life, liberty, and the pursuit of happiness Americans recognized in 1776 “to be self-evident” endure: that still, today, our lives are endowed by our “Creator.”

And yet, sized up not under the laws of God, but critically, those of man, how do freedom and equality for all fare now in America? And what of those guarantees? Self-evident in concept, sure. But does the test of self-evidence hold up also in practice? In short: Do we Americans really have much to celebrate today?

Before anyone answers, it would be prudent for those who wrap themselves in America’s great mourning cloaks — shame and denial — to untie their barbeque aprons and refasten them, tightly, around their eyes; because for an increasing number of Americans, blindness seems vastly preferred over recognition — and for just as many, or more, far more palatable this day than burgers is bigotry.

Just ask the targets of their timeless prejudice: minorities. Ask the African American citizen whom, 237 years after slaveholder Thomas Jefferson declared that “all men are created equal,” millions of fellow Americans still demean, dismiss, and bully. Ask the gay citizen who, in addition to having to confront bigotry, had to wage a many-years-long struggle to finally enjoy the same “inalienable right” to marriage as his heterosexual friends. Ask the Hispanic, Muslim, and other “alien” citizens, whom fellow Americans regularly detain, based primarily on looks alone, for proof of legitimacy. Ask the elder citizen who, if he is living out his last days in an elder-care home, currently has a 30% chance of being neglected or abused.1 Ask the Veteran of the Iraq or Afghanistan War to whom, ever since his return to the Land of Opportunity, fellow Americans have yet to proffer gainful employment, let alone mental, emotional, and physical aid. Ask yourself...

It may soon become cliché (alas, rightfully so, given the state of our Union) to mock Jefferson’s meditation on freedom and happiness. A wonderful nod to human potential, his Declaration of Independence. Yet as words to be believed in the context of surety, at times ever since (as, at present), his honorable assertions have found themselves suffocating in an ever-bloating class of misfits, to name but a few: Uncle Sam, of his ongoing lie about getting fiscally responsible; Big Business, of its entrenched refusal to enforce regulations that would protect the welfare of its employees, the financial interests of its shareholders, and the health of air, water, and earth; George W. Bush, of his insistence that the U.S. had no other choice than to invade Iraq, though his decision was based only on a presumption: that Saddam Hussein was stockpiling weapons of mass destruction (an invasion which subsequently enabled the deaths of more than 100,000 Iraqi citizens); and Congress, of its childless members’ unending feuds and stalemates, and dizzying devotion to vacations and pay-raises.

In our fair land of hyper-intolerance (and too much tolerance of intolerance), one of the constantly given rationales for why prejudice and inequality are still so rampant is that we are a nation too stressed-out, or too exhausted, or too identity-challenged, or...; and, in any case, that too many now are too angry and apathetic; thus have we either forgotten how, or lost the will, to control those particularly pervasive feelings: desperation, fear, hopelessness. Hogwash. The real problem is that our nation has grown too big and dumb from too many half-developed adults: eternal brats who in childhood were not availed the basics of healthy behavior (politeness, decency, social aptitude, honesty, and so on), or, were taught, but for one reason or another decided, upon reaching independence, to subordinate good and goodwill.

(Consider the likes of Alec Baldwin, who last week characterized a British reporter as a “toxic little queen” after the reporter noted that Baldwin’s wife had tweeted during actor James Gandolfini's funeral; consider ESPN reporter Chris Broussard, who, after NBA player Jason Collins in April had publicly announced he’s homosexual, responded (with a particularly well-spiced intolerance) that he believes Collins is in “open rebellion” against God; consider Todd Akin’s (Rep-MO) misogynist-bordering-on-prejudicial comment about rape in last fall’s Missouri senatorial race: “If it’s a legitimate rape, the female body has ways to try to shut the whole thing down”; consider the radio bombast, Rush Limbaugh, who, in recently commenting about the possibility of Hillary Clinton running for president in 2016, asked Americans if they “want to vote for somebody, a woman, and actually watch a woman get older before their eyes on a daily basis”; and, of course, consider Southern cook Paula Deen’s outed prejudice after statements like “I feel like the South is almost less prejudiced, because black folks [implying servants] played such an integral part in our lives” and “Come out here, Hollis [Johnson, her driver, bodyguard, and assistant, who at the time was gazing at her from offstage]. We can’t see you standing against that dark board.” And on and on and on...)

Prejudice is like sandstone: Its layers keep multiplying and multiplying, unendingly. As a twenty-first century American, Chef Paula may quite possibly be dean of a “new prejudice,” a kind through which (despite continuous advancements in what constitutes “right” and “fair,” advancements one would suppose would naturally kill prejudice) she chooses to stay stuck in old ways, despite their having been proven to be wrong ways — in her case, the ways of the Old South, from which she learned her particular brand of ignorance and its bad habits. Habits she has not corrected, because she refuses to evolve with correctness.

It is not without irony that this year’s Independence Day celebration coincides with the 150th anniversaries of the Battle of Gettysburg, the three-day (July 1-3) American-against-American bloodbath that culminated on the very eve of the Independence Day celebration, and The Gettysburg Address — both of which shook mindful Americans to an abrupt reassessment of cultural and political best practices.

In one of perhaps millions of letters written by soldiers during the war, Union officer Sullivan Ballou, in a particularly soulful letter to his wife and children, declared: “...my love of country comes over me like a strong wind and bears me irresistibly on with all these chains to the battlefield.” He continued: “I know how strongly American Civilization now leans upon the triumph of the Government, and how great a debt we owe to those who went before us through the blood and suffering of the Revolution. And I am willing — perfectly willing — to lay down all my joys in this life, to help maintain this Government, and to pay that debt.”

Sullivan Ballou: American of conscience. Invested American. An American who, as most others of his time who supported the Union, believed that, in return for its guarantee of freedom, he owed his country, at once, a debt of extrinsic devotion as well as active participation in the good causes that preserve freedom.

Comparably, with his Gettysburg Address, Abraham Lincoln reaffirmed that “all men are created equal.” And then in earnest he hailed:

“...the great task remaining before us — that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion — that we here highly resolve that these dead shall not have died in vain — that this nation, under God, shall have a new birth of freedom....” [My italics.]

Six score and thirty years since that solemn call to devotion, our great country seems needful of yet another new birth of freedom.

  1. ABC News, citing a congressional report.

The Careless in Elder Care

Those who are reared by parents that understand and impart the basics of human civility, like ethics, social skills, caring for others — in other words, by parents who act adult — are fortunate, indeed. For in this era of rampant selfishness, aloofness, and apathy, the fortunate are fewer and fewer.

It is old news to note the decline of things like neighborliness, good customer service, and loyalty in business: good old-fashioned qualities that went out with the ‘80s. Yet to well-adjusted adults who mourn these losses, what is not old is our utter inability to rationalize them. We wonder: Why now are so many people indolent, angry, indifferent? In all their wisdom, my best teachers impressed in me how much easier it is to live by acceptance than by animosity, and given how significantly different these opposites make not just the recipient feel, but also the giver, they were right of course. Then why, given the choice between accommodating and alienating, do so many these days choose the latter?

Particularly confounding is why so many who make their living providing care to senior citizens actually care very little. (Although precise data about how many older Americans are being neglected, exploited or abused are currently not available, according to the National Center on Elder Abuse, “Evidence accumulated to date suggests that many thousands have been harmed.”1 According to best estimates, as many as two million Americans aged 65 years and older have been mistreated by someone on whom they depend for care.2)

I know a wonderful brood, three sisters and a brother, who recently had to help their aging parents make the physical, and emotional, adjustment from real home to nursing home: in the lingo of health care, to an assisted-living facility. In assisted living, apartments are made available for elderly persons who are still capable of living independently, yet who need (as one senior-living website notes) “a little extra help to maintain their independence.” In other words (for thousands of dollars a month), assisted-living residents are provided 24/7 access to a full breadth of health-care services and licensed health-care professionals, from care companions to nurses and physicians.

In concept, splendid.

However, in the case of my friends’ parents (and too many others in their position) in the brief time since they arrived at their elderly-care home they have been beset by grievances, because too many of their caregivers continuously demonstrate that they don’t care. Among many cases in point:

1. Due to chronic arthritis, Mom has limited mobility; consequently she sometimes gets crotchety. Rather than accommodate her mood swings, or, at the least, brush them off (each response should be routine for health-care professionals; after all, grouchiness is common among the aged), several on staff regularly patronize her; some outright ignore her.

2. Mom and Dad have a beloved cat (who also happens to be elderly). Recently the litter box began to stink — this, because Mom had to temporarily move to the rehabilitation center for physical therapy, and Dad, who has middle-stage Alzheimer’s, sometimes forgets. On the day of Mom’s departure, a member of the housekeeping staff — who was aware of Mom’s and Dad’s situation — made an effort during her regular visit to help Dad clean the box. This assistance continued another day. However, on the third day a supervisor told the housekeeper to stop: Cleaning cat litter boxes is not in her job description. Later, as the supervisor was informing the siblings of this facility “policy,” she made an offer. The housekeeper would resume litter-box cleaning (a) for a monthly fee of $87.00 (this on top of the hundreds of dollars Mom and Dad had to pay, prior to moving in, for the “privilege” of keeping their cat); (b) however, the housekeeper would not be able to provide this assistance on weekends. When, naturally, the siblings queried the supervisor, she blamed the cat. “It should be gotten rid of,” she said.

Elderly-care homes are regulated at the federal level by the U.S. Department of Health and Human Services and the Centers for Medicare and Medicaid Services. One wonders how diligently these agencies watchdog their fair-practice standards.

At the senior-living community where Mom and Dad reside, its website boasts telling quotations: “It’s the most wonderful job in the world” (employee) and “Everyone is so friendly here” (resident). Of course, one would hope that these sentiments are real, given not only that they exude care at an enterprise at which care is paramount, but also given that more and more senior citizens are moving to elderly-care homes, including the Baby Boomers (5% of Americans aged 65 years and older currently live in elderly-care homes; by 2020 this number will have more than doubled3). Yet one wonders whether these appraisals are authentic or rather products of some ad agency’s creative brainstorm. (A third quotation — “Seriously. Fun. Living.” — is also attributed to a resident. Seriously, who talks like this?)

For a senior citizen who no longer possesses the strength or mental facility to maintain his/her own home, transitioning to an elderly-care home presents a major emotional challenge that should not be exacerbated by careless caregivers. Yet sadly — especially to dreamers as I, who keep wishing for a more perfect society — for too many elderly-care givers, “care” is a state of the mind and heart of which they are ignorant, or about which they couldn’t care less.

Someday many of these caregivers may feel differently when it comes their time to seek elderly care.

To those fortunate for whom caring comes naturally and from good upbringing, this irony is palpable.

  1. National Center on Elder Abuse.
  2. Elder Mistreatment: Abuse, Neglect and Exploitation in an Aging America. 2003. Washington, DC: National Research Council Panel to Review Risk and Prevalence of Elder Abuse and Neglect.
  3. U.S. Bureau of the Census.

Brave New World

Wonders never cease in our society of increasing artificial “intelligence”...

At Starbucks recently I ordered a latte and muffin: total cost $5.88. As I offered, as payment, a 10-dollar bill, the cashier’s hands flew up. “I’m sorry,” she said, “but we cannot accept cash.”

I gazed left, then right, and also behind me, to make certain I’d not suddenly been transported somewhere else. I even looked down to make sure my fly wasn’t open. A store not accepting cash for payment? This was like a priest not accepting confessions for sin. In fact, never before had anyone I’d offered money to not snagged it.

“Oh I get it, you think this is a—!” I finally responded waving the bill, yet stopping mid-thought that it could even be possible this cashier suspected me of trying to pass a counterfeit. She was young, and quite radiant, like a computer screen in a dark room, with freckles and big saucer dimples, and blessedly of an age at which innocence still wrapped her snuggly.

“I’m sorry,” she began again, “but as I said,” this time placing a hard emphasis on that legal-tender word as nearly old as humanity itself, “we cannot accept cash.”

“Then what am I to do?” I asked. “I want my coffee and muffin.”

“We’ll need a credit card,” she said.

“I don’t have a credit card,” I said. (I lied; I didn’t want to pay with a credit card.)

“Debit card, then.”

“Strike two.”

“Well, of course we also accept smart phones,” she pressed on, meaning, of course, apps.

“No smart phone, either,” I said, showing her my flip-up cell phone with holster, circa 2004.

By now the shift manager, who’d sort of been listening in, lumbered over. “Go get some paper and record this customer’s order,” he instructed, adding, “and start keeping a list in case you get any more of these cash orders,” his face as red as a coffee bean before it’s picked. Then he looked at me. “This machine, sir,” he said steadfastly, referring to the uppity point-of-sale unit (alias cash register) that was the cause of all this trouble. He paused. “That is, the system that makes this machine work, sir,” he elaborated, “—it is not presently allowing us to accept cash, only plastic and apps.” Again he paused, this time either to catch his breath or unscramble his brain. Then he quickly finished: “So why don’t we just go ahead and swipe a credit card now — we accept almost every kind — and then everything will be all hunky-dory, shall we?” And with this the girl leaned over and whispered the awful truth in his ear.

Meanwhile my latte and muffin had just arrived on the counter, and to free up my hands I put the lonely 10-spot back in my pocket. At the same time, to the girl the manager muttered something to the effect that of course they’d had no choice but to honor my order; it wasn’t my fault the cash feature of their compu-register had crashed so that they couldn’t access the money drawer. And then away he stomped, turning back only ever so briefly to add, “District ain’t gonna like this.”

I smiled at each then in turn, and, backing away with my free breakfast, bid them adieu.

But by then I was just static to the Starbuckians.

For it seemed their present concern was how to make things right with the machine....

2 Stories Set For Publication

Two of my short stories, “The Mourning Cloak” and “Senseless Man,” have recently been accepted for publication.

“The Mourning Cloak” tells the story of a divorced father who perceives his imperfection, and ultimate self-forgiveness, in the form of an unusual savior. The story will appear in the 2013 issue of the Bryant Literary Review, published under the auspices of Bryant University and slated for publication in May. To order, please visit its website.

“Senseless Man” concerns an individual who literally loses his senses in a struggle for his original, his better, nature. The story will appear in the June issue (3.1) of the Kudzu Review, published under the auspices of Auburn University. To order, please visit its website.

Blink And You May Get Fired

This post is an excerpt from the essay ‘Blink And You May Get Fired’.

I just lost my fourth job in ten years.

The first ax fell at a national department store whose CEO decided to “trim fat” as he continued ringing up $300,000 in quarterly bonuses. I lost the next job at an ad agency, where the principals overcharged clients while freezing employee raises; and the next at a well-known corporation, where the new president decided to deep-six the marketing department in favor of outsourcing. The latest demise shares a core commonality with the others: termination without cause. In other words, I was fired simply because it was my employer’s will to fire me. Sue, you say? Gladly I would, were it not for a federal law that makes any chance of winning in court impossible. It’s the At-Will Employment law; and it gives employers in all fifty states carte blanche to dismiss employees at any time, for any reason (except discrimination and a few other easy-to-prove wrongful motivations) — indeed, for no reason at all. (So much for the American Dream.)

How can such a clearly biased statute be legal?

Continue reading...