Thursday, October 26, 2017

Ishmael, Or The More Things Change…


[Note: This is a lengthy post.  I hope that by sticking with it you’ll be rewarded.  If you don’t have time right now, it will still be here when you do.]

When I read the final paragraph of Michael Powell’s “The American Wanderer, in All His Stripes” (NYTimes, “Week in Review,” 8.24.08, p10) a shudder ran across my shoulders.  First, I must acknowledge Mr. Powell’s courage in attempting the onerous task of compressing the archetype of America’s picaro or isolato into a very brief article.  This archetype includes some of America’s fondest and most curious characters, such as Natty Bumpo, Huck Finn, Jay Gatsby, Jack Kerouac (of the many whom Powell mentions in his review), and, of course, Ishmael.

Powel is not so much interested in literary character analysis as he is in tracing the legacy of wanderlust in the American cultural personality.  He cites familiar census data to illustrate this, but I think most American families can feel it in their bones.  We are, in fact, wanderers by virtue of our being here.  With the exception of what’s left of Native Americans and the descendants of slaves, we all got here willfully to move away from somewhere else. For most, this became a cultural modus vivendi— it’s what we do and who we are.  And for a few it was and is the result of desperation.  To put it succinctly: This is a highly complex part of American human character, which first came under scrutiny in the wake of Frederick Jackson Turner’s proclamation that American freedom, as illustrated in this urge to roam, had finally come to an end—the frontier was closed.

[Please bear with me.  I will be dealing with Ishmael and my shudders very soon.  This is all very important background.  It’s the stuff most of your literature instructors neglected to tell you.]

American authors have tried, often successfully, to capture the voice of this persona.  Many scholars cite Fenimore Cooper as the first to create this in his character of Natty Bumpo, the white European who felt more comfortable among the Native Americans than he did among his fellow encroaching Europeans—when he heard the sound of the axe felling trees, the first sounds of settlement, he knew it was time to move on.  This sets the tone for the American who at least seriously doubts the virtues of civilization, who openly opposes it and willingly “plays” its true believers (cf. Melville’s “Confidence Man” and Twain’s “Mysterious Stranger).  As Powell indicates, Jay Gatsby, literally a “self-made man”, was just such a character, his wandering being part of the cover he needed to gain respect from the higher social orders as well as criminals.  Powell also cites this anti-tolerance for the ways of civilization in Huck Finn’s coda to his funny, painful and ultimately dismaying narrative: “But I reckon I got to light out for the Territory ahead of the rest, because Aunt Sally she’s going to adopt me and sivilize me and I can’t stand it.”  What Huck, Natty, Nick Caraway, the voices of the Beats, Ralph Ellison, Richard Wright, etc.—and Melville—felt was that civilization American style was not an especially humanizing experience.

So what about Ishmael and why is this relevant to the current presidential gavotte?  Last part first, Mr. Powell demonstrates that the biographies of our current presidential candidates fit the literary and cultural archetypes that he discusses. He strongly implies that this neat fit will comfortably resonate across the media, so that each candidate will have (I suppose) sub-textual identity with the American voter.  That’s my inference, be that as it may, but the really serious issue is a deep understanding of the Ishmael personality. 

First, he is subtly duplicitous.  Powel cites this, quoting the famous opening sentence, “Call me Ishmael.”  This provides the narrator with two essentials for his purpose: a cover for his anonymity (see Melville above) and a direct biblical connection to the wandering son looking for his mother (Freudian scholars loved that part).  Powell also cites the first part of Ishmael’s self-characterization, “Some years ago, having little or no money in my purse and nothing to interest me on shore, I thought I would sail about a little and see the watery part of the world.”  Remember: This is after the fact of the cataclysmic fate of the whaler Pequod, its monomaniacal captain and its inter-ethnic crew.  Directly after that introduction  (this part Mr. Powell does not cite) Ishmael gets us a little closer to the truth of his life and personality, saying “Whenever I find myself growing grim about the mouth; whenever it is a damp, drizzily November in my soul, whenever I find myself involuntarily pausing before coffin warehouses, and bringing up the rear of every funeral I meet; and especially whenever my hypos [melancholia, blues] get such an upper hand of me that it requires a strong moral principle to prevent me from deliberately stepping into the street, and methodically knocking people’s hats off—then I account it high time to get to sea as soon as I can. This is my substitute for pistol and ball.”  A little more severe than “nothing to interest me on shore.”

OK. So here’s a person that most people would give a wide berth (sorry, couldn’t avoid that). Let’s keep this picaro, this isolato where he belongs—isolated. And now for the source of my shudder.  Powell links both Obama and McCain with this character type.  And he closes the article by quoting from Arnold Ramparsad: “The next U.S. president is going to be Ishmael, whether we like it or not, and whether he knows it or not…Fortunately, both Obama and McCain know that they are Ishmael.”  And unfortunately, Donald Trump doesn’t know and doesn’t care.  I’ll jump the shark here and cite Bannon as Trump’s Ahab.

Personality analysis from literary character to living humans is a stretch, no doubt. But what if it’s not?  Ishmael learned a lot from the trauma of his “sail about…the watery part of the world.”  First, don’t trust people; don’t expose who you actually are.  Second, knowing the dangers inherent in the “drizzily November of [your] soul,” get away from people, at least get away from normal society’s “sivilizing” behaviors.  And finally, tell your story to evoke the greatest sympathy among your listeners.  This, then, is our choice in our November?  Hence, my shudder, which has been resurrected.  If only Trump had moved on.

Addenda
Conventional wisdom among literary scholars is that the Pequod symbolizes the world, and its crew symbolizes representative humanity (albeit exclusively male, but that’s another long post.  Meanwhile, you could read Melville’s Redburn to get an idea of his exclusively male universe. ).


Ishmael is saved by the buoyancy of Queequeg’s empty coffin, his memory of his soul mate, an echo of the “coffin warehouses “ of the opening paragraph.  So what we have learned is nothing except, of course, that the threat of death is the ultimate but unavoidable motivator.

Wednesday, October 25, 2017

Senator Flake and His “Complicit”


In all of the thoughtful words that Senator Jeff Flake articulated in his assessment of statesmanship and leadership, and the lack thereof in Donald Trump’s presidency, the word that put me on a course of deeper consideration was complicit.  Flake was obviously implying that those among his colleagues in both Houses who have withheld honest assessments of Trump’s behavior, demeanor and dissembling actually form and promulgate the basis for the dysfunction of United States governance and the decline of America’s international reputation.  And apparently his remarks triggered very little refutation among those colleagues…so far.

When one word jumps out from hundreds in a single communication and sticks in my head overnight, I wonder why.  Does the word have special significance?  Is it just a curiosity for me?  And in a larger sense, had Senator Flake chosen it to associate more than his colleagues in his denunciation?  When I suffer these kinds of questions, I resort to an etymology source to try to figure out what’s going on.

My search discovered that the derivations of the word’s construction and the course of its usage have taken significant turns over the centuries.  From the Late Latin and Old  French, the compl- root meant coming or working together, or partnering to achieve a goal.  Similarly, the -plicit part of it has roots in Greek, Russian, Old Norse, High German and Old English—all suggesting or meaning a folding or plaiting together into a single unit. (Online Etymology Dictionary)  All of this raises the question of how this word could come down to twenty-first century American English as a pejorative or negative marker, indicating a kind of shared coming together in order to separate or to disunite, rather than to combine or literally entwine to form a stronger unit.

So contemporary usage implies that Senator Flake’s direct purpose was to scold and/or shame his colleagues.  And, I think, based on their apparently timid responses, he accomplished his purpose.  The proof will be if their inaction turns to action.  But then, I got thinking some more.  Perhaps, consciously or unconsciously, the senator had us in mind.  I could be projecting here, but so much of what he said applies to the current state of our national consciousness or lack thereof, that it unmistakably resonates with the disarray of civic purpose and direction that encumbers our communities large and small.

We don’t seem to know what to do or to care about our 17 year old forever war in Afghanistan.  We don’t seem to know how to stop the cancerous scourge of opioid addiction and death across the land.  We don’t seem to be able to muster the desire to call out the bad and horrific behavior of some men toward women.  And in the midst of all this and more, we seem to be satisfied to burrow our eyes and our minds into the small and large screens we are addicted to because they reinforce our insular and isolated thinking and beliefs.  The so-called social media blinds us and desensitizes us to the others surrounding us, because that media comprises anti-socially ensconced echo chambers.  We prefer navel gazing and sharing to seeing and thinking beyond our cubicled lives.


Yes, Senator Flake must have had all or some of this in mind as he chose his words.  The tone of futility and exasperation boosting those words clearly meant them to reach beyond the halls of Congress and into the minds and souls of his fellow citizens.

Tuesday, October 24, 2017

Real world, Actual world. Virtual world


“Just wait until you get into the real world.”  “Things aren’t like that in the real world.”

We hear these jeremiads when we are in…what?  The unreal world?  The surreal world?  Mostly we hear it from people who are in the commercial or business world, the world of pirates, con artists, “masters of the universe”, the population of the “competitive” world, which exerts most of its energy and time scheming and scamming ways to avoid direct competition…the kind that happens in the actual world. 

But that’s not my point.  My point is about our blithe acceptance of the word “real” to mean something akin to “substantive.”  Whatever is real is not only what we perceive but also how we perceive it.  All that is in our real world runs through the filter of who we are and what made us who we are.  For example, a person like me, partially color blind, does not see a green traffic light but rather a sort of grey-blue-green (maybe) light.  My wife sees orange lights, not yellow lights.  That person over there is really tall only because the perceiver is really short.  That’s the easy stuff.  There’s harsher stuff.  For example, every Vietnam veteran I talked to told me the experience “over there” was reality; this homeland stuff isn’t reality.

So any “real world” is about the perceiver of his or her outer experience.  The world exists as perceived, not as it is. This used to be called subjective idealism.  I’m certain it has a more obscure label now.  So, is there another world? Yes.  It is the “actual world.” 

The actual world is a stand-alone world.  As I look out the window to my left, I see various tree and bush leaves gently swaying in a shadowed breeze.  Seen from my air-conditioned room, it is a cooling experience.  But the outside temperature is actually nearly 100 degrees with high humidity, not at all cooling.  I know there is a bird feeder on the other side of the house, but I don’t know what its current condition is.  Have the squirrels ravaged it?  Has a new species decided to visit it?  Are the chipmunks having a picnic underneath it?  These things have occurred, and actually might be occurring now.  Before we humans actually landed on the actual moon, the moon held all sorts of visionary realities for humanity. Actuality has a nasty tendency of draining the romance from reality.

And these days, because of the new magic of digital imaging, we have the virtual world.  The virtual world is a created combination of the real and the actual.  Some of each is placed in there, depending on the whimsy and perception of its creator.  In this sense, the virtual and the real worlds share a common progenitor—the mind of the perceiver/creator.  That is, the individual human.

The irony in all this is that most of us prefer the real and the virtual to the actual.  We seem to prefer our perceptions of the actual world to the actual world, in fact.


This, of course, affects what we mean by the “truth.”  The truth is not the same as the fact.  The truth is the meaning we give to the fact.  A fact has no significance to anything but itself, its ontology, its measurable characteristics and its history.  With truth, as with reality, its meaning and significance are determined by perception.


Monday, October 23, 2017

The Mute Anxiety


 For all the words spoken and written about education “reform” and all the bloviating about teacher benchmarks and evaluations, very little is said about what and how teachers feel.  And part of the reason for this is that teachers keep their most important feelings about what they do to themselves.  They will share these feelings sometimes with intimates and occasionally with colleagues…but only sometimes and occasionally.  The crisis for teachers is that they never, ever know their actual effectiveness, never know on a daily basis how what they say and do affects the individual minds in their care.  And this is why they express such outrage at the thought that this phenomenon can be quantified from test results or even professional observations.  This is the mute anxiety that no one without years of classroom experience knows.

I have a souvenir, a T-shirt tacked on the wall above my workspace.  It was a gift from some of the media studies majors in my department, each one signing some good wishes and thanks on my retirement.  Most are pretty much what one would expect, but a particular note haunts me even today as I look up at it.  It haunts me because I can never know what the student meant.  It reawakens in me the very uncertainty I’m trying to explain.  She wrote:

“Roger
Thank you for your wonderful insights and inspiring lessons.  I’ll miss you terribly!”

First, let me explain this student.  She came to media studies, a bachelor-of-arts program, as someone, new to higher education, who was not certain of degree initials.  For her, it was college, and the major seemed like it might be interesting.  In other words, she was not goal/degree focused; she was interested in learning something, preferably something that would engage her interests.  I think what accounts for the enthusiasm of her farewell was that she had never before realized that learning could be engaging, challenging and interesting.  I was initially impressed by her eagerness, then by her diligence and finally by her originality.  She developed from being very vague about what learning meant to a person for whom learning was critical.

Scan that paragraph.  What in it could be quantified?  What could be benchmarked?  What did I say or do that was so “insightful” so “inspiring” for that person?  What about those two words would have meaning inside some education rubric?  I don’t know, nor could any evaluation.  Only one person could know, and if she had not told me, I would never have known.

An unarticulated, seldom acknowledged experience of the person responsible for the learning in a classroom is the sense of guilt that comes from a feeling of inadequacy.  Simply stated it says, “I don’t know what’s wrong.  I’m trying everything I know, but she’s not getting it.  I don’t know what to do.” This is the feeling that comes from classroom teaching over a long period of time.  It expresses the feeling that the person responsible for the learning that’s going on is the person who has been assigned to the learning environment of that classroom.  No matter what other dynamics might be going on among that particular cohort of presumed learners, no matter what the test results show, the person “in charge” feels a sense of inadequacy, because someone in the room “didn’t get it.”  This is the ongoing anxiety of the teacher in the kindergarten classroom through the mentor in the graduate school seminar.  The person massaging the learning, the person who has used every thing learned over long experience doing it, that person knows from the look on one or two or three faces that something has been missed.  And that’s what she takes home with her.

The tragedy of this is that some teachers eventually weary of the anxiety and fall into the abyss of routine, the very routine that the quantifiers are recommending as the salvation of our education “system,” the reform of America’s “failed” education system—whatever that means.  These teachers who release themselves to routine are the wounded in the classroom ranks.  Some of them—too many of them—are shunned, perhaps even mocked by their colleagues, thus emphasizing how ultimately lonely the task is.  And critics and so-called revolutionaries within the reform movement have been doing their best to sustain this feeling of desperate isolation, to enhance the feeling of failure 

So this, then, is what might be called the tragic paradox of the classroom teacher.  She or he knows that only one professional person, the person doing it, can actually experience what is happening in the classroom.  And a certain amount of pride attaches to that.   But coincidentally that pride becomes the source of the anxiety attached to the uncertainty of whether or not each mind in the process has been inspired to learn not only that much but also to learn much more.

Nothing in what I have learned about the pedagogy behind the Common Core Standards or Race To The Top (that winner-take-all wrapped in Social Darwinism phrase) even begins to entertain the notion that this paradox exists.  Moreover, the local puppet masters who manage these programs represent an entirely new managerial class in American education, a class that gears education as a business enterprise, codified in their titles CEO, CFO, etc.   An approach to education as a business enterprise will discourage learning while it creates loyal, uncritical androids.  It assures the common, while it provides no time for and disparages the exceptional.  Just like in a factory.



A prescient article from 2012

When Will Social Media Elect a President?
Twitter and Facebook will change U.S. politics, as new technology always has. Think Nixon or 'Obama Girl.'

The Lincoln-Douglas debates of 1858 took place over seven venues, with 10,000-20,000 attendees and no microphones. One candidate would speak for an hour, followed by a 90-minute rebuttal and then a half-hour response from the original speaker (which alternated debate to debate). This description alone is almost 280 characters—clearly we've come a long way from Honest Abe to the Twitter age. But should we believe the hype about social media's impact on the 2012 election?
Pew Research says no. "Cable leads the pack as campaign news source," it concludes in a recently released 35-page report. "Twitter, Facebook play very modest roles."

Too bad that misses the point. New technologies have always altered campaigns and usually in mysterious ways. Party conventions were first televised in 1952 and soon lost their relevance, becoming scripted theater. Richard Nixon lost votes by sweating under harsh lighting during his televised debate with JFK. Bill Clinton bypassed the traditional news media, playing "Heartbreak Hotel" on his sax on Arsenio Hall's late-night show. MoveOn.org used the Internet to accumulate small donations and host a virtual primary won by Howard Dean, who in turn was brought down by a scream, which in turn went viral on the Web. YouTube was soon created and in 2008 hosted "Obama Girl" and other user-generated campaign ads.
In November 2008, Twitter had about four million users, and 100,000 followed candidate Obama. Today, President Obama has more than 12.5 million followers (while Mitt Romney has about 350,000 and Rick Santorum about 150,000). In 2008, Facebook had roughly 50 million users—nowhere near today's 845 million—and Google+ didn't exist.
Facebook and Twitter are already rivers of political banter—from Rick Perry's "oops" video to infographics of Mr. Obama's insider deals at the Department of Energy. Our friends find dirt and post it without thinking twice. So it tends to be partisan, extreme and divisive—more like a cocktail party than the evening news.
But campaigns can't just do "media buys" of $10 million on Facebook and expect anyone to notice. TV ads are effective because they're intrusive, and this year we'll see $3 billion worth of them, up from $2.1 billion in 2008. Social networks are more subtle media.
Jonathan Collegio of the American Crossroads political action committee explains that "you can bang a TV audience over the head with ads, but online content has to be hot to go viral. No one wants to tweet about or post a lame ad on their Facebook page." Corporations already know this. Vitamin Water "crowdsourced" its next drink flavor, allowing Facebook users to debate and choose it. Old Spice let us tweet to the shirtless guy in its commercial and post 180 response videos with six million views on YouTube—doubling sales in the month the campaign ran. Corona Light became the "most liked" beer on Facebook by letting users upload photos to a 40-foot Times Square billboard.
This viral marketing is what corporate and political campaigns increasingly thrive on, and today it's mostly free. By the 2016 election, it'll surely steal some of the $3 billion in TV ad money. It costs money to stock the campaign backrooms—herbal tea-infused, never smoke-filled—in which coders are tasked with finding innovative ways to bring undecided voters into the fold.
Far better to do that online than through, say, direct mail (which was still a $1 billion political industry in 2008, even though in so many homes it increasingly means mail thrown directly into the recycling bin). Online, one's political affiliation—Democrat, Republican or, most important, independent—can be easily ascertained. Campaigns can read your tweets and your Facebook "likes," plus those of your friends. Campaigns build new databases of independents every election because converting them to one side or the other is the name of the game.
The greatest effect of social networks on Election 2012 will take place behind the scenes. Social networks, like real life, are driven by influencers—not necessarily those with the most friends or followers, but those whose thoughts, ideas and opinions have the biggest impact. Mr. Collegio notes that for political action committees "to seed opinion makers, Twitter is the ultimate platform. Ideas grow into stories on blogs and eventually in the mainstream media." Not the other way around.

For years Google has ranked Web pages according to an algorithm called PageRank. Now there's a new field of study around ranking users in social networks—PeopleRank—according to their influence: how many of their tweets are read, re-tweeted, include links that others click on, etc. Corporations trying to sell high-ticket items are all over this, looking for industry experts, analysts and other buyers that people respect. Startups like Quora and Klout have their own algorithms but you can bet that both major parties are investing in this new-age influence peddling (with Democrats way ahead so far).
Those with social-media "influence" are most likely to help campaigns convert interest into votes. Finding them in the haystack of the real world is tedious and expensive. But harnessing fast servers and constantly upgraded algorithms to find them on social networks is already happening—and it'll definitely sway who becomes our next president.

Mr. Kessler, a former hedge-fund manager, is the author most recently of "Eat People" (Portfolio, 2011).