“Originalism” and the U.S. Constitution

The late Supreme Court Justice Antonin Scalia was insistent that he was an advocate of something he called “originalism” in reference to the interpretation of the U.S. Constitution.  The term can be slippery: it seems to be as open to individual interpretation as many of the cases that come before the court.  In the broadest possible terms, it is said to be a standard for interpretation of the Constitution by attempting to understand clearly what the words of the Constitution meant to the Founding Fathers at the moment they put them to paper in the document.  Scalia, in particular, expounded his view (as quoted in Wikipedia–hey, I’m not writing a legal treatise here) that

“If you are a textualist, you don’t care about the intent, and I don’t care if the Framers of the Constitution had some secret meaning in mind when they adopted its words.  I take the words as they were promulgated to the people of the United States, and what is the fairly understood meaning of those words.”  (Scalia’s own words, from a speech at the Catholic University of America in 1996.  See the Wikipedia article on Originalism, footnote 21.)

I am no legal scholar.  I am not even a lawyer, and in truth, I’m going to take a Scalia-like stand and say that I don’t care whether I persuade you of what I’m about to say or not, but it seems to me (and you are as always free to disagree with whatever vehemence you choose) that this method fails in many cases to take into account the evolution of basic human rights and the evolution of the United States itself.  The originalist, textualist arbiter is locked into the “original” meaning of the words–no matter that there is room for disagreement between two such interpreters or among all nine Justices of the Supreme Court.

A couple of glaring examples of how the Constitution did, in fact, evolve are illustrative.  The original text of the Constitution recognizes slavery as a fact, and even lays out that each slave in a particular state shall count as 3/5 of a “person” for the purposes of apportioning members of Congress–even though those slaves were not accorded the right to vote by the original text.  Thus a free white man voting in Virginia had far more clout than one voting in Massachusetts, who was voting only for himself, and not any number of census-inflating, but non-voting slaves.  All women were completely disenfranchised.

Slavery in the United States became unconstitutional (thus, illegal) with the ratification of the 13th Amendment in 1865.  Then, as now, ratification required the assent of 3/4 of state legislatures; the 13th took effect once this number was achieved, with several other states following in subsequent sessions.  Interestingly, Kentucky did not ratify on first consideration; it rejected ratification, though it did fall into line– in 1976(!)  Mississippi has never ratified it to this day.

The 15th Amendment prohibited the denial of the right to vote to any person on the basis of color or previous condition of servitude (in other words, states were no longer allowed to bar voting based on race or by restricting the rights of former slaves).  The amendment went into effect upon ratification by the required number of states in 1870, but a few states were slow to ratify (Delaware, 1901; Oregon, 1959; California, 1959; Maryland, 1973; and Kentucky, again, in 1976.  Tennessee never did ratify this one.

The 19th Amendment finally got around to giving women the right to vote.  Many citizens would be surprised to know that this amendment’s ratification finally took effect only in 1920.  No state failed to ratify, but late ratifications from 1952 to 1984 (yes, 1984) included Virginia, Alabama, Florida, South Carolina, Georgia, Louisiana, North Carolina, and Mississippi, with Mississippi again bringing up the rear as the 1984 ratification.

If I were to hold originalists to a strict, rigid standard for these obvious failings in “original intent” as laid out in the wording of the original Constitution, they would probably reply, as Scalia often did, that these were remedies that took place through legislative (i.e., political, rather than judicial) processes, and that such processes were recognized and laid out in the original content of the original document.  In the narrowest sense, this is 100% true, but that leaves open the question of whether the institutionalized injustices in the original document were just oversights, which came to light later and were corrected in a timely fashion.  You might even agree.  I don’t.

It appears evident that even the Founding Fathers were hamstrung in some instances by politics, and that they made compromises (Who knows how willingly?) in order to get agreement and support from each other.  In some instances–the quartering of soldiers without the assent of the property owner, the prohibition was absolute, English common law precedents be damned.  In others, as shown above, an enlightened, “justice for all” approach would not have carried the support of many of the representatives of the states that depended on slavery for their economies.  (Virginia, was, at the time, one of the largest and most prosperous states as well as home to thousands of slaves, whose labor supported large plantations of tobacco or cotton.)  Women were basically, in a legal sense, property at the time of the original Constitution, and the document the Founding Fathers produced reflected that status.  The cited provisions of the original framework were, to be sure, eventually corrected to more enlightened ones, but what comfort is that to thousands who never were “given” the rights we all take for granted today?

The “originalist” position serves as a starting point, to be sure.  It enumerates certain rights to be reserved for individuals, others for states, and others for the only government that represents all of the above, the federal government.  But, as we often heard in the 20th century, “states’ rights” was often little more than a smokescreen for withholding individual rights from some.  No interpretation of the Constitution should be allowed to perpetuate that.  Today we see several states trying to suppress the rights of some to vote, a continuation of restrictions that transparently disenfranchise some so that others could continue to shape their states’ laws to reinforce their own hold on power.

I have wandered far from my starting point, but Justice Scalia will be the subject of another post, soon.  I think, as I am sure many others do, that his expressed philosophy on many controversies served his own philosophical/religious leanings, not the Constitution or the people of the United States.  More later.

 

WHEN DID I GET SO OLD? (PART I)

In recent  months, and in a phenomenon that I am sure is not unique to me, I have been flooded with things (say, a new Star Wars movie) that cause a flood of memories as this reminds me of that which was happening at the same time as that other thing and pretty soon I am pondering a slew of happenings that were the focus of my life, even if briefly, oh, so many years ago.  Things like these…

Childhood: the 1950’s

When I was four, Americans elected as President a man millions had called “General Eisenhower” a few years before.  Now they called him “Ike” and it seemed the adults “liked” him, but a little bit the way we “like” things on Facebook today.  They didn’t really know him, but it seemed they trusted him with the country.  That all worked out OK, I guess; he got us “out” of Korea, or at least out of the shooting war that was Korea at the time.  Another good thing: he warned as he left office of something he called the “military-industrial complex.,” a prophetic utterance if there ever was one.

He forfeited, in retrospect,  all the good will, though, by reaching down into Republican ranks to get a little-known Republican senator from California, Richard Nixon, as his vice-presidential running mate.  We’ve been paying for that one for decades.

Good things about the decade: cheap gasoline, full employment, the start of the interstate highway system, the WWII veterans moving into positions of responsibility and visibility.  A country not of 48 states any more, but 50, adding Alaska and Hawaii.

Bad things: the “Red” (i. e., Communist) scare, which propelled another obscure Senator (Joseph McCarthy) to prominence, largely for seeing Communists all around him, and coincidentally had my schoolmates and me practicing surviving a Russian atomic attack by hiding under our desks and covering our necks.  The Cincinnati pro baseball team went from the nickname of “Reds” to “Redlegs.”  No, no, Senator, no godless Commies in this dugout.

Later childhood and early adult years: the 60’s

Where do we start?  Americans chose the first President who had been born in the 20th century and one of the youngest ever in John Kennedy.  He also had a wife with movie star looks who spoke French.  The pair were American royalty while Camelot lasted: Kennedy had feet of clay, or maybe just a common eye for the ladies, but he was new and exciting.  He stared down the older and more experienced Nikita Khrushchev to conclude the Cuban missile crisis.  He also became the first example for me and my same-age cohort of an assassinated leader in 1963.  Lyndon Johnson took over, and in about a year and a few months, had committed thousands of young Americans to jungle warfare in far-off Vietnam.  Not good.

1968 was perhaps the most pivotal year of my life.  I passed a draft physical as thousands shouted “Hell, no, we won’t go.”  I went a little crazy for a British blues trio called Cream (Eric Clapton, Ginger Baker, and Jack Bruce).  A different Senator McCarthy (Eugene) decided to challenge President Johnston in the Democratic primaries and drew enough support that Johnston decided to leave office at the end of his one elected term.  The election of 1968 saw the return of Nixon (yes, HIM again) vs. Hubert Humphrey and the third-party candidacy of former Alabama Governor George Wallace.  Nixon won and then began courting the racially tilted Wallace voters for next time.  As the year closed, I had entered the military–the Navy, in my case, after a family tradition .

The decade’s “good” things: The music!  The Beatles, Rolling Stones, Zombies, Kinks, and more rocked from Britain.  Protest music (Dylan, The Eve of Destruction) answered from this side of the Atlantic and the Beach Boys showed a Pacific Coast sound.  The beginnings of a true diversity in public life.  The advances of the civil rights movement.  The Voting Rights Act.  A vigorous reaction to Soviet adventurism.

Bad things: The real beginnings of our current polarization in politics.  The apex of the military misadventure in Vietnam, which would eventually claim 55,000 American lives while our political leadership dithered, with no one wanting to be responsible for the first war the U.S. ever lost.  The assassination of John Kennedy’s brother, senator Robert Kennedy, and of prominent African-American civil rights advocates Medgar Evers and Martin Luther King, and African-American icon and firebrand Malcolm X.

Coming of age: the 70’s

Late in this decade, I left my 20’s behind and began my 30’s.  I heard myself called “Dad” for the first time (a very sobering experience), and the country, similarly, continued through tremendous changes.

The 70’s, probably more than  any other time in the country’s life, was characterized by crises, one after another, and by leadership that proved not to be up to those crises. In the 1972 election campaign (which Nixon won overwhelmingly), operatives of something called “The Committee to Re-Elect the President” were paid to commit breaking and entering at the Democratic National Headquarters to plant listening devices.  A long series of lies and distortions designed to insulate Nixon from anything so tawdry eventually led to articles of impeachment passing the House; finding that conviction in the Senate was virtually inevitable, Nixon resigned in disgrace rather than face the music.

Nixon’s Vice-President, Spiro Agnew, had earlier been investigated by the FBI on charges that he had accepted graft from his days as Governor of Maryland through his term as VP; he was charged and pled nolo contendere, a nice way of avoiding trial by accepting a punishment but “admitting” no wrongdoing; he had been replaced as VP by Gerald Ford, who had never run nationally at that point, but was the sitting VP, and so succeeded Nixon in 1974.

So we had the first ever president who had never run for either president or vice-president; he was well-known to the country as the Speaker of the House, and got an enormous dose of good will upon ascending to the presidency, after two years of investigations, testimony, and sordid revelations.  Within a few days, Ford had squandered that good will by pardoning Nixon before the latter could be charged or tried for any crime.  Saying “Our long national nightmare is over,” Ford apparently thought he had relegated the whole affair to the history books.  He was wrong.  He kept most of Nixon’s cabinet and served an undistinguished couple of years as president, though he did preside over the end of US involvement in Vietnam; despite years of slogans centering on “peace with honor,” Americans were, in 1975, treated to television news footage of the US Embassy in Saigon’s evacuation by helicopter as the city was overrun by Viet Cong forces.

1976 saw the nearly unknown ex-Governor of Georgia, Jimmy Carter smile and “plain folks” his way to the Democratic Presidential nomination and then to victory over Ford in November, but not before Ford was significantly weakened by a challenge to him as the party’s choice as nominee.  The challenger: the former B-movie star and California Governor, Ronald Reagan.  We’ll talk about him (a lot) in the next decade.

Carter was honorable but not seen as strong by many, and was unfortunate enough to be in office at the time of a convergence of events in far-off Iran that eventually tarnished and ended his presidency.  A shah had ruled despotically there for over 20 years after seizing power in a CIA-backed coup designed to make Iranian oil safe for Western oil companies during Eisenhower’s presidency.  The shah’s favored methods for maintaining control included an active domestic secret police who were experts in torture, maiming, and political intimidation through murder.  The Iranian in the street never forgot US interference in placing the shah in power, and when he sought cancer treatment in the US and other countries, a coup overthrew him and installed the Ayatollah Khomeini in the seat of power, as a mob stormed the US embassy and took the diplomats hostage in 1979, holding them until Carter left office following his loss in the presidential election of 1980.  The electorate, as usual, wanted Carter to “do something” and felt the US had been humiliated.

During this decade, I entered, and graduated from, college, and entered a career in teaching.  Events just kept on washing over me.  I started to think about another career, but at this stage it was only a thought.

Good things in the 70’s: Continued great music.  Led Zeppelin, The Who, Rush, and others, despite disco.  The beginning of the Star Wars phenomenon.

Bad things: Political messes, widespread drug abuse, the Japanese near-takeover of the US car industry, the beginning of the “Rust Belt” depopulation phenomenon.  And many more.

In Part II, I’ll talk about subsequent decades.

PRIMARIES ARE REALLY SECONDARY

I promised to say how I thought we might do better…

In my last post, I advanced the idea that primaries were not a very good way to choose party candidates for president, but I also said that we could do better.  Here are a few thoughts on how that might be the case.

Primary elections have become a way for the media to present the major parties’ choosing candidates as if the whole thing were a horse race or the NFL playoffs, complete with scores, substitutions (on the candidates’ management teams), and even talk of upcoming crucial contests.  After the early “Big Two” of Iowa and New Hampshire, all media eyes will shift to South Carolina.  Later major coverage will be devoted to other states, all of which will be described by some talking heads as “crucial tests” for one candidate or another.  Maybe.  But then, not likely.

South Carolina makes a very illustrative case study.  Care to guess how often since 1960 a Democrat has carried that state in a general election?  Twice: in 1960 (Kennedy) and 1976 (Carter).  The former was the last election before the South in general realigned Republican in a spasm of reaction to Civil Rights legislation, and the latter was the election of a Southerner who was a professed “outsider” to the ways of Washington, D. C.–which had recently included Watergate and a President (Nixon) who resigned rather than face certain impeachment and removal from office.  What would you call the eventual Democratic nominee’s chances of winning South Carolina’s nine electoral votes in 2016?  As my late grandmother was fond of saying, two chances: slim and none.

So why have a contested Democratic primary election there at all?  It is not required by federal law–it’s not an election at all, but a preference poll, the winner of which will get some delegates  pledging to support that nominee at the party convention, the place where the real party choice is  made.

Need another example?  Let’s pick a state just as out-of-reach for Republicans as South Carolina is for Democrats.  A state like Hawaii.  Since Hawaii has been a state (1959) it has voted for a Republican president twice, in 1972 and 1984.  In both instances, this was a second-term landslide in favor of the incumbent President, the former for Nixon and the latter for Reagan.  To its credit, Hawaii does not put on a Presidential primary at all.

It is a waste of time, money, staff, and energy to have a primary in these two states, as well as in others.  Democrats in Wyoming?  You could get them all in one hotel lobby.  Republicans in Vermont?  They used to win, but have not done so in the general election since 1988, and indeed, their best showing since then was in 2000, when George W. Bush polled 40.7%.

The party elders from each major party should decide in any given year whether any single state should host the quadrennial circus known as a Presidential primary,  Certain states–currently, say, California, New York, Florida, and maybe Texas probably should have them just to test the candidates’ appeal in a large-scale vote.  Others–Ohio, Pennsylvania, and Illinois, for example, carry enough allure as bellwethers to merit the bother of primaries.  Otherwise, state party committees should be doing the picking of delegates to the national convention, and their choices will have less to do with passionate commitment than with the effect a particular candidate will have “downticket”–on the party candidates in other elections that take place concurrently with the Presidential.

Undoubtedly, you have some better idea.  Put it forth, please.  I don’t relish one more election cycle with 100 appearances per candidate in Iowa.

PRESIDENT OF IOWA? NEW HAMPSHIRE?

The current system used by both political parties to choose a Presidential nominee is both wasteful and unrepresentative.  We can do better.

I promised my wife–and any readers–a break from politics this time.  So this post is not about politics.  It’s about civics–how the election process has become a cash cow for political consultants, advertisers, media, and the hospitality industry, while accomplishing little of use in choosing national candidates.

The media are full of stories about how one candidate or another is “polling well” or “lagging” in Iowa, in reference to local party caucuses that will be held in that state two months from now.  Here’s a primer on that event: we’ll look at the Republican side here.  The Democratic side is similar in outline, although even more complex in some ways.  The Republicans meet voluntarily in local gatherings, and, starting in 2016, must declare a preference for, and are then bound to, a particular candidate.

The local caucus (there are over 1000 of these) meet only to choose delegates to district or county conventions, held at a later time.  Iowa has 99 counties.  The county or district conventions, in turn, choose delegates to a state convention, which is held months later.  So you can speechify all you want at your local caucus in favor of Congressman Windbag, but unless your candidate is still showing a pulse later, much nearer to the national convention, you will have wasted your time in a preliminary to the preliminary which is a preliminary to the national event, where the news might be someone who was not even a candidate when the local caucuses took place.

New Hampshire?  Media cover this as if it were a do-or-die event, but New Hampshire votes in its primary in early February, when conditions might well resemble those of Antarctica.  Say what you like about how seriously New Hampshire voters take their outsize “responsibility” toward both national parties and turn out to vote; this is not a recipe for “participatory democracy,” as we often hear.  We also hear how important it is for each New Hampshire native to meet each candidate personally at some point at some rustic diner for pancakes and sausage.  Is this a serious recipe for voters to inform themselves about candidates, or candidates (as they often insist) who want to “get to know” the voters?

Need more?  Let’s look at some demographics, just to see if one of these states is some microcosm of the country at large.  The US is estimated today to have a population of 320,000,000.  Iowa has 3,107,000, or less than 1% of the US total.  New Hampshire has 1,327,000, or less than half of 1%.  Urban areas where many federal programs are most important?  Iowa’s biggest is Des Moines, at 207,150.  Manchester is New Hampshire’s largest at 110,448.  The foreign-born population of the United States stands at 12.9%, while in Iowa, it’s 4.1%, and in New Hampshire, 5.4%.  African-American population?  US, 13.2%, Iowa, 2.9%, and New Hampshire, 1.1%.  (Note: all figures quoted here are drawn from Wikipedia and are estimates from 2010 to 2014.  As I’ve mentioned before, I’m not putting together a Ph.D. thesis here!)  So while there’s nothing wrong with being more native-born, and more white than the country at large, these facts and more mean that these two states are not a microcosm of the overall electorate. While New Hampshire has come down in primary elections in favor of the eventual nominees recently, Iowa went to (wait for it…) Rick Santorum in 2012 and Mike Huckabee in 2008.

So why are these two states holding political contests before anyone else?  Glad you asked.  Pay attention to how many field campaign operatives are quartered in each of these two states for the next year, as well as the number of consultants, spin masters, and media representatives file reports from Dubuque or Keene.  Then pay attention to how often you hear of these places after the event is over.  Each one of these folks are paying for lodging, food, and incidentals and each one is pumping that revenue from a national campaign or a news organization into local coffers, most of which will dry up soon after.

So why doesn’t another state try to get in on this gravy train?  Ah, but they have.  Florida recently moved its primary up to an early date, and the national Republican organization reacted by sanctioning Florida and taking away some of its representation at the summer convention.  Why?  You can guess– people who own hotels, restaurants, and other businesses that profit from the coverage and competitiveness of the early contests set up a howl with their state party organizations, who  in turn set up a howl with the national organizations.  And the national organizations do not want to anger business owners/voters/ contributors.  They gave in, and the status quo ante prevailed.  In politics as in life, money talks.  In fact, it talks louder in politics.

How can we do better?  That’s the subject of another day.  Soon.

 

 

GIVING THANKS

Thanksgiving: everybody knows what it’s about.  Sort of.

To begin, let me say that Thanksgiving is one of my very favorite days of the year.  Food, family, good cheer–who could ask for more?  I am always reminded, though, when the store displays and the TV specials start, that the observances and meaning of the modern day are far from what they were in the 17th century.

Tradition is at the heart of much of what goes into Thanksgiving, and as is the case with any tradition, something gets handed down through generations.  The result is still a link to the original tradition, but often modified to a greater or  lesser degree, and Thanksgiving is no exception.  Religion and food (a harvest or thanksgiving feast) are central to the holiday, and a look at both may surprise a few.

The small group of English colonists in modern Massachusetts are known to millions of American schoolchildren as “the Pilgrims.”  The word means, literally, a person who undertakes a journey for religious reasons.  This particular group was not known as pilgrims by their contemporary English countrymen, but as “dissenters” or “separatists” because of their quarrel with the Church of England, which had itself succeeded in wrenching itself free from papal control only within the past few decades.  One of their own number, William Bradford, named the group.

King James (of England, also known as James VI of Scotland, and patron of the 1611 version of the Bible known today as the King James version) was only too happy to bid them farewell; England was bubbling with sectarian strife, and giving these folks land far across the Atlantic was a good way to remove them from the mix.  (The group was further splintered into those who hoped to reform the Church of England, and those who considered it beyond reform, and would later come to be known as “puritans.”)

A well-known part of the Thanksgiving mythos in the United States is that these people came to these shores for “religious freedom.”  And they did–sort of.  Religious freedom simply did not extend to any other sect of Christianity, much less any other than Christian or non-believer.  Their repute for repression is probably a bit overblown, but they are known to have exiled one woman in the early years of their settlements for talking too freely about her experience of marital pleasure!  And Roger Williams also was exiled for, among other things, questioning the right of one settlement to take over Indian lands without payment to those Indians.  Disagreement with the community elders, whether in matters of theology or law, was dealt with by excommunication.  One can admire the courage of their conviction while noting that the stifling of any dissent was achieved in a manner familiar to those “pilgrims”: with intolerance and exile.

As to the feast aspect of Thanksgiving, both harvest festivals and days of thanksgiving were known to many cultures.  This one in particular was observed to celebrate and thank God for the colony’s safe establishment on these shores, and one must grasp that only some 50 of the original 100 members of the colony survived after a year, when this celebration took place, so those who survived may have felt thankful indeed.  Squanto, a Native American of the local tribe, had previously been taken as a slave to England and later returned, so he spoke English.  He taught the English settlers the ways his band grew crops and caught eels; he also served as translator for dealings between the two groups.  Again, one might imagine the settlers were very grateful for Squanto’s aid.  The settlers invited some 90 or so of the local natives to the feast, so the first Thanksgiving was indeed more American than English!

And the food bore little resemblance to what we eat today,  Turkeys there were, though not the domesticated flightless type we consume by the millions; they roasted or boiled wild turkeys, a stringy, tough cousin.  These turkeys were supplemented by other birds, from passenger pigeons (now extinct) to ducks.  No modern roasting, either.  Customary methods to cook birds included boiling and roasting over an open fire, or a combination of the two.  It included neither potatoes nor yams, both of which were unknown in North America at the time.  There were beans and wild fruits–and squash, as close as they got to pumpkin pie, and corn, which they had learned to grow from Squanto and his band.  No wheat, no wheat flour, so no pies, either, though they may have put berries or fruits in an earthen vessel and baked (by the fire) a sort of “pie” without a crust.

In considering how the first American Thanksgiving was celebrated, I can think of a few things to be thankful for–that the proto-Puritans did not succeed in establishing their kind of commonwealth over the whole USA, and that we now have a much bigger variety of festive foods to eat!

Now if we could just dial back “Black Friday…”

IMMIGRATION, A FAVORITE TOPIC–ONCE EVERY FOUR YEARS

Here’s a bold prediction: nothing happens soon

In a presidential election cycle, certain topics make headlines until the election is over, and then just go away.  Such issues excite certain factions among voters, and politicians use such topics to attempt to turn out voters (I know, you’re shocked…) but then do little in the interim. The furor goes quiet until it proves a useful election tool again.  Immigration “reform” is the poster child of such issues.

The reform of actual immigration law and procedure would be a worthy goal.  It is not likely to come up for discussion in any serious forum.  The topic of the moment is illegals or undocumented immigrants, depending on your point of view.  Many Americans are disinclined to want more people from other countries admitted to this country, based on our perceived inability to assimilate these new arrivals.  Others favor the “Statue of Liberty” rhetoric (Give me your tired, your poor…) and want to liberalize current law and procedure in favor of more immigration.

There are arguments to be made for both points of view, but they are not currently being made with any effort to persuade anyone.  Pre-Columbian America had no national identity.  America as a nation had its beginning as a British colony, with substantial other European representation.  While these “colonists” (or “settlers, conquistadores” etc., as you will) were European in their thinking and acculturation, they shared a sense that the old country, whether England, Ireland, Spain, what have you, did not offer them the opportunity they wanted, and they wanted to try to make a life in this New World.  Eventually they grew tired of being administered and directed by colonial powers and their representatives on site, and went their separate way.  There arose an isolationist sentiment that is still strong in our national thought.

Those who want “open doors” see economic benefits in a stream of both skilled and unskilled labor coming from outside.  “We are all immigrants,” they say, and bemoan the lack of charity from the more nativist among us.  This strain is also augmented by those who see the whole thing in terms of self-interest: they still have cousins, parents, friends, etc., who are “over there” and would prefer to be “over here,” so restrictive immigration policy is “inhumane.”

As is usually the case, things are not as simple as either side would have you believe.  21st century reality is not as amenable to the nativist “pull up the drawbridge” thinking as it was in simpler times.  If we as a nation want the best scientists, not to mention baseball players and other skilled athletes, etc., we can recruit them only by being somewhat open to the world while we continue to produce our own.  Anyway, if someone chooses to marry a non-American, we, in principle, welcome the addition.

To the side who wants to let everyone come, I humbly point out that there are many like me who, while not an elite old society group, are not exactly “immigrants” either.  Many of us can trace American-born ancestors back to before the Revolutionary War.  We’ve been Americans for more than two centuries.  Yes, there’s room for many more, but it’s not ignoble to want newcomers to abide by established law.  Ample exception is provided for refugees and asylees.  And there is, as a last resort, the “visa lottery” that lets people who meet minimal educational standards try their luck at joining the party.

So what is comprehensive immigration reform?  At this moment, it’s hard to say.  An overhaul of the immigration system might be a good thing in and of itself.  Another “amnesty” program like the one instituted in the 1980’s would permit millions who entered under other than legal circumstances or overstayed a legal but temporary stay to get a “path to citizenship,” and while the idea has its boosters, those who are opposed are inalterably opposed.

How about this: we don’t round up eleven million illegal/undocumented aliens, but we don’t bend everything to “legalize” them either?  Say you’re here as of a date specific, and you’re self-supporting somehow or other, stay if you like.  If you have a US citizen spouse, he/she can petition for your legal immigration, and that is to be encouraged.  If you have a US citizen child, once he/she is old enough to file a petition legally, that is your path.  It obliterates any sense of fairness to those who abide by the law to oblige the country to accommodate itself to you.

Of course, this hypothetical approach gives something to each side…wait, that’s a compromise.  Never happen.

 

 

 

 

 

 

 

 

IF ONLY IT WERE THAT SIMPLE, (PART II)

After finishing this thought, it’s on to something other than politics, I swear…

The upshot of the circumstances mentioned in the last post (a disincentive to discuss policy in campaigns, a lack of serious planning on the part of candidates and their campaigns, etc.) is rather simple, and one that will surprise no one who has observed a national campaign.  Campaigning politicians exaggerate, twist the words and positions of their rivals, and most of all, advance specious “solutions” to what their audiences may see as problems.  Such problems may seem blown far out of proportion or even nonexistent to the portion of the electorate not committed to  a particular candidate, but that really doesn’t matter: the speechifying candidate has no real intention of carrying out most of the things he/she advocates, anyway, and perversely enough, his/her followers are fully aware of that fact, but often appear to be swayed by the idea of someone’s putting it into words.

I’ll just take a couple of examples to illustrate this paradox.  Once again, Donald Trump serves as the prime example, if only because of the sizeable outrageousness of a couple of his pronouncements, for example, his recent statements on illegal immigration.  On different occasions, he stated that the “…Mexican government sends…” these people to the United States, as if some official in Mexico toured that country to draft people to decamp for the United States.  (And never mind, by the way, that many are not Mexicans anyway.) This is so absurd as to need no rebuttal, but then he expanded on the proposition, stating that the number of such persons present in the US without benefit of legal immigration status was probably more like 30,000,000 than the commonly estimated 11,000,000.  Where did that number come from?  Never mind, it’s just to make a point that he could simultaneously get rid of all these people and stop more from coming by…wait for it…building a wall all along our southern border with a “wide doorway” to welcome in those we wish to welcome.

Huh?

A wall more than 2,000 miles long?  How high?  Built of what?  By whom? At what cost?  In what time period?  Don’t worry, there will be a detailed plan later.  Right after we announce how we will push out 30,000,000 people against their will.  (And, in the case of a good number, against the will of US employers, as well.)  You see?  It’s really simple.  Our leaders are just stupid.  And many “man in the street” interviews featured people who lauded Trump for his “plain speaking” or “saying what is on his mind.”  Non-serious talk is met by non-critical acceptance, because those who cheer Trump on know full well he is not serious…but are sure he’ll do something.

Let’s look at another campaign theme.  Climate change is an issue this time around; in brief, the specifics are hard to nail down.  Burning fossil fuels causes a concentration of carbon dioxide in the atmosphere, and that increased carbon dioxide absorbs more heat than said atmosphere used to; ergo, the earth is gradually heating up, but climate science is complex, and the numerous variables present in a layer of air large enough to cover the entire earth mean that atmosphere may react in quirky ways at times.  Though the general trend is toward a warmer and warmer climate, it is not possible to quantify it in neat tables leading toward a date specific when, for example, polar bears will definitely be extinct.

Rather than accept the scientific consensus, though, the fossil fuel industries point to that impossibility to interpret the whole issue as indicating that climate change is “not settled science.”  Dollars continue to flow into Exxon-Mobil, Shell, and Consolidate Coal’s coffers (among many others) while the issue is “debated.”  This is not politics, in reality.  It may eventually be survival, but deep-pocketed industries are endangered by any effort to curtail current practices, and those industries will want to stall, at least until they can find somewhere else to make more millions.  So campaign contributions flow to oil state senators like James Inhofe (R-Oklahoma) who uses a snowball made in Washington, DC, on a winter day to “prove” climate change is not real.  Does this show up in the current campaign?

Texas Republican Senator and presidential candidate Ted Cruz has been quoted to the effect that there has been no significant evidence of a global warming trend for the last 17 years.  Former Pennsylvania (coal state) Republican Senator Rick Santorum also has spoken disparagingly of scientific studies that came down on the side of massive climate change, declaring that various predictions have not come true.  Others find different ways of putting off any genuine action.  While this topic is not exactly parallel to Trump’s outlandish posturing about immigration, he again takes any unknown quantity to the extreme, claiming climate change is a “hoax” perpetrated by people who want to intrude into, and regulate to a greater degree, the lives of the American public.  Many voters, fearful for their livelihoods if any change in energy generation and use is in the cards, react in the usual way.  If our candidate says it, it must be true.  Sort of.  Well, at least he won’t let “them” intrude and regulate any more than they already do…right?

IF ONLY IT WERE THAT SIMPLE (PART I)

One of these days, I’m going to write about something other than the contemporary political scene.  Honest.  But there is so much more to say…

It is hardly necessary to point out that the general public has neither the time nor the inclination to analyze in depth the political speech and assorted spectacle, posturing, and outright misrepresentation that the political class orchestrates each campaign.  The higher the stakes, the less the claims and promises made in the campaign resemble what happens on planet earth.  And when the dust clears after each election, there is rarely if ever any penalty paid by any candidate; an atmosphere of “winning is the only important thing” pervades the scene.

What often results is a sad breakdown of one of the foundations of democracy: an informed electorate.  It can be argued that it is the job of a political candidate to inform the electorate, and equally, the job of the electorate to take full advantage of competing claims offered to educate itself on the issues of the day, and then make a truly informed choice.  This is not what is happening; while one can make a case that the informed electorate has always been more of an ideal than reality, the trend is toward less honest debate and fewer informed voters.

The tendency to treat political campaigns less as a clash of ideas and philosophies and more as just another game becomes more evident with each passing cycle; whole presidential elections come and go featuring claims that Candidate X is a “proven winner” or the “most electable” without any reference at all to any ideas, policies, or accomplishments attributable to that candidate (not to mention whether those ideas or policies might be beneficial to the republic or any section thereof).  Primary campaigns are, if anything, worse, since it seems to be assumed that all candidates are playing to some “base” or other, and that, in the end, the survivor will “tack to the center'” or move to expand his/her base of support–i.e., become more inclusive.

So there are really two campaigns in presidential cycles: first, a candidate must excite a narrow base in an intraparty series of primaries and hang on while the press of finance and fickle voter bases and donors winnow lesser-known and/or less well-financed candidates from the field.  If Candidate X survives this gauntlet, donors and endorsers typically fall in line behind him/her and a much more brutal one-on-one slugfest ensues, party machine vs. party machine in a winner-take-all general election.

So the world’s best-known democracy undertakes the election of its next leader, usually without any in-depth discussion or serious debate until both major parties have chosen standard bearers.  Both have, at this point, usually made outlandish claims that they will do wonderful things, that anyone who opposes these plans does so out of some nefarious plan, and that a glorious new era is just around the corner after the election if only voters are enlightened enough to choose correctly.

In this tedious (and I apologize for that) summary, there is no mention of detailed plans or of serious studies designed to gauge the possibility that this plan or that course of action will lead to a particular positive result (e.g., a reduction in American military commitments overseas, a long-term economic upturn for the populace at large, etc.)  Why is this?  Because to offer specifics in detail invites scrutiny by opposing campaigns or by front groups for those campaigns.  On the face of things, there is nothing wrong with that, and it should not, in theory, discourage a serious candidate from offering such detail.  However, the operant theory is that an opposition candidate or his campaign will offer only sneering rebuttals in an attempt to reduce the public stature of the candidate who rolls out ideas, so why bother?

So how does a candidate differentiate himself from other seekers of the same office?  By smearing the character, morals or associates of  any opponent, by attempting to make his audience see themselves in his self-descriptions, and most importantly, by insisting that nothing is really complex about governing; all that’s necessary is a little “common sense” (or business sense, or old-fashioned American something or other.) This is and always has been of dubious veracity, and at times, dangerously naïve.  But it has worked before, and probably will again.  In Part II, a few pertinent examples will help illuminate why and how.   

In 2015, do you know any conservatives? I mean, with principles…

Is a conservative just a liberal who’s been mugged?

Did I get your attention?  Get you on your guard?  The boldface sentence above is one that has been quoted for years as a flippant way of “explaining” how one group becomes converts to the other.  Young, carefree people, with their implicit disregard for their own mortality, are liberals.  They believe that people are usually good at heart, and that most want to share in the bounty of the world with those less fortunate.  A mugging (or some equally unpleasant intrusion by “reality”) snaps them out of it.  They henceforth recognize that they are in a dangerous, unequal, Darwinian world, and that no paradise of equality and brotherhood is imminent.  They therefore come to their senses and become conservatives, the better to preserve their wealth and their place in society.  so, it becomes a natural thing for people to become more conservative as they age or become more prosperous.

Alternatively, as many of us were taught in school during the 1950’s and 60’s, the difference between the two groups was one of interpretation.  Faced with the task of maintaining government “of the people, by the people, and for the people” and following the Constitution, in every decision, every dilemma, and every draft of every new law or government program, each group, followed a sort of paradigm.  In each case the individual asked him/herself one of two questions.

1.  Does the Constitution say I can do this?

Or, alternatively…

2.  Does the Constitution prohibit my doing this?

Those who preferred to govern by question 1 were conservatives.  Those who governed by question 2 were liberals.  Isn’t that simple?  Well…actually, no, it was never quite that simple, but it served to delineate Franklin Roosevelt as a liberal, vowing to use every power explicit or implicit in his oath and description of office to confront the horrors of the Depression and World War II, while Herbert Hoover and his unshakeable belief in the invisible hand of the free market and a cautious hand on the tiller, as a conservative.

Fast forward a few decades, to the era of Richard Nixon (whose influence seems destined to outlast not only him but everyone who was alive when he flourished), who saw an opportunity for branding, even before that term was widely used.  His domestic opponents, the primarily anti-war crowd but also others who came down on the wrong (from his viewpoint) side of just about any issue were liberals, people who upset the equilibrium of large segments of American society by forcing racial integration onto the society at large, or questioning the morality, or indeed, the utility of the Vietnam War.  (Don’t raise your hand, I know the original escalation of US involvement in Vietnam came under Democratic administrations.  Nixon’s painted opposition to it as disreputable at best…thus fudging loyalty to the President with loyalty to the country, and protest with something nearly treasonous.)  So, when the President called you a liberal, you were demeaned.

So, by extension and repetition in the intervening decades, “liberal” became nearly a curse word, and by contrast, those who thought of themselves as “conservatives” were the self-appointed guardians of the American way.  Modern political campaigns are based largely on the theme of not letting anyone else define your candidate; the conservatives did a much better job of seeing the advantage in self-definition.  You almost never hear one candidate sneeringly defining his opponent as a conservative, while calling an opponent a liberal is meant to get the villagers and their pitchforks marching.

In 2015, “conservatives” largely are people who deny science to claim climate change is an elaborate hoax; clamor for more domestic surveillance in the name of national security; endlessly predict the imminent end of Social Security; claim that “free trade” will cure all that ails the economy; and damn any effort to raise the minimum wage because it will “kill jobs.”  There are many more examples.  So, liberals, with the exception of Bernie Sanders, cringe for fear of being labeled, the more progressive of the two major parties seems bent on nominating for president a centrist with way too many Wall Street ties, and “conservatives” work tirelessly for the goals of people like Grover Norquist (taxophobia); the Koch brothers (drill, baby, drill); unhinged, harsh evangelicals like Mike Huckabee (who proudly supported that poor stage prop of a county clerk in Kentucky to insinuate he would stop America’s slide into depravity);  the “fortress America” crowd who think we can kill for peace; and others.  Herbert Hoover, I think, would be embarrassed.  Though he died unrepentant about the Great Depression, since in his mind, the Constitution left him powerless to do much about it, and he did not cause it directly, he believed strongly that it would end without government intervention.  Being unrealistic is not a sin, and advising people that they are always in danger from a political party is not a very honorable tactic.  These are not virtues, either.

 

To Hilary Clinton, thanks for the memories, and are there any more?

0a50c6a10d09a4b7f2990bde9c3bf9b2 In this season of political madness and one-upmanship on the part of the fringe of the Republican Part that has lately become its core, it is sadly proper to observe that not all of the distracting noise comes from that side of the aisle.  Democrats, too, have a problem.  Hilary Clinton is not necessarily crumbling before our eyes as a potential president, but she may be slowly dissolving with the steady drip of revelations concerning her private server for e-mail communication while she served as Secretary of State.

Disclaimer: I was an employee of the Department during Secretary Clinton’s entire tenure at State.  Certain things about State I know firsthand, others at some remove, and still others only from the old “heard from a friend who heard from a friend” channel.  I do not know the former Secretary personally.  Everything I knew about her from the other two channels, though, suggests that she was a solid Cabinet member, involved and knowledgeable, concerned with those who reported to her as well as those to whom she reported.  I liked her direct style, for instance, under hectoring in a Congressional hearing on Benghazi, when she simply said she was ultimately “responsible.” Like a military commanding officer, she recognized that she ultimately would always carry this: the acts of political theater repeatedly orchestrated today add nothing to that burden.

It is utterly a mystery to me, then, why the “government e-mail on a private server” matter was ever permitted to take root, grow, and flourish to the point where, just today, after we heard that Secretary Clinton stated for the consumption of all that she had started the use of that infamous private server in March, 2009, but later today, accounts surfaced to the effect that there is an e-mail from that account to General David Petraeus dated in January of that year.  Her poll numbers in categories in such qualities as “trustworthy” continue to drop.

I am not going to say that the matter can now be effectively explained away or otherwise made a non-factor.  In fact, it seems the media will continue to offer up nuggets on this same theme until Election Day, 2016, and gleeful Republicans will continue to tut-tut and speculate at nefarious reasons for the existence and use of the server.  Neither am I going to offer a spirited defense for, or even an attempt to account for, its use over a four-year tenure.

In January, 2009, when HRC took over at State, the scene when Secretary Colin Powell took over and was aghast at the lack of a computer in his office at State was a matter of legend, having taken place eight years before.  Secretary Powell also went to great pains to ensure that State employees, both domestically and posted abroad, had access to both internal State communications (intranet) and the common variety (internet) that high school students all over the country had.  E-mail through both the classified and unclassified systems was a well-established fact of life.  There had to be a concrete decision on the part of the Secretary herself that she would use an alternative system; whether that was instead of, or in addition to the official channel, I just don’t know.

The new Secretary was no novice in the ways of Washington.  As First Lady from 1993 to 2001 and a U. S. Senator from New York after that, she had been a lightning rod for every criticism, every niggling negative comment, every sling, every arrow and broadside that every Republican operative and every media personality of the right-wing echo chamber could fire in her direction.  One would suppose that she would have been reflexively so careful to walk the straight and narrow so as not to give these sources anything to complain about.  Obviously, this is not what came to pass.

I don’t know why this all happened.  Indeed, it may be nothing but an administrative misstep, the kind that gets government employees mild rebukes from time to time, and comes off more careless than evil, but I wish Secretary Clinton would come out and say two things.  WHY would you do this (an “It was allowed, ” is beyond weak), and WHY, once it was plainly not going to be dismissed by the general public (that is, voters, rather than haters or apologists) did you not realize that the longer it took to have a “tell-all” with Anderson Cooper or just about any prominent media figure, the more it would look as if there was something hidden?  Harry Truman once spoke of heat and getting out of the kitchen if you couldn’t take that heat.  Madame Secretary, don’t try to wait for the heat to go down.  We can’t afford an overdone presidential candidate.