Saturday, March 29, 2014

An Unhappy 50th Anniversary

Most 50th anniversaries are considered happy if not “golden.” This one is neither. I’m speaking of Lyndon Johnson’s War on Poverty whose 50th anniversary passed almost without notice early this year.

Johnson launched the “war” in his State of the Union speech a half century ago:

We have declared unconditional war on poverty. Our objective is total victory … I believe that 30 years from now Americans will look back upon these 1960s as the time of the great American Breakthrough . . . toward the victory of prosperity over poverty.

Well, the 30 years came and went along with 20 more and we still wait for the great American Breakthrough and its “victory of prosperity over poverty.”

Like so many government programs, this one – with a tip of the hat to Will Shakespeare – was “full of the sound and fury signifying nothing.” Taxpayers are $20 trillion poorer and 47 million people remain in poverty, an all-time record. The poverty rate today is 15% compared to the 17% rate in Johnson’s day 50 years ago, but the US population was also smaller then so his 17% represented fewer people. Sadly 22% of children live in poverty today. Even 1% is too many.

Three and a half years ago I wrote about Johnson’s Lost War on Poverty. Obama was at the height of his powers, the country was mired in the Great Recession, and the solution to high unemployment and its induced poverty was, predictably, to spend more money. As I mentioned in that blog, poverty is quite different today than it was when Johnson declared war on it. In Johnson’s day a 21-inch B/W tabletop TV cost about $1,800 in today's dollars and could receive only a handful of channels, a refrigerator with freezer cost the equivalent of $1,510 in today’s dollars, and a two-speed automatic washing machine, primitive by today's standards, cost the equivalent of $1,100. Only 12% of homes had air-conditioning versus over 88% today.

Do these higher living standards for the poor mean that the war on poverty has succeeded? No.

Welfare has lifted what it means to be “poor” but the underlying causes of poverty are worsening. A prosperous economy will always be the best anti-poverty program, but we’ve had little of that since Obama was first elected. Unlike Obama, LBJ’s aim was to give the poor improved economic opportunity, not a permanent dole, so that in time those on welfare would become independent of it, support themselves, and become taxpayers instead of tax recipients. That hasn’t happened. Quite the opposite. As so often happens with government programs, the Law of Unintended Consequences reared its ugly head. Today taxpayers essentially pay the bottom 20% of the income earners more than a trillion dollars a year basically not to work.

Earlier this month the US House Budget Committee chaired by Paul Ryan issued a report entitled, The War on Poverty: 50 Years Later. Ryan and his colleagues thought they had been sent to Washington to be good stewards of taxpayer money. To that end, the report concludes that government anti-poverty programs have grown, many are duplicative, and more than a few work at cross purposes. For example, there are 92 assistance programs for low-income families, 17 food aid programs, 20 housing programs, and dozens of education and job training programs.

Most of these programs fail to accomplish their purpose. If they existed in the private sector they would have been shut down faster than you can read this sentence. Has the government insisted that the administrative leaders of the agencies in charge of these programs (i) make them work, (ii) shut them down, or (iii) hand in their resignation? No. None of the above. The government – primarily Congress – simply creates more programs. The new programs, of course, are never coordinated with existing programs. So not infrequently you find the equivalent of one agency digging a hole and another filling it up – year after year, billions after billions. Managers in these agencies know that competing programs are a waste. But what the heck. Who really cares?

I’m sure you’ll be shocked, as I was, to learn that last year the CBO found that poverty programs cause many low income households to face implicit marginal “tax” rate of nearly 100%. In other words, for every additional dollar a low income family earns through work, it loses a dollar either in income taxes or the loss of welfare benefits. Now, these people are poor, not stupid. They figure out that more work nets nothing because government programs take it away, so they don’t work. That’s how we end up transferring a trillion dollars in program money to effectively pay many poor not to work or not to work more than they do.

A while back, Morning in America talk show host Bill Bennett invited Ryan to discuss his government poverty program findings and his committee’s report. Ryan observed:

We have got this tailspin of culture, in our inner cities in particular, of men not working and just generations of men not even thinking about working or learning the value and the culture of work, and so there is a real culture problem here that has to be dealt with.

Oops. Wrong thing to say. You can’t use “inner cities,” “men not working,” and “culture” in the same sentence. If you do you’re raaaaa-cist. Predictably the liberal website Think Progress “just happened” to be recording Ryan’s appearance on the Bennett show and the Left Media went ballistic.

The headline on the Think Progress hit piece is “Paul Ryan Blames Poverty On Lazy ‘Inner City’ Men.” He never used those words or inferred the allegation. But who’s quibbling about truth in journalism? 

Representative Barbara Lee (D-CA) joined the Left chorus saying, “Let’s be clear, when Mr. Ryan says ‘inner city,’ when he says, ‘culture,’ these are simply code words for what he really means: ‘black.’” Ms. Lee happens to be black. She also happens to be a member of the Congressional Black Caucus, an organization that only black members of Congress may join.

Neither Ms. Lee nor members of the Black Caucus have offered a solution to the appalling ineffectiveness of government-sponsored poverty programs. This is surprising because poverty disproportionately affects black households. Nor did Ms. Lee’s criticism of Paul Ryan acknowledge the undisputed fact that 50 years after the war on poverty began a large part of our citizens – mostly black – are even more dependent on government assistance and less capable of managing their own economic wellbeing. If conservative white lawmakers had set out 50 years ago to cripple the black community in American society and disintegrate its family structure, I believe there would be civil unrest. But that’s precisely what liberal and black lawmakers have done.

Is political ideology so important that the liberal bloc of elected officials would rather continue policies that have failed to help the poor? Are Ms. Lee and members of the Black Caucus willing to stick out their necks, as Ryan has done, and state for the record the way government measures the success of these programs is the number of people it enrolls for assistance, not the number who have been liberated from assistance?

I don’t have government experience, but in my world – the world of business – we would judge program success in very different terms.

It seems to me that the viciousness with which Ryan (and Rubio who favors consolidating all poverty programs into a single state-administered grant) is not a policy critique as much as it’s a warning shot across the GOP bow that poverty is “our” program – i.e. the poverty franchise is owned by the Liberal Left. Yet, if the GOP were to credibly compete with the Left and find more effective ways to help the most vulnerable in society, the poor votes could no longer be taken for granted by Democrats. That’s the real issue. And like any political issue, Democrats would have to appeal to poor constituents by showing their solution is demonstrably superior to the Republican plan.

The poor would benefit if there were competition to help them, I’m convinced of that. But the Democrat party wouldn’t benefit. I’m convinced of that too. I don’t think Paul Ryan has all the answers. He may not have the best answers. But at least he took a political risk, more than I can say for Ms. Lee and the Black Caucus. I think Ryan is a decent guy. He’s not looking for poor votes. He’s looking for a more cost-effective solution than current poverty programs. Otherwise the poor remain the turf of the Democrat party, which persuades them – as it always has – that Democrats not Republicans really care about them and the poverty aid shell game continues.

In commenting on the Ryan report George Will recently recalled another report that was the impetus for the War of Poverty:

A year from now, there surely will be conferences marking the 50th anniversary of what is now known as the Moynihan Report, a.k.a. “The Negro Family: The Case for National Action.” In March 1965, Moynihan, then 37 and Assistant Secretary of Labor, wrote that “the center of the tangle of pathology” in inner cities – this was five months before the Watts riots – was the fact that 23.6% of black children were born to single women, compared with just 3.1% of white children. He was accused of racism, blaming the victims, etc.

Forty-nine years later, 41% of all American children are born out of wedlock; almost half of all first births are to unmarried women, as are 54%?and 72% of all Hispanic and black births, respectively. Is there anyone not blinkered by ideology or invincibly ignorant of social science who disagrees [that family breakdown and poverty programs correlate]?


Fifty years ago, LBJ promised that the war on poverty would be an "investment" that would "return its cost manifold to the entire economy." It hasn’t done that.

It’s time to take a different approach and give ideas like those of Ryan and Rubio a chance. It’s hard to imagine their ideas making poverty worse or more expensive to fight.

Saturday, March 22, 2014

Putin’s Munich Moment

Protected by two oceans with friendly neighbors on our northern and southern borders, Americans aren’t naturally inclined to concern themselves with disruptions in other parts of the world. Nothing shows our global disinterest and detachment more than not knowing the location of a country or much about it – unless it’s one of the biggies like Germany, Russia, China, and maybe a handful of others.

Take Ukraine, for example. Where the heck is that place? What’s all the fuss about Russia and Crimea? And why should I care?

Google a map and you’ll see that Ukraine is due south of Moscow. Russia borders about half of its northern border and its entire eastern border. On the Ukrainian south is the Black Sea. Jutting out into it like the Greek Peloponnese is Crimea, the Ukrainian peninsula so much in the news.

When the Union of Soviet Socialist Republics collapsed the day following Christmas in 1991, Ukraine was one of the 15 Republics to escapee the embrace of the Russian bear hug. Like the other escapees, Ukrainians are bilingual – Russian was the required lingua franca – and many were multilingual since a great number of ethnic groups make up their society. But Slavic Ukrainian is the mother tongue for two-thirds on the country – increasingly so as one travels west today– and one-third are Russian speakers with Ukrainian as their second language. The majority of Russia speakers are concentrated in Crimea.

Looking at a map of the former USSR, one is struck with the size of Russia – 11 time zones wide – and the indicators of its poor economic condition: Russia has 30% fewer miles of paved road than the state of Texas and its GDP is slightly less than Brazil’s and slightly more than Italy’s. Russia’s GDP is one-eighth (12%) of the US GDP.

Crimea is an autonomous province within Ukraine. It was created in 1954 when Nikita Khrushchev, who was half-Ukrainian, transferred it to Ukraine oversight. At the time it meant nothing since both Crimea and Ukraine were part of the USSR. But lately Vladimir Putin has been crying, “We wuz robbed,” as he disputes the legality of the 1954 transfer. Since Putin was two years old at the time he was not consulted. About 60% of Crimean residents are native Russia speakers, many of them living around Sevastopol, the principal port for the Russian Black Sea fleet. Crimea has a governing parliament, but the Ukrainian government has veto power over its actions. As the USSR was falling apart in 1991, 90% of Ukraine voted to secede. The Crimean vote for independence was 56% despite its large Russian population.

While Crimea may be an autonomous province, its economy and infrastructure is integrated with Ukraine. Over a billion dollars of Crimea’s annual budget revenue comes from the Ukrainian capital at Kiev. Most of its water, 80% of its electricity and two-thirds of its natural gas comes from Ukraine. In other words, Crimea is not a country that can go it alone.

The decade following independence was difficult for Ukrainians (including Crimeans) as its economy converted from the centralized Soviet command economy to a market-based economy. But relative prosperity arrived and continued until the 2008 world economic meltdown. As Ukrainians tasted the good life, they tilted toward the west. The west in turn tilted toward Ukraine, inviting the country to join the European Union provided it cleaned up its political act. One of several conditions required freeing Yulia Tymoshenko from prison, the details of which can be read at her hyperlink. (You may recall that she was Prime Minister when Viktor Yushchenko became the third post-independence president. He allegedly was poisoned by Russian agents who favored the pro-Russian candidate, Viktor Yanukovych. Yushchenko’s face was grotesquely pocked by the poison but he eventually healed.)

A victim of the 2008 global meltdown, the Ukrainian economy went south along with Yushchenko’s popularity. He lost the 2010 presidential election to Viktor Yanukovych, a corrupt Russian stooge. Despite Yanukovych’s promise to join the EEU, when the time came to sign on the dotted line, he reneged. His best bud, Vlad Putin, had made him a better offer – a $15 billion loan, discounts on natural gas imports from Russia, and membership in the Eurasian Union, also called the Customs Union, which Putin intends to be a counterweight to the EEU. The Customs Union does not even exist at this time and won’t exist until early 2015, if then. But one fact is certain: no democracies need apply.

The Customs Union will be a political and economic union more in the mold of the former Soviet suppression of civil, political, and economic liberty. It will be a throw-back to old style Cold War paranoia. Putin’s Kremlin suspects the real motive behind the EEU flirtation with Ukraine is to weaken Russia and its influence in the world. Thus, Putin’s Syrian mediation, which bested Team Obama-Kerry game, set, and match, showed his commitment to be a global alternative to the influence of the west.

President Yanukovych’s repudiation of the EEU last November squelched hopes for westernizing Ukraine. Gone were the hopes of improving the lives of its citizens through the much needed economic, political, and judicial reforms which EEU membership would have compelled. Protestors responded by taking to the streets. This is what they did to dispute the rigged 2004 election which Yanukovych supposedly won but the Court nullified it allowing Yushchenko to become President. Unlike 2004 in which street protests were peacefully tolerated by the government, this time the incompetent Yanukovych government fought back. A number of protesters were killed.  Yanukovych fled Kiev for eastern Ukraine which was more pro-Russian. Parliament voted to remove him from office and replace him with an interim President. And Tymoshenko was released from prison allowing her to fly to Germany for treatment of spinal injuries she received while imprisoned.

Seizing upon Ukrainian political instability Putin expanded his military presence in Crimea. He alleged the move was made to protect the security of the Russian fleet base and Russian revenues from natural gas lines which went through Ukraine to Black Sea ports. Hogwash. Russian troops also crossed the Crimean-Ukraine border to seize a natural gas plant on Ukrainian soil.

An extraordinary election was called to give the pro-Russia Crimean peninsula the opportunity to secede from Ukraine and rejoin Mother Russia. The election ignored all constitutional requirements. People opposed to secession and Russia repatriation failed to vote, fearful of their lives. Some who voted may have done so illegally, including voting more than once. Election observers and cameras weren’t allowed. Remarkably, 125% of the registered voters voted. Not so remarkable, secession won by 96.7%. Therefore, as of Sunday, Crimean time was set to Moscow time, banking reorganization began to integrate with Russian banking, and other forms of Russian integration were initiated.

Putin further claims he has the right to "defend" "Russian citizens" anywhere in Ukraine. Since Russians are scattered throughout the country, that means he can invade at will, despite the fact that more than a few Russians don’t want repatriation with Mother Russia.

If you’re a student of history thinking all of this seems strangely familiar, you’d be correct. In 1938 Hitler demanded that Czechoslovakia cede the Sudetenland – that portion of Czechoslovakia which was populated by German speakers. The cowards in the west led by British Prime Minister Neville Chamberlain and French Prime Minister Édouard Daladier did nothing – but talk.

Like Hitler, Putin is a bully and one heck of a good poker player. He effectively invaded Georgia almost six years ago and annexed two provinces. As it had done 70 years earlier, the west did nothing but talk. Putin now invades Crimea, crosses into Ukraine to seize a gas production facility, and reserves the right to make further incursions. Obama and Kerry talk, warn, wag fingers, and do nothing.

Putin has seen these two clowns in action before. Obama-Kerry established Syrian red lines and did nothing. They warned Iran that going nuclear wouldn’t be tolerated and did nothing. They warned North Korea about firing provocative missile test shots and did nothing. They warned China about maneuvers in the international waters of the South China Sea and did nothing. Putin watched as America shed its military might to increase welfare spending. He watched as Obama pulled out of Iraq and Afghanistan so inadvisably as to squander all gains paid by American blood and treasure. He watched as Obama abdicated the US role as world leader and focused his administration on US income inequality. He watched and watched and watched, and rightly concluded that Obama is an over-cautious professorial politician in the Woodrow Wilson mold … a good talker but a feckless leader.

We don’t have to respond to Putin with military action. In many respects Russia is a third-world country. Shutting down all business – and I mean ALL BUSINESS including our energy purchases – with Russia and trying to get our allies to do likewise would bring Russia to its knees. Who will give Russia economic aid then? China? It has its own economic problems. Sending military aid to Ukraine would help salvage “Ukraine military” from its current oxymoron status, arming it to resist. There are other non-military initiatives we could take to let Putin know he was in a bare knuckle fight. But alas, Obama hasn’t the spine to face down Putin as his hero Kennedy did to Khrushchev in the Cuban missile crisis.

With a little imagination anyone can see where Putin is headed. He wants to reconstruct the collapsed Russian empire, which he has called “the greatest geopolitical catastrophe of the century”. Putin's geopolitical mentor, Aleksandr Dugin, promotes a radical ideology that sounds very much like Hitler’s Anschluss – except now reuniting of all of the Slavic people in Europe. "Only a global crusade against the US, the West, globalization, and their political-ideological expression, liberalism, is capable of becoming an adequate response. ... The American empire should be destroyed.” And the Russian empire should be resurrected. Tough talk. Google Dugin’s name. You’ll get an eyeful.

Putin hasn’t been jetting around Eurasia making friends, loans, and weapon deals because he’s collecting frequent flyer points. As Admiral Painter famously told Jack Ryan in The Hunt for Red October, “Russians don't take a dump, son, without a plan.”

Those who think my concerns are just conspiracy theories should remember that six months after the 1938 Munich Agreement, Hitler swallowed up all of Czechoslovakia. Six months after that he invaded Poland and massacred its citizens. Six months later he invaded France, defeating it in one month. For the next five years, the world fought a war that cost 60 million lives. Almost one and a half million were Ukrainians – 15% of the Soviet losses.

Evil cannot reveal its aims when it initially appears. Its course and ultimate consequence are  determined by the neglect of men anxious to minimize their response it.

Now, what was that prophetic statement by George Santayana?

Ah, yes: "Those who cannot remember the past are condemned to repeat it."

Saturday, March 15, 2014

Our National Insanity

Like lemmings leaping off of a cliff, Americans this past Saturday night dutifully joined with the lunacy of the political class in playing the twice-a-year game, “Pretend You Can Change Time.”

All of the clocks in our houses, the ones on our wrists, and the ones in our cars had to be moved forward an hour. (Phones and computers are smart enough to do it on their own.) In the fall we repeat the insanity and move all of the clocks back to the time they were. I have over 60 antique clocks so the DST game is a particular annoyance.

But the one clock that really matters couldn’t be changed – the one in your body. Most people, therefore, looked at their newly-reset clocks Saturday evening and said, “My gosh! 11 o’clock already? It’s can’t be 11 o’clock; I’m not even sleepy.” Well, it wasn’t 11 o’clock in body time so many of us stayed up until midnight. But if we had someplace to go Sunday morning – like church – we lost an hour of sleep according to our bodies and hit the road bleary-eyed behind the wheel of a 4,000-pound automobile.

Since it can take up to a week for people to acclimate their body to a new time schedule, millions of commuters were virtual guided missiles in their morning commutes this week, combining the still lingering loss of an hour of sleep with the hazard of a new hour of darkness.

Benjamin Franklin originally conceived of the idea of resetting clocks in the summer when the sun rises earlier and sets later. Legend has it he was awakened by sunlight streaming through a shutter one morning during his assignment as US ambassador to France. That started his boundless imagination to calculating that Parisians burned up about 65 million pounds of candles doing things in the dark that could be done during daylight hours if they would just roll out of the sack earlier. (Apparently he was unaware of the life of a farmer.) Franklin proposed waking the city folk at daybreak with cannons and church bells and, to give them an incentive to go to bed after sunset, imposing a tax on candles. The idea went nowhere in his lifetime but it identified an ingenious new way government could interfere in the lives of citizens.

In Franklin’s time people never traveled far from their birthplace. The activities of life and work didn’t make it necessary. So time was determined locally using the sun’s zenith to establish solar noon to which all local clocks were set. Cross a river and you could be in a different time zone. After the industrial revolutions in England, Europe, and the American north, commerce and transportation scheduling was chaotic absent a more standardized time system. So in 1884 the International Meridian Conference agreed on a time standard that would make zero time the time on the longitude that passed through Greenwich England – thus Greenwich Mean Time or Zulu time, the military’s phonetic “Z”, was established. GMT time became the time west of Greenwich until a westbound traveler reached 15 degrees of longitude. At the 15 degree boundary, a new time zone started and continued for another 15 degrees.

The earth is roughly spherical, meaning that a circle is transcribed by traveling due west from Greenwich until the traveler returns to Greenwich from the east. Since a circle has 360 degrees, establishing time zones every 15 degrees of longitude means there are 24 time zones each having a time standard which is one hour earlier than the time in the zone immediately to its east.

Daylight Saving Time was instituted by government edict in the administration of Woodrow Wilson, who during World War I believed coal would be saved by moving clock time back an hour. (Yeah, I know the “spring forward” thing, but clock time goes backward when the hour hand goes forward.) This was part of a general law, the Standard Time Act of 1918, which imposed a congressionally mandated time under the ever-elastic Commerce Clause provision of the US Constitution. Wilson wanted DST clock time to be permanent (he was an avid golfer) but farmers raised a ruckus because their routines were determined by solar time. So clocks resumed their Standard Time setting.

But along came World War II and Franklin Roosevelt reinstituted clock jiggering, which he dubbed “War Time,” so DST prevailed throughout the year. After the war states were given the choice to stick with Roosevelt time or return to Standard Time. Some states opted in, others opted out and chaos reigned. Once again Congress intervened establishing the Uniform Time Act of 1966 with provisions that states which observed DST had to start it on the first Sunday in April and end it on the last Sunday in October. That law was replaced in 2007 with the Energy Policy Act that started DST on the second Sunday in March and ended it on the first Sunday in November.

Energy savings, the alleged purpose for DST, has never been proven. One government study showed a 1% reduction in energy consumption – a pittance – but other studies show that the shift to DST costs American business $150 million. I know, that’s also a pittance in the age of Obama but add to it the increased incidence of auto accidents immediately following the switch to DST and increased adverse health incidents associated with disrupted body clocks and it appears obvious that the cost of DST compliance exceeds the benefits.

Nevertheless last Sunday morning, March 9, at 2 a.m. a miracle occurred! By government edict an hour disappeared! The federal government ordered time to advance an hour and POOF! the hour between 2 a.m. and 3 a.m. vanished. Just like that. BAM! and it was gone. What if I’d had someone to meet at 2:30 a.m.? Would we have had to meet in an alternate universe? “Hey Matrix Man, the government just blew up the hour when we were going to meet. Can we do it one universe over?”

Where I live the sun rose – if any politicians are reading this, that happens in the east – at 6:57 a.m. last Saturday March 8. The sun set at 6:40 p.m. That gave me 11 hours, 42 minutes, and 38 seconds of daylight. That’s cool. I got everything done I needed to do. Solar noon occurred at 12:48 p.m. I hardly noticed that it was 48 minutes late. ‘Course I could have walked, clock in hand, about 700 miles east to the Boston longitude and my clock’s noon and solar noon would have been about the same. But that stuff doesn’t bother me. Besides, I’d have been 500 miles off the coast of Savannah in the ocean.

Thanks to DST solar noon now comes at 1:48 p.m. That puts clock noon in the solar mid-morning. The alleged purpose of the “spring forward” nonsense – to save daylight – does nothing of the kind. On the Monday morning following the beginning of DST, the sun rose at 7:55 a.m. where I live and set at 7:41 p.m. giving 11 hours, 46 minutes, and 49 seconds of daylight. The solar morning and afternoon daylight were equally divided – 5 hours and 53 minutes in each. However, if I reckon daylight by clock time, the day began with the 7:55 a.m. sunrise and continued for 4 hours and 5 minutes until clock noon. The afternoon ran from clock noon until sunset at 7:41 p.m. – that is, 7 hours and 41 minutes. So by clock time Monday saw a tad more than 4 hours of daylight in the morning and afternoon saw almost 8 hours. The duration of daylight was the same in hours as it was by solar reckoning. But DST silliness removes an hour from “morning” and adds it to “afternoon.”

DST doesn’t affect evening commuters unless they get home quite late. Commuters who normally get home at 6:30 p.m. clock time had daylight both before and after the annual onset of DST. However morning commuters who normally leave home at 7 a.m. had daylight commutes before the onset of DST but after DST their commute now starts in the dark and continues so for almost an hour. Where’s the energy saving?

Office buildings, which burn electricity mostly when they are occupied, turn on the lights an hour earlier and turn them off an hour earlier. Where’s the energy saving?

But there is considerably more energy expenditure because of DST. It’s the energy expended by parents trying to get their children to bed and asleep now that government has pumped all this light into their afternoon. “But mom, it’s still light outside! Why do we have to go to bed earlier?” They don’t get it.

Neither do I.

On November 2 Americans will waste more time setting maybe a billion clocks and watches back an hour. For what? Why don’t we just pick a time a stick to it?

Notwithstanding the lunacy rampant in Washington, we have as much daylight as we’re going to have. It can’t be saved and it can’t be created. The creation of daylight and time happened before there was either. Or politicians.

Yes, Virginia; there was a time when there were no politicians.

Saturday, March 8, 2014

Zombie Nation

We don't read and write poetry because it's cute. We read and write poetry because we are members of the human race and the human race is filled with passion. Medicine, law, business, engineering, these are noble pursuits and necessary to sustain life. But poetry, beauty, romance, love, these are what we stay alive for.

To quote from Whitman, "O me! O life!... of the questions of these recurring; of the endless trains of the faithless... of cities filled with the foolish; what good amid these, O me, O life?" Answer: that you are here; that life exists and identity; that the powerful play goes on and you may contribute a verse; that the powerful play goes on and you may contribute a verse.

What will your verse be?


This is the question John Keating (Robin Williams) asked his students in a scene from the Dead Poets Society. Keating’s character was loosely fashioned after Sam Pickering who attended Montgomery Bell Academy in Nashville as a student and later returned there to teach for a year – just as the Keating character attended and taught at the fictitious Welton Academy in Vermont.

One of Pickering’s students, Tom Schulman, later wrote the screenplay for Dead Poets Society using Pickering’s unorthodox teaching methods as his inspiration.

I was never moved to write a screenplay about my English teacher, but when I attended a military prep school in Tennessee he was as passionate, though not as unconventional, as John Keating. Somehow Captain Standard managed to capture the interest of a classroom full of 16- and 17-year olds whose hormones put anything but Shakespeare, Browning, and Byron in their minds. That he could sell us on reading, not to mention memorizing, Ode to a Grecian Urn, It Was a Beauteous Evening, and Ozymandias was a tribute to Captain Standard’s passion. It would have made Keats, Wordsworth, and Shelley proud.

In a world of mouse clicks and key words, it’s easy today to learn that Samuel Taylor Coleridge composed several hundred or more lines of poetry in a dream (aided by a bit of opium extract) after reading a book about Xanadu. He managed to get some of the poem’s lines on paper after awakening before he was interrupted by “a gentleman from Porlock” whose visit kept Coleridge from his writing desk for an hour. Upon returning to his desk, the dream lines were mostly gone and what little we have in the last dozen lines of Kubla Khan today is all that remained. Captain Standard carried tidbits like that in his pre-Internet head six decades ago which he liberally sprinkled in his lectures upon introducing a new assignment. I still remember them today.

Upon entering the engineering school of Tufts University (it was only a college then) as a freshman I had to take the English courses that the academic worthies considered were the minimum needed to be considered an educated person in civil society. In those classes I met the club-footed Philip Carey in Somerset Maugham’s Of Human Bondage. I met Pepe Torres in John Steinbeck’s Flight and Le Père Goriot in Honoré de Balzac’s eponymously named book.

I suspect I’ve not earned a dime more over the course of my professional career for having been exposed to these and other works in the humanities during my stay at Tennessee Military Institute and Tufts. But I think they made my life a more curious and richer experience. I appreciate that both Captain Standard and the Tufts worthies decided what a teenaged boy should know without consulting me. Thanks to their decision I became a lifelong learner. Appropriately so, they served only the appetizer. The banquet choice was mine.

I don’t believe I’m an educational snob. But I was shocked to read in this week’s Wall Street Journal of Kyle Bishop’s Ph.D. dissertation proposal to the English department supervisory committee of the University of Arizona. It proposed that he do “scholarly research” on zombie movies! It was approved! When I submitted my dissertation proposal 40 years ago it was expected that doctoral students would have spent months confirming that the subject they chose to research was unique among prior doctoral research projects and would produce a scholarly contribution to its field of knowledge worthy of the university. More than a few proposals in my department were rejected as not scholarly enough. And yet Dr. Bishop is now chairman of the Southern Utah University English department and has lectured on zombies in Spain, Hawaii, and Canada.

I hadn’t realized how important zombie research was in academia until I also read in the article that Professor Juliet Lauro in the Clemson English department teaches how zombies represent the struggles of slavery and oppression. She is writing a book entitled The Transatlantic Zombie: Slavery, Rebellion and Living Death. Maybe she got her inspiration from Dr. Bishop who turned his dissertation into a book, American Zombie Gothic: The Rise and Fall (and Rise) of the Walking Dead in Popular Culture, which sold a thousand copies.

A professor in one of California’s universities has edited a book, The Economics of the Undead, which “raises issues of the use of resources” should our country or the world face a catastrophic misfortune. A zombie “scholar” in a university in England is working on a tome that "seeks to investigate zombie sexuality in all its forms and manifestations." Now that’s something I always wondered about.

Heather Mac Donald informed us in an article earlier this year that UCLA students majoring in English – majoring in English, mind you – were required prior to 2011 to take a course in Milton’s works, another in Chaucer’s works, and two courses in Shakespeare. That is, until the junior faculty staged a departmental coup d'état and announced that the pillars of English literature were so “yesterday.” A UCLA English major may now take three courses from four “academic disciplines” – (i) Gender, Race, Ethnicity, Disability, and Sexuality Studies, (ii) Imperial, Transnational, and Postcolonial Studies, (iii) genre studies, interdisciplinary studies, and critical theory, or (iv) creative writing. In other words, a UCLA English major may now graduate without taking an English course. I’m not making this up.

And in the general student population – i.e. those not specializing in English – Ms. Mac Donald informs us, “UCLA’s undergraduates can take courses in Women of Color in the US; Women and Gender in the Caribbean; Chicana Feminism; Studies in Queer Literatures and Cultures; and Feminist and Queer Theory.”

The collegiate educational establishment has gone nuts.

The psychobabble of today’s elite on university campuses obsesses over class, race, gender, inequality, its victim status, and the sins of our fathers … and zombies (I don’t want to leave that out.) It is turning its narcissism into fraudulent disciplines of academic scholarship. This is the modern day equivalent of Esau trading his birthright – his intellectual inheritance deeded in a millennium of art and knowledge – for a bowl of porridge, the “alternative rubrics of gender, sexuality, race, and class.” A college education in the future might well have replaced the once serious inquiries into the minds and expressions of ages past for navel-gazing exercises in self-discovery. Why bother to go to college for that?

Several days ago I was reading a book and came upon a passage that stopped me cold. The author, referring to another person, observed, “His life existed in a minor key – a symphony pathetique – until last year when he got it all together and those minor chords gave way to the major key of success.” Reading that I wondered how someone schooled in gender, race, disability, oppression, and the politics of inequality could have a clue what this author meant. When Reagan met Gorbachev at Reykjavík in the fall of 1986 the Soviet leader characterized the Cold War as “the labor of Sisyphus.” Would an education steeped in the “scholarship” of women’s studies, Chicana feminism, and studies in queer literatures and cultures inform its graduates of Gorbachev’s complaint?

Early this year I began reading Robert Edsel’s book, The Monuments Men: Allied Heroes, Nazi Thieves, and the Greatest Treasure Hunt in History. It chronicles the exploits of a small group of art experts who landed in Europe as the fighting still raged in 1944. They were not young men. They had been recruited for their expertise as art curators in civilian life. Armed with a signed order from General Eisenhower, their task was to recover the objects of art that Hitler and the Nazis had pillaged from museums and private collectors during German occupation.

While many priceless works were lost, the Monument Men, as they came to be labeled, recovered most of what Eisenhower called the symbols of "all that we are fighting to preserve." One of the men was directed by the French underground to the mountaintop castle of Mad Ludwig in Neuschwanstein where he found 12,000 stolen art objects. Two other Monument Men found a salt mine in Altausee, Austria where Michelangelo’s Madonna and Child was found among 137 sculptures, 6,600 paintings, and thousands of rare books and art objects. The meticulous Nazis cataloged what they had stolen and noted its hidden location. But even as recently as last month stolen art from the Nazi era is being found and repatriated.

A film – loosely based on the book – has been in theaters recently. While I recommend the book over the film, anything that informs the public of what the Monument Men accomplished honors their achievement. It did not come cheaply. Two were killed in combat action as much for love of art as love of country – Captain Walter Huchthausen, an American art scholar, and Major Ronald Edmund Balfour, a British art scholar. Astonishingly, this small group of art conservators recovered the world’s hallowed heritage of artistic expression for future generations to study and enjoy.

I expect today’s academic elites wonder why they bothered.

Saturday, March 1, 2014

Unsettled Science

This is not opinion. This is about facts. This is about science. The science is unequivocal. And those who refuse to believe it are simply burying their heads in the sand. President Obama and I believe very deeply that we do not have time for a meeting anywhere of the Flat Earth Society … The bottom line is this:… climate change can now be considered another weapon of mass destruction, perhaps the world’s most fearsome weapon of mass destruction.

Well, there you have it. That’s how Secretary of State John Kerry laid it out for the good folks of Indonesia when he spoke there in mid-February.

Signaling a new global warming offensive by the Administration, Kerry’s speech synched up nicely with Obama’s who had left a carbon emission trail across the country in order to visit California on about the same date. Ostensibly the purpose of Obama’s visit was to give aid, comfort, and a major climate policy speech in the drought-stricken state. But most of his time was spent playing golf on several exclusive courses among the 124 courses in and around Palm Springs which suck up a quarter of the water drawn from the area’s underground aquifer. Each of these 124 beauties drinks about a million gallons a day – yes, per day – because of the dry desert air. Emphasis on desert. It would be a shame if any of them browned out like the crops of California’s farms which – thanks to Obama’s EPA regs – are on water rations in order to protect salmon and an endangered 3-inch fish, the Delta smelt, from browning out.

Obama’s pre-golf speech assured his staged audience that he was “directing all federal facilities in California to take immediate steps to curb their water use, including a moratorium on water usage for new, non-essential landscaping projects.” I wonder if that includes golf courses. No mention of farms that are failing or cattle herds that are being broken down and sold, because Fish Before Families is Obey’s motto.

“The budget that I sent to Congress – the budget that I send to Congress next month will include $1 billion in new funding for new technologies to help communities prepare for a changing climate, set up incentives to build smarter, more resilient infrastructure.” Ah, yes! A billion dollars of taxpayer money in a Green Mafia venture fund. That’s the kind of solution I’d expect from a rookie who’s never had a real job. Most of California’s 20th century infrastructure was built during an historically exceptional wet period. California is naturally arid. That’s why part of it is desert where fools build golf courses so American imperial presidents and the 1% he despises have playgrounds. 

John McCain erupted after the Kerry speech to say that instead of focusing on Syria, Iran, and the Israeli-Palestinian negotiations, Kerry has been “butterflying around the world, saying all kinds of things. So he has to go over to Asia and talk about climate change and say it’s the most important issue. Hello? On what planet does he reside?”

Former House Speaker Newt Gingrich called for Kerry to resign, saying he was “delusional” to put climate change as a greater threat than unrest all over the world.

The “settled science” on which Obama and Kerry anchor their radical arguments for hamstringing American industry is anything but settled. In fact if one travels back in time, a plentitude of examples abound to show how unsettling settled science can become.

Take Galileo, for example. The poor fellow would have likely become a “crispy critter” had he lived in northwestern Europe instead of Rome in the early 17th century inquisitions. For disputing the settled science that the earth was the center of the universe, the Roman Inquisition let him off lightly – the rest of his life under house arrest. Three hundred and fifty years would pass before the Pontifical Academy of Sciences of the Catholic Church would admit (but only after 13 years of investigating Galileo’s condemnation) that they were wrong in 1633 and Galileo was right. Ah! Settled science dies hard.

Oh, and then there’s that Roman doctor guy, Galen I believe his name was. He settled some science around 150 years following Christ’s birth. Thanks to him everyone just knew – with the same certitude that Kerry and Obama know – that illness was caused by getting your humors (blood, phlegm, yellow and black bile) out of whack. The solution was to bleed the patient to get those bad old humors out of the body. Settled science – shorthand for “everybody knows” – therefore concluded that the feverish George Washington suffered from humors gone tilt despite the fact that the 67-year old former president had spent too much time outdoors in cold rain while inadequately clothed. The unsettled science of today would have diagnosed his symptoms as an inflammation of the epiglottis, that stiff but flexible piece of cartilage at the back of the tongue. Yet everyone present that December in 1799 just knew the old man had too much bad blood in his body. It’s estimated that his physicians removed 37% of it during his 16-hour treatment. In other words, the settled science of Washington’s day bled him to death.

Benjamin Rush, a signer of the Declaration of Independence, was a respected physician in Washington’s time. The settled science of humors led him to develop and patent his well-known Rush’s Thunderclappers – an appropriate name for a pill that would induce explosive diarrhea thus venting bad humors in a different manner. The members of the Lewis and Clark expedition had the misfortune of packing 50 dozen of these intestinal warheads because no doctor would accompany their journey, and the fame of Rush’s Thunderclappers had spread far and wide – probably quite literally. Since the diet of the Corps of Discovery was almost exclusively the game they shot along the way and little was eaten in the way of plant fiber, the explorers fell frequent victims to – you guessed it – constipation. Thunderclappers to the rescue!

Thunderclappers were laxatives on steroids. Each dose was 60% mercury, enough to kill a man except that the volatile pill shot through a digestive system at warp speed – too fast for much of the heavy metal to be absorbed into bodily cells. Mercury doesn’t break down in soil either. So in a quirk of science that’s really settled, the trail of the Lewis and Clark expedition is traceable even today by following the deposits of mercury chloride in the soil where nature and chemistry met but never mingled.  Talk about toxic waste.

These are but a few examples of the arrogance of the ages whose stakeholders could not conceive that science had withheld some of its secrets from them. All that could be known was known, or so they believed. If you didn’t imbibe the Kool-Aid of their conventional wisdom you were, in the words of John Kerry, a flat earth thinker.

But consensus and science make strange bedfellows in every generation. The Royal Society of London, more formally known as The Royal Society of London for Improving Natural Knowledge, was formed in 1663 – about 30 years after Galileo was getting himself wrapped around the settled science axle. The motto adopted at its formation is Nullius in verba – Latin which translates roughly “take nobody’s word for it.” A looser translation could be, “Don’t believe what other people tell you, however authoritative then may be.” And as I recall the lessons of history, the flat earth thinkers of the past were the Al Gores, John Kerrys and Barack Obamas of their age – the true believers in settled science … settled science as it’s believed to exist, not as it’s known to exist. Along comes Columbus and … well, you know the  rest.

How unfortunate that high altitude balloon and satellite direct measurements of global warming have corresponded closely with each other’s data but contradicted computer model projections which predicted warmer temperatures that never happened.

How unfortunate after years of Henny Penny warnings that the earth was getting warmer, the Intergovernmental Panel on Climate Change – the most authoritative climate initiative launched – has now reversed itself and admitted that actual recorded temperatures have risen only a quarter of the expected rate. The IPCC can’t explain why a temperature “pause” announced last year was not forecast by their computer models, and it can’t explain why there has been no statistically significant temperature rise since 1997. A search of the hyperlinked document above shows the words “uncertain” and “uncertainty” appear over 1,300 times.

No less than the New York Times found it necessary to say “yes, but” after Obama’s California speech, observing that no scientific evidence substantiated his claim that the drought is due to man-made climate change. Indeed, the Times went so far as to quote noted climatologist Richard Seager who’s "pretty sure the severity of this thing is due to natural variability."

Apart from the fraud, rigged data, and outright incompetent measurements revealed in the climate change debate, the fact most embarrassing to the Gores, Kerrys, and Obamas is recent winters. They have been among the coldest recorded in many parts of the world. The winter now ending in Europe has been one of its most severe in 70 years. Where I live we had two back-to-back storms depositing several inches of snow and sleet in a major city that has no snow removal equipment to speak of.

Where’s the global warming? Not to worry. Its advocates simply re-label the debate rhetoric and now call it climate change. Renaming the “climate change” ideology allows it to embrace hurricanes and typhoons, extreme winters, and extraordinary rains as well as droughts and heat waves.

But even as the climate change ideologues in the US are hamstringing our competitiveness with regulations and junk science, European and Asian industrial economies are abandoning their renewable and alternative energy initiatives for the plain fact that they can’t afford them and remain competitive in a global economy. More importantly, they aren’t convinced these initiatives are solving the right problem.

“No question is so difficult to answer as that to which the answer is obvious,” George Bernard Shaw once observed. Witness the 1970s when President Carter tried to convince the American people of the “obvious” – that the world would run out of fossil-based energy in a relative few years. Today, almost 40 years later, we estimate 1,000 years of energy reserves exist.

“It isn’t what people know that gits ‘em in trouble,” Will Rogers often observed, “it’s what they know that ain’t so.” What Kerry, Obama, and climate change ideologues of their ilk “know” ain’t so. And while the industrial economies of Europe and Asia are headed in the opposite direction, the misguided regulations and executive orders of Obama & Co. will restrict something that “ain’t so,” getting our economy and we who depend on its vitality in real trouble.