Saturday, December 28, 2013

Resolved

It’s that time again.

Time to change those things we don’t like about ourselves in the annual ritual called New Year’s resolutions. Time to surrender to the allure of every January 1: a new year, a new beginning, potent with fresh-startism.

The compulsion to do a self-makeover is stronger on January 1 than any other time of the year (who ever heard of making resolutions on June 12?) Despite the fact that, as the popular aphorism goes, every day is the beginning of the rest of your life, a fresh start, and a second chance, January 1 is different – or seems so.

The hope and change euphoria of New Year’s resolutions does not admit that nine in ten of them will fail according to a survey conducted a couple of years ago, and most of those failures occur quickly – like within a month. So if one of your resolutions is to hit the gym and get the old bod in shape, my suggestion would be to buy the weekly or monthly plan, not the enticingly-priced annual plan which gives “two months of ‘free’ membership.” Those clever gym owner guys know that most of the people who sign up in January won’t be there at year-end; in fact most won’t be there after Valentine’s Day, which is why they price annual memberships so attractively.

I’ve belonged to two gym clubs at different times over the past 25 years and I always hated January. Workout times lengthened because oversold memberships overwhelmed the capacity of the machines. My only solace was knowing that I only had to put up with it for a month. Most of the new members would be gone by then and workout times would return to normal. The gym owners knew the same thing. Don’t buy more machines; just put up with old member complaints for a month and this too shall pass.

If we used a product that failed as often as New Year’s resolutions, we’d stop using it and we’d tell all of our friends to avoid it. But we are convinced there’s a difference with resolutions. When products fail it’s the product’s fault; when resolutions fail, it’s our fault. Not enough willpower, the old scold looking back from the mirror tells us. Maybe more positive affirmations pasted around the house and office would have helped.

But changing the person you are is not a rational process. It’s an emotional process. And emotions can’t be controlled by force of will alone, at least not for long. Still, the temptation to try harder is compelling, all the while ignoring that the psychological “muscle” of willpower is like any other muscle. Pushed to the limit by having to resist one temptation after another now forbidden by a new resolution, it ultimately fails. And when we pursue too many New Year’s resolutions, the willpower muscle fails sooner.

The only way to deal with limited willpower is to realize that willpower is limited. Sounds obvious but apparently it isn’t. We can’t make up for limited willpower by trying harder. Trying harder assumes more effort will create more willpower. At the same time we don’t want to be limited by our limited willpower. So, how do we keep something that’s limited from limiting us? We avoid relying on it as much as possible. One way to avoid relying on willpower is to distract our focus away from whatever takes willpower to resist.

In his well-known book, Emotional Intelligence: Why It Can Matter More Than IQ, author Daniel Goleman cites the children’s marshmallow experiment developed by Walter Mischel at Columbia University. It was suspected that a child’s ability to delay immediate gratification was a precursor to success in later life as an adult. Mischel created the experiment to test his hypothesis.

A marshmallow was placed in front of each child in a group arranged around a table with the assurance that if they didn’t eat the marshmallow, they would be given another when the researcher returned. The researcher didn’t reenter the room for 20 minutes – an eternity for a child staring down temptation – during which time the behavior of the children was observed through non-transparent glass.

Some children surrendered to temptation almost immediately, others held out until they no longer could, and still others succeeded in holding out until the researcher reentered the room with their reward. The group was tracked longitudinally into their adult careers and it was found that the children who had been able to delay gratification later got better SAT scores, had fewer problems controlling their weight, and tended to have more career success than those who couldn’t delay gratification. Goleman labeled the ability to delay gratification “emotional intelligence” and concluded it was a better predictor of adult success than IQ.

Goleman’s conclusion is controversial because it’s unclear if success in life correlates with the ability to delay gratification or is caused by it. Correlation and causality are quite different. If a farmer observes that his rooster crows when the sun comes up, he might conclude (incorrectly) that the crowing rooster causes the sunrise. In fact there is no causality; they correlate by happening at the same time. Roosters crow during daytime also.

What is often ignored in reciting the Mischel research is the behavior of the children who “delayed” gratification. Some sang to themselves, others fiddled with their clothing, and still others put their heads in their hands or on the table top. In other words, they distracted themselves from the marshmallow. Quite possibly their willpower was no stronger than any of the other children. But they had somehow learned to distract themselves to avoid burning up their willpower reserve. In school they likely used that same strategy to distract themselves from the temptation to play, so they studied more. In their careers they distracted their attention from disruptive temptations, allowing them to advance faster. Their success was due less to the ability to delay gratification than a simple and repeatable secret: when life serves dessert, look at the salad bowl.

In other words, the seemingly strong-willed children learned to avoid as much as possible putting their willpower to the test – a practice they carried into successful adulthoods.

So why do we put ourselves through the annual ritual of making New Year’s resolutions, knowing from past years that our willpower to keep them will fail? Isn’t doing the same thing and expecting different results the definition of insanity? I think the answer lies in the ritual of New Year’s resolution-making itself. It fires us up and inoculates us against what the late Zig Ziglar called “stinkin’ thinkin’” – i.e. doing something positive makes us feel positive, if only temporarily. We ignore past failure and hope for the best one more time, just as an alcoholic believes he can take one drink.

Am I suggesting that we should give up making New Year’s resolutions because they fail in large proportion and do so fast? No. Self-examination and making an effort to change ourselves is preferable to doing neither. And research has shown that people who formalize their goals as resolutions will be more likely to achieve them than people who have the same goals and motivation but don’t formally commit to them. The keys to New Year’s resolution success are moderation and method.

Let me explain.

Typical resolutions might be “lose 50 pounds by the end of the year,” (after all, that’s less than a pound a week, a paltry 3,500 calories, just 500 per day) or “read 25 books this year” (just two a month) or “exercise three hours a week” (good grief, with 112 average waking hours per week, surely three can be spared!) However, a pound a week, a book every other week, and less than 3% of weekly available time for exercise is a rational defense of the resolution’s ease. The resolution’s accomplishment, especially if it’s radical change, requires dealing with emotions – the feelings needed for natural action – because that is where the resistance will come from. We do things we like to do because our feelings work with us to act naturally. We avoid things we don’t like because our feelings work against us because we have to act unnaturally.

A compelling argument could be made for the ease with which 500 calories could be eliminated from our daily intake either by avoidance or substitution. Yet 37% of Americans are obese, and for some ethnic groups, it’s over 50%. Deprivation never feels good.

A Huffington Post poll taken in October indicated that 28% of Americans haven’t read a book in the past year. A Pew Research poll taken a year ago indicated that the median number of books read by readers last year was 6. (I was surprised it was that many.) Despite the arguable ease with which 25 books could be read in a year, it’s four times the median number read. To make time for reading, something must be given up that’s more fun than reading. If it weren’t, we’d be reading.

Exercise trainers say 30 minutes of walking five days per week and ten push-ups from the knees three times weekly is sufficient to maintain body tone and weight (if combined with calorie management) for most age groups. Few people could argue that these requirements are Spartan. Yet the CDC says 80% of adults fail to get the weekly recommended amount of exercise.

Don’t forget that the person who is urging you to eat less, read more, and exercise is the same person who has a natural dislike for diets, reading, and exercise – a Dr. Jekyll and Mr. Hyde in the same skin. Good luck with that.

I’ve written previously in this blog that ObamaCare will fail primarily because it attempts too much change. A more sensible strategy would have been to string together lots of little changes which collectively would have amounted to major change. It would have taken longer, but it would have converted a sure failure into a possible success. Alas, that isn’t the political way. But it’s the way we achieve successful change in business – lots of little changes, a few at a time.

The same approach should be applied to New Year’s resolutions to improve their success. Don’t make a dozen resolutions. One or two is enough; more than three is too many. The object is to move the ball. In time, you’ll cross the goal line.

If you resolve to exercise five minutes a day, you’re more likely to do it than resolving to exercise 50 minutes, and you may find that once started you’ll not stop at five and may go for 50 minutes. But every day, regardless of how much or how little you walked the previous day, resolve to walk only five minutes.

Instead of resolving to read 25 books this year, resolve to read about 18 pages every day. Even the slowest readers can achieve that number in about 15 to 20 minutes. If you want, read half of the pages in the morning and half in the evening. But every day, regardless of how much or how little was read the previous day, resolve to read only 18 pages. Over the period of a year that will get you through about 25 books. But never think about 25 books. Think only about 18 pages. Cut this quota in half and you’ll still read twice the books read at the national median.

Most authors don’t sit down to write all day. While output varies from author to author, most resolve to accomplish a less frightening goal – like writing 500 words and hanging it up for the day. Books average 100,000 words, and at that pace the author produces a book in about 200 days – or, with weekends off, a book in ten months! Move the ball 500 words at a time and a book is produced every year.

Former wide receiver and now football commentator Cris Carter struggled with drugs and alcohol while he was with the Philadelphia Eagles until head coach Buddy Ryan cut him because of his addictions. The Minnesota Vikings claimed Carter off waiver in September 1990 for a $100 fee. He was a good player if he could conquer his substance abuse, so the Vikings management immediately put him in rehab. The team substance abuse counselor, Betty Triliegi, challenged Carter to go one week without drinking. Just one week. That challenge was made on September 19, 1990, and Carter has been sober since – one week at a time.

The Iditarod Race is a grueling 1,100 mile-long test of dogs and mushers through an Artic area beset by blizzards and temperatures that can fluctuate between 40 degrees above to 60 below zero. The musher and dogs go all day and rest at night or go all night and rest during the day – about 12 hours on and 12 off for the eight to ten day race. That is, until Susan Butcher came along. She changed the way the Iditarod is run today by introducing the four to six hour work-rest cycle. The dogs became less tired before they were rested and recovered their stamina quicker. Butcher’s short sprints were initially pooh-poohed by Iditarod vets (all men) who believed in “go for broke” daily goals before Butcher’s short cycle technique won four out of five consecutive races, proving small is better.

The lesson to be learned here is that New Year’s resolutions are more likely to be successful if they are few in number and they focus on process rather than outcomes. Like the children and the marshmallows, a focus on process distracts attention from the ultimate and often intimidating goal. And short cycles of process – 30 minutes here, five minutes there – are less likely to be resisted and combatted by procrastination or creative avoidance.

Plodding along in short process cycles will more often succeed in achieving the desired outcome than heroic long cycles because life by the inch is a cinch; life by the yard is hard.

So go ahead and make your New Year’s resolutions, and this year may the Force of good technique be with you!

Saturday, December 21, 2013

“He was despised and rejected …”

There are several things that make the Christmas of 1981 memorable in my mind. It was my youngest child’s second Christmas but the first that he was able to comprehend in some sense. One of our favorite Christmas photos is of him staring transfixed at the Christmas tree lights and decorations. After several accidents, his mother had told him he could look but not touch.

A second thing making that Christmas memorable was my dad turning 70 several weeks before it. His age seemed ancient to me at the time, but not so ancient today since I passed it a few years ago. My dad was nine months younger than Ronald Reagan almost to the day. The Reagan years had just begun and with each of my dad’s later birthdays I would remind him that he was now old enough to be President.

The third thing that makes the Christmas of 1981 memorable to me was happening in the country that gave the world Nicolaus Copernicus, Fryderyk Chopin (one of my favorite composers), Marie Curie-Sklodowska, John Paul II (in his fourth year as Pope), and Lech Walesa, the leader of the Solidarity movement which ultimately brought about the collapse of communism – although at the time there was no way of knowing that would be the outcome.

The Solidarity movement had been giving Moscow heartburn and the Russian overlords decided to replace Poland’s nominal leader, Stanislaw Kania, whom they thought was too soft, with General Wojciech Jaruzelski whose reputation for repression was reminiscent of Poland’s long history of foreign occupation and domination. Predictably, Jaruzelski struck like the villainous dictator he was and established harsh martial law to crush Solidarity. Strikes swept the country, many unionists were killed, Solidarity leaders were arrested and jailed, and only starvation brought the surrender of the last holdouts deep in the coal mines. It seemed to me that the world was moving toward another Cold War confrontation with the paranoid leadership of the Soviet Union.

Reagan was completing his first year in the White House and was still something of an unknown quantity. His love of freedom and hatred of communism were well-known, however, and there was little doubt that he would do something. One of his advisers later noted that Reagan was “livid” about the Polish situation. Moreover, the world’s political leaders, Pope John Paul, NATO, and activists outside of the Iron Curtain had united to denounce Jaruzelski’s crackdown.

On December 19, 1981, Polish Ambassador Romuald Spasowski, once an enthusiastic communist, notified the State Department that he was defecting and requested asylum. The next day he announced his defection in a radio message to the world saying he had acted to show support for Solidarity and Lech Walesa. "The cruel night of darkness and silence was spread over my country," he said. Reagan invited him to the White House for a private meeting on December 22 during which Spasowski asked Reagan if he would light a candle and put it in a White House window to show support for the people of Poland. Reagan did just that with his own hands as Spasowski and his wife watched and wept.

During the evening of December 23, as my family readied for Christmas, Reagan went on television to give his Christmas message to the nation and to address the worsening situation in Poland. I was able to find the text of his speech which I’ve edited in the following paragraphs. It is a memorable commentary that, for a society already secularized 32 years ago, Reagan felt comfortable to speak of the spirit of Christmas in such religious terms.

Good evening.

At Christmas time, every home takes on a special beauty, a special warmth, and that's certainly true of the White House, where so many famous Americans have spent their Christmases over the years. This fine old home, the people's house, has seen so much, been so much a part of all our lives and history. It's been humbling and inspiring for Nancy and me to be spending our first Christmas in this place.

We've lived here as your tenants for almost a year now, and what a year it's been. As a people we've been through quite a lot—moments of joy, of tragedy, and of real achievement—moments that I believe have brought us all closer together. G. K. Chesterton once said that the world would never starve for wonders, but only for the want of wonder.

At this special time of year, we all renew our sense of wonder in recalling the story of the first Christmas in Bethlehem, nearly 2,000 years ago.

Some celebrate Christmas as the birthday of a great and good philosopher and teacher. Others of us believe in the divinity of the child born in Bethlehem, that he was and is the promised Prince of Peace. Yes, we've questioned why he who could perform miracles chose to come among us as a helpless babe, but maybe that was his first miracle, his first great lesson that we should learn to care for one another.

Tonight, in millions of American homes, the glow of the Christmas tree is a reflection of the love Jesus taught us. Like the shepherds and wise men of that first Christmas, we Americans have always tried to follow a higher light, a star, if you will. At lonely campfire vigils along the frontier, in the darkest days of the Great Depression, through war and peace, the twin beacons of faith and freedom have brightened the American sky. At times our footsteps may have faltered, but trusting in God's help, we've never lost our way.

Just across the way from the White House stand the two great emblems of the holiday season: a Menorah, symbolizing the Jewish festival of Hanukkah, and the National Christmas Tree, a beautiful towering blue spruce from Pennsylvania. Like the National Christmas Tree, our country is a living, growing thing planted in rich American soil. Only our devoted care can bring it to full flower. So, let this holiday season be for us a time of rededication.

Even as we rejoice, however, let us remember that for some Americans, this will not be as happy a Christmas as it should be. I know a little of what they feel. I remember one Christmas Eve during the Great Depression, my father opening what he thought was a Christmas greeting. It was a notice that he no longer had a job.

A few months before he took up residence in this house, one of my predecessors, John Kennedy, tried to sum up the temper of the times with a quote from an author closely tied to Christmas, Charles Dickens. We were living, he said, in the best of times and the worst of times. Well, in some ways that's even more true today. The world is full of peril, as well as promise. Too many of its people, even now, live in the shadow of want and tyranny.

As I speak to you tonight, the fate of a proud and ancient nation hangs in the balance. For a thousand years, Christmas has been celebrated in Poland, a land of deep religious faith, but this Christmas brings little joy to the courageous Polish people. They have been betrayed by their own government.

The men who rule them and their totalitarian allies fear the very freedom that the Polish people cherish. They have answered the stirrings of liberty with brute force, killings, mass arrests, and the setting up of concentration camps. Lech Walesa and other Solidarity leaders are imprisoned, their fate unknown. Factories, mines, universities, and homes have been assaulted.

The target of this [repression] is the Solidarity Movement, but in attacking Solidarity its enemies attack an entire people. Ten million of Poland's 36 million citizens are members of Solidarity. Taken together with their families, they account for the overwhelming majority of the Polish nation. By persecuting Solidarity the Polish Government wages war against its own people.

I urge the Polish Government and its allies to consider the consequences of their actions. How can they possibly justify using naked force to crush a people who ask for nothing more than the right to lead their own lives in freedom and dignity? Brute force may intimidate, but it cannot form the basis of an enduring society, and the ailing Polish economy cannot be rebuilt with terror tactics.

Yesterday, I met in this very room with Romuald Spasowski, the distinguished former Polish Ambassador who has sought asylum in our country in protest of the suppression of his native land. He told me that one of the ways the Polish people have demonstrated their solidarity in the face of martial law is by placing lighted candles in their windows to show that the light of liberty still glows in their hearts.

Ambassador Spasowski requested that on Christmas Eve a lighted candle will burn in the White House window as a small but certain beacon of our solidarity with the Polish people. I urge all of you to do the same tomorrow night, on Christmas Eve, as a personal statement of your commitment to the steps we're taking to support the brave people of Poland in their time of troubles.

Once, earlier in this century, an evil influence threatened that the lights were going out all over the world. Let the light of millions of candles in American homes give notice that the light of freedom is not going to be extinguished. We are blessed with a freedom and abundance denied to so many. Let those candles remind us that these blessings bring with them a solid obligation, an obligation to the God who guides us, an obligation to the heritage of liberty and dignity handed down to us by our forefathers and an obligation to the children of the world, whose future will be shaped by the way we live our lives today.

Christmas means so much because of one special child. But Christmas also reminds us that all children are special, that they are gifts from God, gifts beyond price that mean more than any presents money can buy. In their love and laughter, in our hopes for their future lies the true meaning of Christmas.

So, in a spirit of gratitude for what we've been able to achieve together over the past year and looking forward to all that we hope to achieve together in the years ahead, Nancy and I want to wish you all the best of holiday seasons. As Charles Dickens, whom I quoted a few moments ago, said so well in "A Christmas Carol," "God bless us, every one."


As Reagan spoke I looked at my small children aged 2, 4, and 6 scattered about on the floor, eyeing the tree across the room already surrounded by gifts as their mother wrapped more. I couldn’t help but see the contradiction in my circumstances with those living six time zones east in Poland. We put a candle in our front window that night for the people of Poland.

I wondered then and continue to wonder today why Americans don’t get on their knees every day to thank God who for whatever reason – certainly not for deserving it – has blessed this country so much among the nations of the world.

I wonder if any public official today would use the terms Reagan used so openly above to talk about the “first Christmas in Bethlehem” two millennia ago, the “divinity of the child born,” who could perform miracles, who “chose to come among us as a helpless babe,” and that “Christmas means so much because of one special child”?

The world into which Jesus was born was under the boot of the Roman Empire whose cruelty would make Nazism pale in comparison. Prophets had foretold his coming for over a thousand years:

For a child will be born to us, a son will be given to us; and the government will rest on His shoulders; and His name will be called Wonderful Counselor, Mighty God, Eternal Father, Prince of Peace. There will be no end to the increase of His government.

It is an extravagant promise, don’t you think?

Yet, in the 32 years since Reagan gave his sunny address on the meaning of Christmas in a world about to explode, the secular elites have done their best to drive Christ out of Christmas. If Christmas weren’t already a holiday, they would prevent its becoming one. A crèche is now a desecration of the public square. When I was a child in public school, we began every day’s class with a Bible reading by the teacher and a prayer. The secular elites have ended that practice with a perversion of the First Amendment, using the State to bar religion instead of barring the State from sponsoring a religion. But that’s precisely what secular elitism is – a religion. Except that its religion tolerates no character claiming to be God, Prince of Peace, or the creator of a government that will increase without end.

Secular elitism rejects such outlandish claims. But that rejection was also foretold by prophets more than a thousand years before the miraculous birth in Bethlehem.

For He grew up … like a tender shoot, like a root out of parched ground; He has no stately form or majesty that we should look upon Him, nor appearance that we should be attracted to Him. He was despised and rejected – a man of sorrows, acquainted with deepest grief. We turned our backs on him and looked the other way. He was despised, and we did not care.

God knew He would be rejected by the world. But He came anyway.

Why?

Because if God had a refrigerator, your picture would be on it. If God carried a wallet, your picture would be in it. Your birthday would be circled if God kept a calendar. And if God lived in a house, one of its rooms would be specifically designated as yours.

No matter what we think of God, that’s what He thinks of us.

Merry Christmas!

Saturday, December 14, 2013

Obama Finds His Inner Elizabeth Warren

After the disastrous rollout of his ObamaCare website and the sticker shock sweeping in behind millions of private insurance policy cancelations, Obama promised to make his signature healthcare law the focus of the remainder of his second term. That lasted for 3½ minutes. Sagging poll numbers compelled him to shift to another focus. So last week he gave a predictable speech decrying income inequality. The audience was the Washington DC liberal think tank, The Center for American Progress, and he promised that would become his focus for the remainder of his second term.

If we Google the key words “Obama” and “focus” we find that over the years Obey’s “focus” has variously been jobs, women’s issues, wars, climate, energy, the Middle East – you name it. Lots of focusing and refocusing means no focusing because in truth Obama has the attention span of an amoeba.

Several newly-released polls show that almost 60% of adults disapprove of Obama’s job performance. Each negative poll shifts his focus du jour to another red herring issue tout de suite in order to distract the voting public from the crummy job he is doing as president. Oui? This latest refocus is a patent attempt to deflect the media and public attention away from the ObamaCare disaster.

The choice of income inequality as the latest Obama focus is ironic. Abraham Lincoln said, “God must love the common man, he made so many of them.” Obama and his Democrat buddies essentially paraphrase that: “We liberals must love the poor people because our policies create so many of them.”

"I take this personally," Obama harrumphed in his inequality speech, pointing out that members of his family have benefited from government programs. He must have been referring to his alcoholic uncle who is in this country illegally because Obama and Michelle have had every door opened, every privilege extended, and every preference given, allowing both Obamas to partake of the best education the country offers and benefit from the most influential political connections the liberal establishment could lay at their feet as they ascended to the heights of Olympus.

Calling rising income inequality "the defining challenge of our time" is laughable coming from Obama’s lips. More than any other modern president, he has used his office to live like an imperial monarch while pushing an inequality-inducing agenda that has crushed the median income, produced an anemic recovery, and given the Federal Reserve justification to rejigger monetary policy to drive up house and stock prices. Obviously, those who owned houses and stocks benefitted – not because of greed or exploitation but because of government policy.

What hypocrisy to blame private enterprise, markets, and technology entrepreneurship instead of government for the gap in incomes from top to bottom! Might the gap reflect a gap in skills, experience, and knowledge? Might the gap reflect a gap in the value one’s work contributes to an economic society? The term “income inequality” implies there ought instead to be income equality. What does that mean? Obama’s CAP speech reveals a man who believes the American economic system should provide everyone equal economic outcomes. Those on the Right believe in equal opportunity, not equal outcomes measured as equal incomes and wealth.

It is Obama and his liberal scourge that have stonewalled school vouchers for inner-city kids and protected education unions from accountability for the sorry state of public education. It is Obama and his congressional protectors who can’t connect the high corporate tax and business regulation dots to the weakest post-recession recovery in history dots. It was Obama and his union minions who were quick to pick winners and losers in the stimulus bailouts rather than let the market decide who was “too big to fail.” It was Obama, Reid, and Pelosi who peddled their nauseous “fair share” rhetoric to justify playing Robin Hood in the redistribution of wealth as if government were engaged in anything other than outright theft. We can thank Obama and the ancestry of liberalism he mentioned in his speech for the generations of welfare recipients who have unwittingly become residents on the government plantation, dependent on the “big house” as surely as if society’s clock had been turned back 150 years.

So, what does Obama propose to solve “income inequality”? Easing the rules on union organizing. More stimulus spending (translated union spending.) An increase in the minimum wage – from $7.25 to $10.10. Along side that he wants an increase in the wage floor for workers whose incomes are mostly tips, which can average up to $20 an hour, but the tip income would not be counted in tripling their “minimum wage.” He wants a Paycheck Fairness Act which would make the government the referee in paying men and women equally for similar work. The government will define “similar.” He wants to extend unemployment benefits another 99 weeks (a new incentive to get the unemployed to look for work.) He wants an Employment Non-Discrimination Act which would give government new powers to interpret bias against gay, lesbian, bisexual, and transgender hiring. The government will define “bias.”

On top of this grab bag of new government rights and raises is the calamitous ObamaCare which will raise the cost of hiring everyone, but it will be especially felt by small businesses which are most sensitive to labor expense. Sane thinking would start by trying to solve that problem. Instead Obama’s health law will make it worse.

While he avoided an explicit accusation, the Obama CAP speech left little doubt that the “income inequality” problem is caused by a small group of greedy people at the top of the economic pile who spend every waking moment trying to keep everyone else at the bottom of the pile. Obama’s speech solution translates into two remedies – (i) give everyone on the bottom of the pile a raise by government fiat and (ii) transfer the ill-gotten gains of the rich (the makers) to the “have nots (the takers.)

I’m not making this up. Here’s what he had to say. “The top 10% no longer takes in one-third of our income, it now takes in half.” Note the use of the verb “take.” “Whereas in the past, the average CEO made about 20 to 30 times the income of the average worker, today’s CEO now makes 273 times more,” he further whined. “And meanwhile, a family in the top 1% has a net worth 288 times higher than the typical family, which is a record for this country.” (And Obama knows how to find those people when he’s raising money for his never-ending campaigning.) Of course, he denied that he was promoting equality of result. But he made clear his belief that wealth and income differences are unfair and government policy should eliminate them. That sure sounds like a complaint about the inequality of results.

Well, let’s look at the facts, which fortuitously were released by the CBO this past week. The top 40% of households in a rank order of pretax income actually paid 106.2% of the 2010 income taxes, the latest year of data available. “How can any group pay more than 100%?” you ask. Short answer: because a large group of wage earners “pays” negative taxes – i.e. they pay no taxes AND they qualify to get money from the government. Household incomes in the bottom 40% of the rank order paid no tax and received $18,950 in government transfers in 2010 so says the CBO.

The top 20% of households ranked by pretax income PAID 92.9% of taxes in 2010. The next 20% down the rank order PAID 13.3%. Add those two together and the top 40% PAID 106.2% of the government’s net tax income. The next 20% down the rank order PAID 2.9%, bringing the total PAID by the top 60% to 109.1%.

Then we get to the next 20% down the rank order of household incomes. They PAID -2.9%. The bottom 20% PAID -6.2%

Add those percentages up and they net to 100% even in Palm Beach County FL. It should be pretty clear that the makers are paying and the takers aren’t – their “taxes” are incomes.

I never thought anyone could get to the left of Elizabeth Warren. But Obama’s CAP speech sure did it. Rather than the socialist agenda he promoted, conservatives have better ideas for creating a better society. To solve a problem, one should start with what causes it. It’s apparent that at least 40% of households make too little money to enjoy the lifestyle of the 60% above them in the rank order of pretax incomes. Is that because the top 60% are keeping them down? Only Obama and his liberal ideologues believe that hogwash. “It’s the economy, Stupid!” Isn’t that the advice James Carville gave candidate Bill Clinton to run against Bush 41? This economy is incompetently managed. All of the White House breast-beating that accompanied last week’s jobs report that the economy had added 203,000 last month – better than expected – totally ignored that there are still four million less people in the full-time workforce than were there before the Great Recession. Each year more people reach workforce age and the jobs aren’t there because of Obama’s anti-growth policies.

Inequality is bad when it’s caused by crony capitalism – wealth created by favors from government. Solyndra and other friends of Obama’s come to mind. Inequality is bad when government policy provides incentives that compete with hard work, when it protects constituencies from better alternatives (union, schools), and when it produces an underclass (entitlements.) Inequality is good when it provides role models like Bill Gates, Steve Jobs, and Warren Buffett, when it provides proof that the American system of innovation and markets rewards companies like Intel and Facebook, Google and Amazon.

Inequality doesn’t hurt our economy. Wrong-headed government policies do. That would include the no-interest Fed policies that produced the recent stock market boom, much to the cheers of the Obama White House whose resident now has the gall to vilify the people who benefitted from it. That would include ObamaCare, which is hated by the majority of the American public who suspect healthcare quality will decline because of it.

Last week the death of Nelson Mandela was “honored” by politicians around the world. Mandela would have abhorred the encomiums. He was a modest man who forbade photos of him after he was released from prison. The only image of him was taken illegally and released after his death by the South African government. In the grieving audience were Obama, British PM David Cameron, and Denmark’s PM Helle Thorning-Schnidt. Putting their best face forward for the occasion, they took a “selfie” like teeny-boppers cuddling their heads while Michelle fumed. Only God knows what transpired between the Nation’s representative to the world and his wife as they deplaned from Air Force One after the Mandela memorial. I’m glad I wasn’t there to hear it.

But Obama’s self-absorption brought this thought to mind. Shakespeare’s King Lear is one of his most complex plays. Among other faults, Lear could not separate his roles of king, father, and friend. As the providentially-endowed steward of the realm, his kingship absorbed all of his other roles. He was king of the realm, king of his family, and king of his friends. Ultimately he lost everything and became mad. In the end he discovers his true self, recovers his sanity, rises above treachery, and is restored his kingship in a proper perspective. When Lear dies, Shakespeare called him "every inch a King" but he was also every inch a man as well.

Obama is our president, not our king, and he is the steward of the Republic. I wonder if it will ever be said of him that he was “every inch a President” – or every inch a man?

Saturday, December 7, 2013

The Folly of a “Minimum” Wage

An obscure election last month in the small city of SeaTac surrounding the Seattle airport was little noticed in the national press. SeaTac is a mostly poor community of 27,000 whose economic engine is the airport that employs most of them. The cause for notice was SeaTac‘s November ballot initiative, Proposition 1. It would raise the minimum wage to $15 – 63% higher than the $9.19 minimum wage for Washington, already the highest of any state in the union. The jobs affected will be those in the airport and surrounding transportation and hospitality businesses.

Only 6,000 total votes were cast in the Prop 1 initiative and it passed by only 77 votes. The campaign for and against it was expensive, costing businesses and unions over $300 per vote cast. The losers, the business coalition, are calling for a hand recount so the outcome won’t be known until next week.

But will the SeaTac businesses be the losers in the end?

There is ample research to show that increases in the minimum wage by force of law rather than the force of the market always costs jobs. Job losses don’t happen immediately. Maybe that’s why those who believe they can defy economic law by willing it keep trying.

Setting a wage rate floor is simply price-fixing in a different disguise. Suppose your favorite restaurant raised its prices 63% across the board. Most likely you’d dine at a different restaurant. When gasoline prices rose to record levels last year, what did people do? They car-pooled, they substituted public transportation, they telecommuted from home. Gasoline inventories piled up forcing down the price. How is that any different than raising labor prices – labor that is no more productive after increasing its cost than it was before?

The laws of supply and demand are not voluntary. They weren’t established by ballot initiative or enacted by government. All of the cork-popping heard after the Prop 1 vote count came in will ultimately go for naught. The affected SeaTac businesses will not swallow higher labor costs and continue doing business as they were, which the Prop 1 advocates seem to think will happen. Some will relocate, as they threatened to do. Restaurants will replace wait staff with self-service buffets. Car rental customers will do more things for themselves which attendants once did for them. And automation will replace jobs. Service levels will go down, and prices will go up.

The concept of a “just wage” – advocated by the Prop 1 supporters – is rooted in the Middle Ages when wages and justice were affairs of the church before modern economies dictated prices and wages. But supporters of non-market mandates have continued to defy the will of the market. Australia experimented with minimum wage-setting in the late 1800s and couldn’t make it work. American wage reformers tried to impose their will on the market as early as 1912. But the US Supreme Court kept getting in their way by declaring unconstitutional any attempts to interfere with free bargaining by employers and workers for wages. Not until the socialistic New Deal passed the Fair Labor Standards Act of 1938 was a federal minimum wage legalized by Roosevelt’s newly “packed’ Supreme Court. It reversed its previous rulings and upheld the minimum wage, deciding it was within Congress’ Commerce Clause rights. Go figure.

The Roosevelt minimum wage was 25 cents an hour. It was raised several times, the last change putting it at $7.25 an hour in July 2009. Minimum wage increases are always a battle to push through Congress, so in 2010 Congress gave states and municipalities the power to set their own minimum wages above the federal level, which the District of Columbia and 19 states have done. In his last State of the Union speech early this year, Obama made his customary appeal to Congress to raise the federal minimum wage almost 25% to $9, arguing that “No one who works full time should have to live in poverty.” Well, Obey, almost no one does.

But let’s take Obey a step further. If prosperity can be achieved by mandating it, why not raise the minimum wage to $20 an hour or $35? Any right-thinking person knows markets don’t obey mandates, Obey.

Setting a minimum wage has been the liberals dream since Roosevelt. The fact that it assumes employers won’t react and ignoring the damage it does to jobs never enters the liberal’s calculus. In 1950 only half the jobs were affected by the federal minimum wage. Blacks and whites were employed in equal percentage and teen unemployment was not a problem. Today virtually all jobs are subject to minimum wage limitations. Teenagers – the least skilled part of the workforce – are the most unemployed. The jobs they once performed – pumping gasoline at service stations, for example – have been replaced by self-service because of minimum wages. Go to a fast food restaurant and you’re handed a cup to serve yourself at the beverage machine, something an employee once did. I bagged groceries as a kid and carried them to the shopper’s car. Today, you’ll check-out your own groceries, bag them, and carry them to your car.

Minimum wages discriminate against teenagers because they lack skills and experience. But minimum wages discriminate against black teens most because they are even less skilled and less experienced than white teens. It’s not racism. Black teens are more likely to live in broken homes with poor role models. And absent parental support, they benefit less from public education, which is terrible on its own. Not surprisingly, then, the white teen unemployment rate is 21% and black teen unemployment is almost twice that – 37%. Laws that discriminate against low skilled workers, therefore, fall heaviest on blacks and Obama wants to increase that burden. Even if low skilled workers are willing to accept less pay, the minimum wage laws prevent it. There ought to be a law against the law!

The adverse impact of a minimum wage isn’t just economic. There’s more benefit in an entry level job than the money earned. Entry level employment introduces young people to the world of work. It teaches responsibility – and consequences. It teaches work habits, respect for authority, and that hard work usually earns rewards. Kids from broken homes with deficient education would benefit most from these experiences, but minimum wage laws discriminate against them.

Moreover, the belief that a minimum wage should allow their earners to support families and avoid poverty is fundamentally flawed. The wages paid to fast food workers, retail employees, and many other businesses whose business model operates on razor thin margins rely on a low-cost part-time workforce. Many jobs are seasonal. This is done to keep costs down and pass low prices to consumers. They were never intended to provide the financial support for a family. These jobs were designed to supplement a principal income – not to be a principal income.

Not fair? Fairness has nothing to do with it. That’s the business model and every competitor follows it. Should government outlaw business models?

So, are great numbers of economic slaves toiling away on the abyss of poverty to enrich their capitalistic overlords as advocates of the minimum wage would have us believe?

The national workforce which reports a minimum wage income to the IRS is under 3%. Of those people less than one in four are in their prime working years of 20 to 64 and less than a fourth of those people live below the poverty line. For the benefit of those who vote in Palm Beach County FL let me explain that that’s one-sixteenth of less than 3% – i.e. less than two-tenths of 1%. That doesn’t sound like a national disaster to me.

The rest are teens – half of the minimum wage workforce – who are primarily supported by their family income, and the over-65s – about one in four – who work to supplement retirement and Social Security or work simply to get out of their house a few hours a week.

We don’t know how many of those who earn minimum wages in the prime working years are between jobs that pay more than minimum wages. But we do know that the stereotypical single parent working full time at minimum wage to support a family is fallacious. The Bureau of Labor Statistics reports only one in 25 minimum wage earners fit that stereotype, and those people are eligible for the Earned Income Tax Credit which raises their effective income well above the minimum wage. They are not living on the edge of economic destitution.

Hmm, if the minimum wage workforce is tiny and the portion of it fitting the single-parent-supporting-a-family stereotype is microscopic, and if raising the minimum wage primarily denies work to people not qualified to perform jobs paying the minimum wage, why do politicians keep raising the minimum wage?

I can think of a few reasons and I’m sure readers can think of more.

First, understand that the most common elements drifting around in the universal ether are hydrogen and stupidity. Those who advocate for raising the minimal wage would quickly admit that increases in the price of beer, cars, eating out, and electricity would reduce the consumption of beer, cars, eating out, and electricity. In fact, increased taxes on tobacco reduced consumption in half in less than three decades. But the same people freely park their wits at the door while convincing themselves that raising the cost of cheap labor won’t affect its consumption by employers.

New Jersey, renown for it high taxes, saw businesses and residents flee the state for years until Republican Chris Christie was elected governor and fought unions and legislators to make the state competitive with surrounding venues. The NJ politicians learned nothing from the experience. Christie denounced their recent proposal to raise the state’s minimum wage to $8.25. Nevertheless, the state legislature passed the increase by a margin of 61%.

Congressional Democrats have introduced a bill to raise the minimum wage from its current level of $7.25 to $9.80 over two years – an increase of more than 35%. Obama, always undeterred by facts, wants Congress to raise the minimum wage by 23% immediately. This seems to prove the assertion that an ounce of a politician’s brains costs ten times more than an ounce from anyone else because ten times more politicians are needed to get an ounce of brain. Minimum wage laws are politically feel-good regulations. It never occurs to politicians to get proof that they are solving the problem they are trying to solve – something routinely done in business and science.

Second, the argument for a minimum wage is couched in the tired bromide of a moral wage. Wages are dictated by economics, not morality. However moral it might be to argue that Joe Blow should be paid $10 an hour because he deserves it, if his work produces $5 an hour in value that’s not morality. It’s stupidity. Morality can’t be legislated. If morality was the basis of politics there would be no politicians.

Politicians decry the outsourcing of work to other countries, allegedly exploiting a foreign workforce to build $500 flat screen TVs that would cost $1,500 if made with American labor. So here’s a question: ask American consumers if they’d like to pay a $1,000 “morality tax” to have TVs made in this country. I don’t think so.

Which brings me to the third reason politicians press to raise the minimum wage: unions. Unions are struggling to stay relevant while membership sinks to historic lows. Local ballot initiatives to raise the minimum wage, such as was done in SeaTac with Prop 1, may increase union membership and raise union wages.

Here’s how.

Many union contracts specify wages that are multiples of the minimum wage. If the minimum wage goes up, union wages go up. Democrats are the beneficiaries of union political contributions, so Democrats are the chief sponsors for wage increases.

The Prop 1 initiative was architected by the union and heavily promoted by union advertising, which outspent the business coalition three to one. Because it wrote the initiative, the union inserted a provision in it which requires paid sick leave, makes part-time work difficult, prohibits sharing of tips, etc. These provisions can be waived in a union contract, but businesses that aren’t unionized must comply with them. Obviously this puts non-union businesses at a disadvantage, which is an incentive to allow their workforce to unionize.

In Long Beach CA a similar initiative was on the ballot which required hotels with more than 100 rooms to pay a $13 minimum wage and sick leave. Two Hyatt hotels caved and were unionized. However, some hotels closed rooms to have only 99 available for rent and laid off workers. The remaining workers were paid $13 an hour but had their hours cut.

It’s too early to tell if SeaTac and Long Beach are new strategies for unionizing using the minimum wage as a battering ram. The effort may fail if businesses react smartly to offset the damage. Unfortunately, the local community will lose either way. It becomes a less attractive location for new business. And its unemployment rate will rise.

This past week, Obama’s favorite union, the Service Employees International Union, orchestrated demonstrations in 100 cities outside of fast food restaurants calling for doubling their hourly wages to $15 – twice the minimum wage – or face a nationwide strike. (Yes, SEIU was behind Prop 1 too.) Young fast food workers were interviewed by eager reporters. The workers’ complaints were predictable: they “deserved” to be paid $15 an hour or at least $10 for their work. Really? There’s no one forcing you young folks to work at McDonald’s or Arby’s or Burger King. So leave. Get yourself a job paying the $10 or $15 an hour that you say you’re worth.

Uh? … not qualified for jobs paying that much, you say? I thought as much. Too bad you don’t.

Saturday, November 30, 2013

What If …?

Over a dozen years ago Robert Cowley compiled an intriguing book of essays under the title What If? The essay authors – all outstanding historians of the day – were invited to compose a credible outcome, a “what if” alternative, for a pivotal moment in history if circumstances had taken a different, often minor, turn of events.

For example, in 1889 Buffalo Bill’s Wild West Show toured Europe featuring Annie Oakley’s famous shooting skills with her Colt .45. Her act concluded with an invitation for a gentleman in the audience to step into the arena and allow her to shoot off the ash from his cigar at a distance far enough away to make it interesting. It was all show because there were never any real takers. Her husband, Frank Butler, was planted in the audience as a stooge. He would bravely step forward with a Havana clenched in his teeth, allowing Annie to bring the crowd to its feet with her keen shot.

When the show made a stop in Berlin for a performance at the Charlottenburg Race Track, Annie offered her customary dare. In the audience was the young and showy Kaiser Wilhelm II who immediately took her offer, much to her shock and the horror of the Kaiser’s security detail. Stepping on to the arena, the local police tried to intervene but Wilhelm waved them off.

Annie couldn’t back out without losing credibility, so she paced off her usual distance and took aim. Sweating under the pressure of shooting at a crowned head of Europe and wishing she hadn’t consumed so much whiskey the night before, she nevertheless shot away the Kaiser’s cigar ash to the crowd’s wild jubilation.

Cowley asks what if she had missed? There would have been no bellicose Kaiser Wilhelm II alive 25 years later to start war in Europe. When World War I did in fact break out, Annie wrote the bombastic Kaiser and asked for a second shot. He never responded.

History offers many “what ifs.”

In the aftermath of World War I, the Treaty of Versailles was more punishment than peace treaty. It forced Germany to admit its ‘guilt” for the war as well as pay reparations for it. The Treaty’s major accomplishment was to invent Hitler.

Hitler rose to power in 1933 and obsessed over undoing the Treaty. He pursued this in a succession of trial provocations each intended to test the resolve of the former allies – primarily Britain and France. In 1934 he ordered the German home guard to arm for war and, the following year, reintroduced conscription – both flagrant violations of the Treaty. With no response forthcoming from the major world powers, Hitler was emboldened. In 1935 he began building tanks, planes, and submarines – further violations of Versailles. Still no intervention by England or France.

In 1936, Hitler ordered troops across the Rhine and created an armed threat in the demilitarized zone of the German Rhineland, the territory between the French border and the Rhine River, an unmistakable violation of the Versailles treaty. There his raw army recruits and 36,000 policemen faced nearly 100 French and Belgian divisions. France and Belgium were within their right to cross the Maginot Line and confront the Germans. But the French and English heads of state said nothing and did nothing. 

One of the Rhineland occupiers, General Heinz Guderian, said after the World War II that if the French had responded to their provocation in 1936, “…we should have been sunk and Hitler would have fallen." Another German officer confessed that the German General Staff considered Hitler’s move tantamount to a suicide mission.

I can tell you that for five days and five nights not one of us closed an eye. We knew that if the French marched, we were done. We had no fortifications, and no army to match the French. If the French had even mobilized, we should have been compelled to retire.

Hitler himself said:

The forty-eight hours after the march into the Rhineland were the most nerve-racking in my life. If the French had then marched into the Rhineland we would have had to withdraw with our tails between our legs, for the military resources at our disposal would have been wholly inadequate for even a moderate resistance.

The head of the French army, General Maurice Gamelin, believed that a confrontation of the German Rhineland occupiers would be unpopular at home, costly, and would lead to war requiring full mobilization of French armed forces. He counseled against action.

In England, not only were there no anti-German Rhineland protests, there were peace demonstrations. One Member of Parliament observed "the feeling in the House [of Commons] is terribly pro-German, which means afraid of war." The Prime Minister during the 1936 crises, Stanley Baldwin, had tears in his eyes when he admitted that British public opinion would not support military intervention in the Rhineland.

Winston Churchill, a backbench Conservative MP in 1936, was the British Jeremiah – one of the few voices raised against German rearmament and its threat to future peace. He wanted his country to reinforce a French challenge of the Rhineland occupation under the coordination of the League of Nations. The League of Nations, however, was as useless then as the United Nations is now and nothing happened. Churchill predicted the Rhineland would become a hinge allowing Germany to swing its forces through Belgium to attack France. And that’s precisely what happened in 1940.

A “what if” opportunity thus passed unexploited in 1936 that could have prevented World War II.

An Austrian German by birth, Hitler provoked the 1938 Anschluss crisis two years after the Rhineland showdown. It was part of his plan to unite all German speakers in one state. "People of the same blood should be in the same Reich" he had written in his autobiography Mein Kampf.  Hitler’s annexation of Austria was accomplished by threatening invasion so convincingly that the Austrian government resigned and Hitler’s armies were invited in.

German-Austrian union was forbidden by the Treaty of Versailles. Notwithstanding, England and France did not protest, did not mobilize, and did not defend the Treaty.

When an aggressor senses his opponent’s lack of resolve to confront him, he will act without restraint. And that’s precisely what Hitler did. He was now convinced that British Prime Minister Neville Chamberlain, Baldwin’s successor, and French Prime Minister Édouard Daladier were both appeasers – paralyzed by their nightmarish memories of World War I – and therefore loath to confront his militaristic aggressions.

His next move would exploit their fecklessness.

In 1938 Hitler demanded that Czechoslovakia cede the Sudetenland – that portion of Czechoslovakia on its north, west, and southwest border which was populated by German speakers. If his demand was not accommodated, he would invade Czechoslovakia.

It was all bluff. At that time Czechoslovakia had one of the strongest armies in Europe. It was well trained and well equipped, thanks to the country’s armaments works – primarily those of Skoda, the Czech equivalent of Ford or GM. Moreover, its military industries were protected by military fortifications on the German and Austrian borders – all in the Sudetenland – which explains why Hitler wanted it.

Czechoslovakia probably could have won a fight with Germany given the poor quality of the German fighting force at the time. In fact, senior German generals were horrified with Hitler’s plans to invade, convinced it would ignite a world war which Germany would lose. So convinced were the German generals of a bad outcome, that one of the earliest conspiracies was plotted to overthrow and arrest Hitler. Representatives were sent to meet with Chamberlain, telling him that the moment Hitler gave the invasion order he would be arrested and imploring Chamberlain to intervene militarily in support of Czechoslovakia. Czechoslovakia had mutual defense treaties with France and Russia, and England was a defense partner of France. Poland too would likely have joined if France and England were in the fight.

But it was not to be. A meeting was held in Munich involving France, England, Germany, and Italy. Czechoslovakia was not invited. Chamberlain followed his appeasement instincts and the Munich Agreement was signed transferring the Sudetenland to Germany. Betrayed by its treaty allies, Czechoslovakia conceded. Hitler agreed not to invade Czechoslovakia – an agreement that lasted six months when Germany troops swallowed up the remainder of the country, now defenseless without its Sudeten armaments and fortresses.

The cowardly Chamberlain flew back to England and landed among the adulation of adoring pacifists. Deplaning, he waved the infamous Munich Agreement, assuring the crowd at the airport that he had won for them “peace in our time.” That phrase has gone down in history as a verbal monument to the gullibility of naïve world leaders who appease international bullies expecting conciliation to preserve peace. Think Jimmy Carter and Barack Obama.

Chamberlain’s surrender to Hitler was criticized by some far-sighted British politicians, among them Winston Churchill who unleashed his rhetorical fury on the worthless document:

We are in the presence of a disaster of the first magnitude...we have sustained a defeat without a war, the consequences of which will travel far with us along our road... we have passed an awful milestone in our history, when the whole equilibrium of Europe has been deranged, and that the terrible words have for the time being been pronounced against the Western democracies: "Thou art weighed in the balance and found wanting." And do not suppose that this is the end. This is only the beginning of the reckoning.

Another “what if” moment was lost.

With the fall of Czechoslovakia, Hitler’s territorial ambitions now became apparent even to the western fools of Munich. The English and French publicly assured Poland that they could be relied on to protect it against German aggression. But the time for red lines had passed. Germany could no longer be denied its long sought destiny. Equipped with the armaments stolen in the Munich treaty, Germany was rolling like a juggernaut toward the Polish frontier, which bordered the northern Sudetenland acquisition. Germany was now the best armed military force in Europe, ironically clad in Panzer tanks that had been built for the Czechs by the Skoda works. England and France, Poland’s outfoxed treaty allies, could do no more than watch the eight-month massacre of an ally in 1939.

Sixty million people would die before the German beast was slain in 1945.

History may not repeat itself, but as Mark Twain said, it does rhyme. What can we learn today from this disgraceful episode of serial incompetence? One could certainly argue it teaches that preemptive military action, however unpopular at the time, is usually a winning antidote for an unbridled aggressor. Hitler had made his ambitions abundantly clear in Mein Kampf, assuming any of the western leaders had bothered to read it. And the provocations he initiated in 1936, twice in 1938, and in 1939 were straight out of his play book.

After World War II ended Churchill, whose resolve had saved the British, was ousted from office. So much for the thanks of a grateful nation.

However, a relatively unknown school, Westminster College in the small Missouri town of Fulton (population 7,000), wished to bestow an honorary degree on the statesman that got England into the war and kept it there to the end. And because Churchill was indisputably the best rhetorician of the 20th century, he was invited to be the keynote speaker. The audience numbered over 40,000 and his speech, which is often referred to as the “Iron Curtain” speech, alerted the world to a coming “cold war.”

Churchill’s speech presaged many of his thoughts that would later appear in his seminal recollection of the Second World War, most especially its first volume entitled The Gathering Storm. Here is an excerpt from the Iron Curtain speech that verbalizes the many lost “what ifs”:

Up till the year 1933 or even 1935, Germany might have been saved from the awful fate which has overtaken her and we might all have been spared the miseries Hitler let loose upon mankind. There never was a war in all history easier to prevent by timely action than the one which has just desolated such great areas of the globe. It could have been prevented in my belief without the firing of a single shot, and Germany might be powerful, prosperous and honoured to-day; but no one would listen and one by one we were all sucked into the awful whirlpool. We surely must not let that happen again.

We surely must not let that happen again indeed.

Today we are once again confronted by the villainous face of evil. This time it’s Iran. Iranian leaders are as fanatical as Hitler but are many times more dangerous given nuclear weapons. Iran’s mullahs are believers in the 12th imam and hold the apocalyptic theology that bringing the world to an end will usher in the “second coming” of this imam. He will establish the Shia religion as the world religion and begin an unending era of peace. In short, a worldwide nuclear holocaust favors Iran’s theology.

Moreover, the west is once again afflicted with gullible leaders who believe appeasement is the answer to aggression. The recent “agreement” championed by Kerry and Obama does nothing to stop Iranian bomb-making. Worse, it deludes the America people into thinking that something short of military intervention will stop a nation intent on eliminating Israel, whose existence it won’t officially recognize. Israel will be our Poland.

Once again, our leaders believe that peace can be won without victory over disturbers of peace. We are standing aside while a replay of 1936, 1938, and 1939 passes before our eyes … while, as Mark Twain observed, history rhymes.

Once again the Left believes – and tries to convince the rest of us – that our choice is between war and peace when in fact it is, and always will be, a choice between fighting and surrender.

Saturday, November 23, 2013

How We Got Thanksgiving Day

In early September of 1620, 104 men, women, and children crowded aboard a leaky ship that was about 90 feet long and 26 feet wide amidships and set sail for the New World. The ship, named the Mayflower, would be at sea for 66 days before making landfall on the point of the fish hook we call Cape Cod, where it anchored near the location that would become Provincetown. It was well north of its intended destination of Virginia and therefore the passengers had no patent from the English crown to settle in this place.

The passengers continued living on board for a month while a few men first explored the Cape area. Finding curious mounds, the explorers punched holes in several revealing some to be granaries for corn and beans but others to be graves whose desecration didn’t endear the trespassers to the natives.  A boat was built to explore the leeward shoreline of Cape Cod, and finding the natural harbor at modern day Plymouth and a defensible hill above it, they decided to make their settlement there. With winter approaching, shelter had to be built before the majority of passengers could disembark.

The long ocean crossing and the additional month crammed aboard ship had done little to improve the disposition of the passengers, which was compounded by the fact that 44 of them were religious dissenters from the Church of England while 66 made the voyage as a business venture. The dissenters called themselves the “Saints” and called the others “Strangers” – hardly a good way to create unity. Despite having more differences than similarities, their survival depended on cooperation, of which there was little on board the ship. Therefore, William Bradford, who had emerged as the informal leader, recommended that before disembarking every passenger should sign an agreement that set forth rules for self-government, which later came to be called the Mayflower Compact.

The first winter was ghastly. Now calling themselves Pilgrims, over half of them died in three months. They were buried at night, fearing that the surrounding Indians would learn that their number was dwindling which might encourage an attack. Unlike the Indians encountered on the Cape, however, the Pilgrims had settled among the peaceful Wampanoags. And in March the tribal chief, Massasoit, sent Samoset as his ambassador to the settlers because Samoset spoke English. He had providentially learned English from sailors who had fished the coast and briefly lived on land nearby. After his first encounter with the Pilgrims, Samoset returned with Tisquantum, known in history as Squanto, an Indian who had been kidnapped in 1614 by an English slave raider and sold in Málaga, Spain. There he had learned English from local friars, escaped slavery, and found his way back on an expedition ship headed to the New England coast in 1619 – the year before the Pilgrims arrived.

Since Squanto spoke better English than Samoset, he became the technical advisor to the Pilgrims, teaching them how to raise corn, where and how to catch fish, and how to make things needed for working and hunting. He showed them plants they could eat and plants the Indians used for medicinal purposes. Squanto was the reason that the settlement survived during its first two years.

The first year the Pilgrims farmed communally and nearly starved. William Bradford’s diary tells us he astutely learned from that failure and decided thereafter that each man should forsake communal farming and instead farm for his own family’s food needs. "This had very good success," Bradford wrote, "for it made all hands very industrious, so as much more corn was planted than otherwise would have been. By this time harvest was come, and instead of famine, now God gave them plenty, and the face of things was changed, to the rejoicing of the hearts of many." The Pilgrims’ experiment in socialism was a valuable lesson.

After taking in an abundant harvest in the fall of 1621, the Pilgrims invited Squanto, Samoset, Massasoit, and 90 other Wampanoag men to join them in a three-day celebration of their success. The festivities consisted of games and feasting – and a not-so-subtle display of Pilgrim musketry just in case the natives became unfriendly in the future. This celebration is recorded in history as the first Thanksgiving – which it wasn’t.

In fact, two years earlier on December 4, 1619, a group of 38 English settlers arrived at Berkeley Hundred, part of the Virginia Colony, in an area then known as Charles Cittie (sic), It was located about 20 miles upstream from Jamestown, the first permanent settlement of the Virginia Colony, which had been established in 1607. The Berkley settlers celebrated the first known Thanksgiving in the New World. Their charter required that the day of arrival should be observed yearly as a "day of thanksgiving" to God. On that first Thanksgiving day, December 4, Captain John Woodleaf presided over the service. The charter specified the thanksgiving service: "Wee ordaine that the day of our ships arrival at the place assigned for plantacon in the land of Virginia shall be yearly and perpetually keept holy as a day of thanksgiving to Almighty God."

But not for long. Nine of the Berkley settlers were killed in the Indian Massacre of 1622 which also wiped out a third of the population of the Virginia Colony. Therefore, Berkeley and other outlying settlements were abandoned as the colonists moved back to Jamestown and other more secure points. Thanksgiving was forgotten.

The first national celebration of Thanksgiving occurred in 1777. This was a one-time only Thanksgiving in which the 13 colonies, rather than celebrating food and God’s providence, celebrated the defeat of the British at Saratoga in October by Washington’s Continental Army.

In 1789 President George Washington made the first presidential proclamation declaring Thanksgiving a national event. Under this proclamation it was to occur later that year on November 26. Some were opposed to it, particularly those in the south. They felt the hardships of a few Pilgrims did not warrant a national holiday and besides, such proclamations were excessively Yankee and Federalist – or so they thought.

John Adams, the second president, issued a Thanksgiving proclamation in 1798 enlisting the help of the Almighty not only against celestial evil but also in the more mundane battles of his administration. He seemed to be asking God to side with the Federalists against his struggles with the Jeffersonians. When he later revealed that the proclamation had been recommended by (gasp!) Presbyterians, it set off a firestorm that Adams, a devout Unitarian, was leading a movement to establish the Presbyterian Church as the national religion. Adams became the first one-term president – a fact he attributed to his proclamation.

In 1779, as Governor of Virginia, Thomas Jefferson decreed a day of “Public and solemn thanksgiving and prayer to Almighty God.” But as the third president, he opposed nationalizing Thanksgiving proclamations. Writing to a Reverend Samuel Miller, Jefferson said, “I consider the government of the United States as interdicted by the Constitution from intermeddling with religious institutions, their doctrines, discipline, or exercises …”

In 1817, New York became the first of several states to officially adopt an annual Thanksgiving holiday. Each state celebrated it on a different day, but the South didn’t embrace the tradition. Therefore, for almost 60 years following Jefferson’s presidency, Thanksgiving remained a non-event on the national scene with no advocate until Sarah Josepha Hale.

Hale was no shrinking violet. She raised $30,000 for the construction of the Bunker Hill monument in Boston and started the movement to preserve Washington’s Mount Vernon home for future generations. She was a fervent believer in God and the American Union, as well as being a fierce abolitionist. Hale had made it her business to advocate and get action on symbols that celebrated America and what today is known as American exceptionalism.

Notwithstanding Andy Warhol, Hale had more than 15 minutes of fame. She authored the words to “Mary Had a Little Lamb” and was the editor for two prominent women’s magazines of her day. Beginning in 1846 she had written editorials calling for a uniform national celebration of Thanksgiving, writing four presidents and dozens of congressmen to push her cause.

In October 1863, America was embroiled in the American Civil War where the concept of "Union" was very much at issue. Hale tried again writing to her fifth president, Abraham Lincoln.

Hale’s proposal found a place in Lincoln’s heart. The battle of Gettysburg had been fought three months earlier and he was to travel to the battlefield the following month. He had been invited to be the clean-up batter after the keynote speaker, Edward Everett, who orated for two hours. Lincoln’s 278-word dedicatory address would become his most famous utterance.

Touched by Hale’s pleas, Lincoln issued his Thanksgiving Proclamation on October 3, 1863 setting its observance on the last Thursday of November.

After Lincoln’s assassination, his successor, Andrew Johnson, ever the contrarian, issued an 1865 Proclamation to “observe the first Thursday of December next as a day of national thanksgiving to the Creator of the Universe …” Yet the next three proclamations of the quirky tailor from Greeneville Tennessee returned Thanksgiving to the last Thursday in November.

Despite the presidential proclamations, states went their own ways. Southern governors often opted for inexplicable dates for observance or none at all. Oran Milo Roberts, governor of Texas in the late 1880s refused to observe Thanksgiving in the Lone Star State, snorting, “It’s a damned Yankee institution anyway.” But the South eventually succumbed to observing it.

Then along came Franklin D. Roosevelt whose finagling with the date of Thanksgiving created a national uproar.

In 1939 there were five Thursdays in November and the last one was the 30th, leaving only three weeks and change before Christmas. This wadded the boxers of the presidents for Gimbel Brothers, Lord & Taylor, and other retailers concerned less with tradition than sales in the waning years of the Great Depression. They asked Roosevelt to move Thanksgiving to the 23rd allowing an additional week for shopping. Although I’ve never understood why Christmas shopping couldn’t start before Thanksgiving, Roosevelt acceded and the country went ballistic.

Polls showed that 60% of the public opposed the change in date. Republicans in Congress were affronted that Roosevelt, a Democrat, would change the precedent of Lincoln, a Republican.

New England, from which the Thanksgiving tradition sprang, put teeth in its resistance. The selectmen of Plymouth, Massachusetts informed Roosevelt in no uncertain words, “It is a religious holiday and [you] have no right to change it for commercial reasons.” Massachusetts Governor Leverett Saltonstall harrumphed that “Thanksgiving is a day to give thanks to the Almighty and not for the inauguration of Christmas shopping.”

Methodist minister Norma Vincent Peal was outraged, calling it "...questionable thinking and contrary to the meaning of Thanksgiving for the president of this great nation to tinker with the sacred religious day with the specious excuse that it will help Christmas sales. The next thing we may expect is Christmas to be shifted to May first to help the New York World’s Fair of 1940."

Nor did all merchants favor the presidential rejiggering of the Thanksgiving date. One shopkeeper hung a sign in his window reading, “Do your shopping now. Who knows, tomorrow may be Christmas.”

Usually the states followed the federal government’s lead on Thanksgiving, but they never relinquished their right to set their state’s date for the holiday. Predictably 48 battles erupted.

New Deal Republicans had wit on their side in the national lampoon of Roosevelt. Republican Senator Styles Bridges of New Hampshire urged the President to abolish winter. The Republican mayor of Atlantic City recommended that Franklin Roosevelt’s holiday be renamed "Franksgiving," while the Republican Attorney General of Oregon came up with this bit of doggerel:

                                          Thirty days hath September,
                                          April, June, and November;
                                         All the rest have thirty-one.
                                         Until we hear from Washington.

Twenty-three states celebrated Thanksgiving 1939 on November 23, and another 23 stood fast with November 30. Two states, Colorado and Texas, shrugged their shoulders and celebrated both days, with Texas having the innovative reason – to avoid having to move the Texas versus Texas A&M football game. The 30th was labeled the Republican’s Thanksgiving, while the 23rd became the Democrat’s Thanksgiving.

Roosevelt’s experiment in moving the Thanksgiving date to improve Christmas sales continued for two more years, although 1940 and 1941 had Novembers with four Thursdays. But the evidence was against the assumptions – more shopping days did not increase sales. Roosevelt conceded and agreed to move Thanksgiving back to the last Thursday in November.

Under public pressure, the US House of Representatives passed a joint resolution in October 1941 to put Thanksgiving on the traditional last Thursday beginning in 1942. However, when the resolution reached the Senate in December, the Senate converted the resolution to law and changed one word: “last” was amended to “fourth” so never again would Thanksgiving fall on the 29th or 30th of November. The states followed suit, although Texas held on to the last Thursday until 1956.

So on this Thanksgiving, and all the future Thanksgivings, let’s raise a drumstick in salute to Sarah Josepha Hale, who instituted its observance, and to Franklin Roosevelt, who went on to convince Americans that he could “save” daylight and move an hour from the morning to the afternoon.

Now, that’s a nice trick!