Saturday, January 25, 2014

Hillary, Benghazigate, and 2016

Has Hillary Clinton begun her 2016 presidential campaign? “Oh no, not that!!!” you scream. “Not another presidential campaign season so soon.” Well, take heart. She won’t be flooding the airwaves with ads telling you why she should be the next great Democrat president of the United States. She will instead be campaigning on behalf of various mid-term Democrat candidates for congressional seats and state-level offices – all safe shoo-ins, of course, whose victories will give her the persona of kingmaker. And she will release a book. In other words, yes, she has begun her 2016 presidential campaign.

This is exactly the modus operandi Obama used in 2006 – campaign on behalf of mid-term candidates and be sure to drop an off-hand comment to their election organizations here and there, “Oh, by the way, did I mention that I’m running for president in two years?” It’s campaigning on the cheap because her stump expenses for other candidates are paid by their campaigns. And don’t forget the book. It’s coffee table publicity. Obama’s was entitled The Audacity of Hope. Wonder what happened to all that hope? Not audacious enough, I guess.

In the 2008 campaign, the media lined up behind Obama – the audacious guy – not Hillary – the sure fire choice because of her last name. You’d think she’d still be torqued about that. But politics makes for strange candidate behavior. Prior to a televised debate the presidential opponents smile and shake hands before they trade barbs and insults about each other for an hour, then they smile and shake hands again. I’ve seen that behavior before in sumo wrestling. Before and after wrestling, the opponents squat – the sumo equivalent of a handshake. At least in political debates the opponents don’t bump bellies and appear almost naked except for those funny-looking thong things.

The media has already started lining up behind Hillary for the 2016 campaign, even though the campaign hasn’t begun and other candidates haven’t declared. (Shocking.)  The January 27 cover of Time magazine, for example, shows a huge pant-suited leg and black pump shoe – Hillary’s standard garb – with a miniature hapless male opponent hanging off the heel. A miniature 2016 campaign poster has been discarded on the floor and the owner of the leg is walking off of the cover. Mission accomplished. The huge-miniature symbolism speaks volumes. “Can Anyone Stop Hillary?” the Times cover rhetorically asks. And a sub-headline proclaims, “How to scare off your rivals without actually running (yet).” Wow! What subtlety!

The fawning Time article, obviously written by an ardent Hillary admirer, trumpets “One widespread forecast holds that Clinton is poised for a cakewalk of historic proportions.” Well, there you have it. The 2016 presidential election may as well be canceled. It’s over. Hillary has cakewalked to victory. With historic proportions.

Hillary hasn’t officially declared her candidacy yet. That way she can duck all the specific policy questions while she’s running for president. “Madam Secretary, are you running for president?” Moi?

I wouldn’t crank up that cakewalk band just yet, however. The vice president usually gets a nod to carry on the policies of the current administration. So expect 74-year old Joe “Foot in the Mouth” Biden to campaign through at least Round One of the 2016 election. Obama sorta’ has to support his candidacy, rather than Hillary’s, or else admit the guy he selected twice to be one heartbeat away from becoming president for the past eight years is really a dummkopf, not qualified to be president.

An interesting Public Policy Polling survey last year showed 64% of Democrats supported Hillary and 18% supported Six-pack Joe. The support for other Democrat favorites dropped off significantly in single digits. But if Hillary wasn’t in the race, PPP showed Biden got 49% with barely double digit support for the two leading other Dem favorites. Democrat pundits point to this as evidence that Hillary has to run to prevent “Loose Cannon Joe” from becoming the party candidate.

There will be other candidates in the race. Elizabeth Warren of Cherokee Nation fame will surely run, likely to test the waters for a future presidential bid. (Lord, take me home before that happens!) Possibly Andrew Cuomo will run, while neglecting his day job more than usual, until primary voters convince him he’s a boring boor. Don’t discount another run by the unctuous John Kerry, especially if he has one or two foreign policy coups – more than Hillary has – and after all Kerry will be unemployed after January 2017. A couple of Democrat governors (folks who have actually run a government) have raised their hands and shouldn’t be taken lightly. So a cakewalk and coronation as the party’s nominee it won’t be for Hillary – Time magazine notwithstanding. The White House is rarely handed to a candidate, especially a non-vice presidential first term campaigner.

You’ve got to wonder why a 67-year old woman would put herself through the rigors of a nomination campaign and then a presidential election campaign. These days a 67-year old person is not elderly but combined with the fact that she had a blood clot in her brain last January and one in her leg during Bill Clinton’s second term, 20-hour days for two years is going to take its toll, and Hillary has a reputation for being a stress carrier and that word that starts with a “b.” And if Hillary wins – BIG if – it’s four more years of long stressful days – unless she’s like our current president who takes more vacations than Rick Steves. Also, daughter Chelsea announced last fall that 2014 was the Year of the Baby. Campaigning means grandma won’t be taking trips to the zoo pushing her grandchild in a stroller.

I think the answer to “why put yourself through this?” is the same thing that has kept her married to her philandering husband. Politics runs in their veins. Both have been immersed in politics for 40 years. Who needs love when there are elections to be won? Bill and Hillary Clinton are a political party of two who happen to be married – casually, it seems. But political candidate Hillary is no Bill Clinton. I know that fact galls her feminist pals, but she ain’t got what he has.

She is married to a great politician but none of it rubbed off on her. Hillary is “plain Jane” when it comes to political charm, instincts, and settling for small wins. 

Her career as a US Senator was unremarkable. When General Petraeus testified before a Senate committee in 2007 to give a progress report on the Iraq war, Senator Hillary, who was a 2008 presidential contender at that point, essentially called him a liar. She said that his report required "a willing suspension of disbelief." Not too smart, Candidate Hillary. The video clip showing her insulting a cooperative Senate witness, not to mention the top commander of the war, ran over and over and over on news channels. With each rerunning, her smug, obviously pre-planned use of the “willing suspension” phrase was a put down intended to play to the growing public dissatisfaction with the war. Instead it showed the Independent voter, on whom every candidate depends, what a classless (that word with a “b” again) she is.

The “willing suspension” phrase was coined by Samuel Taylor Coleridge to explain the willingness of readers to accept as true a fictional story as they are subsumed into it observing its unfolding. (FYI: I attended a military prep school and my literature teacher, Captain Standard, was a Coleridge junkie who compelled us to memorize long passages from Kubla Khan and The Rime of the Ancient Mariner.) Hillary may be a closet Coleridge scholar for all I know, but the phrase isn’t widely used. And as I watched the video clip reruns, her use of it was so canned, I suspect a staff member suggested she would appear erudite if she inserted it in her comments. Instead of appearing eruditely doubting, it compounded her uncharismatic image as a (that word again that starts with a “b”.)

Her political ineptness was on full display as HillaryCare was birthed in secret by her hand-picked liberal academics who were clueless about the real world of healthcare delivery. Bob Dole called it “dead on arrival” when she delivered it to the Senate with the fanfare of Pompey entering Jerusalem. It never made it to a floor vote, but it did help the Democrats lose Congress in their stunning defeat of 1994. Only when Democrats had bullet-proof majorities in both chambers in 2010 were they able to ram through the now-hated ObamaCare, sans a single Republican vote. Given Hillary’s parentage of HillaryCare, it will be hard for her to criticize Obama’s despised health law, which has a good chance of becoming a political albatross around her neck. (Note to Hillary: to understand this metaphor, refer to your bud Samuel Taylor Coleridge’s Rime of the Ancient Mariner.) When it became apparent that Obama’s promise that people could keep their insurance was at best a misrepresentation and at worse a lie, who called him out? Hillary? No, her husband, the ever-popular former president did.

Supporters like to point out that Hillary put more miles on the State Department jet than any previous Secretary of State in an attempt to impersonate Henry Kissinger’s shuttle diplomacy (she didn’t log the most miles, Condi Rice did.) In fact, her shuttling accomplished little more than wear out government property. On her watch American prestige in the world fell to a new low. Iran is fast becoming a nuclear nation, blowing off US diplomatic attempts to stop them, and the mullahs supply terrorists with impunity to kill American soldiers. Syria scoffed at stern warnings of a “red line” eventually allowing our enemy, Russia, to negotiate a treaty that would have made a good skit on Saturday Night Live. And al-Qaeda terrorists attacked the US consulate in Benghazi, killing our ambassador and a security agent and then hours later attacked another building, killing two more security personnel. All four were under diplomatic protection. Obama was nowhere to be found during the attack, Hillary knows what happened but isn’t talking, and the Democrat-controlled Senate issued a “bipartisan” report last week that would make “the dog ate my homework” explanation respectable. It uses the word “terrorist” 70 times. But Hillary is mentioned only once in a section written by Republicans. Who was running the State Department? Apparently no one.

An airhead on Governor Chris Christie’s staff closes a bridge and causes four days of traffic jams, launching three separate investigations, one by the US Attorney and FBI, as if this constituted high crimes and misdemeanors. Four Americans with US diplomatic protection are killed and 18 months later, no foreign national has been captured or killed, no one in the State Department has lost his/her job, and the US Senate’s investigation led it to conclude: “The committee found the attacks were preventable based on extensive intelligence reporting on the terrorist activity in Libya . . . and given the known security shortfalls at the US Mission.” Apparently the Senate report is the final word on this matter.

Even Susan Collins (R-ME) who is about as Republican as my Soft Coated Wheaten Terrier called Benghazigate "a broken system overseen by senior leadership [who] contributed to the vulnerability of US diplomats ... in one of the most dangerous cities in the world, and yet the Secretary of State has not held anyone responsible for the system's failings." The mirror! The mirror! Tell her to look in the mirror, Susan!

I don’t want to belabor Benghazigate. Greg Hicks, one of the Benghazigate whistle-blowers published a Wall Street Editorial this week disputing recent political efforts to blame the dead guy – Ambassador Chris Stevens. This blog is about Hillary and her 2016 chance of winning the presidential election with her political baggage. But last January, when she appeared before the Senate Foreign Relations committee, Senator Ron Johnson (R-WI) did a “Petraeus” on Hillary, pressing her hard to say what she believes caused the attacks (given the cockamamie video story that no one on Team Obama will admit to dreaming up.) Hillary exploded in full fury at Johnson: "Was it because of a protest or because of guys out for a walk one night and decided to go kill some Americans? At this point what difference does it make, Senator?"

Well, it makes a lot of difference to the families of the dead guys, Hillary, and it’s going to make a lot of difference to you. Because you’re going to see that video clip run over and over and over by your Republican presidential opponent.

Saturday, January 18, 2014

The Case of the Missing Jobs

Do you remember those cartoons in which Daffy Duck hands Elmer Fudd a bomb in the shape of a black ball with a lighted fuse on it? Elmer stares at it in disbelief before it explodes, blackening his face and shredding his clothes.

Well, that’s what Ben Bernanke just did to Janet Yellen, the incoming Chairwoman of the Federal Reserve. The bomb in this case is Bernanke’s failed experiment to do the impossible by stimulating the economy outside of its markets – the only true mechanism that can stimulate an economy – by flooding the economy with dollars. Somehow this monetary magic was supposed to create jobs using devalued currency. Tell me how that works. I think I missed that lecture in Economics 101.

Well, the economic chickens came home to roost last week in the surprise December jobs report announcing that instead of the 200,000 plus jobs that had been expected, only 74,000 net new jobs were produced by the economy, about half of them part-time. The mainstream media immediately went to DEFCON 1 flooding the airwaves and newsstands (does anyone buy newspapers anymore?) with misinformation, excuses, and “Ah, shucks, it’s just one month” analyses to hide the fact that the Obama recovery wasn’t recovering.

Over at the White House all you could see was rear ends and elbows as all the president’s men (and women) scrambled in a frenzy of activity to produce a new legislative initiative that would distract low information Democrat and Independent voters (I thought low information voters were permanently distracted) and prevent the 2014 mid-year elections from being screwed up. Presidential press boy, Jay Carney, announced they had found one – an extension of the expired 99 weeks of unemployment benefits, which had been extended toward the end of Obama’s first term. The extension of the already extended benefits would go to people who, ironically, were looking for the jobs which Obama’s policies had vaporized since he assumed office.

Don’t get me wrong. I feel badly when anyone is deprived of earning an income which would make him or her independent of a handout from the government. But extending unemployment benefits is like transfusing a person with a gaping wound without sewing up the wound. The problem the unemployed need is a new supply of jobs, not a new supply of blood plasma.

Only the private sector can create jobs, which the December figures show it’s disinclined to do because Obama policies create uncertainty, hamstring businesses with bureaucratic regulations, and expropriate the rewards of economic success in confiscatory taxes. If Obama has shown anything in his policies, it is that he is more concerned about redistributing income than allowing more of it to be produced.

The Labor Department, which is about as independent from the White House political machine as Bonnie was from Clyde, blamed the cold weather in December for the poor jobs report. The Labor Department smoke screen claimed 273,000 people in their monthly job survey said weather prevented them from working. Sorry, that dog won’t hunt. December did not have unusually bad weather – the polar vortex didn’t happen until January – and weather is factored into the seasonal adjustment of employment numbers.

The way the Bureau of Labor Statistics (BLS) compiles its unemployment data is to call a lot of people in a survey and ask about the household employment status. The unemployment rate is computed as a percentage of the number of people who say they are unemployed but have been looking for work versus the number who say they are employed. Obviously if people “have been” looking for work they are talking about the past. Was there extraordinarily cold weather in November and December? No.

The BLS specifies the economic sectors which gained jobs and those which lost jobs. Therefore we know that, despite recent high demand for housing, construction jobs were down in December by 16,000 – understandable since those jobs are susceptible to weather – but IT jobs were also down (12,000), so were healthcare jobs (6,000) and so were government jobs (13,000), not the kind of work affected by weather.

The jaw-dropper in the disappointing December jobs report was the unemployment rate. It fell from 7% to 6.7% despite the weak jobs environment. How can that be? Almost no new jobs were created yet the unemployment rate fell. Well, one way to do this trick is to undercount who is unemployed. When people stop looking for a job BLS considers them no longer unemployed and doesn’t count them even if they are able-bodied and 35 years old with a good education. This produces all sorts of number magic.

If not one new job were created, and people “drop out” of the workforce by giving up active job searching, the unemployment rate will go down. That’s a neat trick! The unemployment rate is the ratio of people actively looking for jobs – the unemployed – versus those who have a job, the employed. The unemployed figure is the numerator and the employed is the denominator. As people drop out of the workforce – the numerator – it gets smaller while the denominator is unchanged. So the ratio (which is expressed as a percentage) gets smaller. Employment appears to be improving, happy days are here again, and the president looks like Superman. And that is what has happened over the past five years of the Obama recovery as the long-term unemployed have gotten discouraged, exhausted unemployment benefits, and stopped looking for work.

In the BLS survey, the operative question is, if a person is unemployed, have you looked for a job in the past four weeks. (That’s what qualifies them to receive unemployment benefits for a certain period of time.) If the answer is no, the person isn’t counted in either the workforce numbers or in the unemployed numbers. In December, therefore, the workforce fell by 347,000. This made the workforce at the end of 2013 about 548,000 souls smaller than it was at the end of 2012. If dropouts continue, the unemployment rate theoretically would be zero at some time in the future. TAH-DAH!

How do we know the workforce shrinkage isn’t due to baby boomer retirements?

About 7,000 workers a day reach the traditional retirement age of 65. Many don’t retire – they are in good health or they can’t afford to. So they remain in the workforce in their current jobs or take part-time work to supplement retirement income and to stay busy. At the same time, about 11,000 babies born in the 1990's reach the age of 16 each day and become available to enter the workforce, even if only for part-time work. Therefore, if all baby boomers retired at age 65, which they don’t, 4,000 people become available to work every day. This means the available workforce is growing by about 120,000 per month and that is the number of new jobs needed monthly. Some young people will defer work to attend college, of course, and some will work and go to college. But ultimately they graduate or drop out of college or reach age 21 or 22 and need full-time employment at a rate of more about 120,000 new jobs per month – which this economy isn’t producing.
 
Since the people who have given up looking for work aren’t counted in the BLS unemployment rate, the real unemployment rate is much higher. How much higher? Probably around 12% to 13%.

Here’s why.

If we counted all of the able-bodied people between 25 and 64 who aren’t in the armed forces or prison and labeled that number the “potential workforce,” we would have a pretty good idea of how many could work if there were jobs for them. And of course, we know how many people are actually working, because the BLS collects that number every month. So if we divide that number – the actual workforce – by the potential workforce we get a ratio that is called the “labor participation rate.” In other words, we would know the percentage of people who are working versus those who could work. That percentage in December 2013 was 58.6%.

What happened to the other 41.1%, you ask? They aren’t working – about 102 million able-bodied souls. Think of the value their work could be creating if employed, the increase in Gross Domestic Product that our economy is missing, the additional taxes that aren’t being paid and have to be made up by those of us who are working. Instead some, perhaps many, of these “not working” people receive unemployment benefits or public assistance. In other words, for lack of jobs, they are consumers, not producers.

According to the National Bureau of Economic Research's business cycle dating committee The Great Recession which began in December 2007 ended 18 months later in June 2009. A graph of the labor participation rate shows a precipitous decrease after 2008 which has gone sideways since. The Obama administration took office in January 2009. To be fair, his policies didn’t cause the Great Recession of 2008. To be fair, his policies haven’t produced a recovery from the Great Recession of 2008 either. When Obama moved into the Oval Office, the number of able-bodied unemployed was 92.6 million souls. In December 2013 that number had grown by 10 million.

The labor participation rate has varied month-to-month in a narrow range around 58.5% for the past three years. That certainly can’t be blamed on the evil Bush. And if the potential workforce were accurately counted, the real unemployment rate is closer to 13% – almost double the official government figure. That wouldn’t go over well politically if widely known.

We are missing 8 million jobs if only to keep up with population growth, and arguably we need 10 million jobs to employ all of the able-bodied unemployed. If the economy produced 120,000 jobs per month – the figure stated above as the number driven by population growth – it would take 5 ½ to 8 years to close the gap from where we are to where we need to be on jobs.

What solution has Obama and the Democrat Congress proposed to address the fact the Obama recovery isn’t recovering? Scrubbing ObamaCare or its most egregious anti-jobs requirements? Nawh. Getting rid of the anti-jobs regulation that the administration’s shadow government network of government agencies by-passed Congress to impose? Nawh. Reducing taxes, especially C-corp taxes which gobble up capital businesses need to expand? Nawh.

The president and his congressional lackeys want the minimum wage to be raised. As I’ve blogged previously, that’s a sure-fire way to kill more jobs. And the Dems want to extend the unemployment benefits at an unbudgeted cost of $24 billion – the blood plasma approach. Both, of course, are good election year political issues. But neither solves the jobs problem.

Democrats hope that Republicans will push back giving them the brush to paint them as the heartless party. Republicans have indeed pushed back on the minimum wage because it’s a terrible idea and has never solved the problem lawmakers convince voters they are solving. The minimum wage is pure politics.

And the Republicans did push back on extending unemployment benefits until there are offsetting cuts elsewhere in government spending. The Democrats resisted the offsetting cuts counter-proposal and Dictator Reid tabled it in a parliamentary procedure this week.

Don’t expect much to happen on the jobs front for ten more months when we’ll have a chance to unemploy this Congress and employ a new one. In the run-up to the fall election the question I’d be asking every candidate – Republican or Democrat – is “what are you going to do to get government out of the way and let the private sector create jobs?”

One of the many ironies that Barack Obama embodies is his unending concern that some people aren’t paying “their fair share.” Well, the eight to ten million jobs Obama’s policies have eliminated certainly aren’t paying theirs!

Saturday, January 11, 2014

“What so proudly we hailed …”

As the professional football season slogs its way to Super Bowl Sunday on February 2, I wait with eager expectation for the traditional pre-game mutilation of the National Anthem sung by someone who either can’t read music or couldn’t carry a tune in a barroom spittoon. Along with its current investigation and prevention of player concussions, I think the National Football League should investigate and prevent the jazzed up renditions of an anthem whose origin should oblige its singing to be done with some reverence.

But alas, this is yet another example of how cheaply modern society holds its traditions. Perhaps the day will come when a comedic reenactment of the D-Day invasion is presented, or an irreverent telling of that awful night at Ford’s Theater in April 1865, or perhaps a desecrated version of Martin Luther King’s “I have a dream” speech will appear on Saturday Night Live.

The War of 1812 was fought for several reasons and took place in three major theaters – at sea, along the American-Canadian border, and in the South where Andrew Jackson earned his reputation in the 1815 Battle of New Orleans. The British were fighting a two-front war against the United States in the west and the Napoleonic wars in the east until 1814 when Napoleon was defeated. This allowed the British to turn their full fury against the United States.

The British fought and won the Battle of Bladensburg (Maryland) in the late summer of 1814, allowing them to burn the Capitol and the President’s House (White House) in Washington, sending Dolly Madison scrambling for her life into northern Virginia near present-day McLean. Her husband was with the defeated American army at Bladensburg and penciled a note telling her to flee, inasmuch as the British had boasted that if they captured her, she would be sent to London and paraded in irons through its streets. Nice guys, the British.

From their “victory” in burning a deserted Washington, the British turned north to Baltimore and New York (and later New Orleans) where their fortunes turned, thankfully, otherwise we would have been returned to colonial status and likely would have become a British-leaning country as Canada is today.

As they left Washington, the British snagged Dr. William Beanes, a 65-year old physician living in Maryland, and held him as a prisoner of war. Beanes had participated in the capture and jailing of British stragglers and their General Ross took umbrage with that act. Beanes’ friends went to Francis Scott Key, a lawyer practicing in Georgetown (a modern-day suburb of Washington) to help secure his release. With the permission of President Madison and assistance of Madison’s Prisoner Exchange Agent, John Skinner, Key and Skinner sailed up the Chesapeake Bay in search of the British fleet where American prisoners were being held and Beanes was likely to be found.

The pair found the flagship of Vice Admiral Sir Alexander Cochrane. On board they met with General Ross, who had reboarded the flagship with his troops. Ross refused the request to release Beanes. However, Skinner had the foresight to have collected letters from the British wounded left behind in the Battle of Bladensburg. They had been cared for by Beanes and treated well by other Americans. Upon reading the letters, Ross had a change of heart and agreed to Beanes’ release. Ross would be killed in a land battle near Baltimore days later.

Since Key, Skinner, and Beanes had learned of the British intention to bombard Fort McHenry during their time aboard the Admiral’s flagship, they were free to go but not until the British military operation was complete lest they warn those in the Fort. All three were held under British guard throughout the day of September 13, 1814 as the bombardment continued through the night and into the next morning.

Admiral Cochrane attempted to get his fleet in close to the Fort but shallow waters prevented him from sending forward his heaviest ships. So the fight was left to ten small warships and six bomb and rocket ships. They were able to get close enough to inflict damage on Fort McHenry while remaining outside the range of the Fort’s guns. On one occasion the ships moved in closer to increase the effectiveness of their fire but came within the range of the Fort’s guns. The return fire was so withering that the ship captains were forced to pull back. When morning came the British had fired between 1,500 and 1,800 rounds at Fort McHenry with little effect. Cochrane wisely decided to break off the battle and unwisely decided to sail to the Battle of New Orleans where the Americans mauled the British, effectively ending the War of 1812.

The year before the attack, the commander of Fort McHenry had requested a flag to fly over the embattlements in order to make the stronghold visible to the British at a distance. A local committee selected Mary Pickersgill, a "maker of colors" who had previously made ship flags. They requested that she sew a garrison flag measuring 30 by 42 feet and a storm flag measuring 17 by 25 feet. (Yes, it was Mary Pickersgill, not Betsy Ross who made the “star spangled banner.”) The garrison flag was huge with 15 stars measuring two feet point to point and stripes – eight red and seven white – measuring two feet in width. (The surviving garrison flag now measures 30 by 34 feet because about eight feet was cut off and given away as souvenirs as people became aware of its historic significance. It resides in the Smithsonian Institute. The storm flag has been lost.)

During the shelling on the night of September 13 a torrential rain fell and the storm flag was flying. As morning broke, the storm flag was lowered and the garrison flag was raised. It was that flag that Key saw when he penned on the back of an envelope in his pocket his famous lines:

O say can you see by the dawn's early light,
What so proudly we hailed at the twilight's last gleaming,
Whose broad stripes and bright stars through the perilous fight,
O'er the ramparts we watched, were so gallantly streaming?
And the rockets' red glare, the bombs bursting in air,
Gave proof through the night that our flag was still there;
O say does that star-spangled banner yet wave,
O'er the land of the free and the home of the brave?

This was the first of several stanzas he wrote, but when the National Anthem is sung today only the first stanza is performed.

The mythology surrounding the National Anthem asserts that it was written as a poem and later adapted to a popular drinking song of the era. In fact Key wrote the lyrics with the melody of “The Anacreontic Song” (also called "To Anacreon in Heaven") in mind. Hundreds of other lyrics had been written to the same melody. A decade earlier, Key had written similar words to the same melody which he called “When the Warrior Returns” to honor Stephen Decatur’s exploits in Thomas Jefferson’s war against the Barbary pirates.

The melody of the “Star Spangled Banner” was composed by Londoner John Stafford Smith as a sort of theme song for the British Anacreontic Society, a men’s glee club of the 18th century. It later became popular in America. The lyrics were separated from the melody and were variously published under the title "Defense of Fort McHenry." However, Dr. David Hildebrand, Director of the Colonial Music Institute, says the structure of the lyrics don’t match any poem. The lyrics and melody did not become known as “The Star Spangled Banner” until well after the Battle of Fort McHenry as over time it assumed increasing patriotic significance. In popular use several versions began to emerge and a committee including John Philip Sousa met to agree on a standardized version of it.

In the late 19th century it became the official anthem to be sung or played when the flag was raised. Later President Wilson required that it be played at all military and certain other official events. In 1929 Robert Ripley’s syndicated Believe It or Not newspaper feature noted that the United States had no national anthem. Until then “Hail Columbia” and “My Country ‘Tis of Thee” (sung to the melody of the British national anthem, “God Save the King”) had served as the official music for occasions of State. Responding to Ripley’s claim, John Philip Sousa, the march king, published an editorial opinion stating that the stirring spirit of the music of the “Star Spangled Banner” with Key’s “uplifting words” made it a fit candidate for a national anthem. Many agreed and in March 1931 Herbert Hoover signed a law adopting the “Star Spangled Banner” as the official National Anthem of the United States.

This year is the bicentennial of the writing of what later became our National Anthem. It gives me goose bumps every time I hear it sung properly.

Saturday, January 4, 2014

Ready to Read in 2014?

“Read, read, read,” counseled William Faulkner; “Read everything – trash, classics, good and bad, and see [how authors write]. Just like a carpenter who works as an apprentice and studies the master. Read! You’ll absorb it.”

Faulkner was exhorting young writers in this admonition but his advice applies equally to those who don’t aspire to be writers. The reason Faulkner’s urging is sage advice is that most of the world’s knowledge exists in written form … and therefore is not accessible to most of the world.

Why?

Because so few people read. (Last week’s blog noted that 28% in a survey conducted last year hadn’t read a book in the prior 12 months and half who had read at least one book had not read more than six.) If this survey is an indication, most Americans deprive themselves of access to the world’s storehouse of knowledge.

And that choice is not without consequences. There’s ample scientific evidence that people who read literary fiction (as opposed to popular fiction) “read” the body language of others more effectively, are more empathic, interpret situations better, lower their stress levels faster by reading, suffer less memory decline with age, are less likely to suffer from Alzheimer's disease, and have more vivid imaginations. (Emily Dickinson observed, "There is no frigate like a book to take us lands away.") Reading is also entertaining. Many arguments can be made for reading.

But the most important argument is this. After we leave the formal environs of education in our early 20s, our remaining education in life – arguably the most valuable part – will be self-taught. That will come from two sources: personal experience and reading. It’s hard to make a more compelling argument for reading than that. Yet many people essentially stop learning – at least beyond their own experience.

What are you planning to read this year?

I’m afflicted with an overly-curious mind and have found that unless I plan what I’m going to read with some sense of priority, I’ll end the year with parts of many books read and few of them finished. I try to plan a balance of non-fiction and fiction reading during the year, and while I start with a couple a dozen books (my 2014 selections are bought, stacked in order, and waiting for me) I’ll add a few others during the year. I might occasionally substitute books purchased during the year to replace one in my original reading plan, but if I do that too often I find that some books don’t get read. I’m linked to several book review sites so unfortunately I can find more books that I want to read faster than I can read them. Some of the books in this year’s reading plan were purchased a year or more ago.

At one time I “read” several books concurrently. But I gave up that practice after discovering some weren’t finished because I had too many going at the same time. I’ve also found that if I’ve not turned on to a book in about 50 pages or so, chances are it isn’t going to happen by reading 50 more pages. The purpose of reading is to learn something, not to complete books as if I get to add a notch to the edge of my bookcase. Reading time is too valuable to waste on books I have to slog through.

Well … those are my reading habits for what they’re worth. Here’s what is in my 2014 reading plan with hyperlinks to Amazon if you have an interest.

In the non-fictional category, I’m already well into The Heart of Everything That Is, the story of the greatest chief the Sioux produced, Red Cloud. He assembled an army of 10,000 warriors from many tribes and waged a two-year war that defeated the US Army and forced it to sue for peace on Indian terms. No other chief in the American Indian wars accomplished that.

The book was released in November and immediately caught my interest. The authors sourced material from an autobiography of Red Cloud discovered in the 1990s at the Nebraska Historical Society as well as contemporary newspaper reports and interviews with descendants of sub-chiefs, notably Crazy Horse. The title is a translation of the Sioux name for the Black Hills – Paha Sapa, the sacred territory that contains the Wind Cave through which the Sioux believe their deities delivered them from a subterranean netherworld to a land teeming with game.

Regardless of where one’s sympathies lie, the post-Civil War conflict between America and its native Indians is one of the saddest stories in our country’s history. There was no “right” answer on the coexistence question and indescribable suffering was meted out by both sides – probably made worse by US treachery in keeping its treaties. Almost a year ago to this day I recalled the 122nd anniversary of the final episode in this tragic struggle in the blog, Broken Hearts at Wounded Knee.

The second book in my non-fictional stack for the coming year is The Cave and the Light. The author contrasts the conflicting views of Plato, a student of Socrates, to those of Aristotle, a student of Plato, who became the teacher of Alexander the Great and later, the philosophical rival of Plato.

In Raphael’s famous painting, The School of Athens, Plato and Aristotle are seen in the center walking toward the viewer. Their positions are the embodiment of their philosophies. Plato, the idealist, is walking toward us as the right figure of the pair, whereas, Aristotle, the empiricist, is the left figure – thus appearing to be the equivalent of the “right brain” and “left brain” of their worldviews long before the function of the brain lobes were understood. But in a sense the left-right arrangement seems to be the way their views manifest in matters of faith (Plato) versus matters of science (Aristotle.)

The title comes from Plato’s cave and shadows thought experiment contrasting reality and the perception of it. Plato’s theoretical prisoners were chained in a manner in which they were only able to see the shadows on a wall in front of them. They were caused by figures moving behind them and the light source. Seeing only shadows throughout their imprisonment, however, the prisoners thought the shadows rather than their cause were “reality.” It will be interesting to see how the author expands that idea into the “struggle for the soul of Western civilization” – the subtitle of the book.

Third in my non-fiction stack is The Great Debate in which the author makes the argument for the origins of the modern political Right and Left. The “debaters” are Thomas Paine and Edmund Burke. Paine is known to most students of history as the author of Common Sense, which historian Gordon Wood called "the most incendiary and popular pamphlet of the entire revolutionary era." Burke is known to students of history as the critic of England’s mismanagement of the colonial uprising – the Revolutionary War – and his well-known speech in 1775 appealing for reconciliation. On most issues of the American Revolution, they are on the same side politically. The great debate at issue in this book, however, is the Paine-Burke dispute over the merits and lack of them in the bloody French Revolution. In that conflict the debaters were on opposite sides.

The author parlays Burke’s conservatism and Paine’s liberalism into the philosophical schism that divides the country politically today. That would seem to be quite a stretch and I’m looking forward to learn if he pulls it off credibly.

Now to my fiction stack.

As I’ve noted in another blog, fiction encompasses two brands. There is the “popular” fiction brand (think Tom Clancy, Clive Cussler, and Stephen King) and the “literary” fiction brand (think the classical authors, Austen, Dickens, Bronte and their modern descendants.) Literary fiction can be further divided into plot-driven novels (Lord of the Rings, The Count of Monte Cristo, The Girl with the Dragon Tattoo) and character-driven novels (The Remains of the Day, The Caine Mutiny, Frankenstein.) Personally, I read fiction for the purpose of studying and understanding different types of characters, their flaws, the situations they get themselves into, how they change over the course of the novel, and the lessons I can learn from their experiences without suffering the consequences they suffered. Thus, my preference is character-driven literary fiction rather than the plot-driven genre.

That said, the top book on my fiction stack this year is The Goldfinch, a novel released this past fall that has been near the top of most best-seller lists. The story is about Theo Decker, a teenager, who visits the Metropolitan Museum of Art with his mother to see the Carel Fabritius 350-year old painting, The Goldfinch, where it’s on loan (in the novel.) A bomb explodes during their visit, Theo’s mother is killed, and he takes the painting and leaves the museum. The rest of the story is about the hold the painting has on Theo as he grows into adulthood realizing the longer he keeps a stolen painting the harder it is to return it. His life is drawn into the contrasting art underworld and the lives and homes of the rich.

What’s interesting here is that Fabritius, a student of Rembrandt, was himself killed by a gunpowder explosion in 1654 cutting short a life that would have likely produced many masterpieces. The explosion that killed him at age 32 destroyed most of his paintings, and The Goldfinch survives but still bears the evidence of the explosion.

Even at a hefty 784 pages, I’m looking forward to reading this novel.

Next in my fiction stack is another weighty tome, The Luminaries, which tips the scale at 2.5 pounds and runs 848 pages in length. It’s a tale centered in New Zealand (the home of the author) at about the time of the American Civil War. A gold rush has attracted the central character, an English lawyer, who arrives after a harrowing sea crossing to make his fortune. After settling in at a hotel, he makes his way to the bar for a drink to settle his jangled nerves, and it becomes apparent to him that he has stumbled into the midst of a secret conclave of town citizens and prospectors. What they have in common are relationships with three events that happened two weeks prior – the death of a tramp in his shack at the edge of town which contained a large fortune, the disappearance of a young man who had just “struck it rich” in the gold fields, and the near-suicide of a town prostitute.

The book won the Man Booker Prize for 2013, an award given for the best British novel. It is a remarkable achievement for its 28-year old author, Eleanor Catton, the youngest ever to win the prize and she did so with only her second novel. From the reviews I’ve read it has a cast of finely-developed characters so numerous that a scorecard is almost needed to keep track of them. A third of the book is devoted to the opening scene.

The Adventures of Augie March is third in my stack of fiction. This book was published in 1953 and won the National Book Award for fiction. Augie March, the principal character, is a sort of Depression-era Huckleberry Finn living in Chicago – a street smart kid adept at working the angles to survive. In fact, he is probably the personification of the book’s author, Saul Bellow, who himself was a rough-neck slum kid before immigrating to America from Quebec. The late Christopher Hitchens regarded Bellow's characters and fiction as reflections of Bellow’s own struggle "to overcome not just ghetto conditions but also ghetto psychoses."

I was determined to read one of Saul Bellow’s books this year. Augie March would be found on most lists of the best American fiction, but so would his other books – Humbolt’s Gift, Henderson the Rain King, Herzog, and Ravelstein, his last book written when he was 85 and based loosely on the life of his good friend and university colleague, philosopher Allan Bloom.

Bellow led a remarkably productive life for which he was awarded the Pulitzer Prize, the Nobel Prize for Literature, the National Medal of Arts, and the National Book Award for Fiction, which the NBA Foundation awarded to him three times – the only writer to achieve such recognition. The Foundation also awarded him its lifetime Medal for Distinguished Contribution to American Letters in 1990.

I don’t have the space in this blog to mention every book in my 2014 plan or my reason for selecting it, so I’ll have to stop with the top three in the non-fiction and fiction stacks.

I hope you have a 2014 reading plan and that it includes several character-driven novels. I’m convinced you’ll meet some interesting people in their pages.