Saturday, July 31, 2010

The Iran Dilemma – Part I

Iran possesses about 10% of the world’s oil reserves and about 15% of the world’s gas, making it one of the largest energy-producing nations. Yet it has embarked on a nuclear program – allegedly for energy production – which is a thinly veiled attempt to build nuclear weapons in one of the most volatile regions of the world. Iran’s neighbors include Afghanistan, Pakistan, and Iraq – centers for the war on terror – but they also include American allies: Saudi Arabia, a presumptive ally; Turkey, a Muslim democracy; the Persian Gulf sheikdoms; and the Persian Gulf itself, through which 40% of the world’s oil flows to market. The world’s major oil producing region and the heart of the sponsors of terrorism are therefore superimposed on each other – a lethal concoction.

Iran’s decision to “go nuclear” poses a major threat to world peace. How the West should confront it is a major dilemma. To understand what seeded this confrontation it’s necessary to understand how Iran’s relationship with the West – primarily the US – soured in the past. Confronting Iran’s nuclear weapons threat will be the subject of next week’s blog.

Historical events have connections to other historical events and those of Iran’s relations with the West are no exception. The most notable event in modern Iran-US relations – at least from the American perspective – is undoubtedly the November 1979 hostage-taking of 52 Americans and their incarceration for 444 days. However, like many antagonistic ventures, the hostage-taking was Iran’s retribution for the 25 years of misery it had been forced to endure as a consequence of the American foreign policy decision to overthrow the democratically elected Iranian Prime Minister, Mohammad Mosaddeq, in a CIA-engineered coup d’état in 1953. And the overthrow of Mosaddeq is linked to the nationalization of the Anglo-Iranian Oil Company, the predecessor of modern day BP, which was formed at the beginning of the 20th century to exploit Iranian oil reserves.

Exploit is the operative word because, during the control of the Iranian oil fields by the Anglo-Iranian Oil Company (AIOC), the Iranian oil workers lived in squalid conditions and Iran received only a small royalty on the production of its reserves. Mohammad Reza Shah Pahlavi, became the head of government during World War II when the Iranian oil fields were a vital asset to the allied war effort and the Allies needed a reliable person to control the country. After the war, the Shah allowed a constitutional monarchy to rule through an elected parliament and a Prime Minister. But the government was shaky, if not corrupt, and a succession of Prime Ministers came and went.

In 1951 Mohammed Mosaddeq, a nationalistic critic of the AIOC, received the Parliamentary vote to become Prime Minister and the Shah was forced to confirm him in that position. Mosaddeq quickly moved to nationalize the oil industry, provoking British retaliation that brought the Iran economy to its knees. Britain withdrew its tankers, preventing Iran from getting oil to market. The British PM, Winston Churchill, approached US President Truman to support a coup that would return the oil production to AIOC control but Truman refused.

However, Cold War contentions led the Eisenhower administration to fear that Mosaddeq would drift toward the Soviet Union, cutting off a major oil supplier from the West. Churchill now found a willing partner in the plot to overthrow the legitimate government of Iran. Mosaddeq was toppled, tried for treason, and confined to house arrest – an event the Iranian people never forgot. In the wake of the coup, Shah Reza Pahlavi ruled the country as an autocrat – essentially a puppet of the US. For the next 25 years, this worked well in creating the regional stability and a reliable oil source, which the US desired, but it didn’t work well for the Iranian people.

The Shah signed an agreement with an international consortium of foreign companies to develop its oil industry, allegedly receiving half of the oil profits, although Iran was never allowed to look at the books. Revenues were used to institute Iranian land reform and eliminate illiteracy, and reforms were instituted that allowed women to vote, hold office, and refuse marriage before age 15. The Shah also encouraged the adoption of western dress and culture, much to the horror of the clerical class.

However, the Shah also spent lavishly on himself and his family, which in a poor society, alienated him from the people, and he spent heavily to build up a massive military – the largest in the region. His trickle-down spending didn’t improve the economic well-being of the lower classes. Social unrest was brutally suppressed, and the Shah’s secret police murdered hundreds and arrested and tortured even more. When the Shi'ite clergy became alarmed that their influence in Iranian society was threatened by the Shah’s policies, their leader, Ayatollah Khomeini, began to speak openly for overturning the Pahlavi dynasty. Khomeini was exiled, further polarizing the country.

In January 1979 the Shah announced he would leave Iran for 18 months to seek treatment for his terminal cancer. With the country collapsing, Ayatollah Khomeini returned from exile and established a de facto government. In retaliation, the US froze Iranian assets. It also allowed the Shah to visit the Mayo Clinic for treatment, which was seen by the Khomeini government as harboring a criminal and preventing his return to Iran for trial. This set the stage for seizing the American embassy in Tehran and taking its occupants hostage in November. The new revolutionary government banned political parties, cracked down on symbols of western influence, and in an effort to unify the country, vilified the US as the “Great Satan”. In the end, Khomeini had accomplished the first religious revolution in modern time.

After an eight-year war with Iraq that devastated the Iranian economy and infrastructure, Khomeini died in 1989 and was replaced by Ali Khamenei as Supreme Leader. The allegiance of the clerical ruling class split into two factions: those who sought a pragmatic accommodation with the West and supported internal reform and those who endorsed the ideology of Khamenei and wanted a confrontational foreign policy with the West in order to maintain the moral authority of the ideologues. Until the present, the conservatives have held sway, using the revolutionary constitution to disqualify reformers from office, close newspapers, control judges, and intimidate opponents. But their anti-western restrictions are wearing thin on Iran’s youthful society who see the revolutionary leadership, not the West, as largely responsible for their misery. This offers hope for the future.

Here’s why.

A country’s internal stability is a function of its openness. Countries in the free world are open societies that are quite stable to the irritations of recessions, war, and unpopular government policies. Their citizens don’t take to the streets in mass demonstrations of civil unrest. Ironically, however, closed societies such as North Korea and Cuba are also quite stable because society is under the thumb of its government. No country can permanently maintain isolation from world influences unless it is governed by personality cults, like North Korea or Cuba. Fidel Castro is Cuba. Kim Jong-il is North Korea. Ali Khamenei is not Iran because Iran is a semi-open society. If the mullahs attempt to make it less open, there will be social upheaval. If Iran progresses toward more openness, the mullahs are out. Even among the mullahs there are longstanding philosophical differences tugging the country toward more or less openness. The struggle for and against openness is one of the internal tensions in Iran today.

Another internal tension is the growing population and the declining economic opportunity in Iran. Iran is one of the most populous countries in the Middle East. There were 30 million Iranians when Khomeini launched the religious revolution. Today there are 72 million Iranians of whom 70% were born after the revolution and, therefore, don’t have the emotional investment in despising the Shah’s regime or the past usurpations of the West. They are more interested in jobs than ideology.

However, because the most influential mullahs eschew openness, the Iranian economy is shrinking. During the Shah’s regime, oil output was six million barrels a day; today it’s more like four million barrels a day. Still, Iran has sufficient oil revenues and foreign reserves to build infrastructure, capitalize private entrepreneurship, and promote foreign investment and trade which would create the million or so new jobs needed annually to employ its young people. Instead, Iran’s financial assets are used to buy off restive reformers and prop up the country as its manufacturing sector wears out and the country labors under the sanctions imposed by the UN, EU, and US.

Yet a third source of internal tension in Iran is the pressure coming from young people for regime change. While the majority of Iranians may dislike the US, they like its people. Iranians admire western culture and fashion and want a similar open modern society for their own country. Last summer, Iranians took to the streets, chanting "Where is my vote?" Their protests in major cities in Iran and around the world were in support of opposition candidate Mir-Hossein Mousavi who lost in what was alleged to be the rigged reelection of Mahmoud Ahmadinejad as Iranian president. The protests have been called several names by their proponents but the one that seems to have stuck is the Green Revolution, reflecting Mousavi's campaign color.

Last summer’s protests were met with violence by Khamenei, the Supreme Leader, whose harsh repressions may have sealed his fate. More than 5,000 protestors were arrested and an unknown number of them killed. In an era when mass communications can’t be controlled by the government, the use of extreme violence is a threat to the regime's credibility. The death of Neda Agha Soltan, the young woman shot in the chest by security forces, whose death throes were captured on a phone camera, shocked the world.

Revelations of rape and torture by security officials outraged Iranians and dissident mullahs. And the regime may be losing its influence over the military. There are indications that a large part of the Revolutionary Guard is no longer willing to be used as an instrument of oppression. Video images from demonstrations show Guard members joining the ranks of the protesters. A declaration signed by air force and army officers and published on the Internet warned radical Revolutionary Guard members to "Stop the violence against your own population."

These internal tensions could be used by US policy makers to drive a wedge between the clergy and produce a government more favorable to the West. In next week’s blog, I’ll discuss how.

Saturday, July 24, 2010

American Exceptionalism

In 1831 a 25-year old French noblemen, Alexis de Tocqueville, came to this country and spent a year traveling it and observing its people, culture, and the working of everyday life. He later published his observations in two volumes entitled Democracy in America in an effort to explain the American experiment in self government to Europeans, especially the French.

As he journeyed about the country and mingled with Americans, what de Tocqueville saw up close was the world’s first functioning meritocracy – a society quite different than that of Europe, one without class structure, an entrenched church hierarchy, and the European low regard for commercial work. Unlike his countrymen, Americans, de Tocqueville observed, were more independent, less inclined to subject themselves to the control of others, more tolerant of insecurity, and untroubled by economic inequity due to their belief that hard work was the solution. Although democracy existed nowhere else in the world, it permeated every corner of our culture leading de Tocqueville to conclude that America was qualitatively different from all other countries.

“The position of the Americans is therefore quite exceptional,” he wrote, “and it may be believed that no democratic people will ever be placed in a similar one.”

American exceptionalism, according to de Tocqueville, was the product of a society of immigrants who built a nation of laws and ideals like none other in the world. He attributed our exceptionalism to five qualities: liberty, egalitarianism, individualism, populism and laissez-faire, which he collectively called “the American Creed.”

Liberty, he noted, was the first and most important element of the creed. The bequest of the Founders was a Constitution that strictly limited the role of government in American society by stating what it and its executives were forbidden to do. The negative tone of the Constitution underlined that its purpose was to protect the liberty of the people. Writing about the restrictions of the Constitution in Federalist 51, Madison stated, “… that such devices should be necessary to control the abuses of government. For what is government … but a reflection of human nature? If men were angels, no government would be necessary. If angels were to govern men, neither external nor internal controls on government would be necessary.”

A Pew Research poll two years ago surveyed 91,000 people in 50 countries. It revealed how exceptional Americans remain today. They are more proud of their national identity: 71% of Americans said they are "very proud" to be in America whereas 38% of the French and 21% of the Germans and the Japanese said they are proud to live in their countries. Americans are more individualistic: only a third of Americans believed success was beyond their control in contrast to two-thirds of the Germans and Italians. Sixty percent of Americans said they believe the value of hard work should be taught to their children versus one-third of the British and Italians and one-fifth of the Germans. Over half of the Americans believed economic competition stimulates people to work hard and develop new ideas while only one-third of French and Spanish agreed. Over three-fourths of Americans would like their views to spread throughout the world versus one-fourth of the French, Germans, and Italians, and one-third of the British.

Our political culture, de Tocqueville believed, derived from our religious heritage. Fleeing religious persecution in their native lands, the early settlers brought with them a form of Christian worship that was both "democratic and republican." Today 73% of Americans believe in God, compared with 27% of the French and 35% of the British. As a consequence, nearly half of all Americans attend churches or synagogues weekly, compared to 4% of the British, 5% of the French, and similar low percentages in most of Western Europe.

American exceptionalism, however, finds no friend in Barack Obama. Every expression of it has been under attack since his administration took office with the avowed purpose of fundamentally transforming this country. Asked last year if he believed in American exceptionalism, he answered "I believe in American exceptionalism, just as I suspect that the Brits believe in British exceptionalism and the Greeks believe in Greek exceptionalism." In other words, every nation is exceptional therefore no nation is.

It’s apparent that Obama equates American exceptionalism with American arrogance. While in France he said "In America, there is a failure to appreciate Europe's leading role in the world...there have been times where America has shown arrogance and been dismissive, even derisive” For Obama the achievements of other countries are to be celebrated; those of his country aren’t. Rather than defend the honor of the nation he was chosen to lead, he focuses on its failings and misdeeds. His multiple apologies for American’s past transgressions against the world and his bowing to foreign dignitaries is a national embarrassment.

Obama has shown repeatedly that he is uncomfortable displaying American patriotism. He made a point of not wearing a flag pin on his lapel. He sometimes fails to salute the flag and stands casually when the national anthem is played. He has praised every religion except Christianity, which is the foundation of the Republic. He expurgated the word “terrorism” from governmental dialog.

The fundamental nature of government is changing on his watch. Obama has used recess appointments to fill key positions with individuals who are so controversial they likely would not pass Senate approval. The most recent, David Berwick, by-passed Senate confirmation when the Congress recessed for a short July 4 weekend break. Similarly, Obama has subverted Senate scrutiny of key appointees by calling them “czars” – unelected people who are accountable to no one but Obama. His administration employs more of them than any previous administration.

Spending will reach 44% of GDP this year – on par with the intrusiveness of European governments. Yet while Europe and the UK are trying to unwind their decades-long addiction to welfare programs, the American government is going the other way. The stimulus failed to bring down unemployment as it was sold to the American public. Therefore, unemployment benefits were extended to almost two years in a bill passed this week after having been stalled for seven weeks by Republicans who insisted that the benefits be paid for by spending offsets elsewhere. Predictably applications for jobless benefits jumped.

With the bailouts of the auto companies and financial institutions the line between private enterprise and government enterprise is blurring. There is reason to fear that “too big to fail” will become a slogan justifying more government interventions in the future, leading to more government interference in business. When government intrudes in business it always leaves a trail of new regulations, as it did in the recent hamstringing of the financial industry – euphemistically called “reform.”

In his fundamental transformation of America, Obama and his Democrat congressional allies forced through an unpopular “reform” of the healthcare system, bringing one-sixth of the economy under government sway. After asserting for almost a year that the individual mandate was not a tax increase, the Obama administration is now taking the position that it is a tax in order to defend its constitutionality against suits by 20 states and several private organizations.
Labor unions have crippled European productivity for decades while they have been on the decline in this country. Under Obama they are enjoying a resurgence. His card check scheme would make it easier to force people into unions by banishing secret elections. Obama’s recess appointment of Craig Becker to the National Labor Relations Board is a scandal. As former counsel to the SEIU, Obama’s favorite union, he is compelled to recuse himself from pleadings involving his former employer. He has refused to do so. In one such instance, a group of nurses refused to go on strike and resigned from the SEIU, which threatened to make trouble for the nurses if they didn’t toe the union line. Becker was one of the three judges who heard the case.

Obama’s two nominees to the Supreme Court are especially troubling. Both have a “Silly Putty” concept of a flexible Constitution that is contemptuous of the Founder’s design. Both believe international legal opinions should have a bearing on their decisions, which would “Europeanize” justice in this country independent of our Constitution if implemented. In this week’s hearings for Elena Kagan, she refused to answer Senator Tom Coburn’s question asking if she believed in the principle of natural rights contained in the Declaration of Independence. Her answer, “I don’t have a view of what are natural rights independent of the Constitution,” is simply astonishing for a constitutional judge. As Coburn said, refusing to acknowledge natural or God-given rights removes the morality from her progressive moral certitude. Without natural law there would have been no Constitution. Without natural law, “progressives” would take us back to the 17th century, when rights emanated from the state or the king rather than the creator.

America is losing its exceptionalism under Obama.

This week’s polls have Obama’s job approval at all time lows. Rasmussen puts his disapproval at 56% of likely voters. Almost two-thirds – 62% -- believe the country is on the wrong track. And the approval rating of Congress is almost non-existent: only 11% believe it is doing a good job. These numbers aren’t likely to reverse course before the mid-term elections. Charlie Cook, Washington’s most reliable handicapper, says the Democrats will lose control of the House and lose six or seven seats in the Senate. If he is right, the country will have repudiated the agenda that Obama and his Democrat congress have pursued for two years.

Yet, at least three Democrat senators are talking about calling the lame duck Congress back into session in a last chance effort to pass radical legislation that they do not want to be answerable for in the November elections. If they do that and if Obama acquiesces to it, it will hijack the wishes of a democratic society and sully the integrity of a representative Congress – perhaps permanently.

Sunday, July 18, 2010

Why ObamaCare Will Fail

Hubris, the overestimation of one’s competence and ability, especially among those in positions of power, has sent mankind on many a fool’s errand and has been the cause of much anguish through the ages.

One of the earliest recorded instances of it is in Genesis 11. In the days following the biblical flood, people spoke a common language, allowing them to collaborate in joint ventures, such as the building of the great tower of Babel in modern day Iraq. “Come let us build ourselves a city and a tower with its top in the heavens,” they said, in order “to make a name for ourselves.” God observes their hubris – the desire to be like Him – and confuses their language so they can no longer communicate with each other; then He scatters them so that their construction project is left incomplete.

Farther down mankind’s timeline Solomon, allegedly the wisest man who ever lived, warned that “pride goes before destruction, and a haughty spirit before a fall.” Sage advice. Hubris is accompanied by a willingness to take excessive risk. It was at the root of the Challenger disaster, the Bay of Pigs catastrophe, and most recently the Deepwater Horizon oil rig explosion.

When Barack Obama assumed the office of the presidency, our country was facing high unemployment, a meltdown of financial institutions, two foreign wars, a near-nuclear Iran, the misadventures of a tyrant in Korea whose sanity is questionable, and a fulminating conflict in Palestine. Yet despite all of these challenges, Obama and his minions in Congress chose to “reform” the American healthcare system which represents one-sixth of the economy and was not a smoldering problem. We are left to guess his motivation in this risky undertaking, but one thing is certain: Obama is not burdened with excessive modesty. His self-image borders on messianic. Like the ancient builders of the tower of Babel, one wonders if this large scale government reengineering was driven by the desire to “come; let us make a name for ourselves.”

However, even if it had the noblest motivations, ObamaCare is doomed to fail because of its sheer scale and risk. Here’s why.

The American healthcare system is, well, a system. A system by definition is the aggregation of interdependent parts or activities or functions or all three – very few of which, if any, are superfluous. Shut down one part, activity, or function and the system or a subsystem within it will cease to function. In complex systems, a failure in one part can cascade throughout the system causing failures in related subsystems.

In mid-July of 1977, for example, a New York City blackout occurred because a lightening strike at a substation tripped two circuit breakers. A loose locking nut in one breaker box together with a tardy restart cycle ensured that the breaker was not able to reengage and allow power to begin flowing over the lines again. This caused the loss of two more transmission lines, which caused the loss of power from the Indian Point nuclear power station, which caused two major transmission lines to become overloaded, which caused automatic circuit breakers to trip, which reduced power on the grid, which put the city in total darkness one hour after the lightening strike, which caused widespread looting and rioting.

Such is the nature of systems. Their greatest strength is their greatest weakness: their interconnectedness. Our inclination to think there is symmetry in causes and consequences – that disastrous system failures are caused by equally monstrous blunders – is usually wrong. The root cause is most often quite benign and accelerates to a catastrophic ending. A lightening strike on a remote box worth less than $25 caused hundreds of millions of dollars in riot and looting losses and damages almost 100 miles away.

Unlike many technological and organizational systems, the American healthcare system is not the product of a master design. It has evolved over six decades and continues to evolve. It is so vast that there is no person who understands how it works. There are people who understand how parts of the system work – relatively small parts. There are people who possess a global view of how the system works. But there is no one who possesses a ground-level understanding of how inputs are converted to outputs from end to end throughout the system. No one.

Into this unknown world of cause and consequence fools rush in where angels fear to tread. Yet Obama and his Democrat lawmakers, academics, and policy wonks with unbounded hubris propose to improve the effectiveness and reduce the future cost of this system that no one fully comprehends – a system with perhaps billions of micro-connections and work-arounds, most of which are invisible to people working in the system, let alone people outside of it, a Pick-up Stix web of relationships whose equilibrium can be put into a tailspin of unintended consequences by small disruptions of it.

The flagships of the ObamaCare invasion will be over a hundred new government bureaucracies under the commands of managers who will face implementation problems they have never confronted before, de novo bureaucracies with no legacy of precedent, whose operating procedures will have been composed by small armies of regulation writers, who have never worked in the administrative environments whose functions they are prescribing, laboring independently of the other regulation writing armies, thus assuring there is no coherency in their collective work product. There will be, however, a bumper crop of unintended outcomes, some of which will require years to erect adequate organizational defenses preventing their recurrence. As has happened with Social Security, Medicare, and Medicaid, costs will exceed the most pessimistic CBO estimate, perhaps two-fold or more, jeopardizing the U.S. economy for decades, if not forever. The bureaucracy managers will fail, although there will be few objective standards to reveal how badly they are failing. Their failures will not be due as much to the fact that they have not ever dealt with the issues facing them, but that no one has.

Orbiting any new government program with the scale and intrinsic risks of ObamaCare will be two potentially fatal threats. One is the naïve optimism that things will go as planned. They won’t. However, instead of launching initiatives in provisional wrappers with the intent of adapting as new learning is acquired, as well-run business organizations do, they will be launched with a bureaucratic rule book whose effectiveness as a governing document is thought to correlate with its weight. Immeasurable resources and time will be spent trying to make the system work as planned. In predictable bureaucratic behavior, breakdowns and bottlenecks will be “fixed” with patch upon patch, rule upon rule – repairing rather than replacing defective operations.

The second fatal threat is that ObamaCare is not customer-centric. It is procedure-centric. Customer satisfaction was never its goal. This is by design. In their arrogant hubris Obama and his Democrat legislators assumed as an article of faith that government makes better decisions – certainly more rational ones – than the recipients and providers of healthcare services. The appointment of David Berwick to head the Centers for Medicare and Medicaid Services and its $900 billion annual budget made that abundantly clear. Berwick is an academic technocrat who has publicly stated multiple times his lack of confidence in private enterprise solutions for healthcare delivery. Yet one need only look to public education, Amtrak, and the U.S. Postal Service to see how Procrustean government-provided services are. These institutions have “survived” because there are alternatives to using their services. The aim of Berwick and ObamaCare is a single payer.

The failure to make ObamaCare customer-centric could be its undoing. Absence of a feedback loop from the market and alternative choices assures that healthcare services will be substandard. Americans, with their legacy of enjoying the best products and services in the world, may suffer this for a while, but not for long. Democratic society works because of the consent of the governed. People pay their taxes, follow society’s rules, and accept civil authority voluntarily. The few that don’t are manageable because they are a few. This country has not had to deal with large-scale civil disobedience since the Civil War, but it would be foolish to think that civil disobedience is not a possibility if society believes its public institutions are not serving the interests of the majority. Hopefully society’s frustration with ObamaCare will be resolved at the ballot box.

These criticisms of ObamaCare do not mean that the American healthcare system has no room for improvement. It does. But the system seems to work for about 85% of its users. Instead of focusing on the 15% that doesn’t work well, the hubris of ObamaCare is its redesign of the entire system.

Why not take insurance and focus on improving that alone? Small scale highly focused interventions would produce improvements in a relatively short period of time. At least new knowledge would be produced of what will work and what won’t, and that new knowledge would lead to improvement. Such an approach is experimental, flexible, and adaptable. Notwithstanding Berwick, a private sector partnership would be critical to the success of the undertaking. Once insurance is “reformed” perhaps unnecessary testing and treatment could be addressed next, followed by improvement initiatives confronting other failures of the healthcare delivery system.

This piecemeal approach has worked in improving business processes. It would work in improving the cost and quality of healthcare delivery. If performance improvement had been Obama’s aim, he would not have undertaken a large-scale, high risk overhaul that has little chance of succeeding. He would have taken a more modest, less visible, and less risky approach. The hubris of his claim that while he wasn’t the first president to try reforming the American healthcare system he intended to be the last revealed an aim that is ages old: “Come; let us make a name for ourselves.”

Saturday, July 10, 2010

Another Poke in America's Eye

With the recess appointment this week of Dr. Donald Berwick to head CMS, Obama stuck his thumb once again in our country’s eye, demonstrating what kind of American president he intends to be – more imperial than representative; more a ruler than a governor, more the lawgiver than his party’s legislative agenda leader.

His first “in your eye” moment was the ramming through of his unpopular ObamaCare, for which the only bipartisanism was in its opposition. It remains to be seen whether his individual mandate, whose constitutionality has yet to be tested, will give him his third opportunity to show that when it comes to running America, he’s in charge.

Senator John Barrasso, an orthopedic surgeon from Casper, Wyoming commented on Obama’s recess appointment: "I think this was his intention all along." Perhaps so. Why otherwise would a nominee who was only put forward a few months ago, who was in the process of preparing his responses to the hearing committee’s questionnaire and the information it requested, whose hearings had not even been scheduled and, therefore, faced no opposition yet, suddenly be given a recess appointment when Congress goes on a break of less than two weeks?

Dan Pfeiffer, White House communications director, posted a possible answer Tuesday on the White House blog:

"With the agency facing new responsibilities to protect seniors' care under the Affordable Care Act, there's no time to waste with Washington game-playing. That's why tomorrow the president will use a recess appointment to put Dr. Berwick at the agency's helm and provide strong leadership for the Medicare program without delay."

So compliance with Article II, Section Two of the U.S. Constitution is playing games? The Advise and Consent provision says:

“[The President] shall have Power, [to] nominate, and by and with the Advice and Consent of the Senate, shall appoint … public Ministers and Consuls, … and all other Officers of the United States, whose Appointments are not herein otherwise provided for….”

This doesn’t sound like “playing games,” unless, like Pfeiffer, you consider the Constitution an “oh so ‘yesterday’” document that gets in the way when the big boys are trying to get government done.

Apparently Obama & Company were planning to blame the Republicans for making the recess appointment necessary … until Senator Max Baucus, Democrat Chairman of the Finance Committee, opened his mouth: “Senate confirmation of presidential appointees is an essential process prescribed by the constitution that serves as a check on executive power and protects . . . all Americans by ensuring that crucial questions are asked of the nominee — and answered.”

Senator Chuck Grassley, Baucus’ Republican counterpart on the Committee, had added to Berwick’s homework assignment a request for information about the Institute for Healthcare Improvement – a non-profit founded by Berwick ten years ago. After years of receiving about $600,000 in income from the IHI, it suddenly quadrupled his income to about $2.3 million in 2008 – a year in which IHI revenues fell $7 million and it produced a deficit of over $600,000. It could be that there is nothing there, but a lot of money orbits around and through the foundation, and Grassley was right to pursue it even when questions about the IHI were not on Berwick’s questionnaire.

Berwick and the IHI have also been associated with controversial assertions about the efficacy of their research and outcomes in the so-called 100,000 Lives Campaign, claiming to have prevented over 100,000 deaths among inpatient hospitalizations. The methodology has been widely criticized, and one assessment is surprisingly frank in its judgment:

“We do not see ourselves as “cynics” accusing hospitals of “manipulating” their data but rather as students of epidemiology and evidence-based methods who know that humans, working with a predetermined belief (and hope) that a practice works and with an incentive system that rewards positive findings, are capable of producing biased results in ways that neither they nor investigators are even aware of or can anticipate. The rationale for controlled studies, audited data, and other accepted scientific methods is to protect not against fabrication but against the more subtle biases that can contaminate the work of well-meaning, honest individuals who deeply believe in what they are doing.”

Dr. Berwick apparently never planned to hold a public policy level position during his career, because he has been impolitic over the years with his public remarks and writings, to wit:

"I am romantic about the National Health Service," he told a London audience in 2008, referring to the British single-payer system. "I love it," going on to call it "such a seductress" and "a global treasure."

“Most people who have serious pain do not need advanced methods; they just need the morphine and counseling that have been around for centuries"

“Any healthcare funding plan that is just, equitable, civilized, and humane must … must redistribute wealth from the richer among us to the poorer and the less fortunate. Excellent healthcare is by definition redistributional”

In his 1996 book New Rules, Dr. Berwick theorized that the primary goal of healthcare regulation is "to constrain decentralized, individual decision making" and "to weigh public welfare against the choices of private consumers."

During a time when a majority of Americans want ObamaCare repealed, perhaps these and other radical remarks that Berwick has made during his career, coupled with Grassley’s nosing around the IHI was sufficient handwriting on the wall to cause TeamObama to exploit the fact that Congress was out of town and take the flack for making a recess appointment. Article II, Section Two of the Constitution further says:

“The President shall have Power to fill up all Vacancies that may happen during the Recess of the Senate, by granting Commissions which shall expire at the End of their next Session.”

Madison’s notes taken during the Constitutional Convention are silent as to the Founder’s motives in the recess appointment clause. But keep in mind that 18th century America was agrarian and the Founders realized that those in government had farms to care for that would take them home for longer periods than they would be in the halls of government. It made sense that the President had to keep the government running under these conditions. The recess clause was not meant for times when Congress went home for an extended weekend.

So Obama has his man in office until the end of 2011. With Berwick, Obama now has a like-minded lieutenant who will be fanatically committed to transform American healthcare into a top down, government run system.

"Please don't put your faith in market forces," Berwick wrote a couple of years ago. "It's a popular idea: that Adam Smith's invisible hand would do a better job of designing care than leaders with plans can. I find little evidence that market forces relying on consumers choosing among an array of products, with competitors fighting it out, lead to the healthcare system you want and need."

However, both men have to consider in their political calculus the possibility, if not the likelihood, that the House will be lost to the Republicans this fall. Republicans will gain seats in the Senate, not likely a majority, but enough to force the Democrats to come courting in order to get the Democrat agenda passed. And unlike previous mid-term elections, between this November and next January a lot of important legislation will be shoved into the lame duck session.

In by-passing the Senate hearing, Obama assures that Berwick will face even tougher scrutiny if he is nominated again after his recess term ends in 2011. Senators have thin skins when treated as if they are insignificant – even the Democrat types. And Berwick arrives on his new assignment much like John Bolton did when Bush appointed him to the UN post – as damaged goods. There is a patina that a confirmed candidate gets which one appointed in recess lacks. Berwick will lack stature with his CMS staff and in his relations with Congress.

After the 2010 election Obama and Berwick will likely look at each other and realize they aren’t in Kansas anymore. Therefore the recess appointment of Dr. Berwick could prove to be a Pyrrhic victory.

Saturday, July 3, 2010

The First July 4th

When the Second Continental Congress convened in Philadelphia in May 1775, the revolution against Great Britain was well underway. The outbreak of the revolution occurred less than a month earlier with the bloodshed at Lexington and Concord, followed by the Battle of Bunker Hill and the evacuation of the British army from Boston by sea.

Although it had no authorization from the thirteen colonial governments to do so, the Second Congress began functioning as a national government – appointing generals, sending representatives to European governments, signing treaties, borrowing money, and issuing paper currency (called “Continentals”) – all with the primary focus of managing the war effort.

Initially, the war was not a war of independence; it was a rejection of the authority of Parliament to rule the Colonies without Colonial representation. But as the Second Congress labored on into 1776, it became apparent that the recalcitrance of the advisors to the Crown and Parliament itself made any reasonable accommodation impossible and made independence inevitable. A formal declaration of that independence would be necessary to declare to the world the right and the reasons for Colonial independence; otherwise no world nation would get involved in a family squabble between the Crown and a family member.

The Virginia delegation to the Second Congress included Peyton Randolph, the cousin of Thomas Jefferson. When Randolph was called home to become the President of the Virginia House of Burgesses, Virginia’s government, much to the chagrin of Jefferson, sent him to replace his cousin. Compared to other delegates, Jefferson was relatively young at 33 years, hated cities and public speaking, missed his wife and plantation, and almost from the day he arrived in Philadelphia, began writing the Virginia officials asking to be recalled.

But his reputation for science, reading, and literary composition had preceded him. When the Second Congress decided on June 11 that a formal declaration was needed to proclaim Colonial independence, it delegated the task to a “Committee of Five” consisting of John Adams, Benjamin Franklin, Thomas Jefferson, Robert Livingston, the delegate from New York, and Roger Sherman, the delegate from Connecticut. There are no minutes from this committee, but it is puzzling how its youngest member was chosen to write the draft of the declaration. Franklin was not chosen because he had gone home to deal with a severe attack of gout that caused his absence at most of the Committee’s meetings. In letters written decades later, Adams explained that he and Jefferson were therefore assigned the task of writing a draft and Adams, aged 41, deferred to his younger partner after refusing Jefferson’s deference to him. Adams’ letter says he gave Jefferson three reasons: “'Reason first, you are a Virginian, and a Virginian ought to appear at the head of this business. Reason second, I am obnoxious, suspected, and unpopular. You are very much otherwise. Reason third, you can write ten times better than I can.”

For the next seventeen days, Jefferson toiled over the document which was written on a mahogany traveling desk that he had commissioned Benjamin Randolph, a noted Philadelphian cabinet maker, to make for him a year earlier. After his presidency ended, the desk was lost when a wagon carrying the Jefferson personal effects overturned crossing a river en route to Monticello.

When Jefferson had his draft, he showed it to Franklin and Adams who made revisions (which can be seen today in their handwriting) along with revisions in Jefferson’s handwriting which may have come from conversations with Franklin, Adams, or the other committee members. From these revisions, Jefferson wrote a “fair copy” which was titled "A Declaration by the Representatives of the United States of America, in General Congress assembled.” This was presented to the 56 delegates of the Second Congress on June 28.

While Jefferson’s document would become one of the two most important documents of colonial times – the other being the Constitution – it was not so at the time it was written. To Jefferson and the other delegates it was simply one of the many bureaucratic papers that were written during those early days when the delegates were making things up as they went along in their process of becoming a republic. Ignominiously, the document was tabled (literally as can be seen in John Trumbull’s famous painting) while the Congress took up more pressing matters.

One of those matters was a Resolution of Independence that the Virginia Convention had instructed its delegate, Richard Henry Lee, to put before the Philadelphia Congress back in May:

“Resolved, that these United Colonies are, and of right ought to be, free and independent States, that they are absolved from all allegiance to the British Crown, and that all political connection between them and the State of Great Britain is, and ought to be, totally dissolved.”

Some delegates had not been authorized by their state conventions to vote for independence and thought it was premature to act on Lee’s resolution. Now, on July 1, the delegates were prepared to debate and vote. However, Edward Rutledge of South Carolina, the youngest delegate at age 26, asked that the vote be delayed a day so that unanimity might be sought. On July 2 the vote was taken and the resolution passed with twelve affirmations and one abstention, that of New York, whose delegation had not yet received permission to vote for the resolution.

John Adams, writing to his wife on July 3, predicted that the date of July 2 would become an American holiday because of the passage of Lee’s resolution, thinking that the date of the vote on the resolution, not the vote on the declaration and announcement, would become “Independence Day.”

For the next two days, the delegates reworded Jefferson’s text and deleted nearly one-fourth of it, which Jefferson absorbed in silent agony. As Pauline Maier writes in her excellent book, American Scripture: Making the Declaration of Independence, “[Jefferson] had forgotten, as has posterity that a draftsman is not an author …” Nevertheless, Jefferson had produced a masterpiece, without benefit of books, and strictly from the recollection of his well-read mind. His ideas as expressed in the Declaration have been the subject of much scholarship and books. My personal favorite is Garry Wills’ Inventing America: Jefferson's Declaration of Independence.

Franklin had rejoined the delegates after his attack of gout, and was sitting near Jefferson during this painful process. “[Franklin] perceived that I was not insensible to these mutilations,” Jefferson recalled years later, and in an effort to give Jefferson comfort while the editing and amputations were going on, Franklin told him a story of a hatter who was about to open a shop. The hatter had a sign made to advertise his business which said, “John Thompson, Hatter, makes and sells hats for ready money” followed by an image of a hat. Before hanging his sign, Franklin continued, he asked the opinion of several friends. One recommended that the word “hatter” be removed since it was redundant with “makes hats.” Another said the word “makes” should be excised because customers wouldn’t care that it was Thompson who made the hats. A third friend noted that the term “for ready money” wasn’t necessary since merchandise wasn’t sold on credit. Finally, a fourth opined that “sells hats,” could be stricken because Thompson could surely assume no one expected him to give them away. In the end, Franklin sighed, the sign simply said, “John Thompson,” with a picture of a hat, which said what was needed quite well. It seemed Jefferson’s declaration was headed to the same fate.

On July 4, the Declaration of Independence, as it is called today, was approved and sent to the print shop of John Dunlap a few blocks away where about 200 “broadside” (typeset) copies were printed during the night from the engrossed copy and distributed for public readings. The first public reading occurred in the yard of Independence Hall and readings continued throughout the thirteen colonies. A copy was sent to General Washington, who was in nearby Trenton and who had it read to his troops. After the reading concluded, a long period of silence followed, which gave the General concern. But after contemplating what they had heard, the troops broke out with loud “huzzahs.”

John Adams and Thomas Jefferson became fast friends as a result of their service in the Second Continental Congress, no doubt helped by an eight-year difference in their ages. Both held positions as ministers abroad and were out of the country when the Constitutional Convention was held. The first presidential election unanimously elected George Washington. Adams became Vice President and Jefferson became Secretary of State. However, Jefferson left the Washington administration during its second term and returned to Monticello due to his ongoing disagreements with Secretary of the Treasury, Alexander Hamilton over the fiscal management of the new nation.

When Washington refused to run for a third term, the election of the second president set friend against friend because both Adams and Jefferson wanted to be the second president. Adams won, and because he received the next highest total of electoral votes, Jefferson became Vice President by default.

The tumultuous election of 1800 tore their relationship asunder when Jefferson’s election as the third president denied Adams a second term. They did not speak to each other again until both reached old age and their mutual friend, Dr. Benjamin Rush, a noted Philadelphia physician in colonial times, arranged a reconciliation that got them started on a most remarkable letter-writing exchange beginning in 1812.

In an historical irony, Thomas Jefferson died on July 4, 1826 – the 50th anniversary of the passage of the Declaration of Independence. John Adams died a few hours later on the same day.

Jefferson never made pretensions about being president. After his first inaugural, he stood in line at his boarding house waiting his turn to eat. After his second inaugural, he ceased delivering the State of the Union message in person. On one occasion, a visitor to the White House knocked on the front door, which Jefferson himself answered in a tattered house robe and slippers. Not surprisingly, his tombstone mentions his three proudest accomplishments, making no mention of his presidency:

Here was buried
Thomas Jefferson
Author of the
Declaration
of
American Independence
of the
Statute of Virginia
for
Religious Freedom
and Father of the
University of Virginia