Wednesday, December 7, 2011
The Prolix Patriot has written in the past on the virtues of the Electoral College system, but the liberal campaign to institute a popular vote for presidential elections is in the news again. The National Popular Vote Interstate Compact is a proposal whereby participating states agree to give their electors to the winner of the popular vote, regardless of the votes cast for either candidate within the state. The compact will automatically go into effect if enough states join it to give the NPV states control of at least 270 electoral votes.
The election of 1960 was razor thin. Thanks to shenanigans in Chicago and other major Democratic strongholds in the Northeast, Kennedy won the popular vote by a margin of about 100,000 votes, but because of the Electoral College system enshrined in the Constitution by our founding fathers, Kennedy had a clear mandate of 303 electoral votes to Nixon's 219. Now, let's pretend the NPV had been in effect.
Polls in Nixon's home state of California close a full two to three hours after those in Chicago, Dallas, Philadelphia, New York, and Boston. As reports come in that Nixon is only down by a small margin (less than 0.25%), the Nixon campaign pushes organizers and supporters in California to get a few more voters out to the polls in Republican-leaning precincts to swing the national popular vote over to Nixon's favor.
When California finally begins tallying votes, voila, the final count comes in with Nixon ahead by 500 votes. Even though Kennedy should have a clear victory in the Electoral College, the NPV rules require Kennedy strongholds of Illinois, Massachusetts, New Jersey, and Maryland (all current NPV compact members) to give all their votes to Nixon. Given the slim margin, both the Kennedy and Nixon campaigns start requesting recounts and filing legal challenges, counter-suits, and injunctions in almost every state in the Union.
California, Texas, Illinois, and New York are simultaneously adjusting their vote tallies when it becomes clear that Kennedy may still win the popular vote. Nixon supporters in California react by launching a signature drive for a ballot initiative to leave the NPV interstate compact. Meanwhile, disgusted with the possibility that a Republican may take the election, Mississippi's unpledged electors announce they will give their votes to Kennedy.
Lawsuits galore are now headed to the Supreme Court, and then, as an added twist, Kennedy supporters and the press start a campaign to pressure Eisenhower's Supreme Court nominees to recuse themselves from any election-related cases, because Nixon served as an advisor to the Eisenhower administration's nomination and vetting process. By the time the dust has settled the decision on how to resolve the crisis is left in the hands of only four justices with a real possibility of deadlock.
As popular outrage builds with the escalating crisis, Eisenhower convenes all 50 governors at an emergency meeting in Dallas to consider the possibility of deploying the National Guard to maintain order. While traveling from the airport to his hotel, he is assassinated by a disgruntled Communist sympathizer named Lee Harvey Oswald. Nixon is sworn in as acting president while the results of his own election are still being litigated and tabulated. Massive riots and violence break out across the nation and Nixon declares "temporary emergency measures" in an attempt to bring the situation under control.
We are used to thinking of such a constitutional crisis as a preposterous and impossible scenario, but if the NPV ever goes into effect, this is exactly the sort of crisis that could happen in the very near future. In fact, the Heritage Foundation and the State Leadership Foundation hosted an event this morning with Senate Minority Leader Mitch McConnell which examined just these sorts of problems. As former Federal Election Commission Chairman Bradley Smith once observed, “We are so accustomed to stable, generally good government that we sometimes forget that failure of government structures is historically much more common than success.…[W]e tinker with our success at our peril.”
Tuesday, December 6, 2011
As the Occupy Wall Street movement fizzles out with the approach of colder weather, it is worth revisiting the Occupiers’ central argument. Namely, that income inequality between the top 1% and the rest is somehow relevant to the present economic crisis. However, this is a dangerous diversion from the heart of the problem. Instead of focusing on income inequality, we as a society should be focused on improving economic growth.
Income inequality is an inescapable reality. Even in famously egalitarian countries like Sweden or Norway, there is no such thing as perfect equality. Whenever people engage in economic activity together, there is some element of wealth creation over what could be achieved if each person had to fend for himself. Although on a small scale it is possible to apportion wealth creation evenly to all participants, history has shown time and time again that it is more efficient to apportion the rewards of economic activity in proportion to individual contributions.
Furthermore, according to recent economic data from the CIA Factbook, there is no real correlation between income inequality and economic growth. As an example, both South Korea with a low Gini coefficient of 31.4 and Mexico with a very high Gini of 51.7 have similar growth rates. Conversely, Japan and India have similar Gini coefficients, but the annual GDP growth for India is more than double that of Japan. Indeed, an academic paper on the subject concludes that, “Evidence from a broad panel of countries shows little overall relation between income inequality and rates of growth and investment.”
Worst of all, policies which are intended to promote income equality are not always successful and almost never contribute to greater prosperity. When pursued on a large scale, such policies can result in economic disaster as was the case during Stalin’s infamous five-year plans and Chairman Mao’s “Great Leap Forward,” both of which lead to crop failures and the deaths of tens of millions of people from starvation.
Income inequality is a distraction from policies which contribute more directly to promoting prosperity. Numerous studies have demonstrated the relationship between strong property rights and economic growth. As the evidence shows, Americans do not need the government to tell them how to live their lives. If the government fulfills it central purpose of protecting our long-cherished rights of personal liberty and private property, Americans will be free to pursue their own aspirations of happiness and prosperity in whatever way they see fit.
Monday, November 14, 2011
The Supreme Court has granted certiorari to one of the challenges to ObamaCare which means that a decision one way or the other on the constitutionality of the individual mandate will be issued by the Court in June of next year. Conveniently, the Republican presidential primaries will be wrapping up at the same time and President Obama will begin debating the presumptive Republican nominee. While Obama is wishing he could talk about something else, next year's debate will be centered on Obama's obsession with heath care reform amidst a failing economy and rising unemployment.
All of this is bad news for President Obama. While he may have the bully pulpit, the President has no authority in the deliberations of the Supreme Court or in the process of amending the Constitution. The only influence the President has in the former is the appointment of justices. In this, his nomination of former Solicitor General Elena Kagan may come back to haunt him during the general election as her position on ObamaCare prior to its enactment is re-examined by the public after the Court's decision in June.
We cannot know how the Court will decide the case, but in some respects, it doesn't even matter which way the Court rules. The magnitude of public opinion against the ObamaCare individual mandate is overwhelming. In addition to the 28 states that have sought to challenge the law in the courts, 18 states--including, most recently, the key presidential battleground of Ohio--have enacted laws which oppose some element of ObamaCare, and no less than 45 states have proposed similar legislation.
The only states which have not mounted any challenge whatsoever to ObamaCare are the coastal liberal strongholds of California, Massachusetts, Vermont, Rhode Island, and Connecticut. Predictably, the District of Columbia also has no objections to the individual mandate. With numbers like these, the states are in a position to amend the Constitution regardless of how the Supreme Court rules. History has shown that even the threat of an Article V convention has usually been sufficient to pressure Congress to respond to the will of the people.
President Obama entered office with no real leadership experience and an ideological obsession with reshaping the very essence of American life. Ironically, Obama's allies in Congress and the media who proclaimed the "historic" achievement of health care reform will be proven correct, although not in the way they had hoped. The combination of incompetence and hubris that came together in the failed ObamaCare legislation will be remembered forever as the centerpiece of Obama's failed presidency.
Monday, October 17, 2011
As Americans, we hold religious freedom and the freedom of conscience as the most essential of all rights. The First Amendment protects the freedoms of speech, assembly, press, and protest all in their own right, but at their heart these rights all depend on the basic right of belief. At Mass this past Sunday, the Gospel reading centered on Jesus' command to the Pharisees and the Herodians to, "Render therefore unto Caesar the things which are Caesar's; and unto God the things that are God's." Almost two thousand years later, this is an important insight into the debate over separation of church and state in our country.
Whether by requiring taxpayer funded abortions in the healthcare law and by executive order, allowing military chaplains to perform same-sex marriages, or by providing taxpayer funding to openly political groups like Planned Parenthood, the Obama Administration has waged an all-out assault on traditional Christian--and especially Catholic--belief. Indeed, while many of President Obama's initiatives have met with failure, his attack on religious freedom has been devastatingly successful.
Meanwhile, the persecution of Christians by liberals has reached a new low as the Occupy Wall Street protests have spread to Europe with a new decidedly anti-Christian twist. Whether in the Soviet Union, Castro's Cuba, or the anarchist protests of our day, left-wing politicians have always been hostile to religion. Obama appears to be no different. Although nominally a Christian, he has nevertheless provided encouragement, if not an outright endorsement, to the Occupiers' reign of terror.
Even for non-Christians and non-believers, the lesson of history is clear. When a democracy is intolerant of different views, disaster and tragedy soon follows. In one especially vivid example, Ken Burns' recent documentary demonstrates that the deadly consequences of prohibition were in large part fueled by religious persecution. When a minority is persecuted, it is never long before the majority begins to suffer as well. As our laws encroach more and more into the private sphere of our daily lives, the danger only increases.
In all this, we can look to the example of Thomas More who was forced to choose between allegiance to his king and to his faith. When King Henry VIII made it a crime to deny his supremacy as the head of the Church of England, Thomas More stood fast to his faith as a Catholic. Throughout his life, More was obsessed with the meaning of virtue. In his study of theology, philosophy, and the law, he came to believe that above all, virtue cannot exist without integrity. In the end, he chose to die rather than sacrifice his integrity.
Whether Christian or not, pro-life or pro-choice, pro-gay marriage or not, we have a duty as Americans to be tolerant of the views of others, but it does not mean we should accept the views of those with whom we disagree. Although it is inevitable that the government will do things that are objectionable--even to a majority of the people--it is also imperative that the government does not violate the essential freedom of belief. It is one thing to pay taxes for dubious government projects, but it is quite another when the government uses tax dollars to persecute those who pay the bill.
Tuesday, October 11, 2011
With each passing year, a vocal minority uses Columbus Day as an occasion to clamor for the rebranding of the holiday as “Native American Day,” if not its outright abolition. Aside from cute slogans about colonialism and Columbus’ accidental discovery of the New World, there is little substance to these objections. It is true that Columbus did not prove the earth was round and he was not the first person to set foot in America. However, for good or for ill, Columbus’ voyages changed the course of history.
The principal objection is that Columbus symbolizes a legacy of genocide and brutality that we should not celebrate. However, the Europeans were no worse than the native peoples. In particular, the early Spanish explorers documented with horror the practices of human sacrifice and cannibalism by the Aztecs and other indigenous cultures. If roles were reversed and the Aztecs had discovered Europe instead of the other way around, the clash of civilizations would have been just as brutal and bloody.
Alternatively, the tired cliché of our time is that Christopher Columbus did not discover America because there were people already here. However, just because a thing is seen does not mean it is understood. Before Benjamin Franklin’s famous kite experiment, people knew that lightning existed, but they did not understand its significance. In the same way, the native peoples of the Americas did not realize that they inhabited a continent in a larger world until Columbus made contact in 1492.
That said, one could reasonably argue that the credit should go to Leif Ericson, who set foot on Newfoundland around the year 1000. However, Ericson was illiterate, and the only accounts of his voyage that survive were passed down by oral tradition for several centuries before being written down, by which time the only Norse settlement in North America had long since been abandoned. In contrast, Columbus was an expert navigator who kept detailed accounts of his voyages so that others could follow his route.
As a result, Columbus is just as important to the history of America as the first Thanksgiving. There are hundreds of towns, cities, mountains, rivers, roads, parks, museums, schools, monuments, statues, and sports teams—and even the ill-fated Space Shuttle Columbia—all dedicated to the memory of the man who truly discovered the American continent. We can lament that the early European explorers did not share our modern cultural sensibilities and candidly admit to Columbus’ personal failings, but as Americans, we cannot condemn Columbus’ achievement without condemning our entire existence.
Thursday, October 6, 2011
A visionary is someone who has foresight beyond the horizon that limits the vision of those around him. This word is often thrown around carelessly, but in the case of Steve Jobs, who passed away last night, this description is truly appropriate. As Issac Newton once wrote in a letter to a friend, “If I have seen further, it is by standing on the shoulders of giants.” The same can be said of Steve Jobs, who was in some ways the Issac Newton of the personal computer.
Issac Newton realized the impact of Copernicus, Galileo, and Kepler while his contemporaries were still trapped in a geocentric Aristotelian universe. Similarly, Steve Jobs realized the true impact of the graphical user interface, the internet, and wireless communications. Ironically, the two inventions for which Jobs will be most remembered were invented by other people, but it was Steve Jobs who would bring these ideas to their fullest potential.
The graphical user interface was developed by Xerox. In 1979, Jobs was given a glimpse of the Xerox Alto which Xerox had stored away in a back room. Xerox failed to grasp the potential of the personal computer and by the time Xerox began to take interest, Jobs had already created the enormously successful Apple Macintosh. For the first time, the power of the computer was available to ordinary non-technical people at an affordable price.
Similarly, before there were smart phones, there was the aptly named Apple Newton, which was created in the 1990’s while Jobs was running Pixar. The CEO of Apple at the time, John Sculley, even coined the phrase, “personal digital assistant.” Unfortunately, the Newton tried to accomplish too much, too quickly, and was limited by the high cost of miniaturized components at the time. Also, the Newton was created before cell phone networks had the capability to provide wireless internet connectivity. As a result, it was an enormous flop.
After his return to Apple in 1996, Jobs aggressively restructured the company to focus on its core strengths. As always, the hallmark of Jobs genius was the simplicity and intuitive nature of his products. The iPod and iPhone can trace their beginnings back to the Xerox Alto and the Apple Newton, but it was Jobs who built on the work of others to make these ideas attractive, profitable, easy to use, and ubiquitous. In short, Steve Jobs put the internet in your pocket.
Tuesday, October 4, 2011
Last week, a drone operated by the Central Intelligence Agency and U.S. military Special Forces dispatched Anwar al-Awlaki to meet his maker. Awlaki was a radical imam who recruited and encouraged terrorists to attack the United States, but he was also an American citizen. Leading Libertarian figures claim this incident represents a dangerous threat to the right of due process, but a look at history shows us that it is not without precedent. According to Wikipedia:
Before dawn on April 26[, 1865], the soldiers caught up with the fugitives, who were hiding in Garrett's tobacco barn. David Herold surrendered, but [John Wilkes] Booth refused [Colonel Everton] Conger's demand to surrender, saying "I prefer to come out and fight"; the soldiers then set the barn on fire. As Booth moved about inside the blazing barn, Sergeant Boston Corbett shot him.
Colonel Conger, a Union intelligence officer, immediately arrested Corbett for violating his orders to take Booth alive. During the investigation, Corbett claimed that he saw Booth moving toward his weapons, but the other witnesses disputed this account, stating only that Booth was moving around inside the barn, likely searching for some means of escape from his desperate and hopeless position.
Despite the evidence that Corbett shot Booth in cold blood, Secretary of War Edwin Stanton dismissed the charges against Corbett stating, "The rebel is dead. The patriot lives." Corbett was later given an honorable discharge from the Army but later descended into madness, likely caused by the use of mercury which was then common in his civilian profession as a hatter.
Although the 14th Amendment did not exist at the time of Booth's death, due process rights were guaranteed to all American citizens at the federal level by the 5th Amendment. However, civil rights were sharply curtailed in the occupied confederate states following the end of the war. In particular, thousands of Virginians were arrested and their property seized without due process for providing support and aid to the Confederacy.
The fate of Anwar al-Awlaki is remarkably similar. Although an American citizen, like Booth, he joined forces with al Qaeda in order to commit acts of terror against the country of his birth. On the other hand Booth was given a chance to surrender, whereas Awlaki was blasted away by a robotic drone. In both cases, due process was likely violated.
After the death of John Wilkes Booth, the country did not descend into totalitarianism. Quite to the contrary, passage of landmark legislation with the 13th, 14th, and 15th Amendments to the Constitution greatly expanded the protection of civil liberties which was finally consummated in the civil rights struggles of the 1960's.
Despite the horrors of slavery, civil war, Jim Crow, and segregation, America has emerged from the struggle stronger and freer than before. However, Booth was killed after the war was over, whereas Awlaki was killed at a time when there is no clear end in sight.
Today, al Qaeda is trying to radicalize American citizens so they can use our laws against us. The longer the Global War on Terror drags on, the more our rights will slowly be eroded in the name of "security." As the war goes on, terrorist masterminds like Osama bin Laden, Khalid Sheikh Mohammed, and Anwar al-Awlaki will devise ever more insidious and diabolical means to attack at the very heart of what it means to be an American.
Rather than the legal implications, the greater concern should be that the use of targeted killings could encourage new recruits to follow in Awlaki's misguided path to "martyrdom." Over the past several years, President Obama has relied heavily on Special Forces to carry out assassinations of Somalian pirates and al Qaeda kingpins instead of engaging our enemies with conventional military resources.
If the death of Awlaki hastens the end of this war, a return to peace, and a higher standard of justice in the future, it is not all bad--and perhaps the deaths of bin Laden and Awlaki will be the fatal blow to al Qaeda that will allow a return to domestic tranquility. But if not, we must reassess and rethink our strategy before the killing of Awlaki ceases to be the exception that proves the rule.
Thursday, September 8, 2011
Today is the 170th anniversary of the birth of composer Antonin Dvořák. Celebrations are planned across his native Czech Republic and at the Czech Embassy here in Washington, D.C. throughout this month, but the composer should also be celebrated by all Americans as a personification of America’s greatest ideals and a celebration of American exceptionalism.
For three years, the great composer lived in New York as the directory of the National Conservatory of Music. During that time, he composed a prolific body of works that borrow significantly from American folk-musical traditions. Inspired by Longfellow’s epic Song of Hiawatha, Negro spirituals, and Native American culture, Dvořák created one of the most beloved symphonies of all time—his Symphony Number 9: “From the New World,” as well as several other of his most famous works.
In his brief time in America, Dvořák was immersed in the melting-pot of turn-of-the-century New York. During the 1890’s, almost 10 million people immigrated to the United States from every corner of the globe. In that decade, more than 5,000 people were born during the crossing itself. Different cultures were colliding and mixing in unprecedented ways that have come to define America today.
More than just the good food and music that we most often think of today, the millions of people that arrived on our shores brought different languages, different religions, and different histories. Since then, the multitude of different cultures and beliefs, hopes and fears, villains and heroes have all merged into the great story of America itself. We often take it for granted that so many things that we consider “American” today were actually brought here from other shores.
Baseball is as American as apple pie, but baseball is derived from a British game and apples came eastern Turkey. Jazz and Rock ‘n’ Roll are American art forms, but both are derived from the musical traditions of Africans who were transported here against their will. In our language, our history, our culture, and everything that makes us American, we are truly a nation of immigrants.
As we approach another anniversary—the 10th anniversary of September 11th—it is worth considering that in the World Trade Center, the terrorists indiscriminately attacked people from more than 90 countries and from every continent. It was not just an attack on America, but rather it was an attack on what it means to be an American.
Just as in Dvořák’s time, people came together in New York in search of the American ideals of liberty, equality, and justice. We owe it to the memory of the victims of that dark day and to all who risked everything to start a new life here in the "New World" to continue to strive for those ideals, and pray that all people in the world will someday come to know the particular blessings and prosperity we enjoy as Americans.
Monday, August 29, 2011
The Prolix Patriot and the Missus took advantage of the beautiful weather yesterday in the wake of Hurricane Irene to visit the new Martin Luther King, Jr. National Memorial. We knew that there was some controversy surrounding the project, but we were completely unprepared for how truly underwhelming the final product really is. Sadly, the memorial fails to capture the awesome scope of what this Baptist preacher from Atlanta accomplished in his tragically short life.
As we approached the site along the edge of the tidal basin, the Missus offered her initial impression that the large mountain of stone at the center of the memorial might represent America, and the gap in the middle might represent the racial divide, while the large statue of Dr. King himself might represent his message of unity and racial harmony. Sadly, as we approached, the inscription on the half-finished monolith of Dr. King informed us that the monumental stones represented the “mountain of despair.”
Nowhere in the memorial is there any inscription of King’s most famous and uplifting words from the “I Have a Dream” speech. The “mountain of despair” quote is drawn from the speech, but completely misses the transcendent nature of what Dr. King was trying to accomplish. Indeed, in that speech, King himself implored his listeners not to “wallow in the valley of despair.” By memorializing despair, the memorial runs counter to everything that King stood for.
Worse still, although Dr. King was a Baptist minister, there is no mention of God anywhere in the memorial at all. In the Jefferson Memorial, we have the core ideals of America encapsulated in the preamble to the Declaration of Independence. In the Lincoln Memorial, we have the brilliant and haunting eulogy of the Gettysburg Address. Both refer to God as the source of liberty. Why doesn’t the Martin Luther King, Jr. Memorial include the recurring motif of “all God’s children” that epitomizes King’s achievement as the consummation of Jefferson’s and Lincoln’s ideals?
Instead of celebrating the triumph of the thoroughly Christian values of love, compassion, and charity over hatred and violence, the memorial has an unpleasant similarity to the eerily smiling pharaohs of Abu Simbel. Instead of celebrating the great advances in racial equality, the memorial is a puzzle with half the pieces missing, but Rev. Dr. Martin Luther King, Jr. was not a cryptic figure or an obscure mystic. He spoke plainly and honestly about the need for racial equality. It’s a shame the monument does not live up to that legacy.
Wednesday, July 27, 2011
As the debt limit negotiations drag on, the Prolix Patriot was reminded this morning by a cousin-in-law of this passage from Atlas Shrugged:
Then you will see the rise of the double standard – the men who live by force, yet count on those who live by trade to create the value of their looted money – the men who are the hitchhikers of virtue. In a moral society, these are the criminals, and the statutes are written to protect you against them. But when a society establishes criminals-by-right and looters-by-law – men who use force to seize the wealth of disarmed victims – then money becomes its creators' avenger. Such looters believe it safe to rob defenseless men, once they've passed a law to disarm them. But their loot becomes the magnet for other looters, who get it from them as they got it. Then the race goes, not to the ablest at production, but to those most ruthless at brutality. When force is the standard, the murderer wins over the pickpocket. And then that society vanishes, in a spread of ruins and slaughter.
"Do you wish to know whether that day is coming? Watch money. Money is the barometer of a society's virtue. When you see that trading is done, not by consent, but by compulsion – when you see that in order to produce, you need to obtain permission from men who produce nothing – when you see that money is flowing to those who deal, not in goods, but in favors – when you see that men get richer by graft and by pull than by work, and your laws don't protect you against them, but protect them against you – when you see corruption being rewarded and honesty becoming a self-sacrifice – you may know that your society is doomed. Money is so noble a medium that it does not compete with guns and it does not make terms with brutality. It will not permit a country to survive as half-property, half-loot.
"Whenever destroyers appear among men, they start by destroying money, for money is men's protection and the base of a moral existence. Destroyers seize gold and leave to its owners a counterfeit pile of paper. This kills all objective standards and delivers men into the arbitrary power of an arbitrary setter of values. Gold was an objective value, an equivalent of wealth produced. Paper is a mortgage on wealth that does not exist, backed by a gun aimed at those who are expected to produce it. Paper is a check drawn by legal looters upon an account which is not theirs: upon the virtue of the victims. Watch for the day when it becomes, marked: 'Account overdrawn.'
The Prolix Patriot does not advocate for a return to the gold standard, as this would be disastrously impractical even in the best case. At the same time, it is fast becoming obvious that our trust in the "full faith and credit" of the U.S. government as the basic measure of value is sorely misplaced. Even if default is avoided this time around, what's to stop future politicians from further corrupting of our entire economic system?
More importantly, insurance companies, mortgage lenders, and retirement funds should not be punished because of the profligacy of our government. The last case is especially wicked. In exchange for a lifetime of honest work, millions of retirees and those nearing retirement are left with nothing except faith in the word of a politician.
It is written in scripture that, "No man can serve two masters: for either he will hate the one, and love the other; or else he will hold to the one, and despise the other. Ye cannot serve God and mammon." Alas, our elected officials thirst only for power and don't even serve mammon, and the motto "In God We Trust" stamped on our currency has become a twisted joke.
Tuesday, July 26, 2011
After last night’s speech, one gets a sense that President Obama is desperate to prove to a skeptical public that he is in charge. However, the President fails to understand is that leadership is about more than just giving commands. Leadership requires respect, wisdom, patience, and above all, humility, but Obama never had to learn these hard-earned virtues prior to his inauguration. As Benjamin Franklin once said, "Experience keeps a dear school, but fools will learn in no other."
Indeed, there are only six other presidents who, like Obama, were elected without any prior executive experience--defined here as either a former governor, military officer, or both:
- John Adams
- John Quincy Adams
- James Buchanan
- William Howard Taft
- Warren G. Harding
- Herbert Hoover
Many presidents failed to win re-election despite their experience in the military or the governor’s mansion. Recent one-term presidents like Ford, Carter, and George H.W. Bush all served with distinction in the military. Other one-termers like John Tyler, Andrew Johnson, and Rutherford B. Hayes were all governors. And then of course there is the oddity of Grover Cleveland, who was the only president to lose re-election, but then later come back to win a second non-consecutive term.
It is clear that previous executive experience--even in the presidency itself--does not guarantee re-election. Sometimes events are too powerful for even otherwise great and accomplished men to overcome. Patience and humility are essential virtues when confronted with the unexpected and the uncontrollable. Wise presidents know the limits of the office--and the limits of human nature.
Obama doesn’t have any of these advantages. History shows that lack of experience almost guarantees an unsuccessful presidency. This is even worse for Obama, who is powerless in the face of continuing economic uncertainty. As Vice President Biden once said--as he so often does--in an unguarded moment of unintentional honesty, "The presidency is not something that lends itself to on-the-job training."
Wednesday, July 20, 2011
Almost a year ago in these pages, the Prolix Patriot dubbed the President’s flawed economic policies a new era of “Hoover-Carterism.” Little has changed since then. In the debate over increasing taxes vs. cutting spending, Obama has led the liberal chorus of Cassandras who warn that cuts to spending will harm the economy far worse than increasing taxes, but there is scant evidence to support their claims. Rather, the only real support for Obama’s dubious policies is his own hubris.
Prior to 20th century, government spending was an infinitesimal fraction of anything comparable to today. Even when the government was not the behemoth that it is now, the economy grew far more quickly than it has in recent years. Then, on February 3, 1913, the state of Delaware--which ironically has no income tax of its own--provided the 36th vote for ratification of the 16th Amendment to the Constitution which created the federal income tax. With increased taxation, the federal government also began spending more money. The effects are striking.
Using data from economists at Saint John’s University and Miami University summarized in the chart above, we see that prior to 1913, real growth rates (after accounting for inflation) averaged roughly 4.2% per year. Meanwhile, the economy only contracted in nine out of more than 100 years. At the same time, there were 43 out of 100 years with better than 5% real growth.
After 1913, real growth rates have only averaged 3.4% per year—almost a full percent less each year than pre-income tax levels. At the same time, the number of years with contractions increased from 9 to 22 out of slightly less than a hundred, while the number of years with better than 5% real growth shrank from 43 to only 29. Worse still, after the creation of the modern welfare state in response to the Great Depression, each passing decade has seen declining growth rates. The last year with real GDP growth above 5% was 1984.
Some will argue that it’s not fair to compare growth rates during the 19th and 20th centuries due to technological and societal changes. However, the agricultural innovations and the invention of the microchip of the late 20th century are no less revolutionary--and probably more so--than the invention of the steam engine which launched the 19th century industrial revolution. If anything, growth today from the agricultural and information revolutions should exceed that of the 1800’s. Instead, we live in a time of economic atrophy.
It’s clear from this exercise that as government taxation and spending have increased as a proportion of economic output, the growth of the economy over time has slowed until today, where it is at a virtual standstill. Despite living in an era of unprecedented technological advancement, the economy is so overburdened with taxation, government spending, and debt, that a year with 5% real GDP growth is now rare.
Obama and his liberal allies claim that more taxation and spending will save us from our current plight, but even a cursory look at history shows just how absurd this Keynesian prescription really is. The hard truth is that the government really has no power to help the economy. Despite the best efforts of tinkering technocrats, the economic growth of the last 60 years has been pathetic compared to historical averages. Rather, the lesson of history shows that the government can do very little help the economy, but is quite capable of doing harm.
Monday, July 18, 2011
With talks on raising the debt limit grinding along in Washington, D.C., it's worth considering just how big the problem really is. President Obama and the Democrats have been pushing tax increases as a solution to the nation's budget woes, but don't think for a moment that such increases will only affect the super rich. When the tax man comes around, we're all going to pay the price for the government's profligacy.
According to the Congressional Budget Office, the estimated 2011 U.S. federal budget deficit is $1,480 billion dollars. But what does that really mean? Comparing this number to data from the International Monetary Fund, the U.S. federal deficit is roughly the same as the Gross Domestic Product of the following countries and regions:
- Canada - $1,574 billion
- India - $1,537 billion
- Russia - $1,465 billion
- Spanish-speaking South America (i.e., except Brazil) - $1,443 billion
- Spain - $1,409 billion
- Australia and New Zealand combined - $1,375 billion
- All of Africa, minus South Africa - $1,357 billion
Monday, July 11, 2011
During the 2008 election, vice-presidential candidate Sarah Palin was pilloried by the media for her initial support of the $400 million “bridge to nowhere” that would have been funded with federal taxpayer dollars. However, this is nothing compared to the vastness of the mounting debt crisis. A few hundred million dollars is laughable compared to a national debt that is measured in tens or even hundreds of trillions.
While discussing the national debt limit with a friend, he compared the looming debt limit to a car that is low on gas. He argued that just as you would not slam on the brakes when you realize that you need to stop for gas in 20 miles, politicians should not treat the debt limit as a crisis. This is a useful metaphor, and perhaps it has some validity, but even if the debt limit is raised, our nation will still be in dire straits.
Yes, the car of state, as it were, is low on gas. However, this is a better analogy for the problems of the private sector economy. Businesses in this country are burdened by a combination of too much uncertainty in all the wrong places plus plenty of bad certainty about over-regulation and excessive taxation. As a result, the economy is sputtering and we are on the verge of an even deeper double-dip recession.
However, the car of state is also being pushed by the government towards a bridge that has not yet been fully built. As in the climactic scene from “Back to the Future III,” we have already passed the point of no return, but unlike the movie, there is no such thing as time travel. The debt problem is not going to go away by itself. Instead of gliding gracefully across a bridge to the future, we are racing towards a bridge to nowhere.
As of today, the official national debt is hovering around $14.3 trillion, or about 98% of gross domestic product. Meanwhile, the balance sheet for Social Security, Medicare, and Medicaid contains about $115 trillion in red ink, or roughly eight times the current gross domestic product. Even if the government confiscated 100% of all income, assets, and property from every man, woman, and child in the nation, we still wouldn’t have enough to cover our current obligations.
As the deadline for a deal to raise the debt limit nears, the Obama Administration is asking Republicans to balance cuts in spending with tax increases, but increasing the debt limit and cutting spending will accomplish nothing unless politicians have the courage to reform the choking burden of entitlement spending.
Unlike the movie, it doesn’t make any difference if we go over the economic cliff at the magic Hollywood time-travel speed of 88 miles per hour or 5 miles per hour--the end result will be the same. If we are to avoid national bankruptcy, we don’t just need to slam on the brakes; we need to throw the engine into reverse. Current proposals call for cutting between $2 and $4 trillion from spending over the next decade, but unless politicians can add a few zeroes to those figures, we are only delaying the inevitable catastrophe.
Tuesday, July 5, 2011
In honor of Independence Day, the Prolix Patriot and the Missus watched two movies over the weekend listed under the “patriotic” section on the DVR: “Flags of Our Fathers” and “1776.” At first glance, the dark and ugly cynical realism of the former and the lighthearted Technicolor ebullience of the latter would seem to have little in common, but on a deeper level, the two films both explore the very essence of what it means to be an American.
A repeated theme in “Flags” is that the world-famous picture wasn’t actually the first flag-raising. The six men immortalized in bronze on a hill overlooking Washington, D.C. were actually the second group of U.S. Marines to raise a flag at the summit of Suribachi. Similarly, in “1776,” we learn that the vote for independence we celebrate on the fourth was actually achieved on the second. The fourth was simply the day that the Declaration of Independence was made public.
In addition to the temporal confusion surrounding both events that persists to this day, both movies are at pains to remind us that the men who took part in these famous events were ordinary people. A common motif in “Flags” is the facelessness of the Marines fighting at Iwo Jima. The photo of the second flag-raising was especially poignant--and popular--because those men could have been anybody’s son, anybody’s father, or anybody’s brother.
Similarly, men like Roger Sherman of Connecticut, Judge James Wilson of Pennsylvania, Dr. Lyman Hall of Georgia, and Governor Stephen Hopkins of Rhode Island are largely forgotten today despite their impressive accomplishments. Aside from the mostly erroneous dramatizations of these lesser-known figures in “1776,” even the more famous founding fathers were ordinary men with ordinary desires. Thomas Jefferson had an obsession with macaroni and George Washington had a sweet tooth for ice cream--he probably even got brain-freeze from time to time.
Finally, we must remember that the events depicted in both films were only the beginning of the fighting. Just as the flag-raising on Suribachi took place on the fourth day of a battle that would continue for another full month before the island was finally secured on March 26, 1945, so too the Declaration of Independence would not be recognized by Britain until the victory at Yorktown in 1781 and the eventual Treaty of Paris in 1783 which in turn was not ratified until January 14, 1784.
In a larger sense, this is the story of America. The movie “1776” has Benjamin Franklin counseling John Adams at the climactic moment of the plot:
Besides, what would posterity think we were? Demi-gods? We're men, no more no less, trying to get a nation started against greater odds than a more generous God would have allowed. First things first, John. Independence; America. If we don't secure that, what difference will the rest make?Even though Franklin probably never uttered those exact words, the movie correctly summarize the founders belief that the future of this new thing called the “United States of America” would depend on the care and judgment of future generations. The founders gave us independence and the Constitution, but they left the rest to us.
The sacrifices of those who fought and died on Iwo Jima and in every battle before and since make clear that the fight for freedom is one that never ends. All of us have a duty to never stop fighting for those self-evident rights of freedom of speech, freedom of religion, and freedom from persecution and tyranny which were endowed to us by our creator. It is easy to feel patriotic on the Fourth of July. What matters though is what we do on the other 364 days of the year.
Friday, June 24, 2011
With this week's announcement of "worse than expected" unemployment numbers, it is worth investigating how often the experts have gotten their predictions wrong. While admittedly an unscientific approach, a comparison of Google search results is a quick and easy way to compare news coverage of the economy. Below are screen captures of the Google results for the phrases "better than expected" and "worse than expected" plus unemployment.
While both searches show an increasing trend over time, this is probably attributable to better access to recent data with the growth of the Internet. On the other hand, there is a clear bias in favor of "worse than expected" economic reports by a margin of about 28,300 to 18,100, or roughly three "bad" stories for every two "good" ones.
In is impossible to predict the future with any certainty, so it's hardly newsworthy when reality fails to meet projected targets. However, we should be concerned when journalists distort uncertain economic predictions in order to manipulate public opinion. If journalists were honest in their reporting, we would expect an even number of articles for better than expected and worse than expected economic performance.
By consistently favoring bad news which undermines consumer confidence, journalists create a self-fulfilling prophecy of negative feedback that makes it all the more difficult for the economy to recover. Worse still, this unsophisticated approach to economics also undermines the vital role of the press in maintaining the educated public which is at the core of republican self-government.
Wednesday, June 15, 2011
Like the many-headed Hydra of Greek legend, the federal government is a massive and poisonous beast that defies all efforts to control it. Despite Herculean efforts at reforming government over the past several decades, the deep cuts, layoffs, and efficiency gains that were promised have never materialized. Worse still, despite many attempts at reform, the growth of government waste and abuse has only accelerated.
In 1978, President Jimmy Carter fulfilled a campaign promise to make it easier to fire incompetent federal employees with the Civil Service Reform Act. Nevertheless, anybody who has worked for or interacted with the federal government knows that incompetence is still rife. Even under the improved regulations, it still takes months of repeated offenses and endless piles of red tape to get rid of problem employees.
Because of the difficulty of eliminating incompetent civil servants, President Bill Clinton sought to reform the procurement process so that the federal government could more easily hire (and fire) contractors while at the same time eliminating the burden of lavish government benefits and pensions. Despite these changes, it is harder than ever for businesses to win government contracts and the government is spending more than ever on litigation due to conflicts of interest and non-performance.
Then, in 2001, President George W. Bush articulated his President’s Management Agenda, which sought to eliminate underperforming government programs and duplication of effort. Despite ten years under the new guidelines, Congress has created countless new commissions, agencies, and panels of bureaucrats with new powers while President Obama has filled his administration with “czars” for every conceivable pet issue.
Earlier this week, the Obama’s Director of Digital Strategy observed in a blog post that, “an overall online landscape of literally thousands of websites – each focusing on a specific topic or organization – can create confusion and inefficiency” and sets a goal of eliminating half of them. If only the Obama Administration had figured this out sooner we might have been spared the onslaught of new government websites that began with the dubious change.gov, continued with recovery.gov, healthcare.gov, and goes on and on.
While it is refreshing to hear somebody in the Obama Administration talk about making government smaller in some way, the lessons of history are clear: every attempt at reform that originated with the President has met with failure. Congress has overruled attempts at reform by creating new offices, new regulations, and increasing spending.
If we are to bring the vast federal bureaucracy under control, a new mindset is needed in Congress. Instead of creating new laws to tackle problems, Congress needs to revisit the laws that are already on the books and eliminate provisions that aren’t working or are no longer needed. Instead of hiding pet projects in massive 2,000 page omnibus spending bills, every dollar the government spends should be held up to scrutiny and every program should be forced to defend its existence.
Otherwise, the leviathan of government, like the Hydra of myth, will only continue to grow with a plethora of new and even uglier faces.
Friday, May 20, 2011
President Obama announced with great fanfare yesterday his solution to the Israel-Palestine conflict: just rewind the clock to 1967 and everybody will be happy!
There's one big problem with this solution though. Prior to the 1967 six-day war, the territories of the West Bank and Gaza Strip that now constitute the Palestinian Authority were controlled by Jordan and Egypt respectively. Thus, prior to 1967, the sum total of Palestinian lands was exactly zilch, zero, nothing, nada, etc.
In essence, Obama is telling one of our closest allies to give up more land to an entity that was created from Israeli territory under the 1993 Oslo Accords. In return, Israel will get...nothing. Even after previous Israeli concessions in the Gaza Strip and West Bank, terrorist attacks have continued unabated. Obama's rhetoric may win points with his UN admirers, but it is not a serious path to peace.
This is reminscent of the old Jewish folktale about Herschel of Ostropol who was fond of playing practical jokes. In one popular re-telling for children, Herschel uses a dreidel and the false logic of "heads I win, tails you lose" to outsmart one of the Hannukah goblins that has been terrorizing a small town. It's a safe bet that the Israelis will not fall for the same trick.
Monday, May 16, 2011
Earlier in these pages, we compared DC's proposed streetcar network and existing Metro system to the original DC Transit, circa 1958 when the system was at its zenith.
Today, prompted by a conversation with the author of the Blog of the Courtier, we endeavor to show how mass transit is a critical ingredient for population density. Using the same streetcar map and comparing it to the 2010 census population data for each of the 180-odd census tracts within the District, we can see an almost uncanny conincidence betweena areas of high density and the original streetcar network.
Prior to the 1960's, there were few bridges across the Anacostia river, so large parts of far Southeast were never served by streetcars. However, at the same time, large areas of Northeast D.C. which are not served by Metro had extensive streetcar service.
Intuitively, it makes sense that density conincides with streetcars. Proponents of bus service argue that routes can be changed easily to accomodate changing trends and population shifts--indeed, Metro has a page dedicated to bus route changes, but this is exactly why bus lines do not promote density. Metro can decide on a whim to reroute the bus line that used to take you straight to your office.
On the other hand, a streetcar or subway line cannot be moved very easily, so developers and residents have confidence that they will continue to have regular transit service. Even after more than 50 years, the only areas of higher density that have emerged in the district are the pockets in far Southeast that are now served by Metro.
Wednesday, April 20, 2011
Let us ignore some of the more glaring inaccuracies with the new "Tax Receipt" calculator that was created by the White House on Monday and instead focus on the more subtle manipulation of the data to accomplish President Obama's political goal of shifting attention away from entitlement reform.
Using the White House's default setting for a family of four making $80,000/year, the application helpfully breaks down how much of this hypothetical family's taxes go to different categories. However, what immediately jumps out is that the (large) portion of taxes that go to Social Security and Medicare are not included in the percentages even though they are counted towards the total spending at the bottom.
By omitting entitlements, the percentage for defense spending is inflated drastically so that it appears as the largest share of spending, just narrowly edging out the next largest category--health care. However, if we include entitlements, the picture changes dramatically. According to the White House numbers, defense spending is really only 12% of overall spending, not 26%.
Also, if we look at the broad categories of welfare spending vs. government spending on constitutionally enumerated powers like defense, law enforcement, and foreign relations, we can easily see that spending on the welfare state in this country accounts for more than 80% of the hypothetical family's tax payments for that year.
Worse still, if we conjecture that the parents of this family are under the age of 40, it is almost guaranteed that neither parent will ever see a dime of the money that they are currently paying into the system, because Social Security and Medicare will be completely bankrupt and interest on the national debt will have overtaken non-defense discretionary spending by the time they retire. Meanwhile, their children will be left with a debt they will never be able to pay down.
The hard truth that Obama doesn't want you to know is that the 20th century welfare state is at its breaking point. The welfare programs currently in place benefit some, but unless Congress is able to implement drastic reforms, these programs will cease to function within our lifetimes. The choice now is clear: we can follow President Obama in ignoring and obfuscating the problem, or we can have the courage to make serious reforms.
Friday, April 8, 2011
It's been a while since we last had one of these. In today's installment, you have to read the fine print to discover it's not actually a mistake. Tax Day will be postponed this year due to a District of Columbia holiday. If you've ever seen one of those D.C. license plates that says "Taxation Without Representation," now you know that it's a lie.
Wednesday, April 6, 2011
With the threat of a federal government shutdown growing with each passing day, it’s important to understand what functions would and would not continue operating after this Friday if Democrats and Republicans can’t reach a last-minute compromise. The list below is by no means complete, but provides at least a partial accounting of some of the far-reaching effects of a shutdown:
WON’T SHUTDOWN: the US Postal Service is considered self-funding, so mail will continue to be delivered. However, paychecks for millions of government employees and contractors will not be sent out, so postal workers will have slightly less mail to sort through.
WILL SHUTDOWN: all national museums, parks, memorials, cemeteries, and historical sites will be closed to the public. The Park Police will still patrol the monuments to ensure that nobody takes any pictures or listens to music on their headphones though.
WON’T SHUTDOWN: the IRS will continue to collect taxes, so don’t feel like you can slack off as tax day approaches next Friday, but if the government owes you money, don’t hold your breath waiting for your tax return. Meanwhile, The Federal Reserve will still continue to print worthless currency to fund the existing debts of the United States.
WILL SHUTDOWN: all non-essential government offices will be closed and it will be a fire-able offence for government employees to attempt to do any work during a shutdown. Some agencies are even exploring ways to remotely shut down the ubiquitous blackberry phones wielded by government employees.
WON’T SHUTDOWN: Metro will continue to run at full capacity even if all the trains are empty. Plush Union contracts will also guarantee that even though nobody is riding metro, senior employees will continue to accrue overtime hours while not fixing the broken escalators at Farragut West.
WILL SHUTDOWN: all personnel aboard the International Space Station will be requested to enter deep hibernation until the budget situation is resolved. Operation of the space station will be transitioned over to control by a new computer system called “HAL 9000.”
WON’T SHUTDOWN: the Sun will continue to shine. US Department of Agriculture farm subsidies will continue to be funded under the “provision of payments” clause which defines essential government functions. However, all federally-subsidized solar power plants will have to be pointed away from the Sun, unless the weather is cloudy.
WILL SHUTDOWN: the Moon will go dark and the tides of the ocean will cease. Functions of the commerce department relating to measurement of oceanographic phenomena will be suspended and as mentioned previously, NASA will be closed. During the shutdown, only the dark side of the Moon will be visible. The Earth will also stop spinning and fire and death will rain down from the heavens.
Tuesday, March 8, 2011
For Catholics and many Protestants, today is the last day before the fasting and penance of Lent. Known variously as Fat Tuesday, Mardi Gras, or Shrove Tuesday, this is a day that started as a celebration and anticipation for the joy of Easter but has lately become another commercialized excuse for bad behavior. However, what many people have forgotten is that today is also the last day of winter--or at least it should be.
A popular urban legend has it that spring does not begin until March 20 this year. In recent years, calendar makers and the mainstream media have popularized the misguided idea that the equinox is the precise moment that winter ends and spring begins even though the changing of the seasons is a natural and gradual process that cannot be exactly predicted from one year to the next.
As the name implies, March is a month of changes, so instead of an abrupt switch at March 20, evidence that the process of spring has already begun is everywhere. That process is reminiscent of the adage that March “comes in like a lion and goes out like a lamb.” The high winds and freezing cold of February are already giving way to the verdure of spring.
We now live in an age where, for the first time in the history of the world, a majority of people live in cities and are detached from nature, so it is easy to understand why this urban legend has become so popular. Worse still, we are now subjected to faux-intellectual clichés such as the perennial, “it’s snowing in December but it’s not even winter yet.”
Such oxymoronic statements are evidence that the choice of any particular date for the beginning of each season is at best arbitrary. However, if we must choose a date, Ash Wednesday seems as good a choice as any, especially if we consider that the word “Lent” is derived from the old Germanic word for "spring." In the Mid-Atlantic region, this is a particularly good fit for the reasons below.
From a meteorological standpoint, the coldest and snowiest days of the year run from the first week in December to the first week in March:
1) In the Mid-Atlantic region, the average daily temperature drops to its lowest level almost exactly at the midpoint between December 1 and March 1, not at the midpoint between the winter solstice and the vernal equinox. Although temperatures are different, the midpoint of the temperature curve is similar in other regions.
2) Comparing daily temperatures for December and March, almost every Day in December is colder than the same day in March.
3) Average daily snowfall is highest in the throughout the months of December to February, not from December 20 to March 20.
From a botanical standpoint, deciduous plants start growing in late February and go into dormancy around early December:
4) The first blooms of spring--especially crocuses and daffodils--can be found in early to mid-March, or sometimes even late February, well before the vernal equinox.
5) Depending on location, the changing leaves achieve their peak color in late September through early November and start to fall very soon thereafter.
6) The average date of the last overnight frost of spring, and thus the time for planting crops ranges from mid-February in the South to mid-May in the Mountain West. Thus, the equinox is actually closer to the midpoint of spring from an agricultural standpoint.
From a cultural and historical standpoint, the season of spring has always included most or all of the month of March and possibly even February, whereas Summer is understood to begin in June, May, or even late April depending on who you ask:
7) Groundhog Day (or Candlemas) which is celebrated on February 2 is a tradition which sometimes results in a proclamation of “early spring.” Although the predictions of Punxatawney Phil and his cousins are wildly inaccurate when compared to actual weather data, the tradition is nevertheless a recognition that mild weather sometimes comes before March 1 and certainly long before the vernal equinox.
8) As mentioned above, Lent, which is derived from the Germanic root for "spring," begins between February 4 and March 10, but the average date of Ash Wednesday is on February 21.
9) Especially in the South, Easter is the first day it is considered acceptable to wear white pants, shoes, etc. which are traditionally associated with summer. The average date of Easter is on April 8.
10) Similarly, the feast of the Nativity of St. John is a popular holiday celebrated across Europe in late June which is known as midsummer, implying that spring has already ended and summer has begun long before the summer solstice.
11) The name of March comes from ancient Rome, when March was the first month of the year and named Martius after Mars, the Roman god of war. In Rome, where the climate is Mediterranean, March was the first month of spring, a logical point for the beginning of the year as well as the start of the military campaign season.
12) Data going back to 1974 indicate that retail prices for gasoline are higher on average for the months of May through September because the summer months are traditionally associated with car trips and vacations.
Finally, from a bureaucratic standpoint, no agency or body of the United States government has ever declared the solstices and the equinoxes as the “official” start and end of the seasons:
13) According to the National Oceanographic and Atmospheric Administration which falls under the U.S. Department of Commerce, Atlantic hurricane season runs from June 1 to November 30 and thus coincides exactly with the traditional dates for summer and fall.
14) According to the Energy Information Administration which falls under the U.S. Department of Energy, the “winter heating season” runs from October 1 through March 31 and thus corresponds to the traditional dates for winter, with some extra padding on either end. The midpoint of the winter heating season is December 31.
15)The U.S. Naval Observatory which falls under the U.S. Department of Defense, provides a listing of the exact times for the equinoxes, solstices, perihelion, and aphelion which is titled “Earth’s Seasons” but does not state that these astronomical phenomena are in any way the “official” start or end of the seasons.
16) By law, Daylight Saving Time (DST) now starts on the second Sunday in March and ends on the first Sunday in November. Prior to 2007, when the new law went into effect, DST started on the last Sunday in April and ended on the last Sunday in October. Just as the government definition of “winter heating season” includes padding on both ends of the traditional season, so too does DST includes padding on either end of the traditional summer season. The midpoint of DST is in mid-July.
In summary, the folklore of the equinox as the first day of spring is a modern idea which did not become fashionable until very recently. The idea has been popularized in the last few decades, because the exact times of certain phenomena are variable and can only be noted after they have occurred. At the same time the dates of many phenomena such as the last frost or peak fall leaves vary across the vastness of our continent.
However, instead of relying on the false precision of the astronomical equinox, we should base our understanding of the seasons on natural phenomena that we observe around us. Indeed, at this moment here in Virginia, the first tender green leaves and pink buds are coming out on the trees and shrubberies, the grass is starting to turn green again, and chirping birds are once again heralding the arrival of sunrise. Regardless of what the calendar makers would have you believe, spring is here.