If the Only Way You Can Get Your Great Idea Implemented…

Economics textbooks are full of clever-and-appealing policy proposals.  Proposals like: “Let’s redistribute money to the desperately poor” and “Let’s tax goods with negative externalities.”  They’re so clever and so appealing that it’s hard to understand how any smart, well-meaning person could demur.  When critics appeal to “public choice problems,” it’s tempting to tell the critics that they’re the problem.  The political system isn’t that dysfunctional, is it?  In any case, reflexively whining, “The political system will muck up your clever, appealing policy proposal,” hardly makes that system work better.  The naysayers should become part of the solution: Endorse the clever-and-appealing policy proposals – and strive to bring them to life.

When you look at the real world, though, you see something strange: Almost no one actually pushes for the textbooks’ clever-and-appealing policy proposals.  Instead, the people inspired by the textbooks routinely attach themselves to trendy-but-awful policy proposals.  If you point out the discrepancy, they’re often too annoyed to respond.  When they do, reformers shrug and say: “The clever-and-appealing policy never has – and probably never will – have much political support.  So we have to do this instead.”

Examples?  You start off by advocating high-impact redistribution to help poor children and the severely disabled… and end defending the ludicrously expensive and wasteful Social Security program.  “Unfortunately, the only politically viable way to help the poor is to help everyone.”  Or you start off advocating Pigovian taxes to clean the air, and end up defending phone books of picayune environmental regulations.  “Unfortunately, this is the way pollution policy actual works.”

Don’t believe me?  Here’s a brand-new example courtesy of Paul Krugman:

But if a nation in flames isn’t enough to produce a consensus for action — if it isn’t even enough to produce some moderation in the anti-environmentalist position — what will? The Australia experience suggests that climate denial will persist come hell or high water — that is, through devastating heat waves and catastrophic storm surges alike…

[…]

But if climate denial and opposition to action are immovable even in the face of obvious catastrophe, what hope is there for avoiding the apocalypse? Let’s be honest with ourselves: Things are looking pretty grim. However, giving up is not an option. What’s the path forward?

The answer, pretty clearly, is that scientific persuasion is running into sharply diminishing returns. Very few of the people still denying the reality of climate change or at least opposing doing anything about it will be moved by further accumulation of evidence, or even by a proliferation of new disasters. Any action that does take place will have to do so in the face of intractable right-wing opposition.

This means, in turn, that climate action will have to offer immediate benefits to large numbers of voters, because policies that seem to require widespread sacrifice — such as policies that rely mainly on carbon taxes — would be viable only with the kind of political consensus we clearly aren’t going to get.

What might an effective political strategy look like? … [O]ne way to get past the political impasse on climate might be via “an emphasis on huge infrastructural projects that created jobs” — in other words, a Green New Deal. Such a strategy could give birth to a “large climate-industrial complex,” which would actually be a good thing in terms of political sustainability.

Notice the pattern.

Step 1: Economics textbooks offer a clever-and-appealing policy proposal: Let’s tax carbon emissions to curtail the serious negative externalities of fossil fuels.  It’s cheap, it’s effective, it provides great static and dynamic incentives.  Public choice problems?  Don’t listen to those naysayers.

Step 2: Argh, Pigovian taxes are going nowhere.

Step 3: Let’s have a trendy-but-awful populist infrastructure program to get the masses on board.

So what?  For starters, any smart activist who reaches Step 3 tacitly concedes that public choice problems are dire.  You offer the public a clever-and-appealing remedy for a serious social ill, and democracy yawns.  To get action, you have to forget about cost or cost-effectiveness – and just try to drug the public with demagoguery.

Note: I’m not attacking Krugman for having little faith in democracy.  His underlying lack of faith in democracy is fully justified.  I only wish that Krugman would loudly embrace the public choice framework that intellectually justifies his lack of faith.  (Or better yet, Krugman could loudly embraced my psychologically-enriched public choice expansion pack).

Once you pay proper respect to public choice theory, however, you cannot simply continue on your merry way.  You have to ponder its central normative lesson: Don’t advocate government action merely because a clever-and-appealing policy proposal passes a cost-benefit test.  Instead, look at the trendy-but-awful policies that will actually be adopted – and see if they pass a cost-benefit test.  If they don’t, you should advocate laissez-faire despite all those shiny ideas in the textbook.

Krugman could naturally reply, “I’ve done the math.  Global warming is so terrible that trendy-but-awful policies are our least-bad bet.”  To the best of my knowledge, though, this contradicts mainstream estimates of the costs of warming.  That aside, why back a Green New Deal instead of deregulation of nuclear power or geoengineering?  If recalcitrant public opinion thwarts your clever-and-appealing remedy, maybe you started out on the wrong path in the first place.

Unfair?  Well, this is hardly the first time that Krugman has rationalized destructive populism when he really should have reconsidered.  Krugman knows that immigration is the world’s fastest way to escape absolute poverty.  He knows that standard complaints about immigration are, at best, exaggerated.  But he’s still an immigration skeptic, because:

The New Deal made America a vastly better place, yet it probably wouldn’t have been possible without the immigration restrictions that went into effect after World War I. For one thing, absent those restrictions, there would have been many claims, justified or not, about people flocking to America to take advantage of welfare programs.

Notice the pattern.

Step 1: You start with the textbook case for a welfare state to alleviate domestic poverty.  Public choice problems?  Bah.

Step 2: Next, you decide that you can’t get that welfare state without horrible collateral damage.

Step 3: So you casually embrace the status quo, without seriously engaging obvious questions, like: “Given political constraints, perhaps its actually better not to have the New Deal?” or even “How close can we get to the New Deal without limiting immigration?”

The moral: If the only way you can get your great idea implemented is to mutilate it and/or package it with a pile of expensive junk, you really should wonder, “Is it still worth it?”

Well, is it?

Open This Content

Iraq: America’s Other “Longest War”

As the calendar prepares to flip from 2019 to 2020, protesters stormed the US embassy in Baghdad.  As I write this, the action — a response to US airstrikes in Iraq and Syria which killed at least 25 and wounded more than 50 — hasn’t yet become a reprise of the Iran hostage crisis of 40 years ago, but it’s eerily reminiscent.

Although few Americans seem to notice, Iraq is arguably the second-longest war in US history.

Mainstream media often refer to the 18-year US occupation of Afghanistan as “America’s longest war.” That claim is wrong on its face.

Setting aside a century of “Indian wars” and two decades of involvement in Vietnam prior to the 1965 escalation, the Korean War handily takes the “longest war” prize:  It began in 1950 and has merely been in ceasefire status, with occasional flare-ups and no final settlement, since 1953. If wars were people, the Korean War would be collecting Social Security.

The US war in Iraq is approaching its 28th birthday, also with no end in sight.

It began in January of 1991 with Operation Desert Storm (“the liberation of Kuwait” from Iraqi occupation). The 12 years between that “mother of all battles” and the 2003 US invasion were punctuated by US bombings to facilitate a Kurdish secession movement in the north,  protect persecuted Shiites in the south, and provide convenient distractions from assorted Clinton administration peccadilloes.

Following the short, sharp conventional fighting phase of the invasion, the war remained a very hot conflict — a combination of civil war and anti-occupation insurgency — for years following US president George W. Bush’s “mission accomplished” announcement in May of 2003.

A brief cooling period accompanied Barack Obama’s 2009 inauguration, but by 2014 American troops (and “civilian contractors,” i.e mercenaries) were once again arriving to intervene in the new regime’s fight against the Islamic State of Iraq and Syria (ISIS).

The airstrikes which sparked the current protests were carried out in response to a rocket attack on a regime military base in which one of the aforementioned American mercenaries was killed.

The bigger picture:

The US government is using Iraq as a staging area for its ongoing actions in Syria and against Iran (which it blames for this specific rocket attack and for its backing of militias in Iraq in general).

US president Donald Trump talks a good “let’s get out of all these stupid wars” game. But in actuality he has increased, and continues to increase, the size of US military deployments to, and the tempo of US military operations in, the Middle East and Central Asia.

Several thousand US troops remain in Iraq and the war looks likely to stretch into a fourth decade.

There is, of course, an alternative: Trump could put his money where his mouth is and begin withdrawing US troops from the region instead of continuing to pour American blood and treasure into a series of conflicts which should never have happened in the first place.

Peace on Earth? Maybe not. But the US going home and minding its own business would be a good start.

Open This Content

Adventure May Never Find You

There’s an old part of many adventure stories in fantasy and history: the arrival of the guide and the call to adventure.

It’s young dweeby Steve Rogers being picked to be Captain America. It’s Gandalf rapping on Bilbo Baggins’ door. It’s Aslan transporting the Pevensie children into Narnia.

Adventure seems to find people in these stories and more or less push them into an adventure.

That happened for a lot of young men of my grandfather’s generation. They had a depression and a world war that brought danger and risk and adventure to them. Many of them responded admirably.

But many of us now live in the richest societies that have ever existed. We have shopping malls and iPhones and advanced medical care and food delivery apps competing for our business.

We have bubbles into which we can go and never come out.

Maybe like me you happen to have been born or happen to have ended up in one of these bubbles (and I’d argue that most of the United States is a bubble). While there are multiple calls to adventure even for us, it’s possible to never hear those calls, or at least to not recognize them. Adventure may never find us where we are. If it does, we might find ourselves waiting for a long time.

The psychologist Nathaniel Branden was fond of saying “No one is coming to save you.” I would paraphrase to say that (probably) no one is coming to call you on an adventure, at least not when you’re inside a bubble of security and comfort. So there’s no point in sitting around and waiting. Adventure is outside the bubble. It won’t find you in there, but you may find it out there.

Open This Content

JEDI Mind Tricks: Amazon versus the Pentagon and Trump

Amazon is one of the largest companies in the world, boasting revenues of more than $230 billion last year. But last month the company sued the US Department of Defense over a paltry potential $10 billion spread over ten years.

Amazon lost out to Microsoft in bidding for the Pentagon’s Joint Enterprise Defense Infrastructure (yes,  JEDI, because the most important part of a government program is coming up with a cool acronym) cloud computing program.

Amazon claims it lost the contract due to, well, JEDI mind tricks — “improper pressure” and “repeated and behind-the-scenes attacks” —  played by US president Donald Trump on the Pentagon to set its collective mind against his perceived political opponent, Amazon president (and Washington Post owner) Jeff Bezos.

If so, Trump’s mind tricks pale next to the mind tricks used to justify the notion that the Pentagon needs a billion dollars a year to buy its own specialized, proprietary cloud computing system — one that the DoD’s own fact sheet boasts is  merely ” one component of the larger ecosystem that consists of different cloud models based on purpose” — from Microsoft, from Amazon, or from anyone else.

The great thing about cloud computing is that it’s a 50-year-old concept, generally available for years now in numerous off-the-shelf versions. The Pentagon doesn’t need its own cloud computing system any more than it needs its own brand of staplers.

Some JEDI knights might protest that the US armed forces need sturdier security than the everyday user, justifying a proprietary system. Per the fact sheet, “NSA, CYBERCOM, and the intelligence community provided input into JEDI’s security requirements.”

I suspect we’re talking about the same NSA, CYBERCOM and intelligence community we’ve listened to whine for the last 30 years about how civilian encryption technologies and other privacy protections are just too darn good and should be artificially hobbled to make them easier to crack.

Global Firepower lists 2019 defense budgets for 137 of the world’s countries. Of those countries, 61 — nearly half — spend less than $1 billion per year on their entire armed forces. That is, less than the Pentagon wants to spend per year on a single computing system.

It’s not Amazon who’s getting screwed here, it’s the American taxpayer. JEDI is Pentagon budget padding at one end and corporate welfare at the other, not an essential element of a robust national defense.

In other news, US Defense Secretary Mark Esper still hasn’t found the droids he’s looking for.

Open This Content

Rose Wilder Lane: Pioneer of Educational Freedom

My eight-year-old daughter Abby recently started reading Little House in the Big Woods by Laura Ingalls Wilder. It was prompted, in part, by watching the Little House on the Prairie television episodes with her great-aunt. Coincidentally, I have been reading more lately about some of the key women in history who promoted the ideals of individual freedom, limited government, non-coercion, and voluntary cooperation through trade. Rose Wilder Lane is one of these women. She was born on this day in 1886.

Liberty Should Always Trump Coercion

The daughter of Laura Ingalls Wilder and Almanzo Wilder, baby Rose is the child many of us remember from the ninth Little House book, The First Four Years. Perhaps those years of growing up on the prairie instilled in Lane a sense of rugged individualism and self-reliance that ultimately found their way into her writings throughout the 20th century. By the late 1920s, she was said to be one of the highest-paid women writers in the US. She became an outspoken critic of Roosevelt’s New Deal, Social Security, and other government programs she felt disempowered individuals and gave greater authority to the state.

In her 1943 book The Discovery of Freedom, Lane makes a compelling case for individual freedom and limited government power. She traces the roots of compulsion in many areas of life, including education, and explains why liberty should always trump coercion. She writes:

American schooling is now compulsory, enforced by the police and controlled by the State (that is, by the politicians in office) and paid for by compulsory taxes. The inevitable result is to postpone a child’s growing-up. He passes from the authority of his parents to the authority of the police. He has no control of his time and no responsibility for its use until he is sixteen years old. His actual situation does not require him to develop self-reliance, self-discipline and responsibility; that is, he has no actual experience of freedom in his youth. (pp. 259-60).

Lane goes on to say that this type of American education, imported from Prussia by 19th-century education reformers, “is ideal for the German state, whose subjects are not expected ever to know freedom,” but it is “not the best preparation for inheriting the leadership of the World Revolution for freedom” (p. 260). She laments the “substitution of compulsory State education for the former American free education,” saying that formerly “American children went to school because they wanted to go, or because their parents sent them,” not because it was mandated of parents under a legal threat of force (p. 258).

As Abby digs into the Little House series (which Lane was instrumental in helping to create to catalog the experiences of her parents), I learn alongside my daughter, fascinated by the life and works of baby Rose, who would grow up to become a pioneer of liberty.

Open This Content

Federal Gun Control in America: A Historic Guide to Major Federal Gun Control Laws and Acts

For Americans, the crux of gun control laws has been how to disarm dangerous individuals without disarming the public at large. Ever-present in this quest is the question of how the perception of danger should impact guaranteed freedoms protected within the Bill of Rights.

Not only is such a balancing act difficult as-is, but there are also two additional factors that make it even more challenging: America’s federal government is constitutionally bound by the Second Amendment, and politicians notoriously take advantage of tragedies to pass irrational laws when emotions are at their highest. As President Obama’s former Chief of Staff, Rahm Emanuel, once famously remarked:

“You never want a serious crisis to go to waste. And what I mean by that is an opportunity to do things you think you could not do before.”

This line of thought is not new to American politics. From the emancipation of enslaved Americans and the organized crime wave of the 1930s to the assassinations of prominent leaders in the 1960s and the attempted assassination of President Reagan in the 1980s, fear has proved a powerful catalyst for appeals about gun control.

Below is an overview of the history behind major gun control laws in the federal government, capturing how we’ve gone from the Founding Fathers’ America of the New World to the United States of the 21st century.

Second Amendment in America’s Bill of Rights: Ratified December 15, 1791

Congress added the Bill of Rights to the Constitution of the United States specifically “to prevent misconstruction or abuse of its powers.” The Second Amendment is the foundational cornerstone of every American’s right to bear arms, stating:

“A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.”

The right to bear arms was second only to the first – the most vital freedoms of religion, speech, the press, the right to assemble and the right to petition government for redress of grievances. Meanwhile, conflicting views have left government and personal interest groups struggling to reconcile technological advances, isolated but significant violent anomalies and the constitutional mandate protecting the natural right to self defense and this most basic aspect of the Bill of Rights.

First and Second Militia Acts of 1792: Passed May 2 and 8, 1792

The U.S. Congress passed the Militia Acts of 1792 less than a year after the Second Amendment’s ratification. The first act’s purpose was “to provide for the National Defence, by establishing an Uniform Militia throughout the United States.” This measure established the need and command structure for a state-based militia. The second act defined conscription parameters for those militias, limiting armed service to “each and every free able-bodied white male citizen” 18 to 45.

Colonial Gun Regulations

Even today, the majority of firearms laws are state-based and vary considerably. While CaliforniaConnecticut and New Jersey have the most restrictive laws, ArizonaVermont and Kentucky have some of the least stringent. For more than a century, the young United States relied primarily on “state” laws:

  • The earliest came from Virginia, the result of fear of attack by Native Americans. The 1619 law imposed a three-shilling fine on able-bodied men who failed to come armed to church on the Sabbath.
  • By 1640, slave codes in Virginia prohibited all “free Mulattos and Negroes” from bearing arms. In 1712, South Carolina enacted a similar law.
  • During this time in Virginia, gun laws for Native Americans were similar to those for white men – as they were not barred from possessing guns (unless they were gathering food on land held by white men). There were, however, prohibitions against providing “Indians” with weapons and ammunition. Native Americans could own weapons, but there were strict regulations on how they could obtain them.
  • Throughout the Antebellum South, LouisianaFloridaMarylandGeorgiaNorth CarolinaMississippi and even Delaware all passed multiple measures denying guns to people of color, requiring court-issued permits, and allowing search and seizure of weapons as well as punishment without trial.

Continue reading Federal Gun Control in America: A Historic Guide to Major Federal Gun Control Laws and Acts at Ammo.com.

Open This Content