Social Media Statism and “Real” Libertarianism (29m) – Episode 024

Episode 024: Join your host as he swims in the cesspool of social media and discusses science deniers, armed protesters, and the difference between Libertarianism and Voluntaryism.

Listen to Episode 024 (29m, mp3, 64kbps)

Contact Jared by emailing voluntarycontrarian@gmail.com, on Twitter @TVC_Podcast, on Instagram @voluntarycontrarian, and on Facebook fb.me/TVCPodcast.

Subscribe via RSS here, or in any podcast app by searching for “voluntary contrarian”. Support the podcast at Patreon.com/evc or PayPal.me/everythingvoluntary.

Open This Content

Failing in a Crisis – And All The Time

Government-supremacists are desperately trying to interpret government’s actions during the pandemic in such a way to make them seem smart– or at least honest. And they are failing. Hard.

As I have said before, political government is never a credible source. If your argument depends on government being a credible source you’ve set yourself up to fail before you began. You’ve hitched your wagon to a mirage and kicked off down a steep, winding trail full of big rocks, potholes, and ditches. Things can only get worse from there.

Don’t take medical advice from government without checking credible sources first. Government is not your doctor, and any doctors working for government have rejected medicine for politics. You can’t mix medicine and politics without contaminating the medicine to the point of uselessness at best, and lethality at worst.

Don’t take government’s claims of scientific accuracy without checking credible sources first. Government is not a scientist. Any scientists working for government gave up real science when they became political. For that matter, any scientist promoting a political agenda has betrayed the scientific method for politics and scientific thought for superstition. That’s not science.

Government is not an economist, a charitable organization, your parent, your master, your superior, your servant, a protector of rights, or a promoter of liberty. Government is not a safety team and is not on your side.

Government is a gang of thieves that uses initiated force to make you treat it as though it is all the good and helpful things it claims to be.

Government will not save you from the pandemic. Not even under the best-case scenario where they were right about the risks, did the right things at the right time, and didn’t do anything to make things worse.

This truth is something government-supremacists can’t take when it doesn’t align with “their side”– which it never will. It doesn’t matter whether they are the government-supremacists who support what was done or the government-supremacists who say government should have done something different. The truth is not with them. And they just keep digging themselves in deeper, to their discredit.

But it can be sort of fun to watch them flailing and failing– if you can ignore the fact that they are harming you with everything they advocate.

Open This Content

From Telework to Flexible Wages?

Lately I’ve been stunned by reports of nominal wage cuts.  They aren’t just in the news; several professionals that I personally know have received such cuts.  Employers routinely cut total pay during recessions by slashing bonuses and hours.  Even in good times, many employers cut real wages by freezing pay despite inflation.  Yet outright reductions of nominal base pay – hourly wages for hourly workers, base salary for salaried workers – have been exceeding rare for as long as we’ve had data.  Economists have debated whether downward nominal wage cuts are bad, but virtually all economists agree that downward nominal wage cuts are rare.

What on Earth is going on in today’s labor market?

The simplest explanation is that the current recession is terrible.  Quite right; maybe it’s twice as terrible as the Great Recession.  But last time around, I heard zero first-hand reports of nominal wage cuts, and near-zero such stories in the news.  I can understand a doubling of incidents, but not this.

Another tempting tale: Workers today realize that they must take pay cuts or lose their jobs.  Alas, this trade-off is on the table during every recession.  And in every prior recession, falls in nominal base pay have stayed very rare.  What then is really afoot?

Let’s begin with a primordial fact: The best explanation for nominal wage rigidity is psychological. When employers cut workers’ nominal base pay, workers feel robbed and resentful.   This hurts morale, which hurts productivity, which hurts profits.  In contrast, when employers start doing layoffs, the fearful remaining workers respond by working harder.   Logically, of course, there’s no reason for workers to feel more robbed and resentful about a 1% nominal cut in the face of 0% inflation than a 0% raise in the face of 1% inflation.  Human beings, however, are not so logical.

Why then are nominal pay cuts suddenly on the table?  You could say, “Workers have suddenly become more logical,” but as far as I can tell, they’re crazier than ever.  But psychologically speaking, there is one radical and unprecedented change in the emotional experience of labor in the time of coronavirus: the explosion of telework.  Until recently, only 3% of workers teleworked, and a large majority of these teleworkers probably dropped by the office at least every week or two.  Now the telework share has plausibly multiplied tenfold, and our former offices are all but abandoned.

Loneliness is only the most obvious psychological effect.  Teleworkers have also lost most of their opportunities to complain and hear complaints, to feel bitterness and sow bitterness, to feel aggrieved and seek revenge.  As a result, I speculate, the effect of nominal wage cuts on morale has never been lower.

When an employer cuts the pay of a face-to-face work team, the workers constantly remind each other of the perceived affront.  They work down the hall from the executive they hold responsible for the pay cuts.  They see which fellow workers are standing up for themselves, and who’s kowtowing to The Man.  That’s how the classic mechanism – wage cuts –> bad morale –> low productivity –> reduced profits – worked.  Now, in contrast, teleworkers are stuck at home with their families.  They’re juggling childcare, housework, and safety in a chaotic situation.  As a result, they have neither the energy nor the forum to kvetch – verbally or otherwise – with coworkers.  Today’s teleworkers talk to their peers to get the job done, then get back to business.  Supervisors who cut your pay now feel more like a tiresome video than a human villain, which quells the urge to settle the score.

Think about it this way: If your firm cut pay three months ago, what would have happened?  You would have arrived at work and started griping to your friends.  A few would philosophically adjust to the new normal, but a coterie of complainers would have whined, muttered, grumped, and sputtered for months.  In so whining, muttering, grumping, and sputtering, they would have disrupted not only their own work, but teamwork itself.

If your firm cut pay today, in contrast, you’d probably just read the email, groan, and resume your duties.  You might lament your fate to your partner or close friend.  Yet now that you’re teleworking, you plausibly won’t even mention the issue to a single coworker.  You almost certainly won’t lunch with coworkers to denounce the firm’s callousness and greed.  Stripped of this social feedback loop, neither morale nor productivity will fall much.  At long last, pay cuts finally do exactly what firms desire: mitigate losses by cutting costs.

On top of all this, executives and managers almost surely feel much less guilty about pay cuts than they ordinarily would.  Out of sight, out of conscience.

How can we test my story?  Most obviously, industries that switch to telework will be much more likely to impose nominal cuts.  To repeat, that means lower nominal base pay for salaried employees, and lower nominal wages for hourly employees.  In industries where some categories of workers switch to telework and others don’t, I also predict that the switching categories will be more likely to experience cuts.  (There, however, horizontal equity norms may get in the way.  If 95% of a firm’s employees telework, management might cheaply avoid outrage by also cutting pay for the 5% who work on-site).

Note: You don’t have to think that wage cuts are socially desirable to buy my story.  For a tenured GMU professor such as myself, nominal wage cuts are all pain, no gain.  That said, thirteen years after the Great Recession started, I remain convinced that nominal wage cuts are a greatly underrated way to alleviate the grave evil of unemployment.  Nominal wage cuts don’t merely save jobs within the firm; they also save jobs throughout the economy.  Keynes opposed wage cuts, but good Keynesians smile upon them.

Think of it this way: Suppose you have $1M total to pay workers.  Which is better for Aggregate Demand: Retaining your whole workforce and cutting pay 10% – or keeping wages constant and laying off 10% of your employees?  The latter route, though timeworn, reduces workers’ spending because the marginal propensity to consume falls with income – and reduces firm’s profitability in the process.

Does this make me optimistic about the economy?  Hardly.  We’re already in the midst of a second Great Depression, and even perfect nominal wage flexibility won’t restore normalcy anytime soon.  Still, when word of nominal wage cuts reaches my ears, I feel a glimmer of hope.  Unemployment will skyrocket.  Without nominal pay cuts, however, unemployment would have been worst yet.  Unemployment will take years to subside.  Without nominal pay cuts, however, unemployment would have lingered longer still.  As I wrote a decade ago:

Is labor market rigidity a market failure?  I’m afraid so.  But strangely enough, this market failure is largely caused by anti-market bias!  The main reason workers hate wage cuts is that they imagine that wage-cutting employers are satanically “unfair.”  If workers saw wage cuts for what they are – a full-employment mechanism – they’d sing a different tune.  While they wouldn’t be happy to see their wages cut, they’d grudgingly accept that a little wage variability is a fair price to pay for near-total employment security.  Once this economically enlightened perspective took hold, employers would eagerly cater to it – and the market failure would largely go away.

According to Peter Pan, “Everytime a child says ‘I don’t believe in fairies,’ there’s a little fairy somewhere that falls down dead.”  As far as I know, he’s wrong about fairies.  But if Peter had warned, “Everytime a person says, ‘I don’t believe in markets,’ there’s a worker somewhere that loses his job,” he wouldn’t have been far from the truth.  Scoff if you must!  People can and do cause market failure by believing in it.

Teleworkers still don’t believe in markets, but at least they’re less likely to tell each other, “I don’t believe in markets” – or act on their resentment.  Thank goodness for small miracles.

P.S. Disclaimer: The best predictor of future data is past data- and we should never say, “This time it’s different” lightly.  So I wouldn’t be shocked if aggregate data ultimately revealed continued severe nominal wage rigidity despite my current impressions of drastic change.  If so, consider this piece an imaginative yet regrettable attempt to explain “facts” that barely happened…

Open This Content

Whither the Precautionary Principle?

The precautionary principle, per Wikipedia, is “a strategy for approaching issues of potential harm when extensive scientific knowledge on the matter is lacking. It emphasizes caution, pausing and review before leaping into new innovations that may prove disastrous.”

Over the last half century or so, regulators and activists have regularly invoked the precautionary principle versus industrial and commercial concerns: Will this new car wash ruin the nesting grounds of the Great Purple-Crested Bandersnatch? Could construction of that pipeline conceivably pollute a river? Might the noise from a proposed refinery disturb the sleep of some nearby Mrs. Nimby?

Then came COVID-19, and all of a sudden many of the same voices who’d have followed the precautionary principle to hell and back to stop construction of a nuclear power plant or delay the logging of a plot of old growth forest completely abandoned it.

For THIS situation, panicking and screaming “SCIENCE!” at the top of one’s lungs suddenly and inexplicably became satisfactory substitutes for “caution, pausing and review” before radically transforming the lives of more than 300 million surprised human lab rats.

I’m pretty sure that placing millions of Americans under de facto house arrest and shutting down significant portions of the US economy constitute “new innovations that may prove disastrous.” And every day it becomes clearer that “extensive scientific knowledge on the matter was lacking” when it came to the rationales for doing so.

Over the course of the last month, projections of US COVID-19 deaths from supposed “experts,” based on their super duper magic … er, “scientific” … models, have fallen from a high of 1.7 million, to a likelihood of between 100,000 and 240,000, to perhaps 60,000.

None of those numbers are numbers we want to hear when we’re talking about dead people, of course, but the fall from 1.7 million to 1/28th that number is a strong indicator that the overall process was based on something resembling wild, panicked guesses (and in some cases raw political opportunism) more than realistic modeling based on smart assumptions and fed with good data.

Don’t take my word for it. Ask Dr. Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases: “I’ve looked at all the models. I’ve spent a lot of time on the models. They don’t tell you anything. You can’t really rely upon models.”

But those models were what federal bureaucrats, state-level politicians, and local health officials DID rely on, and point to, as the basis and justification for a cascade of crazed policy decisions that have already resulted in what will likely turn out to be the worst US economic collapse since the Great Depression.

Don’t let the government’s COVID-19 Catastrophe Caucus fool you into believing they saved America or humankind. Before this is all said and done, we will have gotten off very easily if their mistakes haven’t killed more people than COVID-19 would have killed if left to rage completely unchecked.

It’s time to start interpreting the precautionary principle as a strong presumption against trusting the state with any power whatsoever.

Open This Content

Are Kids Learning More at Home During COVID-19?

More than one billion students around the world are currently missing school due to the COVID-19 pandemic. Several US states have already canceled school for the remainder of the academic year, turning to online learning when possible, and other states are likely to extend their school closures soon. Some educationists panic about learning loss while children are at home with their families, and headlines abound about how “homeschooling during the coronavirus will set back a generation of children.”

Learning Outside of a Classroom

Rather than focusing on the alarmist narrative of what is lost during this time away from school, it is worth emphasizing what is gained. There is so much learning that can happen this spring, within families and outside of a conventional classroom.

In many school districts across the country, any assigned coursework has been deemed optional, compulsory attendance laws have been relaxed, and annual testing mandates have been removed. This regulatory respite can provide an opportunity for parents to regain control of their children’s education and expand knowledge using the abundant online learning resources now at our fingertips. Free from state and federal curriculum and testing directives, parents can nurture their children’s education and development, helping them to explore new interests, dive into self-directed projects, and reveal passions and talents.

Whether it’s taking a virtual tour of one of 2,500 museums around the world, listening to a live concert, learning in-demand technology and coding skills for free, engaging in livestream story or art time with renowned authors and artists, or just enjoying special, slower moments together as a family, this is a once-in-a-lifetime chance to disconnect from standard schooling and discover how much learning can really happen.

Some worry about children’s learning slipping away during this time at home. Writing recently for The Washington Post, former Tennessee education commissioner Kevin Huffman notes the alleged “summer slide” phenomenon when students purportedly lose during summertime much of what they learned during the academic year. He suggests several strategies for combating the learning loss that he says will occur during the pandemic, including adding “more instructional days next year and beyond,” and “opening schools in the middle of the summer, lengthening the school day and the school year, or potentially eliminating summer vacation for the next couple of years.”

Does Learning Loss Occur?

But as I’ve written previously for NPR, we should be skeptical about the overall idea of “summer slide,” or learning loss when children are away from school. If learning is so easily lost when a child’s school routine is disrupted, did they ever really learn at all? They may have been effectively schooled—that is, trained and tested on certain material—but they likely never learned.

Now, children and their parents have an unprecedented opportunity to learn without school. While this is a stressful time for all of us, as our routines are altered and we are mostly stuck inside, distanced from our larger community, it can also be a time to use the enormous, and mostly free, digital resources that are sprouting daily to support learning and discovery. It can be a time to nurture and rekindle our children’s natural curiosity and creativity, qualities that are so often dulled within a mass compulsory schooling system focused on compliance and conformity. It can be a time to get to know our children in ways that might have been difficult during our previously packed, always-on-the-go days.

Most parents will eagerly send their children back to school when this is all over, but some parents will be surprised by what they discover during this break from ordinary life. They may see how much calmer their children are and how school-related ailments such as ADHD are less problematic at home. They may see that their children’s mental health has improved, particularly for teenagers who report the most unhappiness at school.

Parents may see their children’s love of reading and writing reappear, when they are allowed to read books and write stories that are meaningful to them and not tied to an arbitrary school assignment or grammar lesson. They may see a strong interest in science and technology emerge, as their children want to know more about how viruses work and what inventions are being created to help fight the pandemic. Parents may see real learning happen and decide not to send their children back to school.

Fortunately, there are now so many more ways to facilitate education without schooling, including hybrid homeschooling models, virtual learning, microschools, self-directed learning centers, and co-learning spaces. With more demand from parents for innovative, out-of-school learning options, more entrepreneurs will build experimental K-12 education models that will expand choices for parents and learners. Opting out of conventional schooling has never been easier or more worthwhile.

Rather than dwelling on the schoolwork that isn’t getting done this spring, let’s celebrate the immense learning that is occurring, in our homes and with our families, as we experience this historic event together. Let’s focus on what we gain, not on what we lost.

Open This Content

Do Intellectuals Make Life Any Better?

There’s a path my life could have taken – could still take – toward the life of an intellectual.

I’ve just about always been interested in one or more of the favorite intellectual subjects of philosophy, history, politics, theology, economics, psychology, and sociology (whatever that is). I’ve always liked to have big opinions on things. And I’ve always preferred toying with ideas to toying with numbers or machines.

But I’m beginning to think this is an aptitude worth resisting. It’s not obvious to me that intellectuals as such bring a whole lot of benefit to the world.

Obviously this will be controversial to say.

For the sake of this post, I’ll be using a Wikipedia-derived definition:

An intellectual is a person who engages in critical thinking and reading, research, and human self-reflection about society; they may propose solutions for its problems and gain authority as a public figure.”

Let me be clear that I think everyone ought to engage in critical thinking. It’s in the rest of the definition that the problems start to emerge.

Every intellectual is a person who not only has a pet theory about what’s wrong with the world – but who makes it their job to reflect/research on that problem and write about that problem.

When you think about these intellectuals, what do you think of?

My mind wanders to the endless number of think-pieces, essays, and books with takes what’s wrong with humans, what’s wrong with society, or what’s wrong with intellectuals (that’s right – I’m currently writing a think-piece. Shit.) The history of this produce of intellectualism is an a stream of lazy, simplified pontifications from individuals about things vast and complex, like “society,” “America,” “the working classes,” “the female psyche,” etc. in relation to something even more vast and complex: “human life.”

It’s not that thinking about these things are wrong: it’s that most of the ink spilled about them is probably wasteful. Why?

Because core to the definition of intellectualism defined above is its divorce from action. Intellectuals engage in “reading, research, and human self-reflection,” “propose solutions,” and “gain authority as public figures,” but none of these acts require them to get their hands dirty to test their hypotheses or solve their proposed problems.

The whole “ivory tower” criticism isn’t new, so I won’t belabor the point. But I will point out two consequences of intellectualism’s separation from practical reality.

First, intellectuals don’t often tend to be great people. Morally, I mean. Tolstoy left his wife in a lurch when he gave up his wealth. Marx knocked up one of his servants and then kicked her out of his house. Rousseau abandoned his children. Even Ayn Rand (whom I love) could be accused of being cultlike in her control of her intellectual circle. Those are just the notable ones – it’s fair to say that most of the mediocre “public intellectuals” we have aren’t exactly action heroes. While they may not be especially bad, they aren’t especially good on the whole.

There seems to be some link between a career which rewards abstract thought (without regard for action) and the mediocre or downright bad lifestyle choices of our most famous intellectuals.

The second major problem with intellectuals springs from the fact that nearly everything the intellectual does is intensely self-conscious. Whether it’s a philosopher reflecting on his inability to find love and theorizing about the universe accordingly or an American sociologist writing about the decline of American civilization, the intellectual is reflecting back upon what’s wrong with himself or his culture or his situation constantly, usually in a way that creates a strong sense of mental unease or even anguish.

Have you ever seen an intellectual coming from an obvious place of joy? The social commentators are almost always operating from malaise and malcontent, which almost always arise from a deep self-consciousness.

Of course it’s anyone’s right to start overthinking what’s the matter with the world, and to feel bad as a result. The real problem is that the intellectual insists on making it his job to convince everyone else to share in his self-conscious state of misery, too.

How many Americans would know, believe, or care that “America” or “Western Civilization” was declining if some intellectual hadn’t said so? How many working class people, or women, or men would believe they are “oppressed”? How many humans would be staying up at night asking themselves whether reality is real? Both are utterly foreign to the daily experience of real, commonsense human life. And while the intellectual may draw on real examples in his theories, he’s usually not content to allow for the exceptions and exemptions which are inevitable in a complex world: his intellectual theory trumps experience. The people must *believe* they are oppressed, or unfulfilled, or unenlightened, or ignorant of the “true forms” of this, that, or the other.

I’m wary of big intellectual theories for this reason, and increasingly partial to the view that wisdom comes less from thinking in a dark corner and more from living in the sunshine and the dirt. The real measure of many of these theories is how quickly they are forgotten or dismantled when brought out into daily life.

People who use their intellects to act? The best in the world. But intellectuals who traffic solely in ideas-about-what’s-wrong for their careers? More often than not, they are more miserable and not-very-admirable entertainers than they are net benefactors to the world.

The ability to think philosophically is important. But that skill must be used in the arena. Produce art. Produce inventions. Be kind. Action is the redemption of intellectualism.

Disclaimers

*By “intellectuals,” I don’t mean scientists. On the humanities side, I don’t even mean artists. The problem isn’t artists: it’s art critics. It’s not scientists: it’s people who write about the “state of science.”

There are exceptions to the bad shows among intellectuals, but usually these are the intellectuals who are busy fighting the bad, ideas of other intellectuals: people like Ludwig von Mises fighting the ideas of classical socialism, or . The best ideas to come from people like this are ideas which don’t require people to believe in them.*

And don’t get me wrong: this is as much a mea culpa as a criticism of others. I’ve spent much of my life headed down the path of being an intellectual. I’m starting to realize that it’s a big mistake.

Originally published at JamesWalpole.com.

Open This Content