Factors of Production

PearceSpeaking of the global financial meltdown that began in 2008, Jim Cramer once said that, “the only guy who really called this right was Karl Marx.”  He was not the only person to bandy about language questioning the solvency of our current brand of Capitalism in the wake of the Great Recession, and not the only one to invoke the name of the founder of Communism.

Americans, particularly the Republican Party and their Tea Partiers, generally regard Communism as a bona fide plot from Satan.  Even its milder cousin, Socialism, is something nasty and indecent that only Europeans would do in public.  This rhetoric, though, belies the prescience of Marx’s insights into Capitalism.  The 2008 implosion is precisely the kind of thing that a Marxist lens predicts.  His vision of a global communist revolution may not have panned out, and those nations that attempted to realize his vision may have succumb to corruption, violence, and decline, but the core insight that ignited Marx’s entire paradigm that unfettered capitalism would lead to rampant inequality remains vindicated by a century and a half of history.

That’s an inconvenient truth for many people today, but then we have arrived in a period of history largely characterized by the triumph of ideology over fact, so overlooking inconvenient truths is par for the course.  Today’s right wing has made a habit of rejecting logic in favor of dogma.  Prominent Republican and Tea Party figures reject evolution by natural selection and cling instead to an unreasoning Biblical literalism that would make Saint Augustine blush.  With equal fervor, many conservatives noisily talk down the threat of climate change, so noisily in fact that the number of Americans who were concerned about global warming actually declined between 2000 and 2004–and those levels of concern have never returned to their peak despite growing scientific consensus on the seriousness of the dangers of climate change and increasingly erratic weather events.

Perhaps even more importantly, a virulent strain of free market dogmatism has come to hold enormous sway over the global economy.  The roots of this overarching economic ideology go back to the 1960s when economist Milton Friedman launched his campaign against Keynesian economics.  Though a generation of high school economics classes would give ten to twenty points on their final exams for knowing that John Maynard Keynes’ theories were validated by events during the Great Depression and World War II when government spending offset reduced demand in the marketplace, Friedman’s Chicago School of economics would argue against most activist spending by governments as “naive.”

In the 1970s, the expanding global economy encountered a crisis stemming from what Sociologist David Harvey has termed “excessive power of labor,” and he notes that Capitalism “never solves its crisis problems, it moves them around geographically.”  So the reaction against “greedy unions” led directly to another, arguably greater, imbalance favoring finance capital.  The dominant theory of global Capitalism became one plank from the ideology of Friedman and his Chicago School economics.  In fact, President Reagan literally came into the White House with Friedman’s text under his arm.  Perhaps it is this simple fact that leads the Tea Party to apotheosize Reagan, since in almost every other regard he fails to live up to their standards–again, the facts of Reagan’s actual actions while in office are unimportant compared to his mythological importance as founding father for this new breed of conservatism.

Now, poor John Maynard Keynes who was so demonstrably right about government’s potential to offset the wild fluctuations of the business cycle is worse than ignored, he is demonized.  A conservative think tank assembled by the magazine Human Events included Keynes’ landmark text General Theory of Employment, Interest and Money on their list of the most dangerous books of the past two-hundred years, along with Marx’s greatest hits (one suspects the time range was specifically chosen to ensure Marx would make the cut).

The outcomes of this shift to privileging finance capital have been dramatic.  In the last forty years, global GDP has grown more than seven fold.  That rate of GDP growth accelerated in the first decade of the twenty-first century, accounting for half of the astronomical increase since 1980.  It is this fact that purveyors of this supply-side thinking point to as evidence of the righteousness of their faith in unfettered free markets.  The numbers cannot lie, they cry to the heavens.

And so they cannot.  In that same period of time, real wages in the developed world have been stagnant.  Even in countries like China, the share of overall national wealth paid out as wages has fallen.  As Keynes and Marx could have warned us, this progression is unsustainable.  We have paid for the astronomical growth in GDP–and the commiserate growth in the portfolios of the world’s billionaires, now some twelve hundred or so–by generating fictional capital through debt.  Consumers and governments have become addicted to debt, and Big Capital has hovered over the whole affair like a dealer nursing along a crop of junkies.

Now the very people who celebrate this growth as the proper paying out of rewards for the deserving decry the endless cycle of debt as weakness and entitlement without realizing that the two are sides of the same coin.

In 2008, the whole system almost came unhinged, but bandaids were slapped on, a few of the most dangerous new financial animals euthanized, and the world has pressed on with the exact same agenda of achieving growth.

This time, Capitalism didn’t even bother to geographically relocate its problem.  It just shifted the rug over the hole in the floor and went on with business as usual.

The rug is fraying now.  Austerity measures in the Eurozone are having the exact effects that Keynes would have predicted–all the wrong ones–while only America’s checks and balances have spared the nation a recovery-ending legislative spree from the newly elected Tea Party ideologues (God bless you, James Madison!).  Under a parliamentary system, the Tea Party would have controlled the government, but thankfully Obama remained president through the attempted national self-destruct in 2010 and even more mercifully, the Senate remains (largely) a bulwark against the House’s hysteria.

Very soon, though, the United States and the world will have to face facts.  The most important of those facts is that this form of Capitalism that privileges finance as the dominant engine of economic growth is untenable.  Not only is its meteoric growth doomed to result in a collapse, but it has completely failed to distribute the benefits of that growth to the population as a whole, reserving the gains instead for a select few positioned to profit.

Of course, to the free market dogmatists of today’s conservatism, that fact is irrelevant.  Imagining capitalism to be some kind of natural order to the universe, they reckon that winners in the market are those most worthy of the spoils (ironic for a group of people who are known for rejecting Darwinism).  This they call freedom.  The flaw in their thinking, though, is that there is nothing natural about Capitalism.  Its engines are all artificial.  Corporations.  Property Rights.  Even markets.  They are all constructs, tools we use to govern society.  We have created them.

The foundation of our understanding of markets, of course, comes from Adam Smith.  Far from a free market dogmatist, Smith understood how the “invisible hand” could jeopardize the bindings of society and noted the distinction between common goods and those which should be subjected to market forces.  Smith understood the importance of balancing the different interests society has in its economic choices, but despite his insights, he could not foresee the world we live in today.  He did not pass through the crises our society has.  By now it is obvious to many–and it will become obvious to many, many in the near future–that some of the fundamental assumptions of Capitalism must be rethought.

Perhaps one of Smith’s assumptions is a good place to start.

Smith described the “component parts of price,” what they call the factors of production in those high school economics classes, as including land, capital stock, and labor.  The crisis of labor in the 1970s that led indirectly to today’s crisis was one arising from privileging labor over the other factors of production.  The solution that society pursued involved quashing labor’s power through offshoring and other neo-liberal policies (under Reagan and Thatcher in particular).

After decades of stagnant real income, incredible growth in income inequality, and deepening consumer and national debt, that solution is obviously flawed.

I propose an alternate solution that might still be implemented to redirect the flow of power through our economic system:  Let’s take labor off the table.

If we cease to think of labor as a factor of production, then we have to completely reimagine how individuals in our society come together to accomplish economic means.  If all employment required a contract and all employment contracts were closer to limited partnerships, then we could radically transform the relationship between capital and labor in a way that preserves the dignity of workers while making them real stakeholders in economic success.  Some might argue that a radical shift like this would remove too much incentive from entrepreneurship, but that logic is patently absurd, the same as the thinking that presupposes raising taxes on the rich will somehow cause them to opt out instead of continuing to earn as much as they can after taxes.

If we adjusted the rules of hiring and working, bright, creative individuals would still strive to succeed and still generate the ideas that drive our economy.  Start-ups are already the source of most new job creation.  It’s not really that sector of the economy that is the problem.  It’s mega-capitalist entities that drive the imbalance in question, but they can be reimagined as well.  In 2011, hundreds of thousands of Americans abandoned corporate banks for credit unions.  If so much of our political culture weren’t being driven by big money, more changes in the legal landscape could further trends like that.  If we privileged cooperatives in our laws half as much as we do corporations, then we might see more entities like Spain’s Mondragon, which has weathered the recession much more favorably than Spain as a whole.

We need a new model.  To be sure, market forces should remain an important part of how we manage our economy.  Otherwise, we might end up with something as deeply flawed as the old Soviet economy, but we must also balance the interests of promoting opportunity for everyone or we will end up with a system as imbalanced and corrupt as the current kleptocracy in Russia, a world of mobster billionaires run amok.


The Souls of Presidents

Jim Young

On November 4th, 2008, John McCain took the stage in Phoenix to mark the end of his bid to be president.  It would, presumably, be his last such campaign.  As he graciously conceded to then Senator Barrack Obama, the assembled crowd booed at the mention of his adversary’s name.  Twice.  McCain, showing what may have been restrained disgust, offered his palms to the audience and asked for them to stop.

No one else can know what moved through John McCain’s mind that night, but I like to believe that that instant was one of anagnorisis.

In Aristotle’s analysis of the theater of Ancient Greece, anagnorisis was the moment of realization that must come to the tragic hero before he can accept his downfall.  To be sure, McCain did not suffer a horrendous reversal like Oedipus or Agamemnon, but it is equally certain that the candidate who had secured the Republican nomination for the 2008 presidential race was not the John McCain who had sought the Republican party’s nomination in 2000.

McCain, long heralded as the “Maverick of the Senate,” was vehemently opposed by powerful voices in the Republican party during the 2000 primaries.  He was too moderate, too liberal, some said, to be the Republican nominee.  After a vicious campaign which included racist attempts to malign the parentage of McCain’s daughter, George W. Bush secured the Republican nomination.  Eight years later, an aging McCain had one last opportunity to fight for the presidency.  He charged into it with aplomb.

What exactly happened is open for interpretation.  Had his views changed with age or with the shifts in American politics in the post-9/11 world?  Or did he cynically shift his positions to the right in order to quell the concerns of those within his party that he simply wasn’t Republican enough?  Who can say?  The facts, though, demand some sort of explanation.

McCain, who once argued that banning abortion would lead to unsafe, illegal procedures that would threaten women’s lives, suddenly called for the overturn of Roe v. Wade.  During the campaign, he spoke at Jerry Falwell’s Liberty University, despite having called its founder an “[agent] of intolerance” during his 2000 run.  He even turned against legislation he himself sponsored, saying that he would vote against his own 2006 proposal for immigration reform because in 2008 “the people” wanted the border secured.

Wanton hypocrisy?  I’m not sure.  Perhaps I am just eager to defend the man I wanted so badly to vote for in 2000 (though I would’ve cut off my hand before voting for him in 2008).  He didn’t, after all, reverse course on every issue.  He remained committed to addressing climate change, a priority shared by few Republicans.  He refused to employ the dirty, underhanded race baiting that had been used against him in South Carolina in 2000.

In Ancient Greek tragedy, the hero’s fault or error–called hamartia–lies unknown to himself until the moment of anagnorisis.  Is it possible that McCain unknowingly allowed himself to be swayed by advisors and others?  Could his single-minded pursuit of the office have slowly eroded his famous integrity?

Maybe he really didn’t see it until that November night, until he had to look out at the booing crowd and realize those were the people he had forsaken his honor to please–the reactionary right wingers that Heilemann and Halperin called “furies” in their 2010 post-mortem on the election, Game Change.  McCain had sold his soul to get that nomination and kept on selling it to try to win, but I don’t think he’d really realized it until that night in Phoenix.

One might argue that it’s a fool’s errand to probe the soul of any man through newspaper clippings and cable news sound bites, but sometimes it’s actually quite an easy task.

It’s hard to imagine that President Clinton had any illusions about whether or not he was doing wrong during his tawdry dalliance with Monica Lewinsky.  Though his actions have inspired novels, plays, and films, Aristotle would have seen nothing cathartic in this uncomplicated drama of an empowered, entitled man gratifying himself and lying to an entire nation.

No hamartia.  Bill Clinton undoubtedly knew himself to be a horndog, as we all know him to be.  No anagnorisis.  His scripted apology to the nation offered words only, no public penance, no gouging of his own eyes as with Oedipus when he discovered his sin.  Nothing but more slick Willy.

But again, perhaps I am too harsh, too quick to cast judgment because of a sense of personal betrayal.  He was the first president I’d voted for and he looked out from the TV and pointed right out at America (right at me, I tell you!), assuring us that he, “did not have sex with that woman.”

Then came Bush.  If Clinton had been morally wrong, Bush was wrong in almost every other way.  Wrong about weapons of mass destruction.  Wrong about the dangers of climate change.  Wrong about torture.  Wrong, I still believe, about waging a “War on Terror” that would claim as its first casualty our own sacred civil liberties.

But was he, like Clinton, knowingly wrong?  Did Bush ever recline in that chair in the Oval Office with the smug satisfaction of a shoplifter with a fresh take–as Clinton must have done, post-fellatio?

Bush’s tragedy may still be unfolding, his hamartia waiting for the man to look back and see, in a blinding epiphany, that nation building for profit is morally wrong or that economic policies that favor only the flow of capital weaken the bedrock of the middle class.  Who knows, but for all the ways that George Bush weakened this country, I cannot level at him the kind of contempt I feel for his predecessor.

For all his faults–and even his supporters must at least admit that he failed to achieve his policies in Afghanistan and that his economic policies contributed to the 2008 financial collapse–it seems, from this distance, that George W. Bush did what he believed was right.  Galling as it is to liberals and moderates, I think he did believe that what was good for business must be good for the country as a whole and that violating people’s rights–here or abroad–to keep them safe really was the right thing to do in the face of the evil of terrorism.

What, then, of his successor?

There are no small number of disaffected liberals who simmer with rage for Barrack Obama.  Despite the fact that the Republican party and its radical Tea Party fringe have decried the Affordable Health Care Bill as socialism, it is actually so business-friendly and so conservative a reform package that almost every single component of the law was at one time or another proposed by a Republican (most hilariously of all, the individual mandate that is such anathema to today’s right wing was passed in Massachusetts by Mitt Romney).  Obama’s most liberal supporters were irritated that he surrendered the fight for a public option, which would have moved the United States much closer to the kind of universal healthcare provided by nearly every other industrialized democracy.

That, though, is only one example of the disappointments liberals feel when reviewing Obama’s four years as president.  Though President Obama has halted the use of “enhanced interrogation” made famous by leaked photos from Abu Ghraib, he has reneged his promise to shut down Guantanamo Bay.  What’s more, he has continued fighting Bush’s “War on Terror” with a vengeance.  A robotic death-from-above vengeance.  Eric Holder’s legal gymnastics to justify the drone-strike assassination of an American citizen now rival the logical loop-de-loops of Alberto Gonzales.

Progressives have been further infuriated by Obama policies that validate the Republican’s supply-side economics, all while the President has failed, until recently, to move on important liberal cause célèbre like gay rights and immigration.  Policy reform on other crucial issues–education reform and action on climate change–remains largely on the shelf.

The question is: Has Obama sold his soul?

And does he know it?

Now, as the election cycle for 2012 goes into full swing, President Obama urges us “forward.”  Indeed, anyone with even a basic understanding of the history of recessions in the country must favor him over the bizarre logic of Mitt Romney who argues that because what we need is jobs, then we should cut federal spending (When, Governor Romney, has cutting government spending EVER resulted in more jobs?).  But while liberals and progressives know they want Obama to defeat his vulture capitalist opponent, it is another question as to whether he can actually persuade them to vote for him again.

As November approaches, I think back to the moment when Barrack Obama really appeared on my own radar.  It wasn’t long after his Senate win and his coming out party at the Democratic National Convention.  I read about and from him in Time, and I was impressed.  His views and words were nuanced, careful, and reasonable.  In Time’s excerpt from The Audacity of Hope, he related his struggles to find middle ground between progressivism and faith, including with anti-abortion protestors who occasionally visited his campaign stops.  What I took away from that first impression was that Obama was a man deeply committed to compromise, to meeting halfway.  I had no idea that two short years later he would be America’s first African-American president.

As it turns out, compromise has been the name of the game throughout the Obama presidency.  To slip past filibusters, Obama has had to offer concessions to the Republicans at every turn.  The public option, an early plank of the health care bill, was abandoned to appease them and get the bill passed.  He bought an extension of unemployment benefits with an extension of the sacred-cow Bush tax cuts.

Responding to criticism of Obama’s many compromises, Fareed Zakaria said in 2011, that the president was “a centrist and a pragmatist who understands that in a country divided over core issues, you cannot make the best the enemy of the good” and that his failure to live up to expectations was really an acknowledgment of the complexities of the current political reality.  To be sure, that will be the view taken by many an Obama apologist in the weeks to come.

This might be a more favorable picture of the president than that of the weak-kneed reed that bends to every right-wing breeze, but it is none-too-inspiring when placed up against the “Yes, We Can” and “Change” enthusiasm of 2008.

If we want to probe Obama’s integrity, though, we might look to his recent gesture of support for gay marriage.  Obama began his political life as an opponent of gay marriage, but added, “when you start playing around with constitutions, just to prohibit somebody who cares about another person, it just seems to me that’s not what America’s about.”  In keeping with that position, he opposed the Defense of Marriage Act.  In fact, he admitted in his book that, “It is my obligation…to remain open to the possibility that my unwillingness to support gay marriage is misguided…and that in years hence I may be seen as someone who was on the wrong side of history.”  Now, he has declared that, in fact, he believes he was wrong.

It would be easy enough to read all of this arc as a series of calculated moves.  Obama in 2004 read the handwriting on the wall and decided to keep evangelical voters satisfied by opposing gay marriage.  Now, in 2012, he has reversed positions in an effort to shore up his liberal supporters.

This narrative, though, neglects the simple fact that this reversal is unlikely to win him any significant number of votes, just as his previous soft stance was unlikely to really lose him much support from the left.  The gay and lesbian community is, ultimately, a small proportion of the population, and not many of them were ever likely to vote Romney.  At best, this move gets a few people off the sidelines, but demographically they probably live in urban centers in already blue states.

It could be that Obama really did make the move for the reason he claimed, because it was “the right thing to do.”

In that case, Obama may be skirting the boundaries of a different kind of drama, one in which he has compromised politically without compromising his integrity.  Or it could be that his willingness to be pragmatic is his hamartia, laying in wait for the final episode in his tragedy.



On the first day, the sky went out.

Davis had trouble remembering what they’d been doing when the noise started.  Whatever it had been, they had carried on unperturbed.

When the lights, television, and air conditioning gave out with the power, though, they all rose and looked about.  Hannah pried open the blinds with two fingertips coated in orange acrylic and said, “I can’t see anything.”

They could hear it, though.  Without all the background noise of whirring motors and vibrating speakers, the rumbling sound of the wind seemed overpowering.

Davis went to the front door and pulled it open.  He was struck by how suddenly the world simply was not there, the cracked cement walkway leading down to the street and the usual band of blue with wispy whitish accents replaced by a featureless brown howling.

“Some storm,” he said, forcing the door shut against the pressure of the wind.  A fine dust had coated the entryway in the few seconds he’d left it open.

“Guess there’s no Olive Garden for lunch,” Marcy said.

“Why not?” Hannah whined.

“We’re not going out in that,” her mother told her.  “Besides, their lights might have gone out, too.”

“I was looking forward to those breadsticks,” Davis said.  “I’ll call them, see if they’re open.”

But there was no signal on the phone either.

“Huh,” he grunted.  “The cell phone towers are out, too.”  Hannah exhaled a noisy puff of disgust and went to her room.  “Some storm,” he repeated.

For dinner, they ate cereal by candlelight, muttering their hopes that the power would come back soon so the rest of the milk wouldn’t spoil.  With nothing but the wind and dark to occupy their senses, they turned in early.  Marcy and Davis made love quietly and then lay still for two hours hoping to hear the hum of electricity returning to the house.

Hannah played games on her cell phone until the battery died.


In the morning, the brown haze had been replaced by gray.  When Davis opened the door again, the wind snapped at him and soaked his shirt almost immediately.  He still couldn’t see anything beyond the basic outline of their front porch as he pushed the door closed again.

Marcy brought him a towel.

He sat down in the living room, watching the sheets of water wash over the slits of window showing through the blinds.

At breakfast they went through the fridge, now nearing luke warmth, and used up as many perishables as possible.  They laughed about the combinations.  Ketchup and eggs and cheese and yogurt and a glass of orange juice and milk for each of them.  The sodas they left alone, but Davis joked that they had to use up the horse radish.

“At least we’ve still got the gas,” Marcy remarked while scrambling the eggs over the top burners.

For dinner Davis would figure out how to light the oven without the digital controls and they would use up a frozen store-bought lasagna that was still unspoiled in the freezer.  Throughout the day, the three of them sat around in the living room telling stories.  Hannah reenacted everything she could remember from their vacation to Costa Rica when she was eleven.  Though they remembered it as clearly as she did, her parents just smiled and nodded and laughed at all the right places.

Next, Davis told the story of setting up a picnic for Marcy just outside the university computer lab as a surprise for her while she’d been working late on her master’s thesis.

Marcy, though, got Hannah in stitches telling her about Davis’s brush with the swine flu–which turned out to be low-level food poisoning.  “He said, ‘Honey, if I don’t make it, don’t remarry until Hannah’s in high school, okay?’” Marcy chortled.  Davis bobbed his head good-naturedly.

The next morning, nothing had cleared up.  There was still no world outside.

“Should we try to drive to work?”

“We can’t see three feet,” Davis said, shaking his head.

“But we can’t call in either.”

“They’ll know.  Half the city must be stranded.”

“I told you we should have kept a land-line.  It would’ve still worked even without the power.”

“For thirty bucks a month?  How often is something like this going to happen?”

She shook her head as he closed the blinds.

Davis dug out some books he’d had boxed up in the garage and pushed the couch closer to the window to get enough light to read.  He looked up twenty pages later and caught Hannah sweeping the floor.

“What’re you doing?”

“I don’t know,” she answered.  “I’m bored.”  She kept sweeping.

The leaks started that night.  Every few minutes, one or all of them would leap up and chase the sound of a drip in the dark, damming it with tupperware and dirty towels.

By morning, most of the bulwarks had held, but Hannah was eager to mop up the areas where buckets had overflowed or towels had given up under the weight of super saturation.  They kept rotating and dumping most of the day.

That night, Davis and Marcy curled together in bed.  They told each other the storm would have to break.  It couldn’t go on much longer.  These whispers soothed them as they drifted off to sleep, only to wake at three in the morning (Davis checked the one wrist watch in the house) when they heard a crashing noise outside.  After the sudden explosion of metal ripping into metal, a car alarm roared above the trembling sound of the wind beating against the side of the house.

Davis went downstairs, shirtless in flannel pants and opened the front door.  He could hear the alarm more clearly and as the wind buffeted him with wet fists he thought he could make out a pulse of orange light behind the wall of gray outside the door.  He shone his flashlight into the mist.  No shapes emerged.  He reached his foot to step beyond the threshold, but suddenly, as if in warning, the wind gusted and forced him back.

He closed the door on the sound of the car.

By morning the noise was gone.

They tended again to the intruding water, let the rotation of pots and pans structure their day.  When the light gave out, they had a meal by candlelight.  Hannah complained about the offerings: cans of corn and beans and some Ritz crackers.

The next morning they noticed that the water coming out of the faucets was growing more and more brown.  By late afternoon, it was almost sludge.  Davis cut the water to the toilets and they began using the rainwater to fill the bowls.

“We’ve got water,” he told them.

“You want us to drink what’s coming through the roof?”

“We’ve got water,” he said again.

That night, Hannah did not complain: more cans of corn and beans.  The Ritz were gone, but they had a dessert of four Chips Ahoy cookies.

He misplaced the watch.  The three of them looked around, overturning sopping towels and shuffling around picture frames and knick knacks to move their shadows.  With the house so dark and the batteries on the flashlights failing, they abandoned the search.

He read to them the mornings after that.  He picked a yellowed copy of Ender’s Game to begin.

“I loved this book when I was a kid,” he told them.

He was amazed when, sometime in what must have been the next afternoon, he closed the book on the last chapter.

“I don’t think I’ve ever read a whole book in two days before in my life.”

“And out loud, too,” Marcy said warmly.

“Can I go next?” Hannah asked.

They began sleeping in the living room, Marcy and Davis bundled together on the couch, Hannah’s feet dangling off the edge of the love seat.  They tended to the water invading the house, following an unofficial but increasingly efficient routine, and they read and talked together in the den.

As the light began to fail after Marcy had gotten forty pages into Little Women, Davis took one of the remaining candles into the kitchen and swept its orange glow back and forth across the open cabinet doors.  He sighed to himself and rejoined the others in the living room.

Once he heard Hannah’s breathing slip into long, steady sighs, he whispered to Marcy, “We’ll have to do something.  There’s no food left.”

“We’ll ration what’s left.”

“We’ve been doing that,” he told her, gripping her upper arm firmly.  “There’s nothing left.”

“There’s some.”

“Better now, before we’re weak with hunger.”

She wrapped her fingers around his and said nothing more.

In the morning, before Hannah stirred, he quietly fished out his winter coat from the hall closet and found his racquetball goggles in his duffle bag.  Still wrapped in blankets on the couch, Marcy shook her head at him.  He shrugged in reply and turned.

He walked to the front door and placed his hand on the knob.

It felt cold, colder than he’d expected.

The door came open easily, blown inward by the unrelenting wind.  He pulled the goggles over his eyes and wiped the rain off his lips.  He took two moon-landing steps into the gray outside and squinted, still unable to see anything.

He reached back and grasped the doorknob from the other side.

With a heave, he pulled it shut behind him.

The Vice of Simulation

skyrim vs realityMy son is deep underground, searching a metal cavern that predates human civilization.  He’s walking halls created before the dawn of history.  In front of him, a giant trapezoidal door slides open, admitting him to the next chamber.  Pistol shots ring out.  A fellow soldier–crazed by the horrors he has witnessed–is firing wildly at him.

My son quickly snaps off a shot from his weapon, a bizarre alien contraption he’s picked up that fires out explosive purple needles.  One of the strange projectiles sinks into the shell-shocked soldier and the pistol falls silent.

This is one of the few moments in the storyline of the Halo video game franchise in which the player is invited–though not obliged–to kill a human character.  Generally in the story, the player is pitted against various alien nasties intent on wiping out humankind.  The most pervasive of these alien foes belong to a consortium of like-minded extra-terrestrials known in the Halo universe as The Covenant, a theocratic alliance who have been bent on the extermination of all life on Earth for the ten years since the original Halo game’s release.

Recently, author Jane McGonigal celebrated the apex of this simulated military struggle for mankind’s survival in her book Reality is Broken.  McGonigal is a leading voice in a growing movement known as gamification, the premise of which is basically that if games, and video games in particular, are so effective at motivating and engaging us, then why shouldn’t society itself be engineered more like a game.  She points to the online community of Halo players, millions strong, as an example of gaming’s ability to galvanize people and captivate their imaginations.  In particularly, she applauds an online campaign throughout that virtual community to log ten billion Covenant kills in the third, and concluding, chapter of the Halo trilogy.

“Ten billion kills wasn’t an incidental achievement, stumbled onto blindly by the gaming masses,” she writes.  “Halo players made a concerted effort to get there.  They embraced 10 billion kills as a symbol of just how much the Halo community could accomplish–and they wanted it to be something bigger than anything any other game community had achieved before.”

To some, it may sound terribly silly, but the fifteen million players involved in this simulated endeavor were not merely hormonal thirteen year-olds bent on frenetic simulated violence.  According to the Entertainment Software Association, the average age of gamers is now thirty.  Last year the industry’s sales topped twenty-four billion dollars.  That’s more than twice the total take from the North American box office.  In fact, that number is edging close to the total take for Hollywood from sales of tickets, DVDs, Blu-Ray disc, digital downloads, and even movie rentals, which is only about twenty-eight billion combined.

Video games have been big business for some time, of course.  Tech heavyweights like Sony and Microsoft would not have leveraged so much of their resources to enter the market over the last twenty years if they were not.  But just as video games become a mainstay of pop culture, they are also growing in their capacity to draw their players into immersive worlds and environments.  Take my son’s favorite, for example.  The original Halo game was recently rereleased, but after ten years of technological progress, the new version features updated graphics, rendering its 3D world more lush, more detailed–more engrossing.  It’s a fitting tribute to the game that was the lynchpin of Microsoft’s effort to win territory in the lucrative video game industry.  The company even went so far as to purchase the game’s developer, Bungie, outright before the title’s release in order to establish a “killer app” for Microsoft’s X-box video game console.

Of course, on the surface the game looked a lot like a long line of games that see their players staring down the back of a gun–called first-person shooters.  In fact, Microsoft even redirected Bungie’s development to make the game less unique and more like every other game in the genre.  What remained distinctive about Halo then was something that most first-person shooters did not expend much energy on before:  story.

One of Halo’s predecessors in the genre, Doom, exemplifies the kind of “story” that typically went with these action-oriented shooter games.  In Doom, you are in hell.  With a gun.  You are supposed to shoot monsters.  A lot of monsters.  There are also locked doors so you must find keys to unlock them.  These keys will typically be guarded by monsters.  Shoot them, too.

Halo, though, featured actual character development and a plot that harkened back to tropes from a number of venerated science-fiction texts.  Yes, actual books.  In the video game press, titles are actually criticized now for the weakness of their stories.  The medium even garners serious study as an art form and indeed, some games offer worlds so thoroughly rendered that players feel completely immersed, more than any play or poem could ever hope for.  Massively multi-player games like World of Warcraft and Eve Online draw players by the thousands into artificial worlds where they can cultivate entire online lives, complete with a personal biography, communities of other human players, and virtual property.  2011’s megahit game Skyrim features a system its creators call “radiant” story generation.  After creating a character, players of the game can choose to do anything they would like within the fantasy world of the game.  The player can truly feel like an actor in a living novel reshaped at every turn by the reader.  Players are freely able to choose their paths through the game by choosing sides in a mythical civil war between locals bents on autonomy and the imperial forces pressing for unity, or not.  They can pursue the game’s central quest to undo the machinations of an evil dragon god returning to dominate the world, or they can settle down and chop wood for a living.  This openness and the manner in which the game grants the player real choices made it both one of the most acclaimed and most popular games of the year.

Obviously, the growth of this medium has not been without its detractors.  Jane McGonigal and others applaud video games’ positive impact on hand eye coordination and spatial intelligence, but science continues to dig around for possible links between violence in video games and real-life aggression from players.  The suggestion of such a link has become almost cliché more than a decade after Columbine.  After the tragedy, there were erroneous reports that the perpetrators had prepared by making custom levels of Doom modeled on their high school.  To be sure, the murderers did enjoy violent video games, though it seems safe to say that were any of the most dire warnings issued after that tragedy at all accurate, society would be suffering more turmoil now that a whopping one hundred and eighty million Americans play these games.

Perhaps then it is not a question of whether killing ten billion aliens will harm us psychologically, but rather a question of what it says about us that we would spend our time this way.

Jane Addams said, “Action indeed is the sole medium of expression for ethics.”  If we must ultimately define ourselves by what we do, what does it mean about us when what we do is only part of a simulation?

More and more of us are investing more and more of our energy into worlds that do not, strictly speaking, exist.  We are doing things without actually doing them.

William Gibson’s prescient novel Neuromancer offered a description of cyberspace by borrowing Gertrude Stein’s quip, “there is no there there.”  If there’s no there, then are we there at all when we enter these fictional worlds?  If not, does it matter that we kill there?

I have been guilty of untold atrocities myself.  On countless occasions, I have dragged the whole of the human race into cataclysmic wars of domination so that my will could prevail over the entire world.

In a game, obviously.

For nearly twenty years, my own favorite digital diversion has been a strategy game series called Civilization.  In it, the player, looking down upon the world from a god-like height, moves armies, settles cities, develops new technologies, and generates schemes to raise a civilization from humble subsistence agriculture to the heights of the Space Age.  In my defense, I generally play as what folks in the Civilization online communities call “a builder.”  I erect cathedrals and universities to enrich the lives of my virtual citizens and generally try to achieve greatness through culture and refinement.

But I have been known to occasionally–in the interests of a lasting, stable peace, of course–start a war.  Even to conquer the entire world (it is the surest way to a high score, after all).  And I’ll admit to dropping a nuclear bomb in each release of the game, just to see what the graphics look like.

I have always been fond of the quote from Nietzsche warning that, “he who fights monsters must be careful that he himself does not become a monster, for when you look into an abyss, an abyss is looking back into you.”  What, though, becomes of our souls if we are making sport, not of hunting monsters anymore, but of also being them?

Video games have only just begun to grapple with these questions.  The medium’s Peckinpahs have pushed a few envelopes back with series like Grand Theft Auto which rewarded players for anti-social behaviors, including the callous murders of prostitutes (post-gratification, no less).  More recently, the series that eclipsed the popularity of the Halo franchise, Call of Duty had to issue an update that allowed purchasers (and presumably, parents) to deactivate a level of the game in which the player was compelled to gun down civilians in order to maintain his cover while on a secret mission.

I owned that game.  I left the level turned off and never played it.  I sold it back to the store when my son became more and more interested in action games.  I steered him toward Halo instead.  It seemed closer to the sci-fi cartoons and space operas that had shaped my own childhood, and ultimately, like McGonigal, I thought it was preferable to be drawing purple alien blood from foes beyond the reach of negotiation than, as in Call of Duty, conducting extra-legal paramilitary actions in which human characters are targets.  I talked to my son about the difference, about the violence.  I don’t know that the exchanges would qualify as thoughtful conversations, but I think he gets it.  I think.

He and I, like society itself, are still trying to sort out what our actions mean in these artificial universes.  These days, we play Civilization together sometimes.  From the game he has learned quite a bit about history–from the range of Polynesian settlement in the Pacific to the extent of the conquests of Alexander.  I show him how I play, and discourage him from using the nuclear weapons.  Is it enough?  Is thinking seriously about our virtual actions enough of a concession to ethics?  Millennia ago, Aristotle was troubled by the simple fact that audiences derived pleasure from tragedy and he sought to account for it in order to vindicate the arts.  Today, there is a great deal of research on the brain offering explanations of how these games flip all the right switches on their way to our pleasure centers, but still no Information Age Aristotle to pitch us some new catharsis to make us feel that it’s okay, to assure us that pulling the little plastic trigger is no different than turning the page in a murder mystery, no different then watching the frames advance in a war movie.

It’s not real, after all.  Any of it.  Even if we are the ones doing it, the ones making it happen.

Perhaps that just makes us finally and completely complicit in the crime of art.  The creative imagination of Western Civilization has arrived at a sort of Orient Express where we must each join the story teller by taking a turn at the knife.

When my son reached his moment of complicity and fired that single shot into the crazed trooper deep in the catacombs of Halo, he was surprised.  “I didn’t think that would kill him.”  One shot, after all, is something most video game characters can simply shrug off.  “I feel bad.  Can I go back?”

No.  We can’t.

So yet another new home…

http://www.helmling.com has been forced to move.

I think this marks its 8th or 9th incarnation since 1997, but it’s the first time I’ve had to move.  (Stupid Apple!)  Usually in the past I’ve relocated or redesigned the site because I’d gotten distracted (Stupid Sid Meier!) and let the site go fallow and grow thick digital weeds of neglect.

I guess that actually happened this time, too.


So, there’s not much here right now.  My most recent (completed) works, the sci-fi series for young readers that I wrote for my kids is available under novels.  I also threw up the same stories that were available on the last incarnation of the site (through some amazing application of cut and paste).

I’ll resurrect old content and put it up over the summer and there are a few stories I’m working on as we speak.  For now, enjoy the one new offering: a little poem below called “Piñata” that is sadly lacking the intended indentations (Stupid html!).


Someday, the well will go dry
There simply will not be anymore sea-foam green granite for your countertop
expect delivery
in four million years
And we will tiptoe,
egg-shell cautious,
Over the papier-mâché skin of the hollow earth
Mourning quietly
Whatever it was that used to fill in the middle of things