Democracy’s awful little secret

Federal debating society meetsIt may be that democracy doesn’t scale up well. Just raw, dumb mathematics putting a lid on rule by the people. If so, the 1/3 billion of us in this continent-sized nation are in trouble when we want democracy to be more than in name only.

Those voters who stay at home in droves give us the line “So what? One vote, why bother?” And of course we say “Because Florida 2000,” however lame we know it sounds.

The problem is, they’re right — it’s the perfectly bad “consumer value proposition.” They don’t get anything for their vote that they wouldn’t get by staying at home. And if there’s one thing we certainly are, it’s a nation of consumers. That typically leaves us muttering something about “more education.”

Then a couple of observations kind of bumped together for me that I found provocative: Iceland and ancient Athens.

Iceland-presidentIceland, of course, managed to dodge the bullet during the 2008 banking meltdown by the simple expedient of tossing out the gunmen — unlike the rest of the world, Iceland simply allowed the diseased banks to succumb to their own corruption. I took this as a heroic act of democracy in action, and kind of marveled at why Iceland could do such a thing, while the US could not.

Scratching the surface, though, I learned that Iceland — the nation — has a total population that’s about half that of my hometown, Austin.

That they are able to educate themselves so well and maintain a moral compass pointing to a democratic “true north” seem to support an idea that we should probably pay more attention to — democracy as a system that does not scale up well.

The arithmetic is inherent: in a small electorate, a participant’s vote carries more weight than in a large one. As it gets larger, the “consumer value proposition” of voting — well, we see well enough how that turns out.

For the history-minded, it’s worth noting that in ancient Athens, birthplace of democracy, the electorate was less than 30,000. The Agora seated even fewer (about 6,000), and Plato recommended an ideal maximum of 5,000 citizens in his Laws, companion piece to The Republic.

athensIt’s easy to imagine: you can readily gather a coalition of 60 – 75 of your colleagues, which could have a substantial impact on the outcome of a vote of a 6,000 person electorate. With just one more degree of separation — a “two-hop” — each of your 60 have their own 60, and you’ve got 3,600 on your side.

That is a qualitatively different dynamic than with an electorate of millions — it puts any such personal coalition-building way out of reach.

Since the days of ancient Athens, though, attempts to increase the scale of democracy have given us the questionable hack of “representative democracy.” What could possibly go wrong!?

It may even be safe to generalize: the smaller and the more local a political entity is, the more frequent the political outcomes will be that we’re willing to call “democratic.”

What Athens was in miniature, America will be in magnitude.
— Thomas Paine, in Rights of Man

…who recognized the differences in scale, but didn’t understand the consequences of it.

Wealth we employ more for use than for show, and place the real disgrace of poverty not in owning to the fact but in declining the struggle against it. Our public men have, besides politics, their private affairs to attend to, and our ordinary citizens, though occupied with the pursuits of industry, are still fair judges of public matters.

— Pericles

…who got it right.

The steel hatchet and culturecide

Australian rock artOne of my favorite nuggets from an unapologetically liberal arts education turned up in a class in cultural anthropology, which also happened to be my major – one I stuck with even after learning that job prospects in the field were pretty much nil outside of academia and the CIA. This particular story was even better than the ever-useful “fifty words for snow” trope introduced the previous semester. Indeed, it serves as a fable for the unintended consequences of technological diffusion and the many-layered banality of cultural domination.

Anthropologists seek out “primitive” societies to study, not just for the purpose of chronicling the natural history of human social experience, but as microcosms of society in which they hope to discover principles that can be applied to society in general, even those grown too complex and opaque for direct comprehension.

I suspect that many are drawn to the subject, as I was, by a yearning to understand the particular crazinesses of their own heavily-industrialized tribes through “outsider” eyes, in much the same way as they might approach and analyze a band of out-and-out, bone-in-the-nose savages in the back forty of some forgotten rain forest. There is an odd sort of comfort in putting the two on an equivalent footing.

The primitive society in this case was an aboriginal Australian tribe known as the Yir Yoront, whose lands are on the west coast of the Cape York Peninsula in the northeast corner of the continent. The tribe remained isolated, living their ancient way of life well into the twentieth century, having only sporadic contact with Europeans. By the 1930’s, though, contact had become more frequent, and in 1952 Cornell anthropologist Lauriston Sharp reported on its cumulative – and disastrous – effects.  Even though there was little bloodshed and a lot of willing participation on the part of the natives themselves, the result could fairly be termed “culturecide.”

Australian stone axeAs with any aboriginal culture, the Yir Yoront’s technological inventory was limited to a few very crucial items, the most prominent of which was a polished stone hatchet. It was materially important to their subsistence economy – for gathering firewood, building shelter and storage, digging roots and wild vegetables, as well as for making other tools, weapons and ceremonial objects. This of course was the value of it most visible to Western eyes.

Less visible, however, was its totemic and moral importance – a whole complex of cultural behaviors centered on the axe and its prominence in the interpersonal relations of the Yir Yoront. The older men of this patriarchal tribe controlled the axes, and anyone wanting to use one would have to ask to borrow it, according to a fixed kinship-based protocol, thus reinforcing lines of authority and kinship. Youngsters had to wait until their rite of passage into adulthood before being allowed to use one, and women were never allowed to own one.

The men were dependent upon interpersonal trading relations for the axe heads, since their home territory provided no suitable stone for the purpose. They traded spear tips made from stingray barbs to other groups further inland in exchange for the stone.

Trading partners came together at the tribe’s annual gathering, a festive affair primarily centered on initiations and other totemic ceremonies, but also an exciting, climactic event for the whole tribe. There, partners could renew their ties, catch up on news from other parts of the territory, and of course get a lot of trading done. People would party, young couples would hook up, and everybody would generally gain a renewed appreciation for how sweet life is.

This is the way it was for the Yir Yoront for most of the forty-odd thousand years that they had been Yir Yoront. Nothing ever changes in a culture this stable – every detail of daily life has a status that’s embedded in a vivid web of myth and totemic interrelations.

Then came the missionaries. Aside from bringing the blessings of Christianity to the heathen in exchange for grunt labor and servile gratitude, they did, in their own thick way, have a genuine urge to be helpful.

So when they noticed a bunch of half-naked savages grubbing in the dirt with this primitive stone artifact, they saw their opportunity: a plentiful supply of mass-produced steel hatchets would be just the ticket to civilization for the poor blighters. They started handing out axes to any and all who ingratiated themselves – younger men, women, even children.

As it turned out, the steel axes didn’t work any better for the Yir Yoront’s purposes than their traditional stone axe, so there was no “improvement” in their quality of life due to the improved technology. And even though most Yir Yoront individuals quickly came to prefer the steel axe, the easy new source of axes totally sidestepped the social relations bound up with the traditional patterns of axe production and distribution.

The result was a near-total breakdown of the social structures that were based on the traditional stone axe. Older men acquiring steel axes were placed in a situation of dependence on some erratic missionary in place of a situation of self-reliance in acquiring an axe the traditional way. The elders’ moral authority that went along with lending out axes began to break down in the face of  this new abundance. There was no longer any prestige in knowing how to make a stone axe or in having reliable, productive trading partners.

Among other things, this took away much of the excitement of the annual tribal gatherings, when instead of looking corroboreeforward to acquiring a year’s worth of stone axe heads, a man might now “find himself prostituting his wife to strangers in return for steel axes… with trading partnerships weakened, there was less reason to attend the fiestas, and less fun for those who did.”

There were any number of ways that the Yir Yoront culture unraveled because this one simple technological introduction, however well-intended, acted like an invasive species introduced into an ecology that has evolved without it. Nothing in Yir Yoront institutions or understanding of the world accounted for “rogue” axes, or the kinds of relationships they would have to have with European foreigners. All too soon, they were no longer Yir Yoront, but second-rate Europeans. Their own culture ceased to work.

If this outcome had been intended, it would have provided a dandy, field-tested guide for anyone interested in manipulating a culture, up to and including “murdering” it, for their own political gain. Little wonder that the CIA is so interested in being a major employer of anthropologists.

When I think of this story, I’m often tempted to speculate what might be the equivalent of the steel axe for our own peculiar tribe, “advanced” though it may be. What shiny bit of technology would fit just so into some crucial cultural node holding together multiple strands of cultural ties in such a way as to unravel them all?

Chances are, there aren’t any “invasive species” of innovation that would be as comprehensively destructive for us as the steel hatchet was for the Yir Yoront, but a few could be said to come close. Coming up with candidates for such “culturecidal” technologies could actually be a fairly illuminating (if slightly grim) exercise in social and political analysis – different people would come up with different lists, of course, but I bet there would be a lot of overlap.

My own candidates would be 1) the automobile; 2) television; and 3) air conditioning. These have all impacted our social fabric to a profound degree, bringing the intended comfort and convenience as expected, but also a more complete (if splendid) isolation from each other.

Having originated these particular culture-killers, the USA doesn’t have any outsiders to blame for its own surrender to them, though as a result of its hawking them to the rest of the world, other countries arguably do. To borrow from computer pioneer Alan Kay when he spoke of our educational system: if a foreign power had attempted to impose this on us, we might well have viewed it as an act of war.

This post was also published in The Rag Blog, online reincarnation of The Rag, Austin’s underground newspaper of record from 1966 to 1977.

Resilience and cooperative urban farming

Neighbors dig a new farm plotWe grow food in our neighborhood. It’s not a huge amount, and there aren’t many of us yet, but we’re learning how to feed each other. For the long term, it is a bid for sustainability, healthy food, community, and local empowerment. Cherrywood Farm is urban agriculture for the people.

At the very least, what we’re doing is a nice, green, eco-conscious bit of “walking the walk” after so much talk about resource limits and chronic mistreatment of Mama Gaia at the hands of industrial capitalism. But it’s also a political act, in the best tradition of lefty liberation.

The proposition is this: the more that real people grow real food for each other, the less dependent we are on the food industry, on the petroleum that drives it, and on the wage system that monopolizes our most basic economic activity — the getting of food.

Urban farmer in backyard fieldOur farm operates as a cooperative CSA (community supported agriculture), where members subscribe at the beginning of a growing season and receive a share of fresh produce and eggs on a regular basis. Share distributions take place every other week through the two 18-week growing seasons in a year.

We’re talking real food, zero food-miles, right here, independent of the semis that have to keep rolling night and day in order to keep the store shelves full.

We currently have twenty families taking part. Subscription payment can be cash, work, or half and half. All the farm work is done by members, so we need a good balance among the types of  participation to make sure we have our labor needs covered, as well as sufficient operating funds. After five successful seasons, the mix has gotten pretty well established. Day-to-day work is coordinated by our lead farmers and carried out cooperatively, so members don’t necessarily need advanced skills to enjoy success at growing vegetables.

Our fields are located in the back yards of our land-host members, which brings up one other way to participate. Land hosts receive a full regular share of produce in exchange for turning over a piece of their homestead for common growing space. We have eight locations throughout the Cherrywood neighborhood, including a “showcase” field at In.Gredients, a packaging-free natural foods store on the south edge of our territory. Fields average about 750 square feet and incorporate techniques for intensive cultivation and minimal water use — an important feature for growing things in Texas.

Next generation of urban farmers Cherrywood Farm is part of Urban Patchwork, a non-profit that helps organize neighborhood farms, providing expertise, discount seed and supplies, insurance, and advocacy at City Hall. Some of the City’s water policies, for example, are a lot more farm-friendly as a result of Urban Patchwork’s efforts.

In Austin, conditions are ripe for taking urban agriculture to the next level. Neighborhood identity and activity are both very strong here, with a great deal of public and municipal support. Local farms in the Austin area, some of them operating as CSAs, enjoy an increasing number of customers, both on-site and at weekly farmer’s markets around the city. It’s a natural next step, then, to do a lot more of the growing in the neighborhoods themselves.

It’s a small start, but in time, the practice of growing food where it is eaten could be extensive enough to keep us all happy and well-fed even when the top-heavy, oil-dependent industrial food-supply system gets the shakes and starts missing deliveries.

Mere mention of the prospect generally sends the business-as-usual crowd into fits, but as the empire continues to unravel, it will be harder and harder to ignore the need for local resilience and food security. Slow decline is harder to spot than, say, a nice, spectacular zombie apocalypse, but it does offer more time to learn and adopt serious food production on a local scale.

This post was also published in The Rag Blog, online reincarnation of The Rag, Austin’s underground newspaper of record from 1966 to 1977.

What’s left of liberals

Painting of Liberte leading the French RevolutionEver since George Bush the Elder made his dismissive quip about “the L-word,” liberals have sought to rehabilitate the term and restore it to the connotation it enjoyed throughout its postwar heyday, when “western liberal democracies” were the embodiment of mankind’s progress up from the jungle, and to be liberal was to be evolved, humane, rational, grownup, and on the side of the future.

For a couple of decades, the political spectrum was essentially unipolar. Gradually, though, the “Conservative Revolution” picked up steam, and while it may not be the juggernaut now that it was in 1988 when Bush dropped his L-bomb, it definitely made the rightward end of the spectrum respectable in the public mind, or at least not as unspeakable as it was up until, say, the Carter administration.

It’s not surprising, then, that liberals’ push-back against the weaponized language of the Right desperately enlists any and all tropes that amount to “anti-Right.” And what could be more opposite of “right” than “left”?

This leaves us with the all-too-frequent spectacle of ruling-class liberals referring to their turf as “the left,” as happened recently when Salon magazine ran a liberal-pride puff piece with images of Bernie Sanders and Elizabeth Warren sharing space with its call-to-arms headline: “’I’m proud to say I’m a liberal’: How conservatives vulgarized a term — and why the left must reclaim it.”

For reasons understandable if not exactly admirable, liberal advocates have boldly assumed that since they are “not-the-right,” it means they are “of-the-left,” at least kinda-sorta some shade of left — though, heaven forbid, not like those hirsute troublemakers far enough left to be socialists. Of course, right-wingers are only too happy to go along with the conceit, no doubt getting in a couple of snickers from such an overly-fine distinction between Democrats and Commies.

Contrary to the popular American usage, however, “liberal” is qualitatively different from “left.” It is a centrist  position meant to preserve the status quo by allowing a few reforms that help appease or co-opt efforts at basic systemic change sought by the left. Chris Hedges, in a pithy comment about his book Death of the Liberal Class, told one interviewer “The liberal class was never meant to function as the political left. The liberal class was meant to function as the political center.”

Part of the confusion in terminology comes from the ever-present tendency in politics toward subverting the language to make a position appear better (or worse) than it actually is. It’s a political weapon first established in the popular mind by its treatment in Orwell’s novel 1984, but is certainly alive and well today.

It goes back a lot further, of course — the politically-mindful Confucians in ancient China asserted that “good government begins with calling things by their right names.”

To get some kind of useful handle on liberalism, referring to its history is probably the best way to avoid the subjectivity and vagaries of fashion that make most of the semantic wrangling about the term so unproductive. The work of left-oriented European historians like those of the Annales school (Braudel, Wallerstein, Arrighi et al.) gives us some pretty interesting background on the matter.

As a political position and basis for policy, liberalism apparently came out of the Congress of Vienna in 1815, the start of Europe’s massive reorganization after the defeat of Napoleon, who represented the culmination (at that time) of the “Age of Revolutions.”

The powers-that-be recognized that revolutionary tendencies among the masses meant the established power structure could no longer be sustained by the current system of monarchy, so reforms were put in place with the aim of appeasing them — a program of concessions to “the dangerous classes.”

These reforms included free public education, nationalistic “patriotism” bolstered by universal military service; widening of suffrage into non-landowner classes, and somewhat later, social insurance.

Immanuel Wallerstein describes how, in response to the burgeoning masses of disenfranchised wage-workers, “political leaders of the different states began to effectuate a program of reform designed to respond to the plaints of this group, palliate their miseries, and appease their sense of alienation.”

As the reforms continued, however, revolutions did also, and as Wallerstein notes in another essay, “the revolutions of 1848 showed the potential strength of a militant left force [that was] frightening to the centrist liberals, and even though the revolutions of 1848 all petered out or were suppressed, liberals were determined to reduce the volubility of what they saw as the too-radical, antisystemic demands of the dangerous classes.”

It’s also worth noting that out of this extended program of post-revolutionary reform came our present notion of “social progress.” This brilliantly successful bid to win working-class hearts and minds did much to help stabilize the status quo, since the promise of improvement created patience among those classes who would otherwise be quicker to agitate.

Yes, it’s true that when we’re talking “liberal” or “left,” we’re talking labels, so all the standard disclaimers are hereby invoked. Still, I’d say that in functional terms, “liberal” is ruling-class, “left” is working-class. I’d suggest, too, that anyone comfortable with the label “left” would also accept — or at least, not be offended by — the label “socialist.” Certainly, that works for Bernie Sanders.

This post was also published in The Rag Blog, online reincarnation of The Rag, Austin’s underground newspaper of record from 1966 to 1977.

When you always can’t get what you want

Genteel landOne thing we expect in a democracy is that the majority gets its way. But even though it does get its way in the end, getting there involves a certain amount of deliberative discussion and a healthy amount of compromise. Give and take is the name of the game – sometimes you get what you want, and sometimes you don’t.

Clearly, the Republicans have been missing that memo ever since Grover Norquist first undertook to “inoculate them against the virus of compromise,” whereupon they proceeded over the next two decades to fulfill their anti-government animus by effectively gridlocking Washington right out of the governing business.

In the absence of compromise, and especially the action it enables, what remains is talk, and the talk that remains serves mainly to hog up the available bandwidth of public discourse with pet ideological complaints that have little bearing on the practicalities of actually running a large industrialized nation-state.

One such bit of bluster that has had a lot of replay lately is a perennial favorite of the Republican core constituency: “states’ rights.” It’s the old song, and very much a song of the South. To hear the sons of Dixie tell it (the daughters don’t get to, of course), it seems that nearly anything that the government wants to do to promote the general welfare will somehow end up trampling states’ rights. Never mind voters’ rights or women’s rights – these must take a back seat to the state’s rights, depending on which particular state it is whose rights are under threat.

Almost always, the state or states in question once belonged to the Confederacy, and for a while the term “states’ rights” was code for resisting Washington on the matter of racial integration. Jim Crow, a post-slavery incarnation of the Southern caste system, seemed to be as emblematic of that nation and its way of life as George Washington is of the nation whose capital city bears his name, so the resistance came with a patriotism-like intensity.

Although race-based subjection has had to keep a low profile since then, the resistance to Washington’s “impositions” continues, through one diversionary objection or another, and the resistance is so enduring and visceral and seemingly pervasive throughout the Southern body politic that it calls for some sort of account beyond writing it off  to bigotry – or just sheer cussedness – and simply leaving it at that.

An account that blunt, while accurate, seems inadequate. Simply noting that awful behavior is in fact awful doesn’t give us much in the way of understanding it well enough to deal with it, let alone deter it.

Likewise, there’s not much help to be had in making an apologetic case for the South, pointing out the pockets of relative enlightenment in mostly-urban areas – nor the flip side, pointing out the abundant racism and redneck-style orneriness that take place outside the borders of the old Confederacy. Again, these observations would be true, but kind of miss the point.

Federal debating society meetsWhat we have here in the halls of our dysfunctional democracy are two groups that are roughly equal in power, each of whom wants something they value. The values tend to be cultural, further complicating reasoned discourse. What has broken down is the expectation that differences between opposing positions can be narrowed and overcome through deliberation and compromise.

In this situation, not only is compromise removed from the process, the values are so far apart as to be mutually exclusive. The Republicans, carrying the banner of Southern patriarchal values, refuse to accept gay marriage, for example, while the Democrats maintain that it’s a matter of equality to allow it.

There’s a whole litany of similar issues in which there is little or no common ground, no values shared, and in many cases, directly opposed. In each case, the majority prevails, and the opposition doesn’t get what it wants. In theory, fair enough – after all, you can’t expect to get your way all the time.

How this plays out in the long term, however, is interesting to consider. Following the premise that “you can’t always get your way,” there seem to be at least two possibilities: A) you get your way, certainly not always, but some of the time, or B) you can’t ever have the one particular thing you want, and if it’s something you insist on valuing, you will be the subject of scorn until you by golly come off it, change your values, and stop pushing for it.

This is a decidedly unpleasant dilemma. If it’s option “A,” I hate to think what the states’ rights crowd would inflict on the rest of us when they did get their way – especially considering the institutionalized racism and theocracy for which “states’ rights” is usually a proxy issue.

If it’s option “B,” on the other hand, there’s a built-in hazard that long-term denial of a particular advocacy group’s goals – however distasteful those may be – will form a kind of residue of resentment that eventually stands to gum up the works for everybody. Witness the gridlock we’re experiencing today. When we insist that an interest group take part in our governance, yet consistently outvote them on what they are most interested in, it does not turn out well for us. To put it starkly, we end up with a viper in the nest.

There’s a good case to be made that today’s GOP, with its predominance of Southern leaders and values, has effectively become a political arm of the cultural region that once tried for independence as a nation of its own, the Confederate States of America. They were anti-Washington then; they remained anti-Washington after their independence was suppressed by force of arms, and they are anti-Washington today, expressed more politely in Republican  ideological terms as “smaller government.”

Patron saint of lost causes

Patron saint of lost causes

Post-1865, the US became more like a continental empire and less like a federation. For the latter, we would have to assume participation of the various states to be voluntary, but now there was a conquered people, a subject nation, within its borders. Conquered people don’t just get over it in a few generations, and if their descendants who are obliged to share in the governing of such a forced union end up intentionally or unintentionally defeating the process for everyone, it should come as no surprise.

This is not to say that Southern aspirations, as curated by the GOP, necessarily include political independence from the US. It would probably be too much to hope for, given how badly that went for them last time.

However, it does seem likely that the passive sabotage of Washington’s legislative abilities is consistent with the reaction of a people – a nation – who are forced to be one kind of nation, when all they want to be is their own kind of nation. Unfortunately for the union, that nation is hierarchical, theocratic, patriarchal, and historically committed to a racial caste system.

Those of us who identify with the “true America” – the Union that was preserved and enforced by the Civil War – take it for granted that we have some say about who qualifies as a legitimate nation and who doesn’t. Official US policy certainly makes case-by-case distinctions throughout the world, e.g., Iraq qualifies, but Kurdistan doesn’t. There might be some disagreement with some of those calls as they apply outside US borders, but inside – well, the very idea is so un-American it’s usually taken as a fair subject for raucous humor.

And yet, here we are, two nations with irreconcilable differences. Apparently, we’re staying together for the kids’ sake. Even if there were some way to separate amicably, most Union sympathizers would cringe at the thought of what an irredeemably backward and benighted country Dixie would become if allowed to go its own way.

Arguably so, but it’s also true that there’s a missionary zeal in US tradition that often puts us in the redemption business, and almost as often, leaves us wishing we hadn’t. Another un-American notion, perhaps, but there’s a certain wisdom in dealing respectfully with foreign nations who happen to have customs and institutions that are contradictory to one’s own.

So, if indeed having two or more nation-states between Canada and Mexico is an effective and just outcome, how does it happen peaceably? Big question. But that’s the first step, to admit that civil war is not inevitable and that our own turned out to be pretty much of a mistake – a crummy choice among the possible ways to end slavery, and a complete failure at winning Southern hearts and minds.

It’s also helpful to buy into the axiom that for any just end, there exist just means. Then it’s only a matter of will. Here in the US, it must be the will of the majority, so it would probably take a while. There are actually quite a few examples throughout the civilized world that we could benefit from – Scotland’s separation from Britain and Catalonia’s from Spain, just to name two that are being taken seriously. Maybe it can’t happen here, but then again, it may be too hasty to say that it shouldn’t.

What won’t make it happen

Solar satellite beaming down microwave powerAn American satirist once observed that talking about the weather hasn’t led to anybody’s doing anything about it, but on the internet, a lot of people know how to do something about the climate, it seems. In the virtual land of tool-happy technophiles, however, most of the proposed solutions bring to mind the fabled man with a hammer for whom everything looks like a nail.

Now don’t misunderstand, I can totally sympathize with those who love their gadgetry, being a recovering technophile myself. I haven’t used a gadget for exactly eighteen seconds now, and my knuckles are only a little bit white.

Still, when I go looking for clues about how we as a species might successfully adapt to our twin carbon predicaments, there is little to be found but variants of a single idea: if it looks like a really big nail, just get a bigger hammer and bash harder.

There are any number of like-minded solutions being touted, each featuring a pet technology writ large — massive solar satellites, massive wind farms, massive networks of nuclear power plants, massive areas of desert states paved over with solar panels, et massively cetera.

“Massive” certainly applies to the ooh-shiny machinery so fondly on offer, but seldom mentioned are the massive amounts of capital necessary for executing any of these programs. Or, more to the point, the massive convocation of raw political power — the centralized, consensus-enabled, military-scale organizational power that would be capable of pulling off such madly-unprecedented accomplishments of civil engineering. Where is that power going to come from?

There’s not a credible shred of it in sight, of course: central government lost the will to govern decades ago, and the massive twin problems of climate change and fossil-fuel decline aren’t exactly amenable to “market solutions.”

Such grand schemes are now put forth so often and so casually — and with such obvious neglect of basic data and spreadsheet — that my own reaction has finally passed the “Dude, reeeally?” stage. Now it is just “AGH!”  Please note that this is an acronym for “Ain’t Gonna Happen.”

Not that I don’t wish it could happen. It would be lovely indeed to build a whole technosphere of machines that provide the energy to build the machines that provide the energy, as well as the machines that keep us comfy and well tended.

Even better, if this brave array of machines were able to do it all without putting a lot of awful stuff into the atmosphere, well, we wouldn’t even have to bother with the remedial geo-engineering and space-mining and even more exotic acts of techno-desperation. Truly, who could wish for more?

So it’s come to this. We’re stuck on wishing. We punted a glorified two-man submarine as far as the Moon, and somehow that confirms all the Buzz Lightyear indoctrination we got growing up, promising us a future of jet-packs, Martian vacations and physics-busting interstellar exploration. We’re buying our own BS.

We are certainly not ready to hear “Ain’t Gonna Happen.” It goes against old beliefs about who we are. We developed a Make It Happen culture early in the fossil fuel-induced Industrial Age, an era that Kenneth Clark has dubbed “Heroic Materialism.” In keeping with it, we erect statues to industrious movers and shakers, their steady gaze filled with rising skylines and leveled at a bright horizon where Destiny itself awaits — the image and likeness of the “modern man” in us.

Over the century or so that we marinated in this peculiar ethic, we came to believe that we were able to Make It Happen simply because we dared to be heroic. However, this meant ignoring the very heroic amounts of fuel it took, working furiously behind the scenes to do all the heavy lifting, so that we frail humans could take the bows for being so danged clever and visionary.

A related premise, every bit as peculiar, is that somehow we’ve accomplished our “advanced” way of life because we have become clever enough and inventive enough to make it better through the ingenious use of machines. We often congratulate ourselves on harnessing their power. But right there, we are misled by a metaphor: technology doesn’t have any physical power of its own, of course. Machines are the way we harness natural sources of energy.

Machinery itself is hardly a modern invention. Ancient Greeks and Chinese fully understood intricate arrangements of gears, cams and levers that could be put to useful tasks. Leonardo da Vinci certainly did. What these machines had in common — and what kept technology from spreading sooner — was that somebody had to turn the crank to make it go. Given the limited availability of slaves or animals to take on the crank-turning duties, interested parties would usually just go ahead and do the work themselves, without all the mechanical fanfare.

The real breakthrough of the Industrial Revolution was the trick of using fuel to turn the crank. And once we realized we could get fuel by the ton out of holes in the ground, rather than from carefully-tended forests, we unleashed armies of slave-equivalents ready to turn the crank for any and all interested parties.

With each of us now attended by platoons of uncomplaining energy-slaves, very ordinary citizens of the late industrial era can live just as comfortably — and in many respects, more lavishly — than even kings did in former times. That leaves us anxious about having a comfortable life when the serfs begin showing signs of going away.

There are a number of ways to allay the anxiety — not the least of which is recalling the fact that we were clever enough to learn the trick of using fuel to turn the crank, and even though it turned our civilization into kind of a one-trick pony, we’ve learned quite a few others in the meantime that will help us adapt ourselves to the circumstances, rather than trying for the other way around.

Here’s where we learn even more. The methods that will most surely lead to our living comfortably in an era of contraction and energy scarcity will not likely include the one that served us so well when dense, abundant sources of energy were what allowed us to Make It Happen on such a grand scale.

Transition to renewables won’t match fossil fuel energy level

Vision of solar arraysThe post-carbon world is on its way. Most of us know it, but don’t like it. We’d prefer to keep things going as usual – “things,” of course, being a way of life that requires abundant energy. We got used to having extravagant amounts of energy with fossil fuels, which are unique in that way, but also finite and messy.

Inevitably, nature is taking fossil fuels out of the picture – whether with our cooperation, due to concern about climate change, or without it, due to the forces of depletion.

If the problem is that our current energy source is going away, then the solution – the conventional “powershift” scenario – is to replace it with other sources. The threat to business as usual is thereby averted, and the problem solved.

That’s a big if, arguably, when it comes to identifying “the” problem. It may well be that “business as usual” is itself the problem, with solutions that could give us much pleasanter outcomes than clinging to it at all costs.

However, for all its eagerness to cling to business as usual, the conventional wisdom does not stop to consider the other possibility – a “downshift” to a more modest and human-scale way of life – and assumes that the transition we face is a transition between sources of energy, rather than paradigms for living.

It might be interesting, just for the sake of discussion, to take the conventional wisdom at face value, and see where it leads if we follow it far enough. The well-prepared journeyer should know, however, that soon after we pass the 10,000 square mile chunk of Nevada paved over with solar panels, the destination is likely to challenge most notions of plausibility – enough to cause some serious reconsideration of what “needs” we actually do need.

Most who talk about energy transition propose a scenario of straight-out substitution, one that puts wind farms and solar arrays in place of the coal- and gas-fired power plants now plugged into the electric grid – plus quite a few more to account for powering our electric cars and trains.

It’s more plausible, certainly, than a scenario featuring nuclear power as the world’s energy mainstay – all other objections aside, nuclear just doesn’t scale up well enough to make the grade.

Even so, there are also problems of scale in the wind-and-solar substitution scenario, though less clear-cut. These come to light when you start identifying the details of an implementation program and actually try to put a pencil to it.

The first requirement in the scenario is to maintain business as usual, which  means coming up with an alternate power infrastructure that produces enough energy to run the show that we know. So we need some figures.

Here’s the total – the current worldwide energy consumption per year, including all fuels and electrical sources

  • 521 quadrillion BTU (quads); or
  • 550 exajoules (EJ)

Out of the total 521 quads (550 EJ), only 73 quads are electric power, primarily from coal and natural gas. The remaining 448 quads consists of fuel being burned – to run our cars, to smelt our steel, to bake our cement, to heat our houses.

Electricity is only 14% of our energy use —

Compared to the rest, wind and solar power are a blip – barely out of rounding-error range.

Non-electric energy use —

  • Coal: 111 quads
  • Oil: 175 quads
  • Gas: 102 quads

Fully 86% of the energy we use is to melt stuff, push pistons, and keep us cozy. Not a bit of this moves a single electron.

Altogether, this is the amount of energy it takes to run our present-day way of life. So now that we have some quantities to work with, it’s clear that the path toward replacing the entire amount of it with wind and solar power is many, many steps beyond where we are now.

In fact, to go with the metaphor, that path would be about 290 steps long, with every single step the equivalent of adding all the solar installations and wind farms that currently exist. The first milestone along the way – the point where wind and solar are generating all of our electricity – would be 40 steps away.

It’s worth keeping in mind that if we aim to do everything with electricity, we’ll need to go far beyond replacing our current electric power generation. Seven times farther, in fact.

How long does it take to go one step? Renewables have been growing an average of 4.5% per year, so the next step would take 15 years. It would accelerate, similar to compound interest, but even if we got a big push and ramped it up to 10% per year, we wouldn’t reach our 40-step milestone until mid-century. It would take an additional 20 years to reach the “all energy” goal.

Recently, two engineers at Stanford University worked out a detailed scenario for achieving the goal of all-renewable power by 2050. Their mix included geothermal, hydroelectric and tidal sources, but 90% of the power would be wind and solar.

The numbers are, well, large. They ignored construction costs, but those are included here:

  • 3,800,000  Wind turbines (5 MW) @ $10 million each
  • 49,000 Solar thermal plants (300 MW) @ $1.2 billion each
  • 40,000 Solar photovoltaic plants (300 MW) @ $570 million each
  • 1.7 billion Rooftop solar installations (3 kW) @ $20,000 each

The tab comes to $153.6 trillion, or $3.8 trillion per year. Even if somehow the political leaders of the world were to agree on making this a modern Manhattan Project, their first difficulty would be the conversation they would have to have with their bankers.

It could be awkward, since the entire amount of capital in the whole world that can be mustered in any given year (Gross Fixed Capital Formation) is currently about $14 trillion. They’d be asking for more than a quarter of it, every year for forty years, redirecting finances that would otherwise be putting up buildings, equipping factories and acquiring lots of cars.

If this project somehow got the green light, then it would become a matter of putting up a couple of thousand windmills and several dozen utility-scale solar plants every week for forty years.

We casually refer to a “Manhattan Project” or an “Apollo Project,” but these are pikers by comparison. This one is about 5000 times bigger than the Manhattan Project, and for that matter, bigger than all of World War II – by about 100 times.

Any government or alliance that took this on would show a level of  commitment and real governance the likes of which we haven’t seen since, oh, before the Carter administration – which is about when we should have started something like this to begin with.

Given the numbers, we can judge how likely it is we’ll ever see a day when current energy “needs” are supplied exclusively from renewable, non-polluting sources. From every angle, it appears to be unlikely in the extreme – you definitely wouldn’t want to bet the farm on odds this long.

It makes a lot more sense to look at the other side of the equation, and reassess what our needs really are.

We certainly need to keep building wind and solar power sources. We just need to start rearranging the way we live so that we can get by comfortably on a lot less energy – walkable cities, few private cars, real public transport, local resources and farming, home-scale and neighborhood-scale solar power, doing it yourself, and getting together with the neighbors.

There will be fewer conveniences, no doubt, but also ample opportunity to reflect on how we ever thought those conveniences were worth the aggravation and sacrifice it took to maintain them.

The language of self-determination: independence vs. secession

catalonia-flagsAmerican political sensibilities certainly have their share of contradictions, but one of the more ingrained is the idea of “independence.”

On the one hand, there’s 1776 and all that — an enshrined principle of self-rule that has even found its way into the United Nations charter. Throughout much of the 20th century, the US championed independence and national self-determination everywhere, especially places and groups subjugated by major colonial powers.

On the other hand, it takes only brief exposure to the chronic wrangling over the Palestinians’ desire for their own state to realize that the American conception of “independence” is selectively applied, at best.

It would be useful to know more about what the basis for selection might be. We seemed to approve of the Orange Revolution, for example, without giving much thought to how benign the Ukranian state might be, and certainly without learning many particulars about what makes up the Ukranian sense of national identity.

Scotland, with its current bid for independence from the UK, is also likely to be finding favorable American opinion. They’re using the term “devolution” in this case, and it’s all very civil and proper, British style, with a referendum going before the Scots people next year, and the mother country willing to accept the outcome either way.

For other places in the world, Americans seem to be mostly indifferent to the compositional issues that continually bother multi-national states. Well, there’s Canada, with the Quebequois nation sitting in uncomfortable union with the larger Anglophone state, though with one foot perennially out the door. It’s right in the neighborhood, after all, and some Americans may even have given some thought, one way or the other, to Quebequois legitimacy.

Spain is another case. Unlike Britain, Spain has been taking a hard line with its restless nationalities — the Basques since practically forever, and more recently, the Catalans. For Spaniards, Basque partisans aren’t “freedom fighters,” but “terrorists.”

The Spanish central government eventually blocked a plebiscite for Basque independence in 2008 by a successful appeal to its constitutional court, and a similar move is now planned for defeating Catalonian independence when it comes to a vote next year. In Spain, independence is called “secession,” and it’s unconstitutional.

Other nationalities around Europe are in similar contention with the central governments of their ruling states: the Flemish in Belgium, the Corsicans in France, the Welsh in Britain, and Venetians and Sicilians in Italy. Other  independence movements in the rest of the world, which tend to be less polite than in Europe but every bit as serious, include the Kurds in Iraq, Abkhazians in Georgia, Moluccans in Indonesia, and the Tamil in Sri Lanka.

The usual response to such a list is to judge each case for its legitimacy: some bids for independence are deemed clearly authentic, some puzzling but probably okay, others just rabble making trouble.

The very terms we use are loaded with foregone conclusions about legitimacy: “independence,” “self-determination,” “devolution,” “cultural nationalism,” “separatism,” “secession,” and “rebellion” all refer to the same phenomenon, but each carries its own shade of bias.

We see the same range of bias echoed in “revolution” versus “civil war,” and “freedom fighter” versus “terrorist.” It all depends on who’s got a dog in the fight.

If this seems obvious enough so far, the next question may not be so clear-cut: what, exactly, is the “self” in “self-determination”? We refer to “a people” or “a nationality” as the entity that seeks political autonomy in these independence/separatist movements. It would be good to have some kind of operational definition that isn’t muddled by political boundary lines drawn on a map.

Geography comes into it, of course, in the sense of a cultural area or region. Culture and place tend to be closely bound. Regions are difficult to outline, though, so the edges tend to blur. Better to focus on the people who live there.

In many places, the inhabitants share a cultural complex of values and customs — manners, dress, cuisine, outlook, institutions, perhaps a distinct dialect or language — that they identify with, and that help secure their identity as a group.

In some places, the sense is strong, and often leads to a wish for the group to govern itself; in other places, the inhabitants may be just as distinct, but for whatever reasons are perfectly content to let the “host” state do the governing.

In lieu of extensive anthropological field work for each case, probably the most operational definition of “a people” is when a majority of the population in question says so.  The people of Veneto, for example, recently polled above 80% in favor of Venetian independence from Italy.

Readers who are following this line of argument so far will probably find the next step an uncomfortable leap, at least if they are Americans. Mere mention of regional self-determination here in the land of E Pluribus Unum is sure to trigger PTSD-like flashbacks to the events of 1861-1865.

Nervous dismissals of Rick Perry and Texas “secession” that paint the situation as merely the ranting of a silly right-wing crackpot are understandable enough. It works in his case, however, because he is in fact a silly right-wing crackpot, and a fantasy replay of Texas circa 1861 is as unresponsive to the underlying issues as it is unlikely.

Secession is still probably as poor a means for ending overextended dominion as it was the first time around. Washingtonian rule is still very well backed up by sufficient firepower to make that idea a non-starter.

The “S” word is altogether unproductive, and misleading at best. In Vermont, for example, there is a quite respectable independence movement that bears little resemblance to the crackpot demands of a conquered people hollering for a rematch.

The spectacle of Perry and his ilk, unfortunately, tends to obscure some structural issues that would be worth a visit by some well-considered public discourse. From what I can tell, these involve at least two factors that affect governance: cultural unity and effective scale.

It’s probably reasonable to assume, as a general principle, that a smaller, more homogenous group is more likely to get closer to a consensus more consistently than a larger, more heterogeneous group. The grand experiment that is the USA certainly tests the limits on both size and diversity.

We have a continent-sized state with central rule from a capital that is more than a thousand miles away from the majority of locations in its territory. There are many of the same problems of scale that faced ancient Rome.

It’s not just the number of miles to cover — though that is still an issue in the petroleum age, and will be even more so as the petroleum age winds down. It’s also about the number of different regions and cultures that such a large territory is bound to contain. In forming a federal union that consists of so many different cultures, we found it necessary to achieve cultural unity by simply declaring it: “Out of many, one.” Just to make sure, we declared it in Latin.

Although it’s not officially recognized as such, the USA is effectively a multi-national state, just as surely as Canada or India.

As a democracy, this union runs into further complications when trying to determine the public will about what shall be the law of the land. Each of the constituent cultures would have a pretty good internal consensus about the values and institutions that would be the law of their own land, but often, in order to have it so, they must get Washington to make it the law of the whole land — for all the constituent cultures, even if the values of the one conflict with those of the rest.

With many distinct regional cultures, the chances of irreconcilable conflicts in values and institutions are going to increase, just by the sheer numbers involved. We had our first such challenge in 1861, when the institution was slavery, the values were about caste, and the culture was Dixie.  Charleston found some allies in different regions with similar interests, including out-and-out marriages of convenience like Texas, and together they formed another multi-national, federated state.

Up to that point, it could be supposed that the idea of a “federal union” meant that participation was voluntary. By 1865, though, Washington had established unambiguously that such is not the case.

Obviously, Dixie was on the wrong side of the moral issues, and its leaders had a lot of awful ideas about how to run a society. Equally obviously, they didn’t have the same opinion, and a good many of them still don’t to this day.

Without expressing a shred of sympathy, though, we can still observe that the people of the the former Confederacy were brought under Union rule by force of arms. In every practical historical sense, they were conquered. That gives the FCSA a unique status among the people of America and in its politics.

Historically, a conquered people tends to resent it for generations. They don’t “get over it.” They find any number of ways, sly and otherwise, to make life inconvenient for the conqueror.

It seems likely, then, that Southern resentment accounts for much of the gridlock in Washington, particularly in light of the very credible argument that Republican “Conservative Revolution” culture has Southern culture at its heart, and certainly in most of its political base.

If so, there are probably a couple of factors operating together here. One would be simple uncooperativeness due to resentment of the conquered; the other, an effect of the large-scale, inter-regional competition for central-government favor — Dixie is different, and has some values that are going to conflict severely with those of many others, should they become the law of the land.

Since about 1990, Dixie’s minions have had sufficient power in Washington to grind it to a halt and, well, here we are.

One effect of the long term decline in abundant energy will be the necessity of organizing our affairs on smaller scales, closer to home. Politically, this would mean some devolution of autonomy to more regional and local entities.

To assume that the American federal union will continue in its present capacity is probably not a good strategy for successful adaptation. But there are plenty of peaceful, constitutional options for making more realistic governing arrangements, and these will probably have to be explored at some point in the not-too-distant future.

How we get from here to there is a huge question that’s still mostly taboo, even though it would serve our longer-term interests to give it some serious and creative consideration. It’s not hard to gain an appreciation of how regional self-determination operates in the rest of the world. Perhaps that understanding — along with the cautionary tale of a conquered Dixie — can help show the way forward.

What is the value of where you live?

land-to-be-usedWhen we talk about where you live, let’s keep it within walking distance from where you keep your stuff, as we focused on in an earlier post. Presumably, you’re also paying rent on the place, or paying rent on the money that you used to obtain the right to put your name on the deed. Notice here the absence of the term “buy.”

People have bought things for millennia. It’s an old and widespread custom. You go to the market stall or its equivalent, pick out something you like; the vendor hands it to you, you put it in your basket, and if you simultaneously hand her the agreed-upon number of metal tokens or chickens or cocoa beans, you may then walk away with the goods and not have some muscular fellow chase after you.

Safe to say that most of the stuff in the place you live has accumulated there as a result of this very process.  It’s the primary way you gain possession of artifacts of every sort. These are things that you own. Even the artifact enclosing all the stuff got there the same way — somebody went to the builders’ supply, handed over some tokens, carted the materials to your particular “where,” and put it together into a nice, cozy shelter.

The unique and somewhat peculiar element in what eventually becomes “home” is the fact that there exists a place to stack the lumber and materials; a place to stand while nailing it all together; a place to put the resulting big box in which to stash your stuff and your self. This element did not come from any store. Rather, it’s a rectangular piece of this ancient and living planet that some original someone first marked off and then, for reasons difficult to justify, declared to one and all: “I OWN this land.”

Note that he did not have to exchange any tokens for it. Even so, we just know that the next guy to take his place gave plenty of tokens to the first guy, who was only too happy to receive them in exchange for “his” land, along with, no doubt, a keen sensation of having gotten away with something big. Note also that the subject of the transaction never went into a basket or a wagon, or got transported to any other place. Obviously not — it is a place.

For new householders who may have wondered about the ownership ancestry of their multi-hundred-kilobuck purchase, there’s a fable told in legal circles about tracing title to land — apocryphal, no doubt, but it does shed some light on how the first guy in the chain might have justified having his name on the deed. It seems that a New Orleans lawyer, going through a routine verification of title for a real estate deal, traced it as far back as 1803, but the lenders insisted that it be cleared “back to its origin.” Exasperated, he obliged them with a history lesson:

“Louisiana was purchased by the U.S. from France in 1803, the year of origin identified in our application. The title to the land prior to U.S. ownership was obtained from France, which had acquired it by Right of Conquest from Spain.”

For good measure, he went on to cite Spain’s acquisition of the territory by “Right of Discovery,” wherein Columbus had acted on the authority of Queen Isabella of Spain, whose title to the New World was confirmed by Pope Alexander, who in turn was serving as Earthly agent for God Almighty, the very creator of the all the real estate on the planet. So much for origins; his clients got the loan.

It’s striking how the credibility of each earlier claim gets increasingly iffy prior to 1803: if we are to swallow the premise behind “Right of Conquest,” it’s another way of saying that might makes right. By the same token, “Right of Discovery” essentially means “finders keepers,” regardless of the circumstances surrounding the “find” and its earlier habitation; finally, the divine agency attributed to Rodrigo Borgia, a.k.a. Alexander VI, revisits an open question that has been the cause of considerable dispute and bloodshed down through the ages.

Only at the point of origin does the claim become more substantial, particularly if we recast “creation of God Almighty” as a more empirical “gift of Nature.” That’s what land is, and arguably, its proper recipient should be all humankind, or at least the portion of it currently inhabiting the land in question. That’s the idea behind the concept of the commons — it’s what belongs to us all, because it was here when we found it, and none among us can take credit for making it.

The commons includes a wide range of resources like air, water, the radio frequency spectrum, fish and wildlife, genes, and land explicitly set aside for public use. There are also artificial commons, like a national currency or public rights-of-way.

Ever since the 18th-century Enclosure movement of aristocratic landowners who appropriated public grazing land for the exclusive pasturing of their own flocks, those in a position to do so have systematically appropriated all sorts of public commons for private gain. This includes externalizing costs wherever possible, often in the form of using air and water as sinks for industrial waste.

The patch of ground that is yours to use exclusively by virtue of the name on the deed has a monetary value, as you’re only too well aware when you sign the monthly payment to your partner in ownership, the mortgage banker. The ground itself has a value independent of whatever “improvements” may be located there — most counties assess and tax them separately, in fact.

The odd thing is, you probably paid more for that piece of ground than the previous owner did, even though it’s exactly the same ground as before. If we’re considering urban land whose value is limited to its use as a location, rather than for productive value like logging or mining, then where did that extra value come from? The easy answer “market supply and demand” might not be untrue, but it only puts a specific amount to the extra value — it doesn’t account for how any extra value should happen at all.

Real estate agents pursuing the plumpest commissions have a well-worn identifier for value: “Location, location, and location.” This is their complete list of the important criteria; notably absent from the list are any qualities of the product itself. This surely makes real estate unique among all products, where the nature and qualities of the product are the main thing. Curious as that might be, the saying does offer some deeper insight than the average agent might suspect.

A valuable location has desirable bits of civilization around it: good schools, a good selection of stores and restaurants; parks, libraries, entertainment venues, and of course, other properties on the block of comparable value or better. Put another way, then, the basis of value for a property is the value of everything NOT the property that happens to share the same vicinity — the same place. It’s the whole place that is made valuable by the community that inhabits it. It’s the “where” in where you live.

Odder still, with all of this value that the community gives an individual property, most of it is captured by the individual owners — you and the bank. Some of it goes to you when you sell, but most, of course, goes to the bank in the form of interest. Very little of the value created by the community remains for the benefit of the community — the little that does return to the community comes in the form of property taxes, a generally inadequate and misplaced means to that end.

This value due to place, the community, is arguably part of the commons, every bit as much as a public park, but the bank has managed to “enclose” most of it by this peculiar method of defining and financing “private” property.

These ideas were first identified and worked out in some detail by economist and social philosopher Henry George, whose 1879 work Progress and Poverty was a best-seller in its day, and won him a sizable following for decades to come. Its relevance is enduring, for we still use land the same way today.

George worked out how to quantify the “commons value” of a piece of land, terming it “economic rent” (not to be confused with the money a tenant gives the landlord every month). He put it in the context of tax policy, advocating the idea that you tax things that society wants less of, and don’t tax things that society could use more of.

Property taxes may capture the economic rent (the “commons value”) for the public, as is appropriate, but these taxes arguably put a misplaced burden on the wages of the homeowner, which in turn will not be spent elsewhere in the economy. Meanwhile, the bank is capturing most of it via interest on the mortgage. The idea behind the “Land Value Tax” that George proposed is to redirect that value from the bank’s income stream, back into the hands of the public.

This has the advantage of being the least burdensome for the economy, in that interest income is not derived from productive activity, like wages and profit are — in fact, it is often referred to as “unearned” income. Rather, the interest from a land deal is largely due to the monopolist’s advantage of controlling exclusive access to that particular piece of land — which is another way of describing land “ownership.”

The bank would still get enough of the action to have an incentive to lend, but no longer would it be in the business of enclosing the commons — nice work if you can get it, fellas, but sooner or later this racket has got to stop.

It’s a blunt message we can send them, with the help of Mr. George: this is where we live, its value is because we live here, and by rights the value stays here with us.

Where do you really live?

Town and environs

Your home range

One simple fact of life that got disconnected in the advance of instant communications and nearly-instant travel is our connection to place. Promoters of globalism tell us that “geography doesn’t matter anymore,” and we take it for granted that we can literally be in any other locale on the planet before even needing a change of clothes. We spend so much of our lives projecting ourselves elsewhere through a screen or inside a vehicle that we lose any abiding sense of where we do in fact abide. For the majority of people living the late-industrial workaday life, home is where the bedroom is.

On closer inspection, “where” turns out to be a little word with a big reach: it serves to identify not only the location of an object, but also to indicate the object’s characteristics that led to its being placed there. For example: “Where is the good stuff?” Answer: “It’s on the top shelf.” Or: “Where are your sweat socks?” Answer: “They’re in the laundry hamper.” When we speak of having “a place for everything,” we understand that the nature of the place is fitted to that of the thing that goes there, and vice versa. In short, it belongs in that place.

The same applies to people, arguably. Except that people, being more mobile than bottles of 30-year-old single malt or dirty gym apparel, tend to be a lot more slippery about the relative scale of location. Traveling abroad, you’d probably respond to the question “Where do you live?” with “I live in the U.S.” While that doesn’t narrow it down anywhere close enough for the purpose of actually locating you later, it does communicate “that’s where Yanks live,” along with the associated traits that your new acquaintance is now likely to expect from you.

It works much the same at smaller scales: visiting another state, you’d probably name your home state in response to the inquiry; in the same city, you’ll probably name a neighborhood, if not a street address.

At smaller scales, the answer to “where do you live?” also becomes more literally true. You really do live at that address: that’s where your stuff  is, and that’s where you can be found — at least one-third of the time, even if it means waking you up. You really do live in that city, also: most of your habitual comings and goings are within its bounds. In ecological terms, it’s your range, your territory — your habitat. Do you live in that state, though? Not so much. And that enormous territory we call America? Only figuratively. You as one individual can’t literally inhabit a continent.

One way to de-muddle a definition is to find a way to make it operational. Instead of asking “where do you live?” we could substitute “what area do you inhabit?” and we’d have an operational equivalent. Returning to the ecological ideas of habitat and range, let’s pose a thought experiment: you take out a map and draw the route of every trip you’ve made for, say, the past year, day in and day out.

The result will be quite a dense web of lines, but one area will emerge that’s so dense with your habitual comings and goings that it’s nearly solid. It will probably be very obvious. Draw an outline around it, and you’ve identified your range; this is the area you inhabit. This is the real, unambiguous, data-driven, nitty-gritty “where you live.”

Ideally, the “where” that you live isn’t just the geographic coordinates of your daily rounds, any more than your home is just a house. The difference is the sense of belonging that the place gives you. It works much the same way as the sense of belonging you find with the important people in your life. If you’ve found the place you belong, it’s part of who you are.

This is of course an ideal that is systematically discouraged by the current industrial economic setup. The economic engine tends to function better with a workforce that is mobile, interchangeable, and undistracted by the desire for identity, with its complicating ties to persons and place.

For the sake of functionality, such pesky human needs do get addressed in a perfunctory way — typically with an array of substitutes and simulations like television, luxury goods, branding, patriotism, team sports and professional associations. And it brings us inevitably to the all-too-familiar landscape of alienation and anomie that provided so much material for late twentieth century social criticism and psychological storytelling.

Despite what the calendar says, we’re still trying to live in the century that created this great, all-encompassing engine, and this engine runs on fossil fuel. Throughout the whole era, there has never been a time when the essential premise of cheap, abundant energy was seriously challenged. The premise is not just abundance, but a steadily increasing abundance of cheap energy. “Growth” has become synonymous with “prosperity,” and it is probably no coincidence that the industrial-era economy has historically grown at about the same rate as oil production.

Almost coinciding with the turn of the new century, it turned out, was a truly new turn of events for our ever-growing oil supply: in 2005, it stopped growing. It’s been flat ever since, though on a historical scale those intervening years represent but a few seconds of hang-time in the trajectory of world oil production, where nothing in the oilfield development pipeline will even keep it flat, let alone put it on the ups again. After all of the “yes, buts” and the professions of faith in heroic technology and the necessary deployment of small-scale renewable sources, we’re facing a future with maybe one-fifth of the energy we’re accustomed to today.

That’s bound to change things. Scenarios vary greatly — the major differences having to do mostly with the rate of change — but the upshot of most credible ones is an overall contraction in human affairs. The world at large will once again be, well, large, and the individual’s world will become a good bit smaller. As horizons contract, they will begin to match the dense scribble on the map that identifies your habitat.

The successful adaptation comes when all who share your habitat set things up in such a way that you get most of what you need from within it. Well-adapted life in such circumstances is decidedly local, where place matters not just emotionally, but also materially.

To devotees of things big, fast and shiny, the prospect of contraction is decidedly gloomy. Indeed, no one in our political and economic leadership is defining issues in any terms but those of the century past, nor making any gesture toward the realities of this one, where the central concern will be how best to manage contraction.

But for anyone untempted by the constant pursuit of bigger/faster/shinier, there are some definite bright spots in the gloom: living on a local scale is also living at a human scale, with all of its potential for reconnecting who we are with where we are. We might not be living as large, but odds are that we’ll be living a lot more real.