The normalization of the extraordinary, and other thoughts on Ferguson

1. As the events play out each night, we can't ignore the fact that for a great many people, rioting is fun. All the protestors are not on the side of the angels.

That said, what we are seeing in Ferguson is, more or less, the making explicit of a number of post-9/11 trends in North America, beginning with :

2. The permanence of the temporary: the extraordinary police and judicial powers that were awarded immediately after 9/11 under the guise of defending liberalism against clear and present danger have become entrenched. To this extent, we have validated the Nazi philosopher Carl Schmitt's view that liberal democracy is at core a sham, because it is unable to defend itself on its own terms.

3. The normalization of the special: The S in SWAT might as well be changed to N, since we have asked police at all levels to be faster and more aggressive in their response to potential terrorist threats. Don't wait for backup if someone is shooting up a school, and so on. But this means giving beat cops tactical shotguns and Kevlar.

4.The militarization of policing in North America, especially the United States, has been a recognizable problem for a few years now. The most obvious problem of course is the outfitting of even rinky dinky county cop shops with LAVs and sniper rifles and MRAPs and camo and all the rest.

5. This is partly a response to points 2 and 3 above - that is, it was driven by the feds -- but don't discount the visceral desire by many cops, especially those too dumb or too fat or too old or too young or too risk averse to have fought in Iraq or Afghanistan, to "get some". The more they are kitted out like the army or the marines, the more they are going to feel like soldiers, and the more they are going to want to act like soldiers, and the more they are going to want to earn their stripes in something resembling combat.

6. That is why the real problem with what we are seeing in Ferguson is not the equipment, but the culture. And by that I don't just mean the culture of policing, but our culture as a whole. Over the past decade, the dominant themes and motifs of our culture have become increasingly militaristic. Partly it's video games, but that's just a small part of it. Stuff that is essentially gym gear is now branded as "combat" or "tactical". People used to go running or to the gym, now they do Spartan Races or Tough Mudder courses and go to cross-fit, which has its origins in the military. Weekend warriors no longer play paintball, instead they participate in compressed versions of the SEALs hell week. And so on.

7. What gets lost in all of this is the extent to which the military is a distinct culture, and you can't simply give its gear and its training methods to police and expect good results. One thing that is interesting about the military is the rigorous legal, administrative, and moral codes that govern the use of force. You could call it "honour" but you'd just get laughed at, but a number of veterans of Iraq and Afghanistan have been commenting on social media about how the police in Ferguson have operated under far looser rules of engagement than the soldiers did walking patrol in Fallujah or Kandahar.

8. This alone should make everyone stop and think really really hard about what has happened to policing, because

9. Enforcing the law and maintaining the civil order in a constitutional democracy is fundamentally different from forcing a political outcome upon a foreign power through the use of violence and deadly force. They are so different that the very notion that the equipment, training methods, culture, and norms that govern them should be shared is insane.

Exercise for further discussion: The Libertarian Conundrum

10: The events in Ferguson pose an interesting problem for libertarians.  While it has been gratifying to see people like Mark Steyn chime in on the absolute gong show that is law enforcement in America, people like him quickly run up hard against what we can call the Libertarian Conundrum:

On the one hand, libertarians are in favour of what has been called the "night watchman" state. The state should enforce contracts, protect property, life and liberty against assault and theft, but otherwise leave people alone.

But a big part of leaving people alone is letting them own guns, so a libertarian society is one that will likely have a large number of individuals armed with military-grade weaponry. And as a number of police officials in the US have pointed out, we can't have police forces trying to enforce the law with revolvers when they are going up against people armed with AR-15s.

So if you're a libertarian, you have a problem. You can have your guns, or you can have your minimal state, but you can't have both. It would be interesting to debate which is worse, the Nanny State, or the Military State.


In praise of vat-grown meat

The world's first lab-grown hamburger to come out of Mark Post's lab was taste tested today, and by all accounts it wasn't too bad at all. Some key points from the CBC's story:

  • "The first (lab-made) meat products are going to be very exclusive," said Isha Datar, director of New Harvest, an international nonprofit that promotes meat alternatives. "These burgers won't be in Happy Meals before someone rich and famous is eating them."
  • Sergey Brin, a co-founder of Google, announced that he funded the 250,000-euro ($330,000) project, saying he was motivated by a concern for animal welfare.
  • Scientists agreed that improving the flavor probably won't be hard.
  • The animal rights group PETA has thrown its support behind the lab-meat initiative.

You couldn't find a more perfect combination of private entrepreneurialism, X-Prize-style achievement, social welfare, and status seeking. I love the idea of vat grown meat. Here's a piece I wrote about the prospect of vat-grown meat a year and a half ago when Post announced that his team was getting close to a viable product. It was originally published in the Ottawa Citizen:


Meat without the murder

Monday, February 27, 2012

It's getting increasingly hard to find anything good to say about meat. It is expensive, swallowing huge tracts of land and tons of grain. It has a large carbon footprint. Finally, industrial meat production is hard on the animals, even before they're slaughtered. The operational slogan of PETA, the animal rights organization known for its radical politics and attention-grabbing stunts is "meat is murder." But you don't need to grapple with metaphysics of personhood to concede that for most people, eating animals involves a certain amount of deliberate denial about the circumstances under which meat is produced and how the animals are treated.

As things stand, we are largely divorced from the animal origins of our food - what we see in the supermarket are not animal parts, but flat white slabs of chicken, dark cubes of beef, rounded discs of lamb with convenient bone handles. The extreme end of this distancing is a proposal that was presented recently by André Ford, a student in the architecture department Royal College of Art in London. He has designed a sort of vertical biomechanical latticework into which lobotomized chickens could be plugged, with food, water and air delivered by a network of tubes, with excrement removed in the same manner.

If that sounds disturbingly like the Matrix for chickens, that's pretty much what Ford is proposing. But while this project is more shock art than industrial architecture, Ford makes an important philosophical point: "It is time we stopped using the term 'animal' when referring to the precursor of the meat that ends up on our plates. Animals are things we keep in our homes and watch on David Attenborough programs. 'Animals' bred for consumption are crops and agricultural products like any other."

If that strikes you as repulsive, perhaps that is because it forces us to confront the bad faith that permeates our debate over industrial meat production and consumption: If chickens are not "animals" (in the David Attenborough sense) then there is no real objection to something like Ford's proposal. But if they are animals, then perhaps we should stop eating them. At the very least, their capacity for fear, pain, and suffering is morally relevant to how we treat them.

But what if there was a technology that did away with all of these drawbacks, cutting through the moral bad faith while giving us a constant supply of low-cost, environmentally sustainable, and suffering-free meat products? That is, if we could grow meat in a vat, would there be any reasonable objection to eating it?

It's still a hypothetical question, but it won't be for long. At the American Academy for the Advancement of Science (AAAS) annual meeting in Vancouver last week, professor Mark Post of Maastricht University announced that his team is getting closer to a workable process for lab-grown meat. Their first successes were with pork, and by the end of this year they hope to to replicate the process with beef, giving them a product "that looks, feels and hopefully tastes like meat." Another scientist working on meat in a vat is Vladimir Mironov of the Medical University of South Carolina. Mironov envisions "football fieldsized buildings filled with large bioreactors, or bioreactors the size of a coffee machine in grocery stores" to produce this meat. "It will be functional, natural, designed food," he said. "How do you want it to taste? You want a little bit of fat, you want pork, you want lamb? We design exactly what you want. We can design texture."

The idea of custom-designed meat products opens up a whole new realm for interesting (and relatively harmless) experimentation and competition. You can imagine celebrity chefs designing their own special lines of meat textures and tastes; a well-designed "blend" could be sold for meatballs, or stews, or meat pies. Imagine the delights of a steak that was a mixture of lamb and venison, or - even better - polar bear and panda.

At the same time, invitro meat will suffer from all the drawbacks of everything else that is produced cheaply and for mass consumption - it will be "inauthentic." And so it will inevitably drive a more pernicious form of authenticity-mongering among people who will only eat meat grown "on the hoof." At the extreme, you can imagine private or invitation-only restaurants and supper-clubs opening up where certified on-the-hoof meat is provided to the privileged elite. The fact that it might be illegal would only add to the experience.

There is the ick factor, of course. Meat in a vat just sounds gross, even before you read that the Maastricht group's process requires stem cells from cows and calf serum as inputs. But we humans have a strong tendency to confuse esthetic reactions with moral judgments, and if you want to see something that is genuinely objectionable, as opposed to merely repellent, take a visit to your local abattoir.

Mironov's lab was shut down this month by the university over what it described as "human resources" issues. PETA has lobbied for him to continue his work, and is funding his research even though the U.S. government will not. This is telling, because PETA understands that the goal is not to micro-manage consumer preferences, it is to prevent harm to animals and to the environment. And if all of that is taken out of the equation, there's really not a lot to object to when it comes to eating meat.



How the world works

*Comments are open for this post*

Nature matters more than nurture

Sex matters more than gender

Friends matter more than parents

Situation matters more than character

Norms matter more than laws

Institutions matter more than culture

Economics matters more than morality

Family matters more than state

Narrative matters more than truth

Identity matters more than rationality

Cohort matters more than generation

Class matters more than income

Status matters more than well-being

Race matters



Hipster chickens coming home to roost

(Image via

“It’s the stupid foodies. We’re just sick to death of it…. People don’t know what they’re doing." Chicken Run Rescue owner Mary Britton Clouse.

Some things are so predictable they are indistinguishable from a natural law. Night follows day, Leafs don't win the Cup, summertime construction on the streets of Montreal. And so it is with the rise in unwanted chickens being dropped off at animal shelters across North America.

It all began early in the previous decade, when forty years of cool-hunting was quickly supplanted by authenticity seeking, and food replaced fashion and music as the primary basis for urban one-upmanship. Organic turned to local became artisanal morphed into a full-blown back-to-the-19th-century self-sufficiency movement. Take a big helping of modern foodieism, fold in locavore-driven moralising, add a double dash of hipster status-seeking, and you got the urban chicken-farming movement. 

And now that the fad is getting a bit tired (the cool kids have moved on to shooting), the chickens are being abandoned by their owners. Chickens, it turns out, are a lot of work. They're also not cheap to own and operate. They are pretty mean animals. They stink and they are gross. And they can live for ten years or so, long long after they've stopped laying eggs. 

A lot of formerly eager chicken owners, having finally done the math, are dropping them off at the local humane society or animal rescue centre. (Why not just eat them, you ask? Good question.)

Anyway, it isn't like this wasn't completely predictable. In fact, the executive director of the Ottawa Humane Society made all of the obvious points three years ago the the fad first came to the nation's capital, in an interview with CBC Radio. It's a fad, he said. It's expensive. It's hard, and it's gross. He predicted that the chickens would be coming his way soon enough, and said "frankly, we don't need the work". 

You could argue, as my colleague David Reevely did at the time, that this is somewhat akin to the fire department objecting to new housing subdivisions: it is more of an argument for changing the funding model of the Humane Society than it is an argument for banning urban chickens. After all, he argued, we should expect the vast majority of urban chicken owners to be just as responsible as the vast majority of cat and dog owners. Why punish the many for the sins of the few?

Except the problem, as I see it, is that urban chickens are nothing like cat and dog ownership. Cats and dogs are domestic pets, while chickens are domesticated livestock. There is a large secondary market for abandoned cats and dogs, while the secondary market for urban chickens is likely to be non-existent (unless the market is the local food bank). But most importantly, unlike the urban chicken craze, cat and dog ownership is not a transient fad that will be supplanted by something even more authentic within a few years. 

Nor does it help matters to concede, as Barbara Cartwright, the CEO of the Canadian Federation of Humane Societies does, that "people who attempt to raise backyard chickens are driven by good intentions — to be more environmentally conscious, humane and to eat healthier." The road to Perverse Outcomeville is paved with these intentions, and it does no good to praise good intentions while ignoring the actual consequences. 

So let's just state it plainly: Urban chicken farming is no more environmentally conscious, no healthier, no cheaper, and no more economically worthwhile than regular chicken farming. And now that urban chickens are being abandoned by their owners like hamsters by five year-olds with ADD, the one selling point of the movement -- that it is more humane than factory farming -- is gone. 

Once we get clear on this truth: that it is consequences that matter, not intentions, we can turn our attention to the more broader problem of people wanting to bring animal husbandry back into our cities. There were sound reasons why we pushed this part of our economy out of the downtowns -- reasons based in public health, animal welfare, and simple economics -- and before we take our city centres back to the 19th century, we should at least make sure our decisions are based on something other than hipster fads wedded to dopey nostalgia. 


Why Rob Ford is the Amy Winehouse of Canadian politics

They tried to make me go to rehab but I said 'no, no, no' — singer Amy Winehouse, before dying of alcohol poisoning.

"Everything's going fine" — Toronto mayor Rob Ford, last week, after a bunch more staffers resigned. 

By now it is pretty clear that whatever else he may be, Toronto mayor Rob Ford is a very sick man. He suffers from any number of pathological cravings, obsessions, and addictions, from food to sex to booze to whatever else. And like all addicts, he has more than his share of enablers — people who helpfully pave his road to self-destruction even as they pretend to be acting in his best interests.

I’m talking about the innumerable pundits and reporters and fly-by-night political commentators who have spent the better part of the last three years telling everyone who would listen that Rob Ford’s vices are actually virtues, that his addictions are features, not bugs, and that the Unique Selling Proposition that the mayor uses to keep Ford Nation barking at the moon is the fact that he’s “authentic”.

What is authenticity, and why does it matter to politics? That’s a tough question to answer in a short space — I wrote an entire book about the subject and don’t think I even really got to the heart of the matter. But like a lot of bad ideas, the cult of authenticity seems to have entered our political vernacular from the United States, where there has been, over the past decade, a growing conviction that the biggest problem with politics these days is that our leaders are not authentic enough.

The argument goes something like this: modern politics has become dominated by large political parties and their shiny, prefab leaders who are about image not substance, who speak only in sound bites and talking points, govern with both eyes on the overnight tracking poll, and who delight in breaking their promises while pretending they never made them in the first place and demonizing their opponents while purporting to take the high road. Modern politics is mass-marketed phoniness, and it is no surprise the electorate is completely alienated.

What we crave (to continue the story) is authenticity.  The American writer Joe Klein signposted the search for the authentic in his 2006 book Politics Lost, an essay about the decline of authenticity in presidential politics. Klein took his inspiration from what he called Harry Truman's "Turnip Day" speech at the Democratic convention in 1948 that confirmed his nomination for president. Coming on stage after midnight, speaking plainly, simply, and without notes, Truman challenged the "do-nothing Congress" to act upon those views they claim to endorse, and get back to work.

Klein thinks we need more Turnip Day moments, more politicians like Truman. He argued politicians need to "figure out new ways to engage and inspire us — or maybe just some simple old ways, like saying what they think as plainly as possible."

It’s a good anecdote. The problem is now every authenticity-mongering pundit wants their own Turnip Day homily with which to beat the audience into submission. The most famous variation is David Brooks’ throw-away line about how Americans always vote for the presidential candidate they would most like to have a beer with. (A principle which, if true, would see Joe Biden elected president-for-life.)

A year and a half ago, the Canadian pundit Allan Gregg delivered a lecture to the Public Policy Forum called "On Authenticity: How the Truth can Restore Faith in Politics and Government” in which he claimed that our leaders' most systematic failure is that "they have not picked up on the electorate's craving for authenticity nor adjusted their behaviour to conform to this new reality."

Gregg had his own Turnip Day homily to explain just what he's getting at. He tells a story about the night he went to see a band in a club in Manhattan when the guitar player's electric pickup broke. Instead of stopping the show to fix the guitar, the band unplugged their instruments, moved closer to one another, and performed an intimate number. "As the last chord was struck, the room literally exploded with rapturous cheering, hooting."

Gregg saw a lesson in this for our politicians. What they need to do, he suggested, is unplug from the way they've always done things and try to reconnect with the electorate. They must drop the prefab talking points designed to "conceal meaning." They need to stop claiming to be the only island of virtue in a sea of knaves. They should cancel all political advertising, and talk straight to the people, saying what they mean and meaning what they say.

And the avatar of this movement, according to Gregg, is Rob Ford, whom he describes as  "a leather-lunged, no necked, know-nothing." And in case you think that’s an insult, Gregg goes on: "In Rob Ford’s instance, his very crudeness and unrefined nature made him seem ‘real’ and signalled he was not trying to hide anything from voters." That is to say, Rob Ford won the race for mayor of Toronto because he’s authentic.

Allan Gregg is far, far, far, from the only person to have made this argument. The "Rob Ford is popular because he’s authentic" line started during the 2010 election and continues even as he fights to keep his job over allegations that he’s a crackhead. Here are some selected examples:

We've got some fascinating artifacts of authenticity on our political stage today, some good, some troubling. When Rob Ford was first elected, I stood in a public square listening to him speak, thinking, uh oh, this man is trouble for all who oppose him. Why? Because the mayor says what he means, and he doesn't give a flying fig what opponents think of him. — Judith Timson, Toronto Star, April 2013

Ford, who won by running as an unrefined, yet garishly authentic, outsider, is an outsider once more. His war against the downtown establishment - they of bike lanes and gravy trains - can now continue with renewed relish, and perhaps even success; if Ford runs again, he may well win. — Adam Goldenberg, Ottawa Citizen, November 27 2012

Christie Blatchford has written a number of columns lauding Ford for his "authenticity", and while she has made a point of saying that she regrets voting for him, she also makes a point of reminding readers why she voted for him in the first place. The key for Blatchford is not who Rob Ford is, it is who he is not: He wasn’t a part of what she calls "that soft-left ruling class” that likes to think they run Toronto. And so we get to:

Mr. Ford is surely deeply flawed. Well, so are most of us, me anyway. But, to use a modern term, he is also authentic. — Christie Blatchford, National Post, November 26 2012.

I could pile up the examples like cordwood — just Google “Rob Ford” and “authentic” and your evening will be shot — but you get the picture. The question is, what are we to make of it? How can we get any critical traction on Rob Ford when we are told, over and over again, that what looks to all the world like a serious problem with his character is actually his greatest asset?

It is tempting to rehearse, yet again, the arguments for why the desire for authenticity in politics is self-defeating, and prone to catalyzing the very problems it purports to solve. But instead I’ll try a different approach and suggest that what is being pitched as “authenticity” is actually something far more dangerous, for both the electorate and for the leader who cloaks himself in its embrace.

For starters, Ford’s supporters consistently mistake populism for authenticity. Authenticity, at its purest, represents a perfect alignment between the inner self and its outer manifestation. It describes someone who is self-contained but transparent to the world, innocent without being naive, and sincere without being cloying. Such a person, if he or she ever existed, would make an absolutely atrocious politician.

Rob Ford is not authentic. Instead, he’s just another populist. And in the current climate of North American politics, populism is just another put-on, a mask, a front, that some politicians adopt in order to seem like one of the people. In America, populists thump bibles and kiss babies and warn against commies and talk about craw fishin’ or huntin’ and talk about the Heartland and Families and the Family Farmer. In Canada, populists write books about hockey and hold press conferences at Tim Hortons and warn against commies and talk about hockey and warn about crime and defend the Family Farm and give medals to hockey players.

Again, populism is not authenticity. It’s a pose, a marketing position, a brand. And it is just as phony as any other political posture out there. Sometimes it works, as it did for Rob Ford. And sometimes it flames out spectacularly, as it did for that moose huntin' maverick mom, Sarah Palin. 

But it isn’t clear that Rob Ford is even much of a populist. About the only truly populist kite he’s ever flown is the whole stop-the-gravy-train thing, which some people thought meant he was committed to lowering taxes. As it turned out, he actually thought there was a literal gravy train at City Hall and that stopping it would fix Toronto’s finances. He’s also a bigot and pretty obviously hates the gays, and if you want to call him a populist on those grounds, you’re welcome to the term and the baggage it brings with it.

No, there’s something more basic to Rob Ford’s personality, and there’s nothing that appealing, about it: the man has zero self-control. Whether it is reading while driving himself to work, drinking at official functions, going to KFC while on a much-publicized diet, or allegedly smoking crack and hanging out with drug dealers, it is clear that Rob Ford is simply incapable of resisting temptation, delaying gratification, or otherwise acting in a manner that is in anything other than his short-term interest.

And — it is crazy that this needs pointing out — there is nothing politically or morally praiseworthy about this. In the Republic, Plato hailed rational self-mastery of the passions as the key to both personal well-being and the proper functioning of the city. A few thousand years later, Freud suggested that the control of the id by the super-ego, moderated by the ego, was the key to being a properly formed adult, and the lynchpin of civilisation. In between and since, no one has seriously made the case that rule by the passions, the id, the animal instincts, whatever you want to call it, is a viable way to run a polity of any size. More to the point, no one has credibly argued that this is any way for a grownup to behave. 

Except, that is, Rob Ford’s enablers, whose greatest fear is that Rob Ford will go to rehab and expose their ongoing support for what it really is: a dangerous and foolish egging-on of a very sick man. Which is what makes Rob Ford into less of a buffoon and more of a tragic figure. It turns Rob Ford into the Amy Winehouse of Canada.

Remember the first time you heard Winehouse singing “Rehab”? I do. I loved it.  The casual defiance, the stick-it-to-the-man refusal to go along with square society’s medicalization of boozing.  Which is weird, because I actually co-wrote a book critizing that very attitude – the studied rebellion that treats every institution, from grade school to the hospital, as part of the great conformist system of mass society.

But love it I did. We all did, for mostly the same reasons. Why should Amy Winehouse go to rehab? After all, weren’t her problems – her drinking, the drugs, the depression and the self-harming – the very font of her art, her creativity, and her soul?  “Rehab” became a rallying cry for barflies everywhere. It also showed that, despite decades of public education on this issue, we still don’t take seriously the proposition that alcoholism, drug abuse, and even depression, are actual illnesses.

Imagine if, instead of being an alcoholic, Amy Winehouse had cancer. And imagine she wrote a song called “Chemo” with the lyrics “they tried to make me go to chemo, I said ‘no, no, no’”.  Or if she had an infection, and she sang “they tried to give me antibiotics, and I said ‘no, no, no.” It would be a joke. But deep down, most of us don’t quite accept that alcoholism or drug addiction are diseases like any other. It’s self-destructive, sure, but there’s also something romantic about it.  These are not new observations: the celebration of fucked-up artists is one of the defining features of our culture. When Amy Winehouse recorded “Rehab,” she was telling the world that she didn’t buy into the notion that her drinking was an illness that needed treatment. When we bought the record by the millions and gave her a Grammy for it, we told her we agreed.

Did this popular support play a role in her subsequent death? When she sang about not going to rehab and we cheered and called her authentic, did she internalise the value system we were pushing on her?  That is, I wonder if Winehouse, like others before her and since, bought into her self-image as a messed-up singer of the blues, which made it that much harder for her to get clean.

I'm not suggesting she was simply playing a role, or that she killed herself in the name of cred, but there is a powerful looping effect in all of our identities. All identities are social constructs which get their power from being recognized by others. As a result, there is a feedback loop in our identity construction, where we internalise the norms that govern our chosen (or assigned) identities. When the norms of a given identity contain a built-in mechanism for both radicalisation and self-destruction (as they do for an identity like "messed-up singer of the blues"), it is not hard to see how it could become literally inescapable.

So then imagine you one day find yourself the mayor of one of the biggest cities in North America. You aren’t without your charms, and the people around you aren’t without political savvy. But you also have serious personal problems, which play havoc with your health, your personal life, and threaten your ability to do even the most minimal parts of your job. Yet the worse things get, the more you spiral down, the more your so-called supporters cheer you on.

What would you do? Where would you go? Who would you turn to for advice? In such circumstances, I think you would hope you could rely on someone who has known you all your life, who loves you for who you are but who knows that who you are involves habits and appetites that, unchecked, might get you and even others killed. That is, you would hope there was someone close to you who loved you like a brother.

Does Rob Ford have such a person near him? I honestly hope he does. His life almost certainly depends on it. 


On the stupid "root causes" debate



One of the few benefits of growing old is seeing that the cycle of society is a cycle of stupidity; that the same moronic arguments come, and the same moronic arguments go, and that at a certain point it really isn't worth the time and effort explaining to the stupid just why they are so stupid. 

And so it is with this week's typically feigned outrage over Justin Trudeau's comments, made very early after the bombings of the Boston Marathon, that we should look for the "root causes" of these events. My colleague Andrew Coyne has taken time out of his life to explain why there is nothing objectionable about what Trudeau said: 

Recall that Trudeau was speaking in advance of anyone having been named as suspects, or any of their background or possible motives having been identified. We did not know what or whom we were dealing with: an organized conspiracy, or a lone nutter. But he was right to suggest that whoever did it would have to be someone who had become, for one reason or another, detached from basic social norms: as he put it, “who feels completely excluded, completely at war with innocents, at war with a society.”

This is so obvious that it beggars belief that anyone would try to find anything nefarious in Trudeau's remarks. But it brings to mind a similar occasion, over a decade ago now, when Jean Chretien was similarly chastised by the Canadian right for proposing that, as part of its response to the attacks of 9/11, the US government should maybe seek to understand the root causes of the attacks. 

For his efforts, Chretien was slammed in the pages of the Wall Street Journal by one Marie Josee Kravis, a quasi-Canadian turned New York socialite whose major claim to fame is having been been one of the most completely clueless members of the board of Hollinger in the early years of the millennium. I was asked to respond to Kravis's column by the opinion page editor of the Ottawa Citizen, long before I ever imagined I might one day work for the paper. 

The column I wrote bears the marks of the angry young man posturing that was a signature of my writing at the time, but there's so much of the piece that resonates with the current debate I think it is worth posting in full. To paraphrase Matthew McConaughey in Dazed and Confused: I get older, but the arguments stay just as stupid. Here's the column:

Gagging on U.S. whine: Offering an explanation for terrorism is different from justifying acts of terror. Why can't people understand that?

The Ottawa Citizen 

Sat Sep 28 2002 

Like Conservative lead-er Joe Clark a few weeks back, I find myself in the curious position of coming to the defence of Jean Chretien.

 Once again, our prime minister is under fire for an interview he gave to CBC-TV last summer and for a speech he recently gave at the United Nations, in which he allegedly suggested that "western arrogance" might have contributed to the Sept. 11 terrorist attacks. This latest salvo, by Marie-Josee Kravis, was published on Thursday in the Wall Street Journal and reprinted on this page yesterday.

Mrs. Kravis's article was so full of twisted analysis and half-explained history that it is hard to know where to begin. So why not begin with the first sentence, which finds Mrs. Kravis wondering, "Why is JeanChretien so intent on finding a justification for terrorism?"

I have read the CBC transcript of the interview a dozen times, and I won't pretend to understand every sentence. But one thing Mr. Chretien was certainly not doing was trying to find a "justification" for terrorism, nor was he "blaming the victim" for the attacks. What he was trying to do was grope toward an understanding of the factors that might have inspired the attacks, and which might lead to similar problems "10 or 20 or 30 years from now."

Setting aside the question of whether "western arrogance" and global wealth disparities are what motivated Osama bin Laden and his crew -- I think not, and it is not obvious from the interview that Mr. Chretien thinks so either -- I fail to see what the fuss is about. Offering an explanation for something is conceptually distinct from offering a justification for it.

It is commonly observed in Canada that factors such as poverty, drug addiction and lack of education can lead to a life of crime. When we point this out, we do not thereby justify the crime, nor do we "blame the victim." This is so obvious it is painful to have to spell it out. Mr. Chretien was simply applying this pattern of domestic analysis to the global community. Again, there is nothing here that hints of what Mrs. Kravis calls "misplaced pity for terrorists."

Mrs. Kravis argues that what poor countries need is better access to world markets, which will give them sustained economic growth. Of course they do. But these countries also need a functioning, vertically integrated civil society, stable government, the rule of law, and civil and political liberties; otherwise, economic reform will simply make things worse. Just look at Russia, where most of the existing social capital was destroyed more than a decade ago by western economic geniuses more in the grip of ideology than good sense.

But don't take my word for it. This leftist "social capital" mumbo-jumbo, including the Russia example, is taken straight from the World Bank's Web site, at www.worldbank.org/poverty/scapital/.

I am sure this is all too soft-headed for Mrs. Kravis. She suggests that Mr. Chretien should learn some lessons from Pierre Trudeau, who knew how to deal with terrorists. Send in the army, arrest anyone who looks suspicious, and refuse to consider any sort of political accommodation. Any Americans getting their Canadian history from Mrs. Kravis's article would have been left with the distinct impression that Quebec separatism was killed off once and for all in 1970, since, as she says, it was "Trudeau's resolve that restored order and deterred future terrorist incidents."

Actually, a more plausible explanation is that the FLQ's violent energies were sublimated into the democratic separatist movement that still exists. Quebec separatists just went from blowing up mailboxes to trying to politically blow up the entire country, and they might well have succeeded if it weren't for 30 years of political accommodation, much of it led by Mr. Trudeau. But again, don't take my word for it, read a history book. Or, if that's too hard, read the entry on the "October Crisis" in the Canadian Encyclopedia.

Halfway through her article, Mrs. Kravis takes leave of her original argument and goes off on a rant about the unbearable lameness of Canadian nationalism. She attributes Mr. Chretien's pro-terrorist feelings to his frustrations with the U.S., and even dusts off the old stuff about anti-Americanism being our unfortunate substitute for a true national feeling and self-confidence. Spare me.

To begin with, even the most hostile reading of Mr. Chretien's remarks doesn't come close to the sorts of things that have appeared in Harper's and the New York Times. Second, Mr. Chretien suggested that the powerful should try to be "nice," which Mrs. Kravis interprets as a bit of pique over President George W. Bush's notorious failure to praise Canada for its post-Sept. 11 support. That is doubtful. If anything, Mr.Chretien meant "be nice" as in "don't give money and guns to evildoers like Osama bin Laden and Saddam Hussein," -- which the Americans did for a long time.

Mrs. Kravis insinuates that Canadians are hypocritical for feeling morally superior to Americans while enjoying access to U.S. technology, capital and television. Since when are you not allowed to feel superior to the people you're doing business with? Didn't Adam Smith set us all straight on this point? Besides, if you want to talk hypocrisy, let us start with Mr. Bush, the biggest hypocrite of them all.

Probably no man so dim has ever benefited so much from crony capitalism, yet Mr. Bush stands in sanctimonious judgment of the executives of Enron and WorldCom. He promised a "hemisphere of freedom," then slapped trade sanctions on Canadian lumber and jacked up subsidies for U.S. farmers. If there were any consistency to U.S. foreign policy, Saudi Arabia would be part of the axis of evil. And so on.

There is nothing wrong with Canada's national self-confidence that wouldn't be helped if only we had fewer grovelling, pro-Yankee fifth-columnists keeping addresses in Toronto and Montreal so they can dump on Canada and Canadians in newspapers at home and abroad.

There was nothing wrong with what Jean Chretien said to Peter Mansbridge and to the UN. Get over it.

Andrew Potter teaches philosophy at Trent University, Peterborough.

 I no longer teach philosophy, at Trent or anywhere. I'm currently the managing editor at the Ottawa Citizen



On the moral corruption of Zero Dark Thirty 

It is bad enough that Zero Dark Thirty is a laughably written, poorly plotted, badly directed and exfoliatingly dull telling of the most dramatic manhunt in our lifetimes. It is also completely, uttery, relentlessly immoral.

Kathryn Bigelow's attempt in the LA Times at defending herself on her handling of the torture issue is inept. 

Steve Coll eviscerates her in the New York Review of Books. 

I join in the pile-on in tomorrow's edition of the Ottawa Citizen, or you can read it online. 

By all means see the movie. 


A comment on Glen McGregor's "Toward a Dogme95 of political reporting"


I spent the first 17 years of my adult life in academia. I've spent the last five years in journalism. And if there's one thing that has struck me about the career switch, it is that the only people on earth whose sense of self-importance rivals that of humanities professors, it is journalists. Which is why it was so disheartening when a journalist sat down to write something self-critical about his profession's use of academic sources, the people who were quickest to take offense were professors.

At issue is a blogpost by Glen McGregor, a parliamentary reporter for the Ottawa Citizen. His post, entitled "Toward a Dogme95 of political reporting," is a trim little call for a return to journalism's basics: pick up the phone, work sources, get stories. It asks journalists to stop filing easy stories skimmed from the froth of partisan posturing, social media, and self-styled rent-a-quote "experts." Fine advice, and, in my opinion, largely non-controversial. (Note: While I'm Glen's editor at the Ottawa Citizen, I had no input into his blogpost.)

But it's the first bullet point of Glen's post that seems to have got the most attention:

* No more quoting political scientists:  It’s lazy and signals the reporter couldn’t find any other apparently neutral or objective source to talk. These people work in academics, not politics, so I’m not interested in their opinions on anything but their own research.

This caused quite a ruckus in the cosy Canadian politics neighbourhood of the twittersphere. A number of academics -- most of whom are well known to journalists and to readers for their comments, op-eds, blogs and in some cases even their actual research -- took this as raised middle finger to their presence in Canadian journalism. I'm not going to bother going over the he-said/she-said of it all; my view is that this comment of Glen's is entirely critical of journalists, not academics, and is less about telling professors to stay out of journalism than it is about telling reporters to stop relying on professors to pad out their stories and launder their political views. But like most serious misunderstandings, this one does a useful job of shedding some light on the relationship between journalism and academic work, and how technology-driven shifts in our conception of status, influence, and research itself called that relationship into question.


The key thing to understand about journalists is that they are the lowest rank of intellectuals. That is to say: they are members of the intellectual class, but in the status hierarchy of intellectuals, journalists are at the bottom. That is why journalists have traditionally adopted the status cues of the working-class: the drinking and the swearing, the anti-establishment values, and the commitment to the non-professionalization of journalism.

The key thing to understand about academics is that they are the highest rank of intellectuals. That is why they have traditionally adopted the status symbols of the 19th century British leisured class -- the tweeds and the sherry and the learning of obscure languages -- while shunning the sorts of things that are necessary for people for whom status is something to be fought for through interaction with the normal members of society (e.g. proper clothing, minimal standards of hygiene, basic manners.)

Despite inhabiting opposite ends of the intellectual status hierarchy, some journalists always saw some appeal in looking up towards academia (instead of down on the working classes) and some academics saw the appeal of journalism. Professors, after all, have the cachet of smarts. Journalists, on the other hand, can become folk heroes. And so within journalism there was a natural alliance to be found between journalists who wanted to give their stories some intellectual heft by quoting a serious researcher on the story at hand, and researchers who wanted an audience for their ideas beyond the faculty lounge and the conference circuit.

So far so good. In the pre-internet world of publishing, journalism served as a useful instrument for brokering academic research to the masses. Academic publishing is slow and research is hard to grasp even for PhDs, while a newspaper comes out every day and the language of the broadsheet is educated but relatively straightforward. The reporter who could become an "instant expert" in a difficult field of research, or the researcher who had a gift for explaining difficult research in straightforward language, played a valuable role in the realm of public debate.

There is a downside to this though. Journalists work under tight deadlines, and -- like everyone else on Earth -- they will take the easy path over the difficult, when given the choice. Meanwhile, it is tough for the lay reporter to know which experts are the ones to trust, and even then, academics can be difficult to reach (the better ones always seem to be on research leave somewhere other than at their home university.) And so there has always been an interest amongst journalists in academics who are easy to reach and are willing to talk about a very broad range of topics, including those outside their areas of research expertise. This -- and I think this alone -- is the combination of lazy journalism and dial-a-quote academic punditry that Glen McGregor suggests we can do without.


It is hard to see how any journalist, or any academic, could object to this. No serious journalist wants to be seen as lazy, and no serious academic wants to be considered a lightweight. So why, then, did so many people take offense at McGregor's proposal?

I think the problem stems from the shifting place of academics in the popular discussion over the past decade. One of the great benefits of the rise of Web 2.0 was the way blogs gave professors a platform, independent of both mass media and niche publishing, to promote their work and to critically discuss the work of their peers in a forum that was free, public, dynamic, and immediate. And while it had the effect of making it easier for journalists to identify and reach useful sources, the more serious consequence (for journalists) was that it threatened to make them obsolete, by eliminating their role as intellectual middlemen.

The rise of the social web, Facebook and most especially Twitter, has only accelerated this process. The 2011 federal election in Canada was widely referred to as the first "Twitter election," but as I wrote in a blogpost for Canadian Business magazine, it was more accurate to call it the first "economists' election." It was the first election in which a large number of Canadian economists made direct, unmediated, real-time interventions into the debates over policy and the various party platforms.


My suspicion is that many professors interpreted Glen McGregor's manifesto as an attempt at pushing them out of this newly-carved niche in our popular debate. Nothing could be further from the truth. Yet as it progresses, this disintermediation of academic expertise will have a profound impact on how politics and public policy gets debated in this country. It should also have a profound impact on how both journalists and academics do their jobs.

For journalists, it should change their approach to political reporting pretty much along the lines suggested by McGregor.  Thanks to technology, journalists no longer have to play the role of ideas broker between academia and the public. At the same time, there is very little status to be gained by quoting the same stale academic sources in story after story, when more insight can be found coursing through a well-cultivated twitter stream. Finally, it means that reporters should stop trying to launder their political biases through a convenient academic who will say the things the reporter wants to say, but can't, given the conventions of unbiased reporting.

But it should also change the way academics work as well. One of the more poorly-kept secrets of the academic world is that humanities professors and social scientists are the most ideologically committed members of society. People like to complain about journalistic bias, but journalists are in fact far less politically biased than most professors. A great deal of what passes as academic political commentary is little more than partisan opinion-mongering (I reviewed a particularly egregious example for the LRC a few years ago). And so if academics are smart, they'll take Glen McGregor's no-academics pledge as a challenge: to offer comment to a reporter only when their research puts them in a unique position to inform or clarify the public debate, and serves the needs of the story the reporter is trying to tell.

If there is a big takeaway from "McDogme95" (as Stephen Maher calls it) it is this: It is an opportunity for political journalists to retrench and concentrate their energies on what they are best positioned and best qualified to do: work sources, file ATIP requests, comb through public databases, and break stories that are in the public interest. That in turn creates a space for academics to insert themselves directly into the conversation through their own devices (Twitter, blogs, etc), or through more traditional means such as op-eds or essays. (I can't think of a better example of this than Peter Loewen's recent essay for the Citizen looking at what Stephen Harper is up to.) 

Canadian politics is in need of both better reporting and better contributions by academics. Glen McGregor's manifesto is an excellent first step at articulating the proper division of labour that will take us in that direction.


Gun violence: the economics of abolition

There's lots of talk about America needing to step up on gun control. I suspect that for a lot of people, this is a disguised way of talk about abolition -- that is, the elimination of the private ownership of guns of any sort. 

If so, that's fine. It is certainly worth putting that option on the table and airing it. I doubt it would go anywhere, not in Canada, and not in the USA. But imagine for the sake of argument a government passed a bill outlawing private ownership of guns: it would still be faced with the problem that there are a lot of guns in the country. Stats I've seen vary, with estimates between 250 million and 310 million private weapons on the USA. 

If you wanted to take these off the street through a buyback, what would it cost? Again, buyback programs vary. $200 for a handgun is common, with some programs offering as low as $20 for a rifle. Some programs I've seen have offered $100 gift cards to places like Target. But these are voluntary buybacks, taken advantage of by people who either want to go clean, or have guns they no longer want. A forced buyback program would be far more expensive. 

Assume you wanted to take 280 million guns off the street, at an average buyback price of $200.  Total cost would be $56 billion dollars. (That's probably low, but it's a ballpark).

Would it be worth it? 

[Note: I fixed the math in the next graph thanks to Andrew Coyne's heads up]

Last year in the US, there were 11500 homicides caused by guns. The actuarial value of a human life is $7.4 million. Multiply that, and you have a savings of $85.1 billion, in one year. 85.1-56 is a net savings, minus the buyback costs, is 29.1 billion, call it $30 billion in savings in the first year. But that's not a one-off -- that's $85.1 billion a year after that, every year, compounded. (I think. I forget how to calculate these sorts of things). 

There would be other benefits: lots of people are wounded by guns, so their health care costs and associated other costs would be eliminated as well. But there would be costs as well: it would be foolish to suppose that private gun ownership is a 100% deadweight loss to the economy. 

At any rate, the upshot is that the American government could, if it wanted, easily afford to pay for gun abolition, and it would more than pay for itself in about 8 months. 



Politics: The Naïve and Cynical


1. Here is a naïve view of how politics works.

Politics is about policy. Groups of like-minded people coalesce around a set of ideas about how the world should work. This group is called a party. The party puts forth a platform of policies that will put those ideas into action. The role of the party then is to serve as the interface, or point of friction, where ideas become policies. To gain power, the party promotes and sells these policies to the public as better than those of their opponents.

Thus, the adversarial nature of politics is essentially a debate between objectively superior policies. An election campaign is when the marketplace of ideas is open for business. It is like a graduate seminar in philosophy, where ideas are freely debated, the principle of charity is in full operation, and the best ideas win, whatever their source. 

The goal of this public debate is truth: Truth regarding the demands of justice, the requirements of redistribution, and the scope and character of the public goods that state should offer.  The more people have input into the process, the closer we will be to the truth.

When the party with the best ideas wins, and the better policies are thereby implemented, the country as a whole is better off. As John Stuart Mill taught us, truth is both partial and non-rival -- that is, everyone can share in the truth without it being minimised or depleted.

The crucial trait of a successful politician is that he or she be intelligent. Political leaders should be smart people. Better: they should be policy wonks, charismatic academics, philosopher kings who will rule in the better interest of all. The model naïve politician is someone like Pierre Trudeau, or Jack Layton.

2. Here is a cynical view of how politics works.

Politics has nothing to do with policy, it is about power. Joining a political party is not like joining a faculty club, and is more like joining a tribe or a gang. Their overriding function is to gain power and relative status for their group at the expense of people of other tribes and gangs.

Therefore, a party platform is not a list of policies seen as being in the objective interest of the country. Rather, it is a statement of brand affiliation, or, more simply, identity. The function of the party is to sell its brand or identity as more appealing than that of their opponents. Policies are implemented because of how they appeal to the group and buttress its identity.

Elections are basically popularity contests, not much different from the process of voting for class presidents (read Robin Hanson on this point.) So the point of an election is to make one tribe’s leader seem more appealing than that of the other tribe. The ultimate goal of the exercise is to win power for one tribe. If that requires demonizing the other parties as bad patriots, or bad people, so be it.

For cynics, to govern is to choose between competing interests. There will be winners and losers, with some groups inevitably rising and dropping in status. This is because power is indivisible and rival. One group can only hold it at the expense of others. 

The best politicians are charismatic figures, or gang leaders. They are polarizing figures, ruthless at pursuing the interests of their tribe at the expense of others. Loved, or at least greatly admired by their followers, they are loathed by their opponents.

The successful cynical politician is not necessarily intelligent. What matters is that he is authentic. The relevant question is not “does he have good ideas” but rather “is he a proper representative of my tribe?” The model cynical politicians are men like Jean Chrétien, or George W. Bush.


As used here, the terms "naïve" and "cynical" are not intended invidiously. Instead, they are intended to describe the two extremes of a continuum. Different countries might have different political cultures: some might tend to be more naïve in practice, while others might be more cynical. Citizens of different countries might prefer to be at different points on the spectrum. Some institutions might be more conducive to one form over another.

Yet there is an obvious normative quality to this continuum. Not only can it be used to describe how politics does work, it can also be used as a language in support of reform (or in support of the status quo): we may think that politics ought to be more cynical, or ought to be more naïve.

In fact, the most significant political divide in Canada, and perhaps other polities, is not between left and right, but between those who are cynical and those who are naïve about politics. It informs almost all other opinions about how our political machinery -- including Parliament, the courts, the party system, the electoral system, the media -- should function.

Some examples:

  • The naïve will be in favour of coalition or minority governments and proportional representation. The cynical will prefer majority governments and first past the post.
  • The naïve will have faith in a deliberative approach to democracy. The cynical will rest content with more Schumpeterian forms.
  • The naïve will desire more power for individual MPs or representatives, calling for more free votes in particular. The cynic sees the party as paramount, with party discipline the basis of all political engagement.
  • The naïve will curse the growing reliance on negative advertising as antithetical to the truth-seeking essence of politics. The cynical will see such framing, and the resulting culture of "truthiness," as useful to the in-group/out-group definition that is at the core of political engagement.   

Most arguments between pundits and academics consist of disguised disagreements over which mode of politics is better, the naïve or the cynical. Indeed, most apparently partisan disagreements are, if you scratch the surface, differences of opinion between cynics and naïfs.

To decide whether one is cynical or naïve is the most important meta-political decision one has to make. It is unfortunate that we spend so much time arguing about our partisan biases, and pay so little attention to our meta-political commitments. Whether that itself suggests that we are all, deep down, cynics (or perhaps meta-cynics) is an important question.



Why the truth squads can't beat truthiness

My latest column for the Citizen looks at the entirely salutory development, during the last U.S. election cycle, of media getting back to their old role as fact-checkers. The problem, though, is that fact checking is only effective when truth is seen as a necessary element of political success.

In the age of truthiness, the  "problem with the effort to truth-squad our way back to fact-based politics is it misunderstands the way political persuasion works. Successful politicians don’t win over the electorate by giving them a set of plausible facts that in turn motivate a set of policies, they sell them on an attractive narrative. The best politicians, from Reagan to Bill Clinton to Barack Obama, are storytellers."



There is no Muslim Tide

This isn’t to say that there aren’t problems with Muslim immigrant populations in parts of Europe, especially France, Germany and Holland. But in every case, the troubles can be traced to one of three causes: fallout from past colonial relationships; domestic policies that hinder the ability of immigrants to work, to worship and to naturalize; or the particular character of the immigrant community and how it interacts with the host country. So, Bangladeshis and Pakistanis in London are not the same as North Africans in Paris or Somalis in Ottawa. But regardless of how these isolated problems are (or are not) resolved, the key point is that they have virtually nothing to do with a grand Islamic takeover project.

That's from my review of Doug Saunders' new book, The Myth of the Muslim Tide. 


Is Mormonism crazier than other religions?

(The planet Kobol, as imagined on Battlestar Galactica)

Mormons have some pretty wacky ideas. For example, they believe that some of the native peoples of North America were followers of Jesus Christ hundreds of years before he was actually born. Mormon scripture refers to a planet called Kolob that is, or is near, the physical throne of God -- a belief that was the inspiration for the planet "Kobol" in the sci-fi show Battlestar Galactica (the show's creator was a Mormon). Craziest of all: Mormons refuse to consume alcohol, caffeine, or tobacco.

But is Mormonism wackier than other religions? Is all religious belief equally plausible, or implausible? Or, does plausibility fall on a continuum – a line running from the completely absurd to the thoroughly reasonable?

It would seem that as a rule, most of us -- believers and atheists alike --  instinctively seem to accept that there is a continuum. For example, consider the evolutionary biologist Richard Dawkins, who in recent years has made a name for himself as the leader of a new group of aggressive atheists, a group that also includes Sam Harris, the philosopher Daniel Dennett, and the late Christopher Hitchens. Last month, Dawkins went on a long twitter rant accusing Mitt Romney (who in addition to being the Republican nominee for president was also a Mormon bishop) of being a “massively gullible fool.”

The focus of Dawkins’ attack was Romney’s adherence to the teachings of the Book of Mormon, which is the sacred text of the Latter Day Saints religion. While the book was published in 1830 by Joseph Smith, Mormons believe it contains the writings of prophets who lived in North America between 2200 BC and  AD 421.

“Bible & Koran genuinely old, written in the language of their time. Book of Mormon written by 19thC charlatan. Romney too stupid to see it,” Dawkins tweeted. When he was challenged by readers who pointed out that president Barack Obama was also a believing Christian, Dawkins responded: “Christianity, even fundamentalist Christianity, is substantially less ridiculous than Mormonism (and Obama, if he is Christian at all, is certainly not fundamentalist),” he explained. “The idea that Jesus visited America is preposterous, and the idea th[at] Adam and Eve did too is even worse (it is at least arguable that Jesus existed).”

Another example:  many Canadians will remember when Stockwell Day, an evangelical Christian who believes that the Earth is somewhere between 6000 and 10000 years old, was leader of the Canadian Alliance. During the 2000 federal election, Liberal operator Warren Kinsella mocked Day’s beliefs by brandishing a Barney the purple dinosaur doll on television, claiming "this was the only dinosaur ever to be on Earth with humans."

What makes this interesting is that Kinsella himself is a self-declared practicing Catholic. Yet as
Kinsella's mockery of Day and the glee with which the "Flintstones" theme of his campaign was picked up by the media makes clear, there is a widespread sense that Catholics are less brainlessly
credulous than Young Earth evangelicals.

So the idea would seem to be that the more a religious belief accords with generally accepted scientific views of the world and the universe, the more credible it is.  Let’s call this the Kinsella-Dawkins thesis.

According to this thesis, it is one thing to believe in an omniscient, omnipotent, and benevolent deity who, a few thousand years ago, impregnated a middle eastern woman named Mary with His only begotten Son, and then sacrificed that Son to atone for all the sins of Mankind (sins which were invented in the first place by said deity.) It is something else entirely to believe that 600 years before his son Jesus was born, that same deity led a people from Jerusalem to the Americas, where they grew and split into a pair of warring factions.

Or again: It is one thing to believe in the central doctrine of Christianity, namely, the literal resurrection of Jesus. It is something far stupider, though, to believe that the Earth is at most 10 000 years old, and that God put dinosaur bones and other artifacts in the historical record to test our faith (as many young-Earthers maintain.)

This thesis definitely has a lot of plausibility. It would help account for our folk hierarchy of belief, which seems to allow for degrees of respectability between childish fears of the supernatural, at one end, and the wisdom of millennia that we find in the more robust religious traditions, especially the monotheistic ones.

But for a committed atheist, the Dawkins-Kinsella thesis concedes too much. What it gives up in the name of superficial plausibility is the underlying principle at the heart of the atheistic worldview. To properly see why this is the case, it's useful to recall something that Dawkins himself wrote in his best book, the primer on evolutionary biology The Blind Watchmaker. As Dawkins points out, what we are trying to explain through religion is exactly how organized complexity came to exist in the universe. The theistic answer is: God created it.

The problem with this answer is that it presupposes exactly what we are trying to explain. Whatever else God may be, he is organized and complex. If we can simply posit organized complexity, then we haven't really explained anything.

That is why evolutionary theory is so unanswerably powerful. Only evolution by natural selection, or some similarly "blind" process, is capable of explaining how organized complexity came from disorganized chaos. Every explanation that relies on a consciousness, a higher power, or any sort of pre-existing organizing principle is simply assuming the problem away.

But if that's the game we're playing, then what difference is there between Catholicism and Mormonism, or Hinduism and Islam? It's all of a piece: an equally adolescent commitment to wishful thinking and to the supernatural. After all, once you have accepted that there are conscious, invisible and unknowable forces at work in the universe, does it really matter how many of them you buy into? If God can resurrect his son for a long weekend, surely he could also arrange things so that a 10000 year-old planet appears to be billions of years older. If there is an omniscient power in the universe, is it less plausible that he lives on a planet a few thousand light years away than that he resides in an unknowable realm where he hears our prayers and grants salvation according to whim?

Arguing over religous belief is like playing tennis without a net: almost any hit counts as good return. Under these circumstances, it is pointless to debate the question of who is the better player. The only
issue is why anyone finds it useful to play at all.


The past is the future of paid content

With the news that pretty much every newspaper in Canada is going to some sort of pay wall/"metered model," debate is raging once again over whether consumers will ever pay for content. They have and they will. The trick, as someone from the recording business taught me long ago, was to make it seem free, without actually being free. The model is radio. Here's a column I wrote nearly five years ago on the subject. My belief in the soundness of the central argument hasn't changed. 




Mon Feb 25 2008 
Page: 14 
Column: OPINION 


The lengths to which some people will go to avoid picking up the cheque. At the end of January, a 28-year-old Brit named Mark Boyle began what promises to be a 30-month trek from England to India, for which he is bringing some T-shirts, bandages, and an extra pair of sandals. Significantly, he is leaving his wallet behind, hoping to survive entirely off the kindness of strangers.

Mr. Boyle is walking to promote the values of the "freeconomy" movement, a group that claims 3,000 members in 54 countries. Advancing the bold and original thesis that money is the root of all alienation, freeconomicists believe we need to shift from a "money-based, community-less society" to a "community-based, moneyless society." And so Mark Boyle will strike a blow for community by spending the next 2 1/2 years cadging free meals from Bristol to Porbandar.

It comes as no great surprise then that Boyle is a former dot-com businessman. It is cyberculture, and its confluence with hippie values, that is helping drive the copyright wars, one of the most pointless economic conflicts in recent memory. Dedicated to the proposition that "information wants to be free," the Free Culture movement believes content such as news, books, film, games, but above all music, should be free in two senses: free as in speech (there should be no censorship or control over how culture is used); and free as in beer (the culture should be free for the taking).

This movement is opposed by music producers, film studios, and other content producers, who are lobbying for more stringent penalties for illegal downloading and for stricter controls on how content can be used and copied. Here in Canada, the Conservative government is preparing to introduce an updated copyright bill, but it is facing stiff resistance from "copyleft" activists who worry that the new legislation will give in to Big Copyright's most outrageous demands.

And so the two sides are locked in an increasingly polarized dance, with each advocating a perverse and unsustainable business model. It was left to Paul McGuinness, the long-time manager of U2, to try to knock some sense into them. At a conference in France last week, McGuinness gave a speech in which he blamed internet service providers (ISPs), fund managers, and the hippie culture of Silicon Valley for destroying the recording industry, and he went on to propose that a fee for legitimate downloading should be collected by ISPs and paid out to copyright holders.

For his efforts, McGuinness was flogged around the blogosphere, where he was variously accused of being greedy, hypocritical and -- worst -- "corporate." Except that he's right about the influence of hippie values on Internet culture, as well as his suggestion for how to bring the copyright wars to an end.

The profound influence of the counterculture on cyberculture is not remotely controversial. Scratch a file-sharing activist and, more often than not, you'll find someone who deep down just doesn't like the idea of paying for music.

But that is a bit of a cheap shot. After all, nobody likes paying for music, any more than they like paying for food or drink or shelter or anything else. People pay for things when there is stuff they want and shelling out is better than the alternatives of stealing it or going without. All the Internet has done is make theft the most palatable option of the three, while a halfway measure such as 99-cent downloads on iTunes only serves to foreground the main question, namely, why should you pay for something that other people are getting for free?

If you're trying to square the notion of free culture with how the economy works, a handy rule of thumb is this: in the end, the consumer pays for everything. So when it comes to seemingly free media like radio and television, they are funded for the most part by commercial advertising, which is in turn paid for at the cash register by consumers.

The trick to resolving the copyright wars once and for all is to come up with a scheme for making downloading a similar experience to listening to the radio or watching TV: it would seem free, while ensuring that copyright holders actually get paid.

So how can we make file sharing seem free without it actually being free? Canada currently has a levy on blank recording media (such as CDs) that is collected by the Copyright Board and passed on to copyright holders, but a plan to extend the levy to MP3 players was struck down in early January by the Federal Court of Appeal. The most promising idea is a version of McGuinness's tax-and-distribute model, in which the government charges a basic Internet access tax, collected by ISPs, that would give users an unlimited right to download songs, videos, books, games, and so on. The fee would then be paid out in royalties by the Copyright Board in much the same way it is currently done for radio.

Most importantly, it would allow artists to be paid, in a way that doesn't rely on draconian copyright controls on the one hand, or the kindness of strangers on the other. In the end, you get the culture you pay for, which is why the motto that everyone involved should be rallying around is "Free Lunch." As in, there's no such thing as a.



"the incoherent bleating of the Wasposphere elites"

Terry Glavin is a friend and a comrade, and man alive I hope it stays that way. If I ever find myself on his bad side, I hope he is a good long hike from the nearest computer. 

Terry took a summer break from columnizing for the Ottawa Citizen, but he's back today, weighing in the hand-wringing over the closing of the Canadian embassy in Tehran and the expulsion of Iran's diplomats from Ottawa. His big target is Ottawa Centre MP Paul Dewar, who lamented the absence of more "robust diplomacy." After giving a short laundry list of the sorts of things Iran's fellow travelers get up to:

It is by these instructive evidences that “robust diplomacy” betrays itself as something worse than mere war. It’s cannibalism with table manners, and nobody has any business calling themselves a socialist, a liberal, a progressive or a social democrat if they engage in anything of the kind.