Wednesday
Mar202019

Some questions and concerns about the Canadian media bailout

Yesterday’s federal budget included some long-expected provisions for helping the struggling newspaper business, on the grounds that  “A strong and independent news media is crucial to a well-functioning democracy.” The three main provisions are a refundable tax credit for journalistic labour (i.e. reporters), a tax rebate for up to 15 percent of a digital subscription, and allowing non-profit news orgs to become charities and issue tax receipts for donations.

I should say I'm generally opposed, in principle, to giving public money to the media. But if they're going to do it, I hope they do as little damage as possible. And so instead of just complaining about the whole enterprise, I’d like to try to be constructive, or at least offer some constructive complaints.

 In particular, I’d like to take issue with this strange claim, from the Toronto Star’s report on the budget:

Contrary to criticism that government assistance would compromise the independence of news outlets, the criteria makes almost no demands on content — save that the organization primarily focus on original news content such as current events or general interest.

 

So first, to qualify for any of these three measures, an organisation has to be a Qualified Canadian Journalism Organization, or QCJO (they should have put a bit more time into the acronym). Anyway, the budget lays out the criteria for what sort of organisations will count, based on corporate structure and ownership rules (must be Canadian); what sort of content it covers (print news for a consumer audience, not industry magazines and newsletters); and its size (two or more employees at minimum).

But the budget also sets out a plan for an independent panel that will be tasked with making a number of key decisions. Which raises the question: Are the above qualifying criteria to be a QCJO exhaustive -- that is, are they the sufficient conditions to qualify? Or are they just necessary conditions, with further conditions to be decided by the panel? That is, could the panel decide that even if an organisation met the formal criteria outlined in the budget, for some reason or another they wouldn't qualify it for QCJO status? This is an important point that I have not seen a proper answer to.

 Second: The labour tax credit provides for a 25% rebate on an employee’s salary, up to $55k per employee, so the rebate is capped at $13750 per employee. The obvious question is: Who counts as an eligible newsroom employee?  It seems they want to limit this to "news" reporters, and the budget goes so far as to list the sorts of activities that would count, and even hazards a guess that it should be someone who spends at least 75 per cent of their time on "news content".

No one who had spent more than a day working in a newsroom could endorse that sort of time slicing definition of a news reporter, but leave that aside (since the budget itself does, handing the ultimate decision off to the panel.) And the panel has a heck of a job here. What sort of news would count? The Ottawa Citizen still employs a rock music reporter and a bunch of sports guys and a science journalist. Ok, rock music maybe doesn't qualify. But does sports? The Citizen's sports reporters have been covering the Senators, which is owned by Eugene Melnyk, who is part of the gong-show redevelopment of LeBreton flats, which is a heavily political issue. And Tom Spears writes science journalism, including covering the Canada Space Agency and federal institutions like the museum of science and technology. Should he count?

Again I know the answer is "the panel will decide", but you can easily see: a) how the panel membership’s built-in biases will be a problem, and b) how newsroom managers will be forced to play games with reporters' assignments and job descriptions to get the maximum number of qualifiers.  At the very least, this will have the effect of pushing newsrooms into certain kinds of coverage and away from others, not because the audience wants it, but because the tax man demands it. And the Star thinks this budget makes no demands on content? Please.

 Third, the CAJ in a statement says “the $55,000 cap will incentivize news outlets not to pay employees more than that. For unionized outlets, some of which have a starting salary in excess of $55,000, this structure could encourage the breaking of unions.” I don’t know if that’s the case but it raises important questions. I would love to see an economist weigh in to help explain what this sort of tax credit does to wages, and how it might affect union bargaining.

Fourth: The government has allocated $360 million over 5 years for the labour tax rebate, which works out to $72 million a year. At a full rebate of $13750 per employee that would subsidize benefits just under 5500 journalists, or somewhere just under half of the total number of employed in the country according to last year's census. So ok they all aren't news reporters, and they don't all make $55k, but what if the rebate hits the $72 million cap, which my napkin scribblings suggest it could? Presumably then the panel will have to decide how the limited money gets parsed out, and to whom, and according to what principles. I'd love to be in the room when the formula for that disbursement gets negotiated.

Fifth: The focus for this is on print content. Why, apart from the fact that the Toronto Star and Postmedia have been the ones whining the loudest? Shouldn’t saving democracy be neutral with respect to media delivery? If a newsroom thinks it can succeed by pivoting to video or podcasts, don't we want them to do this? This plan will dissuade them from doing so, and hence will hinder innovation and experimentation, by giving the big news organisations every incentive in the world to stick to what they’ve done for 100 years or more.

So what do we make of this? Right now, the only thing I think we can say for certain is that the independent panel is going to have enormous influence over Canadian media -- how it is structured, who it employs, and what it covers. And so it will matter enormously who is on the panel. But it will also matter enormously how much discretion they have. And it will matter enormously what sort of decision procedure the panel will use in exercising that discretion? Is it a majority vote? Unanimous?  The horse trading and bargaining and special pleading that will go on will not be pretty, to put it mildly. How transparent will the panel’s decisions be? Who will decide that? Do we trust the media to report fairly on all this? 

If anyone could provide me with model for where something like this works anywhere on Earth I’d like to see it. In the meantime, I’ll just leave you with this tweet, from a journalist whose work I respect very much:

Who is Ezra you ask, and why should we be worried? If you don't know the answer to that, then that's part of the problem. 

 

Friday
Jun292018

Self Help For Partisans

As a public space in which good faith civil democratic debate can take place, I think the United States is pretty much finished. As Cass Sunstein framed it in a recent column for Bloomberg, America is beset by what he calls “Political Manicheanism”, where “political issues are seen not as reasonable disputes among fellow citizens, but instead as pitting decent people with decent character against horrible people with horrible character.” Basically, it’s good versus evil, with each side seeing itself as the good guys. 

We could probably have a good discussion about why this is the case. Does the steady march towards polarization go back to Reagan, or Nixon, or Kennedy? Is it the fault of Republicans, or of Democrats, or is the whole political system to blame? Is it third party financing? The media? And if it’s the media, is it Fox News, or CNN? Was it the FCC’s elimination of the Fairness Doctrine? Is it the because of the decline of mainstream media, or is it the internet - and is that just tomayto/tomahto? Is social media the real culprit, or fake news? Is it adherence to “balance” in journalism that drives the appetite for partisan media, or is it a bulwark against it?

And so on.

What is not really up for debate is the reality of the situation. Shortly after Trump got elected, I described American politics as “two troops of apes shrieking at one another across a great partisan divide.” That hasn’t changed, though the shrieking has got louder as the divide has got wider.

What has changed, though, is how the situation as evolved up here in Canada. We’ve long flattered ourselves that our politics is more civil than in the US, maybe because there’s less money involved, or because the stakes are lower, or our system is better, or our media are more concentrated, or just maybe because we’re all so much nicer.

But if the last few weeks of #cdnpoli Twitter are any sort of weathervane, Canadian politics is well down the same path as we’ve seen in America, perhaps irreversibly so. I’m not going to name names or describe incidents or exchanges -- if you’re paying the slightest attention, you know what I’m talking about. And arguing about who started it only underscores the problem: A line seems to have been crossed, where even the politicians and other actors who have seemed most committed to resisting the tug of Manicheanism have decided to go all in on painting their opponents not as basically decent people with different views on things, but as horrible people with horrible character. And again: I don’t think either side is blameless in all of this.

At some level, we all know that we’re entering a pretty dark place. And it is tempting to see correlation as causation and conclude that social media is a big part of the problem. (I think it is certainly part of the problem -- go read the first two thirds of Joe Heath’s Enlightenment 2.0 for the argument.) But we need to be wary of going full McLuhan on the situation, as Colin Horgan did on Twitter: The medium is the message, the message is garbage, so the fixing the medium will fix the problem. QED.

The problem with blaming the medium is it leads to hoping that if we only heckle the New York Times editorial page editors enough, or bully newspapers into calling Trump a liar in their headlines, or if enough of us complain to Jack Dorsey about Twitter bots or parody accounts, or we hold a mass prayer for Zuckerberg to be hit by a bus or for Facebook to be broken up by regulators -- then things will go back to the way they were.

But it’s not going to happen. This is the world we've built. It reflects who were are, our dispositions, our biases and our values. There’s no business model, regulation, or filter waiting to be discovered that is going to save us. As Craig Forcese wrote on Twitter, “The problem stems, not from the medium, but from ourselves.”  And the ourselves who are most to blame for all of this are the partisans.

***

Look, some of my best friends are partisans. But there is something fundamentally wrong with the partisan brain. People -- smart, educated people -- who seem to have their feet firmly planted in the realm of reason and logic, cause and effect, inference and deduction, fairness and good judgment, suddenly lose their minds when faced with an issue over which there is partisan advantage to be had, or when a threat to their tribe’s hold on power looms.

And so in the interests of offering friendly but also urgently self-interested advice, here are three principles or guidelines partisans need follow that will help lead them back to Planet Sanity.

 

1. “What if my opponent did that?”

 Shortly after the Liberals came to power in 2015, I had the idea of starting a blog called “What if Stephen Harper did that?”, which would simply link to something the Trudeau government did and ask the obvious question. Because despite riding to power on a wave of good intentions, in a lot of ways -- its control over Parliament, the lack of transparency and the abuse of process, the torquing of public policy initiatives for partisan advantage -- this government is no better than the one it replaced.

Of course, they don’t see it that way. No one ever does, because people tend to interpret their own behaviour in light of what they see as their true motives. And because they see their motives as fundamentally good,  the Liberals give themselves a pass for engaging in the behaviours for which they crucified Stephen Harper.

But here’s the thing: Stephen Harper almost certainly interpreted his own behaviour in exactly the same way. He no doubt justified his own control freakery and partisan gamesmanship on the same grounds -- that it was in the service of the public interest. To respond that no, what Harper was doing was advancing partisan interests makes the fundamental partisan error: “what I do is in the public interest, what my opponent does is for partisan reasons.”

Looking at your own actions the way your opponent might see them is very difficult, but you have to try. What you see might surprise you.

 

2. The principle of charity

 The requirement that you assume your political opponent has more or less the same goals that you have -- namely, making the world a better place - is just a specific form of a more general injunction, which is that you should always begin with the assumption that your opponents are rational. That is, you should assume that their beliefs are for the most part true, and that their beliefs and desires are connected to each other and to reality in some plausible way.

This is what philosophers call the “principle of charity.” It comes in various versions of varying strength, but the core of it is a demand that we interpret someone’s statements and behaviour in the most rational way possible. That is, we should avoid attributing irrationality, delusion, or bad faith to someone when a coherent or rational interpretation can be had. That doesn’t mean there are no irrational or deluded people, nor does it mean that no one ever acts in bad faith. But as Joe Heath puts it in Enlightenment 2.0, “If our understanding of the world depends crucially upon the claim that everyone else is an idiot, evil, on the take, or part of the conspiracy, then the problem almost certainly lies with our understanding and not with the world.”

In short, before calling your opponent insane, a lunatic, a criminal, or a total moron, check to make sure you are giving the best possible interpretation of those views that maximizes their status as rational people.

Applying the principle of charity is a good way of doing this. But an even better approach is to try to pass the ideological turing Test.

 

3. The ideological Turing Test

 You remember the original Turing Test, in which Alan Turing proposed to replace the ineffable question “can machines think?” with the behavioural question of whether a machine could interact with a human in a way that was indistinguishable from human to human interaction. As Turing saw it, if a human couldn’t tell the difference, then there was no further question as to whether the machine was actually thinking.

The ideological Turing Test is the brainchild of the economist Bryan Caplan, and it is designed as a test to see whether a partisan truly understands the arguments of his or her opponent. The idea is that the partisan (say, a Liberal) is asked to answer questions or write an essay in which they are posing as their ideological opponent (say, a Conservative). If a neutral judge can’t tell the difference between the arguments of a true Conservative and those of the Liberal trying to “pass” as a Conservative, then we can conclude that the Liberal does genuinely comprehend the Conservative point of view. 

How many Liberals out there think they could seriously pass as Conservatives, and vice versa? It’s not an idle question. Because if you can’t credibly represent your opponent’s views, this means a few things. First, you probably don’t understand those views. Second, this means you can’t have a proper argument with them. Which means you are probably not taking their ideas seriously, which means, finally, that what you’re doing on is not debating them, or arguing, but performing.

 

***

And maybe that’s the big problem -- that everyone has stopped arguing with their opponents, and has decided to simply perform for their supporters.

Bagehot famously wrote:

Of all modes of enforcing moderation on a party, the best is to contrive that the members of that party shall be intrinsically moderate, careful, and almost shrinking men; and the next best to contrive, that the leaders of the party, who have protested most in its behalf, shall be placed in the closest contact with the actual world. Our English system contains both contrivances: it makes party government permanent and possible in the sole way in which it can be so, by making it mild.

 Once upon a time Parliament may have been composed of moderate, careful, and shrinking men and women. But that is no longer the case. Moreover, it would appear that whatever contact our representatives have with the actual world, it is not making them moderate and mild in their interactions with one another.

As Bagehot saw, party government in a parliamentary system must be mild if it is to be possible at all. And if our representatives can’t see their way to helping themselves out of their partisan echo chambers, if they can’t see their way back, to treating one another as rational interlocutors acting in good faith, then we are well and truly screwed.

 

Friday
Dec222017

On letting go of Luke Skywalker

 

“You hated it, obviously,” came the snarking email from one of my best friends. “Because you are not seven years old anymore.”

 Earlier, I’d sent him a quick note (“Ok saw it”) to let him know I’d finally seen The Last Jedi and it was safe to talk about it without worrying about spoilers. 

I didn’t hate it, not even close. While horrendously flawed in places, The Last Jedi is easily the best Star Wars movie since Return of the Jedi. But my friend was right about one thing: I’m not seven years old anymore. And the great virtue of The Last Jedi is that it is the film that has finally allowed me to make peace with the Star Wars franchise, for the first time in 40 years. And in a weird way, it’s also allowed me to make peace with myself.

***

It’s hard -- sorry, impossible -- to overstate the effect Star Wars had on me as a boy. I saw Star Wars when I was seven, and spent the next three years in an all-consuming daydream of X-Wings and lightsabers and stormtroopers and droids and Luke and Han and Leia and all the rest of it. And I’m not in the slightest bit unique in this; it was pretty much how it was for all of my friends. The first three movies are basically the data points of our common childhood -- boyhood, prepubescence, adolescence. George Lucas knew exactly what he was doing when he put Carrie Fisher in a bikini in the third film.

And then there was nothing. Lucas had talked about making nine films, but after Jedi, the series basically went into a state of suspended animation. The films got taken over by toys and books and other ephemera, which was absolutely fine because it meant there was nothing to overwrite the movies that had installed themselves as as the machine language of our common mental operating system.  The Star Wars Trilogy was about as perfect a cultural moment as you’re going to find in this galaxy, and leaving it alone was probably for the best.

But Lucas couldn’t leave it alone. Getting his hands on new digital technologies, he started mucking about with the original films, adding footage and special effects and changing key scenes. In retrospect, the Greedo-shooting-first fiasco was an early indication that the man had seriously, and literally, lost the plot. And then came the rolling disaster of the prequels, the takeover of Lucasfilm by Disney, and the rebooting of the franchise with The Force Awakens (which is total garbage) and the extended universe story of Rogue One (which is a watchable mess.)

The pattern was the same each time: anticipation followed by letdown, then rationalization. Repeat for each new movie, except with each iteration of the pattern the anticipation was less intense, the letdown less disappointing, the rationalizations less convoluted. After all, the films were selling lots of tickets, the toys were everywhere, the extended universe kept getting bigger. At a certain point it became easier just to chalk it up to getting older. I had more or less decided not to bother going to see The Last Jedi. But then I did, mostly because I wanted to earn the right to dump on it with my friends.

***

There’s a scene in Star Wars that kicks off the second act of the film. Luke has raced home to save his aunt and uncle from the stormtroopers, only to arrive to find the farmhouse torched, their charred bodies lying smoking on the ground. He goes back to meet Ben, and says “I want to learn the ways of the Force and become a Jedi like my father.” Mark Hamill is looking up at the camera as he says it, and there’s a pleading, but also ever so slightly excited look on his face.

Boom. At that point the heart of every kid in the theatre went: Boom. The hero has started his journey, and every boy watching suddenly is Luke Skywalker. It doesn’t stay that way, because over the arc of the trilogy, everyone ultimately identifies with either Luke or Han: they are a dreamer or they are a cynic; they want to save the galaxy, or they want to get the girl. Lots of people became Han guys. I was always, always, always, a Luke guy.

Which is why I didn’t realize just what exactly I’d been missing from Star Wars all these years, until The Last Jedi. The moment Luke speaks for the first time -- that boom again. Despite the grey hair and the salt and pepper beard, he sounds pretty much the same. Keep in mind: while Harrison Ford went on to become a huge star, Mark Hamill had a much quieter career. We simply haven’t heard his voice for 35 years.

The Luke Skywalker of The Last Jedi is not the idealistic dreamer of the first movie, nor is he the confident Jedi master of the third. He’s tired and grouchy, wants nothing to do with the galaxy and its problems, and certainly wants nothing to do with the Force. There’s been a lot of press about this, mostly because Mark Hamill has made it clear that he and TLJ writer/director Rian Johnson had long arguments over the direction of the character. You can find interview clips of Hamill saying that Luke would never turn his back on his friends, would never give up trying to save the galaxy. And a lot of people have chimed in, agreeing with Hamill and arguing that no matter what is supposed to have happened to Luke between RotJ and TLJ, nothing could justify the self-imposed island exile.

Really? I mean, come on. It’s been 35 years, and Luke has been fighting pretty much the entire time. How many of his friends have died? How many people has he killed? And that’s just what is obvious, before we find out that, in turning his sister’s son Ben Solo into Darth Vader wannabe Kylo Ren, he has recapitulated Obi-Wan Kenobi’s failure with Anakin, and started the whole sins-of-the-father cycle all over again.

At long last it seems that Luke, and only Luke, has grasped the basic fact of the Star Wars saga: the Skywalkers aren’t here to save the galaxy; the galaxy needs saving from the Skywalkers. Looking back on his life, and forward to what is in store for his friends, Luke can see only conflict and chaos, death and destruction, with his family at the heart of it. And so he’s retreated to an island, cut himself off from the force, and is curating the museum of Jedi history while waiting to die.  He’s tired the way Aragorn and Frodo and Legolas are tired at the end of Lord of the Rings -- the sheer exhaustion of the struggle that never ends, against an evil that cannot be vanquished because it is part of the fabric of the universe. 

***

We would all have a much better sense of this if George Lucas hadn’t made the disastrous and almost unforgivable mistake of trying to re-engineer the story around the redemption of Anakin Skywalker. The Star Wars saga always was, and should have remained, Luke’s journey. The table was set for an epic story about youth and idealism, aging and betrayal, discipline and mentorship, love and -- this above all -- friendship.

What we’ll have to settle for is what Rian Johnson has managed to kludge together out of the bits and pieces he was given. He’s taken every inch of narrative elbow room available and made the most of it. The decisions he made for how to complete Luke’s arc are close to perfect, because what comes to Luke comes to us all, eventually. Age, frustration, failure, loss. It’s called living a life. It is to Mark Hamill’s credit as an actor that he delivers on all fronts. 

At one point in TLJ when Luke is sulking, just like the old days, and threatening to destroy at tree that houses some sacred Jedi texts. And then, just like the old days, the shimmery Force-ghost of Yoda appears. “Young Skywalker,” he croaks, with a bit of a laugh. “Missed you I have.” Then he starts lecturing Luke once again about always having his eyes on the future, never with his mind on where he was, on what he was doing.

I almost started crying. I hadn’t realized until that point just how much I had missed Luke Skywalker. But thinking about it after, I realised it wasn’t Luke Skywalker that I have been missing all these years. Who I really miss is the seven year old boy I once was. He was a wide eyed dreamer, innocent and sweet, and he wanted to grow up and save the galaxy. For a brief moment in the theatre, I was with that boy again, and it made me happy. 

It’s hard to make really close, deep friends after high school. Similarly, it is hard to allow yourself to be completely consumed by the culture once you reach adulthood. Star Wars was something unique: a cultural artifact that took over an entire generation of kids, serving as the mortar that sealed and strengthened our friendships. I mean, we still get together and argue about the movies for crying out loud. 

But for the first time in forty years, I don’t need anything from the Star Wars universe. I’m not seven years old anymore. Luke Skywalker is gone. It’s time to move on.  

Friday
Dec152017

Book Review: Hitchens' *Mortality*

This is a review I wrote for the Ottawa Citizen of Christopher Hitchens' postumous book about dying of cancer. I loved the Hitch, but I didn't love this book. It's worth reading, like everything of his. But I thought he blinked, a bit. 

**

Mortality

Christopher Hitchens, McClelland & Stewart $22.99 Hardcover, 107 pp

Christopher Hitchens' 2010 memoir Hitch-22 is packed with gossip, jokes, confessions, and arguments, with the tenor and tone shifting wildly from ironic to sentimental, crude to sophisticated, learned to juvenile, frequently within the same page or paragraph. But for all the clever turns of phrase, the stiletto reasoning, the unfair erudition, you don't actually discover much about what made the Hitch tick.

In fact, it is only on page 330, well over three fourths of the way through the book, that Hitchens condescends to reveal something of himself. The moment comes in his answers to "The Proust Questionnaire," a form of self-interrogation (and a popular feature from Hitchens' employer, Vanity Fair magazine) that asks you to answer questions like "What is your idea of earthly happiness?" and "what is the quality you most admire in a woman?"

The key moment of self-disclosure comes in the second question, "Where would you like to live?". Hitchens' fully-armed answer: "In a state of conflict or a conflicted state."

From his earliest days as a Trotskyite university radical, Christopher Hitchens was a man who lived to argue, to debate, to fight. The problem is, for an ideological combatant like Hitchens, the end-of-history '90s presented something less than a target-rich environment. As a result, he seem destined to spend his life as a bit of a fringe figure, picking odd little fights with Bill Clinton, Henry Kissinger, and Mother Teresa, where the viciousness of the attack seemed completely out of proportion to the alleged crimes.

The attacks of 9/11 changed everything, as the world was launched once again into a state of high-level polarized conflict. The novelist Martin Amis wrote that if 9/11 had to happen, he was glad he was alive to see it, but it was his boyhood chum Christopher Hitchens who immediately signed up as one of the most enthusiastic combatants.

It was the post-9/11 intellectual climate that motivated Hitchens' two big intellectual shifts: the one that made him infamous was his Slurpee-sucker cheerleading of the neocons as they bullied George W. Bush into invading Iraq; the one that made him famous was his relentless assault on all forms of religion that began with his 2008 book God Is Not Great.

And there things might have stayed for a long while, with Hitchens settling comfortably into a highly public role as the scourge of believers everywhere, leavening his anti-jihad jihads with more of the fun little assignments his editor at Vanity Fair, Graydon Carter, would send him on: get your back waxed, Hitch. Have yourself waterboarded. Write about blowjobs or how women aren't funny.

But suddenly he got metastatic cancer of the esophagus, diagnosed on a book tour, the very day Hitch-22 hit the best-seller list. Suddenly, but surely not unexpectedly, since Hitchens liked to drink and smoke as much as he liked to argue. As he writes in the first essay in this short little book, Mortality: "I have been taunting the Reaper into taking a free scythe in my direction and have now succumbed to something so predictable and banal that it bores even me."

Equally predictable was the way Hitchens set about giving his disease the complete writerly treatment. He begins with a sneer at what he calls "one of the most appealing clichés in our language," namely, that people don't have cancer, they battle it. He then sets about informing the reader that for a cancer patient, "the image of the ardent soldier or revolutionary is the very last one that will occur to you. You feel swamped with passivity and impotence: dissolving in powerlessness like a sugar lump in water."

Given what follows, this is enormously disingenuous. Mortality consists of seven short chapters about Hitchens' "year of living dyingly," beginning with an essay that tracks his forced march from what he calls the country of the well to the land of malady, and ending with a scattered collection of thoughts, epigrams and quotations that he continued to jot down right up till the end.

Together they might as well be called "Christopher Hitchens versus Cancer," with the increasingly sick Hitch taking on one set of enemies after another. Whether it is the false hope of new therapies, the curse of losing his voice, the thin line between torture and treatment, or the idiotic thesis that what doesn't kill you makes you stronger, the tone of the book is relentlessly combative. And because it is Hitchens doing the fighting, it is always smart, insightful and entertaining, with one or two gorgeous turns of phrase on every page.

There is a problem, though, which is that while you learn a lot about what it is like to have terminal cancer, there is very little about deeper theme that is promised in the book's title. Hitchens has little comment on the perverted irony of dying of the same disease that killed his father. There is virtually nothing here about his wife and kids, what his illness means to them and what they mean to him.

He doesn't appear interested in exploring what it is like to face the psychological - and not just intellectual - consequences of his atheism, which is guaranteed extinction. On the questions of his legacy, what people will think of him in a decade, or a century, Hitchens seems unconcerned.

In fact, the reader has to wait until the very end for any trace of mortality as a theme to creep into the book, and in all three examples it is a matter of Hitchens cribbing from other writers. There's a flick at Larkin's poem, Aubade. There is a beautiful quotation from Alan Lightman on how death is the enabler of freedom. And a line from Saul Bellow: "Death is the dark backing that a mirror needs if we are able to see anything." All three of these references trump anything Hitchens is able to conjure on the subject. Why this is the case is unclear, but Mortality is ultimately defeated by the writer's reluctance to say much at all on the subject at hand. Perhaps it is because Hitchens, the ultimate combatant, could not, right to the last, accept that he had finally found himself in a fight he could not win.

This is too bad, because as big a loss as his death has been for so many of his admirers, there is more to dying than cancer. What came to the Hitch will come to us all, in one form or another.

And for those of us who are less courageous, less stoic, less resolute, in the face of what Larkin devastatingly called "the anaesthetic from which none come round," it would have been nice to have had Hitchens to lead the way, as he did so often when he was alive.

-30-

Wednesday
Sep272017

It Might Get Loud: Some thoughts on obsession, addiction, and freedom

 

This is the second of what I hope will be an ongoing series of posts on the themes of obsession and authenticity. To see where I'm coming from, maybe read the first one  before continuing below.

***

I finally got around to watching It Might Get Loud, the documentary film by Davis Guggenheim that brings together Jimmy Page, Jack White, and The Edge to talk about music and their careers, and to jam with one another and teach one another some tricks. It’s been out for almost a decade, and it’s one of those films I’ve been both dying to see, while ever so slightly slightly dreading.

And sure enough, about halfway through, I started to get a familiar feeling: I started to regret not having taken my own guitar playing more seriously. It was at the point in the film where the three of them are talking about their first guitars, and how much they loved those first instruments from the moment they got their hands on them.

Like a lot of my friends, I saved up for an electric guitar in high school. I paid $120 for a second hand piece of crap Stratocaster knockoff that played terribly and sounded worse. I never liked the guitar, and couldn’t tell you where it is now; I think maybe my brother used it for spare parts.

To some extent, the stories Page, White, and Edge tell are not unlike the experience a lot of kids have -- getting your first guitar, playing in bands going nowhere, bonding with schoolmates over music, struggling for an identity. But for me that’s more or less where it stopped. I’m musically anti-talented, and certainly wasn’t obsessed with music or the guitar the way some guys were. Even during the short time when I was in a band, I could barely get up the motivation to practice much. I loved the idea of having a guitar a lot more than I loved having a guitar. 

So it was probably inevitable that my career as a guitar hero never really went anywhere. So why the twinge of regret?

It wasn’t for the music, or its absence. It was, as it always is, for the absence of something, anything, to play the role that music plays for Edge or Jack White or Jimmy Page.  I mean just LOOK at their faces when, for example, Page shows them the riff to Whole Lotta Love:

 

 

 Call it passion, call it love, call it obsession. Not everyone has it, and if you don't you can't fake it.

The regret for its absence is the same regret I feel watching movies like Dogtown and Z Boys or Jiro Dreams of Sushi, or reading books like Barbarian Days, or every two years watching the Olympics: a dissatisfaction with the realization I have never taken something so seriously that it dominates my life, and probably never will.   

Not everyone who is obsessed becomes a famous musician or athlete or chef or artist, and not every famous musician or athlete or chef or artist is obsessed. In fact, there’s a fascinating bit in It Might Get Loud, when the three guitarists talk about the place of music in their life. In one scene, the Edge is walking through his old high school and he points to a noticeboard in the hall where Larry Mullen Jr., the drummer for U2, had put up an ad looking for a guitarist. The Edge matter-of-factly says that if he hadn’t seen the notice, or hadn’t answered it, who knows if he’d even be a musician. If he hadn’t answered that ad on the board, he says, “I might be doing anything. I might be a banker.”

Compare this with Jack White, who was the tenth of ten children, most of whom were musicians. Can you imagine him doing anything else? You know whatever happened in his life, Jack White would be making music. But the real contrast here is with Jimmy Page talking about his relationship with music:  

“Whether I took it on, or it took me on, I don't know.  The jury's out on that. But I don't care. I just really really enjoyed it."

This is the sort of thing you can imagine Keith Richards saying about his own guitar playing, or Laird Hamilton saying about surfing, or Ivan Orkin about cooking. Did they choose their calling, or did it choose them? It’s a pointless question.

And that of course is why it is pointless for me to keep going through this exercise where I drink deeply of someone else’s obsession, only to sigh and wonder why I don’t share it. The problem is not that I didn’t try harder to become obsessed with something. I didn’t try harder at something because I wasn’t obsessed. I didn’t call it, because it wasn’t calling me.

The essence of a true obsession is that it serves as a complete and final answer to the question: what to do? That is, it reduces the whole range of questions that preoccupy a normal life -- who to love, what to do, where to live, how to be -- and reduces them to a single question: how do I feed my obsession?

And so obsession is a close relative of addiction. For addicts, too, the addiction reduces life’s multitude down to a single overriding goal -- getting the next fix. Think for example of Renton’s famous “Choose Life” speech from Trainspotting, and it’s killer kicker: “Who needs reasons, when there’s heroin?”

 

 

Aside from the moral judgment of society, is there anything to distinguish an obsession from an addiction? What’s the difference between the two?

Back before he died but when his emphysema was slowly robbing him of his lifeforce, Peter Gzowski started talking and writing about his lifelong addiction to cigarettes. He smoked three packs a day, which could only be achieved, he noted, by lighting up the minute he rolled out of bed, and butting out just before turning out the lights.  But I remember he said something to the effect that, when he took his first drag on a cigarette, he knew immediately that it was for him -- it was like his body had found its soulmate.

That sounds more than a bit like Jimmy Page on his relationship to music. Except here’s what else Peter Gzowski had to say about his smoking:

"If anyone asked, and they did, all the time, I'd say I hated my habit. It's hard to duck the fact that I probably hated myself for being such a slave to it."

And that is ultimately the difference between and addiction and an obsession:

The first is enslavement, the second is freedom. 

Wednesday
Aug092017

High performance sports and the cult of execution: a response to The Krauthammer Conjecture



Sportswise, odd-numbered summers are boring. On the other hand, even-numbered summers are great -- you can reliably anticipate either a World Cup of soccer or a Summer Olympics, and there’s also a Euro soccer championship as an appetizer during Olympic years. Yes, things are getting better with the women’s soccer World Cup scheduled for odd-numbered quadrennials, and there are occasional one-offs like the World Cup of Hockey tournament. But the Olympics are the best, and both of the big soccer tournaments provide at worst a useful weeks-long distraction during long summer days at work, at best a heady fizz of genuine globally-focused drama.

Last summer’s Rio Olympics were better than most. I came out of it with two heroes, one expected and one not: Penny Oleksiak, the 16 year old Canadian swimmer who emerged from nowhere to win a whack of medals, and Usain Bolt, who cemented his status as the greatest sprinter (and maybe showman) of all time with one last triple gold performance.

And so the summer of 2017, already destined to be a letdown on the sports calendar, has been more of a bummer than anticipated. I was hoping for a bit of a repeat this summer from both my heroes of 2016, but at the world swim championships last month, Oleksiak had a middling meet, winning just two bronze medals in mixed relays. And then, more shockingly, Bolt finished a surprising third in his final 100 metre race at the Worlds in London and stumbled and fell in the 4x100 relay. 

What seems a bit odd, though, is that these comparative failures seem to have bummed me out more than they have either Oleksiak or Bolt. For example, here’s Oleksiak after coming fourth in the 100m butterfly:

"I'm pretty happy with the race. I know there's stuff I can fix going forward. Overall I can't really complain, the race is over and I can't change anything.”

And she seemed even more unfazed about coming sixth in the 100m free, the race she won in Rio:

"It hurt a lot the last 20 metres, but overall it was a pretty good race," Oleksiak told reporters. "I've had kind of a difficult year, so I think being able to get around my best [time] was pretty good."

As for Bolt, he seemed resigned, almost comfortable with having lost (sorry, won bronze) in the last race of his career. He gave credit to gold and silver medallists Justin Gatlin and Christian Coleman, and made almost no excuses, aside from citing time the destroyer, for failing to deliver. He had been complaining a bit about the starting blocks during the heats, and he acknowledged that without a good start he knew he was in trouble. But he wasn’t making excuses, just pointing out the facts. (There was a bit of excuse-making after the 4x100, with Bolt's teammates complaining that race officials had made them wait too long in a cold changing room).

It’s not supposed to be like this, right? Athletes, especially the top competitors in the world, are supposed to live and breathe victory. Defeat isn’t just a disappointment, it is a soul-crushing event. At least that’s how Charles Krauthammer portrays things in a recent column entitled “Why do they even play the game?”

In it, he offers The Krauthammer Conjecture:

In sports, the pleasure of winning is less than the pain of losing. By any Benthamite pleasure/pain calculation, the sum is less than zero. A net negative of suffering.

Which, he goes on to add, makes you wonder why anyone plays at all.

As evidence in support of the conjecture, he points to a number of famous examples, old and new, of athletes being utterly devastated by defeat: LeBron James, Ronda Rousey, Ralph Branca, Sonny Liston. He also states that any parent who has a child in Little League or minor hockey or what have you, can attest to the validity of the conjecture.

If Krauthammer is right, then we have a genuine conundrum on our hands: Because sports produces far more losers than winners, anyone who plays a competitive sport, at any level, is behaving in a fundamentally irrational way. And to make it clear, Krauthammer believes the negative return from losing is not cancelled out or overcome by the money professionals earn. The calculus comes out negative no matter how much you make in the losing.

There are a number of additional variables that affect the cost-benefit analysis, and which Krauthammer ignores. Things like cameraderie, physical fitness, character-building, and other aspects of sports that don’t get caught up in the straightforward winning/losing tradeoff. But we can ignore those for the time being, because I think The Krauthammer Conjecture can be refuted from my armchair: If anything, athletes these days seem less concerned about losing than the fans who cheer them on. And it’s not just professionals, it’s amateurs as well. Why is that?

I think it is due to what we can call the cult of execution.

Success in sports is a function of five components: preparation, strategy, tactics, execution, and chance. The relative importance of any one of these components varies considerably from one sport or event to another (for instance, strategy plays a bigger role in the 10 000 metres race than the 100 metres), but every competitor's outcomes are determined by the interplay of all five components.

Up until, I dunno, the end of the nineties, participants in most sports, even ultra high-level professional sports like World Cup soccer, exhibited considerable variation in all of these components. The sports world in general was, to use an inadequate term, a lot looser. Significant differences in fitness levels, in strategic innovation, tactical skills, and so on all helped give the tactical and strategic space a more open and almost random character.

To give just two examples of what I mean: The ‘72 Summit Series between Canada and the USSR was marked by significant differences in styles of play, tactics, and fitness levels. Their style of hockey having evolved in basically closed ecosystems, when the two teams met there was a great deal of mismatch, and thus a lot more room for “intangible” things like heart, grit, or chance. In a completely different context, think of legendary footballers like Maradona or Paul Gascoigne, both of whom excelled and even dominated, despite being (more or less) fat guys. They could compensate through raw talent and tactical genius what they lacked in preparation.

For both soccer and hockey, things have changed enormously. Both sports are now dominated by conformity at virtually every step in the success chain: everyone prepares like crazy (middle school soccer kids now do two-a-days), there’s been tremendous strategic compression, which has led to less and less tactical innovation. What’s left as a variable, aside from chance, is execution.

Execution is the performance of a tactical decision. In the simplest events or sports, execution of a task or movement is what puts the competition in motion and, iteratively, drives it towards its conclusion. As H.A. Dorfman puts it in his book The Mental ABCs of Pitching, "the execution of pitches, one at a time, is the singular task that moves a baseball game from its beginning to its close." The pitcher's role consists entirely of selecting a pitch (fastball, slider, changeup) and a location (inside, outside), focusing on the target (the catcher's mitt), and delivering the pitch to the target.

Sports like diving and gymnastics have a lot in common with pitching, in that they are almost zen-like in their simplicity: manoeuvre selection (the kind of dive, or vault), its mental visualization, then physical execution. But every sport ultimately comes down to execution, from stroke or stride quality and consistency in swimming or sprinting to serve delivery in volleyball to proper pace and accuracy of passing in soccer.

Every high performance athlete has learned to speak the language of execution -- go back and read the quotes above from Bolt and Oleksiak. And this is largely a good thing: athletes have learned to focus on what they can control and worry less about factors they cannot. And talk of “execution” helps break down their movements and decisions into manageable units that can be worked on or developed independently -- which is exactly where Oleksiak's head seems to be at, thankfully. 

But this is where almost all high-performance sports has ended up: Thanks to relentless globalisation and the remorseless ratchet of competition, everyone is using the same preparation techniques, drawing on the same narrow coaching playbooks and training regimens. In most sports there are no more strategic or tactical secrets, to the point where the role of intangibles like talent or heart or even chance is being squeezed out of the performance function. 

But this means the jargon of execution has a downside, which that for many athletes talk of “execution” is starting to serve as a sort of security blanket -- a catch-all explanation that borders on the tautological. You lost because you didn’t execute, but the only proof you didn’t execute well was that you lost. For example, here’s Jamaican sprint star Elaine Thompson trying to explain how she stumbled to sixth in the final of the 100 metres at the recent Worlds:

"In the first and second rounds I got the starts but I don't know what happened. I'm going to have to look back, and watch but no excuses. It was a good race," she said. "I'm very disappointed but at the end of the day it's not the end of the world.”

Not the end of the world for her maybe. But what about the three million Jamaicans who were desperate for some sort of redemption after Bolt’s loss?

I’m being facetious of course, but only slightly. But such dispassion certainly puts Krauthammer’s question in a different light than he intended. If they care so little, why do they even play the game?

Tuesday
Aug012017

Surfing and Sushi: Some notes on obsession and authenticity

I have a friend who is obsessed with TFC, the Toronto professional football (soccer) club. He doesn’t just follow the team like a normal grownup person, happy when they win and a bit sad when they lose. To a large extent, the sun in his life rises and sets with the fortunes of the team. As I put it to him in an email a while ago, I wish I cared about anything in my life as much as he cares about TFC.

Because aside from raising my kids (which is more of a mania than anything, and doesn’t count regardless), there is nothing in my life that plays the sort of role that TFC plays in my friend’s life, nothing similar that dominates my thoughts, focuses my attention, structures my work life, frames my worldview, and determines my relative happiness.

It wasn’t always this way. From age 7-10 my head swam with Star Wars and little else. In middle school there were sports and friends and computers, then later, sports and the occasional band. But aside from volleyball in high school, and philosophy  in my twenties, and the occasional girl, there hasn’t been much else along those lines. It isn’t that my life is empty, far from it. But for better and worse, it’s been pretty much empty of life-consuming obsessions.

What I do have is a bit of a meta-obsession with other highly obsessed people. The obsessed personality is fascinating, and I like to read books or see movies about people whose lives are dominated by a single thing, especially in the areas of food (usually Japanese), music (rock or punk), and sports (mostly surfing or skateboarding).  I don’t know if it says anything about anything that I was a lousy skateboarder, have surfed once, and was a terrible guitar player in a bad high school band. (I do make pretty good ramen though.)

Anyway, if you’re looking for a guide to my meta-obsessions, you can check out the documentary Jiro Dreams of Sushi or the memoir/cookbook by Ivan Orkin subtitled “love, obsession, recipes”. In the relatively recent rock doc category, there is the Nirvana documentary Montage of Heck and the one about Amy Winehouse that came out in 2015. You should also check out Dogtown and Z-Boys and Riding Giants.


But what really floored my last year was the book Barbarian Days: A Surfing Life by the New Yorker writer William Finnegan. I won’t try to give a summary -- there’s a good review here -- apart from saying that it’s the single finest essay on surfing, on life, and on obsession that I’ve ever read. Most pieces I’ve read or movies I’ve seen on these subjects leave you convinced that, if you haven't spent your life as a surf bum, or a skateboarder, or a ramen chef, or what have you, then you've wasted it. What Finnegan does is make clear the love-hate relationship he’s had with surfing, what it has cost him and why the obsession has been so problematic.  

Every obsession, whether it’s sushi or surfing or anything else, has its gods, its priests, its canon, its approved rituals. And where they tend to become cults is in their extreme hostility to attention from outsiders, the secretive nature of their practices, and the utter devotion given some of the top practitioners. And in many ways, that cultishness comes to tragedy -- most obviously in the case of rock musicians who succumb to the over-the-top lifestyles that are the mark of the truly committed.

But as surf god Laird Hamilton put it in Riding Giants:

If you applied the same amount of devotion to a religious pursuit, do you think anyone would call you a “religious bum”? Probably not. When you consider that surfing is really more than anything a faith, and that devotion to that faith becomes paramount in your life, there’s no such thing as a surf bum.

He’s completely right, of course. But it’s precisely this conception of an obsession as a faith, and one of overriding importance to the way a life is lived, that makes it so problematic for us normals. To put it bluntly, what stands in the way of the pursuit of an obsession is a concern for status.

Whatever else it amounts to, pursuing an obsession requires flouting a lot of the standards that govern “regular” society, and a great deal of the stuff that most of us do to maintain appearances has to go by the wayside. This includes, but is far from limited to: holding a steady job, committing to a relationship, being a good and consistent parent,  keeping in touch with friends, dressing like a normal person, keeping regular hours, keeping up with non-obsessed friends and family, watching what you eat, maintaining your health, and keeping up with the news.

For the obsessed personality, some or all of these must be sacrificed (or better, merely abandoned) in the name of the obsession. At this point, you cease to become someone who surfs, or someone who cooks, or someone who plays music, and change into a surfer, a chef, or a musician. These are identities many of us could never adopt, not just out of lack of talent, but also out of a deep reluctance to unchain ourselves from the tacit norms and status games that keep us in the loop of regular society.

A lot of my writing, from Rebel Sell to the Authenticity Hoax and since, has been devoted to arguing that a lot of behaviours or worldviews that purport to be about rejecting traditional status markers are actually just disguised versions of those status markers, transposed into a political or spiritual key. What I find interesting about obsessed personalities is that they alone seem to be the real deal: people who simply DGAF. At which point, I’m led to the conclusion that the only true authentics are the obsessed.

 

Friday
Feb122016

Authentic Sports: Introducing the Primitive Biathlon

Got this today.  

 

***** NEWS RELEASE *****

 

For Immediate Release

For More Information Contact: 

Fred Hanss, Fort La Présentation Association, Snowshoe Biathlon Chair

(315)212-6213 / fhanss@twcny.rr.com

 

4th Annual Snowshoe Biathlon Scheduled for March 5 and 6, 2016

 

OGDENSBURG --The Fort La Présentation Association will hold its 4th annual St. Lawrence Valley Snowshoe Biathlon on Saturday, March 5 and Sunday, March 6, 2016 at the St. Lawrence Valley Sportsman’s Club located off State Highway 37 in Lisbon..  The event will kick-off at 10:00 A.M. and close at 2:30 P.M. on each day of the event. Sponsors for the popular event includes: The St. Lawrence Valley Sportsman’s Club, Forsyth’s Rifles, Gray’s Gunshop, The Chazen Companies, The Heuvelton Lions Club, and the NYS Muzzle Loading Rifle Association.

 

The Snowshoe Biathlon is a local version of a winter sport that is very popular in New England said, Barbara O’Keefe, president of the Fort de la Présentation Association.  The original snowshoe biathlon took place at Smuggler’s Notch, VT and there are now a number of events in Vermont and New Hampshire that take place annually. 

 

Event coordinator Fred Hanss said that the event requires participants to traverse a race course through the woods on snowshoes and fire a muzzle loading rifle or smoothbore twice at 5 targets in the woods; at the final station, participants may toss a tomahawk at a block of wood for extra points.  Participants will be timed as they run or walk the course and each time they hit a target, 5 minutes is deducted from their running time.  If there is insufficient snow to use snowshoes, the organizers may permit participants to carry their snowshoes on the course.

 

 

There are 3 divisions: traditional muzzle loading rifle; flintlock or percussion cap lock, traditional smoothbore and modern in-line muzzleloader.  Those running in the traditional class must wear wooden snowshoes while in-line shooters can wear modern snowshoes.  Each division is scored separately.  There are classes for seniors, men, woman and youth (12-16).

 

The Snowshoe Biathlon registration fee is $18 in advance of the event, $20 at the door; once registered, runners can run the course a second or third time for an additional $5 fee.  Registration forms can be found on the Fort Association’s website, www.fort1749.org.

 

The event is anticipated to attract participants from across New York State, Vermont and New Hampshire and Pennsylvania.  “This year, we’ve had inquiries from people as far away as Indiana,” said Mrs. O’Keefe.  In addition to raising funds for the Fort de la Présentation Association, the event also helps the local economy. “Our hosts at the Sportsman’s Club benefit from the event as do local hotels and restaurants,” said Mrs. O’Keefe. ‘Those travelling from out of town typically stay for the entire weekend and participate on both Saturday and Sunday.”

 

Funds raised by the event will be used as local match to build a public-access pedestrian and bicycle trail on the Fort Association’s property located on Van Rensselaer Point, Ogdensburg.   The Association has secured state and regional grant funding for the project that is anticipated to break ground in 2016.

 

For more information on the St. Lawrence Valley Snowshoe Biathlon, visit the Fort’s website – www.fort1749.org or contact the event organizer, Fred Hanss at (315) 212-6213.

 

-30-

 

 

Tuesday
Jun232015

Elections for Naifs and Cynics: A primer

A few years ago, I wrote a post in which I described a blunt taxonomy of political attitudes. I suggested that everyone falls somewhere on the line (which is a continuum) between political naifs, at one extreme, and political cynics, at the other. My primary claim was that naifs believe that politics is fundamentally about devising and implementing good policy. Cynics believe that it is about acquiring and exercising power. 

While virtually no one is a pure cynic or unalloyed naif, I think there is no doubt that the distinction does articulate two clear approaches to understanding how politics does, or ought to, function. I also think that understanding whether a given columnist is coming at things from one side or the other can be a useful heuristic for understanding the argument that is being made. 

At any rate, the original post gets tweeted and mentioned on social media fairly regularly by people whose work I respect and admire, so it suggests to me that I'm not the only one who finds the schema useful. We're into the election season now, so I took some time to sketch out an election primer for naifs and cynics. There's obviously a lot more to say, so I might see if this post can be expanded over the course of the campaign. (I'm also happy to take requests or suggestions for other ways of expanding the analyis). 

Some Guiding Principles

1. Everyone is a naif about their own political committments.

2. Everyone is a cynic about their opponents' political committments. 

3. Everyone is a meta-cynic about politics. That is, both cynics and even naifs are more interested in appearing cynical (or naive) to members of their tribe (of cynics or naifs) than they are interested in actually adhering to cynical or naive principles. 

4. We could all stand to be a bit more cynical about politics, and more naive about our meta-political committments. 

Ok with that on the table, let's get started.

Elections for Naifs

What is an election? For naïfs, an election is the opportunity for a national debate. As Andrew Coyne puts it, an election is “a conversation among the voters”, the outcome of which is a collective decision about what policies we should like our government to implement over the next four years or so.

The writ period is the time we have set aside for this conversation: the media deploys its resources to cover and facilitate that conversation, people open their doors to candidates, the pundits weigh in on how that conversation is going and try to help improve it. On this view, the longer the writ period the better, because the more time spent deliberating and debating, the sounder will be the reasoning that ultimately prevails.

Platforms: The most crucial element in an election for naïfs is the party platform. This is the document that lays out the policies the party promises to enact once in power. Ideally, the platform advances a consistent package of evidence-based policies, properly costed out, with a sincere and credible plan for how the money will be raised and the policies implemented.

Debates: For naïfs, debates between leaders play a key role in the election to the extent to which they are able to facilitate and amplify this national conversation, focused on the various party platforms. Debates should be about the substance of major issues – Defence, Foreign Affairs, Health Care, the Environment – such that voters are left enlightened and informed with respect to the choice they face. By the same token, naïfs dread talk of the “knockout blow”, the largely fictitious moment when one leader completely pwns another, destroying his candidacy with one snappy line.

Polls: Naifs tend to decry polls and the effect they have on the nature of the conversation among voters. Polls, according to naïfs, reduce elections to “horserace politics” or a “popularity contest.”

The Vote: The ballot box is the moment when the naïve voter gives his or her verdict on the outcome of the national conversation. By casting a ballot, the naïve voter chooses one candidate over the rest, or one party over the others, in the aim of giving that party a mandate to enact its platform.

The electoral system should aggregate these votes and translate them into seats in parliament in a manner that reflects the proportion of votes received. Power, that is, should be proportional to support. A party that gets 15% of the votes should get 15% of the seats, which should then translate into 15% of the power that the government exercises. A majority government should only be installed if a party indeed receives a majority of votes.  Hence the strong support amongst naïfs for electoral systems that involve proportional representation, and the source of the strength behind the slogan “make every vote count”.

 

Elections for Cynics

What is an election? For the political cynic, an election is what Schumpeter described as a competitive struggle between elites for the peoples’ votes. It is first and foremost a mechanism for the orderly transfer of power and the cycling of elites; or if you prefer, an election is an instrument that allows the people to throw the bums out (and replace them with a new set of bums).

The writ period is the time we have set aside for this competition to play out. The role of the media is largely to heckle, to cheer and jeer, and to analyse, much like fans in the sporting arena. For cynics, a longer writ period serves to demonstrate certain things about the relevant players – their stamina, their resources – but as that arch-cynic Kim Campbell put it,  "an election is no time to discuss serious issues."

Platform: Cynics don’t put much stock in party platforms as statements of policy. For the cynic, the party platform is more like an online dating profile than it is a curriculum vitae. That is, the cynic doesn’t much care if the policies are ideologically consistent, properly costed out, or have coherent implementation strategies. Rather, the platform is how the party signals its leadership of a political tribe. The platform is ultimately how the party defines its market niche in the competitive struggle.

Debates: Cynics find debates useful to the extent that they allow voters to evaluate which set of elites they will entrust with the business of governing. And so the cynical voter will be less concerned, during the debate, with the details of policies and their implementation. Instead, the cynic looks for how the debates signal leadership traits such as competence, coolness, and charisma. The sparring nature of debates is appealing to cynics precisely because it provides the opportunity for these traits to reveal themselves. If a party leader does suffer a “knockout blow,” for the cynic that signals much about that leader’s ability to handle the pressures of high office.

Polls: Cynics love polls, precisely because the election is a popularity contest. Following polls is like following the announcer’s call of the Preakness or the Belmont -- it’s a gauge of how the race is going, and we all get to cheer on our favourite political tribe.  At the same time, polls allow parties to evaluate their progress and adjust their strategies accordingly. Public polling allows voters to calibrate their own voting strategy in light of where the electorate seems to be headed. So for example, if a cynical voter is hell bent on throwing out the current set of bums no matter what the cost, she might change who she plans to vote for by looking at which opposition party the polls say has the best shot at winning.

The Vote: For cynics, voting is foremost an exercise in tribal support and affiliation. The cynic votes for the party whose brand or identity they find most appealing, regardless of platform specifics. At the same time, for politically disaffected cynics the major function of the election is to enable democratic control over the cycling of political elites. As noted in the previous section, the cynical voter might then be inclined to vote strategically: It will matter more them that “their side loses” than “my side wins”.

Because political power is indivisible and rival, enabling the cycling of elites is the crucial function of the electoral system. Its effectiveness is to be measured by how well it accomplishes this, not by how it allocates seats or apportions power. In a society marked by deep diversity and profound disagreement about the proper goals of the government, an electoral system that that allows power to be gained and controlled by a workable plurality of voters might be not only acceptable, but even welcome.

Homework:  Read the Liberal Party's recent Real Change manifesto, and try to place it on the naive/cynical continuum. Once you have done that, try to give that placement a cynical interpretation. That is, ask yourself what political tribe are the Liberals trying to appeal to with this manifesto.  

Thursday
Apr232015

How not to go to graduate school

Those whom the gods choose to destroy, they first send to graduate school.

 That would be the gist of my commencement address, should any university be so unwise as to invite me to speak to the graduands.

There are many different programs and many different disciplines, so I would stress that my comments are aimed at anyone thinking of a Ph.D. in the humanities or the social sciences, with the thought of eventually looking for work as a professor. Anyone out there with these aspirations needs to understand that there are a few good ways of going to graduate school, and many bad ways; and virtually nobody within the academy has any interest in helping you sort out the first from the second.

 In light of my own experience, then, I offer this advice on how not to go to graduate school.

I graduated from McGill in 1993 with a B.A. in philosophy. I hadn't given much consideration to what I wanted to do with my education or my life, and maybe eventually going on to either law or journalism school was about as creative as my thinking got. Bored and broke, I took a job working in a fish market for minimum wage. It was a rotten job, essentially a performance-art demonstration of Bertrand Russell's maxim that work is of two kinds: altering the position of matter at or near the earth's surface relative to other matter, or telling other people to do so.

Desiring neither to alter the position of matter nor to require other people to do so, I slapped together a late application to an M.A. program at the University of Toronto, begged for some quick reference letters, and was duly accepted at the end of July. Four months after my university education had ended it began again, with no real direction or purpose save the bourgeois expectation that good things eventually happen to smart people who follow their bliss.

And at first, I certainly enjoyed being a graduate student. What's not to like? It is endless summer for eggheads, with a straightforward reward structure: read a lot of interesting books, talk to a lot of interesting people, and spend a great deal of time drinking and hanging out. Write it up all smart-like, and someone will give you an A.

I did well, and as a falling stone follows gravity I dropped into the Ph.D. program. There was still no firm plan of study, no pressing research project. I just liked the lifestyle, and doctoral work brought more of the same but now with scholarship money, some teaching, and a hint of status.

I continued to do well enough, but had trouble jumping through some of the tighter hoops. It took me three tries to get a thesis topic and committee together, and I thought -- a lot -- about quitting and doing something more productive with what remained of my twenties. But quitting would only bring its own worries to the table. My big problem was that while I had no idea of what I wanted to do in philosophy, I at least thought I knew that I wanted to do philosophy. Quitting would only inscribe the same question on a much bigger blackboard: what to do?

Taking the path of less resistance, I stayed. I finally settled on a topic and a supervisor, and submitted my thesis a respectable 5 1/2 years after I had started. In hindsight, I was pretty lucky. I had a supervisor who took his professional duties very seriously. He dragged me through the writing of my thesis, forcing me to write, revise, and write again, until I had enough material to pass as a thesis. Then he told me to submit it and get out.

Not everyone is so fortunate. There was one student in my department who was pushing 40, a decade into his program, and who owed close to $100,000 in loans to U.S. banks. Today, I routinely run into people in my Ph.D. cohort who are under the illusion that they are going to finish their degrees, 12 years after they started. They are encouraged in this belief by the university, which tells them they are some of the best and the brightest, by professors, who like them and genuinely wish them well, and by family and friends who equate leaving with failure and who are not sufficiently acquainted with the fallacy of sunk costs.

The fact is, nobody wants you to quit graduate school. Certainly not the university or your department, for whom graduate students are a source of public money and of cheap labour. Professors are reluctant to tell their less capable students to quit, partly because it smacks of condescension, but also because it does not reflect well on them.

Besides, if you tell a student, six or eight years after you admitted her, that maybe she isn't cut out for the biz, she might rightly ask: why didn't you tell me sooner? Yet quit is what most of them do, eventually. Attrition rates at the doctoral level in Canada are very high, with only 45 per cent of humanities Ph.D. students pursuing the degree to completion. The social sciences are only slightly better, with a completion rate of about 55 per cent.

For those who do stick it out to the end, average time to completion has been going up steadily since the 1960s. In his book The Creation of the Future, Frank Rhodes, a former president of Cornell University, wrote that doctoral education is one of the most wasteful of all activities in the university: 

All too frequently, Ph.D. students are allowed to drift, neglected by the faculty, while serving them in useful but lowly functions in teaching and research. The intellectual wastage, individual frustration, and personal financial loss this attrition involves are a personal tragedy and an institutional reproach that cries for amendment.

It is enough to make you howl. Or Howl: I saw the best minds of my generation destroyed by grad school, starving hysterical naked, dragging themselves through the library at dawn...

Ludwig Wittgenstein once said that the only reason to do philosophy was if you are bothered by philosophical problems. This is a useful maxim that applies to every academic discipline, and which every prospective grad student should write on the back of their hand.

Because in the end, the path to succeeding as a Ph.D. is straight as an arrow:

1. Figure out what problem in your discipline bothers you. 

2.Ask yourself if you are willing to commit the time, money and energy to solving it. 

3. If yes, do some research and figure out where the best work on this problem is being done.

4. Apply to that department, and get in.

5. Once there, pursue your problem with single-minded dedication. Forget about becoming well-rounded in your discipline, turn down opportunities to teach, and go only to select conferences where the heavy hitters in your area will be.

6. Start writing as quickly as possible, and publish your results as soon as you have them.

7. Hit the job market with a thesis, some publications and a solid research project ahead of you.

Simple, sure. The problem is that most people do some to none of this. I certainly did not. Instead, like many of my cohort I got sidetracked into the penumbral areas of academic life: teaching, reading, coursework, student unions, journalism, university service. These are the sorts of things that are rewarded within the university and give the illusion of being relevant to your professional development. They aren't.

If you ever finish your thesis and hit the academic job market, no one cares if you were departmental union rep or served on the academic senate or won the graduate teaching award.  This point cannot be overemphasized: the only thing that matters is that you finished, that you published, and that you will continue to do so for years to come.

 The mistake most students make is to treat graduate school not as preparation for academic work but as a way of extending undergraduate life. They enrol in M.A. and then Ph.D. programs because they like being students and because they enjoy reading in their discipline.

But doing something because it is enjoyable is not a job; it is a hobby. The graduate-school bill of lading says that if you can manage to stick around long enough, your hobby will turn into a job and you'll suddenly get paid for doing something that, for the better part of a decade, you have paid to do. Hardly anyone reads the fine print, which notes that this promise will be fulfilled for something less than one candidate in five.

A friend of mine once got some excellent advice from one of her professors, after she approached him to ask about getting a letter of reference for an M.A. program. He told her that there were a lot of interesting ways for a smart young woman to make her way in this world, and that maybe graduate school wasn't one of them.

More professors should have the courage to give their best students similar advice. More than likely, pursuing a Ph.D. program will leave you some combination of broke and unhappy. There are more pleasant ways of being ill-paid in this world, and more remunerative ways of being unhappy.

 

Tuesday
Aug192014

The normalization of the extraordinary, and other thoughts on Ferguson


1. As the events play out each night, we can't ignore the fact that for a great many people, rioting is fun. All the protestors are not on the side of the angels.

That said, what we are seeing in Ferguson is, more or less, the making explicit of a number of post-9/11 trends in North America, beginning with :

2. The permanence of the temporary: the extraordinary police and judicial powers that were awarded immediately after 9/11 under the guise of defending liberalism against clear and present danger have become entrenched. To this extent, we have validated the Nazi philosopher Carl Schmitt's view that liberal democracy is at core a sham, because it is unable to defend itself on its own terms.

3. The normalization of the special: The S in SWAT might as well be changed to N, since we have asked police at all levels to be faster and more aggressive in their response to potential terrorist threats. Don't wait for backup if someone is shooting up a school, and so on. But this means giving beat cops tactical shotguns and Kevlar.

4.The militarization of policing in North America, especially the United States, has been a recognizable problem for a few years now. The most obvious problem of course is the outfitting of even rinky dinky county cop shops with LAVs and sniper rifles and MRAPs and camo and all the rest.

5. This is partly a response to points 2 and 3 above - that is, it was driven by the feds -- but don't discount the visceral desire by many cops, especially those too dumb or too fat or too old or too young or too risk averse to have fought in Iraq or Afghanistan, to "get some". The more they are kitted out like the army or the marines, the more they are going to feel like soldiers, and the more they are going to want to act like soldiers, and the more they are going to want to earn their stripes in something resembling combat.

6. That is why the real problem with what we are seeing in Ferguson is not the equipment, but the culture. And by that I don't just mean the culture of policing, but our culture as a whole. Over the past decade, the dominant themes and motifs of our culture have become increasingly militaristic. Partly it's video games, but that's just a small part of it. Stuff that is essentially gym gear is now branded as "combat" or "tactical". People used to go running or to the gym, now they do Spartan Races or Tough Mudder courses and go to cross-fit, which has its origins in the military. Weekend warriors no longer play paintball, instead they participate in compressed versions of the SEALs hell week. And so on.

7. What gets lost in all of this is the extent to which the military is a distinct culture, and you can't simply give its gear and its training methods to police and expect good results. One thing that is interesting about the military is the rigorous legal, administrative, and moral codes that govern the use of force. You could call it "honour" but you'd just get laughed at, but a number of veterans of Iraq and Afghanistan have been commenting on social media about how the police in Ferguson have operated under far looser rules of engagement than the soldiers did walking patrol in Fallujah or Kandahar.

8. This alone should make everyone stop and think really really hard about what has happened to policing, because

9. Enforcing the law and maintaining the civil order in a constitutional democracy is fundamentally different from forcing a political outcome upon a foreign power through the use of violence and deadly force. They are so different that the very notion that the equipment, training methods, culture, and norms that govern them should be shared is insane.

Exercise for further discussion: The Libertarian Conundrum

10: The events in Ferguson pose an interesting problem for libertarians.  While it has been gratifying to see people like Mark Steyn chime in on the absolute gong show that is law enforcement in America, people like him quickly run up hard against what we can call the Libertarian Conundrum:

On the one hand, libertarians are in favour of what has been called the "night watchman" state. The state should enforce contracts, protect property, life and liberty against assault and theft, but otherwise leave people alone.

But a big part of leaving people alone is letting them own guns, so a libertarian society is one that will likely have a large number of individuals armed with military-grade weaponry. And as a number of police officials in the US have pointed out, we can't have police forces trying to enforce the law with revolvers when they are going up against people armed with AR-15s.

So if you're a libertarian, you have a problem. You can have your guns, or you can have your minimal state, but you can't have both. It would be interesting to debate which is worse, the Nanny State, or the Military State.

 
Monday
Aug052013

In praise of vat-grown meat

The world's first lab-grown hamburger to come out of Mark Post's lab was taste tested today, and by all accounts it wasn't too bad at all. Some key points from the CBC's story:

  • "The first (lab-made) meat products are going to be very exclusive," said Isha Datar, director of New Harvest, an international nonprofit that promotes meat alternatives. "These burgers won't be in Happy Meals before someone rich and famous is eating them."
  • Sergey Brin, a co-founder of Google, announced that he funded the 250,000-euro ($330,000) project, saying he was motivated by a concern for animal welfare.
  • Scientists agreed that improving the flavor probably won't be hard.
  • The animal rights group PETA has thrown its support behind the lab-meat initiative.

You couldn't find a more perfect combination of private entrepreneurialism, X-Prize-style achievement, social welfare, and status seeking. I love the idea of vat grown meat. Here's a piece I wrote about the prospect of vat-grown meat a year and a half ago when Post announced that his team was getting close to a viable product. It was originally published in the Ottawa Citizen:

***

Meat without the murder

Monday, February 27, 2012

It's getting increasingly hard to find anything good to say about meat. It is expensive, swallowing huge tracts of land and tons of grain. It has a large carbon footprint. Finally, industrial meat production is hard on the animals, even before they're slaughtered. The operational slogan of PETA, the animal rights organization known for its radical politics and attention-grabbing stunts is "meat is murder." But you don't need to grapple with metaphysics of personhood to concede that for most people, eating animals involves a certain amount of deliberate denial about the circumstances under which meat is produced and how the animals are treated.

As things stand, we are largely divorced from the animal origins of our food - what we see in the supermarket are not animal parts, but flat white slabs of chicken, dark cubes of beef, rounded discs of lamb with convenient bone handles. The extreme end of this distancing is a proposal that was presented recently by André Ford, a student in the architecture department Royal College of Art in London. He has designed a sort of vertical biomechanical latticework into which lobotomized chickens could be plugged, with food, water and air delivered by a network of tubes, with excrement removed in the same manner.

If that sounds disturbingly like the Matrix for chickens, that's pretty much what Ford is proposing. But while this project is more shock art than industrial architecture, Ford makes an important philosophical point: "It is time we stopped using the term 'animal' when referring to the precursor of the meat that ends up on our plates. Animals are things we keep in our homes and watch on David Attenborough programs. 'Animals' bred for consumption are crops and agricultural products like any other."

If that strikes you as repulsive, perhaps that is because it forces us to confront the bad faith that permeates our debate over industrial meat production and consumption: If chickens are not "animals" (in the David Attenborough sense) then there is no real objection to something like Ford's proposal. But if they are animals, then perhaps we should stop eating them. At the very least, their capacity for fear, pain, and suffering is morally relevant to how we treat them.

But what if there was a technology that did away with all of these drawbacks, cutting through the moral bad faith while giving us a constant supply of low-cost, environmentally sustainable, and suffering-free meat products? That is, if we could grow meat in a vat, would there be any reasonable objection to eating it?

It's still a hypothetical question, but it won't be for long. At the American Academy for the Advancement of Science (AAAS) annual meeting in Vancouver last week, professor Mark Post of Maastricht University announced that his team is getting closer to a workable process for lab-grown meat. Their first successes were with pork, and by the end of this year they hope to to replicate the process with beef, giving them a product "that looks, feels and hopefully tastes like meat." Another scientist working on meat in a vat is Vladimir Mironov of the Medical University of South Carolina. Mironov envisions "football fieldsized buildings filled with large bioreactors, or bioreactors the size of a coffee machine in grocery stores" to produce this meat. "It will be functional, natural, designed food," he said. "How do you want it to taste? You want a little bit of fat, you want pork, you want lamb? We design exactly what you want. We can design texture."

The idea of custom-designed meat products opens up a whole new realm for interesting (and relatively harmless) experimentation and competition. You can imagine celebrity chefs designing their own special lines of meat textures and tastes; a well-designed "blend" could be sold for meatballs, or stews, or meat pies. Imagine the delights of a steak that was a mixture of lamb and venison, or - even better - polar bear and panda.

At the same time, invitro meat will suffer from all the drawbacks of everything else that is produced cheaply and for mass consumption - it will be "inauthentic." And so it will inevitably drive a more pernicious form of authenticity-mongering among people who will only eat meat grown "on the hoof." At the extreme, you can imagine private or invitation-only restaurants and supper-clubs opening up where certified on-the-hoof meat is provided to the privileged elite. The fact that it might be illegal would only add to the experience.

There is the ick factor, of course. Meat in a vat just sounds gross, even before you read that the Maastricht group's process requires stem cells from cows and calf serum as inputs. But we humans have a strong tendency to confuse esthetic reactions with moral judgments, and if you want to see something that is genuinely objectionable, as opposed to merely repellent, take a visit to your local abattoir.

Mironov's lab was shut down this month by the university over what it described as "human resources" issues. PETA has lobbied for him to continue his work, and is funding his research even though the U.S. government will not. This is telling, because PETA understands that the goal is not to micro-manage consumer preferences, it is to prevent harm to animals and to the environment. And if all of that is taken out of the equation, there's really not a lot to object to when it comes to eating meat.

 

Monday
Jul152013

How the world works

*Comments are open for this post*

Nature matters more than nurture

Sex matters more than gender

Friends matter more than parents

Situation matters more than character

Norms matter more than laws

Institutions matter more than culture

Economics matters more than morality

Family matters more than state

Narrative matters more than truth

Identity matters more than rationality

Cohort matters more than generation

Class matters more than income

Status matters more than well-being

Race matters

 

Friday
Jul122013

Hipster chickens coming home to roost

(Image via

“It’s the stupid foodies. We’re just sick to death of it…. People don’t know what they’re doing." Chicken Run Rescue owner Mary Britton Clouse.

Some things are so predictable they are indistinguishable from a natural law. Night follows day, Leafs don't win the Cup, summertime construction on the streets of Montreal. And so it is with the rise in unwanted chickens being dropped off at animal shelters across North America.

It all began early in the previous decade, when forty years of cool-hunting was quickly supplanted by authenticity seeking, and food replaced fashion and music as the primary basis for urban one-upmanship. Organic turned to local became artisanal morphed into a full-blown back-to-the-19th-century self-sufficiency movement. Take a big helping of modern foodieism, fold in locavore-driven moralising, add a double dash of hipster status-seeking, and you got the urban chicken-farming movement. 

And now that the fad is getting a bit tired (the cool kids have moved on to shooting), the chickens are being abandoned by their owners. Chickens, it turns out, are a lot of work. They're also not cheap to own and operate. They are pretty mean animals. They stink and they are gross. And they can live for ten years or so, long long after they've stopped laying eggs. 

A lot of formerly eager chicken owners, having finally done the math, are dropping them off at the local humane society or animal rescue centre. (Why not just eat them, you ask? Good question.)

Anyway, it isn't like this wasn't completely predictable. In fact, the executive director of the Ottawa Humane Society made all of the obvious points three years ago the the fad first came to the nation's capital, in an interview with CBC Radio. It's a fad, he said. It's expensive. It's hard, and it's gross. He predicted that the chickens would be coming his way soon enough, and said "frankly, we don't need the work". 

You could argue, as my colleague David Reevely did at the time, that this is somewhat akin to the fire department objecting to new housing subdivisions: it is more of an argument for changing the funding model of the Humane Society than it is an argument for banning urban chickens. After all, he argued, we should expect the vast majority of urban chicken owners to be just as responsible as the vast majority of cat and dog owners. Why punish the many for the sins of the few?

Except the problem, as I see it, is that urban chickens are nothing like cat and dog ownership. Cats and dogs are domestic pets, while chickens are domesticated livestock. There is a large secondary market for abandoned cats and dogs, while the secondary market for urban chickens is likely to be non-existent (unless the market is the local food bank). But most importantly, unlike the urban chicken craze, cat and dog ownership is not a transient fad that will be supplanted by something even more authentic within a few years. 

Nor does it help matters to concede, as Barbara Cartwright, the CEO of the Canadian Federation of Humane Societies does, that "people who attempt to raise backyard chickens are driven by good intentions — to be more environmentally conscious, humane and to eat healthier." The road to Perverse Outcomeville is paved with these intentions, and it does no good to praise good intentions while ignoring the actual consequences. 

So let's just state it plainly: Urban chicken farming is no more environmentally conscious, no healthier, no cheaper, and no more economically worthwhile than regular chicken farming. And now that urban chickens are being abandoned by their owners like hamsters by five year-olds with ADD, the one selling point of the movement -- that it is more humane than factory farming -- is gone. 

Once we get clear on this truth: that it is consequences that matter, not intentions, we can turn our attention to the more broader problem of people wanting to bring animal husbandry back into our cities. There were sound reasons why we pushed this part of our economy out of the downtowns -- reasons based in public health, animal welfare, and simple economics -- and before we take our city centres back to the 19th century, we should at least make sure our decisions are based on something other than hipster fads wedded to dopey nostalgia. 

Monday
Jun032013

Why Rob Ford is the Amy Winehouse of Canadian politics

They tried to make me go to rehab but I said 'no, no, no' — singer Amy Winehouse, before dying of alcohol poisoning.

"Everything's going fine" — Toronto mayor Rob Ford, last week, after a bunch more staffers resigned. 

By now it is pretty clear that whatever else he may be, Toronto mayor Rob Ford is a very sick man. He suffers from any number of pathological cravings, obsessions, and addictions, from food to sex to booze to whatever else. And like all addicts, he has more than his share of enablers — people who helpfully pave his road to self-destruction even as they pretend to be acting in his best interests.

I’m talking about the innumerable pundits and reporters and fly-by-night political commentators who have spent the better part of the last three years telling everyone who would listen that Rob Ford’s vices are actually virtues, that his addictions are features, not bugs, and that the Unique Selling Proposition that the mayor uses to keep Ford Nation barking at the moon is the fact that he’s “authentic”.

What is authenticity, and why does it matter to politics? That’s a tough question to answer in a short space — I wrote an entire book about the subject and don’t think I even really got to the heart of the matter. But like a lot of bad ideas, the cult of authenticity seems to have entered our political vernacular from the United States, where there has been, over the past decade, a growing conviction that the biggest problem with politics these days is that our leaders are not authentic enough.

The argument goes something like this: modern politics has become dominated by large political parties and their shiny, prefab leaders who are about image not substance, who speak only in sound bites and talking points, govern with both eyes on the overnight tracking poll, and who delight in breaking their promises while pretending they never made them in the first place and demonizing their opponents while purporting to take the high road. Modern politics is mass-marketed phoniness, and it is no surprise the electorate is completely alienated.

What we crave (to continue the story) is authenticity.  The American writer Joe Klein signposted the search for the authentic in his 2006 book Politics Lost, an essay about the decline of authenticity in presidential politics. Klein took his inspiration from what he called Harry Truman's "Turnip Day" speech at the Democratic convention in 1948 that confirmed his nomination for president. Coming on stage after midnight, speaking plainly, simply, and without notes, Truman challenged the "do-nothing Congress" to act upon those views they claim to endorse, and get back to work.

Klein thinks we need more Turnip Day moments, more politicians like Truman. He argued politicians need to "figure out new ways to engage and inspire us — or maybe just some simple old ways, like saying what they think as plainly as possible."

It’s a good anecdote. The problem is now every authenticity-mongering pundit wants their own Turnip Day homily with which to beat the audience into submission. The most famous variation is David Brooks’ throw-away line about how Americans always vote for the presidential candidate they would most like to have a beer with. (A principle which, if true, would see Joe Biden elected president-for-life.)

A year and a half ago, the Canadian pundit Allan Gregg delivered a lecture to the Public Policy Forum called "On Authenticity: How the Truth can Restore Faith in Politics and Government” in which he claimed that our leaders' most systematic failure is that "they have not picked up on the electorate's craving for authenticity nor adjusted their behaviour to conform to this new reality."

Gregg had his own Turnip Day homily to explain just what he's getting at. He tells a story about the night he went to see a band in a club in Manhattan when the guitar player's electric pickup broke. Instead of stopping the show to fix the guitar, the band unplugged their instruments, moved closer to one another, and performed an intimate number. "As the last chord was struck, the room literally exploded with rapturous cheering, hooting."

Gregg saw a lesson in this for our politicians. What they need to do, he suggested, is unplug from the way they've always done things and try to reconnect with the electorate. They must drop the prefab talking points designed to "conceal meaning." They need to stop claiming to be the only island of virtue in a sea of knaves. They should cancel all political advertising, and talk straight to the people, saying what they mean and meaning what they say.

And the avatar of this movement, according to Gregg, is Rob Ford, whom he describes as  "a leather-lunged, no necked, know-nothing." And in case you think that’s an insult, Gregg goes on: "In Rob Ford’s instance, his very crudeness and unrefined nature made him seem ‘real’ and signalled he was not trying to hide anything from voters." That is to say, Rob Ford won the race for mayor of Toronto because he’s authentic.

Allan Gregg is far, far, far, from the only person to have made this argument. The "Rob Ford is popular because he’s authentic" line started during the 2010 election and continues even as he fights to keep his job over allegations that he’s a crackhead. Here are some selected examples:

We've got some fascinating artifacts of authenticity on our political stage today, some good, some troubling. When Rob Ford was first elected, I stood in a public square listening to him speak, thinking, uh oh, this man is trouble for all who oppose him. Why? Because the mayor says what he means, and he doesn't give a flying fig what opponents think of him. — Judith Timson, Toronto Star, April 2013

Ford, who won by running as an unrefined, yet garishly authentic, outsider, is an outsider once more. His war against the downtown establishment - they of bike lanes and gravy trains - can now continue with renewed relish, and perhaps even success; if Ford runs again, he may well win. — Adam Goldenberg, Ottawa Citizen, November 27 2012

Christie Blatchford has written a number of columns lauding Ford for his "authenticity", and while she has made a point of saying that she regrets voting for him, she also makes a point of reminding readers why she voted for him in the first place. The key for Blatchford is not who Rob Ford is, it is who he is not: He wasn’t a part of what she calls "that soft-left ruling class” that likes to think they run Toronto. And so we get to:

Mr. Ford is surely deeply flawed. Well, so are most of us, me anyway. But, to use a modern term, he is also authentic. — Christie Blatchford, National Post, November 26 2012.

I could pile up the examples like cordwood — just Google “Rob Ford” and “authentic” and your evening will be shot — but you get the picture. The question is, what are we to make of it? How can we get any critical traction on Rob Ford when we are told, over and over again, that what looks to all the world like a serious problem with his character is actually his greatest asset?

It is tempting to rehearse, yet again, the arguments for why the desire for authenticity in politics is self-defeating, and prone to catalyzing the very problems it purports to solve. But instead I’ll try a different approach and suggest that what is being pitched as “authenticity” is actually something far more dangerous, for both the electorate and for the leader who cloaks himself in its embrace.

For starters, Ford’s supporters consistently mistake populism for authenticity. Authenticity, at its purest, represents a perfect alignment between the inner self and its outer manifestation. It describes someone who is self-contained but transparent to the world, innocent without being naive, and sincere without being cloying. Such a person, if he or she ever existed, would make an absolutely atrocious politician.

Rob Ford is not authentic. Instead, he’s just another populist. And in the current climate of North American politics, populism is just another put-on, a mask, a front, that some politicians adopt in order to seem like one of the people. In America, populists thump bibles and kiss babies and warn against commies and talk about craw fishin’ or huntin’ and talk about the Heartland and Families and the Family Farmer. In Canada, populists write books about hockey and hold press conferences at Tim Hortons and warn against commies and talk about hockey and warn about crime and defend the Family Farm and give medals to hockey players.

Again, populism is not authenticity. It’s a pose, a marketing position, a brand. And it is just as phony as any other political posture out there. Sometimes it works, as it did for Rob Ford. And sometimes it flames out spectacularly, as it did for that moose huntin' maverick mom, Sarah Palin. 

But it isn’t clear that Rob Ford is even much of a populist. About the only truly populist kite he’s ever flown is the whole stop-the-gravy-train thing, which some people thought meant he was committed to lowering taxes. As it turned out, he actually thought there was a literal gravy train at City Hall and that stopping it would fix Toronto’s finances. He’s also a bigot and pretty obviously hates the gays, and if you want to call him a populist on those grounds, you’re welcome to the term and the baggage it brings with it.

No, there’s something more basic to Rob Ford’s personality, and there’s nothing that appealing, about it: the man has zero self-control. Whether it is reading while driving himself to work, drinking at official functions, going to KFC while on a much-publicized diet, or allegedly smoking crack and hanging out with drug dealers, it is clear that Rob Ford is simply incapable of resisting temptation, delaying gratification, or otherwise acting in a manner that is in anything other than his short-term interest.

And — it is crazy that this needs pointing out — there is nothing politically or morally praiseworthy about this. In the Republic, Plato hailed rational self-mastery of the passions as the key to both personal well-being and the proper functioning of the city. A few thousand years later, Freud suggested that the control of the id by the super-ego, moderated by the ego, was the key to being a properly formed adult, and the lynchpin of civilisation. In between and since, no one has seriously made the case that rule by the passions, the id, the animal instincts, whatever you want to call it, is a viable way to run a polity of any size. More to the point, no one has credibly argued that this is any way for a grownup to behave. 

Except, that is, Rob Ford’s enablers, whose greatest fear is that Rob Ford will go to rehab and expose their ongoing support for what it really is: a dangerous and foolish egging-on of a very sick man. Which is what makes Rob Ford into less of a buffoon and more of a tragic figure. It turns Rob Ford into the Amy Winehouse of Canada.

Remember the first time you heard Winehouse singing “Rehab”? I do. I loved it.  The casual defiance, the stick-it-to-the-man refusal to go along with square society’s medicalization of boozing.  Which is weird, because I actually co-wrote a book critizing that very attitude – the studied rebellion that treats every institution, from grade school to the hospital, as part of the great conformist system of mass society.

But love it I did. We all did, for mostly the same reasons. Why should Amy Winehouse go to rehab? After all, weren’t her problems – her drinking, the drugs, the depression and the self-harming – the very font of her art, her creativity, and her soul?  “Rehab” became a rallying cry for barflies everywhere. It also showed that, despite decades of public education on this issue, we still don’t take seriously the proposition that alcoholism, drug abuse, and even depression, are actual illnesses.

Imagine if, instead of being an alcoholic, Amy Winehouse had cancer. And imagine she wrote a song called “Chemo” with the lyrics “they tried to make me go to chemo, I said ‘no, no, no’”.  Or if she had an infection, and she sang “they tried to give me antibiotics, and I said ‘no, no, no.” It would be a joke. But deep down, most of us don’t quite accept that alcoholism or drug addiction are diseases like any other. It’s self-destructive, sure, but there’s also something romantic about it.  These are not new observations: the celebration of fucked-up artists is one of the defining features of our culture. When Amy Winehouse recorded “Rehab,” she was telling the world that she didn’t buy into the notion that her drinking was an illness that needed treatment. When we bought the record by the millions and gave her a Grammy for it, we told her we agreed.

Did this popular support play a role in her subsequent death? When she sang about not going to rehab and we cheered and called her authentic, did she internalise the value system we were pushing on her?  That is, I wonder if Winehouse, like others before her and since, bought into her self-image as a messed-up singer of the blues, which made it that much harder for her to get clean.

I'm not suggesting she was simply playing a role, or that she killed herself in the name of cred, but there is a powerful looping effect in all of our identities. All identities are social constructs which get their power from being recognized by others. As a result, there is a feedback loop in our identity construction, where we internalise the norms that govern our chosen (or assigned) identities. When the norms of a given identity contain a built-in mechanism for both radicalisation and self-destruction (as they do for an identity like "messed-up singer of the blues"), it is not hard to see how it could become literally inescapable.

So then imagine you one day find yourself the mayor of one of the biggest cities in North America. You aren’t without your charms, and the people around you aren’t without political savvy. But you also have serious personal problems, which play havoc with your health, your personal life, and threaten your ability to do even the most minimal parts of your job. Yet the worse things get, the more you spiral down, the more your so-called supporters cheer you on.

What would you do? Where would you go? Who would you turn to for advice? In such circumstances, I think you would hope you could rely on someone who has known you all your life, who loves you for who you are but who knows that who you are involves habits and appetites that, unchecked, might get you and even others killed. That is, you would hope there was someone close to you who loved you like a brother.

Does Rob Ford have such a person near him? I honestly hope he does. His life almost certainly depends on it.