1. My day job is working as an editor at Canadian Business magazine. One of my favourite elements in the front section is our "Ode", which is meant as a sort obituary to failed companies, brands, or product lines, with a bit of a moral or life lesson at the end. Recent Odes we've published have been to the Hummer, Little Orphan Annie, and the Ford Mercury.
I wrote the most recent one, about Old Spice Guy, and in this case it is more of a success story:
Like all top performers, Old Spice Guy knows that it's always best to leave them wanting more. And so, with online viewership just starting to tail off and the copycats (including an execrable one from Cisco Systems) already appearing, the Old Spice Guy campaign was cancelled. Isaiah Mustafa is off to make a film with Jennifer Aniston, while Old Spice Guy himself retires to an honoured spot, next to the BMW film series "The Hire" and Burger King's Subservient Chicken, in the pantheon of social-marketing success.
2. On the same subject but a completely different tone -- my review of Sebastian Junger's War:
Like most books that give the grunts-eye-view of combat, War is really a book about masculinity, and the distinctly male ways of bonding, in-group/out-group dynamics, and the relationship between male sexuality and violence. These are the same themes that haunt the film, only amplified, and I don’t have much more to add than what I wrote in my review of Restrepo.
Last week, the wsj ran a piece by Brett McCracken complaining about the rise of "cool Christianity", which is the effort by churches to stem the outward flood of young believers by making belief seem cool or edgy. A largely unhelpful criticism of the piece can be found here, but I actually think McCracken is exactly right when he says, "As a twentysomething, I can say with confidence that when it comes to church, we don't want cool as much as we want real."
One of the most interesting set of responses to the argument of the Authenticity Hoax has come from religious readers, many of whom have heartily endorsed the book's conclusions. At first I was confused by this (since the book more or less assumes secularism), but then I realised that many believers see my arguments as a vindication of what they have been saying all along. Insofar as the search for the "authentic" is an attempt at replacing the lost religious dimension of our lives, to the extent that my argument succeeds it can be read as endorsing the religious worldview. Of course the search for the "authentic" in all things eco-and organic is a hoax: the only place to get the real real deal is on your knees, in the pews.
It's funny, the churches seem to be getting on the "cool" bandwagon about a decade too late. Cool is dead, it's all about authenticity now, and churches are abandoning their key product line just as it is coming back into style.
The BBC has a story today about the latest in downshifting: digital minimalism. Kids these days are discovering that atoms are for oldsters. Unlike their parents and older siblings, they don't need to lug around shelves full of books, folders full of files, boxes full of CDs. Instead, they live in the computing cloud, replacing all that heavy stuff with everything from "online photo albums to virtual filing cabinets to digital musical instruments."
The irony is that the star of the piece, Kelly Sutton, is a 22 yearold hipster who is clearly angling for a book deal with his blog, cultofless.com. As Rob Horning says over on his blog, this would be easier to take seriously if it wasn't just another iteration of half-baked stunt-lifestylism, from No Impact Man to the guy who did a new job every week for a year.
But I'm struck by the relative absence of any moral or anti-consumerist dimension to this. Unlike the Bonfire of the Brands guy (from 2006) or Michael Landy's infamous Break Down work of performance art from 2001, the digital minimalists seem to be just looking for a new kind of consumerism. Again, I'm inclined to agree with Horning, that it's mostly a form of implicit status-seeking.And as he says, "Not everyone can be 'minimalist' because then minimal will simply become normal, and some new distinctive posture will have to be adopted."
What pose might that be? It's funny, this came up in my discussion with Dan Gardner during the Ottawa Writer's Festival. I was talking about the successive iterations of Veblenian status-seeking, from conspicuous consumption to conspicuous rebellion to conspicuous authenticity, and Dan asked me what the end point would be, where it goes from here. I didn't have a good answer at the time, but I was thinking on it afterward and I think it probably goes in two directions.
First, we get an even greater fetishization of unique physical objects. You can already see this, with the cult of "artisanal" that has already replaced "local" as the definitive authentic consumer good (for a recent example: artisanal toothpicks). But beyond that, I think we'll start to see the body itself become the site of conspicuous lifestyle display. It's no accident, I think, that Fukuyama followed up The End of History with a book about posthumanism. When our sense of self is no longer wrapped up in world-historical ideologies or isms, what is left of human identity? When we no longer need stuff to express who we are or where we stand, how do we engage in status display? When almost everything of value is made of bits, the last field of contention is our physicality itself, whose tenuous hold on the real becomes the central narrative in the digital age.
The New Scientist reports on a study, by Shahzeen Attari at the Center for Research on Environmental Decisions at Columbia University, on voluntary efforts at reducing carbon emission that underscores just how ineffective these measures are. Even if we set aside the obvious (and I think insuperable) free-rider problems and assume that everyone is on board with the need to cut emissions, there is a bigger problem, which is that people are not very good at estimating the energy use of various activities and appliances:
“For small devices and appliances, people have a pretty good understanding of how much energy they use,” says Attari. “However, for large devices they really underestimate the amount of energy they use.”
As a result people underestimated the energy consumed by devices by a factor of nearly three, she says. So when asked to compare moderate energy-consuming devices, respondents correctly identified that desktop computers consume more energy than laptops, but they significantly underestimated the difference.
One could argue that the solution in this case is a proper public education and labeling system. So just as you know how many calories and grams of sodium are in every bag of chips you buy, we could force manufacturers to put carbon footprint labels on everything they make. But one difficulty is that in order for labels to be effective, people need to be able to understand them — and as Chris MacDonald has been arguing, that’s a very shaky assumption.
A bigger problem is the Achilles heel of all attempts at changes society through moral incentives, which is the way even the best intentioned people are prone to rationalization. There are a lot of things we want to do or consume, and when we are made to feel guilty about it we tend to “launder” our consumption through a moral filter. As Attari says, “Because people might just replace their light bulbs, and feel like they’re off the hook; that they’ve done their part.”
This feeling is even more pronounced when it comes to foreign travel, according to Graham Miller at the University of Surrey in Guildford, UK, and colleagues. When the researchers asked volunteers in the UK about their attitude to sustainable tourism, they found people felt they “deserved the right to fly because they took pro-environmental actions throughout the year”.
So here’s why voluntary efforts will never work:
1. It’s a massive collective action problem
2. Consumers don’t have the information they need to make informed decisions
3. Even when they do, they tend to rationalize their preferred behaviour.
The obvious — indeed, only — solution is a carbon tax, which handily eliminates all three problems at once.
The first book of philosophy I ever read was Walden, by Henry David Thoreau. I read it in high school, and was mesmerized. The volume I had included his essay on Civil Disobedience; it's fair to say that book is what eventually led me to study philosophy in university.
I read something the other day that said that the moral of Walden should not be that technology and civilization are bad, but rather that we need to pay more attention to how we use technology. That's fair enough. To that end, I'm off for a few days to paddle around Algonquin Park in Ontario. I'm looking forward to getting off Twitter and the rest of the internet for a while, but I'll see you all back here next week.
Feel free to keep sending me any links or ideas or criticisms or anything else; I really welcome the correspondence and I value the acquaintances I've made through this book and this blog.
1. A rather annoying interview with Christopher Hitchens conducted by the extremely annoying Jeffrey Goldberg of the Atlantic. As if the big issue is whether Hitch will discover god on his deathbed.
2. Stephen Hawking can't make up his mind. First he said that humans need to be worried about rapacious spacefaring aliens come to steal our natural resources. Now he says that species is most likely to be... us. He's rapidly turning into David Suzuki.
3. In praise of the enlightenment, from City Journal.
4. An adman quits to find his soul -- an authenticity quest. (via @ajkandy)
The fourth season of Mad Men is well underway now, and last night's episode saw 1964 topple into 1965. After showing us occasional hints of the cultural earthquake to come (esp Midge's beatnik friends in the first season), the big show is finally starting. Last night's show flagged just about everything to come -- Joan's abortions, her husband getting ready to ship out to Vietnam, and the Berkely sit-ins of 1964-65. The last days of the old regime went out in style though, with Lane and Don spending their New Year's Eve getting drunk on scotch at the office, watching Godzilla, putting steaks on their genitals, and picking up hookers. You think that's bad behaviour? Wait till we see what the kids get up to this year.
What will it mean for the ad agency? As always, Natasha Vargas-Cooper's footnotes to the episode are excellent, and this point is bang-on:
In the 60s, the symbolic role that youth played in American culture—honesty to self, renewal, rejection of ancient values—became a driving market force. This notion was really that becoming an adult meant participating in consumer culture. This is perhaps the most loathsome legacy of the Boomer's ascent to cultural dominance: the perpetual teenage mentality of rebellion through buying things.
On the excellent ABC blog "The Drum", an Australian academic named Jeff Sparrow laments the importation of the discourse of "authenticity" into Australian politics. It seems that the current election campaign has become dominated a competition over which candidate, prime minster Julia Gillard or opposition leader Tony Abbot is "more real", complete with a call for Gillard's handlers to "let Julia be Julia". It's a shame, really.
The first half of Sparrow's piece pretty much tracks the argument in chapter six of The Authenticity Hoax -- he's even kind enough to give me a shoutout. Then he gets to the heart of the matter:
Quite obviously, Julia Gillard's pledge to find herself does not represent a renunciation of politics-as-usual so much as an intensification of it, with her attack on her handlers almost certainly scripted by those handlers themselves, fully aware of the electoral impact of a properly-designed Turnip Day.
The new focus on political character in Australian politics ('what is Julia really like?', 'just what kind of person is Tony?') directly replicates the preoccupations of US campaigning, in which all candidates try for authenticity, all of the time. But why, exactly, have we moved to this presidential style of politics?
Sparrow and I agree that while the temptation is the blame "the media", that's just not good enough. After all, the media is a consumer good like any other, and blaming the media for serving up bad politics is like blaming McDonald's for serving up bad hamburgers. Someone is buying it, the question is why?
Here is where Sparrow and I part company. He blames "market fundamentalism" and the ideology of "neoliberalism":
The neoliberal turn was always about more than pure economics, involving an insistence that notions of individual autonomy, consumerism, efficient markets and transactional thinking should be extended into all social relations, even - or, perhaps, especially - those that had previously been dominated by quite different rules.
I don't quite agree. Think back to the mid-2000s in Canada, where there was so much hand-wringing over "aliented voters" -- the recurring theme then was that none of the parties properly represented the views of individual voters; the need to compromise by supporting a big-tent party was seen as a shameful compromise.
One of the recurring arguments in the book is that the "authentic turn", in politics as elsewhere, is, paradoxically, a consequence of anti-market thinking. And yes, I think Sparrow is right that much of the disaffection of these supposedly "alienated" voters seems to have been a product of their having adopted and internalized the ideology of consumer sovereignty. But we need to be a little more careful: the main ideological motivation for this is not itself pro-market ideology: just the opposite. The desire for the authentic, in politics as elsewhere, is largely a consequence of anti-market thinking, which is just to say that it is the authenticity-seekers who are creating the very conditions for their own exploitation.
It was inevitable that the book industry would get all shook up by technology, in the way that music, film, and newspapers and magazines had done before it. Publishing hung on for longer for no reason other than it took a while longer for a delivery mechanism to come along that would rival the book as a technology.
The upside of this is that it is opening up new ways of experimenting with publishing -- Neal Stephenson's Mongoliad is a really interesting experiment.
But just as Amazon made the bricks and mortar bookstore obsolete, the Kindle, Kobo, iPad and other such are making the book itself a quaint little objet de nostalgie.
I take no pleasure in this -- I write for a living, and writing is a very conservative business. But for that very reason, I'm worried that the publishing business is even worse situated to deal with the challenge than even the music and film bizes were. But I'm just guessing here, since I have no real experience on the inside. But someone who has worked in the biz, inside and out, is Megan Hustad. She has a great rundown of what the main problems are (with loads of good links as well), but her last point caught me by surprise:
All that said, however, there’s one blanketing sin that largely goes unmentioned. Any publisher that wants to exist let alone remain relevant in 2015 will have to figure out how to wriggle out from underneath it. The fundamental error, as I see it, is that the traditional publishing model privileges this formula:
- Underestimated costs + Overestimated benefits = Project approval.
In other words, before most publishers agree to publish anything, they run sales projections spun from a highly selective glance at the track records of “comparison titles” (as they’re called) that sold well. Comp titles that sold poorly are routinely ignored. Only projects for which all decision-makers have bought into best-case scenarios are pursued.
Now I'm dying to know what "comparison titles" my publishers looked at for the Authenticity Hoax.
As long as you turn the set on and put your finger in the air, if there's any songs out there, they'll come through you. It's very easy to get hung up on just the simple mechanics and craft of songwriting rather than the more important thing that real master musicians like the wherling dervishes can tell us about: just letting it go through you and come out the other side. -- Keith Richards, 1983
Sometimes you have to wonder what year the New York Times thinks it is. About seven years after it became a widespread (and widely-reported) probem, Trip Gabriel had a piece this weekend reporting that the digital age is blurring the lines of what constitutes plagiarism for university students. Aside from having nothing remotely new, the piece is an absolute mess, quoting academics tossing out one half-baked theory after another, without even attempting to do some basic analysis of whether any of it makes the slightest sense.
Stop if you've heard this before: the cut 'n paste features of the internet haven't just made it easier for lazy students to cheat. No, the rip/mix/burn online culture has actually changed our definition of the self.
Lord, are we still talking this way? Apparently so:
A University of Notre Dame anthropologist, Susan D. Blum, disturbed by the high rates of reported plagiarism, set out to understand how students view authorship and the written word, or “texts” in Ms. Blum’s academic language....
In an interview, she said the idea of an author whose singular effort creates an original work is rooted in Enlightenment ideas of the individual. It is buttressed by the Western concept of intellectual property rights as secured by copyright law. But both traditions are being challenged.
“Our notion of authorship and originality was born, it flourished, and it may be waning,” Ms. Blum said.
Look, we may be on the road to some po-mo world where nobody ever says anything new, but the very fact that we find these cases worisome proves that we haven't quite left modernity behind.
We should start by reminding ourselves that plagiarism is foremost a moral question. Sometimes it is illegal (such as when someone makes use of copyrighted material), but the essence of plagiarism is that it is one of a clutch of ethical offences that include fabricating memoirs or news reports, fraud, lying, hypocrisy, and forgery. What unites these is that they all involve some form of misrepresentation.
In many ways, plagiarism is just the flip side of forgery: The forger passes off his own work as that of someone else, while plagiarists pass off the work of others as their own. Plagiarism is an offence that involves the misrepresentation of the self. The reason why we get hung up about these things is because we hold fast to a number of moral ideals about the self. We give these ideals names like uniqueness, integrity and originality, but the motivating principle is what we can call the ethic of authenticity.
As an ethic, it is an injunction to be true to oneself, to place the cultivation of your real self at the forefront of your concern. Our culture remains strongly committed to the ethic of authenticity. Indeed, the reason plagiarism is on the rise is not because we care less about the morality of misrepresentation but -- paradoxically -- because we care about it too deeply.
Because of our commitment to authenticity, we tend to look down on ideas that are borrowed or derivative. We fight over credit for things, partly because there are potential financial or status rewards, but also because we believe there is something profoundly unjust about people receiving credit for books they didn't write or inventions they didn't invent.
But this actually gives us a strong incentive to lie about where we got our ideas. When her plagiarism scandal first hit, Harvard girl wonder Kaavya Viswanathan claimed that she had simply internalized themes and passages from her favourite books. This is the plagiarist's usual gambit, and it is parodied in the film The Squid and the Whale. At the school talent show, Walt announces that he is a about to sing a song he wrote, and proceeds to play a song from Pink Floyd's The Wall. When he's caught, Walt denies that he has done anything wrong. He claims that because he believed that it was the sort of song he could have written, the fact that he didn't was immaterial.
Hilarious, yes, but not far from the truth. Every writer runs into situations where he reads something that seems so obvious, that is so perfectly phrased, that he feels that he would have put it exactly that way, if only he'd thought of it. On these occasions, plagiarism doesn't feel like stealing so much as the appropriation of part of one's true self.
But even this is probably overthinking the problem a bit too much. It's only at the end of Gabriel's piece does a sane voice enter the scene, in the form of Donald J. Dudley, who oversees the discipline office at UC Davis. Most of the cases of plagiarism, he says, did not come from students who were in the thrall of some metaphysical theory of postmodern identity. Instead, they were simply “unwilling to engage the writing process,” i.e. they were lazy.
Or as Keef might say, it's easier just to let the ideas flow through you and out the other side.
Occam's razor remains a very useful tool, even for reporters.
The New Scientist has a short interview with off-the-grid guru Nick Rosen, who explains why it's getting easier than ever to live minimally. Turns out, it's because the gridders have developed so much awesome technology:
Is living off-grid today very different to how it used to be?
In the 1970s we had the "back to the land" movement and in the 1980s and 1990s, the survivalists. In the last five years a much more sophisticated and multilayered off-grid population has emerged. These days, technology makes it quite comfortable, thanks to wireless communication, low-energy appliances and, of course, renewable energy, which has come a long way since the 1970s.
Someone tweeted this link about the evolution of foodieism out of French nouvelle cuisine. I'm not really sure I understand it, but the argument seems to be something like this: Nouvelle cuisines became like a horse race -- about breeding the biggest and best horse on the field. In contrast, there is the dog show conception of success:
“Best in show” is rooted in standards of authenticity – i.e., the best representation of a type. Increasingly, foodie culture in the U.S. and other parts of the world has begun to adhere to these “best in show” standards as they seek for authentic representations of food.
I'm not really sure I understand this.
Nick Rowe and I were both thinking about Borges’ Library of Babel the other day, though for different reasons (I think). For Nick it was part of another of his fun posts that start from way beyond leftfield and end up nice and close to home. I was trying to figure out something helpful and original to say about Wikileaks and what it means for journalism. This is where I’m at:
Imagine two libraries. The first library contains every important book that has ever been written. It’s a big library, but not that big. There is only one problem: it is very hard to get into. Access to the stacks is strictly controlled, and when it comes to the best and most important books, it is almost impossible for civilians to even see them. And so while the public knows where all the useful information is to be found, that doesn’t do them much good since they can’t get at it. It’s basically a useless library, so let’s call this the Library of Robarts.
The second library is much, much bigger than the first. It contains every possible book that could ever be written, from a book that is entirely blank except for a single “A” on the first page, to a book that is nothing but “zzzzzz” on every page. It also contains books of any arbitrary length, since individual volumes can be concatenated to form much longer books. Unlike the first library, this one is open to the public. Anyone can go in and wander the stacks to their hearts content, and is free to spend days, months, or even years in the reading rooms.
But this library, too, is totally useless. It's useless not despite its size, but because of its size. Imagine you are looking for a copy of Moby Dick. You find one that you think is the right one, except it is very hard to know for sure. That is because in addition to the true copy of Moby Dick, the library also contains every possible version of Moby Dick that varies from the true one by a single letter or punctuation mark. And one that varies by only two letters or punctuation marks. Here’s the key point: the only way you could ever know that you had the correct version of Moby Dick is if you already had a true version of Moby Dick! In order to find what you want in this library (called the Library of Babel) is if you already know what you are looking for. As Nick puts it, "What makes a library useful, indeed what makes a library a library, is not just what it contains, but what it does not contain. The optimal size of a library, even if we ignore the cost of books, librarians, and bricks and mortar, is finite."
From the public’s point of view, the ideal library would be mixture of the two regimes. We want the limited size (only the important books!) of the first library, but the open-access of the second.
So what does this mean for journalism? For most of its existence, journalism has taken place in a Library of Robarts world. Officials have secret information that the public wants. The job of the journalist has been to learn about, and hopefully obtain, information that is being kept secret. The journalist in this case is, literally, the medium through which important secret information becomes useful public information.
In the aftermath of the Wikileaks affair, some have argued that this is a sign that we are moving from a world where useful information is secret, and therefore scarce, to an era where useful information is public, and therefore plentiful. That in fact is Julian Assange’s stated goal: a world of absolute transparency, where there are no official secrets.
At first blush, this seems like the ideal mixed-library regime: All and only important official secrets will be made public. The truth will be out there, governments will be more accountable. And journalists will become obsolete, as we will have evolved, say some commentators, into a “post-journalism” world.
Is this plausible? I’m not sure. After all, the two libraries I talked about above are just examples of two ways of hiding a very important piece of information. You can secret it away in a place where no one can get it – put the papers in a safe, or secure it behind very hard encryption, for example. Or you can hide it in plain sight as it were, by embedding the one useful bit of information in a sea of irrelevant information.
Governments typically adopt the first tactic when trying to keep secrets. They put a classified stamp on it, limit its promulgation, lock it up, encrypt it, and so on. But sometimes, when faced with a pesky access to information request, they go the other way. They release the requested document along with a huge pile of other related documents, hoping to bury the needle of useful information in a big useless haystack. “You want information?” they say. “We’ll give you information!” That is, they switch from the Library of Robarts tactic to the Library of Babel tactic.
A big deal has been made about the sheer size of the Wikileaks document dump, with over 90 000 files made public and another 15 000 or so in the queue. Less frequently, it has been observed that the volume of information is not a feature of the leak, but a bug. In his post on the Wikileak, Jay Rosen wondered if the sheer scale of the revelations would have a counterproductive effect. Here is what he wrote:
We tend to think: big revelations mean big reactions. But if the story is too big and crashes too many illusions, the exact opposite occurs. My fear is that this will happen with the Afghanistan logs. Reaction will be unbearably lighter than we have a right to expect— not because the story isn’t sensational or troubling enough, but because it’s too troubling, a mess we cannot fix and therefore prefer to forget.
I think he gets the effect right (reaction has been pretty muted) but not the rationale. I don’t think people kinda shrugged at the Afghanistan logs because the scale of the problems they reveal seems intractable. Rather, I think it is because the scale of the information that was revealed is journalistically intractable. Wikileaks didn’t give us the happy medium library, with its combination of useful and public information, it gave us the Library of Babel, where every good story was hidden in a sea of otherwise useless data.
The lesson for Wikileaks is that information is better when it comes not in a torrent but in useful drips. That is something the Telegraph understood last year, when it tormented the British political class with daily Chinese-water torture revelations about MPs spending habits.
The lesson for journalism, I think, is that it doesn’t really matter which library system we’re operating in. Whether it’s all hidden in Robarts, or in plain view in Babel, the information still needs to be mediated. Except instead of making useful secrets public, the task of the journalist will be to show the public what is needle, and what is haystack. If anything, journalists in the Libary of Babel world will have to be be more knowledgeable, more specialized in the fields they cover, because in order to find the good stories, they'll already have to know what they are looking for.
Over at Time magazine, Michael Grunwald argues that the environmental effects of the BP spill in the gulf have been greatly overstated, while his colleague Bryan Walsh replies, in effect, that it's too soon to tell. The last part of my interview for Canadian Business magazine with Matt Ridley seems relevant:
CB: The two big public crises right now are the sub-prime crash and its downstream economic consequences, and the BP spill in the Gulf of Mexico. Economist Richard Thaler has argued that this is a symptom of a very modern problem, where neither the public nor the private sector seems to know what to do about it.
MR: I would put a slightly different slant on it, and instead of saying that neither public nor private sectors seem able to cope, I would say that public opinion doesn't seem able to cope. I'm terribly struck by the fact that there was an equally big, and certainly longer-lasting spill, in the Gulf in 1979 called the Ixtoc, and I'd never heard of the thing!
The bottom line from it, like from nearly all oil spills, is how quickly the system recovers once it is stopped. I hate saying that, because it makes me sound complacent now, and of course it hasn't stopped yet. And the thing about oil spills is that they are dreadful for the people in the localities. But I have actually covered a number of oil spills in one way or another over the years, from Exxon Valdez to a couple of British ones in Wales and Shetland, and the story is always the same: the media struggles to find the oiled bird, and when the media is gone you get a complaint from the local people saying that everyone thinks their beaches are ruined when it fact they are fine, six months on. So I have a feeling that it is the public perception that is getting worse. Fifty years ago you could get away with much worse. People didn't make as much of a fuss.
And that's no consolation; that's a good thing! That's a sign of how much better we're getting at being intolerant of bad things.