05 June 2006

Canada Gored

Al Gore brought his dog and pony show to the University of British Columbia on Thursday evening, speaking to a packed crowd of the faithful at the Chan Centre. His Keynote presentation (he is on the Board of Apple and obviously eschews Microsoft’s PowerPoint) was filled with images, graphs, and data on the looming crisis that is global warming. At times passionate, the infamous policy wonk presented the issue with the scholarship befitting a University Professor. On this topic the former Vice President has serious street cred: he was among the first politicians in the world to have recognized its importance, co-organizing the first hearings on the topic by the US Congress in 1979.

Nearing the end of his speech, Gore showed a list of the 163 countries that have ratified the Kyoto Protocol, bemoaning the fact that the United States and Australia have not signed. And then he stopped. He slowly walked forward on the stage and in a sonorous voice, pointed out that the Kyoto Accord is in trouble in Canada. The former Vice President of the United States of America unequivocally stated that Canada is the most respected country in the world, reminding us that Canada has participated in every peacekeeping mission that the United Nations has mounted since the United Nations was founded. He directly challenged the audience saying, “You are not going to let Canada exit Kyoto, are you?”

It was a highly emotional moment, one that drove home the fact that at this delicate time in human history the world is watching Canada very carefully. How we handle the debate over Kyoto will have a major impact on the future of the planet. It would be a disaster of unimaginable proportions were Canada’s good name to be besmirched by leaving the Kyoto Protocol. People around the world would rightly shake their heads, and, with a disheartening change in emphasis, wistfully intone the words “Oh, Canada”.

27 May 2006

Urban farming

I have been reading Michael Pollan's book The Omnivore's Dilemma, a lucid account of the path that food takes to arrive in our stomachs. The book is filled with facts and observations put together in a way that are at once fascinating and repulsive - reading this book might cause you to hesitate before putting a steak on the grill this weekend. Of course, that is the point, to get you to think about your food and where it comes from.

One observation which struck me as remarkable was the degree to which our food supply has become urbanized. No, we are not growing our food in the sprawling metropolises in which the majority of North Americans now live. Rather, we have turned the rural landscape into cities.

We are probably all aware that farming is no longer the pastoral occupation that it was before industrialization ushered in the 'green revolution'. What Michael Pollan alerts us to is the fact that most farms these days are really just monocultures of densely packed corn. In the 1920's, an acre of corn yielded 20 bushels of corn. With the introduction of hybrid corn in the 30's, the yield went up to 75 bushels per acre. Add modern fertilizer to the mix and you get a whopping 180 bushels of corn out of an acre of farmland. What you can see in this green revolution is the transformation from pastoral to urban: an acre of land whose population density has risen 9 fold in 90 years, spurred on by advances in technology.

The metropolitan analogy does not end there, but continues when one considers the feedlots in which cattle stand crowded shoulder to shoulder, their legs mired in a foot or more of manure. Just as with humans in cities, they are no longer threatened by natural predators but by the scourges of metropolitan life, the richness of their diets and the stress of crowded conditions.

It was probably not intentional, but it is now a fact of life: we humans have urbanized the farm.

Tags: , , , ,

25 April 2006

Social Context of Commercializing Science

Dr. Sir John Sulston delivered the 2nd Annual Michael Smith Lecture yesterday afternoon at UBC on the topic of Biology in the Public Domain. As one would expect when a Nobel Laureate comes to town, the lecture hall was packed, and Sir John certainly delivered a lively address. As former Director of the Sanger Institute in the UK, he oversaw 1/3 of the public project devoted to the sequencing of the human genome, and was a forceful advocate for keeping the sequence information in the public domain. It was no surprise, then, to hear Sir John decry the widespread growth of patents in the field of experimental biology, a topic that has been much discussed in recent years. Indeed, there is a modest movement to build bridges between the open-source movement and biotechnology.

One very interesting morsel that Sir John tossed out to the audience was his suggestion that the commercialization of academic science could be readily connected to triumph of capitalism and the implosion of the Soviet Union. Although he did not mention the book by name, he was clearly referring to the premature crowning of victory to the capitalist enterprise by Francis Fukuyama in his 1992 book The End of History [the link is to the introduction; you can probably get an already-read copy of the book for a good price at your local used bookstore.] Personally, I think that the watershed was the Bayh-Dole Act, which allowed Universities not only to patent inventions but to retain and profit from the resultant intellectual property rights, even when the fundamental discovery was made using public funds.

Irrespective of the pressures that have led to the current state of affairs, it is clear that academics increasingly view their research as not only knowledge for the sake of knowledge, but also as a potential gateway to application in the real world, including full-scale commercialization. My personal view is that there is nothing wrong with scientists moving from the bench to the boardroom and back so long as they keep their perspective on the propriety of what they are doing, and do all that is possible to minimize the potential for conflict-of-interest. If one is seriously pursuing new cures for disease, the private sector is precisely the right place to practice the craft. On the other hand, crass commercialization accompanied by restricted access to knowledge for all is an inappropriate outcome for public funds. Even a coarsely-tuned moral compass can help lead the way.

Tags: , , , ,

22 April 2006

Truth in Advertising

Perhaps unsurprisingly, we can learn a thing or two from bees. Tom Seeley, a Professor at Cornell reports (subscription required; for a precis of the original research, check out the media relations story from Cornell) that honeybees use a clever strategy for decision making, in this case, moving the swarm to a new hive. 'Scouts', a subset of the group, head off to find plausible sites for the swarm to settle. When they return, they use the infamous waggle dance to let the others know where the site is, and most importantly, how good it is. The group assesses the scouts' reports, and then the swarm moves to the best site.

The system is not really all that surprising - but it does represent a remarkable display of decision by committee. The authors highlight the fact that the collective decision "is a product of disagreement and contest rather than consensus or compromise". What I found notable about the piece was the honesty of the scouts. If the site that an individual scout finds is excellent, the resultant waggle dance is exuberant. On the other hand, if the site is only acceptable, the waggle dance is more muted. The competition for having found the best site does not result in deception: given that the objective is the overall welfare of the swarm, the scouts are scrupulous about being honest in their assessments.

Being highly evolved animals, you would think that we humans would treat important decisions with equal candor. Unfortunately the evidence goes against us. Earlier this month, Carl Elliot published a damning piece in the Atlantic entitled The Drug Pushers. Even more alarming, this past week witnessed the Inaugural Conference on Disease Mongering in Newcastle, Australia. Draw your own conclusions, but this much is clear: we would be better off if humans treated the marketing of medicines with the same veracity as honeybees.

Technorati Tags: , , ,

02 April 2006

Multicultural baboons

Robert Sapolsky, a neurobiologist at Stanford with a long-standing interest in primatology has written a thoughtful article in a recent issue of Foreign Affairs (of all places). In the piece, he describes a remarkable social phenomenon observed in a troop of Savannah baboons in Kenya. Normally, about half of the male members of the troop are very aggressive and the other half more social - a version of the storied alpha male phenomenon. When a tourist lodge expanded its territory into one occupied by this particular troop, they became rather adept at pilfering food from the garbage dump. Soon thereafter an epidemic of tuberculosis swept through the troop, with the infection apparently caused by some tainted garbage. Because the most aggressive males had preferential access to the food they were also preferentially affected, and this caused a sea change in the behavior of the troop as a whole: the aggressive males died quickly and the few remaining males were markedly less aggressive and more social. Despite the fact that more than 20 years have elapsed and all of the Savannah baboons that were alive during the tuberculosis event have died, the "Garbage Dump" troop remains highly social today, in stark contrast to other troops of Savannah baboons in the area.

This phenomenon is even more remarkable when one considers the mating behavior of Savannah baboons. As with many species, Savannah baboons exchange members between troops, an adaptation that presumably reduces inbreeding. But the social rules are quite precise: juvenile males leave their troop and join neighboring troops, working their way through the hierarchy of the new community. One would expect that about half of the males that joined the Garbage Dump troop would be aggressive and the other half social. Yet twenty years later, the entire troop remains highly social, suggesting that the newcomers adopted the social mores of the local group.

Apparently, a similar experiment has been going on in the Netherlands, except in reverse, and with tragic consequences. Writing in the New Yorker, Jane Kramer describes the approach that Holland has taken to multiculturalism, something called the "pillar model". A solution to the fighting between the Catholics and Protestants in the seventeenth century, the pillar model allows each group to manage its own affairs, with separate neighborhoods, hospitals, schools, and even state-supported media. The model worked well as tolerant Holland managed its affairs from the Enlightenment into the modern age, and by the twentieth century the country was essentially divided into Catholic, Protestant and humanist pillars.

The wrinkle appears to have come with the wave of recruitments from Turkey and Morocco that began in the 1960s. Rather than integrate these newcomers into Dutch society, the pillar model was applied, allowing the growing Muslim population to not only continue to maintain their religious and cultural heritage, but essentially walling them off from Dutch society at large. The experiment is now widely viewed as a dismal failure, defined perhaps most vividly by the murder of the provocateur Theo van Gogh in 2004 by a young Dutch-Moroccan fanatic who, incensed by the filmmaker's scurrilous depiction of Islam in his film Submission, shot van Gogh eight times, repeatedly slit his throat and then pinned a long diatribe to his chest with a knife.

Surely the abominable socioeconomic conditions of Dutch Muslims contributes to the problems that Holland is experiencing, but the concept of the pillar society plays a role as well. It is here that the story of the Garbage Dump troop of Savannah baboons is informative: cultural integration is an entirely normal occurrence, and probably needs to be achieved on some level if multicultural societies are to live in harmony. Certainly, tolerant societies must be respectful of cultural and religious diversity, and the heritage that immigrants bring to multicultural societies imbue them with vibrancy. At the same time, it is imperative that when social groups with different cultural backgrounds live together, efforts are made to bridge the inevitable gaps between them. Societies that ignore the lessons of the Dutch experiment do so at their peril.

, , ,

01 April 2006

Sloth is now a disease

Scientists in Australia have come up with a new disease: extreme sloth. Australian neuroscientist Leth Argos has identified a new disease called Motivational Deficiency Disorder. Described in a new article in the British Medical Journal (subscription required), Ray Moynihan reveals further that people who suffer from MoDeD can be characterized by overwhelming and debilitating apathy. Moreover, the little-known biotechnology company Healthtec is testing a new drug, Indolebant, as a potential treatment for MoDeD. In a wonderful piece of scientific sleuthing, they discovered that drugs that block the effects of THC (the active ingredient in marijuana) at the CB1 receptor antagonize the slothful behaviour of MoDeD sufferers. Critics have been quick to point out that medicalization of laziness is just the latest, and perhaps most audacious example of disease mongering by the pharmaceutical industry.

British Medical Journal has a nifty Rapid Response feature. If you can, it is worth checking out the responses to this piece of news.

15 March 2006

Burning Wood

I have been thinking a lot about wood lately. First there was a post from Verlyn Klinkenborg in which he talked about some of the ambivalence that he feels about burning wood. Then we were visiting our friends Ian and Dianna, and she mentioned that someone had been horrified that anyone in this neck of the woods was using wood - that electric baseboard heating was the only environmentally sound way to heat one's home insofar as all of the electricity in British Columbia derives from hydroelectric power. Hmmm. If this is the case, should I be thinking seriously about powering down my wood stove?

No, I think not. And here is why.

First there is the question of particulate matter. Admittedly, my wood stove emits various nasty molecules into the air, and certainly more than a hydroelectric plant. However, I minimize this by burning it with the air intake fully open. My wood burns a bit faster, but the burn is much much cleaner - I can tell because there is no residue building up on the window of my stove, and my flue remains clean as a whistle. This burning strategy ends up causing the house to be a bit warm for an hour or so, and then as we let the fire burn down, it gets a bit cool. We deal with this by putting on a sweater. It works. So on the particulate matter measure, a point for the hydroelectric strategy.

Then there is the issue of greenhouse gases. Again, burning wood releases greenhouse gases. But allowing a dead tree to rot in the woods releases precisely the same amount of greenhouse gas as burning it. Since the wood that I burn is all from deadfall, my woodstove doesn't change the environment a bit. Moreover, a fact sheet from Charles Darwin University in Australia states the case quite clearly: "While the carbon contained in fossil fuels has been stored in the earth for hundreds of millions of years and is now being rapidly released over mere decades, this is not the case with plants. When plants are burned as fuel, their carbon is recycled back into the atmosphere at roughly the same rate at which it was removed, and thus makes no net contribution to the pool of carbon dioxide in the air." Woodstove and hydroelectric dam are even on this one.

Wood is a local product. I buy my wood every year or two from a guy that I know personally. He lives pretty modestly and the couple of hundred dollars that I spend on wood each year goes a long way to putting food on the table of his family. The electricity that comes through the wires is produced by a large and faceless utility. I have nothing against this company, as it does provide a meaningful service. It just represents one more highly impersonal transaction. Chalk up a point for the woodstove.

Finally, there is the meaning of the woodstove in my life. Historically, humans have used fire both for heating and cooking, and consequently the hearth has prominet role as the center of the home. There exists even a series of regular rhythms that the fireplace requires - the seasonal preparation of the wood; starting the fire first thing in the morning; tending the fire as the day progresses; even not lighting the fire as the weather warms in the spring. All of these rhythms put me in touch with my surroundings in a very intimate fashion. And then there is the unqualified joy that comes from sitting near a fire, seeing it burn, and knowing that somehow, in the midst of the chaos out there, all is right with the world. Because mastering fire represents one of the essential adaptations that distinguishes humans from all other species on this planet, it seems to evoke some ancient memory which is comforting like no other. Needless to say, the electric baseboard heater falls far short on this measure. A large and important point awarded to the woodstove.

In Carl Elliot's book Better than Well, he describes philosopher Albert Borgmann's use of the evolution of heating as the classic example of the distinction between a 'thing' and a 'device'. [The words thing and device are not as important as the distinction, so bear with me here.] Borgmann describes how the hearth was once the central focus of the household, but was gradually replaced by a variety of 'central heating devices' with the aim being to heat the house in the most efficient manner possible. The advance in technology completely satisfies the physical need, but increasingly leaves us disconnected from the natural world. I am no Luddite, but somehow it seems important that we maintain some connections to our shared past. I will draw my own personal line at my woodstove.

27 February 2006

MAD Petrodollars

The impending ribbon-cutting ceremony for the opening of the Iranian Oil Bourse is receiving scant attention - but should be front and center for every one. If the Bourse opens as anticipated in March 2006, it is likely to shake up the global order significantly.

Here is the background. In 1971, the US took the dollar off of the gold standard, essentially turning currency trading in the dollar into a confidence game. To reassure investors, the US made a (not so secret) deal with Saudi Arabia to insure that Saudi oil would be denominated in dollars; the quid pro quo was a series of security assurances. Ever since then, essentially the entire world has been buying oil on the international market for dollars. For the world at large, this means that they need to find a way to get their hands on US dollars so that they can purchase oil, and in the normal course of events this is accomplished by trading goods for dollars. On the other hand, the US can buy oil with dollars that they print themselves.

As pointed out recently in Z Magazine, this essentially allows the US to purchase oil with fiat money - bills that have value only because the issuing country says they do - as opposed to commodity money, which the rest of the world must use to purchase oil. The interesting and compelling observation is that this situation has allowed the US to run up deficits ($725 billion in 2005!!!) that would be otherwise unsustainable, especially in a world where money can move so easily.

Of course, this is the nub of the matter. So long as oil is denominated in dollars, the US economy retains its ability to dominate. But what happens if this situation changes and oil becomes denominated in another currency, Euros for example? Several people (here and here and here) have speculated that this represents the underlying reason for the invasion of Iraq, and with the looming threat of the Iranian Oil Bourse, tongues are wagging at the relationship between the threat to petrodollars and the level of rhetoric about Iran's nuclear program.

In a world awash in misinformation, it is always difficult to know what is true (and there are contrary views). However, one observation that supports this line of argument is that the US dollar has not cratered in the face of astoundingly poor financial performance. In a sense, the scenario that has developed is analogous to that which arose in the Cold War. All of the players need to keep the US dollar propped up lest their holdings of US dollars lose their value. If countries (Japan and China come to mind) stops buying oil in dollars, the entire house of cards comes crashing down and everyone loses. When the Russians were on the other end of the 'hot line', there was reason to expect that Robert McNamara's notion of Mutually Assured Destruction would stay their hand. It is worth pondering whether the Iranians will be equally prudent.

Beware the ides of March.

, , ,

24 February 2006


Two recent news items demonstrate just how jumpy we have become. We need to get a collective grip on reality.

First came the news that scientists at the University of Texas Southwestern Medical Centre have successfully developed a vaccine against the potent toxin ricin. The science behind the experiment is elegant, and was appropriately published in the prestigious journal Proceedings of the National Academy of Sciences (subscription required). But wait a minute - there are really only two known instances in which ricin has been used in a way that might warrant concerns. The first was when Bulgarian dissident Georgi Markov was killed after being jabbed with a ricin-filled dart hidden in a KGB agent's umbrella in London in 1978. The second was when two ricin-laced letters were intercepted in 2003 by postal authorities in the USA. One of the letters was addressed to the White House. Given this rather sparse history, the question arises: who is going to take a ricin vaccine?

The second piece of exaggerated news arises from a report released by the National Academy itself. Writing in MIT's technology review, Emily Singer describes the prospect of terrorists 'hijacking your brain' with new generation chemicals. Perhaps disruption is possible, but hijack is just a tad hysterical.

While it is understandable to be vigilant, this level of paranoia is just plain silly. Not only does it demonstrate that otherwise level-headed folks are getting a bit carried away, the noise that it generates has the potential to blind us to real threats which should be judiciously minimized.



17 February 2006

Take a Media Holiday

A well-dressed man in a 3 piece suit walks out his front door with briefcase in hand, kisses his wife and walks purposefully down the pathway apparently leading to the street. One more step and he disappears off a cliff, the camera following him as he plummets down the canyon, a parachute opening just before he lands next to an SUV. The marketing guys sure know how to get your attention. But what really interested me was the disclaimer at the bottom of the screen as the man opens the car door: Professional stunt man. Do not try at home.

Are people really as stupid as the mythical lemmings that follow each other over the cliff? More to the point, are we unable to distinguish between the drivel shown on TV and reality? Sadly, the answer is yes, and, as I have written about previously, the fact that our sense of well-being derives from how we stack up against our peers causes a fair bit of unhappiness. This is particularly the case when our peers are idealized media darlings.

Given how appealing our media-infested world is to our attention-seeking brains, it is hard to imagine this situation improving any time in the near future, at least on a large scale. A personal solution that a wise person offered up some time ago is to take a media holiday from time to time. No TV. No Internet. No Newspapers. I have tried it and can report that not only was I able to survive such deprivations, but remarkably the world continued to revolve on its axis with the same wobbliness as it had before I took my little media holiday. As for me, it seemed to provide excellent fodder for Clear Thinking.

08 February 2006

The quest for truth

The growing reliance on functional magnetic resonance imaging (fMRI) to examine the brain in action is leading, inexorably, to the search for better lie detectors. At least two companies, No Lie MRI and Cephos are developing variants of this technology with several applications in mind.

At issue in all of this, of course, is the accuracy of the tests. Before anyone runs off and gets too excited about this new technology, it would be worth reading at least the abstracts of the academic papers (here and here) upon which the commercial strategies rest. It turns out that using fMRI provides accuracy in the range of 85-90%. Not bad, but hardly foolproof.

Of course, the reason that this is important is because of the famous case of Aldrich Ames, the CIA spy who was convicted for sying for the Ruskies in 1994. Ames succesfully deceived investigators using a polygraph, and has continued to watch the field of lie detection from Allenwood federal penitentiary. He sent a fascinating letter to Steven Aftergood at the Federation of American Scientists that is worth a read (I prefer the handwritten version, but you may wish to see the transcribed version). It seems that in the intelligence business, polygraphs are often used to coerce confessions from people, a conclusion that is hardly surprising given the current climate of paranoia.

The 'yuk factor' derives from the fact that this technology might breach a sanctuary that we all cherish, the privacy of thought. It may be only a minor comfort, but the truth of the matter is that scientists are still a long way off from reading your mind. So rest well.

30 January 2006

What could you do with Two Hundred and Fifty Billion Dollars?

Over on the right hand side of my blog, you can see a handy little counter that I found which estimates the cost of the war in Iraq, based on Congressional Appropriations. It is set to hit $250 billion in March 2006. It is really quite staggering to think about why that money was spent, and what else it could have been used for.

First, let's address why the money is being spent. The honest answer, of course, is that the war is about oil. Not only does Iraq produce tons of the stuff (or at least it did before the US bombed it flat), Saddam Hussein was in a position to threaten oil shipments throughout the Persian Gulf. The calculation went something like this: a significant disruption of oil supplies would mortally wound the US economy. As a result, removing Saddam from power was deemed to be worth the political and economic cost of removing him from power. Other explanations (weapons of mass destruction, building democracy, etc.) are convenient, but frankly, they are a load of horse manure.

Let us assume for a moment that Saddam's threat was real. The unsophisticated response is what we have seen: utilize the power of the military to secure the Persian Gulf. The more nuanced response would have been to seize the moment and launch a massive project to develop alternative energy sources. Imagine what $250B would buy in terms of research and development. Not only would the money have provided the only realistic opportunity to identify alternative energy sources that might come on line as cheap oil became a distant memory, but it also would have provided a much needed boost to the US economy.

Yes, that's right, the same US economy that the Bush gang invaded Iraq to defend. Two hundred and fifty billion dollars later, and what do we have to show for it?

26 January 2006

The vulnerability of energy supplies

After the recent attacks in Nigeria and Russia on oil and gas pipelines, the possibility of social disruption leading to a new energy crisis moves from dystopic fantasy to looming reality. The Global Guerrillas website adds an interesting twist: the possibility that such attacks might come not just because groups with political objectives may wish cripple governments or supranational corporations, but rather that individuals may stage such attacks to profit from the change in world oil prices. It has always been the case that individuals have seen windfall profits following disasters. But now it seems that small scale adventurers, operating independently or in collusion with others, can cause both anarchy in energy producing countries (with secondary effects upon liberal democracies) and make substantial profits. Given how obvious this opportunity is, it would be surprising if it didn't happen in the very near future. Get ready for $100/barrel oil.

19 January 2006

On Happiness

I have been reading Happiness - Lessons from a New Science by Richard Layard, Director of the Centre for Economic Performance at the London School of Economics and a member of the House of Lords. He is among the rising tide of voices advocating that governments should include measures of happiness in devising policy, essentially adopting some form of the Bhutan's pursuit of Gross National Happiness. This is, of course, a wonderful idea, and it seems like a fantastic platform for politicians to pursue as they try to convince voters that they deserve their support. Indeed, some governments are giving consideration to the issue, but the truth is that this is going to be a hard sell.

Layard recounts the evidence that despite the fantastic increase on our material wealth over the last 50 years, we are no happier than we used to be. He offers many reasons for our collective malaise, and it is worth reading to book to get the full picture - his writing
is engaging and I found the book an enjoyable read. One of the insights that Layard emphasizes is the importance that we all place on our relative status with respect to those around us. Essentially, we measure ourselves against others, and we do so with alarming regularity. This exercise had little impact when our brains evolved as members of communities of 150 or so individuals on the African Savannah. Even as industrialization rose to prominence, it was rare for individuals to encounter compatriots who lived lives that were radically different than theirs. But our brains are poorly equipped to deal with today's reality, where the availability of cheap oil allows for us to travel widely, and in so doing observe and inevitably crave the lifestyles of others. Even more pernicious is television which brings the rich and famous directly into our living rooms. Is it any wonder that the real lives of real people pale by comparison?

Anticipating the new science of happiness, Ferenc Mate devoted a chapter in his book A Reasonable Life to the ills of television, concluding that the best thing that you can do with your TV is to pick it up and chuck it out the window. It seems that his advice is sound indeed.

18 January 2006

Law of Unintended Consequences

A letter in today's issue of Nature draws attention to the law of unintended consequences. Scientists are not often thought of as frivolous types, but they can be: many of the genes that have been discovered have been given rather silly names (Homer; dunce; Sonic Hedgehog, to name but a few). This trend was begun innocently by some 'cool' scientists who were involved in sequencing the fruitfly genome, but grew rapidly to include human sequences as well. Unfortunately, patients seem less than amused to find that they have a mutation in a gene with a whimsical name (Are you telling me that my son the genius has a dunce mutation?).

Who knew?


We seem to be hurtling down the tracks towards an uncertain future. Despite our best intentions, the impact of humans on the world around us seems to be less benign with each passing day. Whether it be hubris, the law of unintended consequences, or just plain stupidity, we are clearly in a bit of a bind.

It really is a pity that we are in such trouble, for the human experiment has much to recommend it. In the days and weeks to come, we'll take a look at some of the fine things that humans have achieved, as well as irreverently poking fun at a few items that have been less well-thought out. As with evolution, I don't know where it will lead, but the journey is always interesting.