Recently in Authors Category

Peter_Singer.jpgThomas Nagel's recent review in The New York Review of Books of Peter Singer's book, The Life You Can Save: Acting Now to End World Poverty, does a fine job of demonstrating why moral philosophy in its academic form has always baffled me: the blasted hypotheticals.

You walk past a drowning kid.  You can save the child, but you will have to wade into a shallow pond and muddy your trousers and ruin your shoes.  Is it immoral to keep walking and leave the kid to die?
From this hypothetical and variations thereon, Singer distills a principle: "'If it is in your power to prevent something bad from happening without sacrificing anything nearly as important, it is wrong not to do so.'" (p. 24.)  Building on this principle, Singer develops the general rule that "those who are financially comfortable" should donate 5% of their annual earnings (or more, if they are rich) to aid organizations that alleviate poverty.  (p. 25.)

While I agree with Singer that individuals have a moral responsibility for others less well off than themselves, and further that we should all be developing means of discharging that responsibility, I think the hypotheticals have led Singer astray.  As economists have learned, abstract models that work in theory tend not to operate so cleanly in the real world.  All of those "externalities" that economists - and moral philosophers - have ignored for the sake of elegant conceptualizing have a way of refusing to be ignored once the conceptual gets concrete.

One major externality in Singer's hypothetical is the response of the drowning kid.  Singer treats the drowning kid as a prop that serves to highlight the moral decision-making of the affluent actor.  But the needy, no less than those whose needs are met, are moral agents with responsibilities that they may choose to discharge or disregard.  "Internalizing" this externality in Singer's hypothetical might look like the following:

You walk past a drowning kid.  You can save the child, but the child will be ungrateful and, moreover, will steal your wallet while you are saving him or her.  Is it immoral to keep walking and leave the kid to die?

You walk past a drowning kid.  You can save the child, but the child will accuse you of implementing a non-sustainable intervention and of thereby preventing him or her from being able to survive without your assistance, a charge that will lead to your public humiliation and condemnation.  Is it immoral to keep walking and leave the kid to die?

You walk past a drowning kid.  You can save the child, but the child's brother will be furious at what he perceives to be foreign interference with his family and will subsequently blow up a bakery that foreigners in town frequent, with the result that several local youths die and several more people (including foreigners) are injured.  Is it immoral to keep walking and leave the kid to die?
In posing these hypotheticals, my point is not to suggest that aid recipients are immoral, but to illustrate the over-simplistic nature of Singer's unilateral model for assessing moral responsibility and crafting general rules based thereupon.  The financially well-off may have a moral responsibility to help those in need, even if they prove to be ungrateful, cheat them of small sums, accuse them of acting injudiciously, humiliate them, or use them as an excuse for outrageous crimes; but the affluent also have a moral responsibility to discharge their obligations in a way that will have the most positive possible outcome.  

With his 1975 book, Animal Liberation, Singer launched the animal rights movement, an impressive achievement that - despite its already numerous accomplishments - will continue to reverberate for generations to come.  As between humans and non-human animals, of course, humans are the only moral agents, a situation in which Singer's conceptual model has much greater emotional and logical force.  As between rich humans and poor humans, however, Singer's one-sided general rule both fails to persuade (the rich) and demeans (the poor).  What is necessary, instead, is a general rule that takes into account the moral responsibilities of both the donor and the aid recipient.  Only such an approach will have any chance of resulting in "the most positive possible outcome."

Such a rule cannot help but be more radical that Singer's current proposal.  Any general rule designed to promote optimal discharge of moral responsibilities on both sides of the wealth-redistribution equation must involve the affluent in more direct engagement with poverty than mere check writing.  

And while nobody today thinks that people who won't give 5% of their salary to charity are going face poverty more directly - by, for example, sharing the burdens of power outages and sub-par sanitation that result from volunteering in a developing country slum - nobody thought that the indiscriminate and cruel slaughter of animals was noteworthy in 1975 either.

(Image of Peter Singer from The Guardian)   
santosh_desai.jpgIn the last month, I've spoken with Tanzanians about their frustration with the rate of economic growth in Tanzania relative to Kenya (typically traced back to Julius Nyerere's collectivization experiments), and I've also talked with Indians about India's pace of development relative to China (typically attributed to China's embrace of foreign direct investment, which India has strictly limited).  In both instances, I have felt that focusing on the economic indicators was too short-sighted.

More important than quarterly earnings reports and stock market performance, in my view, are indicators like the ability to change leaders without killing people and the degree of social tolerance for different groups.  These indicators necessarily signal a positive and deep-seated assimilation of modern governance and social norms; and these norms, in turn, lay the institutional foundation for modern economies.  Economic prosperity, on the other hand, can readily be achieved (but not sustained) without any such foundation and, as we have seen even in the U.S., can be the result, not of productivity, but of chicanery.

From this perspective, the good news is that "late bloomers" like Tanzania and India may actually be better off in the long run; the bad news, of course, is that they're poor(er) now, with all the ramifications (lack of international respect, shorter life spans, etc.) that poverty entails. 

Personally, I believe that these costs are worth the benefits.  Modernization is a slow process, and developing countries that allow it time are (in my opinion) better off than those that rush the process.  That said, I'm not from a developing country, and obviously perspectives can differ.  Therefore, I was interested to read, in Santosh Desai's recently published - and excellent book - Mother Pious Lady, an affirmation of my perspective.

Mother Pious Lady is a collection of essays about India's middle class.  In a selection called "The Power of the Imperfect Solution," Desai argues that:

India understands time.  It understands the transience of all things, including solutions.  It understands that there are no final solutions to problems [hear that "End of History" wallas?]; at best there is a temporary equilibrium that must eventually get destabilized and give way to a new equilibrium. . . . The desire for lasting solutions is nothing but a desire to freeze time.
(p. 135.)  This understanding leads Desai to advocate as follows:

[While] Western analysis operates by reducing a problem to its components and freezing it in time . . . [t]hings are classified, labelled, put in boxes . . . . Perhaps a good place to start would be to stop labelling situations and conditions indiscriminately as problems.  Moving beyond the simplistic problem/solution mode into the process/time mode will allow for a much more realistic understanding of how things change and how little they do.  That we [Indians] understand this is a huge advantage; let us not take to the flashy shallowness of other modes of thinking in our quest to be seen as successful in the short run.
(p. 139.)

At the risk of being accused of "flashy shallowness" for praising a commentator with whom I agree, I think Desai's perspective is immensely valuable.  All countries - the so-called "developed," as much as the "developing" - are currently trying to strike the right balance between communal and individual, traditional and modern, indigenous and foreign.  Desai's explication of value in India's communal, traditional, indigenous views has done a great service to Indians . . . and to anyone interested in achieving balance in their own lives and countries.

(Image of Santosh Desai from Times of India)    

Way too modern

| No Comments
Handwashing.jpgAt a recent humanitarian training on hygiene promotion in emergencies, I had the opportunity to reflect on the extent to which modern thinking can impair learning. 

The training involved one Power Point presentation after another, most of which entailed some stultifying combination of semantics, theory and complicated visual depictions of behavior models.  The training materials looked like they'd been held hostage in some business management consulting firm that demanded ransom in the form of adherence to its enthusiasm for inane diagrams supposedly representing conceptual analysis of real world phenomenon.

Earnestly attempting to stave off sleep by focusing on the slides, I recalled Walter Ong's explanation in his masterful book, Orality & Literacy: The Technologizing of the Word, that abstraction is a characteristic of thinking in literate (that is, modern) societies.  Pre-modern, oral societies think more situationally:

Illiterate subjects [in one experiment] consistently though of the group [of drawings of a hammer, saw, log and hatchet] not in categorical terms (three tools, the log not a tool) but in terms of practical situations - "situational thinking" - without adverting at all to the classification "tool" as applying to all but the log. . . . A 25-year-old illiterate peasant: "They're all alike.  The saw will saw the log and the hatchet will chop it into small pieces.  If one of these has to go, I'd throw out the hatchet.  It doesn't do as good a job as a saw." . . . Asked why another person had rejected one item in another series of four that he felt all belonged together, he replied, "probably that kind of thinking runs in his blood."
(p. 51 (citations omitted).)

Of course, situational thinking isn't bad or less intelligent than abstract, categorical thinking.  It's a different way of organizing information that, in certain contexts, is appropriate or even superior to abstract, categorical thinking.

One such situation, I have discovered, is during a training for hygiene promotion in emergencies.

Hygiene promotion involves persuading and cajoling people into washing their hands after using the toilet.  Safe water and food handling, safe disposal of excreta and solid waste, and safe management of "vectors" (rats, flies, mosquitoes, etc.) is also part of the job.

The job can be difficult and anxiety-provoking because the subject matter can be embarrassing, and people are often unwilling to discuss or change intimate habits, especially with or at the behest of strangers or foreigners.  In learning how to do the job, case studies, simulations and opportunities to work directly with relevant populations are helpful.  But as any parent who has toilet trained a child can affirm, diagrams of models of behavior change don't offer much assistance in getting a kid to use a toilet.

This retreat into business-consulting-speak may be a simple result of hiring too many engineers to do water and sanitation-related work in emergencies.  Engineers are notoriously poor communicators. 

But this silly and ineffective abstraction about hygiene promotion may also have another cause: anxiety about discussing embarrassing and, potentially, demeaning issues.  Making a behavior model about hand washing may seem, to some, more important work than actually communicating with others about hand washing; certainly, there's less risk of personal exposure and humiliation. 

Sadly, such a perspective simply leads to wasted efforts.  No matter how advanced the society in which we live, we are all practitioners of primitive functions, like defecating.  Modern thinking is powerless to change ancient facts.

(Image courtesy of the Global WASH Cluster website) 

Don't Marry Him

| No Comments
Lori_Gottlieb.jpgReviews of Lori Gottlieb's new book Marry Him: The Case for Settling for Mr. Good Enough, along with Gottlieb's original Atlantic article (on the book is based), miss an important opportunity for addressing a serious problem in American society.

In her Atlantic article, Gottlieb calls the problem one of the "most complicated, painful, and pervasive dilemmas many single women are forced to grapple with nowadays: Is it better to be alone, or to settle?"

In her review of Gottlieb's book in The New York Times Book Review, Amy Finnerty describes Gottlieb's restatement of the problem as follows:

Gottlieb makes a case that many women today end up alone because they hold men to insanely high standards. . . . She convinces us that we women are simply too fussy, entitled and downright delusional about our own worth in the mating marketplace. We overanalyze and seek undiluted sexual and intellectual fulfillment, thus setting men up for failure.
But both formulations of the problem miss the point.  Gottlieb is closest to the real issue when, in her Atlantic article, she observes:

I've been told that the reason so many women end up alone is that we have too many choices. I think it's the opposite: we have no choice. If we could choose, we'd choose to be in a healthy marriage based on reciprocal passion and friendship. But the only choices on the table, it sometimes seems, are settle or risk being alone forever. That's not a whole lot of choice.
Sadly, Gottlieb doesn't expand upon this insight.  Neither female pickiness nor a sense of being forced to choose between settling and solitary lives is the problem; these phenomena are side-effects of the real problem: American men aren't well-matched for America's post-feminist women.

The most serious failure of feminism was to ignore the fact that gender roles are relational.  Men's and women's roles fit together like puzzle pieces (or like yin and yang).  Radical alteration of one of the roles requires a similar level of change in the other role for the two roles to continue to be compatible.  Feminists devoted extensive thought, theory and action to the cause of revising a woman's role; to the extent that the gave any thought to men's roles, however, they seem to have assumed that men would adjust.

Men have not adjusted.  While women struggle under extraordinary social pressure to be educated and sociable, have careers and families, be sexy and mothers, be emotionally competent and financially wise, men grapple with the sense of being intimidated by women, of feeling inadequate and fearing they are a disappointment to the beloved women in their lives.  In my experience, they deal with this complex of issues by taking refuge in extended adolescence and staying stoned a lot.

In this context, settling - as Gottlieb advises - is insanity.  As anyone who has lived through a divorce (or who has witnessed parents get divorced) knows, a bad marriage causes vastly more damage that no marriage.  And if a society is grooming men who aren't suited to the women that the society is producing, the choice is not between settling and solitude, but between a bad marriage and a decent life.

I'm not alone in either my conclusion or my analysis: two hundred years ago Jane Austen wrote a more persuasive argument than this blog post can offer in her novel, Pride and Prejudice.  As any reader of that novel can recognize, American women live today in a world where too many of the men are Wickhams, the con artist scourge of Pride and Prejudice.  By the conclusion of that novel, Eliza Bennett has learned that her own haughtiness and preconceived notions had prevented her both from seeing the dangers of the charming Mr. Wickham and the goodness of the more remote Mr. Darcy, her future husband.  

Gottlieb would have American women unlearn the lesson of Eliza Bennett - would have American blind themselves to the unsuitability of the available partners out of their prideful need to get married and their prejudice against carving out a satisfactory life for themselves beyond the bounds of marriage.  Gottlieb urges American women to settle for Wickham.

Jane Austen has already illustrated the perils of that choice.

(Picture of Lori Gottlieb from The New York Times Book Review

The accidental jester

| No Comments
B_Cole_and_Cranworth.jpgStorytellers don't have to be reliable to be entertaining.  Great narrative voices can be widely off the mark - P.G. Wodehouse's marvelous Bertie Wooster is an example - and yet their own haplessness with facts and reality only deepens our delight in hearing what they have to say.

Lord Cranworth is an interesting example of an entertaining, unreliable narrative voice.  Unlike Bertie Wooster, who is fictional, Lord Cranworth was real.  And diverting further from Bertie Wooster, whose lack of reliability was the conscious intent of his creator, Cranworth didn't mean to be unreliable.

Cranworth has become unreliable in part because the passage of time has rendered so many of his opinions politically incorrect.  "I dislike making contact with a black race which emphatically dissents from the superiority I claim for my race and colour," he writes of Ethiopians.  (Lord Cranworth, Kenya Chronicles 178 (1939)).

But Cranworth has also become unreliable because his account of factual events diverges from other contemporaneous accounts.  Here, for example, is Cranworth's version of the events leading up to the deportation of Galbraith Cole:

Galbraith Cole was one of the earliest pioneers, a brother-in-law of Lord Delamere, and deservedly one of the most popular inhabitants both with black and white.  He had suffered repeatedly from thefts of cattle and sheep from his farm on Lake Elmenteita [sic], abutting the Masai Reserve.  One day he caught a party of Masai red-handed driving off his sheep, and, having a rifle, fired a shot to frighten the delinquents.  By an unfortunate mischance the shot struck one of the party, who subsequently died.  The Government were placed in a position of difficulty.  No local jury would, or indeed could, convict Cole of any major crime, and the tribe in question, with whom the punishment for cattle-stealing from time immemorial had been death, saw no justifiable grounds for complaint.  On the other hand, a considerable opinion at home said that in the interest of our own rule and good name an example must be made.  And again it is hard to dissent from that view.  The Governor decided that it was a case for deportation, unpopular though the course might be.
(Kenya Chronicles at 64).

His account omits several salient facts that Karen Blixen mentions about the event:

When Karen Blixen lectured at Lund University in 1938 she gave an example of Galbraith Cole's unswerving conviction, which a man of less fibre would have easily betrayed.  Like the Masai he had killed, he paid his price without question:

The Judge said to Galbraith, 'It's not, you know, that we don't understand that you shot only to stop the thieves.' 'No,' Galbraith said, 'I shot to kill.  I said that I would do so.'

'Think again, Mr. Cole,' said the judge.  'We are convinced that you only shot to stop them.'

'No, by God,' Galbraith said.  'I shot to kill.'  He was then sentenced to leave the country and, in a way, this really caused his death.
Errol Trzebinksi, Silence Will Speak 76 (1977) (quoting Donald Hannah, Isak Dinesen and Karen Blixen: the mask and the reality 35-36 (1971)).

In highlighting this disparity, I am not so much interested in which version is accurate, but in the relationship between an accurate grasp on facts and the formation of opinions that endure the test of time.  My guess is that Cranworth wasn't just unlucky that public opinion shifted away from his conviction of white superiority; rather, I hazard that a certain disposition on his part to tamper with facts supported the formation of opinions that could not survive the eventual triumph of reality.  Hence, the man could write of his early years in British East Africa:

Settlers were coming in with a steadily increasing flow.  New, beautiful and undeveloped territories were being discovered and occupied.  New crops were being tried out and new possibilities became probabilities almost monthly.  Land values improved with great rapidity and the native population became more prosperous and infinitely happier and safer.  No stigma rested at that time on the white settlers for the work that they were doing.
(Kenya Chronicles at 29 (emphasis added).)

Amusing to read now, but not very credible.

(Photo of Berkeley Cole and Lord Cranworth from Kenya Chronicles)

Starter of conversations, killer of poets

| No Comments
Samuel_Johnson_NYT.jpgPublisher's Weekly recently hosted a panel as part of its "Think Future: What's Next in Publishing" discussion series on the question of "Will Book Reviews Still Matter?"    

I didn't attend the event, and I don't know what was said, but a fair guess is that the discussion, like the animating question, was of a piece with other expressions of the massive insecurity in the industry right now: will people read books in the future?  Will book stores continue to exist?

Not being one who views change as synonymous with annihilation, I am comfortable projecting the continued existence both of books and book stores.  My relaxed optimism extends with even more confidence to book reviews - though I might wish it to be otherwise.  Here's why:

Short of folks stuck in ski chalets during blizzards who are driven by boredom to peruse the only book on hand, people's choices in reading materials are rarely random.  They're usually guided by some previous knowledge about the book.  Their friend recommended it.  They've heard good things about the author.  The book got good reviews.

Although a friend's recommendation, or a prior positive experience with the author's work, will likely remain more influential than reviews are to an individual's purchasing decision, reviews are nonetheless likely to continue to be important for sales.  Reviews start a public conversation about a book, as well as setting the agenda for that conversation, and such conversations prime an audience's appetite for the book.    

Conversation, whether in meat-space, virtual space or mental space, is vital for any book marketing effort because conversation is the social corollary to the private act of reading.  Most of us are social animals and most of us, therefore, want to talk about what we read.  In communities with a relatively high level of literary output, but without apparatus for sparking public conversation about books - for example, in Nairobi, where I've never seen a single book review, bookstores lack the space to accommodate book readings and the Internet hasn't picked up the slack - books don't sell.

So conversation is necessary.  And, though any glance at the line-up of television pundits might lead one to another conclusion, conversation (even in America, even today) is a skill.  Good conversationalists have thought-provoking, witty and passionate things to say.  Poor conversationalists - which includes most of us at some moment or another - can nonetheless function tolerably if they have the decency to quote (with or without attribution) that which they've heard good conversationalists articulate.

Reviewers, if they excel at their jobs, are good conversationalists who provide book-meat to the public for roasting, mastication and regurgitation.  Reviewers thus serve a critical social function that will in some form transcend the rapid (and foolish, in my opinion) disappearance of book review sections in newspapers.

The question to my mind, therefore, is not, "Will Book Reviews Still Matter?" but "What are the media platforms from which book reviews will be disseminated?"  

If the answer is (as it likely will be), "the Internet," then we will probably see a similar pattern to that which has emerged elsewhere online: faced with overwhelming choice and no editorial filter, netizens will default to trusted familiar voices.  We will see, not a diminution in the importance of book reviews for book sales, but an increase in the importance of certain online reviewers' opinions about books.

And as anyone with even passing familiarity with Lord Byron's poem "Who Kill'd John Keats?" knows, concentration of the critics' power is never a positive development.   

(Image of Dr. Samuel Johnson, in Harold Bloom's words, "the most eminent of all literary critics," from The New York Times)  

About this Archive

This page is an archive of recent entries in the Authors category.

Biographies is the next category.

Categories

Archives

OpenID accepted here Learn more about OpenID
Powered by Movable Type 5.04