Quote of The Week

“The consolation of imaginary things is not an imaginary consolation.”

Roger Scruton


Authenticity and Truth

“The Adventures of Tom Sawyer was made by Mr. Mark Twain, and he told the truth, mainly. There was things which he stretched, but mainly he told the truth.”
(Mark Twain, The Adventures of Huckleberry Finn)

A few years ago I revealed on my web page that I was going to a yoga class so that I could write a newspaper column about the experience. This brought an immediate objection from one of my two loyal web readers. To paraphrase her question: if a writer artificially sets up a particular experience in order to write about it, isn’t the resulting work essentially false?

My answer was an infuriating “yes and no.” Consider two examples. Barbara Ehrenreich in Nickel and Dimed set out to experience the world of the lowest-paid workers. Bill Bryson went to Australia only in order to write Sunburned Country. Are these two excellent books somehow tainted by the calculated nature of the experience? Could we get a more “authentic truth” from a poor person on the one hand, and an Australian on the other?

This leads us into the darkest corners of philosophy, where I’ve been before and I don’t want to go again. But here are some thoughts on the subject.

Written experience is false by definition. As soon as we set out to put anything into words, the experience itself is changed, however “genuine” it was at the time. The philosophical movement called “deconstructionism,” represented by Jacques Derrida who died in 2004 and whose very name gives me a mild headache, proposed that what gets left out in the process of creation is just about everything that really matters. Any intelligent person can understand what that means. But, in practical terms, it means the death of literature. We have to swallow hard, and ignore it.

No writer can keep up a steady output without seeking out or drawing on artificial life experiences. Everyday life is just too dull – at least mine is. In search of material for my columns I’ve been to the yoga class, worked in a bookstore, flown a tiny, flimsy plane, hiked a remote trail in midwinter, and done dozens more improbable things, simply in order to write about them. If I wait for something interesting to happen to me I will be dead before I need to switch on the computer again. I started my working life as a journalist, and my instinct is to look for stories.

This is where journalists get in trouble, because news is different. In 2004 a series of scandals at The New York Times, The Boston Globe and other respected papers revealed that a few reporters and feature writers had been making their stories up. Shock and horror reverberated through the world of journalism. But most veteran writers took the shock and horror with a pinch of salt. We’ve all made stuff up. My very first job, as a sixteen-year-old apprentice journalist, was to invent short items to fill blank spaces on slow news days. Later I graduated to writing letters to the editor, and then answering them.

This created a mild sense of guilt, although everybody did it. But I think we all had a pretty clear idea where the borderline lay between a minor dishonesty, the journalistic equivalent of a white lie, and an important deception. In other words we knew how to distinguish between entertainment news and real news. Real news deals with political, economic and social issues that affect people’s lives. Entertainment news deals with sports, celebrities, fashions, and all the trivia on the margins of culture. If you can’t tell the difference you shouldn’t be a journalist. My journalistic career was short, probably for this reason.

Most writers are not reporters but creators. We don’t watch the AP wire, and we don’t feel bound by the rules of reporting taught in journalism schools. We are in the entertainment business, and we can’t possibly experience everything we write about. What kind of state would Ian Fleming have been in if he’d been through all the traumas suffered by James Bond? We won’t even ask about Barbara Cartland or some of today’s Chick Lit authors. The job of the writer is the recreation and transformation of the quotidian. That’s why TV soaps take place in hospitals, or police stations, or in families that suffer a major emotional trauma every twenty-four hours. Their lives are infinitely more interesting than ours, simply because they are so false.

This is all obvious. It was the paradox that Joseph Heller played with in his novel Something Happened, a book in which almost nothing happens. But the question of artificially creating experience leads us on to the much more treacherous terrain of truth and falsehood in writing, or anywhere else. “What is truth?” asked Pilate, and received no very convincing answer.

Most memoir, for example, is fiction. I like to say that my entire biography is fiction, which is half true. The other half is narrative non-fiction with some fictional elements. Most writers live in a state of chronic confusion between fact and fiction, especially when they write fiction based on the facts of their own lives. The boundaries dissolve into smoke as soon as we try to draw them. It’s human nature to improve our personal stories if we get the chance. That’s why fiction is always, always, always more interesting than life and, I’m tempted to say, why lies are usually more interesting than the truth.

“There are only two ways of telling the complete truth,” wrote Thomas Sowell, “anonymously and posthumously.”

Clinging to an imagined authenticity can be a terrible handicap. In workshops, when we are trying to make a piece of personal writing more interesting, more dramatic, or more humorous, there’s always one writer who complains: “But it didn’t happen like that” – to which my rather unkind response is: “Who cares?” Unless you are writing a Presidential memoir, or some other piece of future history, the precise details of what happened, how and when are irrelevant, even if you have remembered them correctly!

How important is memory to a writer? Some people remember every last detail from their childhoods, every meal they have ever eaten and with whom, and even more intimate details. Others have very poor memories, or are blessed with a kind of built-in self-editor that tidies up and improves the past as they go along. I have a terrible memory, and I find that very useful. Neitzsche, going to extremes as always, said: “No man with a good memory can ever be creative.” The quote may not be exact, but it’s an interesting thought. If you have a bad memory, everything is always new, including the leftovers you put in the refrigerator yesterday and rediscover with pleasure today.

Research shows that a truly accurate long-term memory is uncommon. Very few people remember their childhoods accurately. Studies of siblings show that they recall identical childhood events very differently. I grew up with a female cousin of the same age. When we talk about those days now it’s as if we had two completely different childhoods with different families. If she decides to write her autobiography she will perpetrate many terrible and embarrassing errors.

So, problem number one is that memory is not necessarily a source of truth – which suggests in turn that personal truth is simply not available to us. We can never be sure of anything. Those with an academic turn of mind will immediately point out that “truth” and “authenticity” are divisible into those statements we can be sure of and those we cannot. Consider the following statement:

“I was born in Whitechapel, East London on 14th February 1939. I was a perfect baby, and my mother was the most beautiful women in the world.”

This statement obviously contains two kinds of information – facts and fantasies. The facts can be verified from my birth certificate. But statements involving memories, emotions, impressions, dreams and interpretations can never be verified. Mixing a few facts into a fantasy does not create a factual narrative. Most of the time we scarcely know the difference.

Problem number two is that we hate to admit problem number one. We have collectively been brainwashed into believing that any expression of personal experience must be accepted as authentic, just because it is personal. This is the driving force behind the tidal wave of irrationalism and crazinesss that washes over us every day – from the supermarket tabloids to religious extremists, flying saucer cultists, conspiracy theorists and writers of phony memoirs. We are willing, at least on the surface, to accept untested personal testimony as truth. It took almost two years for Dave Pelzer the author of the fake memoir A Child Called It and other fictions to be effectively challenged and exposed. His books continue to sell well.

Confusing the issue still further is the growing industry of “ghostwriting.” Celebrities, politicians, professionals, and people whose names are in the news for one reason or another, often decide to cash in on their fame or knowledge by producing a book. So they hire a writer from one of several agencies and, in due course, the book appears under the name of someone who did not write it. Most medical advice books, for example, are written not by doctors but by writers employed by doctors – which is a scary thought when you come to think about it. This doubt about the real authorship of books pushes the whole question of authenticity into even more murky waters.

In summary, a writer in this culture can get away with a great deal of faked authenticity, and the creator of fiction, memoir, non-fiction or faction is necessarily sailing without a compass across a limitless sea of ambiguity. Narrative writing is enormously popular today, even in newspapers. But reality never conforms to the stylish contours of fiction.

Narrative is always false. There’s no way around this. Facts are essential for a scientist or an academic, but fantasies are more fun for a writer Which would you rather read – A Comprehensive History of Magical and Alchemical Practices in the Early Middle Ages or Harry Potter and the Philosopher’s Stone? I rest my case.

The Cult of Celebrity

“To be a celebrity…is to
be forgiven everything.”

Mary McGrory

If a writer is someone whose name is attached to the front of a book, then we are in distinguished company. David Beckham, an English soccer player who became famous for wearing funny hair and marrying a pop star, sold 86,000 copies of his autobiography in the first two days after publication. This was more than all six books shortlisted for the Booker Prize in 2003*. In fact one well-reviewed novel shortlisted for the prize sold just sixty-seven copies in the entire year. Mr. Beckham’s achievement is all the more impressive because when he appears on TV he seems quite inarticulate, with a rather limited vocabulary and an insecure grasp of English grammar.

Private Jessica Lynch hit the bestseller lists with her Iran war story, and the sly royal butler Paul Burrell has had a huge success with his memoir A Royal Duty. Politicians get into the act. The former Tory party leader in Britain has published a thriller. In fact every politician in Britain seems to be writing a novel, and American politicians are not far behind. Madonna, Britney Spears, the Duchess of York, Jerry Seinfeld have all emerged as “writers.”

It seems that everyone wants to be a writer – especially a novelist or a memoirist. But the book market is crowded, and the obscure writer finds it hard to break through the clutter. Celebrities have no such problem. Their names alone are enough.

What is so annoying about this is not that these celebrities write books (or have them ghostwritten). That’s their right. What drives up my blood pressure is that these artificially concocted works immediately shoot to the top of the bestseller lists, bypassing the books of hundreds of serious writers who have spent a lifetime learning the craft. I am not given the opportunity to play on a top sports team just because I have written a few books. Why should David Beckham be handed a million dollar book contract just because he kicks a ball around?

Most of these celebrity books are probably never read by their buyers. They are more like keepsakes, or collector’s items. They give the fan a tangible connection with the object of his or her worship.

The New York literary agent Vicky Bijur, interviewed in the January 2004 newsletter of The American Society of Journalists and Authors, explained candidly what she looks for in an author.

“The big buzzword is ‘platform.’ Does the author have a column in the newspaper, a radio show, a track record, a killer Web site, is he or she quoted frequently in the press, has she racked up a number of appearances on early morning television, does she have videotapes of her TV appearances, etc? That’s a large part of selling nonfiction. If you look at the bestseller lists…it’s astonishing how many of the books are by people who have major platforms outside publishing…. It’s very important for authors to be telegenic or articulate or personable.”

In plain words, this agent is telling us that the publishing industry strongly favors authors who are already celebrities, or who at least have “star quality.”

Nobody can blame writers for using whatever qualifications and advantages they have. I’ve found in my own work that a little local celebrity goes an astonishingly long way. But, such as it is, that celebrity is all based on writing. Modern publishers and agents seem to believe that any celebrity is better than none. Mass murderers and corrupt corporate executives have an immediate advantage over any dull and law-abiding scribbler.

Celebrity also brings very practical advantages, such as resources. Al Franken made his name on television, and then began to publish satirical political books. The books are entertaining, and obviously, involve a lot of research. This is not a problem for the author who employs a team of fourteen full-time researchers to collect and check facts for him. This would stretch the budget of most freelances.

Famous and powerful people have always written books, sometimes very good ones. Winston Churchill and Julius Caesar spring to mind. But the super-competitive corporate publishing industry has changed the rules of the game. They want their investment to be a sure thing, which means starting with national (or international) name recognition.

So we can blame publishers on the one hand, which is always a pleasure. But we also have to blame ourselves and the bizarre celebrity worshipping culture we have created. Look at the front pages of the magazines at the supermarket checkout. They have an almost hypnotic effect. From week to month to year, the same names with the same faces confront me as I stand there clutching my cans of cat food. The stories are always the same – improbable dramas of divorce and betrayal, fortunes gained and lost, weight gained and lost, terrible diseases, and secret sorrows.

This is the shadow world where celebrities live. It’s so unlike our own world that the people who inhabit it seem unreal. Yet celebrities must be real, because they take up so much public space. They’re inescapable, at the checkout or on television, in magazines and movies, celebrities perform the soap opera of life, so the rest of us don’t have to.

Not all celebrities are equal, of course. The greatest of them are transcendent and eternal, like the undead Elvis and Princess Diana, and the still-alive Liz Taylor, who must be as old as my old mother, and Joan Collins and Charlie Manson and any and all royals and Kennedys.

Then there are lesser pantheons of fame, a shifting collage of micro-talented celebrities whose fame will last until public taste changes and they cease to be profitable. And there’s a whole mass of enormously famous people you’ve never heard of, who fill the blank spaces in the magazines. Who cares if Lori caught Scott cheating when we have no idea who they are? They may familiar to people who spend all day and night in front of the TV, but not to me. A magazine I perused at the checkout today featured the “Ten Biggest Celebrity Weddings of 2004.” I hadn’t heard of a single one of brides or bridegrooms, and they all looked identical and interchangeable.

Some celebrities, like Bart Simpson and Jay Leno are just ideas in the mind of a producer and don’t really exist at all. Others are ordinary people caught in the spotlight of fame for a moment, like raccoons transfixed by headlights on the highway. How quickly Tonya Harding and John Wayne Bobbitt became celebrities, and how quickly we have forgotten them! Remember Joey Buttafuoco and Amy Fisher? Nationally famous for something or other just a moment ago, now faded into grayness like a badly fixed photograph. One day soon even Kylie Minogue’s bum will be forgotten, and a good thing too.

We obviously need our celebrities: but for what? In ancient times, individuals became famous because they were heroes of action – usually violent action – like Alexander the Great. There have been other times when, believe it or not, people were celebrated for intellectual accomplishments, including writing. In the eighteenth century, philosophers like Rousseau and Voltaire were internationally famous. Then in the nineteenth and early twentieth centuries, the great empire builders of capitalism were most admired, men like Carnegie, Rockefeller, and Ford.

Who are today’s celebrities? Entertainers. We don’t admire people who lead, or think, or invent, or build. But we give fame freely to those who pretend to amuse us. This is a peculiarly modern phenomenon. For most of history entertainers have been marginal people, almost outcasts. In the old Hindu culture of India, entertainers were an untouchable tribe. In medieval Europe, actors and musicians were seen as the dregs of society. This seems entirely appropriate.

So what do we see in entertainers to justify all the celebrity they attract these days? Surely we don’t want to be like them? No, that’s too awful to contemplate. We want them to be unlike us. Celebrities are gods for our time, and publicity is our modern form of worship. We are slipping back, unconsciously, towards paganism. The old pagan religions must have been very satisfying: they had a nymph or satyr for every occasion, an idol for every prayer, a god for every need. Each deity celebrated an aspect of human nature, and their mythical lives held the mirror up to human society.

We don’t have Venus now, but we do have the love goddess Madonna; we don’t have Proteus, but we do have the ever-changing Michael Jackson. We can’t open a magazine without reading modern tales of Dionysius, Achilles, Phaedra and, most often, Narcissus. Sometimes one longs for Zeus, but he is unaccountably absent from the chorus line.

Having elevated celebrities to the status of minor gods I suppose we must accept that we have given them godlike powers, like priority access to the publishing industry. But it does change the traditional advice given to beginning writers. Forget about “Write what you know.” The new formula for success is: “Write nothing until everyone knows you.”

First published in 2003 and still as true as ever!

Two Cultures

Those of us who never had a proper education in science are reminded of our ignorance every time we switch on a computer, or a microwave, or even a light bulb. It might as well be magic, but it’s not, and we have no more idea how these tricks are performed than we understand the flying broomsticks in Harry Potter. It is humiliating, and dangerous because science is behind just about everything we use and depend on. Satellites spin in the sky above our heads, we get miraculous drugs from the local pharmacy and make calls on phones that don’t seem to be connected to anything, and we don’t understand how any of it works.

The British scientist and novelist C.P. Snow wrote about this in 1959 in a famous book called The Two Cultures. The two cultures were science and art, and Snow was alarmed at how little they understood about each other. Nothing has changed since 1959. In fact, science has run so far ahead of scientific illiterates like me that we can only gape at it like the Pilgrim Fathers confronted by a video game.

I was painfully reminded of this by reading a memoir by the scientist and controversialist Richard Dawkins about his lifetime of research in biology. He writes well and most of his arguments can be followed by the scientifically retarded, but the underlying knowledge of things like phenotypes and genotypes, memes and genes, arthromorphs and biomorphs is simply not in my brain, so that his descriptions of research sound as much like mysticism or alchemy as the products of reason and logic. So, I must trust Professor Dawkins, and I do more or less. But if I trust him, without being able to explain why, I might trust anybody with any theory about reality. That’s what’s so dangerous. The more a few scientifically educated people know, and the less the rest of us know, the more science begins to look like some sort of elite conspiracy. Ignorance breeds fear, and into the empty space comes every dumb idea and idiotic belief that has ever entered into the mind of man – and believe me that’s a long list.

Science doesn’t leave much room for the opinions and beliefs we cherish. We love to have pointless arguments about the best basketball team or religion or political party or TV show, because it is fun to argue and we can always consider ourselves right. My opinion is as good as yours. But when it comes, say, to finding the elusive infinitely small particles that may be the basic building blocks of the universe, my opinion is definitely not as good as yours, and totally worthless compared to the opinion of a physicist. My idea would be to get a large magnifying glass. Their idea was to build enormous machines, like the Hadron Collider outside Geneva. Guess whose method is most likely to succeed.

Of course, science can’t do everything: it can’t tell us the right word to use in a poem or explain Beethoven’s late quartets. There are different things to know, and different ways of knowing. But we need a better balance, more people like Alexander Borodin, the Russian composer who was also a research chemist, or indeed Richard Dawkins himself, a biologist with a wide-ranging knowledge of the arts.

We aren’t all smart enough to be scientists, but it is important that we understand how and why science works, and magic doesn’t. Even I know what a scientific proof should look like: observation, classification, experiment, repetition, comparison and so on. That’s how we know that our knowledge is knowledge. If you don’t believe that, you might believe anything.

The Knowledge Man

Denis Diderot is an almost-forgotten name in intellectual history, although everybody knows Mr. Google. But whenever we use an online search engine to answer one of our questions we are benefitting from the genius, bravery, and determination of Monsieur Diderot.

His idea, which was as simple as it was revolutionary, was to gather together all the knowledge in the world in a systematic way so that anybody could find information about anything. In other words, Diderot invented the encyclopedia.

It was a project on a heroic scale, completed in 1772, and filling twenty-seven large volumes containing seventy-five thousand entries. It was not at all popular among the rich and powerful of the time. The idea of spreading knowledge is never welcomed by people whose position depends on the ignorance of those less fortunate. Knowledge really is power. That’s why universal education evokes such mixed feelings. On the one hand, it is the force behind economic and social progress. On the other, once people start thinking for themselves, who knows what might happen?

Diderot was said to have written ten thousand of the Encyclopedia articles himself. Some of them were so radical that the entire book was banned for a while. But now it is treated as an intellectual monument, and the French state honored the author in his tricentennial year.

When I was growing up, encyclopedia salesmen came from door to door. “The encyclopedia man” became a kind of joke, spoofed by Monty Python among others. My parents had two different sets in multiple volumes, one for adults and one for children – that is to say, for me, my very own encyclopedia. An encyclopedia in the house was supposed to guarantee that your child would grow up both intelligent and knowledgeable, and obviously it worked in my case. You can still find printed encyclopedias gathering dust in public libraries, but rarely on family bookshelves – the Internet has seen to that. I leave it to you to decide whether the Internet in the house will guarantee that your child will grow up both intelligent and knowledgeable.

The great thing about an encyclopedia in book form, and especially a big one like Britannica or Americana, is that you can read it, explore it, and get lost in it. One thing leads to another, and another, and another. You may start by looking up Diderot and end up reading about speculative fiction or Italian opera. I know, because it happened to me. Internet search engines simply seek out a target and hit it: here’s your question, here’s your answer, end of story. This is very practical and useful, but there’s no adventure in it and precious little chance of making strange or unexpected discoveries.

Intellectual giants of the past like Goethe, Nietzsche, Hegel, and Freud all admired Diderot, not just for his massive encyclopedia but for his subversive satirical writing. He was a man with a mission, to promote knowledge in all its forms without prejudice and without censorship. It is ironic to reflect that if he were to pursue the same passion today, three centuries later, he would probably be in as much trouble with the authorities now as he was then.

Memento Kaypro

What happens to old computers? That’s a rhetorical question. I know the answer. Old computers go straight to my basement.

I’ve been tidying the basement because it’s cool down there, and I decided to organize the mess in a systematic way – books in one area, suitcases in another, boxes of old files in the far corner, and so on. One area was reserved for electrical stuff, and I put a superannuated computer there, then another, then another, then another. What are all these computers doing in our house? We’re not or geeks, or hackers. My wife and I both use computers under protest and in a state of high anxiety, like primitive savages forced to worship a new and unpredictable god.
A little electronic archaeology allowed me to arrange the computers in layers, according to their ancient origins. The oldest was something called a Kaypro IV, which came into our lives about 1983. This was called a “portable” computer, although “transportable” would be a better name for it. It weighed almost thirty pounds. When we traveled by air we couldn’t take any other baggage. But we became so attached to that monster that we lugged it to Europe four times. My bad back dates from the time we bought the Kaypro.

The Kaypro IV may have been the perfect computer. It used two five and a quarter inch floppy disks, one for the program and one for recording what you wrote. It didn’t record much, but that was OK. Unless you were setting out to write War and Peace, it was enough. The Kayro never crashed, never froze, and never lost any data. It wasn’t Internet-compatible, but in 1983 who cared?
Next to the Kaypro was another machine called a DFL, and another from Hewlett Packard, and another from Packard Bell, plus an ancient laptop from Compaq that the cats had disliked on sight. I didn’t explore the furthest, darkest corners of the basement for fear I might find an old IBM Mainframe lurking there.

The total number of computers in the house turned out to be eight, which was astonishing. We don’t even have kids, and clearly we can’t get rid of these things. Old computers are defined as hazardous material, and must be disposed of “properly.” Unfortunately, nobody agrees what “properly” means, which is why they end up in the basement. They also age very fast. Your smashing new machine will be history in about four years – a rate of planned obsolescence that even the auto industry cannot match. With two hundred and fifty million computers in the United States being junked every four years, the problem seems not so much their toxicity as their sheer volume. Whole western states could soon be several feet deep in discarded computers.

Old computers are toxic in more ways than one. Criminals can steal identities from them, and perhaps even material for blackmail. The contents of your hard drive may be emotional dynamite, like Madame Bovary’s letters, or they may be just embarrassing because of the web sites you have visited or the bad poetry you have written in the past. Those computers in the basement are not just junk: they are time bombs.

They are also a source of guilt because they are such obvious symbols of waste and hysteria. Our old computers are not dead. They all still work. They’re just not fast enough or clever enough for the maniacal modern world. They don’t have enough memory. They are not state of the art. If we applied the same brutal logic to ourselves we would all be languishing in the basement.

Putting On Appearances

Clothes make the man.

Old proverb (even older than Calvin Klein)

Many writers get hung up on small details: what books to read, what courses to take, what kind of paper and writing instruments to use, even how to set margins and spacing, what typefaces to choose, and so on. They are, quite naturally, searching for the professional secret. What does it take to be a real writer? But they are asking the wrong questions.

The really important question is: what to wear?

When I was young I made more of an effort to appear literary. It was obvious to me that the writing business was largely a matter of creating self-images, and that every writer should be not just a self-publicity agent but also a fashion plate.

There’s a faded black and white picture of me in Paris in 1958, taken by a street photographer. It shows how hard I was trying to achieve the right style for a bohemian writer of the period: corduroys, bow tie, horn-rimmed spectacles, and politically correct CND pin. I am carrying a large folder that suggests a substantial manuscript in progress. I was doing my best to look the part, even though my actual writing performance was lamentable.

The few real writers I was able to meet in those days confirmed my impression that it was important to wear the right uniform. The poets (male and female) wore black, as required by tradition, the male novelists wore baggy suits made of tweed or corduroy, while the females favored long, shapeless frocks made out of some unyielding substance like denim. Nobody could ever mistake them for bankers or lawyers, or for that matter artists or musicians. The uniform proclaimed the writer as clearly as an eighteenth-century wig proclaimed the barrister.

This is a long and honorable tradition. The flamboyant Oscar Wilde comes instantly to mind, and also the bardic W.B.Yeats with his cloak and floppy bow tie. In the past century, American writers from Hemmingway to Mailer have affected a kind of hard-edged sporting macho style, designed to suggest that typing is a super-masculine thing requiring lots of testosterone. On the female side, Barbara Cartland set the standard of image-making excellence with her shocking pink outfits, and most women writers at least made some effort to look artistic or intellectual.

But in the twenty-first century, in this as in so many things, standards are slipping. When I go to places where writers gather, such as conferences, book fairs, bars in the Hampton and the lobby of the Algonquin Hotel, I find it harder and harder to distinguish authors from real people. The spiffy Tom Wolfe is instantly recognizable, of course. But the rest of us might as well be schoolteachers, or Wall Street brokers on dress-down Friday.

I spoke to a publicist about this, after she agreed not to charge me fifty dollars a minute for the conversation. She agreed with my perception. Most writers don’t even try to look the part. They have become dull and conformist, and not at all inspiring for their image-makers in the publicity industry.

The British, as usual, are more blatant about this. If the publicity handouts are to be believed they prefer their young female writers to dress like streetwalkers, the older females like duchesses, and the men like teenage heroin addicts or stage villains. None of this improves the quality of their writing, but it makes their book jackets and TV interviews much more watchable.

I am a poor example myself. On good days I look so much like a professor that students I’ve never seen before greet me with “Hi professor.” On bad days I look like one of those old men who visit the town dump every Saturday in search of treasures. On no days do I look like a writer.

It’s time to remake my self-image. Where did I put my old corduroy suit? I think I tucked it safely away somewhere in 1963.

It’s the Thought That Counts

Here’s a book worth reading, although it was published back in 2011 which now almost seems like an age of innocence – Except When I Write by Arthur Krystal. He is a well-known essayist and critic (New Yorker, Harpers, etc.) and has collected together twelve of his best essays and critical reviews from 2005 to 2009.

The first essay alone is worth the price – “When Writers Speak.” In it Krystal offers a quotation that Edgar Allan Poe attributed to Montaigne.
“People talk about thinking, but for my part I never think except when I sit down to write.”

This mild but disturbing aperçu brought me up short. It had the same effect on Krystal, and perhaps on you. How true it is – at least for my mental habit. Most of the time I live and talk on automatic pilot, using the same stock repertoire of actions and phrases that I’ve been using all my life. Thought is not necessary, unless I try to solve a crossword or puzzle out some new outrage committed by my computer.

It’s the same for most of us. Recently I took a two-hour train trip during which I was surrounded by people who talked loudly and incessantly to each other or (more often) to some disembodied entity hiding inside a cell phone. I must have been an unwilling listener to eight or ten conversations on that train, and none of them made any sense whatsoever. They were (to use a good old-fashioned word) just blather, empty words, ungrammatical stream of consciousness noise. No communication was taking place, except the kind of communication that occurs when one monkey chatters to another – a kind of verbal grooming or exchange of recognition signals.

It was depressing to realize that I often do exactly the same thing. Faced with a social situation, or an unexpected phone call or an encounter in the post office, I can blather as well as anyone. Not a single thought enters my head while I am doing it.

But when I sit down to write, as Montaigne said, I begin to think. The rusty gears of my brain grind into action and (as has often been pointed out to me by my nearest and dearest) I disappear into a state of abstraction where I don’t want to talk to anybody. This is exactly why writing is so hard. It’s not the writing it’s the thinking that produces keyboard avoidance and writer’s block.

The reverse side of this phenomenon, as Krystal points out, is that writers are often poor speakers. Our literary skills don’t always translate into verbal skills, perhaps because the thinking part of our brains is reserved for or used up by the first.

As someone who works in radio, where speech and writing (and, occasionally, thought) come together, I found Krystal’s speculations fascinating. But you have to read the whole of Krystal’s essay – if I describe it in any more detail here I will probably be guilty of copyright infringement. The rest of his book is excellent too, especially if you are interested in Hazlitt, Poe, Barzun, or Scott Fitzgerald (some of his favorite subjects). His essays are models of complex yet completely friendly writing, and they must have required a lot of thought.

Read it Out Loud

The habit of reading aloud to children is slowly dying out. Busy parents prefer to settle their little darlings down with the TV or a video. This seems a shame to me, because I was the beneficiary of countless hours of reading aloud by my parents, Not only was it a very warm and companionable thing, but it allowed me to see my parents as magical storytellers, so I have admired storytellers ever since. My father was a particularly good reader, having a resonant voice, good timing, and a gift for imitation. The family myth is that, at a very young age, I learned all my favorite stories by heart, so they could never get away with skipping a page, or even a single line.

In the eighteenth and nineteenth centuries it was normal for literate families to read aloud to each other, just as it was normal for them to play music together. Now the mass media have made those sociable habits largely redundant. Most of us, if called upon to read from Shakespeare, or even from the morning paper, will make a sad mess of it. We don’t have the skills that come from having the habit.

Yet how reassuring it is to hear a familiar voice beginning a familiar story. The voice draws us in, as plain print does not. We can lift our eyes from the printed page, the smallest distraction is enough to break its spell. But a voice holds our attention

“The mole had been working very hard all the morning, spring-cleaning his little home. First with brooms, then with dusters, then on ladders and steps and chairs, with a brush and a pail of whitewash; till he had dust in his throat and eyes, and splashes of whitewash all over his black fur, and an aching back and weary arms.”

That’s the opening of Kenneth Graham’s The Wind in the Willows, the story of a mole, a water rat, a toad and their friends in the English countryside. I loved that book as a child, and knew every word. After that first paragraph I still want the story to go on, to hear how Mole abandoned his spring cleaning and burrowed up into the sunshine and met the water rat, and all about their adventures. But I want someone to read it to me, I want to hear it.

Familiar opening lines draw us in like a magnet. This is one you know.

“Old Marley was dead to begin with, there was no doubt whatsoever about that.”

Or, how about this?

“It is a truth universally acknowledged that that a single man in possession of a good fortune, must be in want of a wife.”

Or this, a little more tricky:

“For a long time I would go to bed early. Sometimes, the candle barely out, my eyes closed so quickly that I did not have time to tell myself: ‘I’m falling asleep.’”

If you recognized the opening of Marcel Proust’s Remembrance of Things Past you get the A. Here’s an easy one.

“Call me Ishmael.”

Most book addicts will recognize all those openings without difficulty, and that’s the point. They stand for the whole unfolding story, the story that we know already and we want to hear again, like a child at bedtime.

So for some real old fashioned entirely free home entertainment choose a good story, preferably a mystery or a ghost story, gather your audience, and read it out loud.

Utopia Needed

We don’t hear much about utopias these days. We have conspiracy theories and apocalyptic religious wars, but no dreams of a perfect future. Perhaps we have just grown up. It may be that cultures are like people: at some point in the life cycle, we have to give up the idea of self-improvement and concentrate on self-maintenance and survival.

Yet the promise of utopia runs deep in American history. It’s an eighteenth-century idea, like America itself, and suggests that we can achieve an ideal society on this earth and by our own efforts, starting with the perfection of the self. The story of this country is full of utopias, from the Puritans themselves to the Shakers, Oneida, Amana, New Harmony, Brook Farm, the Fourierists, and hundreds more.

Most of these were already fading away a hundred years ago. Where can we look for utopia today? We can start, I think, by looking in college towns. With large, mobile populations of young and optimistic people, such places seem to offer a kind of laboratory in which to build a perfect life and a new world. That’s the charming thing about college towns like Madison, Chapel Hill, Ann Arbor, or Burlington. Among those I’ve visited my favorite was Santa Cruz in California where I spent a year in the 1970s. The place almost turned me into a hippie.

There were and are so many self-improving things on offer that, if you stayed long enough, you might actually become perfect – physically, morally, spiritually, and intellectually. Apart from the usual choices – therapy, yoga, meditation, new age philosophies and so on, there are opportunities to go and serve in the world by building environmentally friendly houses in India, or peace in Israel, or sustainable habitat in Costa Rica, or community awareness right here at home. This is where intellect and optimism are concentrated. Everything around you says “Yes, we can.” In such an environment a young person can stop and think for a while about changing his or her life, and by extension changing the world.

They could do worse than start with a dusty book from the back shelves of the library: Looking Backward, by Edward Bellamy, published in the year 1888. It was one of the most influential utopias ever published in America. In the closing years of the nineteenth century, he looked ahead to imagine how the world would be our own time. It’s an old book, but it shows that we have some unfinished business.

Bellamy predicted that we would now have a world without war or poverty, a world of universal equality and understanding between classes and races. There is no crime in Bellamy’s twenty-first century America because there is no money. Money has been abolished – and with it making money from money, and envy, and robbery, and poverty, and wealth, and political corruption. There are, of course, no lawyers, and no financial advisers in this utopia. Medical care is free. Education is rigorous and disciplined, but free up to the limit of each person’s capacities. Culture, creativity, and good manners are valued above all things. And, talk about utopia, there are even term limits on Congress.

We don’t need a new utopia, the old one was fine. Here’s a wonderful summer project for any idealistic young student. Write a research paper that answers the question: how do we get from the world as is now to the world as Edward Bellamy imagined it would be now? Extra credit for finding the right answer, whatever it is.

Copyright: David Bouchier

The Mixed Pleasures of Teaching

A lot of writers teach to pay the rent. It’s the almost ideal occupation because of the long vacations and the fact that teaching is (or was) all about words, books, and ideas. Some have made the leap from teaching to literary fame, like Frank McCourt (Teacher Man) and Alan Bloom (The Closing of the American Mind).

I’ve done my share of teaching, although I started late. The first time I faced a class, at a British University, I was already thirty years old. This gave me a certain advantage over younger assistant professors. The students assumed I was more senior than I was, and I didn’t disillusion them. For twenty years I continued teaching at that university, and in New York, Connecticut, and California. It was a great experience, although it left me with an ineradicable streak of irony.

Now I only teach at summer writing schools, and the occasional adult seminar, where the dynamic is very different from that of a college class. But an opportunity just came along to teach a special seminar at the local university, which set me thinking again about the education process, which I thought I had left behind for good.

Teaching changes people. It’s great for self-esteem, perhaps too great. Professors feel sure of themselves. They are the experts, and they are on top of their subjects. They may occasionally be challenged by graduate students or a bright undergraduate, but basically they are invulnerable.

It’s a good feeling to be ‘the authority.’ It’s something I miss as a writer. My audience and work team consists of our two cats, who never defer to my authority in any way whatsoever. So the thought of a seminar is always rather tempting.

Other kinds of teaching may be less good for morale. Primary school teachers are (I hope) firmly in charge of their tiny pupils. But middle and high school teachers are on a battlefield, and they often seem crushed and embittered by the experience. Education works best from ages 0 to 10, and then again from about 20 to 30. The intermediate years are a waste of time.

Pedagogy has changed enormously in my lifetime. The boys’ school I went to was run by a team of authoritarian “masters” to used sarcasm and violence to keep us in line. But they were effective teachers. At university I encountered some of those old-fashioned professors of the type portrayed so vividly by John Houseman in the movie The Paper Chase. Teachers were always typecast as enemies, egomaniacs, or gurus. They fascinated us because they were so different and we assumed (usually wrongly) that they had bottomless stores of secret knowledge.

All gone now: vanished beyond recall. Today’s teachers, at all levels, are supposed to be friends and confidantes, regular guys and gals who facilitate shared learning experiences instead of teaching. In fact, school has become much more like home, which is appropriate at a time when homeschooling has become an unwelcome necessity. I just hope that homeschooling parents are paid enough to compensate for the awesome responsibility and 24-hour timetable.