Text Patterns - by Alan Jacobs

Friday, January 31, 2014

Noah Millman is very smart about blogging

Here:

I started blogging in 2002, hanging out my own shingle on blogspot. I did it primarily as a belated response to the trauma of 9-11: I had been emailing news items to a variety of friends and family with an obsessiveness that nearly deserved a DSM number, and one of them finally told me I should stop emailing him and start a blog if I felt compelled to tell everyone what I thought. So, against my wife’s explicit instructions, I did.

And I loved it, right from the get-go. The thrill of instant response to what I said was a perfect fit for my latent writerly ambitions for recognition and my Wall Streeter’s inherent attention deficits. I would write, I would press “publish,” and someone out there would respond.

But that response wasn’t merely gratifying or instructive; it shaped what I wrote, shaped the persona (a better word than “self”) that I was developing on-line. My style, my subject matter, my politics, my sense of who I was and was meant to be evolved in part based on what got positive reinforcement and what didn’t, even though I wasn’t being paid anything at all. A gift economy is still an economy, and there’s nothing particularly pure about non-commercial social discourse. “No man but a blockhead ever wrote except for money” – so said Sam Johnson, but in fact the truer statement is that no man but a blockhead ever tried to earn money by writing. When it comes to money, Willy Sutton had a much better understanding. So all of us writers, whatever our medium, write out of some other compulsion than to earn a living. And to the extent that that compulsion has something to do with having readers, we have to watch the progress of our addiction, how it is changing us.

"A gift economy is still an economy," and the blogosphere may not even be that. The networks of exchange are complex and still imperfectly understood.  

the internet, mapped


By Martin Vargic. Click on the image for a much much much larger version.

Thursday, January 30, 2014

technocracy


This is one of the first references I can find to the word "technocracy."  (The only earlier one that seems to have the same feel is from one W. H. Smyth, who in an issue of a magazine called Industrial Management in 1919 wrote, “For this unique experiment in rationalized Industrial Democracy I have coined the term ‘technocracy’.”) What's noteworthy, of course, is how purely positive, even celebratory it is — as long as "scientists of all nations enter an understanding not to contribute their skill in the manufacture of munitions." Which is not exactly how it worked out. 

The word began to catch on a few years later, though, significantly, it declined at the very time (World War II) when a genuine technocracy was coming into being: 


"Technocracy," and its adjectival form "technocratic," rose sharply in popularity in the 1970s, though I suspect that by this time the terms had undergone significant pejoration. I'll try to find out what writers were responsible for this significant upturn. 

UPDATE: good folks on Twitter have pointed out that those early would-be technocrats were followers of Edward Bellamy and that  while they may have meant the term positively it as highly contested from the start (that was Yoni Appelbaum); and that (this comes from Chris Beha) Daniel Bell may have played a significant role in the pejoration of the term in his 1973 book  The Coming of Post-Industrial Society — though a quick look at that book suggests that Bell was drawing heavily on French social theory. I wonder what role Jacques Ellul played in this? More on all these matters as I learn more....

Monday, January 27, 2014

outside the circle

In Mark Edmundson’s essay about his graduate-school days at Yale, there’s a passage that plucks my own chords of memory:

The Yale English-department faculty was mostly white, male, and bald or graying. They wore ties, tweed jackets, thick glasses, and sensible shoes. Some of them even smoked pipes. It was the most aggressively senior group of people that I had yet encountered.

The first time I saw more than a few of them together was at the introductory reception, a couple of days after my first meeting with the director. That day the department served sherry, which I had never seen, much less tasted. For solid refreshments, we got Dunkin’ Munchkins, tiny balls that were, the conceit ran, the punched-out centers of doughnuts. More than half a dozen graduate-faculty members clumped together in a tweedy circle, pretty much ignoring the students. The denizens of the Yale English department seemed about as interested in meeting us graduate students as they were in eating the Munchkins.

The reception was held in the graduate seminar room, which looked like the throne room for a monarch soon to be deposed. There was a grand table, lined with about 20 comfortable leather chairs. Up front was the lion’s seat, a regally worn piece of business, where—we all knew without being told—the graduate professor would install himself to hold court. (I knew that if I ambled over then and sat down in it, I’d be vaporized.) Behind the throne was a bookcase full of genteelly decomposing books, many of them related to the study of Old English. Above the books, I saw a shape that looked like a crude map of Australia but on closer examination proved to be a water stain.

Occasionally a brave grad student tried to edge into the tweed clump, but the clump subtly bunched, closed ranks, and left him outside. Eventually the student returned to us, hangdog.

I didn't do grad school at Yale but rather at Virginia, just a few years after Edmundson; yet I know the experience. I think particularly of a day in, probably, 1983, when Stephen Greenblatt came to give a lecture for the English Department. Greenblatt was then only around forty, but already a full professor at Berkeley and a very fast-rising star in the academic firmament. The talk he gave was a version of what would become his most famous essay, “Shakespeare and the Exorcists,” and I certainly had never been, and probably have not since been, so bowled over by a lecture.

Perhaps my admiration was readable on my face. As the applause died down, James Nohrnberg, the immensely learned and deeply eccentric Spenserian of the faculty, leaned over to me and said, with an amiable grin, “Have you checked to see if you still have your wallet?” He then turned and ambled out of the room.

The rest of us withdrew to the Colonnade Club for a reception, where I thought I might get a chance to ask Greenblatt a question or two. (Such was my naïvete.) I found him in a corner of the room, surrounded by about a dozen betweeded or blue-blazered faculty members; I hovered uneasily around the edge of the group, hoping to catch his eye. And in fact I did: he nodded at me, smiled a bit, but then had to answer the next question that came from one closer to him in rank, status, and space. I waited, but one of the eminences surrounding him — I’m almost certain I remember who it was — noticed my presence, sneered, and closed the gap between him and his neighbor to make sure I could not intrude into the charmed circle. After a little while the whole phalanx ushered Greenblatt away to dinner and the rest of us crumpled and dispersed.

But I’ll always remember, gratefully, that he would have spoken to me — had he been allowed to.

(Now that I've told this story, let me recommend that you read Edmundson's essay for what it's really about: the power and the beauty of what used to be called "humane learning.")

Saturday, January 25, 2014

Pynchon, literacy, and Dickens

For the print version of my review of Thomas Pynchon's Bleeding Edge I added a sidebar on Pynchon and literacy. Here it is:

None of the major characters in Inherent Vice and Bleeding Edge, and few of the minor ones, read books. References to television, music, consumer brands, and (in Edge) web-sites abound, and those references are sharply observed and often extremely funny. There's a brilliant riff in Inherent Vice on how Charlie the Tuna, in the old StarKist commercials, is complicit in his own exploitation; and an ongoing joke in Bleeding Edge involves Maxine's husband's addiction to a biopic channel, with such memorable films as “The Fatty Arbuckle Story” featuring Leonardo di Caprio, and a dramatization of Mikhail Baryshnikov's life starring Anthony Hopkins. (“Would you look at this. Ol' Hannibal dancin up a storm here.”) But books, whether fictional or nonfictional, highbrow or lowbrow, are almost impossible to find, because they have played no role in shaping the hearts and minds of these characters.

Even Heidi, Maxine's academic friend who uses words like “mimesis” and “alexithymic,” is never seen reading and makes no references to books. (It's probably significant also that she studies popular culture: she visits ComicCon—more formally, Comic-Con International, held annually in San Diego since 1970—and observes trends in Halloween costumes.) I can think of only one book mentioned in Bleeding Edge, and not by name: a computer programmer is said to have on his desk a copy of “the camel book"—that is, Larry Wall's Programming Perl (probably the 2nd or 3rd edition). In 2001, when the book's events take place, Perl was still the most widely used scripting language, especially by those who coded the internet, though Python and a brand-new language called Ruby were on the rise. It says something about Pynchon's attention to detail that he gets this right. It says something about his book's themes that the only book referred to is a programming manual.

Pynchon writes long, complex, demanding, learned books about people who don't read long, complex, demanding, learned books, and while this could be said of many other writers as well, in Pynchon it has, I think, a particular significance. Most of Pynchon's characters in these recent books—and, I think it is fair to say, in all of his books in one way or another—are caught up in immensely complex semiotic fields. All around them events are happening that seem not just to be but to mean, but the characters lack the key to unlock those mysteries, and as they try to make their way are constantly buffeted by the sounds and images from movies, tv shows, tv commercials, popular songs, brands of clothing, architectural styles, particular makes of automobile … all combining to weave an almost impossibly intricate web of signification. Rare indeed is the Pynchonian character who is not entangled to some degree in this web.

By doing what he does in book after book, Pynchon clearly indicates not just that he finds this entanglement problematic in multiple ways—psychologically, socially, politically—but also that the primary means by which the entanglement may be described and diagnosed is that of books—large books comprised of dense and complicated sentences. In Pynchon's fiction we see an immensely bookish mind representing an unbooked world, and its great unspoken message is: Let the non-reader beware.

I have sometimes wondered whether, centuries from now, when the large hand of History has smoothed over differences that seem vitally important to us, readers might see Pynchon and Dickens as pretty much the same kind of writer: makers of big rambling eventful tragicomic books featuring outlandish characters with comical names. The Pynchonian and the Dickensian projects have a great deal in common, and as timegoes on I think it will become more and more clear that there is something truly old-fashioned about Pynchon's career.

Friday, January 24, 2014

a little Narnian adventure

I'm suffering with a rather nasty cold at the moment, which, along with the typical craziness of the start of an academic term, has slowed the pace of writing around here; but I can't resist trying, even in this befogged condition, to say a few words in response to this typically smart post by Adam Roberts on the Narnia books.

First, about whether Lewis's portrayal of Aslan is consistent with “the Christ of the Gospels” — I would say both (a) Yes and (b) That depends. Taking the Christian scriptures as a whole, we get three interventions into this created order by the Second Person of the Trinity. First, we are told at the beginning of John's Gospel that “all things were made through him” and, similarly, in Colossians 1 that “by him all things were created”; for this, see The Magician's Nephew. Second, the Incarnate Christ is the one we see in the Gospels; for this, see The Lion, the Witch, and the Wardrobe. And then there's the Christ who “will come again in glory to judge the living and the dead,” as the Creed says, and whose role as Warrior and Judge is described most fully in Revelation; that's The Last Battle. Traditional Christian theology would say, and therefore CSL would say, that the character of the Second Person of the Trinity is utterly consistent but he plays different roles at different points in history.

Now, about Susan. The old Problem of Susan. Adam writes, “ I do not claim that Susan is forever banished from heaven; I say that at the end of the Narnia books she is excluded, and so she is.” I genuinely don't think this is the right way of putting it, because it gives the agency to someone other than Susan — which might be okay if Lewis were a TULIP-affirming Calvinist, but he wasn't. I think that the right way to put it is to say that Susan simply chooses not to return to Narnia. That we paltry little humans have the power to refuse God is a point that Lewis returns to often in his theological writings. As he writes in The Problem of Pain, if we demand that God leave us alone, “that is what he does” — and, interestingly, Lewis prefaces that statement with an “Alas,” as though he might well prefer God to operate in another way. (Which also helps us understand that in sparing Susan from the train wreck that kills the rest of her family he is trying to give her a chance to turn back around towards Narnia. However, the emotional tenor of all this is muddled by this catastrophic contrivance to get the rest of the Pevensies into Narnia one last time; it's one of Lewis's unwisest narrative choices.)

I think this point — that we can refuse God and that some of us do — was important enough to Lewis that he was determined to get it into the Narnia books, but how was he to do it? The point wouldn't be made strongly enough if any of the less dominant characters embodied it, so it had to be one of the Pevensies. He couldn't make Lucy a backslider: she was the one who had always had the greatest faith and the greatest spiritual discernment. And he couldn't use Edmund either, since any renunciation of Aslan by Edmund would destroy the whole portrayal of Edmund's redemption in the first book. So it had to be either Peter or Susan, and I suspect that Lewis was not quite ready to face the possible theological implications of the High King of Narnia becoming a rebel against Aslan. So Susan it had to be. Lewis was backed into a structural corner, as it were.

This is not to say that Lewis didn't have some deeply troubling ideas about women, only that I think he couldn't have gone in another direction if he were going to make this theological point about our ability to be “successful rebels to the end.”

Finally, I want to look at this passage:

My real criticism of this novel relates to a different matter. It is that it ends just when it is getting interesting. The Pevensie kids become the kings and queens of Narnia: King Peter the Magnificent, Queen Susan the Gentle, King Edmund the Just and Queen Lucy the Valiant. They grow to adulthood in this world, until, many years later, they chance upon the lamppost again, and tumble back into our world, no longer adults, now children. Only a few hours have passed on Earth, for all the years (decades?) they spent in Narnia. Then Lewis stops; but this is where the story starts, surely—what would it be like to have an adult consciousness inside the body of a child? To have passed through puberty, and then suddenly to have the hormone tap switched off? You could hardly go back to you former existence; but neither could you expect to live as an adult. Would you go mad, or use your beyond-your-seeming-years wisdom to some purpose? How would you cope? Would you try to explain? Would you betray yourself, and reveal the Narnia portal to the world—would governments attempt to exploit it? The psychological interest in the story begins at the end; but that's exactly the place where Lewis drops the bar down and ends things. Grrr!

I want to look at this passage because I have wondered about this point too, and I suspect that Lewis never really thought this through. In this particular sense the Narnia books are connected in my mind with the old Tom Hanks movie Big, which enacts an especially intriguing wish-fulfillment dream: that it's possible for a kid to assume an adult body and have adult experiences, including living on his own, having sex, and working for a living — and then turn his back on all that and go back to live as a child with his parents. But then what would puberty have been like for him? There's a reason why Blake's Songs of Innocence and Experience can't be read in reverse: “The Lamb” necessarily comes before “The Tyger” and always will.

It's interesting that Lewis to some degree addresses this issue when going the other way: when the adult Kings and Queens of Narnia ride back into the forest where the Lamppost is they have clearly forgotten who they once were and only gradually recollect their previous lives as ordinary children; but whenever they come back into Narnia they seem to have perfect recall of everything they had done there before. Lewis might have done better to reverse the memorial polarity.

Monday, January 20, 2014

relevance and ignorance

A few days ago I wrote, “Between the writers who desperate to be published and the editors desperate for “content,” the forces militating against taking time — time to read, time to think — are really powerful.” If you want evidence for that claim, you couldn’t do better than read this interview with cartoonist and writer Matthew Thurber, who cheerfully describes the pleasures of writing about a subject he can’t be bothered to learn anything about — in this case, the so-called and just-around-the-corner Singularity:

I like not-knowing in general. And if I’d waited until I’d read all of Ayn Rand and all of the singularity literature, I wouldn’t have been able to work fast enough to get this comic done. I felt an urgency to get it out before it became completely irrelevant. YouTube has been around for a decade. The Snowden stuff happened when this book was coming out. But I felt like it would be funny if I didn’t know what those things were. Writing a book responding to the singularity but not really knowing what it was. It was just a rumor. Ineptitude can be funny, too.

Thurber pushes this point hard enough that it eventually becomes clear that he wants to be thought of as knowing even less than he does: “I don’t know what the singularity really is. I understand that it involves the hybridization of humans and technology, or A.I. Or actually, no, I don’t know what it is. A robot? Like the movie D.A.R.Y.L.? Or any movie where there’s a robot who has feelings?” So he asks the interviewer: “And maybe it already happened? Do people think it already happened?”

There’s much to consider here. First of all, Thurber’s claim “I like not-knowing in general.” Well, have I got a political party for you, then! But more seriously, this reminds me of that prince among legal scholars, Hugo Grotius, who wrote, Nescire quaedam magna pars sapientiae est — “Not to know some things is the greater part of wisdom.” But Grotius’s point — made in an era of rapidly expanding knowledge, of having too much to know — was that you have to make a discipline of foregoing certain kinds of knowledge that are not necessary to your chosen intellectual path in order to cultivate other kinds that have first-order importance for you. (When I posted that Grotius quote to Twitter a while back, my internet friend Erin Kissane shot back a famous line from Sherlock Holmes asserting his indifference to the Copernican theory: “You say that we go round the sun. If we went round the moon it would not make a pennyworth of difference to me or to my work.... Now that I do know it I shall do my best to forget it.”)

But, obviously, this is a vert different point than the one Thurber is making, which is that he likes knowing nothing about the very things he is writing about. Or at least he is quite willing to remain ignorant in order to avoid being slowed down in his work. It’s not hard to imagine making a pleasant intellectual game out of writing about what you don’t know, but Thurber is clear that for him this is all about being relevant — it’s specifically the quest for relevance that mandates ignorance. Thurber’s argument goes like this: A great many people are talking about something called the Singularity; I don't know much about it, but I’d like to draw the attention of those people; but those other people, like me, have short attention spans and may soon be talking about something else; so I’d better write something about the Singularity quickly so I can attract their eyeballs before it’s too late.

This is all phrased light-heartedly, but I wonder if that tone isn’t at least a little misleading: Thurber really does seem afraid of getting left behind. And he’s not the only one: it’s pretty clear that in writing The Circle Dave Eggers was so eager to make a Socially Relevant Intervention about tech companies that he didn’t bother to learn how they actually work. So what we have hear is an urgency to be heard coupled with a need to be relevant. The result: social commentary made by people who have nothing but vague, uninformed speculations to guide their writing. This is how whole books become indistinguishable from the average blog comment.

Thursday, January 16, 2014

less rolling, less tumbling, more mindfulness

In my last post I explained that I’m thinking of moving away from my Tumblr because its frictionlessness, its ease of posting and re-posting, has become somewhat problematic for me. And I said there was another reason for moving away from it, but in fact I have at least two.

In my view, the number one thing that all the major social media do wrong is this: metrics. Follower and friend counts. I would greatly prefer not to know how many Twitter or Tumblr followers I have, and were I on Facebook I wouldn't want to know how many “friends.” But the world of social media is a world of counting. The whole business reminds me of what Auden called “Infernal Science”:

One of our greatest spiritual dangers is our fancy that the Evil One takes a personal interest in our perdition. He doesn’t care a button about my soul, any more than Don Giovanni cared a button about Donna Elvira’s body. I am his “one-thousand-and-third-in-Spain.”

One can conceive of Heaven having a Telephone Directory, but it would have to be gigantic, for it would include the Proper name and address of every electron in the Universe. But Hell could not have one, for in Hell, as in prison and the army, its inhabitants are identified not by name but by number. They do not have numbers, they are numbers.

And once you start conceiving of your online social world as countable, it’s hard, as many, many people have noted, not to make decisions based on what will boost those numbers. Since the major social media platforms don't even allow the possibility of hiding the numbers — it has probably never occurred to any of the people who work for such sites that a person might not want that data — you either have to find a way to ignore the numbers or stop using the service.

Which leads me to my second reason for moving away from Tumblr: the value of owning your turf — your turf also, not incidentally, being a place where you can take no notice of metrics. Thus Frank Chimero on changes he’s making in his online presence, based on, among other things, his realization that his being spread so thin online is making him grumpy:

While this callousness and irritability might be caused by the traits of certain environments online, it’s also just an attitude, so it can be modified. I can adjust how I look at the newness, change how I interact with these venues, and try to make a quieter, warmer, and slower place for my things. That’s good for the audience (I think), and good for my work and the things I share. You need to build a safe place so people don’t need to be on guard and stingy with their attention. If you can do that, we all get a breather.

It seems the best way for me to do this is to step out of the stream and “build my own house,” just like those architects. I don’t have to simplify or crop or be pulled out of context (unless I want that), which hopefully produces a fuller picture of who I am, what I like, and what I value. I’m returning to a personal site, which flips everything on its head. Rather than teasing things apart into silos, I can fuse together different kinds of content. Instead of having fewer sections to attend to distracted and busy individuals, I’ll add more (and hopefully introduce some friction, complexity, and depth) to reward those who want to invest their time. I won’t use analytics — actually, I won’t measure at all. What would I do with that data anyway? In this case, it’s just more noise. The singular thread that runs through everything is only “because I like it.”

So, I’m doubling down on my personal site in 2014. In light of the noisy, fragmented internet, I want a unified place for myself — the internet version of a quiet, cluttered cottage in the country.

This sounds great to me. So all I have to do is to figure out how to do this in my own life. I know I’ll be here; I know I’ll keep my Gospel of the Trees site up and running; but everything else is now negotiable. I will almost certainly do any personal blogging and posting of images, quotes, and the like to my own turf. Maybe I’ll even move my research notes from Pinboard to there, or move them offline entirely. Maybe I’ll make my Twitter account private, or create a new private one and leave the public one just for announcements and links.

It’s time for more mindfulness. And mindfulness can’t be done quickly.

Wednesday, January 15, 2014

rolling and tumbling

I’ve been writing lately about thinking — about, especially, the conditions under which thinking happens. Previous posts are, in order, here, here, here, and here. Now for another installment.

I started what was used to be called a tumblelog but now is usually just called a Tumblr in March of 2007, so I was a fairly early adopter of what has since become an enormous social-media enterprise. I liked it right from the beginning because, in contrast to a conventional blog, its code promoted the posting and re-posting of ... stuff. Quotations, images, music, snippets of conversation: all were trivially easy to get onto my site. I began to think of my Tumblr as a digital commonplace book, an idea I wrote about here and here. Looking at those essays, I see that I registered a note or two of caution. For instance:

When I post quotations and images to my tumblelog I suppose I'm succumbing to the temptation to cheat: I'm not writing anything out by hand; I'm not even typing the words, which is what I used to do when as a teenager I kept a sheaf of favorite quotations in a desk drawer. I'm just copying and pasting, which is nearly frictionless. I don't have to think about whether I really want to record a passage or image: if it's even vaguely or potentially interesting, in it goes. I might not even read it with care, much less give it the kind of attention that wold be required if I were to write it out by hand.

And this:

Keeping a commonplace book is easy, but using one? Not so much. I started my first one when I was a teenager, and day after day I wedged open books under a foot of my ancient Smith-Corona manual typewriter and banged out the day's words of wisdom. I had somewhat different ideas then of what counted as wisdom. The mainstays of that era — Arthur C. Clarke and Carl Sagan were perhaps the dominant figures — haven't made any appearances in my online world. But even then I suspected something that I now know to be true: The task of adding new lines and sentences and paragraphs to one's collection can become an ever tempting substitute for reading, marking, learning, and inwardly digesting what's already there. And wisdom that is not frequently revisited is wisdom wasted.

I really love posting to my Tumblr, and the “frictionless” quality of its code is a primary reason: with a few keystrokes and mouse-clicks I can fill the page with interesting tidbits and even the occasional profundity. And there’s an additional pleasure in seeing readers re-post things that I’ve dug up — I think my record for re-posts is this syllabus of W. H. Auden’s. (Images and short quotations generate the most re-posts and favorites, by the way.) I don’t have a great many readers of my Tumblr, but I think the ones who do read it enjoy it, at least to judge by the emails and tweets I’ve received on previous occasions when I went on Tumblr hiatus.

But I have come to think of these very pleasures as posing problems. Ease of posting makes me indiscriminate — I just throw anything up there that looks vaguely interesting, and then at the end of the day when I see that I’ve posted a dozen items I get this strange illusion of productivity. And sometimes when I want to post something especially important I find myself wishing that I hadn’t posted so much insignificant material that day, because I don’t want what’s important to get lost in the crowd. Moreover, once I start noticing the kinds of posts that get re-posted, it becomes a matter of discipline to ignore that and focus on what I think is interesting. I’ve started to believe that my relationship with my Tumblr isn’t altogether healthy, and — to circle back to the theme of this series of posts — that this particular technology may be encouraging me to post without thinking.

And there’s one more reason why I’m beginning to think that another model of online idea-presentation might be better for me. But that I’ll explain in my next post.

Tuesday, January 14, 2014

portrait of a Silicon Valley entrepreneur

There are some minds which give us the impression that they have passed through an elaborate education which was designed to initiate them into the traditions and achievements of their civilization; the immediate impression we have of them is an impression of cultivation, of the enjoyment of an inheritance. But this is not so with the mind of the Rationalist, which impresses us as, at best, a finely-tempered, neutral instrument, as a well-trained rather than as an educated mind. Intellectually, his experience is not so much to share the experience of the race as to be demonstrably a self-made man. And this gives to his intellectual and practical activities an almost preternatural deliberateness and self-consciousness, depriving them of any element of passivity, removing from them all sense of rhythm and continuity and dissolving them into a succession of climacterics, each to be surmounted by a tour de raison. His mind has no atmospheres, no changes of season and temperature; his intellectual processes, so far as possible, are insulated from all external influence and go on in the void. And having cut himself off from the traditional knowledge of his society, and denied the value of any education more extensive than a training in the technique of analysis, he is apt to attribute to mankind a necessary inexperience in all the critical moments of life, and if he were more self-critical he might begin to wonder how the race had ever succeeded in surviving. With an almost poetic fancy, he strives to live each day as if it were his first, and he believes that to form a habit is to fail. And if, with as yet no thought of analysis, we glance below the surface, we may, perhaps, see in the temperament, if not in the character, of the Rationalist, a deep distrust of time, an impatient hunger for eternity and an irritable nervousness in the face of everything topical and transitory.

— Michael Oakeshott, "Rationalism in Politics" (1947)

Monday, January 13, 2014

the confidence of the elect

Right after I wrote my last post I came across an interestingly related one by Tim Parks:

No one is treated with more patronizing condescension than the unpublished author or, in general, the would-be artist. At best he is commiserated. At worst mocked. He has presumed to rise above others and failed. I still recall a conversation around my father’s deathbed when the visiting doctor asked him what his three children were doing. When he arrived at the last and said young Timothy was writing a novel and wanted to become a writer, the good lady, unaware that I was entering the room, told my father not to worry, I would soon change my mind and find something sensible to do. Many years later, the same woman shook my hand with genuine respect and congratulated me on my career. She had not read my books.

Why do we have this uncritical reverence for the published writer? Why does the simple fact of publication suddenly make a person, hitherto almost derided, now a proper object of our admiration, a repository of special and important knowledge about the human condition? And more interestingly, what effect does this shift from derision to reverence have on the author and his work, and on literary fiction in general?

But Parks’s key point is not that people generally change their attitudes towards a writer once he or she gets published — the writer changes too:

I have often been astonished how rapidly and ruthlessly young novelists, or simply first novelists, will sever themselves from the community of frustrated aspirants. After years fearing oblivion, the published novelist now feels that success was inevitable, that at a very deep level he always knew he was one of the elect (something I remember V.S. Naipaul telling me at great length and with enviable conviction). Within weeks messages will appear on the websites of newly minted authors discouraging aspiring authors from sending their manuscripts. They now live in a different dimension. Time is precious. Another book is required, because there is no point in establishing a reputation if it is not fed and exploited. Sure of their calling now, they buckle down to it. All too soon they will become exactly what the public wants them to be: persons apart, producers of that special thing, literature; artists.

Notice that this is another major contributor to the problem of over-writing and premature expressiveness that I mentioned in my post: the felt need to sustain and consolidate an established reputation.

And then there’s the sense that most successful people have — and, again, need to have — that their success is not only deserved but inevitable. Immediately after reading this essay by Parks I read an interview with Philip Pullman in which he plays to the type that Parks identifies:

Yet on one thing, Pullman’s faith is profound and unshakeable. He’s now in his mid-60s, and though he thinks about death occasionally, it never wakes him up in a sweat at night. ‘I’m quite calm about life, about myself, my fate. Because I knew without doubt I’d be successful at what I was doing.’ I double-take at this, a little astounded, but he’s unwavering. ‘I had no doubt at all. I thought to myself, my talent is so great. There’s no choice but to reward it. If you measure your capacities, in a realistic sense, you know what you can do.’

Note the easy elision here between “knowing what you can do” and “knowing you’ll be recognized and rewarded for it.” If talent is so reliably rewarded, then I don't have to consider the possibility that my neighbor is getting less than he deserves — or that I’m getting more.

These reflections aren’t just about other people. How I think they apply to me is something I want to get to in another post.

reading and thinking, one more time

I want to put together two recent posts because I think when you look at them in conjunction with each other they indicate a significant, and troubling, trend. I wrote recently about how to acquire thoughts worth expressing; and I also noted that the same article on codex-reading vs. e-reading gets written over and over and over again, with the authors rarely showing any awareness that others have quite thoroughly covered their chosen theme.

The link is simply this: that one of the most reliable ways to sharpen your own thinking is to find out what other smart people have thought and said about the things you’re interested in — that is, to take the time to read. But the content-hungry world of online publishing creates strong disincentives for writers to take that time. Almost every entity that has an online presence wants to publish as frequently as possible — as long as the quality of the writing is adequate. And often “adequacy” is determined by purely stylistic criteria: a basic level of clarity and, when possible, some vividness of style. That the writer may be saying something indistinguishable from what a dozen or a hundred writers have said before is rarely a matter of editorial concern. Get the content out there!

And of course, writers want to be published and be read. If they can’t have their work in print magazines or books, then having it tied to a URL is the next best thing — sometimes even a better thing. The passion for self-expression is incredibly powerful. Consider, for instance, the unvarying lament of literary journals: that they have far more people submitting stories and poems to them than they have readers. (Would-be and actual creative writers rarely read, and often know nothing about, the journals to whom they submit their work and whose approval-via-acceptance they so desperately crave.)

So between the writers who desperate to be published and the editors desperate for “content,” the forces militating against taking time — time to read, time to think — are really powerful. So writers tend to trust the first thoughts that come to them, rarely bothering to find out whether others have already considered their topic and written well about it — and in fact not wanting to know about earlier writing, because that might pre-empt their own writing, their publication — the “content” that editors want and that will keep readers’ Twitter feeds clicking and popping with links. In the current system everyone feels stimulated or productive or both. And hey, it’s only reading and thinking that go by the wayside.

Friday, January 10, 2014

the experiment that wasn't

A couple of days ago this showed up in my Twitter stream:


I had followed Cole on Twitter for a couple of years but eventually unfollowed — I don't remember why. In fact I thought I was still following, something that can happen when the people you follow on Twitter retweet someone a lot. But I went to check it out, and sure enough, something was happening: a story was unfolding.

You can read the whole story here, under the title "Teju Cole orchestrates his Twitter followers into a collective short story," but the really important thing to note about this event is that it was not a "collective short story" — though that's what it appeared to be at the time. When I checked it out the story was about a dozen tweets in, and my assumption (which was also the assumption of thousands of others) was that Cole had gotten the story going and then was choosing the replies that in his view best moved the story along. Now that, I thought, was interesting.

But as it turned out the many people who submitted their own tweets in hopes of having them chosen as parts of the story were wasting their time: Cole had written the story in advance and was just asking some of his followers to tweet parts of it, making sure that the last word was given to a TV host with three times as many followers as Cole:


The tyranny of the single author continues unchallenged!

Later, Cole wrote:


Well, sort of. When the story depends on people agreeing in advance to tweet its parts, parts written for them by someone else, and on their being retweeted by the author according to his plan and his schedule, the collaborative "we" element of this is trivial. A number of people in my own feed expressed some disappointment that the "event" wasn't anything like what it had at first appeared to be.

What Cole did may be sort of cool — maybe — but it wasn't a "collective story" and it wasn't what some called it, an "experiment in narration." But if someone actually tried what thousands of us thought Cole was doing....

Tuesday, January 7, 2014

code and the homeless

My wife Teri worked for years in relief and development, and she told me that one of the open secrets of that world was that a certain amount of fudging always had to be done to bridge the gap between what donors wanted to do and what needed to be done. For instance, people liked to give money to sponsor a particular child, and to forge a relationship of some kind with that child, and to provide for that child — but of course the child's whole community had needs that any responsible relief and development agency had to be more concerned about. So the selling point to a donor might be that a new well could provide plenty of fresh water for little Anna, but the real point would be to have a reliable source of water for Anna's village. Still, it was always the individual, the one little girl or little boy, that set the juices of compassion flowing.

Surely this also explains this guy's desire to teach a homeless man how to write code — an idea that has been endless mocked, though not always for the right reasons. Compassion should never be discouraged, though sometimes it needs to be redirected. For a more appropriately big picture, read this post on the Code for America blog.

What that post teaches us is that the problem for the homeless and the chronically poor is not that they don't know how to write code themselves, but that their situation is routinely made at the very least more stressful and often materially worse by code ineptly written by others. They suffer disproportionately from automated governmental systems not only because they don't have the education to interpret and respond to the products of bad code but also because the code that they have to deal with is often much worse than what the wealthier among us regularly encounter. It should be no surprise that a bureaucracy (perhaps necessarily) lacking in compassion generates badly-designed and badly-implemented automated systems.

So people in the software industry who want to help the poor and the homeless might think about this. What if a Google and Facebook and Twitter donated not just money but also coding expertise? What if they went to City Hall and made a deal to rewrite and re-implement the automated systems that are meant to help the poor and the homeless but in fact often make their suffering worse? Let's not worry about teaching individual poor people to code until we can create an environment in which bad code is no longer a principal ingredient of their misery.

Monday, January 6, 2014

that article again?

I'm not even going to bother to link to them: a new set of articles (they come out at least weekly, and in major magazines and newspapers) about how reading on a Kindle or Nook differs from reading a paper codex. They all say the same thing: Reading on an e-reader sure is convenient, but oh how I miss the tactile pleasures of the codex. The really odd thing is that none of these articles shows any awareness of the ten thousand previous articles saying exactly the same thing.

But if you want to think in more interesting and productive ways about these matters, then read Sarah Werner's Snarkmarket post and follow the links. We can do better — see?