Text Patterns - by Alan Jacobs

Monday, September 30, 2013

the Underground Man addresses the solutionists

In essays and books, Evgeny Morozov has outlined four intellectual pathologies of the code-literate and code-celebrant: populism, utopianism, internet-centrism, and solutionism. These are fuzzy overlapping categories and Morozov still seems to be in search of a stable set of terms to articulate his critique. We might simplify matters by saying that all the people who fit into these categories tend to have excelled, in their academic and professional careers, as problem-solvers, so that we might sum up their thinking in this way: to many men who can code, every moment of discomfort looks like a problem in need of a code-based solution.

Thus Morozov, from To save Everything, Click Here:

Design theorist Michael Dobbins has it right: solutionism presumes rather than investigates the problems that it is trying to solve, reaching “for the answer before the questions have been fully asked.” How problems are composed matters every bit as much as how problems are resolved.

In one important passage in the book, Morozov responds to the proposal that cameras be installed in your kitchen to watch you cook a meal and correct you whenever you go astray. But this, Morozov points out, is to reduce people to machines carrying out instructions. It rules out any possibility of autonomous action, of actually learning an art-form, of creative improvisation, of discovering that your “error” led to a better result than the recipe you failed to follow. “Imperfection, ambiguity, opacity, disorder, and the opportunity to err, to sin, to do the wrong thing: all of these are constitutive of human freedom, and any concentrated attempt to root them out will root out that freedom as well.”

I simply want to note that the argument that Morozov makes here was made more powerfully, and with a greater sense of the manifold implications of this tension, one hundred and fifty years ago by Dostoevsky in Notes from Underground. In one especially vital passage Dostoevsky anticipates the whole of the current solutionist paradigm:

Furthermore, you say, science will teach men (although in my opinion this is a superfluity) that they have not, in fact, and never have had, either will or fancy, and are no more than a sort of piano keyboard or barrel-organ cylinder; and that the laws of nature still exist on the earth, so that whatever man does he does not of his own volition but, as really goes without saying, by the laws of nature. Consequently, these laws of nature have only to be discovered, and man will no longer be responsible for his actions, and it will become extremely easy for him to live his life. All human actions, of course, will then have to be worked out by those laws, mathematically, like a table of logarithms, and entered in the almanac; or better still, there will appear orthodox publications, something like our encyclopaedic dictionaries, in which everything will be so accurately calculated and plotted that there will no longer be any individual deeds or adventures left in the world. ‘Then,’ (this is all of you speaking), ‘a new political economy will come into existence, all complete, and also calculated with mathematical accuracy, so that all problems will vanish in the twinkling of an eye, simply because all possible answers to them will have been supplied. Then the Palace of Crystal will arise. Then….’ Well, in short, the golden age will come again.

All problems will vanish in the twinkling of an eye, simply because all possible answers to them will have been supplied. To this utopian prediction the Underground Man has a complex response:

Can man’s interests be correctly calculated? Are there not some which not only have not been classified, but are incapable of classification? After all, gentlemen, as far as I know you deduce the whole range of human satisfactions as averages from statistical figures and scientifico-economic formulas. You recognize things like wealth, freedom, comfort, prosperity, and so on as good, so that a man who deliberately and openly went against that tabulation would in your opinion, and of course in mine also, be an obscurantist or else completely mad, wouldn’t he? But there is one very puzzling thing: how does it come about that all the statisticians and experts and lovers of humanity, when they enumerate the good things of life, always omit one particular one? They don’t even take it into account as they ought, and the whole calculation depends on it. After all, it would not do much harm to accept this as a good and add it to the list. But the snag lies in this; that this strange benefit won’t suit any classification or fit neatly into any list.

What does he mean here? What is the “one particular” “good thing in life” that is always omitted from the list? We would see it quite clearly, the Underground Man says, if the solutionist utopia were ever actually realized, because at that moment someone would arise to say, “Come on, gentlemen, why shouldn’t we get rid of all this calm reasonableness with one kick, just so as to send all these logarithms to the devil and be able to live our own lives at our own sweet will?”

And such a figure would certainly find many followers. Why? Because

that’s the way men are made. And all this for the most frivolous of reasons, hardly worth mentioning, one would think: namely that a man, whoever he is, always and everywhere likes to act as he chooses, and not at all according to the dictates of reason and self-interest; it is indeed possible, and sometimes positively imperative (in my view), to act directly contrary to one’s own best interests. One’s own free and unfettered volition, one’s own caprice, however wild, one’s own fancy, inflamed sometimes to the point of madness – that is the one best and greatest good, which is never taken into consideration because it will not fit into any classification, and the omission of which always sends all systems and theories to the devil. Where did all the sages get the idea that a man’s desires must be normal and virtuous? Why did they imagine that he must inevitably will what is reasonable and profitable? What a man needs is simply and solely independent volition, whatever that independence may cost and wherever it may lead.

And so the solutionist utopia will inevitably sow the seeds of its own undermining. “This good” — the good of independent volition — “is distinguished precisely by upsetting all our classifications and always destroying the systems established by lovers of humanity for the happiness of mankind. In short, it interferes with everything.” This is true for good and for ill; but it is always and everywhere true.

popularizing

Two recent articles from the Guardian make nice companion pieces for reflection: a profile of Simon Schama and one of Malcolm Gladwell. Each article raises an important question: What counts as valid (useful, responsible) popularization of a subject?

Here’s Oliver Burkeman writing about Gladwell:

We are now sufficiently far into the Gladwell era that the Gladwell backlash is well under way. He is routinely accused of oversimplifying his material, or attacking straw men: does anyone really believe that success is solely a matter of individual talent, the position that Outliers sets out to unseat? Or that the strong always vanquish the weak? “You’re of necessity simplifying,” says Gladwell. “If you’re in the business of translating ideas in the academic realm to a general audience, you have to simplify … If my books appear to a reader to be oversimplified, then you shouldn’t read them: you’re not the audience!” (Another common complaint, that his well-paid speaking gigs represent a conflict of interest, is answered in a 6,500-word essay on Gladwell’s website.)

A subtler criticism holds that there is something more fundamentally wrong with the Gladwellian project, and indeed with the many Gladwellesque tomes it’s inspired. To some critics, usually those schooled in the methods of the natural sciences, it’s flatly unacceptable to proceed by concocting hypotheses then amassing anecdotes to illustrate them. “In his pages, the underdogs win … of course they do,” the author Tina Rosenberg wrote, in an early review of David and Goliath. “That’s why Gladwell includes their stories. Yet you’ll look in vain for reasons to believe that these exceptions prove any real-world rules about underdogs.” The problem with this objection is not that it’s wrong, exactly, but that it applies equally to almost all journalism, and vast swaths of respected work in the humanities and social sciences, too. You make your case, you illustrate it with statistics and storytelling, and you refrain from claiming that it’s the absolute, objective truth. Gladwell calls his articles and books “conversation starters”, and that’s not false modesty; ultimately, perhaps that’s all that even the best nonfiction writing can ever honestly aspire to be.

And here’s Andrew Anthony on Schama:

By the mid–90s, Schama was art critic for the New Yorker, had a chair at Columbia University and had written and presented two series on art for the BBC. Since then, he’s balanced his post in New York, from which he took a sabbatical to make The Story of the Jews, with making films in Britain, trailing an ever-growing army of fans in his wake.

With popularity, however, comes envy, particularly in academic circles, and Schama has not escaped the accusation that he has “dumbed down”. It’s a charge that he vehemently rejects.

To his accusers, he says he wants to say: “‘Try it, Buster. See how unbelievably demanding it is.’ Anyone can write an academic piece directed at other academics. To write something that delivers an argument and a gripping storyline to someone’s granny or eight-year-old takes the highest quality of your powers. I am completely unrepentant. One should not feel shifty.”

I’m with Schama on this one. I’ve spent a good chunk of my career trying to write for a general audience in ways that are faithful to my subjects — that don’t dumb down but rather translate from academic and intellectual terms into something like ordinary language — and it’s really, really hard. But I think it’s immensely worth doing, if you can resist the temptation to cook the data to make your story better.

I think it’s generally understood now that Gladwell does not resist that temptation, and indeed may not even realize that he’s succumbing to it. Reviewer after reviewer after reviewer — many of them experts in the fields that Gladwell dabbles in — have denounced his habit of clutching tightly a piece of evidence that seems to support a good story and then clinging for dear life to it while ignoring everything that might undermine its legitimacy or suggest a more complicated narrative. And yet he shows no signs of mending his ways. Gladwell has written some interesting and useful things, but overall he’s a very bad popularizer.

Simon Schama, on the pther hand, is a very good one, and this is largely because he is an academic and can, so to speak, show his work. A book like Citizens, the one that put him on the general public’s map, is an absolutely riveting story, but it’s backed, as his notes and bibliography show, by extremely thorough research. He’s a very well-trained, highly skilled scholar who also happens to have a great gift and and passionate love of storytelling. It’s a rare combination, but the ideal one for the would-be popularizer.

Friday, September 27, 2013

learning with books!

So today on Twitter I asked:


I mainly got recommendations for websites, which is cool, but some of the people who recommended websites were extremely adamant that it is totally wrong to try to learn CSS from a book. “Books are probably the absolute worst way to learn tech/web/coding stuff.” “Books on a topic like this are a total waste of time.” (See what I mean by “extremely adamant”?) The denunciations of books on CSS made two points: that any book will be incomplete — which isn’t really relevant to someone just trying to learn the basics — and that books are outdated upon publication — which might be slightly more relevant, but not much. I’d be looking for a very recent book, and the CSS standards, especially for the kinds of minimal styling that I’d be interested in, aren’t changing that fast.

But you know, I could use one or more of the many online CSS tutorials out there — so why wouldn’t I? They would be free, which a book would not be; they’re instantly accessible. Seems like a no-brainer.

Except I’ve discovered from my pretty minimal past experience with coding — or the closest I’ve come to coding — that I really struggle with online guides and learn much more easily from books. Part of it is what Erin Kissane said:


But I also seem to find it visually more helpful to have a book open next to my computer rather than switch back and forth between online resources and my text editor. If I can keep my text editor open and visible at all times and then cut my eyes back and forth to the page with instructions and examples, I can stay better focused on the task — and on what’s wrong with the stuff I’ve typed. I can also highlight passages in the book, underline or annotate them, dog-ear the pages, go back and forth quickly between one section and the next…. By contrast, online tutorials are mechanistic, relentlessly linear, and controlling of my pace and my attention.

I learned most of what I know about AppleScript from a book; ditto with LaTeX; and I think I’ve had so little success learning my first real programming language, Python, because I haven’t found the right book. (I’m going to try this one next.) But I’ve never had any success at all learning from online tutorials.

YMMV, of course. Which is my chief point.

reading at speed

I’ve recently noticed a number of apps devoted to increasing reading speed: here’s a review of a couple of them. Whenever I think about speed-reading, my mind casts back to a story I read when I was a teenager — a story I discuss in my book The Pleasures of Reading in an Age of Distraction. Let me quote myself:

Consider a story by one of the great weirdos of American literature, R. A. Lafferty (1914-2002). It’s called “Primary Education of the Camiroi,” and it concerns a PTA delegation from Dubuque who visit another planet to investigate an alien society’s educational methods. After one little boy crashes into a member of the delegation, knocking her down and breaking her glasses, and then immediately grinds new lenses for her and repairs the spectacles — a disconcerting moment for the Iowans — they interview a girl and ask her how fast she reads. She replies that she reads 120 words per minute. One of the Iowans proudly announces that she knows students of the same age in Dubuque who read five hundred words per minute. (As Stanislas Dehaene explains, that’s pretty close to our maximum speed.)

“When I began disciplined reading, I was reading at a rate of four thousand words a minute,” the girl said. “They had quite a time correcting me of it. I had to take remedial reading, and my parents were ashamed of me. Now I’ve learned to read almost slow enough.”

Slow enough, that is, to remember verbatim everything she has read. “We on Camiroi are only a little more intelligent than you on Earth,” one of the adults says. “We cannot afford to waste time on forgetting or reviewing, or pursuing anything of a shallowness that lends itself to scanning.”

So you want to read faster? Are you sure?

messes

Johann Michael Bretschneider (1656-1727), Scholars in a study. Poznan, National Museum in Poznan

These new studies of the relations between messiness and creativity are really interesting, but it raises two questions for me:

1) Does the correlation between messiness and creativity persist over time? The studies seem to focus on how people respond to messy rooms that they're exposed to for short periods. But what if the messiness were everyday?

2) Again over time, does it matter whether the messiness, or neatness, is yours or someone else's? One of my favorite Malcolm Gladwell essays — I have a great many unfavorites, but never mind — is one on "The Social Life of Paper", in which he notes studies showing that "pilers," people who pile up paper on their desks, are remarkably skilled at finding their stuff, even though to an observer the desk might seem utterly chaotic. It's hard to imagine that messiness could be anything but frustrating if you had no control over it.

So: an interesting study, but I'd like to know more. (It seems like I'm always saying that.)


Thursday, September 26, 2013

the great spaces-after-a-period controversy

A few months ago, Farhad Manjoo of Slate got a lot of attention — well, in my Twitter feed anyway — by writing a post telling us “Why you should never, ever use two spaces after a period.” Why? Because “Typing two spaces after a period is totally, completely, utterly, and inarguably wrong. The case he makes is largely historical — but guess what? Manjoo’s history is totally, completely, utterly, and inarguably wrong:

Unfortunately, this whole story is a fairy tale, made up by typographers to make themselves feel like they are correct in some absolute way.  The account is riddled with historical fabrication.  Here are some facts:
  • There were earlier standards before the single-space standard, and they involved much wider spaces after sentences.
  • Typewriter practice actually imitated the larger spaces of the time when typewriters first came to be used.  They adopted the practice of proportional fonts into monospace fonts, rather than the other way around.
  • Literally centuries of typesetters and printers believed that a wider space was necessary after a period, particularly in the English-speaking world.  It was the standard since at least the time that William Caslon created the first English typeface in the early 1700s (and part of a tradition that went back further), and it was not seriously questioned among English or American typesetters until the 1920s or so.
  • The “standard” of one space is maybe 60 years old at the most, with some publishers retaining wider spaces as a house style well into the 1950s and even a few in the 1960s.
  • As for the “ugly” white space, the holes after the sentence were said to make it easier to parse sentences.  Earlier printers had advice to deal with the situations where the holes became too numerous or looked bad.
  • The primary reasons for the move to a single uniform space had little to do with a consensus among expert typographers concerning aesthetics.  Instead, the move was driven by publishers who wanted cheaper publications, decreasing expertise in the typesetting profession, and new technology that made it difficult (and sometimes impossible) to conform to the earlier wide-spaced standards.
  • The lies do not just come from random Slate writers or bloggers, but also established typographers, who seem to refuse the clear evidence that they could easily see if they examine the majority of books printed before 1925 or so.  Even an authority like Robert Bringhurst is foolish enough not to do his research before claiming that double spacing is a “quaint Victorian habit” that originated in the “dark and inflationary age in typography” of the (presumably mid to late) nineteenth century.

I love history. Real history, like this, not fake history, like Farhad Manjoo’s. Still: I hate seeing two spaces after a period (as in the very post I am quoting). An unhistorical judgment, but my own.

Myst and its afterlife


Myst was the first computer game I bought — I had played some text-adventure games that I borrowed from friends, but didn't shell out money until I had my first computer with a color monitor (A Macintosh Performa 6116CD, if you must know). I played Myst a lot: I struggled to solve many of the puzzles but just couldn't let it go. When Riven came out a couple of years later I became fully absorbed in that too. I looked forward to playing many more games of this kind in the years to come. I waited for them to come out.

And — according to Emily Yoshida in this outstanding essay on the 20th anniversary of Myst — so too did the makers of Myst, Rand and Robyn Miller.


“We’re hearing lots of comments here on the 20th anniversary — people are going, ‘Man, I would love to see that same kind of experience [as Myst] again,’” Rand said. One of the first things that made him think there could still be a place for non-violent, open-world gaming came straight from the pages of that cultural barometer/hive mind Reddit. “On the front page, there was a [post] where somebody said, ‘Hey, I just put my grandparents in front of Assassin’s Creed in the gondola and let them sail around Venice for a couple hours.’ And then there’s a huge discussion after that — as there always is on good Reddit articles — where people are saying, ‘Yeah, why don’t people make these games? Why can’t we just explore? Why do we always have to shoot things?’ So, maybe the time is right again to try that. That’s exciting. I still think there’s plenty of room for something really cool in this genre out there. And I don’t think we’ve done it yet.”

So have we? I stopped playing computer games basically for one reason: I have no interest in shooting things (or cleaving them with a battle-axe or a light-saber). Are there exploratory, aesthetically interesting, non-violent games than I'm missing? And I mean wholly non-violent, not you-don't-have-to-do-a-lot-of-killing nonviolent: games you play with no weapons at all

Among recent games, the one that most closely meets this description is The Room, an iPad game that I absolutely loved. I'm glad to know that there will be a sequel. But I'd love to see a lot more like it.

Tuesday, September 24, 2013

prosaics of the digital life

In the best book yet written on my favorite twentieth-century thinker, Mikhail Bakhtin: Creation of a Prosaics, Gary Saul Morson and Caryl Emerson describe the influence of Tolstoy on Bakhtin, especially Tolstoy's emphasis on the cumulative effect of tiny decisions and thoughts on a person's whole life. Here's a key passage:

Levin in Anna Karenina and Pierre in War and Peace have both been troubled by the impossibility of grounding an ethical theory, and therefore of knowing for sure what is right and wrong. On the one hand, absolutist approaches not only proved inadequate to particular situations but also contradicted each other. On the other hand, relativism absurdly denied the meaningfulness of the question and led to a paralyzing indifference. After oscillating between absolutes and absences, they eventually recognize that their mistake lay in presuming that morality is a matter of applying rules and that ethics is a field of systematic knowledge. Both discover that they can make correct moral decisions without a general philosophy. Instead of a system, they come to rely on a moral wisdom derived from living rightly moment to moment and attending carefully to the irreducible particularities of each case.

I think we could say that "attending carefully to the irreducible particularities of each case" more or less is, or at the very least is an absolute precondition of, "living rightly moment to moment." Ethical action requires such mindfulness, a point that was also essential to the thought of Simone Weil, for whom attentiveness (as she called it) was the touchstone of ethical, intellectual, and spiritual action alike.

We might also connect such mindfulness with my recent reflections on the problem of adherence: the failure to adhere to one's determinations is at least in part a failure to be fully mindful about what one is doing.

It seems to me that most of our debates about recent digital technologies — about living in a connected state, about being endlessly networked, about becoming functional cyborgs — are afflicted by the same tendency to false systematization that, as Levin and Pierre discover, afflict ethical theory. Perhaps if we really want to learn to think well, and in the end act well, in a hyper-connected environment, we need to stop trying to generalize and instead become more attentive to what we are actually doing, minute by minute, and to the immediate consequences of those acts. (Only after rightly noting the immediate ones can we begin to grasp the more distant and extended ones.)

That is, we need more detailed descriptive accounts of How We Live Now — novelistic accounts, or what Bakhtin would call prosaic accounts. We need a prosaics of the digital life.

Monday, September 23, 2013

Happy

Happy Humphrey

A while back I was writing about the mysteries of adherence, that is, why some people manage to discipline themselves in ways that they need to while others do not. I want to return here to that theme to relate a fable — but a true fable.

The man in the photograph above is William Joseph Cobb, better known in his wrestling days as Happy Humphrey. He was a very famous wrestler, though perhaps not as famous as the almost-equally-massive Haystacks Calhoun, whom he sometimes wrestled. Happy Humphrey weighed as much as 900 pounds, and his weight proved not to be good for his health. After years of trying to lose weight, and in fear of imminent death from the heart condition that had forced his retirement from wrestling, he decided to turn himself over to researchers at the Medical College of Georgia in Augusta. That is, they rather than he became responsible for his adherence to a diet.

For two years, from 1963 to 1965, Humphrey lived at the clinic, where the researchers confined him to 1000 calories per day. They cycled him through high-protein, high-carb, and high fat diets, and came to the conclusion that he lost pretty much the same about of weight on each, though he lost more actual fat on the high-protein diet, and felt much better also. At the end of the two years Humphrey, who had weighed over 800 pounds when he checked into the clinic, weighed in at a sleek 232.

It's important to note that during those two years Humphrey was completely confined to the clinic, and remained under supervision at all times. Also, by the time Humphrey died from a heart attack in 1989, he once more weighed over 600 pounds.

Something else caught my eye: it appears that when Humphrey left the clinic he stayed in Augusta and worked at a shoe-repair shop. Some of the websites I consulted in reading about Humphrey say that he was originally from Macon, Georgia, so he wasn't too far from home territory, but I still can't help wondering whether he wanted to remain near the place and the people who had done for him what he couldn't do for himself. As though some aura of will-power lingered in the neighborhood.

And I also can't help wondering how Humphrey felt about leaving the clinic. Was he desperate to recapture the freedoms of ordinary life? Or did he miss the peacefulness of an environment in which vital decisions were made for him? Did he ever long to return? Did he ever ask to return? How “happy” was he, really? And when?

Friday, September 20, 2013

public speaking

I don't want to do it any more. I explain why here. Feel free to dismantle my reasoning in the comments below.

on redshirting

Maria Konnikova writes about the practice of “redshirting,” that is, starting kids' schooling at a later age so they'll be among the older rather than among the younger kids in class:

On the surface, redshirting seems to make sense in the academic realm, too. The capabilities of a child’s brain increase at a rapid pace; the difference between five-year-olds and six-year-olds is far greater than between twenty-five-year-olds and twenty-six-year-olds. An extra year can allow a child to excel relative to the younger students in the class. “Especially for boys, there is thought to be a relative-age effect that persists across sports and over time,” said Friedman. “Early investment of time and skill developments appears to have a more lasting impact.” Older students and athletes are often found in leadership positions—and who can doubt the popularity of the star quarterback relative to the gym-class weakling?

It’s this competitive logic, rather than genuine concern about a child’s developmental readiness, that drives redshirting.

To that claim, I reply: How the hell do you know? Honestly, few things infuriate me more than writers who calmly assure you what other people's motives are. Are there some parents who make the decision to redshirt out of purely competitive motives? No doubt. How many? I have no idea and neither do you. The line often attributed to Rebecca West that “There's no such thing as an unmixed motive” applies well to this situation.

My son was born in August and we started him in first grade at age seven. We hadn't the least interest in competing with anyone, nor did Wes — his deeply non-competitive nature was evident from his toddler years. But he was very shy, very unsure of himself, and very small for his age. It was obvious that sports would not be his thing, so we didn't factor that into our decision-making. We just didn't want school to be a place of terror for him, or at least any more than it had to be. As it turned out, even as one of the oldest kids in his class he was also one of the smallest, and had to suffer through a good deal of bullying which was never addressed by his schools and eventually (in 7th grade) led to our decision to educate him at home. Thanks be to God, he's a healthy, happy, and somewhat-above-average-in-height young man today.

More than anything else, I wanted Wes to avoid having to go through what I went through. (As it turned out, I didn't achieve this goal, though I think I lessened the damage.) I know all about being the youngest: with my September birthday, I started school when I was five — and then skipped the second grade. The school's principal told my mother that I would have been bored and restless in second grade, which was probably true. So I was six when I started third grade, 12 when I started high school, 16 when I started college … and, people, it was terrible. My entire childhood and adolescence were miserable, largely because I was so far removed in size and maturity from my classmates. (Things got a little better when I had a massive growth spurt at age 13; before that I was the better part of a foot shorter than anyone in my class.) I was relentlessly, ceaselessly picked on by the older and larger kids who surrounded me — yes, it's called bullying: a phenomenon Konnikova seems unaware of.

Indeed, her celebration of how awesome it is to be the youngest kid in class is wholly based on studies of academic achievement: such younger children tend to do better academically than the ones who are older than their peers. For Konnikova this correlation is obviously causal: put younger kids among older ones and they get smarter; put older kids among young ones and they get dumber. It does not seem to occur to her that the causal arrows might run in the other direction: that kids get put among their elders because they're already smart, or other kids get held back and put among younger kids because they're already struggling.

But whether or not there are academic benefits to being put in class among one's elders, there are often immense social and psychologcal costs. I experienced them, and would give a great deal to be able to rewind my tape and avoid them. They taught me little of value, made me deeply unhappy, and helped me to develop lamentable character traits — chief among them the tendency to lie to generate interest and approval — that it took me years and years to overcome.

Now, such miseries will not happen to everyone. For instance, my friend Tim Carmody was also one of the youngest in his classes when he was growing up, but because he was big for his age — and, I don't know, maybe because of a different personality type — it wasn't a problem for him. But the potential dangers for the younger, especially if they're also the smaller, need to be factored in to any discussion of when to send kids to school. Konnikova ignores these dangers altogether in her eagerness to condemn redshirting parents as hyper-competitive gamers of the system.

The decision about when to send your kid to school is a big one, and it has very large and lasting consequences. These will vary from school to school, and family to family. So the conversation about how to weigh the various factors needs to be a nuanced one. Konnikova's simplistic smear of parents who make a choice she dislikes doesn't help at all.

Thursday, September 19, 2013

popularity sort


The problem with Instapaper's new Popularity Sort — essentially, an engine for finding out what everyone else is reading on Instapaper and suggesting that you read it too — is that it does what so many other social media do: exacerbate the gap between the attentional haves and have-nots. It's a matter of inertia: posts and essays and articles that get a little attention at first stand a good chance of getting more, and then more, and then still more, while those that don't get that little bump at first will be more likely to sink into neglect.

The net result of this kind of thing, as it becomes more dominant in online discourse — and it will indeed become more dominant — will be (a) more and more writers whoring for that initial burst of attention to get the ball rolling, and (b) a significant loss of what I'm going to call, on the model of biodiversity, graphodiversity.

So, you know, just never read something because other people are reading it. Avoid popularity sorting. Cultivate your own genuine interests and don't let them get crowded out by invasive species of popular but bad writing.

Wednesday, September 18, 2013

technology as prosthesis

As I said I would, I’ve been thinking more about Sara Hendren’s recent essay on assistive technology, and her claim that “all technology is assistive technology.” The key variable is what we’re trying to assist.

These thoughts are consonant with the view articulated by Marshall McLuhan in Understanding Media: The Extensions of Man, in which he explicitly speaks of technology in prosthetic terms, though he doesn’t, as far as I can discover, use the terms “prosthetic” or “prosthesis.” He writes, “Any extension, whether of skin, hand, or foot, affects the whole psychic and social complex,” and his goal in Understanding Media, is to explore these effects.

McLuhan sees all technology in these terms, not just electronic ones. Riffing on W. H. Auden’s identification of the 20th century as The Age of Anxiety, he writes,

If the nineteenth century was the age of the editorial chair, ours is the century of the psychiatrist's couch. As extension of man the chair is a specialist ablation of the posterior, a sort of ablative absolute of backside, whereas the couch extends the integral being. The psychiatrist employs the couch, since it removes the temptation to express private points of view and obviates the need to rationalize events.

Like so much of what McLuhan writes, this is simultaneously ludicrous — the “ablative absolute of backside”? — and provocative. But all I want to note at this point, and in this post, is that McLuhan’s description of our movement from the “mechanical age” to the “electronic age” is somewhat misleading, because it is increasingly obvious that electronic technologies do not obviate mechanical ones but enhance and supplement them. They are extensions of extensions, as the varying phenomena that go under the label internet of things indicate. People like Hugh Herr — about whom Sara Hendren has written here — are exploring technologies that establish an intersection of the mechanical, the electronic, and the biological: Herr calls this field biomechatronics.

I have no conclusions here — I don't know enough to draw conclusions — but I know that I am especially interested in how these convergences will affect our technologies of knowledge. More about that in future posts.

the narrowing world of literary fiction

I've been thinking a good deal about a recent blog post by Adam Roberts, a fine scholar and even better novelist whose work you should know. (Here are some thoughts of mine about his fascinating book New Model Army). In this post he's reflecting on the general tendency among the literati to dismiss science fiction and fantasy and, especially, young adult (YA) novels as being intrinsically less complex and innovative than what we usually call "literary" fiction.

Take the Twilight books. There are lots of ways in which these are very bad books, of course: clumsily written, derivative etc etc. BUT! They speak to and move millions, and I’m uncomfortable simply mocking that. It (the mockery) seems to me symptomatic of an attitude that defines ‘aesthetic merit’ solely in terms of stylistic or formal innovation. These novels are about something important (sex) and they write about it in an ahem penetrating way—sexual desire as a life-changing force that is at the same time something that doesn’t happen; sex as something simultaneously compelling and alarming, that draws you on and scares you away in equal measure. There are no Booker shortlisted novels that are about that. Indeed the post-Chatterley novel has taken it as more-or-less axiomatic that sex is something to be explicitly and lengthily portrayed in writing. The mainstream fiction attitude to sexual representation is ‘adult’ in the several senses of that word. I have no problem with that myself; I'm not advocating prudery, or Victorian sexual morality. I'm suggesting that that’s not actually how sex manifests in the lives of a great many people.

Or take Harry Potter, bigger even than Meyer. Formally conservative and stylistically flat novels, yes—but this series is one of the great representations of school in western culture. Perhaps the greatest. School dominates your life from 5-18; more if you go to college. When you’re 25 and reading fiction, school has been literally two thirds of your existence. It is our gateway to the adult world, our first experience of socialisation outside the family. It’s a massive thing. When do Booker shortlisted novels ever apprehend it? They don’t—the most you will get is a little background of character A’s schooltimes past, by way of fleshing out their characterisation as adults. Because it is as adults that we’re supposed to be interested in. [School is]a massive, global phenomenon. Yet where are the other great novels of school life?

Roberts is a consistently imaginative and provocative thinker, but this post is especially important. What he's showing us is that the typical story we tell about the rise of modern fiction — a story of expanding possibility, of increasing frankness about experiences, especially sexual ones, that our ancestors drew a discreet curtain across — is highly selective and hence distorting. The freedom to treat "adult" issues has led to a neglect in our fiction of experiences deemed to be less than adult. Yet, "Victorian sexual morality" notwithstanding, Charles Dickens could treat school experiences as of equal interest to adult love and loss. (Roberts: "And of course people often denigrate him — compared say, to ‘properly adult’ writers like Eliot or Thackeray — as somehow an immature figure, a child who never quite grew up. Bollocks to that. Feature not bug, people! Feature; not bug.") What if the much-celebrated "openness" of our recent fiction is simultaneously a closing-off?

All that said, I find myself thinking, Is Roberts actually correct about how thoroughly novelists of the past hundred years have neglected these experiences? There's a good deal about school in A Portrait of the Artist as a Young Man, isn't there? And I wonder if to some degree the narrative treatment of school experiences hasn't been offloaded to memoir and autobiography, as in Paul Watkins's lovely Stand Before Your God. But about the neglect of sex as something imagined and sought and feared rather than experienced, I think he must be correct. Thoughts? What have I missed?

Tuesday, September 17, 2013

on assistive technology

I was talking with some friends on Twitter the other day about the ever-shortening definition of what gets counted as a “long read” — it's enough to make me more sympathetic to those shrinking-attention-span arguments that I tend otherwise to be skeptical towards. Medium tells me, in its highly annoying way, that this essay by Sara Hendren should take me fifteen minutes to read, which I guess makes it a long read by many current definitions.

But whether you call it that or not, the key thing is this: I could read this essay in less than fifteen minutes, but it leaves me with a great deal to think about that will occupy me for considerably longer. (I won't hold my breath for Medium to estimate how long I'll be thinking about what I read there.)

Hendren's interests, here and on her website, Abler, revolve around the various forms of disability, the languages we use for disability, and the objects (with their overt or covert artfulness) we design to aid disabled people — the kind of objects that are generally lumped under the category “assistive technology.” But it's just that term that Hendren wants to call into question:

Well—it’s worth saying again: All technology is assistive technology. Honestly—what technology are you using that’s not assistive? Your smartphone? Your eyeglasses? Headphones? And those three examples alone are assisting you in multiple registers: They’re enabling or augmenting a sensory experience, say, or providing navigational information. But they’re also allowing you to decide whether to be available for approach in public, or not; to check out or in on a conversation or meeting in a bunch of subtle ways; to identify, by your choice of brand or look, with one culture group and not another.

Making a persistent, overt distinction about “assistive tech” embodies the second-tier do-gooderism and banality that still dominate design work targeted toward “special needs.” “Assistive technology” implies a separate species of tools designed exclusively for those people with a rather narrow set of diagnostic “impairments”—impairments, in other words, that have been culturally designated as needing special attention, as being particularly, grossly abnormal. But are you sure your phone isn’t a crutch, as it were, for a whole lot of unexamined needs? If the metrics were expansive enough, I think the impact of what’s designated as assistive would start to get blurry pretty quickly.

Please read the whole thing. And then think about it for however long it takes. There are a great many implications here, some of which I hope to take up in a later post.

Monday, September 16, 2013

how I horrified Ross Douthat


It was easy. I just confessed that I don't watch Breaking Bad.


Or maybe the crazy thing is that I don't watch the show but still read about it, I'm not sure — Ross seemed pretty distraught and I didn't want to insist on clarification.

But given some of the feedback I've received, maybe I should clarify my own position. First of all, an important fact I didn't mention in my previous post: until I moved to Texas a couple of months ago, I hadn't had cable TV for nearly a decade. So for much of that time my access to our era's most-celebrated TV shows (The Sopranos, The Wire, Mad Men, etc.) was iffy: only some were available on Netflix and Amazon Instant Video, and intermittently, and at variable prices. But I could have found a way, sure: so why didn't I?

The answer is very simple: often I would think to myself, Do I want to start watching Show X? — and then my answer would be, No, I think I'd rather read. That's all. That's the full explanation. When faced with a choice between watching a TV show or a movie and reading a book or even the articles in my Instapaper queue, I almost always end up choosing to read because, as I noted in my previous post, reading is just what I do.

But I'm not at all sure that my choice is always the best one. The more I learn about Breaking Bad the more I wish I had picked it up somewhere along the way, early enough that I wouldn't have a great deal of laborious catching-up to do. (And before I read so much about it.)  But that ship has sailed, I think.

I bring all this up because I think it's worth noting that over time we all develop what I might call a default medium — that is, when looking for entertainment, each of us tends to gravitate towards one medium or medium-plus-genre as the first choice. (So not just "reading" but "mystery novels" or "newspaper journalism"; not just "TV" but "nature documentaries" or "dramatic series" or "sitcoms.") Defaults can be overridden, of course, but they can be strong, and I suspect they get stronger with time.

the plain text gospel revisited

For some years now I have been a big believer in the Gospel of Plain Text: eschewing whenever possible word processors, and indeed anything in a proprietary file format that creates documents I might someday be unable to open — or could open only by paying a hefty upgrade feed to a software maker. Plain text files are very small and fully portable: they can be opened on any computer, and are as future-proof as anything in this vale of tears can be.

But still, I read with interest a recent post by Federico Viticci that points to the limits of a plain-text-only workflow:

I came to the conclusion that, in spite of the inner beauty of plain text, some things were better off as rich text or contained inside an app’s database. By saving links to interesting webpages as plain text URLs, I was losing the ability to easily tap on them and open them in a browser; by organizing tasks in a plain text list, I started feeling like a GTD hippie, presumptuously denying the higher appeal of tools like OmniFocus and Reminders. Plain text is great…until it’s not. It’s silly to deny the fact that, in today’s modern Web, we’re dealing with all kinds of rich content, which gets lost in the transition to plain text or hierarchical file structures (such as Dropbox); managing tasks in a text list is an elegant, old-fashioned concept that I’m sure Don Draper would have appreciated 50 years ago, but that, today, I wouldn’t choose over smart features like alarms, notifications, and location data in my todo app. I didn’t drop plain text altogether: I chose plain text for some tasks, and started using other apps for different purposes.

I get that — but I think Viticci is neglecting a really interesting development in recent writing software, what we might call enhanced plain text. (Maybe there's an actual name for this and I just don't know it; if so, please let me know in the comments.) For instance, I am writing this post, in plain text, in a remarkable new iPad app called Editorial. But this is what I see as I type:


In Editorial, I create plain text files but, instead of a .txt file extension, I label it as .md, for Markdown, John Gruber's simple markup syntax. Editorial then provides appropriate syntax highlighting, as you can see from the image, and when I'm ready it will, with a single click, convert this document to nicely formatted HTML, ready for posting.

But, as when you view an HTML file in your browser (HTML files also being plain text), what changes here is merely the presentation, not the text itself. If I ever find myself trying to open this file in a text editor that doesn't recognize the .md extension, I just have to change it to .txt and my file will be perfectly readable. So an app like Editorial gives me the simplicity and portability of plain text with structural markup that makes that text easier for me to read and use.

Similarly, when I'm on my Mac I take all my notes in an app called nvALT, in which I use plain text files only — but the app recognizes links I paste or type in and makes them clickable, and in one note I can link to another one simply by placing its title in [[double brackets, like this]] — and now that title becomes clickable: a click takes me to that note. This, along with the use of tags, enables me to keep all my research for the book I'm working on highly organized and easily accessible — a major boon for me.

Or consider TaskPaper, an app that's far too little-known: it's a simple but very useful task manager, which can also serve as an outliner. In TaskPaper I can keep track of projects, tasks, and sub-tasks in a hierarchical list with clickable tags; and I can create a color scheme that keeps all these elements visually distinct from one another. And yet .taskpaper files are just text files: as with .md files, I can just change the extension to .txt with no loss of data.

And then there's LaTeX, about which I can geek out so enthusiastically that I probably shouldn't even allow myself to get started....

Anyway, I love apps that do this: that give the structural and visual appeal we typically associate with complicated and proprietary file formats while retaining the underlying simplicity and universality of plain text. So while there may be good reasons for going beyond plain text at times, those times don't come around as frequently as many people think. I'm gonna keep preaching that Gospel.

Saturday, September 14, 2013

Franz und Kraus


Most of what I have to say about Jonathan Franzen's ridiculous essay in the Guardian is communicated by the image above, and by this Hilary Kelly piece on his utter deafness to irony, but I want to add one small note.

Like many people who complain about the limitations of Twitter, Franzen seems unaware that you can write more than one tweet. If you don't get everything said in your first tweet, then you can write another one — and another after that! It's endless, actually! Rather like writing a novel, which, as I understand it, you do one sentence at a time.

For Franzen, though, frequency of publication seems to be all-important. If you reflect on your experience in tweets it's "yakking about yourself," but if you save up all those thoughts until you have a two-hundred-page memoir it's literature.

Which makes it very odd, then, that Franzen should choose Karl Kraus as his exemplar of excellence, for Kraus was a journalist whose particular writerly excellence lay in the creation of — yes — aphorisms. He produced many hundreds of these, the best of which have been collected in this book. They are not all tweet-sized, but a great many of them are — enough that I'm tempted to create a Karl Kraus Twitter account.

Kraus's aphoristic power was particularly striking to W. H. Auden, who, with Louis Kronenberger, produced an anthology of aphorisms in which Kraus features prominently and, following Kraus's example, became a devoted practitioner of the aphoristic art himself.

So it turns out that there's yet another way in which Franzen is deaf to irony. The man he sets heroically against the world of tweeters might very well have been the best tweeter of them all.

Friday, September 13, 2013

the most important Breaking Bad post of all

As the series cranks up the tension and suspense towards what will no doubt be a compelling ending, I can’t help joining in the speculation — which of course feels to me less like speculation than intuition. I’m quite confident in my ability to predict what we’re going to see at the beginning of the next episode, when the outcome of the desert shootout will be revealed. If you want, I can tell you the moment I identify as the tipping point for Walter White — the moment after which we shouldn’t have been able to root for him any more — and I can offer a pretty plausible account of the widespread hostility towards Skyler, along with my own views about whether she is or is not the Lady Macbeth of Albuquerque. I have a pretty solid theory about the whole Star Trek scene from earlier this season, in both its original version and its animated copy. I could make for you a compelling Breaking Bad color wheel. Come on, fellow fanatics, let’s talk.

Oh, one more thing: It doesn’t matter that I’ve never seen the show, does it?

Seriously, I haven’t seen a minute of it. I haven't even watched YouTube clips. However, I’ve read countless tweets and blog posts, starting about three years ago, so I’ve been following the show in something like real time. It’s been fun and instructive. I read James Meek’s superb essay on seasons 1–4, and this thoughtful reflection by A. O. Scott. And this one by Scott Meslow. And a bunch of stuff on Grantland, like this. Really good writing, for the most part.

And yet none of it has made me want to watch the show. I’ve never even considered watching it. Maybe if someone made a two-hour condensation of the whole series up to the last season, the way ABC did for Lost, I would … Nah. Who am I kidding? I don’t have the time, or, rather, I’d prefer to spend the time I have in other ways, probably by reading books.

The big, sprawling multi-season dramatic series that have received the greatest commendation in recent years — from The Sopranos to The Wire to Deadwood to Mad Men to Breaking Bad — have never seemed to me to be worth the enormous investment of time they require. The one that I followed the most closely, The Wire, is really fantastic — but I have to say, if a genie emerged from the lamp and told me that I could have all the hours spent watching The Wire back, and my memories of the show completely erased, as long as I used that time to read books, I would certainly take that deal.

That’s most emphatically not because I think written narrative intrinsically superior to filmed narrative. I don’t. It’s just that reading is the thing I do. Watching TV and movies, not so much. I’m far more likely to read about a TV show than to watch one; Breaking Bad is just the most recent illustration of than tendency. So sue me.

Oh, and in the end, Skyler is going to be the pivot on which the whole denouement turns. I could give you the details, but really, you should prefer to be surprised.

the Bodleian needs chairs


The Bodleian Library at Oxford needs new chairs, and has commissioned designs. Above are the three finalists. Like the author of that Guardian report, I strongly prefer the Barber Osgerby design on the left. It's a classically modernist shape, but with traditionalist elements I think will harmonize with its surroundings.

I post this as another installment in my ongoing love letter to libraries. Libraries need seats that people will want to sit in, and while big soft chairs are especially desirable for people who just want to read, there's still a need for good desk chairs to suit those occasions when one must summon all the research and do some serious typing — or even serious underlining and annotating. I could see myself getting good work done in that Barber Osgerby chair.

Wednesday, September 11, 2013

"internet addiction" and other fictions

Imagine a child who's on her computer all day long because she's playing World of Warcraft. Imagine another who is talking with friends and looking at photos on Facebook. Imagine a third relentless in his pursuit of hi-def porn videos. And imagine a fourth — this will be hard, I know — who is learning how to code and has found a great many like-minded people in the coding subreddit. Taken together these kids make it clear why there's no such thing as internet addiction.

“But they're online all the time and we can't get them to interact with real human beings!” their parents shout, in unison. “They're addicted to screens!” To which the proper answer is, Well, some of them are interacting with real human beings, for better or worse, and — more important — none of them is addicted to screens. They're addicted to certain experiences that they are getting access to via their computers. That is not at all the same thing.

Indeed, no one has ever been addicted to screens, or even addicted to the internet. Neither screens as such nor the internet as such have the power to enrapture people. Now, some might reply that this is a frivolous response, that such language is an easily-understood shorthand. But I would counter that it's a highly misleading shorthand, because it teaches us to conflate extremely different experiences, to place them under a single umbrella, which is exactly where they don't belong. Each of the four children I described above is moved by very different interests; and indeed, it would be easy to write more detailed stories for each of them that would show their behavior to be, in particular contexts, either more or less culpable or worrisome than my current bare-bones narration makes them sound.

So my suggestion would be this: whenever you hear that someone is suffering from “internet addiction,” just ask what, precisely, is it on the internet that he or she is addicted to. And get them to define addiction while you're at it. And treat the whole exchange as the beginning of a conversation about a person, not a definitive diagnosis.

tech intellectuals and the military-technological complex

I was looking forward to reading Henry Farrell’s essay on “tech intellectuals”, but after reading it I found myself wishing for a deeper treatment. Still, what’s there is a good start.

The “tech intellectual” is a curious newfangled creature. “Technology intellectuals work in an attention economy,” Farrell writes. “They succeed if they attract enough attention to themselves and their message that they can make a living from it.” This is the best part of Farrell’s essay:

To do well in this economy, you do not have to get tenure or become a contributing editor to The New Republic (although the latter probably doesn’t hurt). You just need, somehow, to get lots of people to pay attention to you. This attention can then be converted into more material currency. At the lower end, this will likely involve nothing more than invitations to interesting conferences and a little consulting money. In the middle reaches, people can get fellowships (often funded by technology companies), research funding, and book contracts. At the higher end, people can snag big book deals and extremely lucrative speaking engagements. These people can make a very good living from writing, public speaking, or some combination of the two. But most of these aspiring pundits are doing their best to scramble up the slope of the statistical distribution, jostling with one another as they fight to ascend, terrified they will slip and fall backwards into the abyss. The long tail is swarmed by multitudes, who have a tiny audience and still tinier chances of real financial reward.

This underlying economy of attention explains much that would otherwise be puzzling. For example, it is the evolutionary imperative that drives the ecology of technology culture conferences and public talks. These events often bring together people who are willing to talk for free and audiences who just might take an interest in them. Hopeful tech pundits compete, sometimes quite desperately, to speak at conferences like PopTech and TEDx even though they don’t get paid a penny for it. Aspirants begin on a modern version of the rubber-chicken circuit, road-testing their message and working their way up.

TED is the apex of this world. You don’t get money for a TED talk, but you can get plenty of attention—enough, in many cases, to launch yourself as a well-paid speaker ($5,000 per engagement and up) on the business conference circuit. While making your way up the hierarchy, you are encouraged to buff the rough patches from your presentation again and again, sanding it down to a beautifully polished surface, which all too often does no more than reflect your audience’s preconceptions back at them.

The last point seems exactly right to me. The big tech businesses have the money to pay those hefty speaking fees, and they are certainly not going to hand out that cash to someone who would like to knock the props right out from under their lucrative enterprise. Thus, while Evgeny Morozov is a notably harsh critic of many other tech intellectuals, his career is also just as dependent as theirs on the maintenance of the current techno-economic order — what, in light of recent revelations about the complicity of the big tech companies with the NSA, we should probably call the military-technological complex.

The only writer Farrell commends in his essay is Tim Slee, and Slee has been making these arguments for some time. In one recent essay, he points out that “the nature of Linux, which famously started as an amateur hobby project, has been changed by the private capital it attracted. . . . Once a challenger to capitalist modes of production, Linux is now an integral part of them.” In another, he notes that big social-media companies like Facebook want to pose as outsiders, as hackers in the old sense of the word, but in point of fact “capitalism has happily absorbed the romantic pose of the free software movement and sold it back to us as social networks.”

You don’t have to be a committed leftist, like Farrell or Slee, to see that the entanglement of the tech sector with both the biggest of big businesses and the powers of vast national governments is in at least some ways problematic, and to wish for a new generation of tech intellectuals capable of articulating those problems and pointing to possible alternative ways of going about our information-technology work. Given the dominant role the American university has long had in the care and feeding of intellectuals, should we look to university-based minds for help? Alas, they seem as attracted by tech-business dollars as anyone else, especially now that VCs are ready to throw money at MOOCs. Where, then, will the necessary voices of critique come from?

Monday, September 9, 2013

the libraries we need


The new central library of the city of Birmingham — England, that is, not the one I grew up in in Alabama — is pretty darn cool, I think. It's wonderful to see cities investing in big beautiful spaces in which to seek books and knowledge.


And maybe every major city needs a building like this to signal its commitment to learning and the arts. But we should probably also think about the limits, for the local population, of this kind of project. 

When I was growing up in the other Birmingham and making good use of local branch libraries, I don't know whether I even knew that a main library existed downtown; I certainly never visited it until I was in high school, and then only because a friend of mine worked there. What mattered to me was having a library within walking or biking or short driving distance from my house. 

A neighborhood library ought to be easily accessible, and ought to be a place of refuge for people who need to study or think or just relax. It should have stacks that can be browsed, with as many books and journals as the library can afford, but — and maybe I'm flirting with heresy here — only after the library is well-equipped with internet-enabled computers and a staff who knows how to help people find what they want. (And no, not everyone, even in America, has reliable internet access at home.) Every local library can be a portal for the vast multi-media resources of the Digital Public Library of America or the British Library, especially if the computers' interfaces are customized and the staff trained to emphasize the best-quality resources. 

A big central library is an immensely attractive thing, as I've noted, but more intimate library spaces can be beautiful too. Consider the Julian Street Library at Princeton University:


Or, a space made on a smaller budget, the Anacostia Neighborhood Library in Washington, D.C.:


(See these and other award-winning library designs here.)

As I say, I love the big libraries — I find them irresistible. When I visit a city one of the first sites I want to see is the central library. But how many neighborhood libraries could be built, or renovated and more fully outfitted, with the money it takes to build one enormous place? We might do better by distributing our resources, and shifting them from the central city towards the periphery, towards the neighborhoods from which downtown can seem a very long way away. 

Saturday, September 7, 2013

evolution and storytelling, one more time

Jennifer Vanderbes wants to make “The Evolutionary Case for Great Fiction” and begins by imagining two groups of early hominids. One group tells vivid stories and offers incisive critiques of those stories, and so survives; the other bumbles through its tales and so dies out. Read the whole thing for the details.

Okay, it's my turn to tell a story. Back in the Pleistocene there were two clans of hominids living in the same neighborhood. The leaders of one clan were careful observers of their environment: they paid close attention to the flora and fauna that surrounded them, and sought to understand the traits of the former and the behavior of the latter. They discovered which plants to avoid and which were healthy to eat; by trial and error they learned to be better hunters. Their curiosity about their world left them little time for song or story, except for narrowly mnemonic purposes; but they grew stronger and lived longer than others of their kind. They were, you might say, the first Empiricists.

And about those others — the neighboring clan: They were blessed, if that's the right word, by leaders who could act out vivid scenes and sing beautiful songs. They were, you might say, the first Artists. While the Empiricists were investigating every corner of their environment, the Artists entertained one another with their performances and lively responses, both positive and negative, to those performances. They grew so captivated by their arts that they neglected their environment. Eager to return to the clan to share songs and stories, they ate whatever came to hand. They never learned to hunt very efficiently. The food they chose was not especially nutritious and in some cases proved to be poisonous. Their lives grew short, their numbers declined, and eventually they died out.

My point, if you have not discerned it, is this: there is absolutely zero reason to think that Vanderbes's speculative narrative is any closer to the truth of what happened than mine. I'm not sure what she does even amounts to a just-so story; it's more like randomly guessing at what might have happened. We know nearly nothing about this stage of human history. Maybe storytelling is evolutionarily adaptive, but not as adaptive as empirical attentiveness. Maybe storytelling wasn't adaptive at all in the Pleistocene, but was not sufficiently maladaptive to be expunged from the social and mental makeup of our ancestors. Maybe our Empiricist ancestors were sufficiently successful in improving our fitness as a species that there's more room now for pointless art — it doesn't endanger anyone's survival, even if it has no adaptive value. Who knows?

Even less plausible, if that's possible, is Vanderbes's belief that aesthetically superior stories — “Great Fiction” — are evolutionarily more adaptive than predictable, conventional, and highly simplistic stories. Indeed, the reverse seems more likely. Even assuming that the telling and understanding of stories are in some sense evolutionarily adaptive – a point which has never been clearly established, as far as I know – we might reasonably conclude that the stories that are readily accessible to larger numbers of a given group would be more likely to improve the survival rates of that group. The richness, depth, and complexity of our greatest works of art might be the very things that would make them comparatively useless for adaptive purposes.

Completely unsupported evo-psych guesswork like this is just not getting us anywhere. It's past time for us to acknowledge that.

Friday, September 6, 2013

my professor

On Twitter, my friend Matt Thomas has been retweeting students who tells us about their professors. (He’s just been searching “my professor" and RT'ing the more interesting results.)

According to my professor, unicorns exist. I'm running with that.

My professor is saying we are robots. Ok.

My professor told us today that we should have sex with hundreds of people & then have open marriages so we can have even more sex.

My english professor comes in 5 minutes late with a poptart singing and dancing to prince.

Also, it seems, many, many teachers have decided to lead off the semester with references to twerking, just to show that they know what’s what. I guess.

I was chatting with Matt and others about what all this tells us, and while it may give an indication of just how lame — how clueless, how vapid, how trying-too-hard, how not-trying-hard-enough — many American professors are, I’m not convinced. To be sure, I'm totally convinced that a great many professors are indeed lame in just these ways; I'm just not convinced that the tweets are reliable evidence of that fact. (I don't think Matt is convinced either, and he's not presenting the tweets to prove a particular point.)

After all, how many of these tweets can be assumed to be accurate transcriptions? I mean, maybe your professor said you should have sex with hundreds of people, but maybe you were texting someone when he prefaced that with “Some people think.” And by the time it became clear to you that he wasn’t in fact telling you to have sex with hundreds of people, if it ever did, the tweet was already out there and why take back something funny and interesting?

So I’m just wondering how much of this is a game called “Sh*t My Prof Says.” There’s no way to know, of course; but I’m wondering.

And I also think of the times in my life when I have misheard or half-heard something and then preferred the error to the truth — as did the English novelist Henry Green when he grew increasingly deaf. Once he thought an interviewer was asking him a question about suttee and was quite disappointed when it turned out that the word in question was “subtlety.” The world of real speech was never quite as lively as the one his bad hearing enabled him to imagine. Distraction can have the same effect on us, can it not?

In this context I find myself thinking of Richard Wilbur’s great poem “Lying,” which begins,

To claim, at a dead party, to have spotted a grackle,
When in fact you haven’t of late, can do no harm.
Your reputation for saying things of interest
Will not be marred, if you hasten to other topics,
Nor will the delicate web of human trust
Be ruptured by that airy fabrication.
Later, however, talking with toxic zest
Of golf, or taxes, or the rest of it
Where the beaked ladle plies the chuckling ice,
You may enjoy a chill of severance, hearing
Above your head the shrug of unreal wings.

From there he moves on to the lies of poetry, and ends with an invocation of the small event that became the Song of Roland. Forget about Twitter for a while: read it.

Thursday, September 5, 2013

enough about me

So here’s what I do, in the digital realm, to limits the powers of intermittent reinforcement and increase my powers of adherence: when I have work to do on my computer, I either disable all notifications or shut down social media (Twitter, email, IM) clients altogether.

Does this work? Variably well, and the key variable is how much I enjoy the task I need to work on. If I’m working on a book or article, I usually get sufficiently absorbed in the task that I forget social media. But if I’m, say, grading papers — which I do on my computer: I have students submit their essays as PDFs — then I get twitchy: I’m often tempted to check email or Twitter. In fact, I sometimes think I would do better if I just had the push notifications enabled, so then I would only be interrupted when something actually happened, instead of interrupting myself by wondering whether something has happened. But I’ve noticed that when I leave notifications on I get pinged just when I am actually concentrating on what a student is arguing — so no, turning them off is the best option.

I also have my computer set to auto-hide all applications that are not currently active, so when I’m writing my text editor is the only thing I can see, when I’m grading my PDF viewer is the only thing I can see, and so on.

So that’s my practice. I kind of enjoy talking about these things: productivity strategies and all that. But maybe that’s because those conversations keep me from having to think about more important and less pleasant things. Consider, for instance, a notable fact selected from the account I’ve just given: how much easier it is for me to concentrate on my own writing, my own thoughts, than on my responsibility to help my students develop their thoughts. It’s not especially discomfiting to investigate and critique what Cory Doctorow has called “your computer's ecosystem of interruption technologies”; it’s really discomfiting to realize how bored and distracted I can become when it’s not all about ME. And if I find myself less plagued by distraction than many others I know, perhaps that’s not because I am more disciplined, but because I am blessed in having a good deal of work to do that I really, deeply enjoy.

pre-tweeted for your convenience

Nick Carr:

Frankly, tweeting has come to feel kind of tedious itself . It’s not the mechanics of the actual act of tweeting so much as the mental drain involved in (a) reading the text of an article and (b) figuring out which particular textual fragment is the most tweet-worthy. That whole pre-tweeting cognitive process has become a time-sink.

That’s why the arrival of the inline tweet — the readymade tweetable nugget, prepackaged, highlighted, and activated with a single click — is such a cause for celebration. The example above comes from a C.W. Anderson piece posted today by the Nieman Journalism Lab. “When is news no longer what’s new but what matters?” Who wouldn’t want to tweet that? It’s exceedingly pithy. The New York Times has also begun to experiment with inline tweets, and it’s already seeing indications that the inclusion of prefab tweetables increases an article’s overall tweet count. I think the best thing about the inline tweet is that you no longer have to read, or even pretend to read, what you tweet before you tweet it. Assuming you trust the judgment of a publication’s in-house tweet curator, or tweet-curating algorithm, you can just look for the little tweety bird icon, give the inline snippet a click, and be on your way. Welcome to linking without thinking!

Please click through to the original and you’ll see that Nick has thoughtfully singled out his best aphoristic zingers for immediate tweetability. What a guy.

Wednesday, September 4, 2013

rational choice and your future self

In a typically superb post, Tim Burke explains the problem with trying to apply a rational-choice model to the decisions college students make about their education:

There is a lot of information that you could acquire about courses or about colleges that you could reasonably use to assemble a decision matrix. What size is the class or the college? Do you have a good reason for thinking that you flourish in small or large classes or institutions? What do you think you need in terms of knowledge or training? What kinds of environments and teaching styles do you enjoy or find stimulating? And so on — this often information you could have, and sometimes, I agree, information that is hard to come by that shouldn’t be so hard to get.  

But then think on all the things that make a difference in a class or a university that you cannot possibly know about no matter how much information you have. The friends you will make. The people you will love. The mentors who will strike a chord with you. The class that will surprise you and change your views of everything. The chance experience you have that will transform you. You can find an environment that is rich in people, in time, in resources, in the unexpected (and some colleges and classes are impoverished in all or most of those). But you can’t determine any of the specifics with all the information in the world and yet it is these specifics that create the most “added value”. Perhaps even more importantly, it’s not the person who just had the experience who will value the commodity most (or rue it most dearly), it’s the person you will be in the years to come. That person is not you in so many ways: you are today very very bad at predicting what that person needed or wanted and you always will be bad at it. If we could sue our younger selves, many of us probably would.

So the people who look and say, “Oh, just make sure there’s more information and people will make the choices that economists think they ought to make” are doomed to disappointment. Which would be fine if they would consent to just being disappointed but policy wonks tend to think that when the outcome that the models predicted doesn’t happen, the answer is to make people behave like the models said they would.

So wise. Please read it all.

Monday, September 2, 2013

adherence

Now I want to take the thoughts from my last post a little further.

Just as it is true in one sense to say “guns don’t kill people, people kill people,” though only at the cost of ignoring how much easier it is to kill someone if you’re holding a loaded gun than if you can’t get one, so also I don’t want my previous post to be read as simply saying “Tech doesn't distract people, people distract themselves.” I am easily distracted, I want to be distracted, but that’s easier for me to accomplish when I have a cellphone in my hand or lots notifications enabled — thanks, Growl! — on my laptop.

Still, I really think we should spend more time thinking about what’s within rather than what’s without — the propensities themselves rather than what enables and intensifies them. Self-knowledge is good.

And along these lines I find myself thinking about a fascinating and provocative article in the Journal of the American Medical Association that says, basically, it’s time to stop studying the effects of various diets and debating about which ones are best because, frankly, there ain’t a dime’s worth of difference among them: “The long history of trials showing very modest differences suggests that additional trials comparing diets varying in macronutrient content most likely will not produce findings that would significantly advance the science of obesity.”

In short, such comparative studies are wasting the researchers’ time, because while countless studies have not told us anything conclusively about which diets are best they have told us conclusively that whatever diet you choose the thing that really matters is whether you’re able to achieve the discipline to stick with it. Therefore, “Progress in obesity management will require greater understanding of the biological, behavioral, and environmental factors associated with adherence to lifestyle changes including both diet and physical activity.”

Adherence: that’s what matters in achieving weight loss and more general increases in health. Do you actually follow your diet? Do you actually keep to your exercise regimen? And that’s also what’s most mysterious: Why are some people able to adhere to their plans while others (most of us) are not? This, the authors suggest, is what we should be studying.

The same is true for technological addictions. Some people use apps like Freedom to try to break their addictions — which is great as long as they remember to turn the app on and resist the temptation to override it. Jonathan Franzen uses superglue to render his computer un-networkable — which is great as long as he doesn’t hunt down another computer or keep a smartphone within reach. Evgeny Morozov locks his phone and wireless router in a safe so he can get some work done — which is great as long as he actually does that when he needs to.

In all these cases, what people are trying to do — and it’s an intelligent thing to attempt — is to create friction, clumsiness, a set of small obstacles that separate the temptation to seek positive reinforcement from the giving in to that temptation: time to take a couple of deep breaths, time to reconsider, time to remind themselves what they want to achieve. But in the end they still have to resist. They have to adhere to their commitments.

Which takes us back to the really key question that the JAMA article points us to: whether it’s diet or exercise or checking Twitter, why is adherence so difficult? Why do most of us adhere weakly, like Post-It notes, rather than firmly, like Jonathan Franzen’s superglued ethernet port?

I’ll have more to say about this in another post.