Text Patterns - by Alan Jacobs

Thursday, March 31, 2011

rearranging the bookshelf




Via my editor Adam Keiper.

Wednesday, March 30, 2011

grown-ups and games

In light of my recent critique of gamification, you’ll not be surprised to learn that I loved this essay by Heather Chaplin. Here’s the conclusion:

Sometimes I feel bad for these gamification enthusiasts. Priebatsch longs to change the term valedictorian to White Knight Paladin. And McGonigal, whose games are filled with top-secret missions in which you get to play the superhero, says "reality is broken" because people don't get to feel "epic" often enough. This is a child's view of how the world works. Do adults really need to pretend they're superheroes on secret missions to have meaning in their lives?

In Reality Is Broken, McGonigal talks about a game she invented to help herself get over a concussion. SuperBetter, as she called it, involved her taking on a secret identity—Buffy the Concussion Slayer—and enlisting family and friends to call her to report on "missions." The purpose of SuperBetter, McGonigal writes, was to connect her with her support system. I felt sad when I read this. What, you couldn't just pick up the phone? You needed to jump through all those hoops just to talk to your friends?

Life is complex and chaotic. If some people need to do a little role-playing now and then to help them through the day, mazel tov. It's another thing entirely, though, to rely on role playing for human contact, or to confuse the comfort of such tricks with what's real. Having a firm grip on reality is part of being a sane human being. Let's not be so eager to toss it away.

Tuesday, March 29, 2011

conversations without boundaries

It was a busy week at Wheaton College last week. In addition to Edward Mendelson's visit, which I mentioned in a post the other day, the distinguished theologian Miroslav Volf came to town to discuss his new book Allah: a Christian Response. In the past few days he and I have been having an email conversation that touches on certain matters relevant to the concerns of this blog.

In his talk, Volf raised a simple but profound question: How do people from different religious traditions learn to live peacefully with one another in the same world? He outlined some ways in which we can be generous towards one another, and can identify some important common ground, especially among believers in the Abrahamic religions. His arguments were compelling, but they also (for me) raised a question: If it is often necessary to seek common ground, do rhetorical situations arise when it's important for a religious believer to emphasize what's unique about his or her own religion?

Miroslav's first response to this question was that emphasizing uniqueness as such is not a value for religious believers, or for Christians anyway: rather, we must strive to be faithful to our beliefs and commitments, and let uniqueness come as it will. A very wise response, I think, though one that had not occurred to me.

But then — and here's where the concerns of this blog come in — he pointed out that while one might want to speak differently in different rhetorical situations, might strive to adjust one's language to suit different audiences that have different needs, in practice we do not live in a world with "bounded" rhetorical situations. "Everyone is listening," he said, thanks to the World Wide Web, as it is accurately called, which takes what you say to one audience and broadcasts it — as text, audio, video, or all of the above — to pretty much anyone who's interested in finding it.

One of the most fundamental principles of rhetoric has always been decorum, that is, suiting one's language to occasion and audience. Those of us who teach writing typically think it vital to get our students to think in these terms — to see that they must adjust style and diction, evidence and argument, to reach the readers they most want to reach.

Such imperatives will never cease to be important. But it also seems likely that we will have to train students to be aware — and will have to train ourselves to be aware — that much of what we say and write can find audiences we never intended. And the consequences of our words' extended reach will not always be positive ones.

Increasingly, these will be matters of import for everyone. But given the intensity of feelings that people have for what Paul Griffiths calls the "home religion," religious believers whose lives have a public dimension should be especially thoughtful, careful — and prayerful.

Monday, March 28, 2011

Bogost on blogs

Here’s a wonderfully thoughtful post by Ian Bogost about the limitations of the blog as an intellectual tool, especially in academic contexts. This is an old theme of mine, so it’s nice to have someone pick up on it. Bogost writes,

Tim Morton is right to call out old forms like books and academic essays, rejoining [“exhorting,” maybe?] them to “figure out what they are about in this new environment.” But the same is true for blogs and other forms of digital writing as well. We’re no more stuck with the awkward tools that are blogs than we are stuck with awkward tools that are journals. . . .

I wonder what a writing and discussion system would look like if it were designed more deliberately for the sorts of complex, ongoing, often heated conversation that now takes place poorly on blogs. This is a question that might apply to subjects far beyond philosophy, of course, but perhaps the philosopher’s native tools would have special properties, features of particular use and native purpose. What if we asked how we want to read and write rather than just making the best of the media we randomly inherit, whether from the nineteenth century or the twenty-first?

I wish these were the sorts of questions so-called digital humanists considered, rather than figuring out how to pay homage to the latest received web app or to build new tools to do the same old work.

This is great stuff. Blogs are very poor tools for fostering genuine intellectual exchange, which is one reason why, increasingly, those exchanges happen for many on Twitter — despite the 140-character-at-a-time limit. We might ask why that is: Why do some many people prefer to exchange ideas on Twitter rather than on blogs? I don't think it’s just laziness. And then we might ask another question: What might a tool look like that combines the best features of blogging and tweeting, while minimizing the flaws of both instruments?

Friday, March 25, 2011

and while I'm co-signing

This is exciting to hear about:

Joseph Cohen says he’s fed up with Blackboard. The leading course-management software is overloaded with features and dreadfully designed, making simple tasks difficult, says Mr. Cohen, a student at the University of Pennsylvania’s Wharton School. . . . Mr. Cohen and a classmate, Dan Getelman, have launched Coursekit, a stripped-down online learning-management system that offers a discussion board, a calendar, a syllabus, and related resources for courses at Penn. Mr. Cohen says he hopes Coursekit’s simple interface and Facebook-inspired tools will help make online discussions in a course as social as the course itself.

I hope Coursekit flourishes. Blackboard is a terrible, terrible, terrible system: bloated, ugly, confusing. The Blackboard motto seems to be, “Why Do Something in Two Clicks When You Can Do It In a Dozen?” I don't know anyone who uses it for one minute more than absolutely necessary. But something like Blackboard would clearly be valuable to teachers and students everywhere. Since Blackboard ate WebCT (which was equally bad), competition in this arena has been badly needed. Maybe Coursekit can provide it.

I co-sign this proposal

Siva Vaidhyanathan:

We have the technological systems in place to connect the vast majority of people in the world with much, if not most, of the greatest collections of knowledge. We have impressive digital databases. We have millions of hours of sound recordings. We have 100 years of film and video available. We have, of course, millions of books. . . . 

We lack only one thing: the political will to fight for a great and noble information system—a global digital library. I'm not talking about the haphazard rush we've seen to date to digitize the stacks of major research libraries. Nor a commercial venture like Google's. I'm proposing what I call the "Human Knowledge Project" in my book, The Googlization of Everything (And Why We Should Worry). What I mean is a truly global digital library. To generate support for that, we need to identify the political and legal constraints, as well as articulate the payoffs. 

That entails a formidable series of tasks. It might take 10, 20, or even 50 years. But there is no reason we should settle for expediency at the expense of excellence. After a few conversations, we might decide it's not worth the effort or cost. But at least we would have tried. And that's so much healthier than waiting for the Big Rich Magic Company in the Clouds to do all this for us—on its terms.

Thursday, March 24, 2011

free advice

The other day Edward Mendelson was here at Wheaton, speaking to my Modern British Literature class about Auden and then, in the evening, delivering a spellbinding lecture on persons and categories in Homer. A really fine time was had by all.
Needless to say, a cell phone went off ten minutes into the talk. That always happens, though, doesn't it? Have you ever been to a lecture when that didn't happen?
But then, half-an-hour into the lecture, a few students and a couple of off-campus visitors strolled in and, in a quite leisurely fashion, found their way to seats in the middle of the room. They then began to chat with their neighbors because (I later learned) they were coming to hear a female philosopher deliver a lecture on Anselm and were disconcerted to hear a male literary critic talking about Homer. It appears that some other philosophy lectures had been given in that room, so they figured that one would be too, though they had received handouts in their classes telling them where the lecture actually was to be held. Eventually they got up and wandered out.
A little later someone came in at the door near which I was sitting. He had two styrofoam containers in his hands, which he handed to the students sitting next to me. He then turned and walked out. The students opened the containers and munched away quite contentedly. Even when the lecture was over they still sat there dragging French fries through ketchup.
So I wrote this.

Monday, March 21, 2011

gaming the system

Oliver Burkeman writes:

[Seth] Priebatsch's declared aim is to "build a game layer on top of the world" – which at first seems simply to mean that we should all use SCVNGR, his location-based gaming platform that allows users to compete to win rewards at restaurants, bars and cinemas on their smartphones. (You can practically hear the marketers in the room start to salivate when he mentions this.)

But Priebatsch's ideas run deeper than that, whatever the impression conveyed by his bright orange polo shirt, his bright orange-framed sunglasses, and his tendency to bounce around the stage like a wind-up children's toy. His take on the education system, for example, is that it is a badly designed game: students compete for good grades, but lose motivation when they fail. A good game, by contrast, never makes you feel like you've failed: you just progress more slowly. Instead of giving bad students an F, why not start all pupils with zero points and have them strive for the high score? This kind of insight isn't unique to the world of videogames: these are basic insights into human psychology and the role of incentives, recently repopularised in books such as Freakonomics and Nudge. But that fact, in itself, may be a symptom of the vanishing distinction between online and off – and it certainly doesn't make it wrong.

Note the covert assumption here that, while we can totally reconfigure how we evaluate student performance, we can’t think of it as anything but “performance,” and we can't resist the student tendency to think in terms of competition for grades or professorial approval.

In this case I think gamification would simply make a fundamentally unhealthy, counterproductive way of thinking somewhat more fun — at least for those who thrive on competition. (For those who dislike competition, and there are more such people than is commonly realized, it would just make things worse.) I’d rather see if we can re-think our educational system to limit or channel the ethos of competition — which, I grant, would be much harder than game-ifying it.

The Information (3): information and its near relations

One of my favorite passages in The Information comes when Gleick describes a series of conferences held, starting in the late 1940s, at the Beekman Hotel in New York. “A host of sciences were coming of age all at once — so-called social sciences, like anthropology and psychology, looking for new mathematical footing; medical offshoots with hybrid names, like neurophysiology; not-quite-sciences like psychoanalysis — and [neurophysiologist Warren] McCullough invited experts in all these fields, as well as mathematics and electrical engineering.” In addition to Claude Shannon, participants included Norbert Wiener, John von Neumann, Margaret Mead, Gregory Bateson, and others.

There's an interesting moment (p. 248) when Shannon is speaking and just can't get people to focus on information as such — they keep wanting to get into semantics, meaning. (As I read this I found myself thinking of the way Neal Stephenson distinguishes between semantic and syntactic Faculties of philosophy in his novel Anathem.)

The Information is to some degree about this resistance, this inability that non-Shannon human beings have to see communication solely in terms of information transfer. But Gleick doesn't address this point as directly as I think he should: he tends instead to allow the confusions and elisions to be present in his narrative. Which is to some degree defensible, since that’s what real life has been like.

But let’s make some distinctions:

  • information: defined (in multiple ways) here
  • data: information recognized by humans as information
  • knowledge: information sorted by humans and translated into human terms
  • wisdom: the proper discerning of the human uses of knowledge
  • counsel: wisdom transmitted to others
     
That last point I decided to add after reflecting on Walter Benjamin’s great essay “The Storyteller” (PDF here.)

Friday, March 18, 2011

the book surgeon

all the news that's fit to read

In the past year or so, as more and more websites — of all kinds — have acquired Twitter feeds, my daily newsreading habits have shifted: whereas I once began the day by going through a large collection of RSS feeds, now I start with Twitter. And as I have added Twitter feeds, I've noticed a good deal of redundancy: sites giving me links to their new posts through RSS and Twitter alike. I responded to this phenomenon by purging my RSS feeds, ultimately leaving in my RSS reader only those sites that don't have Twitter feeds, and making Twitter my chief portal for news as well as conversation with friends.
And you know what? This doesn't work so well. Twitter doesn't handle news as well as RSS, largely because of the 140-character limit. Given so little information, I often can't tell whether a story is worth reading or not, so — because I don't want to miss out on something awesome! — I often end up clicking through to stories that prove to be that interesting or informative. RSS, by contrast, typically gives me either a complete story or a full first paragraph, so it's a much more efficient conduit, leading to fewer unnecessary click-throughs.
Also, while for conversations I might want Twitter to refresh frequently, for news that's not necessary — unless it's breaking news, in which case what you want is not your regular stream but searches by relevant hashtag. Setting the RSS reader to refresh every hour at most, the Twitter client to refresh more frequently, is the way to go. For me anyway.
Fortunately, before I started trimming my RSS feeds I made and stored a copy, as an OPML file, of my list when it was at its largest. So I'm restoring that, and cutting back my Twitter feed largely to friends. Twitter is great for conversations; RSS is better for the daily news.

Thursday, March 17, 2011

The Information (2)

In the early pages of The Information, Gleick writes a good deal about communication: African talking drums, for instance, and telegraphy. Someone wants to say something to another person, perhaps a distant person; how can that be accomplished? Only over much time, Gleick (implicitly) argues, does it become clear that the problem is one of information. And, it turns out, many other problems are problems of information also:

What English speakers call “computer science” Europeans have known as informatique, informatica, and Informatik. Now even biology has become an information science, a subject of messages, instructions, and code. Genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. The body itself is an information processor. Memory resides not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level — an alphabet and a code, 6 billion bits to form a human being. “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’” declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions. . . . If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.”

Then, in a curious turn, near the end of the book Gleick starts talking again about communication. The protagonist of The Information, insofar as it has one, is clearly Claude Shannon, the most important figure in modern history that hardly anyone has heard of, because it was Shannon who defined information and isolated it from other terms with which it is often confused. But Gleick seems to be contemplating near the end of the book the price we pay, or can pay, for Shannon’s world-changing insights: “The birth of information theory came with its ruthless sacrifice of meaning—the very quality that gives information its value and its purpose. Introducing The Mathematical Theory of Communication, Shannon had to be blunt. He simply declared meaning to be ‘irrelevant to the engineering problem.’ Forget human psychology; abandon subjectivity.” None of that matters to “the engineering problem.”
This seems to make Gleick uncomfortable, but in ways that he never quite sorts out. I think I will want to return to this point.

Tuesday, March 15, 2011

a problem of distance

Increasingly often, these days, I find myself picking up a magazine I subscribe to, starting to read, and then putting it aside with a sigh. The problem is simply that I do not see as well as I used to: as I've gotten older my eyesight has gotten worse in complicated ways, and the optometric arts only imperfectly compensate for these changes.
This creates more difficulties for me when reading magazines than when reading books, because the print is often smaller in magazines, and magazines, being larger than books, especially paperbacks, can be more awkward to hold. I can usually hold a paperback in one hand and move it closer or farther away, as necessary, until I find the right distance; and then when my hand gets tired I can switch to the other. Hardcovers are heavier and more awkward to hold, but their solidity allows them to be partially propped up — on my chest as I'm lying down, for instance — which makes the task of reading less of an upper-body workout. (Though Lord knows I need more of an upper-body workout, I don't want to combine that with the act of reading.) But magazines are floppy, especially if they're large-format — as are some of my favorites, including Books & Culture and The New York Review of Books — and I have to hold my hands farther apart to read them . . . It's really starting to bug me.
And of course this is happening because the Kindle and the iPad have provided an alternative reading ergonomic: if book or magazine print seems small to me now, and I have to adjust my arm position to find the right distance from my eyes, I can never now forget that it is much, much easier to hold the thing to be read wherever it feels comfortable and simply adjust the size of the type. This knowledge that there is a Better Way intensifies the annoyance but is also a distraction — instead of focusing on the article to be read I'm thinking, "Why isn't there a Kindle version of this magazine?" or "Now, why exactly did I subscribe to the print edition of this magazine?"
There are a great many people like me in this respect, and there will be more of them in the coming years. Any periodical that doesn't have a very clear, very fast-developing plan to make itself available to e-reader and iPad users is going to be in a lot of trouble, and soon.

Monday, March 14, 2011

mine and yours

Why do social media always have to be about social competition? Everyone on Facebook is aware of how many friends they have in relation to how many friends their friends have. On Twitter people celebrate follower milestones: five hundred, a thousand, ten thousand. For a while Tumblr was defaced by a comparative ranking called Tumblarity, which has now disappeared, I hope for good (though Tumblr still tells you how many followers you have).
In light of all this, consider Peer Evaluation, a tool for "empowering scholars." What's it all about? The home page says,
  • Promote and enjoy real-time Open Access to research
  • Share primary data, working papers, books, media links...
  • Receive feedback and reviews from your peers
  • Expose your work to those that matter
  • Aggregate qualitative indicators about your impact
  • Drive, build and share your online reputation
All (potentially) very cool, until those last couple of bullet points, the key ones really — yes: "your online reputation." Because that's what it's all about, isn't it? As Othello says, "Reputation, reputation, reputation! Oh, I have lost my reputation! I have lost the immortal part of myself, and what remains is bestial." Well, we don't want that to happen, do we? So let's consult the site's "Reputation Dashboard" to find out where we stand in the great striving for attention that infects, it would appear, every form of social media.

The Information (1)

As I understand it, the principal virtue of popular science writing — indeed writing for a general audience about any technical subject — lies in its ability to take the work of scholars and make it accessible to non-scholars, without fundamentally falsifying it. Twenty percent of the way into The Information, I find that James Gleick is doing little more than retelling stories already told by other popular writers. His account of Charles Babbage’s early mechanical calculators covers ground well-trod by others, including The Difference Engine: Charles Babbage and the Quest to Build the First Computer (2002), by Doron Swade. Almost everything he says about the telegraph will be familiar to anyone who has read Tom Standage’s The Victorian Internet (1998). The material on the relationship between orality and literacy — with its inevitable nods towards the paradoxes of Plato’s writing about that apostle of the spoken word, Socrates — has been covered thousands of times, most recently (at roughly the same length) in Nick Carr’s The Shallows. Of course, not everyone will have read all, or any, of these books — but even for those folks, Gleick’s narrative might well seem bland and colorless, and lacking a strong thematic center. Very much journeyman work so far; I had expected much better. And I still hope for better as I move along.

Saturday, March 12, 2011

they've got us where they want us

Disturbances in the Twitterverse: first Twitter releases a new iPhone client that prominently features trends — including “promoted” trends, that is to say, ads — and offers no way to hide them. (A new release makes the trends appear in a slightly less annoying way, but they are still mandatory.) Then Twitter issues new guidelines to developers that — as far as I can tell — pretty much eliminate further development of third-party clients: “Developers ask us if they should build client apps that mimic or reproduce the mainstream Twitter consumer client experience. The answer is no. . . . We need to move to a less fragmented world, where every user can experience Twitter in a consistent way.” And that seems to mean, through the Twitter website and through Twitter’s own clients for Mac, Windows, iOS, Android, and so on, with their prominently featured “promoted trends.”

Clearly this has nothing to do with "user experience": the people who run Twitter are casting around for ways to make money, which is understandable, and this is what they have settled on. I can’t say that I totally blame them: their investors are surely demanding return on their (hefty) investments. Twitter has got to be enormously expensive to run. But here we see the chief problem that arises when a major new form of communication — what, surprisingly to everyone, turned out to be a major new form of communication — consists of proprietary technology completely controlled by a single company. Imagine if Google had invented and owned email, so that you could only use email by navigating to the Gmail site and dealing with whatever ads Google chose to feature; so that Google had absolute control over your user experience; so that if Google’s servers went down the entire communicative ecosystem went down. That’s the situation the bosses at Twitter are clearly trying to create.

Again, it’s their technology and they can do what they want. Maybe the next step will be an ad-free, trend-free Twitter experience for a monthly fee; I wouldn’t be surprised, and I would have nothing legitimate to complain about. But Twitter has become so central to many people’s lives that it feels like a public utility, and the ads therefore feel like an unwarranted intrusion — as though we had to listen to 30-second pitches for dishwashing detergent before being able to complete a phone call. (And don't think I'm unaware that we've been here before. I've read The Master Switch.)

Just as Diaspora is being created as an open-source alternative to Facebook, in response to the rather more blatant and consistent tyrannies of the Zuckerbergian empire, these recent developments will prompt renewed attention to open-source and/or distributed alternatives to Twitter, like this one and this one.

But there’s a problem — probably an insurmountable one: these kinds of services only work when pretty much everyone you want to know is on them. Nobody wants to go back and forth between different Twitter-like services or Facebook-like services, trying to remember which friend is on which service. (I rarely remember to log in to Diaspora, and when I do, I find that my tiny handful of friends haven’t visited either.) In such an environment, what’s called for is some powerful aggregating technology that would allow us to have a single conduit through which we could see what all our friends are up to. But this is of course precisely what Facebook and Twitter are refusing to allow. In effect, they’re saying “You get what we offer the way we want to offer it, along with all your other friends, or you’re out in the cold — the silent, still, radically un-social cold. Deal with it.”

Thursday, March 10, 2011

evaluated

My colleague Heather Whitney has a post up at ProfHacker about one aspect of student evaluations: their occasional lack of truthfulness. Let me add my two cents:

A year ago, as some readers of this blog may recall, I spent some time in the hospital. My classes that semester met on Tuesdays and Thursdays, and as a result of illness I missed four classes: two weeks total, in a fourteen-week semester. A lot to miss! — but perhaps not enough to warrant students writing on their evaluations, “It’s hard to evaluate this class because we hardly ever met,” or “It’s not Dr. Jacobs’s fault for being sick, but the fact that he missed most of the semester really hurt the success of this class.”

I also recall, a couple of years back, some students complaining on their evaluations that they got poor grades on their papers because I didn’t give them any guidance — even though before turning in the final draft of their papers they had to submit (a) a proposal for the paper, to which I responded in writing with detailed suggestions, and then (b) a complete draft of the paper, to which I also responded in writing. If this was not “guidance,” I wonder what would have been.

I can only explain this phenomenon — which is consistent among a minority of students — by speculating that some students think that evaluations are an opportunity not for them to speak truthfully, according to any naïvely “objective” or dispassionate model of truth, but rather for them to share their feelings — whatever those feelings happen to be at the moment. And at the end of a long and stressful semester, those feelings will sometimes be rather negative. This is one of the many reasons why student evaluations as they are typically solicited and offered are useless, or worse than useless — and I speak as someone with a long history of very positive student evaluations.

Thus for a long time I have recommended, to anyone who will listen and to many who will not, that evaluations for a given course be solicited at least one full semester after the course is completed, when students are less emotionally involved in it. A year or more after would be even better. We might get fewer responses, especially from students who have graduated, but they would be better responses.

Whenever I make this suggestion, the first response I get is always the same: “But a semester [or a year] later, they won’t remember anything from the class!”

“That would be something worth knowing,” I reply.

Monday, March 7, 2011

information about The Information

So, remember when I was saying that I might just re-read books for the rest of the year? Seemed like a good idea — but I had forgotten that I had pre-ordered Crazy U for my Kindle. And when it showed up I couldn't help devouring it. Ah, well, some kind of road is paved with good intentions, I don't remember exactly what kind. I’m not worried, though.

It’s Spring Break around here, so I’m taking the rest of the week off from the blog, and will try to be as useless as possible. Though I do need to finish this big essay on the maddening Marshall McLuhan.

When I get back, I’m going to be blogging my way through a book again — and yes, it’s a new book, one I haven't read before. But I can’t resist this one: it’s James Gleick’s The Information: a History, a Theory, a Flood. You are all welcome to get started on it and read along with me.

Crazy U

Let me tell you a few things about Andy Ferguson’s new book Crazy U: it’s well-researched, insightful, thought-provoking, and sometimes hysterically funny. He’s good on everything: college admissions standards, evaluation of candidates, financial aid, you name it. And he links the themes together in sometimes unexpected ways.

Consider for example this passage, from a section on how the admissions policies of Ivy League universities have changed over the years:

“In a way you had more human diversity in the old Harvard,” a friend once told me, after a lifetime of doing business with Harvard graduates. His attitude was more analytic than bitter, however. “It used to be the only thing an incoming class shared was blue blood. But bloodlines are a pretty negligible thing. It allows for an amazing variety in human types. You had real jocks and serious dopes, a few geniuses, a few drunks, a few ne’er-do-wells, and a very high percentage of people with completely average intelligence. Harvard really did reflect the country in that way back then. “You still have a lot of blue bloods getting in, multigeneration Harvard families. But now a majority of kids coming into Harvard all share traits that are much more important than blood, race, or class. On a deeper level, in the essentials, they’re very much alike. They’ve all got that same need to achieve, focus, strive, succeed, compete, be the best—or at least be declared the best by someone in authority. And they’ve all figured out how to please important people.” Harvard grads disagree with this, of course. They like to say that the new Harvard represents the triumph of meritocracy. No, my friend said. “It’s the triumph of a certain kind of person.”

Then, some pages farther on, Ferguson is discussing the lamentable “Me essay,” the tell-us-everything-deeply-personal-about-yourself essay most colleges ask their applicants to write, and in that context he asks,

But what qualities does the Me Essay measure? If they were trying to capture the ability to write and reason, this could be accomplished by less melodramatic means. No, the admissions essay rewarded personal qualities beyond mathematical reasoning and verbal facility. Some of the traits were appealing enough, in appropriate doses. Refreshingly effusive kids, admirably enthusiastic kids, the all-American Eddie-attaboys might very well thrive on the essay. But it would also reward other characteristics, like narcissism, exhibitionism, Uriah Heep–ish insincerity, and the unwholesome thrill that some people get from gyrating before strangers. Which of these traits, I wondered, predicted scholarly aptitude or academic success? I saw it at every turn, as my friend had said of Harvard: the system “privileged” a certain kind of kid. And if you weren’t that kind of kid the best course was to figure out how to pretend you were.

Suddenly you see how the whole system of college admissions coalesces around, not just ambition, but a particular kind of ambition — something far more social than intellectual. It’s kind of nauseating, to be honest — though perhaps I feel that way because I’ve just helped to shepherd my son through college applications. But whether you’ve got college-aged kids or not, and even if you don't have kids and don't even plan to have them, you ought to read Crazy U. It’s a first-rate piece of popular cultural criticism, and it’s very, very funny.

Friday, March 4, 2011

three dimensions in two

I've been chatting recently with some Twitter friends about possibly doing a conference panel together, and we've been considering an exploration of skeuomorphs. This got linked to a discussion of three-dimensionality — a Tim Carmody favorite — and it occurred to me that many skeuomorphs are in fact attempts to mimic three-dimensionality. When I'm reading a book on my Kindle iPad app and a the pixels of my screen arrange themselves to resemble a page turning, the effect is supposed to be a trompe-l'oeil of three-dimensionality: the page is coming up towards me before turning over. Similarly, some features of the design of the iCal app on the iPad are meant to resemble the embossings and indentations of old-timey cardboard or leather desk calendar sets.
In the middle of this discussion my friend Matt Frost sent me this:
Well, there you go.

answers to important questions

The other day a homeschooling parent, whose child is in the ninth grade, wrote to me to ask what books I thought are essential for a young person to have read before coming to college. My reply:

For what it's worth, I don't think what a young person reads is nearly as important as how he or she reads. Young people who learn to read with patience and care and long-term concentration, with pencil in hand to make notes (including questions and disagreements), will be better prepared for college than students who read all the "right" books but read them carelessly or passively.

Thursday, March 3, 2011

reading cues in three dimensions

I tend to be annoyed by evolutionary just-so stories (“Let’s see, how can whatever I’m doing at the moment be explained by my hunter-gatherer ancestors?”) but this one I like, perhaps because it addresses an ongoing problem for me:

Something's been bothering me ever since I started reading books, especially non-fiction, on my Kindle:

I can't remember where anything is. Physical books are full of spatial reference points; an especially beloved book is a physical topography in which we develop a vague sense of which chapters contain relevant information; even where, on a page, a particularly striking sentence or diagram lies.

Ebooks have none of these referents. They're searchable (or at least, some are) which mitigates this issue somewhat. But I'm unlikely to remember that a fact was at "41% through a book" for one simple reason: my hands never got a chance to find out what 41% through a particular ebook feels like.

This isn't to say that physical books are perfect — perhaps if we read off of giant scrolls laid out across a gymnasium floor, I'd have an even better memory of where I saw a fact: "upper left quadrant, approximately the fourth row..." or something like that. And perhaps some day a virtual interface for reading will give me those kinds of spatial referents.

But in the meantime, millions of years of evolution are going to waste. It's no secret that mnemonists — the mental athletes of the world of competitive memorization — use tricks like placing facts and sequential information on the walls of mansions they imagine walking through. And why? Because our brains are exquisitely well-tuned to remember where things are. Exactly what you'd expect from a species with a migratory, hunter-gatherer past; a species that re-applied those abilities to the navigation of cities long after it settled into an agricultural pattern.

Whether the evolutionary explanation for knowing how to find things in books is a good one or not, the relative lack of spatial cues in e-reading is a genuine lack. My internet friend Tim Carmody has been talking on Twitter lately about three-dimensionality, and here’s a great instance of how and why three-dimensionality works for us.

Wednesday, March 2, 2011

the social trap

Andrew Keen:

Today's digital social network is a trap. Today's cult of the social, peddled by an unholy alliance of Silicon Valley entrepreneurs and communitarian idealists, is rooted in a misunderstanding of the human condition. The truth is that we aren't naturally social beings. Instead, as Vermeer reminds us in The Woman in Blue, human happiness is really about being left alone. On Liberty, the 1859 essay by Bentham's godson and former acolyte, John Stuart Mill, remains a classic defence of individual rights in the age of the industrial network and its tyranny of the majority. Today, as we struggle to make sense of the impact of the internet revolution, we need an equivalent On Digital Liberty to protect the right to privacy in the social-media age. Tapscott and Williams believe that the age of networked intelligence will be equal to the Renaissance in its significance. But what if they are wrong? What if the digital revolution, because of its disregard for the right of individual privacy, becomes a new dark ages? And what if all that is left of individual privacy by the end of the 21st century exists in museums alongside Vermeer's Woman in Blue? Then what?

But this is just replacing one untenable generalization with another. As everyone knows, surely, we want and need to be alone sometimes and to be with others sometimes. Mental and spiritual health is found in the proper balance of the two, which must be different for different people. Every the most solitary crave connection sometimes; even the most social need time alone to re-charge. It would be hard to deny that the internet has been a profoundly rich and healing place for many lonely people. The questions are, or should be: What is the right balance for me? and, Have I achieved that balance?

As I commented in a post some months ago, advocates of the Big Social like Steven Johnson think we have been too solitary and too disconnected and need — as a society anyway — to move towards more connection, because connection generates ideas. My response to that was that (a) the generation of ideas is not the only social good, and (b) it’s hard to look at the proliferation of social media that I and my friends are occupied with and think that we don't have enough connections. My suspicion is that we need fewer connections, but (as I suggested in my previous post) more multi-dimensional ones. Keen overstates the value of privacy and under-rates the value of social connection, but he’s surely write to think that solitude is right now the endangered condition.

Tuesday, March 1, 2011

once more with feeling

Just a note: So far this year I have re-read several books, and I am giving serious consideration to making this The Year of Re-Reading. I will have to read some things I haven't read before because of reviewing commitments — David Foster Wallace's forthcoming The Pale King, for instance — but I think I may commit to making all my elective reading re-reading.
I do like shiny new things, even in book form, so I am hesitant — but if I do embark on this experiment it ought to be interesting. For me, anyway.