Thursday, March 31, 2011
Via my editor Adam Keiper.
Wednesday, March 30, 2011
Sometimes I feel bad for these gamification enthusiasts. Priebatsch longs to change the term valedictorian to White Knight Paladin. And McGonigal, whose games are filled with top-secret missions in which you get to play the superhero, says "reality is broken" because people don't get to feel "epic" often enough. This is a child's view of how the world works. Do adults really need to pretend they're superheroes on secret missions to have meaning in their lives?
In Reality Is Broken, McGonigal talks about a game she invented to help herself get over a concussion. SuperBetter, as she called it, involved her taking on a secret identity—Buffy the Concussion Slayer—and enlisting family and friends to call her to report on "missions." The purpose of SuperBetter, McGonigal writes, was to connect her with her support system. I felt sad when I read this. What, you couldn't just pick up the phone? You needed to jump through all those hoops just to talk to your friends?
Life is complex and chaotic. If some people need to do a little role-playing now and then to help them through the day, mazel tov. It's another thing entirely, though, to rely on role playing for human contact, or to confuse the comfort of such tricks with what's real. Having a firm grip on reality is part of being a sane human being. Let's not be so eager to toss it away.
Tuesday, March 29, 2011
In his talk, Volf raised a simple but profound question: How do people from different religious traditions learn to live peacefully with one another in the same world? He outlined some ways in which we can be generous towards one another, and can identify some important common ground, especially among believers in the Abrahamic religions. His arguments were compelling, but they also (for me) raised a question: If it is often necessary to seek common ground, do rhetorical situations arise when it's important for a religious believer to emphasize what's unique about his or her own religion?
Miroslav's first response to this question was that emphasizing uniqueness as such is not a value for religious believers, or for Christians anyway: rather, we must strive to be faithful to our beliefs and commitments, and let uniqueness come as it will. A very wise response, I think, though one that had not occurred to me.
But then — and here's where the concerns of this blog come in — he pointed out that while one might want to speak differently in different rhetorical situations, might strive to adjust one's language to suit different audiences that have different needs, in practice we do not live in a world with "bounded" rhetorical situations. "Everyone is listening," he said, thanks to the World Wide Web, as it is accurately called, which takes what you say to one audience and broadcasts it — as text, audio, video, or all of the above — to pretty much anyone who's interested in finding it.
One of the most fundamental principles of rhetoric has always been decorum, that is, suiting one's language to occasion and audience. Those of us who teach writing typically think it vital to get our students to think in these terms — to see that they must adjust style and diction, evidence and argument, to reach the readers they most want to reach.
Such imperatives will never cease to be important. But it also seems likely that we will have to train students to be aware — and will have to train ourselves to be aware — that much of what we say and write can find audiences we never intended. And the consequences of our words' extended reach will not always be positive ones.
Increasingly, these will be matters of import for everyone. But given the intensity of feelings that people have for what Paul Griffiths calls the "home religion," religious believers whose lives have a public dimension should be especially thoughtful, careful — and prayerful.
Monday, March 28, 2011
Tim Morton is right to call out old forms like books and academic essays, rejoining [“exhorting,” maybe?] them to “figure out what they are about in this new environment.” But the same is true for blogs and other forms of digital writing as well. We’re no more stuck with the awkward tools that are blogs than we are stuck with awkward tools that are journals. . . .
I wonder what a writing and discussion system would look like if it were designed more deliberately for the sorts of complex, ongoing, often heated conversation that now takes place poorly on blogs. This is a question that might apply to subjects far beyond philosophy, of course, but perhaps the philosopher’s native tools would have special properties, features of particular use and native purpose. What if we asked how we want to read and write rather than just making the best of the media we randomly inherit, whether from the nineteenth century or the twenty-first?
I wish these were the sorts of questions so-called digital humanists considered, rather than figuring out how to pay homage to the latest received web app or to build new tools to do the same old work.
This is great stuff. Blogs are very poor tools for fostering genuine intellectual exchange, which is one reason why, increasingly, those exchanges happen for many on Twitter — despite the 140-character-at-a-time limit. We might ask why that is: Why do some many people prefer to exchange ideas on Twitter rather than on blogs? I don't think it’s just laziness. And then we might ask another question: What might a tool look like that combines the best features of blogging and tweeting, while minimizing the flaws of both instruments?
Friday, March 25, 2011
Joseph Cohen says he’s fed up with Blackboard. The leading course-management software is overloaded with features and dreadfully designed, making simple tasks difficult, says Mr. Cohen, a student at the University of Pennsylvania’s Wharton School. . . . Mr. Cohen and a classmate, Dan Getelman, have launched Coursekit, a stripped-down online learning-management system that offers a discussion board, a calendar, a syllabus, and related resources for courses at Penn. Mr. Cohen says he hopes Coursekit’s simple interface and Facebook-inspired tools will help make online discussions in a course as social as the course itself.
I hope Coursekit flourishes. Blackboard is a terrible, terrible, terrible system: bloated, ugly, confusing. The Blackboard motto seems to be, “Why Do Something in Two Clicks When You Can Do It In a Dozen?” I don't know anyone who uses it for one minute more than absolutely necessary. But something like Blackboard would clearly be valuable to teachers and students everywhere. Since Blackboard ate WebCT (which was equally bad), competition in this arena has been badly needed. Maybe Coursekit can provide it.
We have the technological systems in place to connect the vast majority of people in the world with much, if not most, of the greatest collections of knowledge. We have impressive digital databases. We have millions of hours of sound recordings. We have 100 years of film and video available. We have, of course, millions of books. . . .
We lack only one thing: the political will to fight for a great and noble information system—a global digital library. I'm not talking about the haphazard rush we've seen to date to digitize the stacks of major research libraries. Nor a commercial venture like Google's. I'm proposing what I call the "Human Knowledge Project" in my book, The Googlization of Everything (And Why We Should Worry). What I mean is a truly global digital library. To generate support for that, we need to identify the political and legal constraints, as well as articulate the payoffs.
That entails a formidable series of tasks. It might take 10, 20, or even 50 years. But there is no reason we should settle for expediency at the expense of excellence. After a few conversations, we might decide it's not worth the effort or cost. But at least we would have tried. And that's so much healthier than waiting for the Big Rich Magic Company in the Clouds to do all this for us—on its terms.
Thursday, March 24, 2011
Monday, March 21, 2011
[Seth] Priebatsch's declared aim is to "build a game layer on top of the world" – which at first seems simply to mean that we should all use SCVNGR, his location-based gaming platform that allows users to compete to win rewards at restaurants, bars and cinemas on their smartphones. (You can practically hear the marketers in the room start to salivate when he mentions this.)
But Priebatsch's ideas run deeper than that, whatever the impression conveyed by his bright orange polo shirt, his bright orange-framed sunglasses, and his tendency to bounce around the stage like a wind-up children's toy. His take on the education system, for example, is that it is a badly designed game: students compete for good grades, but lose motivation when they fail. A good game, by contrast, never makes you feel like you've failed: you just progress more slowly. Instead of giving bad students an F, why not start all pupils with zero points and have them strive for the high score? This kind of insight isn't unique to the world of videogames: these are basic insights into human psychology and the role of incentives, recently repopularised in books such as Freakonomics and Nudge. But that fact, in itself, may be a symptom of the vanishing distinction between online and off – and it certainly doesn't make it wrong.
Note the covert assumption here that, while we can totally reconfigure how we evaluate student performance, we can’t think of it as anything but “performance,” and we can't resist the student tendency to think in terms of competition for grades or professorial approval.
In this case I think gamification would simply make a fundamentally unhealthy, counterproductive way of thinking somewhat more fun — at least for those who thrive on competition. (For those who dislike competition, and there are more such people than is commonly realized, it would just make things worse.) I’d rather see if we can re-think our educational system to limit or channel the ethos of competition — which, I grant, would be much harder than game-ifying it.
There's an interesting moment (p. 248) when Shannon is speaking and just can't get people to focus on information as such — they keep wanting to get into semantics, meaning. (As I read this I found myself thinking of the way Neal Stephenson distinguishes between semantic and syntactic Faculties of philosophy in his novel Anathem.)
The Information is to some degree about this resistance, this inability that non-Shannon human beings have to see communication solely in terms of information transfer. But Gleick doesn't address this point as directly as I think he should: he tends instead to allow the confusions and elisions to be present in his narrative. Which is to some degree defensible, since that’s what real life has been like.
But let’s make some distinctions:
- information: defined (in multiple ways) here
- data: information recognized by humans as information
- knowledge: information sorted by humans and translated into human terms
- wisdom: the proper discerning of the human uses of knowledge
- counsel: wisdom transmitted to others
Friday, March 18, 2011
Thursday, March 17, 2011
In the early pages of The Information, Gleick writes a good deal about communication: African talking drums, for instance, and telegraphy. Someone wants to say something to another person, perhaps a distant person; how can that be accomplished? Only over much time, Gleick (implicitly) argues, does it become clear that the problem is one of information. And, it turns out, many other problems are problems of information also:
Then, in a curious turn, near the end of the book Gleick starts talking again about communication. The protagonist of The Information, insofar as it has one, is clearly Claude Shannon, the most important figure in modern history that hardly anyone has heard of, because it was Shannon who defined information and isolated it from other terms with which it is often confused. But Gleick seems to be contemplating near the end of the book the price we pay, or can pay, for Shannon’s world-changing insights: “The birth of information theory came with its ruthless sacrifice of meaning—the very quality that gives information its value and its purpose. Introducing The Mathematical Theory of Communication, Shannon had to be blunt. He simply declared meaning to be ‘irrelevant to the engineering problem.’ Forget human psychology; abandon subjectivity.” None of that matters to “the engineering problem.”
What English speakers call “computer science” Europeans have known as informatique, informatica, and Informatik. Now even biology has become an information science, a subject of messages, instructions, and code. Genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. The body itself is an information processor. Memory resides not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level — an alphabet and a code, 6 billion bits to form a human being. “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’” declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions. . . . If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.”
Tuesday, March 15, 2011
Monday, March 14, 2011
- Promote and enjoy real-time Open Access to research
- Share primary data, working papers, books, media links...
- Receive feedback and reviews from your peers
- Expose your work to those that matter
- Aggregate qualitative indicators about your impact
- Drive, build and share your online reputation
As I understand it, the principal virtue of popular science writing — indeed writing for a general audience about any technical subject — lies in its ability to take the work of scholars and make it accessible to non-scholars, without fundamentally falsifying it. Twenty percent of the way into The Information, I find that James Gleick is doing little more than retelling stories already told by other popular writers. His account of Charles Babbage’s early mechanical calculators covers ground well-trod by others, including The Difference Engine: Charles Babbage and the Quest to Build the First Computer (2002), by Doron Swade. Almost everything he says about the telegraph will be familiar to anyone who has read Tom Standage’s The Victorian Internet (1998). The material on the relationship between orality and literacy — with its inevitable nods towards the paradoxes of Plato’s writing about that apostle of the spoken word, Socrates — has been covered thousands of times, most recently (at roughly the same length) in Nick Carr’s The Shallows. Of course, not everyone will have read all, or any, of these books — but even for those folks, Gleick’s narrative might well seem bland and colorless, and lacking a strong thematic center. Very much journeyman work so far; I had expected much better. And I still hope for better as I move along.
Saturday, March 12, 2011
Disturbances in the Twitterverse: first Twitter releases a new iPhone client that prominently features trends — including “promoted” trends, that is to say, ads — and offers no way to hide them. (A new release makes the trends appear in a slightly less annoying way, but they are still mandatory.) Then Twitter issues new guidelines to developers that — as far as I can tell — pretty much eliminate further development of third-party clients: “Developers ask us if they should build client apps that mimic or reproduce the mainstream Twitter consumer client experience. The answer is no. . . . We need to move to a less fragmented world, where every user can experience Twitter in a consistent way.” And that seems to mean, through the Twitter website and through Twitter’s own clients for Mac, Windows, iOS, Android, and so on, with their prominently featured “promoted trends.”
Clearly this has nothing to do with "user experience": the people who run Twitter are casting around for ways to make money, which is understandable, and this is what they have settled on. I can’t say that I totally blame them: their investors are surely demanding return on their (hefty) investments. Twitter has got to be enormously expensive to run. But here we see the chief problem that arises when a major new form of communication — what, surprisingly to everyone, turned out to be a major new form of communication — consists of proprietary technology completely controlled by a single company. Imagine if Google had invented and owned email, so that you could only use email by navigating to the Gmail site and dealing with whatever ads Google chose to feature; so that Google had absolute control over your user experience; so that if Google’s servers went down the entire communicative ecosystem went down. That’s the situation the bosses at Twitter are clearly trying to create.
Again, it’s their technology and they can do what they want. Maybe the next step will be an ad-free, trend-free Twitter experience for a monthly fee; I wouldn’t be surprised, and I would have nothing legitimate to complain about. But Twitter has become so central to many people’s lives that it feels like a public utility, and the ads therefore feel like an unwarranted intrusion — as though we had to listen to 30-second pitches for dishwashing detergent before being able to complete a phone call. (And don't think I'm unaware that we've been here before. I've read The Master Switch.)
Just as Diaspora is being created as an open-source alternative to Facebook, in response to the rather more blatant and consistent tyrannies of the Zuckerbergian empire, these recent developments will prompt renewed attention to open-source and/or distributed alternatives to Twitter, like this one and this one.
But there’s a problem — probably an insurmountable one: these kinds of services only work when pretty much everyone you want to know is on them. Nobody wants to go back and forth between different Twitter-like services or Facebook-like services, trying to remember which friend is on which service. (I rarely remember to log in to Diaspora, and when I do, I find that my tiny handful of friends haven’t visited either.) In such an environment, what’s called for is some powerful aggregating technology that would allow us to have a single conduit through which we could see what all our friends are up to. But this is of course precisely what Facebook and Twitter are refusing to allow. In effect, they’re saying “You get what we offer the way we want to offer it, along with all your other friends, or you’re out in the cold — the silent, still, radically un-social cold. Deal with it.”
Thursday, March 10, 2011
My colleague Heather Whitney has a post up at ProfHacker about one aspect of student evaluations: their occasional lack of truthfulness. Let me add my two cents:
A year ago, as some readers of this blog may recall, I spent some time in the hospital. My classes that semester met on Tuesdays and Thursdays, and as a result of illness I missed four classes: two weeks total, in a fourteen-week semester. A lot to miss! — but perhaps not enough to warrant students writing on their evaluations, “It’s hard to evaluate this class because we hardly ever met,” or “It’s not Dr. Jacobs’s fault for being sick, but the fact that he missed most of the semester really hurt the success of this class.”
I also recall, a couple of years back, some students complaining on their evaluations that they got poor grades on their papers because I didn’t give them any guidance — even though before turning in the final draft of their papers they had to submit (a) a proposal for the paper, to which I responded in writing with detailed suggestions, and then (b) a complete draft of the paper, to which I also responded in writing. If this was not “guidance,” I wonder what would have been.
I can only explain this phenomenon — which is consistent among a minority of students — by speculating that some students think that evaluations are an opportunity not for them to speak truthfully, according to any naïvely “objective” or dispassionate model of truth, but rather for them to share their feelings — whatever those feelings happen to be at the moment. And at the end of a long and stressful semester, those feelings will sometimes be rather negative. This is one of the many reasons why student evaluations as they are typically solicited and offered are useless, or worse than useless — and I speak as someone with a long history of very positive student evaluations.
Thus for a long time I have recommended, to anyone who will listen and to many who will not, that evaluations for a given course be solicited at least one full semester after the course is completed, when students are less emotionally involved in it. A year or more after would be even better. We might get fewer responses, especially from students who have graduated, but they would be better responses.
Whenever I make this suggestion, the first response I get is always the same: “But a semester [or a year] later, they won’t remember anything from the class!”
“That would be something worth knowing,” I reply.
Monday, March 7, 2011
So, remember when I was saying that I might just re-read books for the rest of the year? Seemed like a good idea — but I had forgotten that I had pre-ordered Crazy U for my Kindle. And when it showed up I couldn't help devouring it. Ah, well, some kind of road is paved with good intentions, I don't remember exactly what kind. I’m not worried, though.
It’s Spring Break around here, so I’m taking the rest of the week off from the blog, and will try to be as useless as possible. Though I do need to finish this big essay on the maddening Marshall McLuhan.
When I get back, I’m going to be blogging my way through a book again — and yes, it’s a new book, one I haven't read before. But I can’t resist this one: it’s James Gleick’s The Information: a History, a Theory, a Flood. You are all welcome to get started on it and read along with me.
Let me tell you a few things about Andy Ferguson’s new book Crazy U: it’s well-researched, insightful, thought-provoking, and sometimes hysterically funny. He’s good on everything: college admissions standards, evaluation of candidates, financial aid, you name it. And he links the themes together in sometimes unexpected ways.
Consider for example this passage, from a section on how the admissions policies of Ivy League universities have changed over the years:
“In a way you had more human diversity in the old Harvard,” a friend once told me, after a lifetime of doing business with Harvard graduates. His attitude was more analytic than bitter, however. “It used to be the only thing an incoming class shared was blue blood. But bloodlines are a pretty negligible thing. It allows for an amazing variety in human types. You had real jocks and serious dopes, a few geniuses, a few drunks, a few ne’er-do-wells, and a very high percentage of people with completely average intelligence. Harvard really did reflect the country in that way back then. “You still have a lot of blue bloods getting in, multigeneration Harvard families. But now a majority of kids coming into Harvard all share traits that are much more important than blood, race, or class. On a deeper level, in the essentials, they’re very much alike. They’ve all got that same need to achieve, focus, strive, succeed, compete, be the best—or at least be declared the best by someone in authority. And they’ve all figured out how to please important people.” Harvard grads disagree with this, of course. They like to say that the new Harvard represents the triumph of meritocracy. No, my friend said. “It’s the triumph of a certain kind of person.”
Then, some pages farther on, Ferguson is discussing the lamentable “Me essay,” the tell-us-everything-deeply-personal-about-yourself essay most colleges ask their applicants to write, and in that context he asks,
But what qualities does the Me Essay measure? If they were trying to capture the ability to write and reason, this could be accomplished by less melodramatic means. No, the admissions essay rewarded personal qualities beyond mathematical reasoning and verbal facility. Some of the traits were appealing enough, in appropriate doses. Refreshingly effusive kids, admirably enthusiastic kids, the all-American Eddie-attaboys might very well thrive on the essay. But it would also reward other characteristics, like narcissism, exhibitionism, Uriah Heep–ish insincerity, and the unwholesome thrill that some people get from gyrating before strangers. Which of these traits, I wondered, predicted scholarly aptitude or academic success? I saw it at every turn, as my friend had said of Harvard: the system “privileged” a certain kind of kid. And if you weren’t that kind of kid the best course was to figure out how to pretend you were.
Suddenly you see how the whole system of college admissions coalesces around, not just ambition, but a particular kind of ambition — something far more social than intellectual. It’s kind of nauseating, to be honest — though perhaps I feel that way because I’ve just helped to shepherd my son through college applications. But whether you’ve got college-aged kids or not, and even if you don't have kids and don't even plan to have them, you ought to read Crazy U. It’s a first-rate piece of popular cultural criticism, and it’s very, very funny.
Friday, March 4, 2011
The other day a homeschooling parent, whose child is in the ninth grade, wrote to me to ask what books I thought are essential for a young person to have read before coming to college. My reply:
For what it's worth, I don't think what a young person reads is nearly as important as how he or she reads. Young people who learn to read with patience and care and long-term concentration, with pencil in hand to make notes (including questions and disagreements), will be better prepared for college than students who read all the "right" books but read them carelessly or passively.
Thursday, March 3, 2011
I tend to be annoyed by evolutionary just-so stories (“Let’s see, how can whatever I’m doing at the moment be explained by my hunter-gatherer ancestors?”) but this one I like, perhaps because it addresses an ongoing problem for me:
Something's been bothering me ever since I started reading books, especially non-fiction, on my Kindle:
I can't remember where anything is. Physical books are full of spatial reference points; an especially beloved book is a physical topography in which we develop a vague sense of which chapters contain relevant information; even where, on a page, a particularly striking sentence or diagram lies.
Ebooks have none of these referents. They're searchable (or at least, some are) which mitigates this issue somewhat. But I'm unlikely to remember that a fact was at "41% through a book" for one simple reason: my hands never got a chance to find out what 41% through a particular ebook feels like.
This isn't to say that physical books are perfect — perhaps if we read off of giant scrolls laid out across a gymnasium floor, I'd have an even better memory of where I saw a fact: "upper left quadrant, approximately the fourth row..." or something like that. And perhaps some day a virtual interface for reading will give me those kinds of spatial referents.
But in the meantime, millions of years of evolution are going to waste. It's no secret that mnemonists — the mental athletes of the world of competitive memorization — use tricks like placing facts and sequential information on the walls of mansions they imagine walking through. And why? Because our brains are exquisitely well-tuned to remember where things are. Exactly what you'd expect from a species with a migratory, hunter-gatherer past; a species that re-applied those abilities to the navigation of cities long after it settled into an agricultural pattern.
Whether the evolutionary explanation for knowing how to find things in books is a good one or not, the relative lack of spatial cues in e-reading is a genuine lack. My internet friend Tim Carmody has been talking on Twitter lately about three-dimensionality, and here’s a great instance of how and why three-dimensionality works for us.
Wednesday, March 2, 2011
Today's digital social network is a trap. Today's cult of the social, peddled by an unholy alliance of Silicon Valley entrepreneurs and communitarian idealists, is rooted in a misunderstanding of the human condition. The truth is that we aren't naturally social beings. Instead, as Vermeer reminds us in The Woman in Blue, human happiness is really about being left alone. On Liberty, the 1859 essay by Bentham's godson and former acolyte, John Stuart Mill, remains a classic defence of individual rights in the age of the industrial network and its tyranny of the majority. Today, as we struggle to make sense of the impact of the internet revolution, we need an equivalent On Digital Liberty to protect the right to privacy in the social-media age. Tapscott and Williams believe that the age of networked intelligence will be equal to the Renaissance in its significance. But what if they are wrong? What if the digital revolution, because of its disregard for the right of individual privacy, becomes a new dark ages? And what if all that is left of individual privacy by the end of the 21st century exists in museums alongside Vermeer's Woman in Blue? Then what?
But this is just replacing one untenable generalization with another. As everyone knows, surely, we want and need to be alone sometimes and to be with others sometimes. Mental and spiritual health is found in the proper balance of the two, which must be different for different people. Every the most solitary crave connection sometimes; even the most social need time alone to re-charge. It would be hard to deny that the internet has been a profoundly rich and healing place for many lonely people. The questions are, or should be: What is the right balance for me? and, Have I achieved that balance?
As I commented in a post some months ago, advocates of the Big Social like Steven Johnson think we have been too solitary and too disconnected and need — as a society anyway — to move towards more connection, because connection generates ideas. My response to that was that (a) the generation of ideas is not the only social good, and (b) it’s hard to look at the proliferation of social media that I and my friends are occupied with and think that we don't have enough connections. My suspicion is that we need fewer connections, but (as I suggested in my previous post) more multi-dimensional ones. Keen overstates the value of privacy and under-rates the value of social connection, but he’s surely write to think that solitude is right now the endangered condition.
Tuesday, March 1, 2011
Commentary on technologies of reading, writing, research, and, generally, knowledge. As these technologies change and develop, what do we lose, what do we gain, what is (fundamentally or trivially) altered? And, not least, what's fun?
Alan Jacobs is Distinguished Professor of the Humanities in the Honors Program of Baylor University and the author, most recently, of The “Book of Common Prayer”: A Biography and The Pleasures of Reading in an Age of Distraction. His homepage is here.
Sites of Interest
- August (16)
- July (17)
- June (5)
- May (14)
- April (12)
- March (15)
- February (10)
- January (15)
- August (9)
- July (8)
- June (14)
- May (28)
- April (13)
- March (24)
- February (16)
- January (23)
- December (28)
- November (19)
- October (21)
- September (25)
- August (20)
- July (33)
- June (54)
- May (44)
- April (19)
- March (24)
- February (19)
- January (25)
- December (33)
- November (33)
- October (39)
- September (27)
- August (32)
- July (36)
- June (26)
- May (25)
- April (32)
- March (34)
- February (2)
- January (31)