Tuesday, November 30, 2010
Monday, November 29, 2010
Wednesday, November 24, 2010
So what should e-readers be made of? How about, let’s see — yes: paper.
This article reports on the use of paper as the substrate for the formation of displays based on the effect of electric fields on the wetting of solids, the so-called electrowetting (EW) effect. One of the main goals of e-paper is to replicate the look-and-feel of actual ink-on-paper. We have, therefore, investigated the use of paper as the perfect substrate for EW devices to accomplish e-paper on paper. The motion of liquids caused by the EW effect was recognized early on by Beni and Hackwood to have very attractive characteristics for display applications: fast response time, operation with low voltage, and low power consumption. More recent EW structures utilize the voltage-induced contact angle (CA) change of an aqueous electrolyte droplet placed on the surface of a hydrophobic fluoropolymer layer and surrounded by oil. Insulating, nonpolar oils (usually alkanes) are used for this purpose because they (unlike water) do not respond directly to the applied electric field. EW technology is used in many applications, including reflective and emissive displays, liquid lenses, liquid-state transistors, and bio/medical assays.
Seriously? I’m imagining letters re-forming on the page as on the Marauder’s Map.
Monday, November 22, 2010
Well, the recent traveling and busyness may have kept me from posting, but it didn't keep me from reading. Nothing keeps me from reading. So here's what's been going on:
I've been working my way through Tony Judt's magisterial Postwar, but it's a very large book — exactly the kind of thing the Kindle was made for, by the way — and I've been pausing for other tastes. For instance, I read Jane Smiley's brief and brisk The Man Who Invented the Computer: The Biography of John Atanasoff, Digital Pioneer, the title of which is either misleading or ironic or, probably, both. Smiley begins her narrative with a straightforward claim:
The inventor of the computer was a thirty-four-year-old associate professor of physics at Iowa State College named John Vincent Atanasoff. There is no doubt that he invented the computer (his claim was affirmed in court in 1978) and there is no doubt that the computer was the most important (though not the most deadly) invention of the twentieth century.
But by the end of the narrative, less than half of which is about Atanasoff, she writes, more realistically and less definitively,
The computer I am typing on came to me in a certain way. The seed was planted and its shoot was cultivated by John Vincent Atanasoff and Clifford Berry, but because Iowa State was a land-grant college, it was far from the mainstream. Because the administration at Iowa State did not understand the significance of the machine in the basement of the physics building, John Mauchly was as essential to my computer as Atanasoff was — it was Mauchly who transplanted the shoot from the basement nursery to the luxurious greenhouse of the Moore School. It was Mauchly who in spite of his later testimony was enthusiastic, did know enough to see what Atanasoff had done, was interested enough to pursue it. Other than Clifford Berry and a handful of graduate students, no one else was. Without Mauchly, Atanasoff would have been in the same position as Konrad Zuse and Tommy Flowers — his machine just a rumor or a distant memory.
Each person named in that paragraph gets a good deal of attention in Smiley’s narrative, along with Alan Turing and John von Neumann, and by the time I finished the book I could only come up with one explanation for Smiley’s title and for her ringing affirmation of Atanasoff’s role as the inventor of the computer: she too teaches at Iowa State, and wants to bring it to the center of a narrative to which it has previously been peripheral. A commendable endeavor, but not one that warrants the book’s title.
In the end, Smiley’s narrative is a smoothly readable introduction to a vexed question, but it left me wanting a good deal more.
Friday, November 19, 2010
Saturday, November 13, 2010
Friday, November 12, 2010
In a typically smart column about online education, my friend Reihan Salam quotes Anya Kamenetz:
The only way to restore the concept of higher education as a public good is to reinvent it as a truly public good: not subject to antiquated notions of scarcity and hierarchical expertise, but adapted to the current reality of free, open, and immediate sharing of knowledge.
Reihan says, “That sounds right to me,” but I can't say that because I have no idea what it means. I see the same problem here that I saw in Kamenetz’s book: enthusiasm misted over by terminal vagueness. To wit:
1) We restore something as a public good by reinventing it as a public good? Seems close to tautological, but beyond that: who is “we”? Who is going to go about the task of reinvention? College presidents working at the institutional level? Faculty reconfiguring their classes and intellectual activities whether they have administrative support or not? Congress passing new laws?
2) About “the current reality of free, open, and immediate sharing of knowledge”: certainly a great deal of knowledge is free, open, and easily shared. On the other hand, a great deal of knowledge is proprietary and controlled by patents, trademarks, copyrights, and various forms of institutional secrecy. Is the proportion of information that is free greater than it used to be? (I have no idea. Free information is more easily accessed than it used to be, but that’s not the same thing.) In any case, how will “reinventing the concept of higher education as a public good” change the current regime of knowledge control and regulation? How could it do so?
3) What does Kamenetz mean by “hierarchical expertise”? Obviously, expertise itself can't be hierarchical, so she probably (?) means something like, “a system in which people receive official rewards — jobs, promotions, accreditations, certifications, etc. — for demonstrated expertise.” But there’s a lot to be said for such a system. I like being able to choose a doctor by learning, among other things, where she got her medical degree and what board certifications she has earned. Even in Kamenetz’s book the people whom she celebrates for spreading their knowledge are people connected to, drawing funding from, and accredited by elite institutions. Is that a bad thing? Whether it is or not, it ain’t DIY education.
Frankly, I’d love to see a system — or rather (this is the point) a non-system — in which the circulation of knowledge through informal and fluid networks plays a much greater role than it does now. A non-system which circumvents much of the bureaucratic sclerosis of the modern university, perhaps with the help of universities that are willing to reconfigure themselves. ("Reinvention" is too Utopian a term for me.) It all sounds very cool, in the abstract. I’d just like someone to tell me how we're going to get there.
Thursday, November 11, 2010
One of Tim Burke’s colleagues is a little concerned about the breadth of interests represented by Tim’s syllabi:
My colleague suggested to me that I had to be responsible first (and last) to my discipline and my specialization in my teaching, that there was something unseemly about the heavy admixture of literature and popular culture and journalistic reportage and anthropology that populates some of my syllabi. I’ve heard similar sentiments expressed as an overall view of higher education in some recent meetings. At a small liberal-arts college and maybe even at a large research university, this strikes me as substantially off the mark. Or at least we need some faculty who are irresponsible to their disciplines and responsible first to integrating and connecting knowledge.
Let me repeat that for you: We need some faculty who are irresponsible to their disciplines and responsible first to integrating and connecting knowledge. This is a precise and concise summation of what I’ve tried to do for many years now. There’s a price to be paid for this kind of thing, of course: expanded interests do not yield expanded time. The day’s number of hours remain constant, and then there's the matter of sleep. So the more I explore topics, themes, books, films — whatever — outside the usual boundaries of my official specialization, the less likely it is that I will read every new article, or even every new book, in “my field.” But, to rephrase Tim’s point as a series of questions, Is the unswerving focus on a specifically bounded area of specialization the sine qua non of scholarship? Is it even intrinsic to scholarship? Is there not another model of scholarship whose primary activity is “integrating and connecting knowledge”?
I think there is such a model, and I think it deserves to be called scholarship, but I’m not going to fight about the point. Call it what you want, it’s what I love to do, and God willing, I’ll be looking for new and interesting connections for the rest of my life. That’s how my mind works, in any event, but it’s also what makes sense given my institutional situation. Tim and I both teach at liberal arts colleges where we are asked to teach a variety of courses, and to try to maintain a narrow specialization in auch an environment is to set one’s teaching at odds with one’s research. I prefer to seek ways to make my teaching and my research feed each other, and since I can't do that by narrowing the range of courses I teach, I will do it by expanding the range of topics I research and write about.
And I love it this way. Had I ended up at a big research university, I seriously doubt I would have had the luxury of developing some of the major interests that I’ve pursued in the past decade (e.g., the issues pursued on this blog). And from my point of view, that would be a shame.
Tuesday, November 9, 2010
This sobering post from Nick Carr suggests that we ought to be worried, or at least seriously reflective, about “web revolutionaries” who are pushing the commercialism and commodification of human intimacy:
What most characterizes today's web revolutionaries is their rigorously apolitical and ahistorical perspectives — their fear of actually being revolutionary. To them, the technological upheaval of the web ends in a reinforcement of the status quo. There's nothing wrong with that view, I suppose — these are all writers who court business audiences — but their writings do testify to just how far we've come from the idealism of the early days of cyberspace, when online communities were proudly uncommercial and the free exchanges of the web stood in opposition to what John Perry Barlow dismissively termed "the Industrial World." By encouraging us to think of sharing as "collaborative consumption" and of our intellectual capacities as "cognitive surplus," the technologies of the web now look like they will have, as their ultimate legacy, the spread of market forces into the most intimate spheres of human activity.
I think Nick is right about this — as is Jaron Lanier when he sounds a similar note — and I say that as someone generally enthusiastic about the entrepreneurial possibilities of online culture.
On some level we all know this commodification of intimacy is happening: no thoughtful person can possibly believe that Mark Zuckerberg’s crusade for “radical transparency” is a genuine Utopian ethic; we know that he’s articulating a position that, if widely accepted, yields maximum revenue for Facebook. But we are just beginning to think about how radically transparent we are becoming, and if Nick Carr is right, we very much need some “web revolutionaries” who really are revolutionary in their repudiation of these trends.
In other words, the problem isn't the businessmen who want to dig around in our brains — of course the business world wants to dig around in our brains: haven't you seen “Mad Men”? — the problem is the failure of influential wired intellectuals to provide the necessary corrective pushback.
Speaking of personal vacillation and changeableness, remember how I returned my iPad? Yeah, well, I got another one, and basically for one purpose: teaching.
The iPad, it turns out, is a great tool for teachers. I don't rely heavily on presentation software (Keynote is of course my software of choice, though one of these days I’m going to figure out Beamer): most days I don't use it at all, and when I do use it, it tends to be for just a few minutes, after which I want to return to conversation. If I had to build a massive Keynote deck, I’d want to use my MacBook for that, but for brief (under a dozen slides) presentations, the iPad version of Keynote works great.
So I create my presentation and then, on my MacBook probably, write or update my class notes. These are always in plain text, and are in folders that are automagically and instantly backed up via Dropbox. Now Dropbox has a very nice iOS client, which means that I can write up those notes on my MacBook, then pick up the iPad and head to class, plug it in, make the Keynote presentation, unplug it, open the Dropbox client and lead discussion from those just-updated text notes.
I think this system is going to work for me.
Monday, November 8, 2010
It’s just not in my nature, I guess, to stick with one organizational method for very long. A few months ago, I wrote about my return to Backpack. Well, it only took about two weeks for me to leave Backpack again. There were a couple of reasons. First, I realized that while Backpack works well for me on my Mac — aside from some awkwardness of text entry — it didn’t work so well on my iPhone. The people at 37signals have done almost nothing to accommodate to mobile devices, and the third-party Backpack client, Satchel, is usable but awkward.
And then, second, I discovered something I should have known already: that the app I had abandoned, Notational Velocity, has a fabulous Service: “New Note with Selection.” And even cooler, when you select a passage on a webpage and then invoke the service — which I do with a simple keystroke combination I assigned for the purpose — NV adds the URL also. Fabulous!* And then, because I sync my NV notes with Simplenote, I have these clippings available to me on my iPod — and any other iOS device I might have.** (More about that later.) Once NV can read and sync Simplenote’s tags I’ll be nearing my own private Singularity.
Also, I have been using Remember the Milk as my to-do list and calendar, largely because of its excellent iPhone app — the iPhone seems to me the natural place to manage to-do’s.
This has been my system for about three months now, which for me is a long time. Might I even stick with it? Stay tuned. . . .
* The URL-addition works in Safari and OmniWeb, but not currently in Chrome. Chrome adds the selected text only.
** I can't find it now, but just a few days ago I read a blog post about how iOS devices need their own Services menu. This is very, very true.
Friday, November 5, 2010
Keith Gessen: I want to move us into life choices. Does anybody regret the profession they have chosen?
Mark Greif: I have no profession. Whatever profession I do, I regret it.
Benjamin Kunkel: What do you . . . mean? What are you talking about?
Mark Greif: I regret it!
Benjamin Kunkel: What?
Mark Greif: Whatever it is that I’ve become.
Keith Gessen: You’ve become a philosopher.
Mark Greif: No philosopher would think so.
Keith Gessen: You’ve become an editor.
Mark Greif: But that’s something to be ashamed of.
Benjamin Kunkel: An essayist? A critic?
Mark Greif: Essayist! That’s interesting. You know, you go through life not really knowing who you are, and one day, somebody calls you an essayist. Out of all the pathetic categories that I read growing up, I knew there was no bigger joke than an essayist. Someone who couldn’t write something long enough to actually grab hold of anyone, someone without the imagination to write fiction, someone without the romantic inspiration to write poetry, and someone who would never make any money or be published. I’m an essayist!
— from n+1’s fabulous What We Should Have Known: Two Discussions.
Thursday, November 4, 2010
I’ve been reading a number of comics-slash-graphic novels, and too many of them are trying to do in comic form what word-only forms (novels, essays) do better. There’s really no point to a graphic story in which the visual element isn't pulling a heavy load of meaning and mood. Too often you have art that isn't doing a lot except, perhaps, to conceal how drearily familiar the story is.
But a fine example of art that contributes mightily to the character and power of the storyis Asterios Polyp, which would be a fantastic book were it not marred by an utterly ridiculous ending. I get annoyed every time I think about it. But until that ending, the book is a great example of visual storytelling.
Wednesday, November 3, 2010
So far three friends of mine have signed up for letter.ly, and are producing newsletters that I can sign up for. I have very mixed feelings about this. On the one hand, I want to support my friends, and I know that it’s hard to write (or do any other skilled labor) for free. Heck, I’ve even thought about signing up for letterl.ly myself.
But on the other hand, projects like this are more nails in the coffin of the open web, and I don't like seeing that happen, even though it’s probably inevitable. Almost any of the letter.ly newsletters would have been blogs as recently as a few months ago — most of them probably were blogs a few months ago — and thus part of the most public conversational space yet invented. Now they’ll be the property of a select few — which is pretty much how things used to be before the internet. Bill Gates’s famous “Open Letter to Hobbyists” appeared in a computer club newsletter; the great Bill James’s Baseball Abstract began life as a newsletter: people found out about it through ads in The Sporting News. Letter.ly marks an attempt to renew the newsletter as a genre for the digital age. (There are a number of free newsletters out there — e.g., Jason Calcanis’s — but I’m talking about newsletters s a means of revenue for their writers.)
Then there are experiments like the Times of London’s paywall experiment: results are mixed so far, but even if the Times site ends up making money, a formerly major player has been taken out of the general conversation of the Web. Similarly, as more and more people encounter newspapers and magazines not in web browsers but in purpose-built iPad apps, it may get harder to do the copying, pasting, and commenting that have been intrinsic to the blogging enterprise since its inception.
But again, if that does happen it will be a return to the Normal of twenty years ago. Then I bought magazines and newspapers individually, and if I wanted to keep items in them I literally cut them out and filed them — rarely did I paste. If now I buy them individually as iPad apps, with in-app purchase of single issues or subscriptions (which is what Wired, among others, wants me to do) then I have largely returned to old habits, though any copying and pasting I do will be digital and there won't be any cutting at all.
I remember when I had to think hard about how many magazines and newspapers I was subscribing to, and whether I could afford a new one without canceling something else. Maybe I’ll soon be making those decisions again. I understand the necessary of such changes, but I don't have to like them. I especially don't like the thought that I might hurt someone’s feelings by not subscribing to — or, worse, canceling my subscription to — his or her newsletter. And I am deeply uncomfortable with the thought that that One Great Conversation may be breaking up again. It’s starting to look like I’ll soon be nostalgic for those few years when I had a single-payee system — once a month to an ISP — after which the whole world came to my screen.
Monday, November 1, 2010
The vast library that is the internet is flooded with so many advertisements that many people claim not to notice them anymore. Ads line the top and right of the search results page, are displayed next to emails in Gmail, on our favourite blog, and beside reportage of anti-corporate struggles. As evidenced by the tragic reality that most people can't tell the difference between ads and content any more, this commercial barrage is having a cultural impact.
The danger of allowing an advertising company to control the index of human knowledge is too obvious to ignore. The universal index is the shared heritage of humanity. It ought to be owned by us all. No corporation or nation has the right to privatise the index, commercialise the index, censor what they do not like or auction search ranking to the highest bidder. We have public libraries. We need a public search engine.
Well . . . if advertising is the problem, then “a public search engine” won't solve the problem, will it? We wouldn't see ads while searching, but we would see them as soon as we arrived at the pages we were searching for. Moreover, if it’s wrong to have ads next to reportage online, then presumably it’s wrong to have ads in the paper version of the Guardian, in magazines, and on television as well.
What exactly is White asking for? A universal prohibition on internet advertising, brokered by the U.N.? An international tribunal to prosecute Google for unauthorized indexing? Yes, it would have been wonderful, as Robert Darnton has pointed out, if universities and libraries had banded together to do the information-indexing and book-digitizing that Google has done — but they didn’t.
So here we are, with an unprecedented and astonishing amount of information at our fingertips, and we’re going to complain about ads? — the same ads that give us television, newspapers, and magazines? Please. Why not just come right out and say “I want everything and I want it for free”?
Google gives us plenty to complain about; I have deeply mixed feelings about the company myself, as I have often articulated. But the presence of online ads ought to be the least of our worries.
(Update: here's Darnton on the possibility of creating a national digital library.)
Commentary on technologies of reading, writing, research, and, generally, knowledge. As these technologies change and develop, what do we lose, what do we gain, what is (fundamentally or trivially) altered? And, not least, what's fun?
Alan Jacobs is Distinguished Professor of the Humanities in the Honors Program of Baylor University and the author, most recently, of The “Book of Common Prayer”: A Biography and The Pleasures of Reading in an Age of Distraction. His homepage is here.
Sites of Interest
- October (12)
- September (20)
- August (22)
- July (17)
- June (5)
- May (14)
- April (12)
- March (15)
- February (10)
- January (15)
- August (9)
- July (8)
- June (14)
- May (28)
- April (13)
- March (24)
- February (16)
- January (23)
- December (28)
- November (19)
- October (21)
- September (25)
- August (20)
- July (33)
- June (54)
- May (44)
- April (19)
- March (24)
- February (19)
- January (25)
- December (33)
- November (33)
- October (39)
- September (27)
- August (32)
- July (36)
- June (26)
- May (25)
- April (32)
- March (34)
- February (2)
- January (31)