I think it’s really important to meditate on what Alex Payne says here:

The iPhone can, to some extent, be forgiven its closed nature. The mobile industry has not historically been comfortable with openness, and Apple didn’t rock that boat when it released the iPhone. The iPhone was no more or less open than devices that preceded it, devices like those from Danger that required jumping similar bureaucratic hurdles to develop for.That the iPad is a closed system is harder to forgive. One of the foremost complaints about the iPhone has been Apple’s iron fist when it comes to applications and the development direction of the platform. The iPad demonstrates that if Apple is listening to these complaints, they simply don’t care. This is why I say that the iPad is a cynical thing: Apple can’t – or won’t – conceive of a future for personal computing that is both elegant and open, usable and free. . . .The thing that bothers me most about the iPad is this: if I had an iPad rather than a real computer as a kid, I’d never be a programmer today. I’d never have had the ability to run whatever stupid, potentially harmful, hugely educational programs I could download or write. I wouldn’t have been able to fire up ResEdit and edit out the Mac startup sound so I could tinker on the computer at all hours without waking my parents. The iPad may be a boon to traditional eduction, insofar as it allows for multimedia textbooks and such, but in its current form, it’s a detriment to the sort of hacker culture that has propelled the digital economy.

Wonderful as Apple’s recent products are, I am actively trying to figure out how to distance myself from a company I’ve been committed to for a quarter of a century. Whether I’ll discover something that doesn’t involve simply leaping into the arms of Google remains to be seen.

Text Patterns

February 1, 2010

5 Comments

  1. As Alex Payne says the distinctive feature of what has come to be called a "computer" is the ability to change the system from inside the system itself (von Neumann architecture). Perhaps the purest example of that is the programming language/OS called Forth where the whole "OS" is mutable putty to be changed on a whim from the same interface that you use for everything else (I believe Macs still have a Forth shell with Open Firmware). The problem is this mutability is precisely what viruses rely on (in humans and in computers). I am disappointed that Apple seems to lack the vision for solving this problem without sacrificing the identity of what is a computer. You are not much better off with Android as it doesn't seem to expect that you would want to develop on the device itself. But with Android nothing fundamentally stops someone to use the SDK to write in that ability. So they too lack the vision, but don't block it. Sure it seems like no one would want to write programs directly an iPhone, but there is no question that the device is at an unprecedented scale meaning that for the first time we have the potential to write software in places that we never could before; in moments that are most appropriate. For instance, the iPhone has a camera, rudimentary object recognition is very possible. What if I can't find my keys? I could use the iPhone as a second set of eyes as I scan the room. Why do I need an app to do that instead of making that app on the fly out of composable parts? Here is a completely reasonable "program":

    BeepWhen(RecognizeImage(GetFlickrImages( "keys")))

    That line of "code" could be made to work *as is* in C, C++, C#, F#, Haskell, Java, ML, ObjectiveC, Ocaml, Perl, Python, Ruby… You could easily have some multitouch that let you write that program in pictures.

    Sometimes it seems only step we have had towards democratizing computation is the spreadsheet. It is a technology that people get and that "just works". And it is the only programming language on the iPad. *Sigh* why is it that us computer programmers are the only ones that get to have fun?

  2. The "I can't tinker with it, so I would never have learned to program" argument is emotionally powerful because we can relate and feel sympathy for what "would not have been." But is it possible that Mr. Payne is presuming a learning model that was effective when one HAD to code to run a computer but is no longer as meaningful or needed now?

  3. Ryan, some of us who aren't programmers have a little bit of programming-like fun — if the occasional Python script and use of LaTeX counts . . . but there sure won't be any of that on the iPad.

    Which leads me to Chris's point, which (in turn) leads me to Fraser Speirs's argument that I quoted on my tumblelog today. Fraser makes a strong case for the same point.

    So if what people really want is the seamless experience that the iPad offers, that's understandable — in the same way that it's understandable that people like cars that just tell us when to go to the dealer for service. But once upon a time I changed my own oil. . . .

  4. I might be more disposed toward this sort of argument if it were the case that someone could be point me to an example of another set of sophisticated programs that are both as elegant and stable as Apple's. Why not think that you can either have something to tinker with or you can have something that's reliable and easy-to-use, but not both?

  5. For elegance and stability you cannot get better than Dr. Knuth's TeX. Any number of developer tools for Windows, innumerable parts of Linux or the BSD's (which form the foundation for OS X). But these are all programs for programmers. These types of programs have existed as long as computers and their development follows the pattern where a programmer finds the tool they use everyday inadequate, so they make a better one. Now we live in a world where non-programmers (fabulously creative people) are by far the majority of computer users. How can the software world do anything but stagnate unless computational creativity is put in the hands of those that use the computer? I would agree that Alex Payne is pining for something that doesn't exist anymore, but in closed environment where all that exists are apps and not components there is no room for creativity beyond what the apps already do. The users of the software that I create amaze me with the hurdles they jump through to accomplish what they need to get done. It is because they need to do something that is outside the domain of what the program was designed to do. They use their computational creativity to do it anyway, but wouldn't it be better if they had a standard way to do their workaround? A way that was intuitive to them as a computer user? The iPad and iPhone are wonderful in that they are taking down the desktop metaphors. That needs to be done. However it appears that Apple's iconoclastic hammer beats out appliances not computers.

    All the examples that I gave of elegant and sophisticated programs interact with their users via a programming language. Programmers can read that language, computers act on that language. Without souls they will do no reading. The more a programming language looks like a set of equations the less pretentious it is. Non-programmers write to be read, so programming languages are always somewhat of a mystery. So what is needed is something that is intuitive to non-programmers without requiring the machine to have a soul. Clearly a hard problem and one I would love Apple to tackle (multi-touch is a step in the right direction). But their position doesn't even let others experiment to see if they can take a crack at it.

Comments are closed.