Saturday, December 31, 2005

Programming languages as options

Two related posts about programming languages, which, together with operating systems and programming style guidelines, are the Holy Trinity of "Things Most Likely To Cause Computer People To Turn Into Religious Fanatics":

- Joel Spolsky's post about the perils of JavaSchools
- The QofW take on Joel's post

My personal take on this is that there's no perfect programming language, you should always pick the language most appropriate for your task and knowing more languages and programming paradigms just increases the tools you have in your toolbox. Expanding on this relatively content-free platitude and applying it to the question at hand:

I took the version of the Penn CSE course that Joel describes, where you're taught Scheme and ML before C/C++. It was definitely hard to wrap your head around Scheme when you were already used to an imperative style of programming, and lots of people said "Never mind" and dropped out of the major. For years afterwards, I reflexively recoiled from anything to do with AI because I'd been told that Scheme/Lisp was extensively used in AI and I never wanted to go near anything with that many parentheses again. And I've never had to use Scheme or ML again. So, on one hand, I agree with the argument that, for the most part, any reasonable language has all the facilities you're likely to need, that Java is a perfectly reasonable thing to teach and that teaching people Lisp [and, to a much lesser extent, C] is a lot like forcing people to learn Latin.

On the flip side, though, I do think that, right now, starting out with languages like Scheme and C is actually a better way to go, in the long run, but for somewhat different reasons:

- Scheme, and functional languages in general, are just a totally different way of doing things than imperative languages are. You think about your data structures differently, you manipulate them differently and there are facilities in functional languages that just make them an easier fit to certain tasks than imperative languages [the same, of course, is also true the other way around]. So learning a functional language increases your awareness of alternative approaches to tackling a problem, which, I claim, is always a Good Thing.

- C, more than anything else, just forces you to get closer to the machine and, in the process, be careful and be aware of what you're doing. It's been argued that being close to the machine [eg having to do your own memory allocation, treat strings as null-terminated character arrays etc] is also becoming obsolete because it promotes buggy code and reduces programmer productivity, so it's much better to rely on innovations like garbage collection and the ready availability of class libraries etc. That's a reasonable argument, for the most part, but it's subject to the Law of Leaky Abstractions -- sometime, somewhere, something is going to break in the abstraction layer you're sitting on top of and if you don't know how to go down to the appropriate level, figure out what's going on, and fix it, you're hosed. Also, if you ever get into really hardcore systems hacking/optimization, the chances are that you're going to have to break your abstraction layer and dive down into the guts of the system. So while using languages like Java that relieve you of some of the more mundane aspects is the right thing to do a lot of the time, you're better off learning something lower-level, like C/C++, first.
You could, of course, extend this argument and say that everybody should learn assembly language; while not entirely unreasonable ;-), my general rule of thumb would be to stop one level down: learn something that's one level lower than the currently most-used level of abstraction [eg C instead of Java, at the moment]. That way, you have the option of choosing the right tool for the job.

So, in the end, I agree with Joel that it's better to start off with Scheme and C than with Java, but not for the reasons he gives -- it's not about pointers and recursion specifically, it's about having a wider range of options than the least common denominator.

Updates:
- Thanks to Cosma for pointing me at Ook.
- Another reason it's really not about recursion specifically: in practice ie when building a real-world system, using recursion isn't that great an idea anyway, at least not unless it's tail recursion, which is just iteration by another name. Because otherwise you blow your stack space, and that's no fun for anybody.

0 Comments:

Post a Comment

<< Home