Short essays on programming languages

I saw a link to So You Think You Know C? by Oleksandr Kaleniuk on Hacker News and was pleasantly surprised.

I expected a few comments about tricky parts of C, and found them, but there’s much more.

The subtitle of the free book is And Ten More Short Essays on Programming Languages.

Good reads.

This post gives a few of my reactions to the essays, my even shorter essays on Kaleniuk’s short essays.

My C The first essay is about undefined parts of C.

That essay, along with this primer on C obfuscation that I also found on Hacker News today, is enough to make anyone run screaming away from the language.

And yet, in practice I don’t run into any of these pitfalls and find writing C kinda pleasant.

I have an atypical amount of freedom, and that colors my experience.

I don’t maintain code that someone else has written—I paid my dues doing that years ago—and so I simply avoid using any features I don’t fully understand.

And I usually have my choice of languages, so I use C only when there’s a good reason to use C.

I would expect that all these dark corners of C would be accidents waiting to happen.

Even if I don’t intentionally use undefined or misleading features of the language, I could use them accidentally.

And yet in practice that doesn’t seem to happen.

C, or at least my personal subset of C, is safer in practice than in theory.

APL The second essay is on APL.

It seems that everyone who programs long enough eventually explores APL.

I downloaded Iverson’s ACM lecture Notation as a Tool of Thought years ago and keep intending to read it.

Maybe if things slow down I’ll finally get around to it.

Kaleniuk said something about APL I hadn’t heard before: [APL] didn’t originate as a computer language at all.

It was proposed as a better notation for tensor algebra by Harvard mathematician Kenneth E.

Iverson.

It was meant to be written by hand on a blackboard to transfer mathematical ideas from one person to another.

There’s one bit of notation that Iverson introduced that I use fairly often, his indicator function notation described here.

I used it a report for a client just recently where it greatly simplified the write-up.

Maybe there’s something else I should borrow from Iverson.

Fortran I last wrote Fortran during the Clinton administration and never thought I’d see it again, and yet I expect to need to use it on a project later this year.

The language has modernized quite a bit since I last saw it, and I expect it won’t be that bad to use.

Apparently Fortran programmers are part of the dark matter of programmers, far more numerous than you’d expect based on visibility.

Kaleniuk tells the story of a NASA programming competition in which submissions had to be written in Fortran.

NASA cancelled the competition because they were overwhelmed by submissions.

Syntax In his last essay, Kaleniuk gives some ideas for what he would do if he were to design a new language.

His first point is that our syntax is arbitrarily constrained.

We still use the small collection of symbols that were easy to input 50 years ago.

As a result, symbols are highly overloaded.

Regular expressions are a prime example of this, where the same character has to play multiple roles in various contexts.

I agree with Kaleniuk in principle that we should be able to expand our vocabulary of symbols, and yet in practice this hasn’t worked out well.

It’s possible now, for example, to use λ than lambda in source code, but I never do that.

I suspect the reason we stick to the old symbols is that we’re stuck at a local maximum: small changes are not improvements.

A former client had a Haskell codebase that used one non-ASCII character, a Greek or Russian letter if I remember correctly.

The character was used fairly often and it did made the code slightly easier to read.

But it wreaked havoc with the tool chain and eventually they removed it.

Maybe a wholehearted commitment using more symbols would be worth it; it would take no more effort to allow 100 non-ASCII characters than to allow one.

For that matter, source code doesn’t even need to be limited to text files, ASCII or Unicode.

But efforts along those lines have failed too.

It may be another local maximum problem.

A radical departure from the status quo might be worthwhile, but there’s not a way to get there incrementally.

And radical departures nearly always fail because they violate Gall’s law: A complex system that works is invariably found to have evolved from a simple system that worked.

Related posts Software development is better in practice than in theory</a? Don’t be a technical masochist Visual Basic and Coyotes.

Leave a Reply