Programmers have an easier time scaling up than scaling down.
You could call this foresight or over-engineering, depending on how things work out.
Scaling is a matter of placing bets.
Experienced programmers are rightfully suspicious of claims that something only needs to be done once, or that quick-and-dirty will be OK [*].
They’ve been burned by claims that something was “temporary” and they have war stories in which they did more than required and were later vindicated.
These stories make good blog posts.
But some things really do need to only be done once, or only so infrequently that it might as well be only once.
And it might be OK for intermediate steps to be quick-and-dirty if the deliverable is high quality.
As a small business owner and a former/occasional programmer, I think about this often.
For years I had a little voice in my head say “You really should script this.
” And I have automated a few common tasks.
But I’ve made peace with the fact that I often do things that (1) could be done far more elegantly and efficiently, and that (2) I will likely never do again [**].
Related postsAppropriate scaleScaling the number of projects, not the sizePareto’s 80-20 ruleObjectives and constraintsThe bike shed principle[*] “People forget how fast you did a job, but they remember how well you did it.
” — Howard Newton[**] I include in “never do again” things I might do in the future, but far enough in the future that I won’t remember where I saved the script I wrote to do the task last time, if I saved it.
Or I saved the script, was able to find it, but it depends on a library that has gone away.
Or the task is just different enough that I’d need to practically rewrite the script.
Or ….