Optimizing Inefficiency: Human folly and the quest for the worst sorting algorithm

Optimizing Inefficiency: Human folly and the quest for the worst sorting algorithmDaniel MillerBlockedUnblockFollowFollowingApr 3(Image source)Lookup Bogosort, Slowsort, or Stooge sort on Wikipedia sometime and prepare to enter a deep rabbit hole which leads to the strange world of “perversely awful” sorting algorithms.

These three relatively tame algorithms, used primarily for illustrative purposes in computer science classes, are only the beginning.

A strange algorithm known as Bogobogosort has been banging around the distant corners of internet message boards for years (possibly invented by Australian physicist David Morgan-Mar, although its provenance is unclear).

With a horrifying time complexity of O(n!ⁿ⁻ᵏ), Bogobogosort is expected to keep sorting until the heat death of the universe for any list with a length in the low double digits.

Bogobogosort is implemented by copying an unsorted list and then recursively checking an n -1 subset of that list to see if the last element of the subset is greater than the first element.

If not, the entire copy of the list is shuffled and the process is started over from the beginning.

David Morgan-Mar describes the following steps:1.

Make a copy of the list of numbers.

2.

Sort the first n-1 elements of the copy using bogobogosort.

3.

Check to see if the nth element of the sorted copy is greater than the highest element of the first n-1 elements.

If so, the copy is now sorted, else randomise the order of the elements of the copy and go to step 2.

4.

Check to see if the copy is in the same order as the original list.

David reports attempting to sort a list of just 7 integers overnight with Bogobogosort and finally giving up and having to restart his computer.

Bogosort variants are just the beginning, however.

A 2007 paper by Miguel Lerma shows that a sorting algorithm called Worstsort can be made arbitrarily inefficient according to an increasing computable function f.

That is, we can make a sorting algorithm as inefficient as we want, even respecting the restriction that every part of the algorithm must be “useful” (that is, no iterations or delays added purely for the sake of perversity).

Worstsort could theoretically sort any list given infinite space and time, but given any implementation of Worstsort, an even less efficient version is always possible!What‘s the Point?The quest to find optimally inefficient algorithms reflects something profound and even beautiful about the nature of computer science itself: inefficiency is apparently literally infinite.

When we come up with effective heuristics for a particular problem we are literally clawing back some hint of a signal from an infinite soup of noise.

Consider a problem that perplexed scholars for centuries: “species counterpoint.

” Codified by 18th-century Austrian composer Johann Joseph Fux (among others), species counterpoint is a set of strict rules governing how musical notes should move with respect to one another (this was long before modern chord symbols were developed).

Even today students learn to write species counterpoint as a pedagogical exercise in music school.

Like Chess and Go, species counterpoint presents an interesting computational problem.

You “win” by successfully fulfilling the rules of the game under given conditions.

Along the way, you must avoid violating the rules of the game, and you must also avoid making choices that ultimately lead to impossible binds, situations where no valid solution exists based on past choices.

A valid solution in Fifth Species counterpoint with two melodic voices.

(Image source)I won’t bore you with the full list of the rules, but they involve detailed restrictions on how far apart notes can be from one another and where and when certain dissonant or consonant chords can occur within a melody.

Writing a brute-force solution is rather easy.

Given a starting point, you just loop through the set of possible next notes, check if each option violates one of the rules, and choose a solution from the set of permissible moves.

This works for two voices, maybe even three voices, but if we want to compose counterpoint with four, five, or more voices, the set of possible moves rapidly exceeds practicality.

It’s an O(n²) time complexity algorithm.

For hundreds of years, music students learned to do counterpoint in more or less this way, basically trying a series of notes, checking to see if they violate any rule of the species, and then erasing back to an earlier point whenever they got stuck.

It’s a bit like sudoku (and as a music student in a former life, I can tell you it’s often very frustrating!).

Then in 1956, Lejaren Hiller, a chemistry researcher and composer at the University of Illinois at Urbana-Champaign, saw certain parallels between music and scientific computation and had the idea to program the ILLIAC IV computer to write counterpoint.

The ILLIAC IV computer at the University of Illinois at Urbana-Champaign, the computer that inspired Stanley Kubrick’s HAL in 2001: A Space Odyssey.

(Image source)The result of the first computer-generated counterpoint was Illiac Suite for string quartet, which Lejaren Hiller wrote in collaboration with Leonard Isaacson.

With Illiac Suite the world was arguably introduced to a new way of thinking about creativity.

Hiller and Isaacson used a stochastic approach in which their software found only a single solution to a contrapuntal problem meeting certain minimum requirements, but more recent research has improved on this method with decision tree pruning and best-first search which exploits the fact that certain species counterpoint rules are more restrictive than others.

It’s an elegant heuristic to first explore branches of the search tree that meet these more restrictive requirements, leading to an algorithm that can select a valid musical solution from larger sets of possible solutions.

Within the set of all possible species counterpoint are contained many of the great compositions by the composers of the European Baroque era.

And similarly, that set contains an unimaginably large collection of garbage and noise.

It took centuries for humans to invent a way to procedurally traverse a subset of valid solutions to species counterpoint.

But we are only now just beginning to explore what makes certain solutions brilliantly creative and what makes others uninspired (through, for example, deep learning and neural nets).

Beauty In An Inefficient UniverseWhat can we learn from optimally inefficient algorithms?.I think what we learn is that computer programming is not just a skill.

It literally touches problems that have perplexed human beings for generations, not only in the sciences but also in art and music.

When, as students, we learn about the time complexity and how to choose the best heuristic for a particular problem, we are quite literally grasping at problems that have perplexed people since long before the first digital computers were invented.

And in amongst the soup of noise and inefficiency, there are no doubt further optimizations that will make accessible what was once only accessible via inspiration and genius.

Perhaps the most ironic thing about highly inefficient sorting algorithms is that the best time complexity algorithm may not be one implemented by computers at all!.The record for the fastest time complexity sort may in fact be held by a bizarre “natural” sorting algorithm called Bead Sort, literally just falling beads on an abacus.

Assuming certain conditions, Bead Sort theatrically has a time complexity of O(1).

The fastest comparison sorting algorithms implemented by digital computers have a time complexity around O(n log n).

I challenge you to find a more frustrating glitch in the matrix…Graphical illustration of Bead Sort (Image source).

. More details

Leave a Reply