Are there any worse sorting algorithms than Bogosort (a.k.a Monkey Sort)?

153,224

Solution 1

From David Morgan-Mar's Esoteric Algorithms page: Intelligent Design Sort

Introduction

Intelligent design sort is a sorting algorithm based on the theory of intelligent design.

Algorithm Description

The probability of the original input list being in the exact order it's in is 1/(n!). There is such a small likelihood of this that it's clearly absurd to say that this happened by chance, so it must have been consciously put in that order by an intelligent Sorter. Therefore it's safe to assume that it's already optimally Sorted in some way that transcends our naïve mortal understanding of "ascending order". Any attempt to change that order to conform to our own preconceptions would actually make it less sorted.

Analysis

This algorithm is constant in time, and sorts the list in-place, requiring no additional memory at all. In fact, it doesn't even require any of that suspicious technological computer stuff. Praise the Sorter!

Feedback

Gary Rogers writes:

Making the sort constant in time denies the power of The Sorter. The Sorter exists outside of time, thus the sort is timeless. To require time to validate the sort diminishes the role of the Sorter. Thus... this particular sort is flawed, and can not be attributed to 'The Sorter'.

Heresy!

Solution 2

Many years ago, I invented (but never actually implemented) MiracleSort.

Start with an array in memory.
loop:
    Check to see whether it's sorted.
    Yes? We're done.
    No? Wait a while and check again.
end loop

Eventually, alpha particles flipping bits in the memory chips should result in a successful sort.

For greater reliability, copy the array to a shielded location, and check potentially sorted arrays against the original.

So how do you check the potentially sorted array against the original? You just sort each array and check whether they match. MiracleSort is the obvious algorithm to use for this step.

EDIT: Strictly speaking, this is not an algorithm, since it's not guaranteed to terminate. Does "not an algorithm" qualify as "a worse algorithm"?

Solution 3

Quantum Bogosort

A sorting algorithm that assumes that the many-worlds interpretation of quantum mechanics is correct:

  1. Check that the list is sorted. If not, destroy the universe.

At the conclusion of the algorithm, the list will be sorted in the only universe left standing. This algorithm takes worst-case Θ(N) and average-case θ(1) time. In fact, the average number of comparisons performed is 2: there's a 50% chance that the universe will be destroyed on the second element, a 25% chance that it'll be destroyed on the third, and so on.

Solution 4

Jingle Sort, as described here.

You give each value in your list to a different child on Christmas. Children, being awful human beings, will compare the value of their gifts and sort themselves accordingly.

Solution 5

I'm surprised no one has mentioned sleepsort yet... Or haven't I noticed it? Anyway:

#!/bin/bash
function f() {
    sleep "$1"
    echo "$1"
}
while [ -n "$1" ]
do
    f "$1" &
    shift
done
wait

example usage:

./sleepsort.sh 5 3 6 3 6 3 1 4 7
./sleepsort.sh 8864569 7

In terms of performance it is terrible (especially the second example). Waiting almost 3.5 months to sort 2 numbers is kinda bad.

Share:
153,224

Related videos on Youtube

nuiun
Author by

nuiun

Many years of C#/ASP.Net MVC/SQL Server behind me, now working with Node.js, Golang, Python, Docker and AWS full time. Certified as a MongoDB DBA. Currently leading several development teams in trying to force the music industry into the 21st century. I have a dynamic leadership style, putting people first and watching the software follow. Highly interested in Ethereum and dabbling in Solidity. Been working with Hyperledger Fabric, Sawtooth and Caliper for the last several months. Mining altcoins since 2014.

Updated on January 27, 2022

Comments

  • nuiun
    nuiun over 2 years

    My co-workers took me back in time to my University days with a discussion of sorting algorithms this morning. We reminisced about our favorites like StupidSort, and one of us was sure we had seen a sort algorithm that was O(n!). That got me started looking around for the "worst" sorting algorithms I could find.

    We postulated that a completely random sort would be pretty bad (i.e. randomize the elements - is it in order? no? randomize again), and I looked around and found out that it's apparently called BogoSort, or Monkey Sort, or sometimes just Random Sort.

    Monkey Sort appears to have a worst case performance of O(∞), a best case performance of O(n), and an average performance of O(n·n!).

    What is the currently official accepted sorting algorithm with the worst average sorting performance (and there fore beeing worse than O(n·n!))?

    • zombat
      zombat about 14 years
      How many bogomips per bogosort? Inquiring minds want to know.
    • tloflin
      tloflin about 14 years
      To clarify, are you excluding the trivial case where best case performance is O(∞)?
    • nuiun
      nuiun about 14 years
      @tloflin - generally yeah, I assume any named algorithm would have to have a remote chance of success. I'll clarify the question a bit - are there any algorithms that are worse than n*n! on average?
    • MaddTheSane
      MaddTheSane about 14 years
    • Patrick Karcher
      Patrick Karcher about 14 years
      My buttheadsort, detailed below, would be worse than n*(n!^n)!
    • Matteo Italia
      Matteo Italia about 14 years
      I heard that the monkey sort is also known as "drunk man sort", a name that I find much more evocative.
    • Debilski
      Debilski about 14 years
      You can adapt your BogoSort to use a very slow (and even better: erroneous) random generator.
    • Corey Richardson
      Corey Richardson over 11 years
      O(foo) is defined as the worst case time complexity, not a generic performance metric. Best case time complexity is Ω(foo). There is no notation for average time complexity, though Θ(foo) comes close. Not really important to the question, though.
    • Mooing Duck
      Mooing Duck over 11 years
      @Debilski: That will make it a constant factor slower, which is ignored in big-O notation. For a slower notation I don't think you'll find much.
    • Martin Capodici
      Martin Capodici almost 10 years
      @Matteo Italia - or it could be called "Toddler Sort" as anyone with 2 year old can attest.
    • HerpDerpington
      HerpDerpington over 8 years
      UserSort: promt the user to sort the array manually.
    • ai8p588t
      ai8p588t over 7 years
      As of 2013, we have xkcd.com/1185
    • Scorix
      Scorix over 4 years
      As of 2019, worstsort is a sorting algorithm, that has a finite end but can be made as inefficient as needed by choosing functions that grow fast on given input numbers. en.wikipedia.org/wiki/Bogosort#Related_algorithms
  • Patrick Karcher
    Patrick Karcher about 14 years
    No, this is worse because there's no chance of it succeeding. How would the index cards get into your kitchen? They're blowing around outside. It's called, uh, buttheadsort.
  • zombat
    zombat about 14 years
    Infinity + 1. Jinx, no returns.
  • nuiun
    nuiun about 14 years
    I think he means throw the cards up in the air outside, and then check your floor inside, where there are guaranteed to be no cards. Although not a "named" algorithm... it is certainly worse!
  • kennytm
    kennytm about 14 years
    @Patrick Quantum tunneling.
  • Patrick Karcher
    Patrick Karcher about 14 years
    @KennyTM. That had actually occurred to me. There is an extremely small but non-zero chance that any object might disappear and reappear at any other point in the universe. I guess it could happen to a thousand index cards . . . Oi. Dangit, my algorithm is flawed. I'll fix it . . .
  • zombat
    zombat about 14 years
    Not for extremely large values of 1 ;)
  • kennytm
    kennytm about 14 years
    Infinity + Infinity = Infinity.
  • erikkallen
    erikkallen about 14 years
    But it takes O(n) to check whether it's sorted, so you can get O(n*n!)
  • Rune Jeppesen
    Rune Jeppesen about 14 years
    It's kind of like having tea and no tea at the same time. Or space travel using an infinite improbability drive.
  • BioGeek
    BioGeek about 14 years
    Also known as "Assumption Sort": Assume the list is sorted, return!
  • nuiun
    nuiun about 14 years
    +100 - this answer is made out of 100% pure win.
  • zombat
    zombat about 14 years
    What really blows my mind about the concept of infinity, is that you can have different "sizes" of infinity. For example, consider the set of all integers - it is infinite in size. Now consider the set of all even integers - it is also infinite in size, but it is also clearly half the size of the first set. Both infinite, but different sizes. So awesome. The concept of "size" simply fails to work in the context of infinity.
  • Rune Jeppesen
    Rune Jeppesen about 14 years
    But time ceases to exist in the universe you just destroyed. So an observer in a universe that you have not yet checked will not be able to tell how much of the algorithm has been executed. Thus, this algorithm always takes O(1) time, since previous universe-destructions don't exist anymore.
  • BioGeek
    BioGeek about 14 years
    To quote the author of the above algorithm: On average, the universe is destroyed. It takes only O(1) time to decide to destroy the universe, and (assumption) O(1) time to destroy the universe. (destroyUniverse has all the time in the world, which should be a constant as the universe has no inputs, or else something is really screwed up with our understanding of the univerrse.) Therefore the average case is O(1). reddit.com/r/programming/comments/thtx/…
  • kennytm
    kennytm about 14 years
    @zombat: You're talking about cardinality, not infinity as a symbol indicating a trend on the real line / complex plane.
  • David Thornley
    David Thornley about 14 years
    @zombat. The size of the set of even integers is the same as the size of the set of the integers, as shown by the fact that you can place them in one-to-one correspondence. Now, there are more real numbers than integers, as first shown by Cantor.
  • David Thornley
    David Thornley about 14 years
    @erikkallen: Certainly we can come up with an algorithm to verify sortedness that's worse than O(n). For example, for each element in the array, verify that it's greater than all previous ones, much like insertion sort works. That's an O(n^2) algorithm, and I'm sure I could come up with worse given a little thought.
  • Brendan Long
    Brendan Long about 14 years
    Isn't it always O(n), since you have to check every element?
  • Svante
    Svante about 14 years
    @David Thornley: the following checking algorithm would perhaps show the same spirit as the bogosort: pick two random elements, check that the one with the smaller index is smaller or equal to the one with the larger index, then repeat. Keep a square bit matrix to see which combinations have already been checked. Of course, checking this matrix could also be done in a random walk...
  • BlueRaja - Danny Pflughoeft
    BlueRaja - Danny Pflughoeft about 14 years
    And if you really want to blow your mind about infinity: blog.xkcd.com/2010/02/09/math-puzzle
  • Nick Johnson
    Nick Johnson about 14 years
    Yes, in the only universe that observes the list sorted, it took O(n) time to execute - how long it took in other universes is irrelevant.
  • uckelman
    uckelman about 14 years
    Whether \omega + 1 = \omega depends on whether you're doing cardinal or ordinal arithmetic.
  • Joe D
    Joe D over 13 years
    Hey! Don't forget "Indecisive Sort" (Also know as "Schrodinger's Sort" or "Quantum Sort"), where the list may or may not be sorted, however checking it will reveal whether or not it is. Here is my sample implementation: void quantum_sort (void *b, size_t n, size_t s, int (*c)(const void *, const void*)) { if (rand () % 2) qsort (b, n, s, c); }.
  • Nick Johnson
    Nick Johnson over 12 years
    This algorithm has a much bigger problem, however. Assume that one in 10 billion times you will mistakenly conclude a list is sorted when it's not. There are 20! ways to sort a 20 element list. After the sort, the remaining universes will be the one in which the list was sorted correctly, and the 2.4 million universes in which the algorithm mistakenly concluded the list was sorted correctly. So what you have here is an algorithm for massively magnifying the error rate of a piece of machinery.
  • ghord
    ghord over 11 years
    I assume one can use cosmic rays to prove correctness of this algorithm.
  • Boann
    Boann over 11 years
    This is obviously the best sorting algorithm, not the worst.
  • Mooing Duck
    Mooing Duck over 11 years
    What's the big O of this? O(2^N)?
  • Mooing Duck
    Mooing Duck over 11 years
    More interesting to analyze is the average case, which is...?
  • Mooing Duck
    Mooing Duck over 11 years
    This appears to be an O(N) sort, but in truth is constrained by however the OS implements timers.
  • Mooing Duck
    Mooing Duck over 11 years
    The second is almost the equivalent of a bozo sort. First is clever though.
  • Daniel
    Daniel over 11 years
    As all the best text books say, this is left as an exercise for the reader!
  • Mooing Duck
    Mooing Duck over 11 years
    Any way you cut it, this is probably exhibits a better growth than bogosort.
  • Keith Thompson
    Keith Thompson about 11 years
    @MooingDuck: I don't think it actually has a big O.
  • Mooing Duck
    Mooing Duck about 11 years
    There are many algorithms which are not guaranteed to terminate (Bogosort for one). I'm think this algorithm's average case is O(2^N), since each bit in the dataset halves the odds of success each round (but I'm not betting money on being right). You're right that technically it's big-O-infinity since it may not terminate.
  • Keith Thompson
    Keith Thompson about 11 years
    @MooingDuck: Strictly speaking, if it doesn't terminate it's not an algorithm, according to both what they taught me in college and the Wikipedia article.
  • Mooing Duck
    Mooing Duck about 11 years
    @KeithThompson: That's interesting, we might have to ask on TheoreticalCompSci or something. I agree with what you said, but clarify that both this and the bogosort do eventually produce output and terminate at a final state, there simply happens to be no upper bound on the time before that happens. I think it still qualifies as an algorithm, but I'm less certain.
  • Ilya O.
    Ilya O. about 11 years
    Mooing Duck: O(sometimes)
  • Admin
    Admin about 11 years
    I see a race condition there.
  • AJMansfield
    AJMansfield about 11 years
    For an easier-to-implement version, have it use a quantum postdecision device. That way, if, after we shuffle it we determine that it is not correct, we can have the sorting process terminated before we check if it is sorted. This ensures that all we need to do is simply check the list after the shuffle, because if our check concluds it isn't sorted, we wouldn't have checked (and we already know for sure that we did check it). Much easier than destroying the universe (we don't have capable hardware yet).
  • Beetle
    Beetle about 11 years
    You have missed out a crucial first step: 0. Shuffle the list using true randomness.
  • Display Name
    Display Name almost 11 years
    @MooingDuck then we need to know cardinality of element type and distribution used to generate random elements in random arrays.
  • Olathe
    Olathe almost 11 years
    The Wikipedia article isn't conclusive. It says "While there is no generally accepted formal definition of 'algorithm,' an informal definition could be 'a set of rules that precisely defines a sequence of operations.'" Given the halting problem, it seems pretty strange to have terminology that requires us to solve it just in order to know what to call something. Also, what would you call the things that don't always terminate or the things that we're not sure if they terminate? That definition creates quite a mess.
  • Keith Thompson
    Keith Thompson almost 11 years
    @Olathe: The Halting Problem says we can't determine for all programs whether they halt, but there are plenty of programs for which we can make that determination. We know Quicksort and Bubblesoft halt, and we know they're algorithms.
  • sisharp
    sisharp over 10 years
    @Olathe Hold on a sec while I clarify the wiki for you.... :P
  • A.Grandt
    A.Grandt about 10 years
    I like the idea that this algorithm is designed to never finish "before the heat death of the universe for any sizeable list"
  • Jim Balter
    Jim Balter almost 10 years
    There must be some severe misunderstanding from that duck to think that an "algorithm" that consists of waiting for an array to sort itself without doing anything to it has O(2^N).
  • Jim Balter
    Jim Balter almost 10 years
    "Given the halting problem, it seems pretty strange to have terminology that requires us to solve it just in order to know what to call something." -- Severe abstraction failure. Programs that do or don't halt are programs that do or don't halt regardless of any ability to determine that. "Also, what would you call the things that don't always terminate" -- Severe basic logic failure. Since an algorithm is a procedure that always halts, something that doesn't always halt is a procedure that isn't an algorithm.
  • Jim Balter
    Jim Balter almost 10 years
    "The Halting Problem says we can't determine for all programs whether they halt" -- it's remarkable how many people don't understand this. Virtually every program ever written by a human being can be shown to halt or not halt. It's just that a program that purportedly determines that for all programs can't determine that for itself, so we know it's not generally possible. This theoretical limit is constantly misapplied.
  • Ambuj Mehra
    Ambuj Mehra almost 10 years
    Failure to head Beetle's advice may result in all universes being destroyed.
  • jakubiszon
    jakubiszon almost 10 years
    The complexity is O(N! * Z^N) where Z is the size of the set of possible values and N is the length of the array.
  • echochamber
    echochamber almost 10 years
    We should dub this Candide Sort: "This is the best of all posibble worlds because it is the world that is, and so in the best possible world the array would already be sorted!"
  • Sam Kellett
    Sam Kellett almost 10 years
    You can change the sleep "$1" to sleep "0.$(printf "%010d" $1)" to improve performance markedly. time ./sleepsort.sh 8864569 7 then runs in 0.009s on my laptop.
  • Bryson
    Bryson over 9 years
    I, for one, welcome our new sorting overlord. All hail the sorter!
  • Ben Voigt
    Ben Voigt over 9 years
    @Jakub: I think you forgot the complexity of the comparison step.
  • Qwerty01
    Qwerty01 over 9 years
    This runs in O(N) complexity (dependent of course on the implementation of the timer), it's a simple bucket sort in different form.
  • Don Larynx
    Don Larynx about 9 years
    Turning doesntNeedToSort; into monkeySort;
  • Cole Tobin
    Cole Tobin almost 9 years
    @SamKellett That requires numbers be no bigger than 10 digits. And there could be something about sub millisecond precision race conditions, but I could be wrong.
  • MrKekson
    MrKekson almost 9 years
    Holy mother of code. I think we can even do a Bogolplex short.
  • Erik Bergstedt
    Erik Bergstedt about 8 years
    bool intelligent_design_sort( list ){ return false; }
  • Charles
    Charles about 8 years
    The first is Miracle Sort.
  • Darren H
    Darren H over 7 years
    This answer has a faint whiff of Douglas Adams about it
  • ai8p588t
    ai8p588t over 7 years
    Wow, you've implemented a multi-thread counting sort.
  • Nic
    Nic about 7 years
    @BenVoigt That'd just be a + 2N (check if sorted = N, check if data is the same = N), and you only need to consider the highest-order term, which is clearly the factorial times the exponential
  • Ben Voigt
    Ben Voigt about 7 years
    @QPaysTaxes: Not if you use bogocompare.
  • swyx
    swyx almost 7 years
    great worked answer. sad it doesnt have visibility
  • Ryan Haining
    Ryan Haining over 5 years
    pseudopolynomial!
  • Pablo H
    Pablo H almost 3 years
    In "average complexity", the average is taken over all possible inputs, not over all possible universes (unless stated otherwise). So step 1 is O(N) plus the cost of universe destruction, as Brendan Long stated.
  • Pablo H
    Pablo H almost 3 years
    Analyzing the complexity of destroyUniverse() gets tricky: yeah, size of universe is constant (?), but the size of (the representation of) the input list is less than size of universe, because it is contained therein...
  • António Oliveira
    António Oliveira over 2 years
    The time includes the fact that alpha particles won't flip any values, because if that is not the case, taking into account the probability of flipping a bit would make this sorting algorithm never ending, and a flip bit occurs almost 96% of times every 3 days in a computer with 4gb of memory (link source).
  • hamidb80
    hamidb80 about 2 years
    Oh My God you are a genius!