Template talk:Infobox algorithm

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Complete?[edit]

Does anyone know if the term "complete" (in this sense) is widely used? I'm having a hard time finding any other references to it outside of Wikipedia. I think including it in this table is just confusing, especially with the other meanings of the word. BrokenSegue 11:05, 23 April 2008 (UTC)[reply]

Speaking of which, can anybody make out what "optimal" means? -- Smjg (talk) 14:18, 10 April 2009 (UTC)[reply]

  • Meaning: it is proven to be run at the lower bound of all possible algorithms to solve this given problem. John Sheu (talk) —Preceding undated comment added 03:48, 18 April 2009 (UTC).[reply]

Merge Sort's best case is O(n)??[edit]

The box says the best case runtime for merge sort is O(n). I am pretty sure that is wrong. The runtime is *always* O(nlogn). —Preceding unsigned comment added by 98.232.181.18 (talk) 20:10, 5 June 2009 (UTC)[reply]

Darkhan Shalgynbayev (talk) 08:51, 9 June 2009 (UTC): The Mergesort Algorithm’s time efficiency is in ϴ(nlogn) in all cases [Anany Levitin. Introduction to The Design and Analysis of Algorithms, 2nd Edition. Addison Wesley, 2007. p.155]. So, I agree with opposed argument![reply]

Animation could be improved with colored background.[edit]

Use one color to show the region of the image currently being sorted. Use two other colors to show which two regions are being merged. Would be nice if it were a bit slower, too. Dave Yost (talk) 19:53, 10 July 2009 (UTC)[reply]

Optimality definitons unnecessary and illogical?[edit]

I find the "Optimal" entry confusingly ill-defined. Bogosort is "Optimal: No". Heapsort is "Optimal: Never". Quicksort is "Optimal: Sometimes". Is this trying to imply that Heapsort is inferior to Bogosort? What criteria are being used to judge these? What are the possible responses? It's not feasible to assert that the optimality of a single algorithm can be stated for all possible cases. Shouldn't the optimality of an algorithm be considered more fully in the article itself, for given cases? Isn't it enough to list the best/average/worst time/space complexities? If my primary objective is to entertain, would I be remiss in changing Bogosort to "Optimal: Hilarious"? —Preceding unsigned comment added by 141.195.226.101 (talk) 22:29, 18 October 2009 (UTC)[reply]

I agree, the optimal parameter doesn't seem to have been very well thought out and the lack of documentation appears to have led to a ridiculous number of whimsical variants. I suggest the choices Yes for it is an asymptotically optimal algorithm and No for it isn't. Does anyone really have a good reason for middle ground? About the best example would be quicksort which is currently Optimal: Sometimes but seems like (using the standard of worst case analysis) would be Optimal: No. Cheers, — sligocki (talk) 03:33, 17 November 2009 (UTC)[reply]
I strongly believe that the "optimal" parameter should be removed from the template. Some of my objections are explained here: Talk:Depth-first_search#Optimality. In short, these are my main objections:
  1. Optimality is not well-defined for algorithms which solve many problems (like BFS/DFS).
  2. Optimality is not well-defined if there are multiple correct solutions, and some are better than others. For instance, BFS might find shorter paths than DFS. Is BFS optimal and DFS non-optimal?
  3. Optimality can be discussed in the article. Putting it along with time and space complexity gives the issue WP:UNDUE WEIGHT.
  4. How many optimal algorithms do we have anyway? O(n log n) sorting algorithms are non-optimal, since the best lower bound is linear. We have no natural problems in NP with a super-linear lower bound, so almost all algorithms on Wikipedia are non-optimal. (Except binary search and those algorithms that take linear time for problems where the input must be read.)
  5. Editors interpret the word to mean whatever they feel like, which leads to silly things like Merge sort being optimal, Heapsort being "never" and Radix sort being "exactly correct".
Let's get rid of this confusing parameter. Robin (talk) 04:58, 17 November 2009 (UTC)[reply]
I support Robin's proposal per the discussion we've been having at Talk:Depth-first search. —David Eppstein (talk) 05:03, 17 November 2009 (UTC)[reply]
It seemed to me that optimal meant worst case, not best case, so all O(n log n) sorting algorithms are provably optimal. However, I agree that this parameter doesn't seem to have much good purpose and plenty of bad uses. I support getting rid of it. Cheers, — sligocki (talk) 00:12, 18 November 2009 (UTC)[reply]
Well, I've gotten rid of it. About optimality, O(n log n) sorting algorithms are not known to be optimal because we don't have an Ω(n log n) lower bound for sorting in the standard RAM model or the Turing machine model. The Ω(n log n) lower bound is in the decision tree model. The best lower bound known for sorting is just Ω(n). --Robin (talk) 00:43, 18 November 2009 (UTC)[reply]
This is the OP of this section (forgot to sign in that day, apparently), and I'm glad to see that someone who knows what they're talking about took a look at this. Unfortunately, I feel like the idea of "optimality" with regard to algorithms is pretty inconsistent over Wikipedia as a whole. Since this infobox ostensibly graces the pages of many prominent algorithms, I feel like this is a good place to start addressing this problem, but it could probably be served better by a page on algorithmic optimality.
To start, an example: Radix sort#Efficiency. This says that radix sort is optimal by assuming that the time to execute grows linearly with the input, and links to Optimal, which is a disambig page with no clear option for the sort of optimal that the article mentions. Furthermore, look here: Quicksort#Parallelizations says that there exists a parallelized Quicksort algorithm with complexity, which makes radix sort's claims of optimality look silly.
But as for uniting this discussion somewhere appropriate, I feel that someone with knowledge of the subject would be well-served in making a page to discuss optimality with regard to algorithms and computational complexity theory. I have yet to find an alternative page that talks about an algorithm being "optimal" in any substantive way. Best, worst and average case is a start, but it has a vague name and repeats much of what is said in Big O notation, which is itself more about computing the complexity of an algorithm rather than giving methods of optimization, such as via dynamic programming. Optimal could maybe use a link to Best, worst and average case, as currently the only viable alternative is Program optimization, which is more about practical optimization for specific implementations than theoretical optimization of general algorithms.
KHAAAAAAAAAAN (talk) 05:53, 29 November 2009 (UTC)[reply]
I fixed the link in Radix sort to point to asymptotically optimal. --Robin (talk) 03:11, 30 November 2009 (UTC)[reply]

Inconsistency in space complexity[edit]

There doesn't seem to be any agreement on whether "space" is supposed to include the memory already occupied by the data on which the algorithm operates.

Most sorting algorithms start with an array of values. This obviously takes up O(n) space. But what varies between them is the auxiliary space complexity. But even between algorithms with O(1) auxiliary space complexity, there is a discrepancy:

We need to standardise.... — Smjg (talk) 01:48, 16 December 2011 (UTC)[reply]

I'm not sure if this is the right place for this discussion, anyway I think the space occupied by input should not be counted. If you have an input data of N pieces be sorted, it already occupies some O(N) space, and this does not depend on the algorithm you will use to sort them. Moreover, it uses that memory even if you don't sort it at all. What is really important is not the obvious fact the sorting algorith will use data it sorts, but how much memory in addition we need to complete the task. So my opinion is: we should give the auxiliary space only. --CiaPan (talk) 17:29, 3 January 2012 (UTC)[reply]

"Image size" argument does not change the size of image[edit]

Hi, I used "Image size" argument of this template in DPLL algorithm article. But even after removing "300px" from File argument, still "Image size" or "size" or "imagesize" have no effect on the size of this picture. I.e., always the previous size is shown, and changing this argument makes it not bigger and not smaller. Please correct this bug. Thanks, Hooman Mallahzadeh (talk) 13:16, 4 May 2021 (UTC)[reply]

The problem was your incorrect argument in the image parameter. It should be |image=Dpll11.png, without the "File:" and without the brackets. —David Eppstein (talk) 23:46, 4 May 2021 (UTC)[reply]
@David Eppstein: Thanks for your guidance, that works well.