-2

What are some reasons you may choose a worse runtime algorithm? And what are some advantages of a linked list vs a hash table/array where access is constant time.

4
  • 3
    A canonical answer that covers all data structures and algorithms known to computer scientists would be far too broad for one post by a few orders of magnitude. There's a reason one of the best textbooks on this stuff is over a thousand pages long.
    – Ixrec
    Commented Feb 6, 2016 at 21:51
  • @Ixrec I'm looking for a simple answer that is generalized I guess is what I meant. Commented Feb 6, 2016 at 21:52
  • You'd have to drastically narrow the scope of your question before any kind of answer would become feasible, must less a simple answer.
    – Ixrec
    Commented Feb 6, 2016 at 21:53
  • @Ixrec I've made edits Commented Feb 6, 2016 at 21:55

1 Answer 1

3

What are some reasons you may choose a worse runtime algorithm?

By far the most common reason is that the "worse" algorithm is a lot simpler, or is the first solution I think of, and getting mathematically optimal performance just doesn't matter in the part of the code I'm currently working on. Which is most parts of the code.

Another common reason is that in general, theoretically faster algorithms often require making stronger assumptions about the input. Radix sort is way faster than quicksort/mergesort/heapsort/etc, but it only works if you assume minimum and maximum values for your input numbers. I don't want to hardcode arbitrary mins and maxs in my program without a very good reason, since that's exactly the sort of thing that makes programs brittle and unpredictable in the long run. So most of the time I'm not going to use radix sort unless the business requirements happen to give me a min and a max for free.

And what are some advantages of a linked list vs a hash table/array where access is constant time?

The biggest advantage of a linked list over an array or hash table is that inserting or removing an element anywhere in the list is a constant time operation, even in the non-amortized worst case (as long as you already have a reference to the node you want to insert/remove at). For arrays, insertion and removal are linear time operations. For hash tables, insertion and removal are amortized constant time, but linear time in general.

In some cases linked lists (and trees) are also useful because pointers to their nodes do not get invalidated on future insertions. They only become invalid when that particular node is deleted. With arrays and hash tables, too many insertions will force a total reallocation which invalidates all pointers to existing elements. In many languages this is a complete non-issue, but in something like C or C++ it can occasionally be very important.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.