TimeComplexity


Time complexity is a concept in computer science and that deals with the amount of time taken by a set of code or algorithm to process or run as a function of the amount of input.
In other words, time complexity is essentially efficiency, or how long a function takes to process a given input.
Time complexity is expressed typically in the "big O notation," but there are other notations

Types of Notations for Time Complexity

1.     Big Oh denotes "fewer than or the same as" <expression> iterations.
2.     Big Omega denotes "more than or the same as" <expression> iterations.
3.     Big Theta denotes "the same as" <expression> iterations.
4.     Little Oh denotes "fewer than" <expression> iterations.
5.     Little Omega denotes "more than" <expression> iterations.

Understanding Notations of Time Complexity

O(expression) is the set of functions that grow slower than or at the same rate as expression. It indicates the maximum required by an algorithm for all input values. It represents the worst case of an algorithm's time complexity.
Omega(expression) is the set of functions that grow faster than or at the same rate as expression. It indicates the minimum time required by an algorithm for all input values. It represents the best case of an algorithm's time complexity.
Theta(expression) consist of all the functions that lie in both O(expression) and Omega(expression). It indicates the average bound of an algorithm. It represents the average case of an algorithm's time complexity.
Typical Algorithm Complexities
This table will explain what every type of complexity (running time) means:
Complexity
Running Time
Description
constant
O(1)
It takes a constant number of steps for performing a given operation (for example 1,100,10000 or other number) and this count does not depend on the size of the input data.
logarithmic
O(log(n))
It takes the order of log(N) steps, where the base of the logarithm is most often 2, for performing a given operation on n elements. For example, if n = 1,000,000, an algorithm with a complexity O(log(N))would do about 20 steps (with a constant precision).
linear
O(n)
It takes nearly the same number of steps as the number of elements for performing an operation on n elements. For example, if we have 10,000 elements, it takes about 10,000 steps. Linear complexity means that the number of elements and the number of steps are linearly dependent, for example the number of steps for n elements can be n/3 or 4*n, n+1.
 liner logarithmic
O(n*log(n))
It takes n*log(n) steps for performing a given operation on n elements. For example, if you have 1,000 elements, it will take about 10,000 steps.
quadratic
O(n2)
It takes the order of n2 number of steps, where the n is the size of the input data. For example, for n elements the steps can be of the order of 2*n2/4.
cubic
O(n3)
It takes the order of n3 steps, where n is the size of the input data, for performing an operation on N elements.
exponential
O(2n), O(n!), O(nk), …
It takes a number of steps, which is with an exponential dependability with the size of the input data, to perform an operation on n elements. For example, if N = 10, the exponential function 2N has a value of 1024.
The exponential function N! grows even faster: for N = 5 it has a value of 120, for N = 10 it has a value of 3,628,800 and for N = 20 – 2,432,90,008,176,640,000.



Best, Worst and Average Case
Complexity of algorithms is usually evaluated in the worst case. This means in the average case they can work faster, but in the worst case they work with the evaluated complexity and not slower.
Let’s take an example: searching in array. To find the searched key in the worst case, we have to check all the elements in the array. In the best case we will have luck and we will find the element at first position. In the average case we can expect to check half the elements in the array until we find the one we are looking for. Hence in the worst case the complexity is O(N) – linear. In the average case the complexity is O(N/2) = O(N) – linear, because when evaluating complexity one does not take into account the constants. In the best case we have a constant complexity O(1), because we make only one step and directly find the element.

Data structure
Addition
Search
Deletion
Access by index
Array (T[])
O(N)
O(N)
O(N)
O(1)
Linked list (LinkedList<T>)
O(1)
O(N)
O(N)
O(N)
Dynamic array (List<T>)
O(1)
O(N)
O(N)
O(1)
Stack (Stack<T>)
O(1)
-
O(1)
-
Queue (Queue<T>)
O(1)
-
O(1)
-
Dictionary, implemented with a hash-table (Dictionary<K, T>)
O(1)
O(1)
O(1)
-
Dictionary, implemented with a balanced search tree(SortedDictionary<K, T>)
O(log(N))
O(log(N))
O(log(N))
-
Set, implemented with a hash-table (HashSet<T>)
O(1)
O(1)
O(1)
-
set, implemented with a balanced search tree(SortedSet<T>)
O(log(N))
O(log(N))
O(log(N))
-

List

A list is an ordered collection of elements.

Add
Remove
Get
Contains
Next
Data  Structure
ArrayList
 O(1)
O(n)
O(1)
O(n)
O(1)
Array
LinkedList
 O(1)
O(1)
O(n)
O(n)
O(1)
Linked List
CopyonWriteArrayList
 O(n)
O(n)
O(1)
O(n)
O(1)
Array

Set

A collection that contains no duplicate elements.

Add
Remove
Contains
Size
Next
Data Structure
HashSet
O(1)
O(1)
O(1)
O(1)
O(h/n)
Hash Table
LinkedHashSet
O(1)
O(1)
O(1)
O(1)
O(1)
Hash Table + Linked List
EnumSet
O(1)
O(1)
O(1)
O(1)
O(1)
Bit Vector
TreeSet
O(log n)
O(log n)
O(log n)
O(1)
O(log n)
Red-black tree
CopyonWriteArraySet
O(n)
O(n)
O(n)
O(1)
O(1)
Array
ConcurrentSkipList
O(log n)
O(log n)
O(log n)
O(n)
O(1)
Skip List

Queue

A collection designed for holding elements prior to processing.

Offer
Peak
Poll
Size
Data Structure
PriorityQueue
O(log n )
O(1)
O(log n)
O(1)
Priority Heap
LinkedList
 O(1)
O(1)
O(1)
O(1)
Array
ArrayDequeue
 O(1)
O(1)
O(1)
O(1)
Linked List
ConcurrentLinkedQueue
 O(1)
O(1)
O(1)
O(n)
Linked List
ArrayBlockingQueue
 O(1)
O(1)
O(1)
O(1)
Array
PriorirityBlockingQueue
O(log n)
O(1)
O(log n)
O(1)
Priority Heap
SynchronousQueue
O(1)
O(1)
O(1)
O(1)
None!
DelayQueue
O(log n)
O(1)
O(log n)
O(1)
Priority Heap
LinkedBlockingQueue
O(1)
O(1)
O(1)
O(1)
Linked List

Map

An object that maps keys to values. A map cannot duplicate keys; each key can map to at most one value.

Get
ContainsKey
Next
Data Structure
HashMap
O(1)
O(1)
O(h / n)
Hash Table
LinkedHashMap
O(1)
O(1)
O(1)
Hash Table + Linked List
IdentityHashMap
O(1)
O(1)
O(h / n)
Array
WeakHashMap
O(1)
O(1)
O(h / n)
Hash Table
EnumMap
O(1)
O(1)
O(1)
Array
TreeMap
O(log n)
O(log n)
O(log n)
Red-black tree
ConcurrentHashMap
O(1)
O(1)
O(h / n)
Hash Tables
ConcurrentSkipListMap
O(log n)
O(log n)
O(1)
Skip List



Reference: 

No comments:

Post a Comment