Time and Space Complexity
Time Complexity
Defination
Time Complexyity is growth trend of algorithm running time as data size becomes larger.
Essence (Optional)
We denote the input data size as
The defination of asymptotic upper bound is as follows:
If there exists a real number
and a real number , we have the formula (1):
Then, we can conclude thathas given a asymptotic upper bound of . Denoting it as the formula (2):
Essentially, computing the asymptotic upper bound is finding a function
How do we find asymptotic upper bound?
We can ignore the coefficients and constant numbers of the number of operations
- We should ignore all the operations which are not related to
. (Constant numbers) - We should omit all coefficients. (Coefficients)
- We should use multiplication for all nested loops.
The time complexity is determined by the term of highest order in
Number of operations |
Time complexity |
---|---|
100000 | |
Common Types
Common time complexity types are (listed from lowest to highest) as formula (3):
Linear Order
Linear order is often found in the single-level loop. Operations such as traversing an array and traversing a linked list have a time complexity of
Squared Order
The square order is often found in nested loops, where both the outer and inner loops are
Take bubble sort as an example, the outer loop has
1 | """ bubble sort""" |
Exponential Order
In biology, cell division is exponential growth.
If the time complexity of solving a problem using “violent enumeration” is
Logarithmis Order
The logarithmic order is the opposite of the exponential order, which reflects the increase to twice per round case, while the former reflects the decrease to half per round case. The logarithmic order is second only to the constant order, and the time grows very slowly, which is the ideal time complexity.
The logarithmic order is often found in binary search and Divide and Conquer, D&C, which reflect the algorithmic ideas of dividing one into many and simplifying the complexity.
Similar to exponential order, logarithmic order is often found in recursive functions. The following code forms a recursive tree of height
1 | def log_recur(n): |
Linear Logarithmic Order
Linear logarithmic order is often found in nested loops, with time complexity of
1 | """ Linear logarithmic order """ |
Factorial Order
Factorials are often implemented using recursion. For example, in the following code, the first level is splited into
1 | """Factorial Order""" |
Worst, Best, Average Time Complexity
The time complexity of some algorithms is not constant, but is related to the distribution of the input data. For example, if I want to find 1 in an array, the time I spend depends on the position of 1 in the array.
The asymptotic upper bound of the function is represented by Big-
We seldom use best time complexity in practice because it is often achieved only with a small probability and can be misleading. On the contrary, the worst time complexity is the most practical because it gives an efficiency safety value and allows us to use the algorithm with confidence.
Relatively, the average time complexity reflects the efficiency of an algorithm running with random input data, and is represented in
However, in practice, especially for more complex algorithms, it is difficult to calculate the average time complexity because it is difficult to analyze the overall mathematical expectation under the data distribution in an easy way. In this case, we generally use the worst time complexity as the criterion for judging the efficiency of the algorithm.
Reference
hello-algo on github. The original author are krahets (github username), etc. This article of the blog is only translated by hello-algo.
This article is for my study, record, and translation only, not for commercial purposes. For infringement, please contact me to delete the article. The original texts, codes, images, photos, and videos in hello-algo are licensed under CC BY-NC-SA 4.0.