In computer science, the analysis of algorithms is the determination of the number of resources (such as time and storage) necessary to execute them. Most algorithms are designed to work with inputs of arbitrary length. Usually the efficiency or running time of an algorithm is stated as a function relating the input length to the number of steps (time complexity) or storage locations (space complexity).
Algorithm analysis is an important part of a broader computational complexity theory, which provides theoretical estimates for the resources needed by any algorithm which solves a given computational problem. These estimates provide an insight into reasonable directions of search for efficient algorithms.
Big O notation
In computer science, big O notation is used to classify algorithms by how they respond (e.g., in their processing time or working space requirements) to changes in input size.
Big O notation characterizes functions according to their growth rates: different functions with the same growth rate may be represented using the same O notation. A description of a function in terms of big O notation usually only provides an upper bound on the growth rate of the function. Associated with big O notation are several related notations, using the symbols o, Ω, ω, and Θ, to describe other kinds of bounds on asymptotic growth rates.
Big O notation is also used in many other fields to provide similar estimates.
Big theta notation
Big-Theta notation is a type of order notation for typically comparing 'run-times' or growth rates between two growth functions.
Big-Theta is a stronger statement than Big-O and Big-Omega.
Suppose f:N→R,g:N→R are two functions.
Then:
f(n)∈Θ(g(n)) iff: (f(n)∈O(g(n))∧(f(n)∈Ω(g(n)) where O(g(n)) is Big-O and Ω(g(n)) is Big-Omega.
This is read as "f(n) is big-theta of g(n)".
Another method of determining the condition is the following limit: limn→∞f(n)g(n)=c, where 0<c<∞
If such a c does