* How do time or space requirements grow as the problem size increases.
...
Worst Case Efficiency
* is the maximum number of steps an algorithm can get for any collection of data values.
Best Case Efficiency
* it the minimum number of steps an algorithm can get for any collection of data values.
Average –Case Efficiency
* falls between these extremes. It is the efficiency averaged on all possible inputs and must assume a distribution of the input.
Amortized Efficiency
* is being applied not just in single run algorithm but to a sequence of operations performed on the same data structure.
Asymptotic complexity
* is a way of expressing the main component of the cost of an algorithm, using idealized units of computational work.
Notations used in comparing and ranking the order of growth.
1. BigO Notation (O)
* Introduced in 1895 by Paul Bachmann. It is the most commonly used notation for specifying asymptotic complexity.
* Big-O means “on the order of” , is the formal method of expressing the upper bound of an algorithm’s running time. It is the measure of the longest amount of time it could possible take for the algorithm to complete.
* O(n)
o Printings list of n items to the section looking at each item once
* O(logn)
o Taking a list of items, cutting it in half repeatedly until there’s only one item left.
* O(n2)
o Taking a list of n items and comparing every item to every other item.
Sequential Algorithm
* A type of algorithm where each instruction are executed one after another.
Parallel Algorithm
* A type of algorithm where all instructions are executed concurrently.
Sorting
* is any process of arranging items in some sequence and/or in different sets, and accordingly
Types of Problem
1. Searching
* a type of value that deals with finding a given value called the key in a given list or set.
2. String Processing
* a type of problem that deals with