April 2018
Intermediate to advanced
300 pages
7h 41m
English
Big O notation is used to define the complexity and performance of an algorithm with respect to time or space consumed during execution. It is an essential technique to express the performance of an algorithm and determine the worst-case complexity of the program.
To understand it in detail, let's go through some code examples and use Big O notation to calculate their performance.
If we calculate the complexity of the following program, the Big O notation will be equal to O(1):
static int SumNumbers(int a, int b)
{
return a + b;
}
This is because, however the parameter is specified, it is just adding and returning it.
Let's consider another ...