Big O Notation – Why? When? Where?

What Is Big O?

This is a mathematical term that originated in the early 20th century in number theory and came almost immediately to computer science as questions arose with resource optimization. Wiki-defined Big O notation is a mathematical notation that describes the limiting behaviour of a function when the argument tends towards a particular value or infinity. Let's rephrase it, and make this definition a little simpler and closer to software development. Any task is solved according to one or another approach, one or another algorithm of action. To compare the effectiveness of heterogeneous solutions written in different programming languages using different approaches, you can start analyzing its execution through the notation. Next, let's look at the most common classes of frequently encountered time complexities.

O(1) - Сonstant

O(1) - Сonstant

CategoriesUncategorized