Ivar Alexander Abusdal
Ivar Alexander Abusdal
algorithms

Algorithms I: Big O Demystified

Algorithms I: Big O Demystified
0 views
15 min read
#algorithms

In programming the enchantment woven into your code hinges upon unraveling the secrets of underlying algorithms and their arcane complexities. Behold, as this series of articles sets forth on a spellbinding journey through the realms of JavaScript, Python, and C#, delving deep into the esoteric tapestry of algorithms and the hidden structures of data.

In the incantations to follow, I shall illuminate the mystical concept known as Big O for each of these three languages. However, fear not, dear reader, for the tales shall soon diverge into separate series, aligning with your affinity for either JavaScript, Python, or the enchanting world of C#. Choose your preferred incantation, and let the journey unfold!

I have also created a repository for each language:

Unveiling the Symbol: The Essence of Big O

In this preamble before we embark on the arcane journey through each language's enchantments, it is paramount to forge a shared understanding of the sigil known as Big O notation. Whether you are conjuring spells in the ethereal realms of JavaScript, Python, C#, or any other enchanted language, the sacred principles of algorithmic efficiency hold their arcane sway.

Big O notation, a universal tongue in the hermetic lexicon, serves as a conduit to express the conjuring prowess of algorithms in the realms of time and space, tethered to the ever-shifting dance of input size. It is not a precise enchantment, but a means to weave a tapestry of generalized growth for your algorithm.

Imagine, if you will, when a sorcerer utters the incantation Oh of N, they foretell a linear dance with the input – a linear running time in the chronicles of time efficiency. This mystical utterance allows us to compare the complexities of diverse approaches, guiding us in selecting the most fitting incantations of data structures and algorithms for a given enigma.

In the arcane initiation into the mysteries of Big O, the uncharted depths may initially shroud themselves in complexity. Fear not, for the path to enlightenment often reveals its secrets through the unraveling of basic examples, much like ancient scrolls foretell.

Our gaze shall be fixed upon the mystical veil of Big O time complexity. The cosmic dance of space complexity, though a realm of significance, shall patiently await its turn, like a slumbering dragon in the caverns of dynamic programming and recursive enchantments. The journey unfolds, and the arcane secrets shall reveal themselves in due course.

The Switft Constant Charm

O(1)O(1) reveals itself as a constant-time operation. The size of the array matters not, for time remains steadfast in its constancy.

JS: O(1)
const myArray = [1, 2, 3, 4, 5];
const elementAtIndexTwo = myArray[2];
console.log(elementAtIndexTwo);
Python: O(1)
my_list = [1, 2, 3, 4, 5]
element_at_index_two = my_ist[2]
print(element_at_index_two)
C#: O(1)
using System;
 
class Program
{
	static void Main()
	{
		int[] myArray = { 1, 2, 3, 4, 5 };
		int elementAtIndexTwo = myArray[2];
		Console.WriteLine(elementAtIndexTwo);
	}
}

The Labyrinth of Linear Complexity

Within the labyrinth of linear complexity (O(n)O(n)), the cosmic dance of iterations unfolds, entwined with the very essence of the input size. This mystical concept signifies that, at its zenith, the number of iterations mirrors the grandeur of the input scale.

Consider, for instance, the humble for loop, a incantation that traverses the realms of input, its steps echoing in harmony with the cosmic dance of nn.:

JS: O(n)
function linearSearch(arr, needle) {
	for (let i = 0; i < arr.length; i++) {
		if (arr[i] === needle) {
			return true;
		}
	}
	return false;
}
 
const myArray = [1, 2, 3, 4, 5];
const needleValue = 3;
const found = linearSearch(myArray, needleValue);
console.log(found);

In the ethereal quest of linear search (O(n)O(n)), the mystical linearSearch function unfolds its cosmic dance. At its zenith, when the targetValue aligns with the elusive number 5, the function embarks on a journey through the entire array, echoing in harmony with the very essence of the input scale.

Behold, in the realms of both Python and C#, the same enchanting operation takes form, a dance that transcends linguistic barriers, resonating with the universal song of linear exploration:

Python: O(n)
def linear_search(lst, needle):
	for element in lst:
		if element == needle:
			return True
	return false
 
my_list = [1, 2, 3, 4, 5]
needle_value = 3
found = linear_search(my_list, needle_value)
print(found)
C#: O(n)
using System;
 
class Program
{
	static bool LinearSearch(int[] arr, int needle)
	{
		for (int i = 0; i < arr.Length; i++)
		{
			if (arr[i] == needle)
			{
				return true;
			}
		}
		return false;
	}
 
	static void Main()
	{
		int[] myArray = { 1, 2, 3, 4, 5 };
		int needleValue = 3;
		bool found = LinearSearch(myArray, needleValue);
		Console.WriteLine(found);
	}
}

In the mystical realms of quadratic time complexity (O(n2)O(n^2)), the worst-case scenario unveils a cosmic dance where running time ascends exponentially, entwined with the very fabric of the square of the input. A cautionary tale, for the cost of this enchantment grows with unrestrained haste as the input scales.

Beware, dear reader, for if in your coded incantations you chance upon a nested loop such as iterating over spreadsheets, tables, and chessboards, where operations multiply in an intricate choreography, the specter of O(n2)O(n^2) may linger.

A tale of bubble sort awaits, a mere glimpse into the cosmic intricacies of O(n2)O(n^2). Pay no heed to the details for now; the nested loops are the telltale signs of this enchantment, a dance best approached with caution and understanding.

JS: O(n^2)
function bubbleSort(arr) {
	const n = arr.length;
 
	for (let i = 0; i < n - 1; i++) {
		for (let j= 0; j < n - 1 - i; j++) {
			if (arr[j] > arr[j + 1]) {
				// Swap elements if they are
				// in the wrong order
				const temp = arr[j];
				arr[j] = arr[j + 1];
				arr[j + 1] = temp;
			}
		}
	}
 
	return arr;
}
 
const myArray = [5, 3, 1, 4, 2];
const sortedArray = bubbleSort(myArray);
console.log(sortedArray);
Python: O(n^2)
def bubble_sort(lst):
	n = len(lst)
 
	for i in range(n - 1):
		for j in range(n - 1 - i):
			if lst[j] > lst[j + 1]:
				# Swap elements if they are
				# in the wrong order
				lst[j], lst[j + 1] = lst[j + 1], lst[j]
 
	return lst
	
my_list = [5, 3, 1, 4, 2]
sorted_list = bubble_sort(my_list)
print(sorted_list)
C#: O(n^2)
using System;
 
class Program
{
	static int[] BubbleSort(int[] arr)
	{
		int n = arr.Length;
 
		for (int i = 0; i < n; i++)
		{
			for (int j = 0; j < n - 1; j++)
			{
				if (arr[j] > arr[j + 1])
				{
					// Swap elements if they are
					// in the wrong order
					int temp = arr[j];
					arr[j] = arr[j + 1];
					arr[j + 1] = temp;
				}
			}
		}
 
		return arr;
	}
 
	static void Main()
	{
		int[] myArray = { 5, 3, 1, 4, 2 };
		int[] sortedArray = BubbleSort(myArray);
		foreach (int i in bubbleSortArray)
		{
			Console.Write(i);
		}
	}
}

Whispers of the Logarithmic Dance

In the mystical whispers of logarithmic time complexity (O(logn)O(\log n)), the cosmic dance aligns with the essence of the logarithm—an elegant journey of dividing the input by two until the mystical realm of 'one' is unveiled. A dance of efficiency, harmonizing with the cosmic scales, and well-suited for the vast landscapes of large data realms.

At the heart of this enchantment lies the iconic algorithm—binary search. A tale to be told in the very next chapter of our magical chronicles, so fret not about the details for now. Witness how, with each iteration, the problem space gracefully diminishes, a dance of halving that befits the realms of O(logn)O(\log n).

function binarySearch(arr, needle){
	let low = 0;
	let high = arr.length;
 
	while (low < high) {
		const mid = Math.floor((low + high) / 2);
		const midValue = arr[mid];
 
		if (midValue === needle) {
			return mid;
		} else if (midValue < needle) {
			low = mid + 1;
		} else {
			high = mid;
		}
	}
 
	return - 1; 
}
 
const myArray = [1, 2, 3, 4, 5, 6, 7, 8, 9];
const needleValue = 7;
const foundIndex = binarySearch(myArray, needleValue);
console.log(foundIndex);
def binary_search(lst, needle):
	low, high = 0, len(lst)
 
	while low < high:
		mid = (low + high) // 2
		mid_value = lst[mid]
 
		if mid_value == needle:
			return mid
		elif mid_value < needle:
			low = mid + 1
		else:
			high = mid
 
	return -1
 
my_ist = [1, 2, 3, 4, 5, 6, 7, 8, 9]
target_needle = 7
found_index = binary_search(my_list, target_value)
print(found_index)
using System;
 
class Program
{
	static int BinarySearch(int[] arr, int target)
	{
		int low = 0;
		int high = arr.Length;
 
		while (low < high)
		{
			int mid = (low + high) / 2;
			int midValue = arr[mid];
 
			if (midValue == needle)
			{
				return mid;
			}
			else if (midValue < target)
			{
				low = mid + 1;
			}
			else
			{
				high = mid;
			}
		}
 
		return -1;
	}
 
	static void Main()
	{
		int[] myArray = { 1, 2, 3, 4, 5, 6, 7, 8, 9 };
		int needleValue = 7;
		int foundIndex = BinarySearch(myArray, needleValue);
		Console.WriteLine(foundIndex);
	}
}

The Harmony of Linear and Logarithmic Realms

In the celestial realms of algorithmic symphony, the enchantment of O(nlogn)O(n \log n) gracefully intertwines the linear and logarithmic growth, creating a dance that resonates both with the scale of the input (nn) and its logarithmic counterpart (logn\log n). A tale of algorithms that deftly navigate between the realms of linear proportionality and logarithmic proportionality—a cosmic dance that divides the problem space into smaller parts, inviting iteration.

Enter the realms of Merge Sort, a classic algorithm adorned with the garb of O(nlogn)O(n \log n) time complexity. In the ballet of Merge Sort, the list gracefully succumbs to recursive division (logarithmic), and then arises anew as its halves merge, guided by the allure of linear operations. This elegant marriage of linear and logarithmic prowess often adorns the cloak of efficient sorting algorithms, making O(nlogn)O(n \log n) a signature of algorithmic elegance.

Embark on this journey with caution, for we shall delve deeper into the realms of Merge Sort and recursion in future tales. The details, though tempting, are mere breadcrumbs on this cosmic trail of algorithmic wonder.

function mergeSort(arr) {
	if (arr.length <= 1) {
		return arr;
	}
 
	const mid = Math.floor(arr.length / 2);
	const left = mergeSort(arr.slice(0, mid));
	const right = mergeSort(arr.slice(mid))
 
	return merge(left, right);
}
 
function merge(left, right) {
	const result = [];
	let leftIndex = 0;
	let rightIndex = 0;
 
	while (leftIndex < left.length && rightIndex < right.length) {
		if (left[leftIndex] < right[rightIndex]) {
			result.push(left[leftIndex]);
			leftIndex++;
		} else {
			result.push(right[rightIndex]);
			rightIndex++;
		}
	}
 
	return result.concat(left.slice(leftIndex))
				.concat(right.slice(rightIndex));
}
 
const myArray = [3, 1, 4, 1, 5, 9, 2, 6, 5, 3, 5];
const sortedArray = mergeSort(myArray);
console.log(sortedArray);
def merge_sort(lst):
	if len(lst) <= 1:
		return lst
 
	mid = len(lst) // 2
	left = merge_sort(lst[:mid])
	right = merge_sort(lst[mid:])
 
	return merge(left, right)
 
def merge(left, right):
	result = []
	left_index = right = 0
 
	while left_index < len(left) and right_index < len(right):
		if left[left_index] < right[right_index]:
			result.append(left[left_index])
			left_index += 1
		else:
			result.append(right[right_index])
			right_index += 1
 
	return result + left[left_index:] + right[right_index:]
 
my_list = [3, 1, 4, 1, 5, 9, 2, 6, 5, 3, 5]
sorted_list = merge_sort(my_list)
print(sorted_list)
using System;
 
class Program
{
	static int[] MergeSort(int[] arr)
	{
		if (arr.Length <= 1)
		{
			return arr;
		}
 
		int mid = arr.Length / 2;
		int[] left = new int[mid];
		int[] right = new int[arr.Length - mid];
 
		Array.Copy(arr, 0, left, 0, mid);
		Array.Copy(arr, mid, right, 0, arr.Length - mid);
 
		left = MergeSort(left);
		right = MergeSort(right);
 
		return Merge(left, right);
	}
 
	static int[] Merge(int[] left, int[] right)
	{
		int[] result = new int[left.Length + right.Length];
		int leftIndex = 0, rightIndex = 0, resultIndex = 0;
 
		while (leftIndex < left.Length && rightIndex < right.Length)
		{
			if (left[leftIndex] < right[rightIndex])
			{
				result[resultIndex++] = left[leftIndex++];
			}
			else
			{
				result[resultIndex++] = right[rightIndex++];
			}
 
			while (leftIndex < left.Length)
			{
				result[resultIndex++] = left[leftIndex++];
			}
 
			while (rightIndex < right.Length)
			{
				result[resultIndex++] = right[rightIndex++];
			}
 
			return result;
		}
	}
 
	static void Main()
	{
		int[] myArray = { 3, 1, 4, 1, 5, 9, 2, 6, 5, 3, 5 }
		int[] sortedArray = MergeSort(myArray);
		forEach (int i in sortedArray) {
			Console.Write(i);
		}
	}
}

Glimpse of Other Temporal Realms

Behold, brave wanderers, beyond the cosmic veil lie other temporal realms, each cloaked in its own enchanting time complexity. While our journey has touched upon the most iconic dances – constant (O(1)O(1)), linear (O(n)O(n)), quadratic (O(n2)O(n^2)), logarithmic (O(logn)O(\log n)), and the harmonious blend of linear and logarithmic (O(nlogn)O(n \log n)) – know that the celestial tapestry of algorithmic wonder extends further.

In future incantations, we may encounter exotic complexities, each with its own unique dance in the grand cosmic ballet. As the cosmic voyager, be ever curious and prepared, for the enchanted realms of algorithmic complexity are vast and ever-unfolding.

Harmony in the Cosmic Symphony

In the grand cosmic symphony of Big O notation, we embark on a harmonious quest to unveil the essence of algorithmic growth as the celestial input size expands. A key principle guides our journey—dropping constants—a practice where we cast aside the numerical coefficients that linger in the shadows of time and space complexity.

Consider these cosmic reasons for embracing the art of dropping constants:

  1. Focus on Dominant Terms:
    • Big O notation, a maestro in the cosmic orchestra, directs our attention to the dominant term, the crescendo that echoes most profoundly in the realms of growth. Constants, mere whispers in comparison, fade into the background as the input stage expands its cosmic canvas.
  2. Simplified Analysis:
    • In the dance of algorithmic efficiency, simplicity is our guiding star. Ignoring constants, we navigate a celestial path of analysis unburdened by the intricate details of hardware or ethereal implementation. A universal dance unfolds, where algorithms transcend the earthly confines of specificity.
  3. Consistency Across Platforms:
    • As constants gracefully exit the cosmic stage, a universal understanding of algorithmic efficiency emerges. The dropping of constants transforms Big O notation into a celestial tongue, spoken across diverse platforms and computing realms. A language harmonized with the very fabric of the algorithmic cosmos.

Embark, brave reader, on this cosmic dance where constants fade, and the true essence of algorithmic efficiency emerges. In the realms of Big O notation, let the dropping of constants be your celestial guide, weaving a tapestry of understanding that transcends the boundaries of the mundane.

A Stroll Through the Cosmic Meadow

In the celestial meadow of algorithms, let us unravel the secrets of a simple linear search algorithm, where the dance of time complexity takes the form of the expression:

T(n)=5n+10T(n) = 5n + 10

Behold, where 5n5n represents the linear melody, and 1010 stands as a constant note in this algorithmic symphony. As we delve into the cosmic language of Big O notation, we gracefully drop the constant, and the expression transforms into the enchanting notation:

O(n)O(n)

Why, you ask?

  • The linear melody 5n5n resonates dominantly as the input grandeur (nn) swells.
  • The constant note 1010 fades into the celestial background, overshadowed by the cosmic crescendo of 5n5n for vast values of nn.
  • Our gaze is fixed upon the linear ballet between the input size and time complexity, a dance elegantly captured by O(n)O(n).

In this celestial stroll, dropping constants becomes our guiding philosophy, revealing the algorithm's essence without entangling ourselves in the minutiae of numerical coefficients. Let the linear search be a celestial melody, where constants echo faintly in the cosmic breeze, and the true essence of efficiency takes center stage.

Epilogue: The Final Enchantment

In the enchanted realms of computer science, Big O notation emerges as a sacred language—an incantation for expressing and comparing the ineffable efficiency of algorithms. Whether it be the linear dance of O(n)O(n), the quadratic sonata of O(n2)O(n^2), the logarithmic whispers of O(logn)O(\log n), or the constant echoes of O(1)O(1), Big O notation weaves a tapestry capturing the very essence of algorithmic efficiency.

As we traverse the mystical pathways of coding, Big O notation stands as a practical guide, a compass pointing toward optimal algorithms. It offers a concise and standardized framework, a cosmic lexicon for evaluating efficiency that transcends the intricate details of implementation.

In the next chapters of our cosmic odyssey, we shall delve into the mystical binary search, unraveling its secrets in the realms of JavaScript, Python, and C#. The charm-slinging gods shall be our companions on this journey, guiding us through the enchanted landscapes of algorithms.

Until our paths converge again in the sacred scrolls of coding, may your algorithms be swift, your code be enchanting, and may the charm-slinging gods smile upon your coding adventures!