Analysis of Common Algorithm Time Complexities: From O(1) to O(n!) in Daily Applications

Nov 23, 2025 · Programming · 10 views · 7.8

Keywords: Algorithm Complexity | Time Complexity | Big O Notation

Abstract: This paper provides an in-depth exploration of algorithms with different time complexities, covering O(1), O(n), O(log n), O(n log n), O(n²), and O(n!) categories. Through detailed code examples and theoretical analysis, it elucidates the practical implementations and performance characteristics of various algorithms in daily programming, helping developers understand the essence of algorithmic efficiency.

Constant Time Complexity Algorithms

Algorithms with constant time complexity O(1) have execution times that do not change with input size, representing the most ideal algorithmic complexity. These algorithms typically involve direct access or simple operations.

Array index access is a classic O(1) operation:

int[] arr = {1, 2, 3, 4, 5};
int element = arr[2]; // Direct access to third element

In linked list operations, inserting a node at a known position also has O(1) complexity:

class ListNode {
    int val;
    ListNode next;
    
    void insertAfter(ListNode prev, ListNode newNode) {
        newNode.next = prev.next;
        prev.next = newNode;
    }
}

Basic stack and queue operations similarly belong to O(1) complexity:

Stack<Integer> stack = new Stack<>();
stack.push(10); // O(1)
int top = stack.pop(); // O(1)

Queue<Integer> queue = new LinkedList<>();
queue.offer(20); // O(1)
int front = queue.poll(); // O(1)

Linear Time Complexity Algorithms

Algorithms with linear time complexity O(n) have execution times proportional to input size, commonly found in scenarios requiring traversal of entire datasets.

Array traversal is the most fundamental O(n) operation:

int[] numbers = {1, 2, 3, 4, 5};
for (int i = 0; i < numbers.length; i++) {
    System.out.println(numbers[i]);
}

Linear search algorithm for finding elements in unsorted arrays:

int linearSearch(int[] arr, int target) {
    for (int i = 0; i < arr.length; i++) {
        if (arr[i] == target) {
            return i;
        }
    }
    return -1;
}

String comparison and palindrome checking are also typical linear operations:

boolean isPalindrome(String str) {
    int left = 0, right = str.length() - 1;
    while (left < right) {
        if (str.charAt(left) != str.charAt(right)) {
            return false;
        }
        left++;
        right--;
    }
    return true;
}

Logarithmic Time Complexity Algorithms

Algorithms with logarithmic time complexity O(log n) halve the problem size with each iteration, achieving significantly higher efficiency than linear algorithms.

Binary search is the classic representative of O(log n) complexity:

int binarySearch(int[] arr, int target) {
    int left = 0, right = arr.length - 1;
    while (left <= right) {
        int mid = left + (right - left) / 2;
        if (arr[mid] == target) {
            return mid;
        } else if (arr[mid] < target) {
            left = mid + 1;
        } else {
            right = mid - 1;
        }
    }
    return -1;
}

Finding extreme values in binary search trees:

class TreeNode {
    int val;
    TreeNode left, right;
    
    int findMin(TreeNode root) {
        while (root.left != null) {
            root = root.left;
        }
        return root.val;
    }
    
    int findMax(TreeNode root) {
        while (root.right != null) {
            root = root.right;
        }
        return root.val;
    }
}

Linearithmic Time Complexity Algorithms

Linearithmic time complexity O(n log n) combines linear and logarithmic characteristics, commonly found in efficient divide-and-conquer algorithms.

Merge sort employs divide-and-conquer strategy:

void mergeSort(int[] arr, int left, int right) {
    if (left < right) {
        int mid = left + (right - left) / 2;
        mergeSort(arr, left, mid);
        mergeSort(arr, mid + 1, right);
        merge(arr, left, mid, right);
    }
}

void merge(int[] arr, int left, int mid, int right) {
    // Merge two sorted subarrays
    int[] temp = new int[right - left + 1];
    int i = left, j = mid + 1, k = 0;
    
    while (i <= mid && j <= right) {
        if (arr[i] <= arr[j]) {
            temp[k++] = arr[i++];
        } else {
            temp[k++] = arr[j++];
        }
    }
    
    while (i <= mid) temp[k++] = arr[i++];
    while (j <= right) temp[k++] = arr[j++];
    
    System.arraycopy(temp, 0, arr, left, temp.length);
}

Quick sort also has average complexity of O(n log n):

void quickSort(int[] arr, int low, int high) {
    if (low < high) {
        int pivot = partition(arr, low, high);
        quickSort(arr, low, pivot - 1);
        quickSort(arr, pivot + 1, high);
    }
}

int partition(int[] arr, int low, int high) {
    int pivot = arr[high];
    int i = low - 1;
    
    for (int j = low; j < high; j++) {
        if (arr[j] < pivot) {
            i++;
            swap(arr, i, j);
        }
    }
    swap(arr, i + 1, high);
    return i + 1;
}

Quadratic Time Complexity Algorithms

Quadratic time complexity O(n²) typically appears in algorithms with nested loops, exhibiting relatively lower efficiency.

Bubble sort is a typical O(n²) algorithm:

void bubbleSort(int[] arr) {
    int n = arr.length;
    for (int i = 0; i < n - 1; i++) {
        for (int j = 0; j < n - i - 1; j++) {
            if (arr[j] > arr[j + 1]) {
                swap(arr, j, j + 1);
            }
        }
    }
}

Two-dimensional array traversal:

int[][] matrix = new int[3][3];
for (int i = 0; i < matrix.length; i++) {
    for (int j = 0; j < matrix[i].length; j++) {
        System.out.print(matrix[i][j] + " ");
    }
    System.out.println();
}

Factorial Time Complexity Algorithms

Factorial time complexity O(n!) represents the least efficient algorithm type, typically used for solving combinatorial optimization problems.

Brute-force solution for Traveling Salesman Problem:

int tsp(int[][] graph, boolean[] visited, int currPos, int n, int count, int cost, int ans) {
    if (count == n && graph[currPos][0] > 0) {
        ans = Math.min(ans, cost + graph[currPos][0]);
        return ans;
    }
    
    for (int i = 0; i < n; i++) {
        if (!visited[i] && graph[currPos][i] > 0) {
            visited[i] = true;
            ans = tsp(graph, visited, i, n, count + 1, cost + graph[currPos][i], ans);
            visited[i] = false;
        }
    }
    return ans;
}

Generating all permutations of a set:

void generatePermutations(int[] arr, int start) {
    if (start == arr.length - 1) {
        System.out.println(Arrays.toString(arr));
        return;
    }
    
    for (int i = start; i < arr.length; i++) {
        swap(arr, start, i);
        generatePermutations(arr, start + 1);
        swap(arr, start, i);
    }
}

Algorithm time complexity analysis is fundamental to computer science. Understanding the characteristics and application scenarios of different complexity classes is crucial for designing efficient algorithms. In practical development, appropriate algorithms should be selected based on specific requirements, finding the optimal balance between time efficiency and space overhead.

Copyright Notice: All rights in this article are reserved by the operators of DevGex. Reasonable sharing and citation are welcome; any reproduction, excerpting, or re-publication without prior permission is prohibited.