Quick Sort

What do you call a Quick Sort algorithm that's always late? A Quicksort Procrastinator!

What is Quick Sort?

Quick Sort is a highly efficient, comparison-based sorting algorithm known for its speed and widespread use. It follows the divide-and-conquer strategy to sort an array or list of elements. Quick Sort works by selecting a "pivot" element from the array and partitioning the other elements into two subarrays, according to whether they are less than or greater than the pivot. The subarrays are then recursively sorted.

How it Works

Here's a step-by-step explanation of how Quick Sort works:

  1. Pivot Selection: Choose a pivot element from the array. There are various pivot selection strategies, such as selecting the first element, the last element, or a random element.

  2. Partitioning: Rearrange the elements in the array so that all elements less than the pivot are placed to its left, and all elements greater than the pivot are placed to its right. The pivot itself is now in its final sorted position.

  3. Recursion: Recursively apply Quick Sort to the subarrays on the left and right of the pivot.

  4. Combine: No additional combining step is needed in Quick Sort because the elements are sorted in place.

Quick Sort is highly efficient because it minimizes the number of swaps compared to other sorting algorithms like Bubble Sort or Insertion Sort. It also has good cache performance due to its localized memory access pattern.

Quick Sort Code in Java:

import java.util.Arrays;

public class QuickSort {
    public static void main(String[] args) {
        int[] arr = {64, 34, 25, 12, 22, 11, 90};

        quickSort(arr, 0, arr.length - 1);

        System.out.println("Sorted array: " + Arrays.toString(arr));
    }

    public static void quickSort(int arr[], int low, int high) {
        if (low < high) {
            int pi = partition(arr, low, high);

            quickSort(arr, low, pi - 1);
            quickSort(arr, pi + 1, high);
        }
    }

    public static int partition(int arr[], int low, int high) {
        int pivot = arr[high];
        int i = (low - 1);

        for (int j = low; j < high; j++) {
            if (arr[j] <= pivot) {
                i++;

                int temp = arr[i];
                arr[i] = arr[j];
                arr[j] = temp;
            }
        }

        int temp = arr[i + 1];
        arr[i + 1] = arr[high];
        arr[high] = temp;

        return i + 1;
    }
}

Quick Sort Code in C:

#include <stdio.h>

void swap(int* a, int* b) {
    int t = *a;
    *a = *b;
    *b = t;
}

int partition(int arr[], int low, int high) {
    int pivot = arr[high];
    int i = (low - 1);

    for (int j = low; j < high; j++) {
        if (arr[j] <= pivot) {
            i++;
            swap(&arr[i], &arr[j]);
        }
    }

    swap(&arr[i + 1], &arr[high]);
    return (i + 1);
}

void quickSort(int arr[], int low, int high) {
    if (low < high) {
        int pi = partition(arr, low, high);

        quickSort(arr, low, pi - 1);
        quickSort(arr, pi + 1, high);
    }
}

int main() {
    int arr[] = {64, 34, 25, 12, 22, 11, 90};
    int n = sizeof(arr) / sizeof(arr[0]);

    quickSort(arr, 0, n - 1);

    printf("Sorted array: \n");
    for (int i = 0; i < n; i++) {
        printf("%d ", arr[i]);
    }
    return 0;
}

Time and Space Complexity Analysis

The time complexity of Quick Sort can be analyzed mathematically as follows:

Let n be the number of elements in the array to be sorted.

Quick Sort follows a divide-and-conquer approach:

  1. Divide: The array is divided into two subarrays around a chosen pivot element.

  2. Conquer: Each subarray is sorted independently using Quick Sort recursively.

  3. Combine: No additional combining step is needed because the elements are sorted in place.

Here's how we can mathematically analyze the time complexity:

  • In the best-case scenario, Quick Sort divides the array into nearly equal-sized subarrays at each level of recursion, resulting in a balanced partition. This leads to a time complexity of O(n log n).

  • In the worst-case scenario, if the pivot selection strategy consistently chooses the smallest or largest element, the partitioning may be highly unbalanced, resulting in a time complexity of O(n^2).

  • In the average-case scenario, Quick Sort typically exhibits O(n log n) time complexity because, on average, it achieves balanced partitions due to randomized or well-chosen pivot elements.

To perform a precise mathematical analysis, you can use recurrence relations. The recurrence relation for Quick Sort is generally given as:

(T(n)=T(k)+T(nk1)+O(n))(T(n) = T(k) + T(n - k - 1) + O(n))

Where:

  • (T(n)) is the time to sort an array of size n.

  • (k) is the number of elements less than the pivot.

  • (n - k - 1) is the number of elements greater than the pivot.

The average time complexity analysis depends on the choice of pivot element and pivot selection strategy. In practice, randomized pivot selection or choosing a median-of-three pivot helps ensure that the average-case time complexity remains O(n log n).

In summary, the time complexity of Quick Sort can be summarized as follows:

  • Best-case: O(n log n)

  • Average-case: O(n log n)

  • Worst-case: O(n^2)

And it's space complexity: O(log n) (for the recursive call stack).

Quick Sort is known for its average-case time complexity of O(n log n), which makes it one of the fastest sorting algorithms for large datasets. However, its worst-case time complexity can be O(n^2) in rare cases. Quick Sort is preferred for practical sorting applications due to its speed and low space complexity. To mitigate the worst-case scenario, randomized pivot selection or choosing a median-of-three pivot can be used.

Last updated