Quick Sort Algorithm Time Complexity Explained
Q: Describe the time complexity of the quick sort algorithm in the best, average, and worst-case scenarios.
- Big-O Notation
- Mid level question
Explore all the latest Big-O Notation interview questions and answers
ExploreMost Recent & up-to date
100% Actual interview focused
Create Big-O Notation interview for FREE!
The quick sort algorithm has different time complexities depending on the scenario:
1. Best Case: In the best case, quick sort operates in O(n log n) time complexity. This scenario occurs when the pivot chosen divides the array into two nearly equal halves at each recursive step. For example, if we always pick the median as the pivot, the depth of the recursion tree will be log n, and we perform linear work (O(n)) at each level to partition the array, resulting in O(n log n).
2. Average Case: The average case also has a time complexity of O(n log n). This is based on the assumption that the pivot will generally divide the array in a non-ideal but balanced manner. Over numerous partitions, the performance averages out to O(n log n), as the expected depth of the recursive calls still adheres to the log n pattern.
3. Worst Case: The worst case occurs when the pivot chosen results in the most unbalanced partitions possible, such as when the smallest or largest element is consistently chosen as the pivot. This leads to a time complexity of O(n²). For instance, if we sort an already sorted array and always select the last element as the pivot, we would end up with n recursive calls, each requiring O(n) work, leading to O(n²) overall.
In summary:
- Best Case: O(n log n)
- Average Case: O(n log n)
- Worst Case: O(n²)
1. Best Case: In the best case, quick sort operates in O(n log n) time complexity. This scenario occurs when the pivot chosen divides the array into two nearly equal halves at each recursive step. For example, if we always pick the median as the pivot, the depth of the recursion tree will be log n, and we perform linear work (O(n)) at each level to partition the array, resulting in O(n log n).
2. Average Case: The average case also has a time complexity of O(n log n). This is based on the assumption that the pivot will generally divide the array in a non-ideal but balanced manner. Over numerous partitions, the performance averages out to O(n log n), as the expected depth of the recursive calls still adheres to the log n pattern.
3. Worst Case: The worst case occurs when the pivot chosen results in the most unbalanced partitions possible, such as when the smallest or largest element is consistently chosen as the pivot. This leads to a time complexity of O(n²). For instance, if we sort an already sorted array and always select the last element as the pivot, we would end up with n recursive calls, each requiring O(n) work, leading to O(n²) overall.
In summary:
- Best Case: O(n log n)
- Average Case: O(n log n)
- Worst Case: O(n²)


