# toolbox/sort

Sorting is one most common operation on a collection of data. This modules provides the following algorithms:

• Insertion sort `O(n²)`
• Selection sort `O(n²)`
• Insertion sort `O(n²)`
• Merge sort `O(n log n)`
• Quicksort `O(n log n)`

## Sorting properties

### Stability

An algorithm is said to be stable if it uses a secondary criteria to ensure some order for items that fall in the same group based on the primary criteria.

For example, given

``````const users = [
{ name: 'Animesh', age: 26 },
{ name: 'Nicoleta', age: 25 },
{ name: 'Chris', age: 32 },
{ name: 'Adam', age: 32 },
];
``````

A stable sorting algorithm (ascending order by age) will always return

``````const users = [
{ name: 'Nicoleta', age: 25 },
{ name: 'Animesh', age: 26 },
{ name: 'Adam', age: 32 },
{ name: 'Chris', age: 32 },
];
``````

Notice that Adam and Chris were sorted alphabetically within their primary group (age = 32).

An unstable algorithm may or may not return the same result. It will sort by age alright, but may place Chris before Adam.

### In-place

An in-place sorting algorithm has a space complexity of `O(1)`. It doesn't require extra memory because it shuffles the item within the collection itself. Such algorithm are extremely useful in memory contraint environments such as embedded devices.

### Online

Online sorting algorithm don't need to resort the whole collection if a new item is added.

An adaptive algorithm takes advantage of helpful properties of the input. Non-adaptive doesn't. A simple example from manual arithmetic is methods for multiplying. Non-adaptive means you multiply every digit no matter what, adaptive would, say, tack a 0 to the end of a number when multiplying by 10 without ever carrying out the actual multiplication.

In the context of sorting, an non-adaptive algorithm takes the same(ish) amount of time for a given number of values no matter what. An adaptive algorithm takes advantage of properties of the data. This usually means taking advantage of any near sorted or already sorted portions.

Source:

### Methods

#### (inner) BubbleSort(collection) → {Array}

Bubble sort repeatedly steps through the collection to be sorted, compares each pair of adjacent items and swaps them if they are in the wrong order. This continues until no swaps are needed.

Returns the sorted collection as an array.

#### Properties

• ✅ Stable
• ✅ In-place
• ✅ Online
• ✅ Adaptive, `O(n)` if the array is already sorted
• Time complexity: `O(n²)`
• Space complexity: `O(1)`
##### Parameters:
Name Type Description
`collection` Iterable

Collection items must support comparison operations <, >, ==

Source:

#### (inner) InsertionSort(collection) → {array}

The "natural" way of sorting. Starts from the second element, and checks whether it's smaller than values that come before it. If so, it's placed to its rightful place. Otherwise, moves on the next element.

Returns the sorted collection as an array.

#### Properties

• ✅ Stable
• ✅ In-place
• ✅ Online
• Time complexity: `O(n²)`
• Space complexity: `O(1)`
##### Parameters:
Name Type Description
`collection` Iterable

Collection items must support comparison operations <, >, ==

Source:

#### (inner) MergeSort(collection) → {array}

An efficient sorting algorithm that uses divide and conquer technique to accomplish its task faster.

Splits the array into halves until 2 or fewer elements are left in each half. It sorts these two elements and then merges back all halves until the whole collection is sorted.

Returns the sorted collection as an array.

#### Properties

• ✅ Stable
• ❌ In-place
• ❌ Online
• Time complexity: `O(n log n)`
• Space complexity: `O(n)`
##### Parameters:
Name Type Description
`collection` Iterable

Collection items must support comparison operations <, >, ==

Source:

#### (inner) QuickSort(collection) → {array}

An efficient sorting algorithm that uses divide and conquer technique to accomplish its task faster. Unlike MergeSort it works in-place, and doesn't require additional memory.

Quicksort works by picking a "pivot" element (preferably random) and move all elements that are smaller than the pivot to the right. The ones that are larger than the pivot are moved to the left of it. This is done recursively until the collection is sorted.

Returns the sorted collection as an array.

#### Properties

• ❌ Stable
• ✅ In-place
• ❌ Online
• Time complexity: `O(n log n)`
• Space complexity: `O(1)`

Typically, an array sorted in descending order gives the worst-case performance of quick sort `O(n²)`. To avoid this, the

##### Parameters:
Name Type Description
`collection` Iterable

Collection items must support comparison operations <, >, ==

Source:

#### (inner) SelectionSort(collection) → {array}

Sorts an array by repeatedly finding the minimum element (considering ascending order) from unsorted part and putting it at the beginning. The algorithm maintains two subarrays in a given array.

1. The subarray which is already sorted.
2. Remaining subarray which is unsorted.

In every iteration of selection sort, the minimum element (considering ascending order) from the unsorted subarray is picked and moved to the sorted subarray. Selection sort minimises the number of swaps. Unlike insertion and bubble sort, it does only one swap in each iteration of the whole collection.

``````arr[] = 64 25 12 22 11

// Find the minimum element in arr[0...4]
// and place it at beginning
11 25 12 22 64

// Find the minimum element in arr[1...4]
// and place it at beginning of arr[1...4]
11 12 25 22 64

// Find the minimum element in arr[2...4]
// and place it at beginning of arr[2...4]
11 12 22 25 64

// Find the minimum element in arr[3...4]
// and place it at beginning of arr[3...4]
11 12 22 25 64
``````

Returns the sorted collection as an array.

#### Properties

• ❌ Stable
• ✅ In-place
• ❌ Online
• Time complexity: `O(n²)`
• Space complexity: `O(1)`
`collection` Iterable