Sort 100 thousand integers (range given) in linear time
  • sorting
  • lineartime
  • bigdata
  • array
  • countingsort
  •   
  • Posted: 5 years ago
  • Updated: 4 years ago
  • Edit
  • answers (1)
  • views (2326)

Suppose you have 100 thousand integers ranging from 1 to 1 million . How will you sort the integers in linear time?


Posted Answers

Since the range of the input array is given we can apply counting sort algorithm to sort the array. The algorithm works as :


  • Allocate a memory for an array of size equals to the range of the input array elements. (Here 1 million )
  • Initialize all the elements of this new array to be 0 .
  • For each element of the input array, increment the value of the new array's corresponding index by 1 .
  • Iterate through the new array and print each index for its value times.


Time complexity = \( O(n) \).

public int[] countingSort(int[] input, int range) {

/* Allocate new array of size of the range of the input array */
int[] countArray = new int[range + 1];

/* Load the new array */
for (int i = 0; i < input.length; i++)
countArray[input[i]]++;


int sum = 0;
for (int i = 0; i < = range; i++) {
int c = countArray[i];
countArray[i] = sum;
sum += c;
}
/* Print the sorted output */
int[] output = new int[input.length];
for (int i = 0; i < input.length; i++) {
output[countArray[input[i]]] = input[i];
countArray[input[i]]++;
}
return output;
}

You need to Sign In to post your solution.