I have a string of size n, containing only numbers from 0 to 9 pick any character from this string and modified it using formula Math.min(s[i] + 1, 9) and place it in any position of the string.
We can do this modifications any number of times, find the lexicographically smallest string possible.
Example
s ="26547".
Result = "24677"
Explanation:
In input 26547, pick 5 and change that to 6, and move it to ending positions, so modified sting is 26467 Now change 6 to 7 to get result 24677
Hence, the string returned is "24677". It can be proved that this is the lexicographically minimal string possible.
> Constraints
• 1 ≤ |s_id = 2*10^5 • s has only digits from 0 to 9. s can have leading zeros.
Other examples:
s = 26547 , result = 24677
s = 04829, result = 02599
s = 34892, result = 24599
This is an interview question, I thought of using PriorityQueue in Javam but I found itis not correct on. what approach suits to solve this problem.
Instead of incrementing and moving digits one at a time, there's an equivalent set of operations that's easier to analyze. We can remove as many digits as we want, and put back "incremented" digits from our pile of removed digits whenever we want.
The smallest possible output must start with as many 0s as possible. The largest possible number of 0s is however many 0s were in the input. For these 0s to end up in front of the output, all nonzero digits before the last 0 must be removed. So we remove them.
After the 0s, the smallest possible output must start with as many 1s as possible. The largest possible number of 1s is however many 1s we need to add back, plus the number of 1s remaining in the input. (Actually we can't have any 1s to add back, but it'll be clear why I included that bit later.) For these 1s to end up where they need to go, all digits of value 2 or greater that appear before the last 1 must be removed. So we remove them, then we put in all the 1s we need to put in.
After the 1s, the smallest possible output must start with as many 2s as possible. The largest possible number of 2s is however many 2s we need to add back, plus the number of 2s remaining in the input. (We can actually have 2s to add back at this point, because we might have removed some 1s that appeared before the final 0.) For these 2s to end up where they need to go, all digits of value 3 or greater that appear before the last 2 must be removed. So we remove them, then we put in all the 2s we need to put in.
And so on, all the way up to 9. Looking at what gets removed, we find that a digit gets removed if and only if a digit of smaller value appears at a later index of the input.
It was easiest to analyze the problem in terms of mixed waves of removing and putting back digits, but as long as we "put back" digits in the right places, it doesn't matter when we put them back. So for the implementation, it can be easiest to handle things differently.
I find that it's easiest to use a counting approach. The optimal output is guaranteed to be non-descending, so if we know our choice of removals will produce an optimal output, then we can determine what the output is based solely on the counts of how many of each digit appear in the output.
A removed digit of value x
corresponds to a digit of value min(x+1, 9)
in the output, while an un-removed digit of value x
corresponds to an identical digit in the output.
Since whether a digit is removed depends on the digits that appear later in the input, it's easiest to iterate over the input in reverse.
Putting it all together, the implementation could look something like this:
// For convenience, I'm going to take and return int[] rather than String.
// The conversions required for String input and output are straightforward but tedious.
public static int[] solve(int[] input)
{
int[] outputDigitCounts = new int[10];
int smallestDigitSeen = 10; // initialized so the first digit we see is smaller than this
for (int i = input.length - 1; i >= 0; i--) {
int digit = input[i];
int outputDigit;
if (digit == 9) {
outputDigit = 9;
} else if (digit > smallestDigitSeen) {
outputDigit = digit + 1;
} else {
outputDigit = digit;
}
outputDigitCounts[outputDigit]++;
if (digit < smallestDigitSeen) {
smallestDigitSeen = digit;
}
}
int[] output = new int[input.length];
int outputIndex = 0;
for (int i = 0; i < 10; i++) {
for (int j = 0; j < outputDigitCounts[i]; j++) {
output[outputIndex] = i;
outputIndex++;
}
}
return output;
}
This produces the correct output when tested on all example inputs. I haven't tested it further.