For my specific problem I am having the user enter into a text field a two digit number pin, which converts to hex I think? somehow and then base64. I want for example the hex of 01 to output the base64 equivalent AQ==, and 02 to output Ag==, up to a max of 99 which is mQ==.
First of all is this okay? to have two digits [0-9], [0-9]. There shouldn't be an issue correct and it will always have a unique base64 value?
Second, I can't get it to properly work
For input "01", I am getting MDE=. Which is too many bytes and incorrect. I need only 1 byte values. For the input of "0" I get MA== which is the correct number of bytes but only using one digit.
I know I am doing something fundamentally wrong and all encoding/decoding solutions I've found only confuse me more. I think it may have to do with the int actually being a string, but I cannot pass an Int to .data method. Asking for advice from someone much smarter than I, Here is some of my swift code tyia
.onChange(of: channelKey, perform: { value in
// Limit input to two digits because that is what we want
let totalBytes = value.utf8.count
if totalBytes > 2 {
if let maxBytesString = value.truncated(to: 2) {
channelKey = maxBytesString
}
}
let base64EncodedString = channelKey.data(using: .utf8)!.base64EncodedString()
print("base64EncodedString: ", base64EncodedString) // prints “1” as MQ==
})
When you take the string "01" and convert it to data using .utf8, you're converting the string representation of the number 1 into data, which is indeed "01" in ASCII (or UTF-8 in this case), not the binary value 1 (which in hex is 01).
Try this:
if let intValue = UInt8(channelKey) {
var data = Data([intValue])
let base64EncodedString = data.base64EncodedString()
print("base64EncodedString: ", base64EncodedString)
}