Search code examples
arraysswiftswift-extensions

Changing return type to a decimal?


I am trying to convert the current values that I am dealing with to hundredths. Here is my code:

extension String {
    var value: Int {
        if let found = Array("abc".characters).indexOf(Character(lowercaseString)) {
            return found + 1
        }
        return 0
    }
    var totalValue: Int {
        var result = 0
        for letter in self.characters {
            result += String(letter).value
        }
        return result
    }
}

So the letter a would = .01, the letter b would = .02, and the letter c would = .03

How would I modify my code so this can happen?

EDIT

This is the code I am using:

@IBAction func sum(sender: AnyObject) {
    strValue.text = String(format: "%.2f", "$\(strInput.text!.value.description)")
}

I have no idea if this is the right way to do it but there it is


Solution

  • Alphabet characters to number values: use pattern matching

    If you want to allow this kind of letter -> decimal value conversion only for the letters of the alphabet, it could be appropriate to use pattern matching for the unicode scalars of characters "a" to "z".

    Note also that the Character initializer by String will cause a fatal runtime error if called with a String instance that contains more than a single graphene cluster (e.g. "ab".value in your code example will yield such an error). Hence, it's appropriate to ascertain that .value is only ever actually "usable" for single-character strings; returning, say, value 0 for anything but single-character strings with characters of the alphabet.

    Finally, Float is just a number and will always be displayed as accurate as possible (limited by floating point precision), however without redundant decimals, which is the reasons why the exact value of e.g. 1/10 is displayed as 0.1 and not 0.10. To control the number of decimals to display, you need to convert the float to a String where you can specify the formatting in this conversion, e.g. by the String(format:) initializer, or e.g. using an NSNumberFormatter.

    Alternative #1: several String extensions as in your own example

    Below follows a String extension example of using pattern matching to extract the Float .value property of (single) letters in the alphabet, including also the String representation of such a value, called by .valueAsString (same for totalValue; totalValueAsString).

    /* Alternative #1 */
    extension String {
    
        var value: Float {
            guard case let uc = lowercaseString.unicodeScalars where uc.count == 1,
                let uc1 = uc.first where UnicodeScalar("a")..."z" ~= uc1 else {
                return 0
            }
            
            return Float(uc1.value-96)/100
        }
        
        var valueAsString: String {
            return String(format: "%.2f", self.value)
        }
        
        var totalValue: Float {
            var result: Float = 0
            for letter in self.characters {
                result += String(letter).value
            }
            return result
        }
        
        var totalValueAsString: String {
            return String(format: "%.2f", self.totalValue)
        }
        
    }
    
    /* example usage */
    "a".valueAsString // 0.01
    "b".valueAsString // 0.02
    "j".valueAsString // 0.10
    "abcn".totalValueAsString // 0.20
    

    Alternative #2: condensed pattern matching

    If you're, however, only interested in the "total value" of a String of character (filtering out non-alphabetic characters) you could directly apply the pattering matching as a filter to the full string, without the need of using the intermediate .value extension, e.g.:

    /* Alternative #2 */
    extension String {
        
        var totalValueAsString: String {
            let pattern = UnicodeScalar("a")..."z"
            let filteredSum = self.lowercaseString.unicodeScalars
                .filter { pattern ~= $0 }
                .reduce(0) { $0 + $1.value - 96 }
            return String(format: "%.2f", Float(filteredSum)/100)
        }
    }
    
    /* example usage */
    "abc[n".totalValueAsString // 0.20 (ignoring the non-alphabetic "[" character)
    

    Applying the above to your specific sum(...) function

    Now, w.r.t. your edit

    @IBAction func sum(sender: AnyObject) {
        strValue.text = String(format: "%.2f", "$\(strInput.text!.value.description)")
    }
    

    Note that

    • You should avoid using the forced unwrapping operator (!) on the .text property of the (assumed to be) UITextField strInput, in case this is `nil.
    • The .text property of strInput is a String, whereas the String(format:) initializer expects an Float as its second argument.

    Hence, what you're looking for in the sum(...) function is most likely something more along the lines of

    @IBAction func sum(sender: AnyObject) {
        var decText : String = ""
        if let input = strInput.text {
            decText = input.totalValueAsString
        }
        else {
            decText = "0.00"
        }
        strValue.text = "$" + decText
    }
    

    Given that you are using one of the two String extension alternatives above.


    Finally, for displaying currency in the general context of UITextFields, see e.g. the following Q&A: