I have previously asked a question similar to this however the answer doesn't seem to work.
I have a dictionary called Settings:
var settings = String: Any
This is filled by reading a text file - All works fine
when i run println(settings)
it returns the filled dictionary as it should.
[monsterRate: 1.0, monsterMinSpeed: 10.0, weaponPickupRate: 10.0, weaponPickupAmount: 50.0, goldPerMonster: 10.0, totalMonsters: 10.0, LevelNum: 1.0, monsterMaxSpeed: 15.0]
If i run println(settings["monsterMinSpeed"])
it returns Optional("10.0")
However when I try and set my variables to the values that the dictionary holds, it doesn't work:
monsterMinSpeed = (settings["monsterMinSpeed"] as? Double) ?? 0.0
monsterMaxSpeed = (settings["monsterMaxSpeed"] as? Double) ?? 0.0
monsterRate = (settings["monsterRate"] as? Double) ?? 0.0
weaponPickupAmount = (settings["weaponPickupAmount"] as? Double) ?? 0.0
weaponPickupKills = (settings["weaponPickupKills"] as? Double) ?? 0.0
goldPerMonster = (settings["goldPerMonster"] as? Double) ?? 0.0
Even though there are values in the dictionary, it always uses 0.0 as though the keys I am providing don't exist - returning the nil and therefore setting the variable to 0.0
Could someone please help?
Those are coming in as strings, not numbers - try something like:
monsterMinSpeed = (settings["monsterMinSpeed"] as? NSString)?.doubleValue ?? 0.0