My code seems to work perfectly except it output a float with only one decimal place instead of two as per dictionary created.
The dictionary gave each item a price e.g "Taco": 3.00; surprisingly when I run debug dictionary has different value e.g "Taco": 3.0 instead of 3.00
I would like to know why it is behaving this way.
here is my code:
# make a dictionary
dict = {
"Baja Taco": 4.00,
"Burrito": 7.50,
"Bowl": 8.50,
"Nachos": 11.00,
"Quesadilla": 8.50,
"Super Burrito": 8.50,
"Super Quesadilla": 9.50,
"Taco": 3.00,
"Tortilla Salad": 8.00
}
# with a while loop prompt for item and add item value until user input exit
total = 0
while True:
try:
item = input("Item: ").strip().title()
for order in dict:
if order == item:
total = total + (dict[order])
print(f"${total}")
except EOFError:
break
I expected my program to output a total of two decimal places instead of just one.
use
print(f'Total: ${total:.2f}')
test it on python terminal:
x = 12.3
>>> print(f'{x:.2f}')
12.30
>>> print(f'{x:.3f}')
12.300
>>> print(f'{x:.4f}')
12.3000
>>> print(f'{x:.5f}')
12.30000