I'm trying to count how many rolls it takes to get 100 rolls of a sum of two dice, r (2-12) and prints it. This is what I've tried so far:
from random import randrange
def diceprob(r):
count = 0
while count < 100:
roll = random.randrange(2, 13)
if roll == r:
count += 1
print("It took {} rolls to get 100 rolls of {}".format(count, r))
But when ran, it always prints out, "It took 100 rolls to get 100 rolls of 5", with the value of of 5 changing based on what r is called as. Why isn't the count being incremented properly?
You were counting rolls that got r
as the result, not all rolls.
Also note that distribution of the sum of two six-sided dice rolls is not uniform therefore you need to call random.randrange(1, 7)
twice
import random
def diceprob(r):
total_count = 0
count = 0
while count < 100:
roll = random.randrange(1, 7) + random.randrange(1, 7)
total_count += 1
if roll == r:
count += 1
print("It took {} rolls to get 100 rolls of {}".format(total_count, r))
If we invoke diceprob
for various values of r
we will get results
It took 3849 rolls to get 100 rolls of 2
It took 1932 rolls to get 100 rolls of 3
It took 1394 rolls to get 100 rolls of 4
It took 881 rolls to get 100 rolls of 5
It took 717 rolls to get 100 rolls of 6
It took 537 rolls to get 100 rolls of 7
It took 748 rolls to get 100 rolls of 8
It took 798 rolls to get 100 rolls of 9
It took 1295 rolls to get 100 rolls of 10
It took 1881 rolls to get 100 rolls of 11
It took 3689 rolls to get 100 rolls of 12
Compare it with graph of the distribution