Search code examples
pythondatabaselistalgorithmperformance

Is there an easier or more efficient method to find the average running time of an algorithm?


# create 50 linked list with 2500 random integers
for i in range(0, 50):
    myList = LinkedList()
    start = time.time()

    for i in range(0, 2500):
        myList.append(randint(0, 9))

    end = time.time()
    runtime = end - start
    runtimes = []
    runtimes.append(runtime)

    for i in range(len(runtimes)):
        average = 0
        average += runtime
        average /= len(runtimes)
        print(average) # 10s

Wondering if there's a better method to find the average instead of adding each run time to a list, looping through the list of runtimes, adding each runtime to average , then dividing by the length of the list of runtimes.


Solution

  • yes

    import timeit
    ttl = timeit.timeit(lambda: alist.append(5),number=2500)
    print(f"Took {ttl} ... avg: {ttl/2500}")
    

    or even really sum(runtimes)/len(runtimes)