Let's say I wrote a program containing an algorithm that had an exponential runtime and then ran the program through a sufficiently large data set in which it would not finish for years.
What would happen to the computer? Would it lock up? Would it run at 100% capacity until either it finishes or the power shuts off? Would the hardware fail before it finishes?
I'm doing homework on time complexity if you haven't guessed already. This isn't a homework question. It's just a random thought I had.
What would happen to the computer?
It will run the algorithm until it finishes [or have an unexpected error]
Would it lock up?
It depends how the algorithm is implemented - but usually - the "program" will probably freeze, but the computer will still be able to work [probably slower], especially if the machine is multi-cored.
Would it run at 100% capacity until either it finishes or the power shuts off?
If the algorithm is implemented serially, and the machine is multi-cored - it will not run on 100% capcity. If it is multi-threaded - it probably will.
Would the hardware fail before it finishes?
for algorithm that needs 2^n
ops, and n=1000
[for modern present machines] - it most likely will [earth will not be here before it is done]. But there are no guarantees for it.
Important: The problem with exonential problems is not their effect on machines, this is not the problem with them. The problem is they take a long time. how long? well, for O(n!)
algorithm [naive TSP implementation], for n == 20
, the run-time is ~decade. increase n
by one, just a small change in the problem size - and you get ~200 years run time! an extra addition will make it ~4000 years... [again, assuming modern present machine, and for c
the constant for O(n!)
c >= 1