I'm trying to understand the concept of Algorithms and basically how they improve performance of a computer program.
So assume, I have to write a program which generates a list of numbers by,
Starts with the number 1.
Adds 3 to it.
Stores the result (1+3=4) in the list.
Adds 5 to the new number.
Stores the result (4+5=9) in the list.
Keeps alternatively adding 3 and 5 to the latest number in the list.
Now this is a very simple program and lets say it the program has to stop when the number is greater than 10,00,000, and lets suppose a simple program to do this takes 10 seconds to generate the list.
How does one design an algorithm for this problem such that the program takes lesser time to generate the list.
NOTE- I am trying to understand the concept here with an example, the above times mentioned are random and not factual. It would be great if someone could help me understand the concept with a 'simple' example, if they do not wish to use the above example.
What you've given above (the list of steps to produce your list) is an algorithm.
Significant improvements in efficiency typically mean changing from one algorithm to another that accomplishes the same end with less work. For example, for the algorithm above, you might try to avoid creating the list (as such) at all, and instead substitute an algorithm that can quickly generate the result for any particular spot in the list -- given N as input, it would do something like
int n = N/2;
int m = N-n;
return 1 + n * 3 + m * 5;
Note that this code probably isn't exactly correct (I don't think it handles odd versus even input numbers quite correctly), but you get the general idea -- instead of carrying out an entire series of operations to get one result, it carries out a much smaller number of operations to produce the equivalent result.