I want to run the Optimization Experiment till the maximum available memory on my computer is reached (I have 2 TB empty). Therefore, I do not specify the number of iterations, and I set the maximum available memory to 1.5 TB (1500000Mb).
However, when I run the model, the experiment keeps running and stops automatically at iteration number 250,000 (Status: Finished).
Do you think the 250,000 iterations consume the 1.5 TB !? (I disabled log model execution already)? Or is this number (250,000) specified by AnyLogic? If yes, based on what statistics, since the solution space may have millions/billions of solutions (each iteration is a solution).
I tried the Activity Based Costing Analysis model, the model stops at 250K iterations too.
I don't think there is a max limit. The "Activity-based costing" example goes well beyond 250k iterations on my machine:
Maybe you have a limited OptQuest license, do you use the Academic AL version? (I have the Professional version).
However, you are working with a big misunderstanding. The memory you assign is your RAM memory. Typically, computers have 16-64 GB (not terabytes). What you are talking about is your HDD space, AnyLogic does neither use that nor care about it.
So setting such a high value has no impact beyond AnyLogic using all of your RAM. This may cause the stop, but it may also be a fixed upper limit.
Try running a simple experiment with 4GB memory and see if it also stops at 250k