Dear all, I beg your pardon, I can’t get that sorted …
In general, I want to specify the total running time of an algorithm as the sum of the total AMPL-time and the total SOLVE-time. So, there is one AMPL-run with a number, k > 1, of solve-commands. So far, I have assumed:
Total algorithm sec. = _ampl_elapsed_time + <SUM of k _solve_elapsed_time’s>,
which always appeared to be correct, comparing the result with manual stopwatch measuring or simply calculating the (integer-sec-)difference between a time() at the start and a time() at the end of the algorithm.
But now I came across Bob Fourer’s contribution here, where I could learn that “Thus
_ampl_elapsed_time includes the _total_solve_elapsed_time …”, which, from my view, would also mean that _ampl_elapsed_time includes my <SUM of k _solve_elapsed_time’s>, provided this SUM = _total_solve_elapsed_time.
I see a contradiction … which doesn’t mean that there really exists one …
Thanks for any input,
Peter