Estimate of processing cost

Idea created by kimo on Feb 27, 2011
    New
    Score90
    • mlou
    • BruceLang
    • boffamiskell
    • kimo
    • jwarzi
    • kwgis
    • duri
    • mping
    • dknudsen
    It would be useful to know how much processing a tool might require before it is started. For example if the algorithm is O(n) or linear, or the number of output records likely to be generated. Perhaps a tool that runs a series of tests to estimate the time or cpu or disk space required, or memory space or just 'effort' that does not actually run the process, similar to the way that an SQL engine optimises a query.

    This might save a lot of time before the tool fails for lack of memory or temp disk space and make users more aware of impossible tasks.
    I note that interactive selections warn that processes may take a long time. They do not seem to be very accurate for file geodatabases, I ignore them all because they finish in seconds.

    Perhaps it could just be a formula added to the documentation where possible.