I have a dataset with ~ 2000 rows which I would like to iterate over, create subsets and then stagger geoprocessing at one minute intervals as the tool I'm using involves requests to an external API which are limited to 30 features per minute.
Solved! Go to Solution.
The question was specifically for ModelBuilder so I think this is exactly the right place!
You can use python modules in ModelBuilder using a python function inside the Calculate Value tool. So time.sleep() is probably your go-to here. Connect this Calculate Value output as a precondition to your tool call so it will have to run before your tool that gets the data runs.
Expression:
wait(60)
Code block:
import time
def wait(sec):
time.sleep(sec)
return 1
You could try executing a simple python script with time.sleep() within your model.
time — Time access and conversions — Python 3.10.4 documentation
Thank you
BTW - this question should be posted in the Geoprocessing community. Just noticed it's posted in the Member Introductions.
First time I've posted, Please and thank you.
The question was specifically for ModelBuilder so I think this is exactly the right place!
You can use python modules in ModelBuilder using a python function inside the Calculate Value tool. So time.sleep() is probably your go-to here. Connect this Calculate Value output as a precondition to your tool call so it will have to run before your tool that gets the data runs.
Expression:
wait(60)
Code block:
import time
def wait(sec):
time.sleep(sec)
return 1