I'm using a Python script to send a list of addresses to the Google Geocoding API and parse the JSON results to give me a text list of coordinates. My problem is: For some addresses in the list, the API fails to return coordinates and instead returns the status "OVER_QUERY_LIMIT", but then it will continue to process more addresses successfully. For example, in a list of 200 addresses, I will get "OVER_QUERY_LIMIT" for roughly 40 or 60 of them, in no particular order, and the rest will be fine. But if I run the script again with the same list of addresses, some of those which failed the first time will return a result with coordinates like they should, with status "OK", but randomly differentaddresses in the list will fail. Does anyone have a clue why this would happen?
The script uses a loop to send each address as an API request one at a time, including a pause between each request to satisfy Google's limit on requests per second. And the total number of requests I'm sending should be well below Google's cap on free requests per day. So I don't believe I'm really going "over query limit" - also note that I am able to continue getting successful results after the ones that fail. So what could be making some addresses return this status at random?
Here's the script for anyone who's curious:
import urllib, jsonimport sys, timeimport pprint # specify input addresses as a list (for simplicity of demonstration):addresses = [ 'address 1' 'address 2' 'address 3 etc.' ]# I removed the real addresses for privacy, but you get the ideafor add in addresses: time.sleep(.1) prefix = 'https://maps.googleapis.com/maps/api/geocode/json?' data = urllib.urlencode({"address" : add}) url = prefix+data gresp = urllib.urlopen(url) jresp = json.loads(gresp.read()) if jresp['status'] == 'OK': lat = jresp['results'][0]['geometry']['location']['lat'] lon = jresp['results'][0]['geometry']['location']['lng'] print (str(lat)+"; "+str(lon)) else: print jresp['status']
My results typically look something like this (for example):
33.782041; -84.416123233.8111927; -84.3640217OVER_QUERY_LIMIT33.7980347; -84.369196433.7396419; -84.371720533.7601601; -84.3955309OVER_QUERY_LIMIT33.7794269; -84.367583633.760549; -84.38706133.80361; -84.394133733.814729; -84.3910812OVER_QUERY_LIMIT OVER_QUERY_LIMIT33.7977534; -84.407681233.811963; -84.395412OVER_QUERY_LIMIT OVER_QUERY_LIMIT OVER_QUERY_LIMIT33.7682217; -84.384999233.8480993; -84.428437833.9229844; -84.3982854
Any insight would be appreciated!
GeoNet is really focused on Esri products, and this appears to have no connection to any Esri products. Have you tried any Google-specific discussion/support options?
Looking over Usage Limits for Google Maps APIs Web Services | Google Maps APIs Premium Plan | Google Develope... and hearing your problem, it seems you are running into a situation where the geocoding requests are getting submitted to quickly so you are experiencing temporary timeouts.
I suppose you're right. I'm just trying the kitchen sink approach - posting wherever I think someone might have an answer, because this has been such a frustrating problem for me to figure out. Sorry if that's frowned upon here.
The API terms state:
Your code does:
time.sleep(.1)
So I am not sure why you hit these issues.
However, there help file has an example script that demonstrates how to work around this issue:
url = "MAPS_API_WEBSERVICE_URL" attempts = 0 success = False while success != True and attempts < 3: raw_result = urllib.urlopen(url).read() attempts += 1 # The GetStatus function parses the answer and returns the status code # This function is out of the scope of this example (you can use a SDK). status = GetStatus(raw_result) if status == "OVER_QUERY_LIMIT": time.sleep(2) # retry continue success = True if attempts == 3: # send an alert as this means that the daily limit has been reached print "Daily limit has been reached"
Thanks! I haven't tried implementing this yet, but it looks like a good idea.