I have a sample script that I am using as a proof of concept to loop through a JSON web-service and write the results to a CSV then table in file geodatabase. I am using my The ArcGIS Sample Server JSON web-service as an example. The problem is that my script does not write all records to the CSV, however in this case only the last record.
import requests import json import urllib2 import csv import arcpy url = "http://sampleserver1.arcgisonline.com/ArcGIS/rest/services/Demographics/ESRI_Census_USA/MapServer?f=..." parsed_json = json.load(urllib2.urlopen("http://sampleserver1.arcgisonline.com/ArcGIS/rest/services/Demographics/ESRI_Census_USA/MapServer?f=...")) details = {'layers': 'name'} headers = {'Content-type': 'application/x-www-form-urlencoded', 'Accept': '/'} response = requests.post(url, data=json.dumps(details), headers=headers) # tell computer where to put CSV outfile_path='C:\Users\Administrator\PycharmProjects\untitled\json2fgdb.csv' # open it up, the w means we will write to it writer = csv.writer(open(outfile_path, 'wb')) #create a list with headings for our columns headers = ['name', 'id'] #write the row of headings to our CSV file writer.writerow(headers) for services in parsed_json ["layers"]: #initialize the row row = [] #add every 'cell' to the row list, identifying the item just like an index in a list row.append(str(services['id']).encode('utf-8')) row.append(str(services['name']).encode('utf-8')) #once you have all the cells in there, write the row to your csv writer.writerow(row) #increment our loop counter, now we're on the next time through the loop i = 1 i = i +1 f = open(outfile_path, 'wb') writer = csv.writer(f) f.close() #Temporary Delete Function to Overwrite Table if arcpy.Exists("C:\dev_folder\orginalDev.gdb\jsoncsv7"): arcpy.Delete_management("C:\dev_folder\orginalDev.gdb\jsoncsv7") arcpy.TableToTable_conversion("C:\Users\Administrator\Desktop\json2gdb.csv", "C:\dev_folder\orginalDev.gdb", "jsoncsv7") print response.text
my output is:
name,id
5,states
I would like to write all layer IDs and Names, not just the last one.
Solved! Go to Solution.
It looks like it is an indentation issue. The append statements and the writing to the csv happen only once after it has iterated through all of the layers. That is probably why you're only getting the last one.
for services in parsed_json ["layers"]: row = [] row.append(str(services['id']).encode('utf-8')) row.append(str(services['name']).encode('utf-8')) writer.writerow(row)
It looks like it is an indentation issue. The append statements and the writing to the csv happen only once after it has iterated through all of the layers. That is probably why you're only getting the last one.
for services in parsed_json ["layers"]: row = [] row.append(str(services['id']).encode('utf-8')) row.append(str(services['name']).encode('utf-8')) writer.writerow(row)
Worked a charm. Thank you Bruce Bacia!