Select to view content in your preferred language

Arcpy TableToTable_Conversion writing extra fields?

4756
10
Jump to solution
02-06-2015 08:38 PM
GeoffreyWest
Occasional Contributor III

I would like to write two headers, but am receving my actual two headers and 3 blank fields. What could be the cause of this? Here is my script and output. The script parses a JSON web-service and sends lat/lng of buses to a table in a file geodatabase.

Fq99B.png

s2f3s.png

import requests
import json
import urllib2
import csv
import arcpy





if arcpy.Exists("C:\MYLATesting.gdb\API_Table"):
  arcpy.Delete_management("C:\MYLATesting.gdb\API_Table")

url = "http://api.metro.net/agencies/lametro/vehicles/"
parsed_json = json.load(urllib2.urlopen("http://api.metro.net/agencies/lametro/vehicles/"))
details = {'items': 'longitude'}
headers = {'Content-type': 'application/x-www-form-urlencoded', 'Accept': '/'}
response = requests.get(url, data=json.dumps(details), headers=headers)


# tell computer where to put CSV
outfile_path='C:\Users\Administrator\Desktop\API_Testing.csv'

# open it up, the w means we will write to it
writer = csv.writer(open(outfile_path, 'wb'))

#create a list with headings for our columns
headers = ['latitude','longitude']

for items in parsed_json['items']:
  row = []
  row.append(str(items['latitude']).encode('utf-8'))
  row.append(str(items['longitude']).encode('utf-8'))
  writer.writerow(row)

i = 1
i = i +1

f = open(outfile_path, 'wb')
writer = csv.writer(f)
#write the row of headings to our CSV file
writer.writerow(headers)

f.close()

#Temporary Delete Function to Overwrite Table

arcpy.TableToTable_conversion(outfile_path, "C:\MYLATesting.gdb", "API_Table")

print response.text
Tags (1)
0 Kudos
1 Solution

Accepted Solutions
JoshuaBixby
MVP Esteemed Contributor

The above concept can be expanded to download all the information and create a point feature class.

import arcpy  
import json  
import numpy  
import urllib2  

fc = #feature class location  
if arcpy.Exists(fc):    
  arcpy.Delete_management(fc)  

url = "http://api.metro.net/agencies/lametro/vehicles/"    
parsed_json = json.load(urllib2.urlopen(url))  
sr = arcpy.SpatialReference(4326)  #assuming wgs84 coords but didn't verify

ndtype = numpy.dtype([
    ('id', 'S12'),
    ('route_id', 'S12'),
    ('run_id', 'S12'),
    ('latitude', 'f8'),
    ('longitude', 'f8'),
    ('heading', 'f8'),
    ('seconds_since_report', 'i4'),
    ('predictable', '?')
])

vehicles = []  
for item in parsed_json['items']:  
    vehicles.append(tuple(item for k in ndtype.names))

narr = numpy.array(vehicles,  ndtype)  
arcpy.da.NumPyArrayToFeatureClass(narr, fc, ['longitude', 'latitude'], sr)

View solution in original post

10 Replies
JoshuaBixby
MVP Esteemed Contributor

Instead of troubleshooting your current approach, I am going to suggest a different one for a couple of reasons.  First, there seems to be lines in your code that aren't doing anything, or at least not contributing to building your table.  Fortunately your screenshots can be used to figure out the intent of the code.  Second, writing the results to an intermediate/temporary CSV file on disk adds extra steps and slows performance.

The NumPy conversion functions in the ArcPy Data Access module (arcpy.da) are very robust, high performing, and I would argue designed for situations just like this one.  Although I still find NumPy documentation lacking in clarity in many areas, it is well worth learning and using with and without ArcPy.

import arcpy
import json
import numpy
import urllib2

tbl = #table location
if arcpy.Exists(tbl):  
  arcpy.Delete_management(tbl)

url = "http://api.metro.net/agencies/lametro/vehicles/"  
parsed_json = json.load(urllib2.urlopen(url))

LatLongs = []
for item in parsed_json['items']:
    LatLongs.append((
        str(item['latitude']).encode('utf-8'),
        str(item['longitude']).encode('utf-8')
    ))
narr = numpy.array(LatLongs,
                   numpy.dtype([('latitude', '|S12'), ('longitude', '|S12')]))
arcpy.da.NumPyArrayToTable(narr, tbl)
0 Kudos
JoshuaBixby
MVP Esteemed Contributor

The above concept can be expanded to download all the information and create a point feature class.

import arcpy  
import json  
import numpy  
import urllib2  

fc = #feature class location  
if arcpy.Exists(fc):    
  arcpy.Delete_management(fc)  

url = "http://api.metro.net/agencies/lametro/vehicles/"    
parsed_json = json.load(urllib2.urlopen(url))  
sr = arcpy.SpatialReference(4326)  #assuming wgs84 coords but didn't verify

ndtype = numpy.dtype([
    ('id', 'S12'),
    ('route_id', 'S12'),
    ('run_id', 'S12'),
    ('latitude', 'f8'),
    ('longitude', 'f8'),
    ('heading', 'f8'),
    ('seconds_since_report', 'i4'),
    ('predictable', '?')
])

vehicles = []  
for item in parsed_json['items']:  
    vehicles.append(tuple(item for k in ndtype.names))

narr = numpy.array(vehicles,  ndtype)  
arcpy.da.NumPyArrayToFeatureClass(narr, fc, ['longitude', 'latitude'], sr)
XanderBakker
Esri Esteemed Contributor

Hi Joshua Bixby‌, nice short piece of code to get the job done!

0 Kudos
GeoffreyWest
Occasional Contributor III

Agreed, thank you Joshua.

0 Kudos
BlakeTerhune
MVP Regular Contributor

Is it necessary to close the url when getting JSON?

python - closing files properly opened with urllib2.urlopen() - Stack Overflow

0 Kudos
JoshuaBixby
MVP Esteemed Contributor

In this case, there is no url to close.  The url variable in the script is just a string.  The script never directly interacts with the file-like object returned by urllib2.urlopen.  Instead, the json.load method calls urllib2.open and iterates through the returned file-like object.  Once json.load returns, everything it created goes out of scope, including objects returned by urllib2.urlopen.

BlakeTerhune
MVP Regular Contributor

Thanks, Joshua.

A little off topic:

I am using Python with ArcGIS Server 10.2.2 to get tokens and stop/start services. To request a token, I use urlllib2 to open the request url and read the json response. To stop/start services, I use urllib to open the request url and read the JSON response. In both cases I thought it was necessary to do it in a with statement using contextlib to close it when finished. Here is my thread. Sorry for the hijack!

0 Kudos
JoshuaBixby
MVP Esteemed Contributor

Let's jump over to the other thread to continue the discussion.

0 Kudos
GeoffreyWest
Occasional Contributor III

Hi Joshua, This implementation works wonders, however like you stated the NumPy documentation isn't all of that helpful.  My question is will string data types ALWAYS be 12 bytes?  This script is for a proof of concept, and another JSON web service will be used which I why I was using the requests module in the initial script, because I will have to execute a POST instead of GET request to the service.  I appreciate your help.

0 Kudos