ArcPy Table to Table Conversion Not Working Correctly

5300
2
01-29-2015 09:01 AM
GeoffreyWest
Occasional Contributor III

I have a script which writes the lat/lng of a json web service of bus routes.  The script writes the latitude and longitude to a CSV correctly.  However my table to table conversion does not work correctly in Python.

# IMPORTS

#Make Python understand how to read things on the Internet
import urllib2

#Make Python understand the stuff in a page on the Internet is JSON
import json
from decimal import Decimal

# Make Python understand csvs
import csv

# Make Python know how to take a break so we don't hammer API and exceed rate limit
from time import sleep

# tell computer where to put CSV
outfile_path='C:\Users\Administrator\PycharmProjects\untitled\json2fgdb.csv'

# open it up, the w means we will write to it
writer = csv.writer(open(outfile_path, 'wb'))

#create a list with headings for our columns
headers = ['latitude', 'longitude']

#write the row of headings to our CSV file
writer.writerow(headers)


# GET JSON AND PARSE IT INTO DICTIONARY

# We need a loop because we have to do this for every JSON file we grab

#set a counter telling us how many times we've gone through the loop, this is the first time, so we'll set it at 1
i=1

#loop through pages of JSON returned, 100 is an arbitrary number
while i<2:
   #print out what number loop we are on, which will make it easier to track down problems when they appear
   print i
   #create the URL of the JSON file we want. We search for 'egypt', want English tweets,
#and set the number of tweets per JSON file to the max of 100, so we have to do as little looping as possible
   url = urllib2.Request('http://api.metro.net/agencies/lametro/routes/704/vehicles/' + str(i))
   #use the JSON library to turn this file into a Pythonic data structure
   parsed_json = json.load(urllib2.urlopen('http://api.metro.net/agencies/lametro/routes/704/vehicles/'))
   #now you have a giant dictionary.
#Type in parsed_json here to get a better look at this.
#You'll see the bulk of the content is contained inside the value that goes with the key, or label "results".
#Refer to results as an index. Just like list[1] refers to the second item in a list,
#dict['results'] refers to values associated with the key 'results'.
   print parsed_json



   #run through each item in results, and jump to an item in that dictionary, ex: the text of the tweet
   for items in parsed_json['items']:

   #initialize the row
   row = []
   #add every 'cell' to the row list, identifying the item just like an index in a list

   row.append(str(items['longitude']).encode('utf-8'))
  row.append(str(items['latitude']).encode('utf-8'))

   #once you have all the cells in there, write the row to your csv
   writer.writerow(row)
   #increment our loop counter, now we're on the next time through the loop
   i = i +1
   #tell Python to rest for 5 secs, so we don't exceed our rate limit
  #sleep(5)




   import arcpy
  arcpy.TableToTable_conversion(outfile_path, "C:\dev_folder\orginalDev.gdb", "jsoncsv2"



)

previewCatalog.PNG

     This is my table output when running in Python.

     previewTableTool.PNG

         This is the output, when using ArcToolBox in ArcCatalog.

Tags (1)
2 Replies
XanderBakker
Esri Esteemed Contributor

I would probably create the output table first, add the fields and use an insert cursor to fill the table:

import os
fgdb = r"C:\dev_folder\orginalDev.gdb"
tbl_name = "jsoncsv2"
tbl = os.path.join(fgdb, tbl_name)
fld_lat = "Latitude"
fld_lon = "Longitude"

arcpy.CreateTable_management(fgdb, tbl_name)
arcpy.AddField_management(tbl, fld_lon, "DOUBLE")
arcpy.AddField_management(tbl, fld_lat, "DOUBLE")

flds = (fld_lon, fld_lat)
with arcpy.da.InsertCursor(tbl, flds) as curs:
    for items in parsed_json['items']:
        row = (float(items['longitude']), float(items['latitude']), )
        curs.insertRow(row)
MahtabAlam1
Occasional Contributor

ArcMap/ArcCatalog will automatically create a schema.ini. Please look Adding an ASCII or text file table

In absence of schema.ini arcpy.TableToTable_conversion tool might give unexpected results especially when are executing a standalon python script. You can provide the field_mapping parameter or consider creating your own schema.ini file.