Select to view content in your preferred language

Arcpy TableToTable_Conversion writing extra fields?

4956
10
Jump to solution
02-06-2015 08:38 PM
GeoffreyWest
Frequent Contributor

I would like to write two headers, but am receving my actual two headers and 3 blank fields. What could be the cause of this? Here is my script and output. The script parses a JSON web-service and sends lat/lng of buses to a table in a file geodatabase.

Fq99B.png

s2f3s.png

import requests
import json
import urllib2
import csv
import arcpy





if arcpy.Exists("C:\MYLATesting.gdb\API_Table"):
  arcpy.Delete_management("C:\MYLATesting.gdb\API_Table")

url = "http://api.metro.net/agencies/lametro/vehicles/"
parsed_json = json.load(urllib2.urlopen("http://api.metro.net/agencies/lametro/vehicles/"))
details = {'items': 'longitude'}
headers = {'Content-type': 'application/x-www-form-urlencoded', 'Accept': '/'}
response = requests.get(url, data=json.dumps(details), headers=headers)


# tell computer where to put CSV
outfile_path='C:\Users\Administrator\Desktop\API_Testing.csv'

# open it up, the w means we will write to it
writer = csv.writer(open(outfile_path, 'wb'))

#create a list with headings for our columns
headers = ['latitude','longitude']

for items in parsed_json['items']:
  row = []
  row.append(str(items['latitude']).encode('utf-8'))
  row.append(str(items['longitude']).encode('utf-8'))
  writer.writerow(row)

i = 1
i = i +1

f = open(outfile_path, 'wb')
writer = csv.writer(f)
#write the row of headings to our CSV file
writer.writerow(headers)

f.close()

#Temporary Delete Function to Overwrite Table

arcpy.TableToTable_conversion(outfile_path, "C:\MYLATesting.gdb", "API_Table")

print response.text
Tags (1)
0 Kudos
10 Replies
JoshuaBixby
MVP Esteemed Contributor

No.  The strings can be whatever length you specify, I simply chose 12 because looking over the data, it appeared 12 characters was more than enough to hold all the potential values.  If you need a 48 character string, either 'S48' or '|S48' would work.  If you are working with Unicode, then use a U in place of an S.

0 Kudos