|
POST
|
This sounds like the likely solution, my post-processing is in the form of a field calculation and a search cursor in a python script. From a conceptual point of view, would I disable editor tracking at the beginning of the script and enable at the end?
... View more
02-03-2015
09:12 AM
|
0
|
3
|
962
|
|
POST
|
I have a Collector for ArcGIS application which records the last_edited_user from the field, however post-processing needs to occur and overwrites my users from the field to my desktop user. Is there a way to lock the last_edited_user to maintain my initial edited user? I am using ArcGIS For Server 10.2.1 and SQL Server Database 2008 R2.
... View more
02-03-2015
08:45 AM
|
1
|
5
|
5250
|
|
POST
|
Figured it out, PlannedDate = CONVERT(DATE, GETDATE())
... View more
02-02-2015
10:46 AM
|
0
|
0
|
1200
|
|
POST
|
I am writing a script to access data from each day. I would like to select by attribute and send these features to a local file geodatabase. After reading some documenation, CURRENT_DATE is the function which reads data for today. This is an example of my data '2015-02-02 00:00:00'. I don't have the time for any features, so in reality it is only YYYY-MM-DD. In a local file geodatabase, the query PlannedDate = CURRENT_DATE works. However in my ArcSDE SQL Server connection it does not. PlannedDate = GETDATE () does not work either, however no errors are being returned. The data is my table is m/d/yyyy The data when selecting by attribute is yyyy-mm-dd.
... View more
02-02-2015
10:30 AM
|
1
|
1
|
5121
|
|
POST
|
I have a script which writes the lat/lng of a json web service of bus routes. The script writes the latitude and longitude to a CSV correctly. However my table to table conversion does not work correctly in Python. # IMPORTS
#Make Python understand how to read things on the Internet
import urllib2
#Make Python understand the stuff in a page on the Internet is JSON
import json
from decimal import Decimal
# Make Python understand csvs
import csv
# Make Python know how to take a break so we don't hammer API and exceed rate limit
from time import sleep
# tell computer where to put CSV
outfile_path='C:\Users\Administrator\PycharmProjects\untitled\json2fgdb.csv'
# open it up, the w means we will write to it
writer = csv.writer(open(outfile_path, 'wb'))
#create a list with headings for our columns
headers = ['latitude', 'longitude']
#write the row of headings to our CSV file
writer.writerow(headers)
# GET JSON AND PARSE IT INTO DICTIONARY
# We need a loop because we have to do this for every JSON file we grab
#set a counter telling us how many times we've gone through the loop, this is the first time, so we'll set it at 1
i=1
#loop through pages of JSON returned, 100 is an arbitrary number
while i<2:
#print out what number loop we are on, which will make it easier to track down problems when they appear
print i
#create the URL of the JSON file we want. We search for 'egypt', want English tweets,
#and set the number of tweets per JSON file to the max of 100, so we have to do as little looping as possible
url = urllib2.Request('http://api.metro.net/agencies/lametro/routes/704/vehicles/' + str(i))
#use the JSON library to turn this file into a Pythonic data structure
parsed_json = json.load(urllib2.urlopen('http://api.metro.net/agencies/lametro/routes/704/vehicles/'))
#now you have a giant dictionary.
#Type in parsed_json here to get a better look at this.
#You'll see the bulk of the content is contained inside the value that goes with the key, or label "results".
#Refer to results as an index. Just like list[1] refers to the second item in a list,
#dict['results'] refers to values associated with the key 'results'.
print parsed_json
#run through each item in results, and jump to an item in that dictionary, ex: the text of the tweet
for items in parsed_json['items']:
#initialize the row
row = []
#add every 'cell' to the row list, identifying the item just like an index in a list
row.append(str(items['longitude']).encode('utf-8'))
row.append(str(items['latitude']).encode('utf-8'))
#once you have all the cells in there, write the row to your csv
writer.writerow(row)
#increment our loop counter, now we're on the next time through the loop
i = i +1
#tell Python to rest for 5 secs, so we don't exceed our rate limit
#sleep(5)
import arcpy
arcpy.TableToTable_conversion(outfile_path, "C:\dev_folder\orginalDev.gdb", "jsoncsv2"
) This is my table output when running in Python. This is the output, when using ArcToolBox in ArcCatalog.
... View more
01-29-2015
09:01 AM
|
1
|
2
|
6102
|
|
POST
|
I have added a feature service to my content, however the edit properties fields do not appear where I am able to select if users are able to export the data. What could be the issue? I am following this documentation. http://doc.arcgis.com/en/arcgis-online/share-maps/use-hosted-layers.htm#GUID-074D6BFF-60C4-43B6-822F-7BC04F5AF1A5 As the organization administrator I add a feature service ---> edit ---> properties, there are no properties available to export individual layers.
... View more
01-27-2015
02:15 PM
|
0
|
1
|
5192
|
|
POST
|
I have a script which prints the lat, lon of bus routes, how would I use arcpy to print this to a table in a file geodatabase? Example of Output 45.266759 -75.741350 45.266221 -75.741675 45.311531 -75.739158 45.355146 -75.769231 45.412336 -75.718080 import xml.etree.ElementTree as ET
import urllib
u = urllib.urlopen('https://api.octranspo1.com/v1.1/GetNextTripsForStop', 'appID=7a51d100&apiKey=5c5a8438efc643286006d82071852789&routeNo=95&stopNo=3044')
data = u.read()
f = open('route3044.xml', 'wb')
f.write(data)
f.close()
doc = ET.parse('route3044.xml')
for bus in doc.findall('.//{http://tempuri.org/}Trip'):
lat = bus.findtext('{http://tempuri.org/}Latitude')
lon = bus.findtext('{http://tempuri.org/}Longitude')
print lat, lon
... View more
01-26-2015
09:52 AM
|
0
|
0
|
4136
|
|
POST
|
Hi Glenn, THank you for this. This is exactly what we are trying to do. Do you have any examples of this scripted? This is uncharted territory for me, I have a Siebel WSDL, but am having difficulties on where to start with it.
... View more
01-16-2015
09:37 AM
|
0
|
0
|
677
|
|
POST
|
My team and I have a proposed workflow where we would like to consume a SOAP service into our ArcGIS Server. I am new to SOAP services and feel as if I am missing something. The client SOAP service will have customer data, i.e. address, order type, etc entered into a CRM, we would like to retrieve this data via SOAP and have it stored into our database. Their SOAP service is not an ArcGIS Map or Feature Service.
... View more
01-15-2015
05:25 PM
|
0
|
2
|
4632
|
|
POST
|
I would like to use the Bing Hybrid map in a Web Application. I have so far configured everything with little change to the code other than UI stuff. Is this possible?
... View more
12-29-2014
09:27 AM
|
0
|
1
|
4644
|
|
POST
|
I have a category field with 9 categories, what's an easy method to go about selecting 1 of each category? It can be random features, I just need one of each category.
... View more
12-23-2014
09:55 AM
|
0
|
3
|
7043
|
|
POST
|
Yes, I can. I was able to change the connections via SQL Server Management studio.
... View more
12-22-2014
09:31 AM
|
0
|
0
|
678
|
|
POST
|
I am not using the application server service; and there is not an SDE user.
... View more
12-22-2014
09:12 AM
|
0
|
2
|
2212
|
|
POST
|
So for now, I have no SDE service registered with Windows, my question is do I begin in at step 1 again by creating a service with an associated DB? When using sdeconfig -o alter -v CONNECTIONS=128 -i sde:sqlserver:SRSSDGIS02 -u sa -p myPassword -D SCData I receive a message of bad-login user, while my SQL Server database user is indeed sa.
... View more
12-22-2014
08:55 AM
|
0
|
6
|
2212
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 03-16-2017 02:33 PM | |
| 1 | 01-18-2022 07:40 AM | |
| 1 | 04-28-2021 09:29 AM | |
| 1 | 10-24-2016 12:07 PM | |
| 1 | 04-28-2016 09:12 AM |
| Online Status |
Offline
|
| Date Last Visited |
01-18-2022
03:08 PM
|