|
POST
|
Hi Matt, To convert from shapefile to SDE feature class I'd use the Feature Class To Feature Class tool. The projection step introduces another layer of complexity. I'm not 100% sure how to check the spatial reference of a shapefile in ModelBuilder. This is one thing arcpy is great at. If you haven't tried Python/arcpy, I'd encourage you to give it a shot. Assuming your desired spatial reference is NAD 1983 UTM Zone 10N, your script might look something like this: # import required modules
import arcpy, os
# create variables pointing to the desired spatial reference, a folder containing your input shapefiles and your output SDE workspace
spRef = arcpy.SpatialReference(26910)
shapefile_folder = r"C:\WorkArea\ShapefilesFromContractor"
sde_workspace = r"C:\WorkArea\DatabaseConnections\MySDEWorkspace.sde"
# list all of the shapefiles in the shapefile folder
arcpy.env.workspace = shapefile_folder
shpList = arcpy.ListFeatureClasses()
# loop through the shapefiles
for shp in shpList:
# test to see if the projection is undefined
if arcpy.Describe(shp).spatialReference.GSCName == "":
# define the projection
arcpy.DefineProjection_management(shp, arcpy.SpatialReference(4269))
# project to the desired spatial reference
arcpy.Project_management(shp, os.path.join(sde_workspace, arcpy.Describe(shp).baseName), spRef)
# if it not undefined...
else:
# test to see if it is in the wrong projected coordinate system
if arcpy.Describe(shp).spatialReference.PCSCode != 26910:
# project the shapefile into the correct spatial reference, outputting to sde
arcpy.Project_management(shp, os.path.join(sde_workspace, arcpy.Describe(shp).baseName), spRef)
# otherwise, just move it to sde
else:
arcpy.FeatureClassToFeatureClass_conversion(shp, sde_workspace, arcpy.Describe(shp).baseName)
Of course, if the shapefiles come in with the spatial reference undefined, you may need to guess, sleuth out, or ask what geographic coordinate system was used to collect the data and possibly use a transformation. For instance, GPS data would likely have a geographic coordinate system of WGS 1984. Here's the help for the geoprocessing tools that will get you there: Feature Class To Feature Class Project Define Projection Good luck! Hope this helps. Micah
... View more
09-21-2016
08:34 AM
|
1
|
0
|
642
|
|
POST
|
Terrific! I've modified it. If at first you don't succeed...
... View more
09-14-2016
01:07 PM
|
0
|
0
|
3014
|
|
POST
|
I see. Can you post your full script? Also, maybe try the field calculator expression without the str() around the i variable and !OBJECTID!.
... View more
09-14-2016
11:57 AM
|
0
|
2
|
3014
|
|
POST
|
Hi Zachary, Would Cluster_Plot = '"{}-{}"'.format(i,'!OBJECTID!') work? I've become a big fan of using triple quoted strings and the .format() method when creating a variable to use as a field calculator expression. Another option is to use an update cursor. Something like: table = "myTable"
fields = ("OBJECTID", "PLOT_ID")
i = 1
with arcpy.da.UpdateCursor(table, fields) as cursor:
for row in cursor:
row[1] = str(i) + "-" + str(row[0])
cursor.updateRow(row)
i+=1 Good luck!
... View more
09-14-2016
11:47 AM
|
2
|
4
|
3014
|
|
POST
|
Hi George. That's frustrating! Have you installed the SQL Server native client on the computer in question? I remember having trouble connecting to SQLServer SDE in the past and I believe that fixed it. This ArcGIS Help page has some other things to check for when experiencing problems connecting to SQL Server with ArcGIS. It might be worth a look. Good luck!
... View more
09-14-2016
11:37 AM
|
0
|
0
|
4045
|
|
POST
|
Good morning Benjamin, Do the tables all have the same schema? If not, are you doing any kind of field mapping for your merge? I am not 100% sure how the merge tool works when no field mapping info is supplied, but I am thinking it may be adding the fields of each input table into the output, which could be hurting your performance. You may wish to create your output table ahead of time, and then use iteration to append the input tables. Something along the lines of: import arcpy
arcpy.env.workspace = "C:\\DeleteMe\\tomerge\\tables.gdb"
target_table = "C:\\DeleteMe\\output\\final.gdb\\target_table"
tblList = arcpy.ListTables()
for t in tblList:
arcpy.Append_management([t], target_table, "NO_TEST") I was using the Append tool a couple weeks ago to append 60+ feature classes to one target. I think it took about ten minutes. Good luck! Micah
... View more
09-13-2016
08:42 AM
|
2
|
2
|
1383
|
|
POST
|
I had the exact same behavior with pyodbc. Have you looked into pypyodbc to see if it can get you what you need? It is pretty similar (although it can't reference rows quite the same as I recall). I think it may be more reliable. Too bad pyodbc has this problem within a custom script tool. It really is a fantastic package but for this issue! Good luck, Micah
... View more
09-12-2016
10:16 PM
|
0
|
0
|
1054
|
|
POST
|
Hello Shannon, Two thoughts come to mind: Do the features contain many polygons that snake all over the place? Polygons with very complex edges (lots of vertices) that stretch over a large part of the feature class' extent will be very hard to index efficiently. If this is the case, it may be worth splitting up the features by some sort of regular boundary, like township. This may improve your drawing performance. How many attributes does the feature class contain? If there is only one attribute that will be queried, convert the data to raster and it will draw super fast. Otherwise, it may be worth researching the problem with the specific version of SQL Server added to your search terms. Or, if you have one, ask your SQL DBA if there is anything in the database that might be causing the problem. Good luck! Micah
... View more
09-12-2016
03:26 PM
|
0
|
0
|
3046
|
|
POST
|
Greetings, What type of data sources are you publishing for use in your app? If it is data stored in geodatabase, you may consider running the Check Geometry tool which may yield some information about the invalid geometry. Typically these issues include null geometry, self-intersections, duplicate vertices, or line/polygon segments which are shorter than the XY tolerance for the dataset (which is typically .0001 meters, I believe). There is a corresponding repair geometry tool which usually does a nice job of fixing the issues with minimal pain. Micah
... View more
09-12-2016
11:39 AM
|
0
|
0
|
651
|
|
POST
|
Thank you all very much for your input! In the end I only needed a table with three fields: From there I wrote the following script (as it would be implemented in a script tool): # Name: appendWithFieldMap.py
# Author: Micah Babinski
# Date: 9/8/2016
# Description: Appends features to an output feature class using field mapping information stored in a table
#
# USAGE NOTE: The table must contain three columns
# 1. SourceTablepath (fully-qualified path to the source feature class
# 2. SourceFieldName (name of the input field as it appears in the source feature class
# 3. DestinationFieldName (destination field name as it appears in the destination feature class)
import arcpy
import sys
# obtain user parameters
outputFeatureClass = arcpy.GetParameterAsText(0) #r"W:\FieldMappingResearch\gdb\Output.gdb\roads"
fieldMapTable = arcpy.GetParameterAsText(1) #r"W:\FieldMappingResearch\gdb\Inputs.gdb\fieldMapTable"
# list the fields which contain the field map info
fields = ["SourceTablePath", "SourceFieldname", "DestinationFieldName"]
# define function to get unique field values from a table
def GetUniqueFieldValues(table, field):
"""
Retrieves and prints a list of unique values in a user-specified field
Args:
table (str): path or name of a feature class, layer, table, or table view
field (str): name of the field for which the user wants unique values
Returns:
uniqueValues (list): a list of the unique values in the field
"""
# get the values
with arcpy.da.SearchCursor(table, [field]) as cursor:
return sorted({row[0] for row in cursor})
# get a list of unique input feature classes
arcpy.AddMessage("Listing the unique input feature classes.")
inputs = GetUniqueFieldValues(fieldMapTable, "SourceTablePath")
# get a list of unique output fields
arcpy.AddMessage("Listing the unique output fields.")
outputFields = GetUniqueFieldValues(fieldMapTable, "DestinationFieldName")
# create an empty field mappings object
arcpy.AddMessage("Creating an empty field mappings object.")
fm = arcpy.FieldMappings()
# build the field mappings object
arcpy.AddMessage("Building the field mappings object.")
for f in outputFields:
arcpy.AddMessage("\t" + f + "...")
# create a field map object
arcpy.AddMessage("\t...Creating a field map object.")
fMap = arcpy.FieldMap()
with arcpy.da.SearchCursor(fieldMapTable, fields, """{0} = '{1}'""".format("DestinationFieldName", f)) as cursor:
for row in cursor:
# add the input field to the field map
arcpy.AddMessage("\t...Adding " + row[1] + " as an input field.")
fMap.addInputField(row[0], row[1])
# set the output name
arcpy.AddMessage("\t...Setting the output name.")
outputFieldName = fMap.outputField
outputFieldName.name = f
fMap.outputField = outputFieldName
# add the field map to the field mappings object
arcpy.AddMessage("\t...Adding the field map to the field mappings object.")
fm.addFieldMap(fMap)
# perform the append
arcpy.AddMessage("\nAppending!")
arcpy.Append_management(inputs, outputFeatureClass, "NO_TEST", fm)
To summarize, the workflow was: Obtain a unique list of input feature classes (full paths, for use in the append GP tool) Obtain a unique list of the output fields (one arcpy.FieldMap() object is needed per output field) Create an empty field mappings object For each output field, create an empty field map object, obtain the input fields for that output field with a da.SearchCursor, add those to the field map object, set the name of the output field, and add the field map to the field mappings object I imagine that there is a lot more that could be done with this in terms of validation and error handling (for when field types are mismatched) but I think this will get me what I need. Thanks again! Micah
... View more
09-08-2016
04:17 PM
|
1
|
0
|
5032
|
|
POST
|
Thanks Joshua for pointing out that I wasn't very clear about the overall process at a high level! You are correct. We have a large number of desired output datasets, each with many inputs from state, district, and field offices. People I work with have developed "crosswalk" tables, which show each desired output field on the left column, the input datasets along the top row, and which field input field should map into the output field in the cell where they intersect. A simple example might be: Destination Field ROW_Oregon ROW_Arizona ROW_Wyoming ROW_Nevada ROW_NAME ROW_NM NAME row_nm_txt name ROW_TYPE ROW_TYPE ROW_TP row_tp_txt type GIS_ACRES ACRES GIS_ACRES AREA_ACRES acre EFFECTIVE_DATE ROW_DT EFFECTV_DATE ROW_DATE date I am using another table which shows the desired field names, types, and lengths (for text fields) of each dataset, then adding fields to an empty feature class, using code like this: table = "schema_table"
fields = ("field_name", "field_type", "field_length")
target = "output_fc"
with arcpy.da.SearchCursor(table, fields) as cursor:
for row in cursor:
if row[1] == "TEXT":
arcpy.AddField_management(target, row[0], row[1], "#", "#", row[2])
else:
arcpy.AddField_management(target, row[0], row[1]) Once I have an empty feature class with the desired output schema, I want to create an arcpy field mapping object and use it as a parameter to the Append tool to load the data into the output feature class. Currently I am just using the stand-alone Append GP tool (with the NO_TEST option) and selecting the input fields for each output field, which is extremely time-consuming as some of our datasets have more than 90 inputs. The examples given for creating field maps in the help documentation are pretty basic and use hard-coded field names and properties. I am wondering if it is possible to reformat the crosswalk table above in a way that I could read it with a SearchCursor and create the fieldmapping object automatically. Hope that clears it up. Micah
... View more
09-03-2016
09:45 AM
|
1
|
1
|
5032
|
|
POST
|
Thanks Neil. Here was my concept: One table for each output dataset (there are many). The columns would be something along the lines of: SourceTablePath SourceFieldName SourceFieldType DestinationFieldName DestinationFieldType DestinationFieldLength (if text) It seems like that should cover most of the parameters for a field mapping object. I just don't have quite enough fieldmappings experience to piece together a working script. This is for a compilation of nation-wide datasets from tons of inputs from states, district, and field offices. My colleagues have prepared the "crosswalk" tables and I am responsible for merging them. One dataset had 91 inputs! So this could really save me a lot of time. Thanks again.
... View more
09-01-2016
10:20 AM
|
0
|
3
|
5032
|
|
POST
|
Does anyone know if it is possible to store information on input datasets, input fields, and destination fields in a table, and then use arcpy to build a field mapping object from said table? I have built arcpy field mapping objects in the past using blocks of code for each field, but don't have any experience reading the info out of a table and building the table using an iterator. I'd love to get your thoughts! Warm Regards, Micah Babinski
... View more
09-01-2016
09:46 AM
|
1
|
6
|
8207
|
|
POST
|
That should work for you, then. They've got TSR, land ownership, and field office areas.
... View more
08-31-2016
03:52 PM
|
0
|
0
|
1097
|
|
POST
|
Hi Robert. I'm not sure how helpful this is, but here is the Colorado BLM GIS data download page. I don't believe there is a published national BLM cached basemap yet.
... View more
08-31-2016
09:36 AM
|
0
|
2
|
1097
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 10-13-2017 09:58 AM | |
| 1 | 10-27-2017 12:54 PM | |
| 1 | 10-13-2017 04:28 PM | |
| 5 | 08-14-2017 01:58 PM | |
| 1 | 10-16-2017 08:03 AM |
| Online Status |
Offline
|
| Date Last Visited |
04-26-2021
03:16 PM
|