|
POST
|
The second code sample in the ArcGIS Help 10.2 - ListFeatureClasses (arcpy) gets you most of the way there. The code below is adapted from the second code example: import arcpy
import os
gdb = #
arcpy.env.workspace = gdb
datasets = arcpy.ListDatasets(feature_type='feature')
datasets = [''] + datasets if datasets is not None else []
fcList = []
for ds in datasets:
for fc in arcpy.ListFeatureClasses(feature_dataset=ds):
fcList.append(os.path.join(gdb, ds, fc))
# fcList.append(os.path.join(ds, fc)) # if you don't want gdb path included
fcList.sort() The Esri sample code, beyond providing the functional framework, incorporated two notable tips/ideas/practices. The first is that including an empty string '' in the datasets list will allow the datasets loop to include the feature classes that aren't in a data set, i.e., the feature classes that are in the root/base of the geodatabase. The second is that os.path.join is used to create the full path of the feature classes being listed. If you are going to do any further processing of the list of feature classes, having the full path to them in the list makes the next steps easier. Even if you are just listing feature classes to know what is present, including the data set along with the feature class will provide for more context.
... View more
02-23-2015
08:21 AM
|
2
|
0
|
5868
|
|
POST
|
Huh, not sure then. For me, Background geoprocessing generates that exact error and disabling it makes it go away using the code above. Maybe close out of ArcGIS Desktop and clean out all of the PYC and PYO files in the comtypes site package folder as well as clear out the gen folder under comtypes.
... View more
02-20-2015
03:20 PM
|
1
|
1
|
4277
|
|
POST
|
Now I remember why this sounded so familiar, I ran into the same issue about six months back. Turn off "Background Processing." I am guessing you have it enabled, which means some of the code is being run out of process, so there is no AppRef.
... View more
02-20-2015
02:01 PM
|
0
|
3
|
4277
|
|
POST
|
Posting the exact snippet and error code are helpful. There are lots of snippets. How are you running the code, the interactive Python windows in ArcGIS Desktop or as a standalone script?
... View more
02-20-2015
01:13 PM
|
0
|
5
|
4277
|
|
POST
|
I would look into creating a map package: "A map package contains a map document (.mxd) and the data referenced by the layers it contains, packaged into one convenient, portable file." The creation of map packages can be fine tuned and scripted using the Package Map (Data Management) tool.
... View more
02-20-2015
07:55 AM
|
1
|
1
|
1815
|
|
POST
|
It would be helpful if you could elaborate a bit more on the final data structure you are looking at creating. The sample JSON object is large and sparse, which would create a large and sparse table if you want to load it all into a single table. I am unclear whether you are interested in all items in the list or certain subsets. You seem to be working with electronic waste, is that the subset you want to extract? When you say you want to "iterate through most of the fields shown in the JSON response(not all)," what is your criteria for not iterating over something?
... View more
02-20-2015
07:45 AM
|
1
|
14
|
6184
|
|
DOC
|
In order to decide what tool to use on-the-fly, you would have to have either an if..elif...else structure, dictionary map, or some other control structure. In that case, why not call the tool like normal. The other hang up is arguments because different tools have different arguments. Granted, some tools share syntax/arguments, but not enough to make one argument list to rule them all. From a learning perspective, and understanding more nuts and bolts of Python, code snippets like this are useful even if the real-world need is limited.
... View more
02-19-2015
01:49 PM
|
0
|
0
|
2305
|
|
POST
|
Owen Earley, although I agree that GeoNet is really about Esri and ArcGIS, the GeoNet home page does say "all things geo." Maybe the intern was excited and got a bit carried away the scope statement, but one could argue Esri did open the door to non-Esri questions, not that one should expect to get many responses.
... View more
02-19-2015
10:36 AM
|
1
|
0
|
554
|
|
POST
|
How are you "seeing" the geodatabases tables? Are you logging into SQL Server using SQL Server Management Studio (SSMS) or SQLCMD? If the former, are there no views showing up under the Views folder/tab? Originally with 10.1, versioned views ended with "_VW" but Esri changed that in 10.2 to end in "_EVW".
... View more
02-19-2015
09:13 AM
|
1
|
2
|
4232
|
|
POST
|
Starting with ArcGIS 10.1, versioned views are automatically created for tables or feature classes that are registered as versioned. Are you sure the "Enable SQL Access" isn't greyed out because a versioned view already exists?
... View more
02-19-2015
06:42 AM
|
1
|
7
|
4232
|
|
POST
|
Try the following, see if this is what you are after: import arcpy
census = #census unit feature class
percentField= #field where percentage will be input
study = #walk or bikeshed feature class
#make feature layer for using with SelectLayerByLocation
arcpy.MakeFeatureLayer_management(census, "census")
#make search cursor over study areas, even if just one.
studyCursor = arcpy.da.SearchCursor(study, ["OID@", "SHAPE@"])
for oid, s_shape in studyCursor:
#for given study area, select census blocks that intersect
arcpy.SelectLayerByLocation_management("census", "INTERSECT", s_shape)
#make update cursor for census blocks selected above
censusCursor = arcpy.da.UpdateCursor("census", ["SHAPE@", percentField])
for c_shape, _ in censusCursor:
#get area of census block
area = c_shape.area
#find pct of census block that intersects study area
pct = c_shape.intersect(s_shape, 4).area / area * 100
#update record
censusCursor.updateRow([c_shape, pct])
del censusCursor
del studyCursor The code above is designed to be run as a standalone script, not as a Python tool. If the functional code works for you, it can be moved into a Python tool without much effort.
... View more
02-18-2015
01:50 PM
|
1
|
5
|
2051
|
|
POST
|
I couldn't agree more. From the perspective of managing ArcGIS software and not just using it, things have degenerated ever since the dot releases came to light. Beyond the sheer number of patches and lack of any roll-up, all of the back porting of patches has made for an interesting landscape. Now we have patches that need to be applied to 10.2.1 but then re-applied after upgrading to 10.2.2, which seldom if ever happened when service packs were around because back porting was practically nonexistent between service packs. To complicate the matter, the PatchFinder.exe hasn't had its metadata updated in more than a year. The file properties still show the same version # as one I downloaded 12 months ago, but the one from 12 months ago doesn't see all of the newer patches. If the file properties aren't being updated as new PatchFinder.exes are released, how are users supposed to know they have an out of date version?
... View more
02-18-2015
11:19 AM
|
1
|
0
|
1289
|
|
POST
|
Do you have 64-bit Background Geoprocessing installed? If so, then you might be running into an issue where a different ArcPy site package is being loaded than what you expect. If you don't have 64-bit Background Geoprocessing installed, then the only issue should be which Python interpreter you are using, the 32-bit tied to Desktop or the 64-bit tied to Server. In general, though, assuming which ArcPy site package is loaded based on ArcGIS installation order isn't a good practice. You can find the path to the specific interpreter being used with sys.executable, just write a short script to dump that value to a file. That being said, how are you running the scheduled task? Are you specifying the Python executable and passing it your script as an argument or running the script directly and relying on Windows file extension associations to determine which interpreter is being used?
... View more
02-18-2015
11:03 AM
|
0
|
1
|
608
|
|
POST
|
Did some quick tests selecting 25 random records against a ~700,000 record non-versioned feature class in Oracle. Of the three different approaches I mentioned above (random.sample, reservoir sampling, and SQL sample), random.sample was the quickest taking about 4.5 seconds on average to make a feature layer. SQL sample took about 1.4x longer than random.sample while reservoir sampling took 5.1x longer. It seems the overhead of calling random over N items becomes quite impactful with hundreds of thousands of records, and that impact will only grow as N grows. Also, the memory impact from fully populating an OID list was quite a bit smaller than I anticipated. The reservoir sampling code above is quite simple and isn't distributed, which I have seen some high-performing distributed reservoir sampling code, but it still demonstrates the trade-offs of different approaches. The SQL sampling performed fairly well, a bit better than I expected. Given it wasn't that much slower than random.sampling, it seems a more efficient piece of SQL might make the approach more competitive. It would be interesting to see an SQL Server comparison, but that will have to wait for another day.
... View more
02-15-2015
10:18 PM
|
1
|
0
|
2507
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | Monday | |
| 1 | 2 weeks ago | |
| 1 | 3 weeks ago | |
| 1 | 12-19-2025 06:05 AM | |
| 1 | 12-02-2025 07:31 AM |
| Online Status |
Offline
|
| Date Last Visited |
yesterday
|