|
POST
|
Thank you for the suggestions. I have tried running the buffer wizard in ArcMap, which ran for over 8 hours. When I came back to the office in the morning, there was an error message saying "ArcMap has crashed". I then tried running the Buffer tool again in ArcCatalog without the dissolve option checked. This worked, however when attempting to run the Dissolve tool on the output I got an error stating "Invalid Topology". I then tried running the repair geometry tool on the output and re-running the Dissolve tool, but received the same error about invalid topology. Can someone from esri please help?
... View more
12-24-2014
08:09 AM
|
0
|
7
|
6206
|
|
POST
|
The process has not "bombed", it is still running; in task manager the ArcCatalog.exe process is using 13% CPU (which is 100% of one core, as it is a single-threaded application. 100% / 8 cores = 13%); also the amount of RAM being used by the process fluctuates rapidly (between 500MB to 650MB). So it is definitely still working. Splitting up and then merging is not an option due to the way the buffer tool dissolves intersections to remove overlapping polygons. This has to be done as a single process (we do not have time to manually correct thousands of overlapping polygons).
... View more
12-22-2014
12:12 PM
|
0
|
3
|
6206
|
|
POST
|
I am trying to create a new "Cartway" feature class using the buffer tool on a centerline feature class with ~750,000 features in ArcGIS 10.1 SP1. The tool has basic parameters as inputs, such as "FULL", "ROUND", and "ALL", and background geoprocessing is disabled. The problem I am having is the tool is stuck at 100% completion and does not finish. However the ArcCatalog.exe process is still working (13% CPU, or 100% / 8 cores). I even let it run over the weekend and it is still not done. The data and scratch workspace are all on solid state drives (SSD's). There are no "spinners" in this workstation to slow things down, so there should be no bottleneck there. The workstation is a Dell T5500 Precision workstation. The specifications are: 2 x Intel Xeon X5677 (2 physical quad-core CPUs, 8 physical cores total, HT disabled). 20GB RAM 2 x NVIDIA Quadro 600 GPUs 120GB Corsair V12 SSD 120GB Corsair Force GT SSD Any ideas as to how I can get this to finish?
... View more
12-22-2014
04:28 AM
|
0
|
20
|
18933
|
|
POST
|
Yes, I have been using this tool for years, and following the same methodology for years. The database was created using 10.1 on the same day (today). It was created solely for this purpose. The data was standardized and the locator created today. I checked all address inputs and everything is exactly as it should be. The tool is simply aborting the operation before any processing is done.
... View more
07-29-2014
12:17 PM
|
0
|
1
|
1813
|
|
POST
|
I have built an address locator using a parcel feature class after running it through the standardize addresses tool. When I go to geocode my table of about 3,500 addresses, the process does not run, and says there are 0 matched, 0 tied, and 0 unmatched, with calculated speed N/A. I have been using this tool for years, and have never had this issue before. Why is this suddenly happening?
... View more
07-29-2014
07:00 AM
|
0
|
6
|
2465
|
|
POST
|
We compress weekly. We only have one user editing land; I am not sure if he chooses to favor the edit or target version when editing. However that should not matter since there are no other editors. Yes we are using the ArcFM extension for ArcGIS 10.1. The geodatabase is hosted on an enterprise server, I don't think the time and date are wrong. Thanks for your input.
... View more
03-07-2014
06:09 AM
|
0
|
0
|
992
|
|
POST
|
Hey everyone, thought I'd throw this out there as everyone at our organization, including our vendors Schneider and IBM have no clue. We went through an upgrade back in December from 9.1 to 10.1. Our three SDE databases are Land, Electric, and Gas. All run on Oracle platform. Since the upgrade we have had erroneous versions pop up on our Land instance. They are usually called LAND_ELECTRIC, which are created on electric (normally). Other times they are labelled SN_xxx (ie SN_145) indicating they were created with ArcFM Session Manager (again, only used in electric instance). They are almost always dated November 13, 2013, even to this day (this date is from when we began a data freeze on database updates before the upgrade took place). Other times the versions have a current date (ie March 2014). This morning our instance default version on land had a date of November 13, 2013 after a compress to state 0 last night by IBM. There is no custom code running in the compress, it is done using core functionality through ArcCatalog. However all of our updates since then are in place (meaning the database does not really reflect its state from November 2013). If anybody could give us a lead to pursue it would be very much appreciated. Thanks.
... View more
03-06-2014
06:28 AM
|
0
|
5
|
1603
|
|
POST
|
Isn't the scratch.gdb designed to *always* be available and *guaranteed* to exist? If so, why attempt to delete it (if that's even possible)? Edit: Also, the suggested "clean up" is not saying to delete the .gdb or the workspace. Rather it is saying, delete the contents of the .gbd: http://resources.arcgis.com/en/help/main/10.1/index.html#//00570000006w000000
import arcpy
import os
inFC = arcpy.GetParameterAsText(0)
tempFC = arcpy.env.scratchGDB + os.path.sep + "tempFC"
arcpy.CopyFeatures_management(inFC, tempFC)
# Do some work here...
# Clean up when done...
#
arcpy.Delete_management(tempFC)
Why not use the "in_memory" workspace instead? I had actually thought of using in memory workspace, but was not sure if you can use it with a map document. Part of my script uses a detailed layout saved in an .mxd in the same folder the script runs with the layers pointed into the scratchGDB for jpeg exports. I'll try just deleting the feature classes out of the scratch gdb. I would feel more comfortable if the gdb was completely deleted however as "schema locks" seem to always come back to haunt me.
... View more
02-28-2014
05:05 AM
|
0
|
0
|
4466
|
|
POST
|
Perhaps try: arcpy.Delete_management(arcpy.env.scratchGDB)
arcpy.Delete_management(arcpy.env.scratchFolder)
arcpy.Delete_management(arcpy.env.scratchWorkspace) This should delete the folders as defined in the user's environmental variables. arcpy.env.scratchGDB and scratchFolder exist within the scratchWorkspace (or the parent folder if the scratchWorkspace is a gdb) Also these methods return a variable which should be 'true' (string) if the operation was successful for additional checking if necessary. Thanks for the suggestion. I added these lines, and they "work", but there is still a scratch.gdb folder leftover, and now it has two files inside. Both are "xxxx.sr.lock" Anybody from esri care to comment? I feel like this should be so easy...
... View more
02-28-2014
03:14 AM
|
0
|
0
|
4466
|
|
POST
|
Thanks for your reply. However the problem is not a lock or getting things to work. The arcpy.delete function works, and it deletes all feature inside the gdb, and the gdb too. The problem is there is a leftover folder called scratch where the file geodatabase scratch.gdb used to be. Inside the folder is another folder called "info", and inside of "info" is arc.dir This is causing the script to fail on the next pass. Since it is running daily via task scheduler, it is not working. This is not acceptable. Why is the esri delete function not removing the scratch geodatabase? It is better to not delete at all, then at least the scratch geodatabase and features inside are overwritten. It seems the arcpy.delete function only serves to corrupt the scratch geodatabase, rendering it useless, instead of actually deleting it.
... View more
02-27-2014
10:08 AM
|
0
|
0
|
4466
|
|
POST
|
I have a script running that does a bit of spatial analysis, creates a relationship table and exports layout images to JPEG for report. It runs great, but I am having a problem doing proper cleanup. I go to delete the scratchGDB using arcpy.delete, and it works fine. The problem is that a leftover scratch.gdb remains in the workspace directory. After running arcpy.delete the leftover scratch.gdb is not a true geodatabase; in ArcCatalog it appears as a folder with a plus to the left of it. Drilling down on the plus and there is nothing in there. I also added a line to delete that file in python and I get this: [Error 5] Access is denied: 'D:\\gis\\Data\\scratch.gdb' Sometimes the script works fine without deleting the scratch.gdb (it simply overwrites), other times it tells me it cannot create the initial output and fails. I would appreciate any help.
... View more
02-27-2014
05:22 AM
|
1
|
11
|
13642
|
|
POST
|
I need to define my scratch workspace, but python is not taking the input. Here is what I have: # Setup Working Environment
path = "D:\\gis\\Data\\"
arcpy.env.scratchworkspace = path
print "Scratch workspace set to " + env.scratchGDB + "\n" This should set the scratchGDB to d:\gis\Data\scratch.gdb. However it still sets it to C:\Users\tdmc0\AppData\Local\Temp\2\scratch.gdb I cannot have the scratch GDB inside of my temp users directory as it changes daily and this is part of an automated process that has layers in a map document pointing to layers inside my scratch GDB. Why can I not set my scratch workspace? I have followed all of the guidelines in the ArcGIS documentation. Thanks.
... View more
02-18-2014
07:08 AM
|
1
|
0
|
654
|
|
POST
|
The script takes about 45 minutes to complete. After the feature class gets stuck in "Load-Only Mode" it is unusable. I have tried replicating the issue in our development environment by launching the script in task scheduler there, but have been unsuccessful. I am thinking it is something unique to production, ie a conflict with another process etc.
... View more
09-11-2013
11:48 AM
|
0
|
0
|
1112
|
|
POST
|
I have developed a Python Script that performs a variety of geoprocessing and data-loading tasks, including kicking off SQL statements through the command-line. We have tested the script extensively in our development environment, and it successfully runs in production. However when run through the windows task scheduler the target SDE feature class ends up getting stuck in "Load-Only Mode", even though the script ran successfully. All SQL statements run as a sub-process and report errors back to the Python environment, but messages that do not halt execution are not transferred. I'm at a loss as to why this would happen exclusively through automation and not when run manually. Any help, ideas, or comments would be appreciated.
... View more
09-11-2013
04:55 AM
|
0
|
4
|
1217
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 02-18-2014 07:08 AM | |
| 1 | 02-27-2014 05:22 AM |
| Online Status |
Offline
|
| Date Last Visited |
11-11-2020
02:23 AM
|