Arcpy Delete Field Management General Function Failure

1789
7
04-04-2018 02:54 AM
WengNg1
New Contributor III

Hi All,

I've been trawling around the forums for an answer to an issue with a general function failure on one of my arcpy script.

I've got a script that does a Select By Attribute query and export the selection out as another feature class. Then, it does a bit of field mapping through a list and then it also tries to delete unwanted fields from the feature class and does a  few Calculate Field operations .

The script works fine as is but it starts breaking when it encounters a large dataset complaining about a General Function Failure . I tried debugging and found that the script crashes at the delete field management stage. If I reduce the amount of data queried , it works fine which is bizzare.

Since it works with a smaller dataset, it doesn't look like it is a filegdb locking issue.

Are there any known bugs with the delete field management arcpy module that causes this crash?

7 Replies
DanPatterson_Retired
MVP Emeritus

It could also be a resource issue rather than a file lock.  Can you delete using the toolbox interface instead of the script?

0 Kudos
WengNg1
New Contributor III

Dan, you are quite right. I tried deleting with the toolbox and it returned the same error . It also happens to corrupt the feature class that I'm trying to delete which also happens when I was running the script.

It sounds like a bug to me ? As it is a general function failure, I'm not even sure if it captures any logs .

0 Kudos
DanPatterson_Retired
MVP Emeritus

At this point, I presume you have tried resetting your profile. 

If so, you could try a 'repair' in control panel 'add/remove programs'. 

If that doesn't work, then reinstall might be needed. 

Lastly... off to tech support with details and data may be your last option.

0 Kudos
WengNg1
New Contributor III

Yeah I did and I tried reinstalling ArcGIS Desktop. 

I even converted the script to Python 3 and ran it with ArcGIS Pro and it still complains about a general function failure although with better error statements (it actually tells me at which field it stopped).

It's very temperamental , I tried running it once with a small query and it fell over. When I reran the script with the same query, it worked.

0 Kudos
DanPatterson_Retired
MVP Emeritus

hmmmm. beginning to sound like a data issue or a script issue.

I doubt that you are using anything changed from 2 to 3 (did you run the 2to3.py when you upgraded?) that would impact your script.

Is there anything special about your data?  a different codepage/regionality?  Does the code work on other data?

If the script isn't really long, then maybe posting here might twig some discussion ( just in case... Code Formatting the basics++

0 Kudos
WengNg1
New Contributor III

Quick update :

I've contacted ESRI support on this issue as well as consulted IT within my organisation. We were able to narrow it down to an issue with a new McAfee deployment from IT.

This issue was successfully replicated on multiple machines within the organisation and is independent of the script's code. It will randomly corrupt the feature class if you try deleting multiple fields manually in large dataset . We suspect that it might have something to do with a memory scan of some sort but obviously that is very difficult to log.

The only way that stops McAfee from blocking it was to turn off On Access Scanning and Script Scan. However, due to company policies , this cannot be the solution. We've also tried excluding all possible folders related to ESRI but still no luck.

Also, this is independent of ArcMap versions as well. I've tried it on ArcGIS Pro and it was coming up with the same issue.

Although not fully tested, I've encountered this issue with other tools as well when working with large datasets. So, it is not specifically a tool failure of any sort.

DanPatterson_Retired
MVP Emeritus

Weng, good to know, I use McAfee but mine is set for home use I suppose so I haven't encountered the issues you have.  Good luck

0 Kudos