I found an article here: How To: Delete geoprocessing history from a geodatabase
The only explanation the article gives is "GIS administrators and managers sometimes need to delete the geoprocessing history from a feature class or workspace metadata."
Which is not very helpful.
However since this metadata is not a large size whats the point of removing it? Its obviously there for a reason so can someone tell me what the purpose of why it would be useful to do this? And also what negative impacts it could have from doing so?
The main reason according to me will be 'Performance'....I have seen overall performance degrade a lot because of these over a period of time...
Take a look and see if these links help:
I agree performance is the main reason.
Another reason is that while very few people look at it, there is additional information about you GIS system stored it that you might not want to make public. For example, it may refer to another sensitive layer that was used in a clip tool. While the history doesn't provide access to that layer, it does advertise the layer's existence. In addition, full paths are often stored which gives people an idea of your network infrastructure. Most of the time, the history is garbage to start with but someone looking for info on your private GIS system could get stuff out it. Removing the history lessens this risk. This really only a concern in highly sensitive GIS systems.
Also, is there a way to log the amount of data being deleted like storage wise, or number of files being deleted? I was just using the arcpy script in the link I provided
In addition to Asrujit and Kevin's comments, you can set the Results Management for Geoprocessing History to a certain time period if you like. Go to the Geoprocessing Menu in ArcMap and click Geoprocessing History. Under Results Management, change the "Keep results younger than:" parameter to a choice of days, weeks, months, never save or never delete.
Another reason to delete Geoprocessing History is that it can get quite large and unwieldy when doing many runs while exploring different options, to the point where it may no longer truly reflect the data. For example, one may have a project that requires an iterative approach, like creating a point layer of "features of interest" for a city, where there are many rounds of Appending in points as many initial candidates are considered but then ultimately ending up Deleting many of those points. Each time Append is run the Geoprocessing History automatically gets an additional entry (unless you turn the setting off). In this case the resulting Metadata can get pretty large and at the same time not very helpful for the a reviewer to read.
Chris Donohue, GISP
Yes! This site main reason I do it. No one wants to look through hundred's of often minor processing steps. It's unwieldy and makes it difficult for the reader to digest. I'd rather summarize the important processing steps in the supplemental section. In this case, more does not necessarily mean better.