I have some Python geoprocessing scripts I would like to share with several people. A problem is, for example, each person would use a different path as to where they read/write data. I could give the same scripts to each person, only customize the scripts with whatever paths the people are using. But that is cumbersome. I was wondering if I could make a configuration file for these Python scripts and the customizations would only be in this one config file. Back in the .AML days, I could make an .aml called, say, SetVars.aml that would contain all the variables and I could make a geoprocessing .aml read SetVars.aml. That way I could install my geoprocessing .amls on different computers and I would only have to customize the SetVars.aml rather than having to customize every script. Or if I had to change some paths, these paths would be defined as variables in one nice, neat file rather than scattered among several scripts. I want to do something similar to this in the ArcGIS Python world. I don't want these variables to be parameters the user would have to choose - I would just like to have certain variables in a convenient script that the other Python scripts could read, if that is possible. If anyone could point me to an example of this I would appreciate it.
'''defaultData.py ''' def storesData(name): '''returns output file path for the queried person ''' if name == 'Catherine': return 'D:\\Catherine\\GIS_output\\' if name == 'James': return 'C:\\Temp\\GIS\\output\\' else: return 'C:\\Temporary\\' def programSettings(): '''returns default settings in a python list: e.g. [arg1, arg2, overwriteFiles, dataPath, outputTxtPath] ''' return [0,1,True,'X:\\GIS\\Data','output.txt']
'''genericOperationScript.py Loads default file paths for user, default settings, and does some geoprocessing. ''' import defaultData # imports the python file above so you can call it import arcpy # get defaults - this method looks complicated, but is handy for getting all the vairables out in one place... arg1, arg2, overwriteFiles, dataPath, outputTxtPath = defaultData.programSettings() arcpy.AddMessage('Program defaults loaded...') arcpy.AddMessage(' Using data path: '+dataPath) outputPath = defaultData.storesData('Catherine') # get the path for this user... arcpy.AddMessage(' Using output path: '+outputPath ) if overwriteFiles: arcpy.AddMessage(' Existing files will be overwritten..') else: arcpy.AddMessage(' Existing files will not be overwritten..') # do geoprocessing...
Jacob,
tlsilveus' method will work just fine, but you can also store information directly in python modules, use an import statement to load them, and request the data (optionally with an input)... The main advantages are:
- things can be stored directly in Python structures (such as dictionaries or lists)
- things can be more dynamic (i.e. take an input, and vary the output depending on the input), and
- it can be incorporated within one of your scripts - so there would be no extra files (this gets a little more complicated, though...).
# This is the 'initial_comment' # Which may be several lines keyword1 = value1 'keyword 2' = 'value 2' [ "section 1" ] # This comment goes with keyword 3 keyword 3 = value 3 'keyword 4' = value4, value 5, 'value 6' [[ sub-section ]] # an inline comment # sub-section is inside "section 1" 'keyword 5' = 'value 7' 'keyword 6' = '''A multiline value, that spans more than one line :-) The line breaks are included in the value.''' [[[ sub-sub-section ]]] # sub-sub-section is *in* 'sub-section' # which is in 'section 1' 'keyword 7' = 'value 8' [section 2] # an inline comment keyword8 = "value 9" keyword9 = value10 # an inline comment # The 'final_comment' # Which also may be several lines
from configobj import ConfigObj config = ConfigObj(filename) # value1 = config['keyword1'] value2 = config['keyword2'] # section1 = config['section1'] value3 = section1['keyword3'] value4 = section1['keyword4'] # # you could also write value3 = config['section1']['keyword3'] value4 = config['section1']['keyword4']