<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Multiprocessing with ArcGIS Pro arcpy in Python Questions</title>
    <link>https://community.esri.com/t5/python-questions/multiprocessing-with-arcgis-pro-arcpy/m-p/1417898#M70509</link>
    <description>&lt;P&gt;The usual way is to ensure all sub-processes have their own workspace/gdbs. Your sub-processes can write their FCs to those&amp;nbsp;without any locking issues from other sub-processes. Then the main process can collect all the FCs and copy them to the final output workspace.&lt;/P&gt;</description>
    <pubDate>Thu, 02 May 2024 02:13:14 GMT</pubDate>
    <dc:creator>Luke_Pinner</dc:creator>
    <dc:date>2024-05-02T02:13:14Z</dc:date>
    <item>
      <title>Multiprocessing with ArcGIS Pro arcpy</title>
      <link>https://community.esri.com/t5/python-questions/multiprocessing-with-arcgis-pro-arcpy/m-p/1417627#M70501</link>
      <description>&lt;P&gt;I'm experimenting with attempting to use multiprocessing to perform some geoprocessing with a standalone Python script.&amp;nbsp; I have ArcGIS Pro 3.2.2 installed.&amp;nbsp; I'm calling this script in a command window with&amp;nbsp;"C:\Program Files\ArcGIS\Pro\bin\Python\Scripts\propy".&lt;/P&gt;&lt;P&gt;The script will create usually around half of the feature classes without issue, but others fail randomly due to the following errors:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;ERROR 000732: Feature Class Location: Dataset C:\Mapping\FileGDB/CIMS_CLU_Test.gdb/FeatDS does not exist or is not supported&lt;/LI&gt;&lt;LI&gt;ERROR 999999: Something unexpected caused the tool to fail. Contact Esri Technical Support&lt;/LI&gt;&lt;LI&gt;ERROR 160193: This release of the GeoDatabase is either invalid or out of date.&lt;/LI&gt;&lt;LI&gt;ERROR 160706: Cannot acquire a lock.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;I've experimented with decreasing / increasing the processes in the Pool declaration to no avail.&amp;nbsp; Is it realistic to expect this kind of multiprocessing to work with file GDBs or is there a trick I'm missing in the code that will help?&amp;nbsp; Thanks in advance for any suggestions you can provide.&lt;/P&gt;&lt;LI-CODE lang="python"&gt;import arcpy

from multiprocessing import Pool

path = r"C:\Mapping\FileGDB"
filename = "Test.gdb" 
 
# Set local variables
out_path = path + "/" + filename + "/FeatDS"

geometry_type = "POLYGON"
release = "04_2024"
has_m = "DISABLED"
has_z = "DISABLED"

def createFC(state):  
    try:
        arcpy.CreateFeatureclass_management(out_path, state + "_" + release, geometry_type, None, has_m, has_z)
        print(f"{ state } created")
    except Exception as e:
        print(f"{ state } error: { e }")    

def work():
    states = [ 'AL', 'AR', 'AZ', 'CA', 'CO', 'DC', 'FL', 'GA', 'IA', 'ID', 'IL', 'KS', 'KY', 'LA', 'MD', 'MI', 'MN', 'MO', 'MS', 'MT', 'NC', 'ND', 'NE', 'NM', 'NV', 'NY', 'OH', 'OK', 'PA', 'SC', 'SD', 'TN', 'TX', 'UT', 'VA', 'WA', 'WI', 'WY', 'AK', 'CT', 'DE', 'HI', 'ME', 'MA', 'NH', 'NJ', 'RI', 'VT', 'WV'  ]

    with Pool(processes=4) as pool:
        pool.map(createFC, states)

        pool.join()
        pool.close()        

if __name__ == '__main__':
    work()

    print('end')&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 01 May 2024 16:21:37 GMT</pubDate>
      <guid>https://community.esri.com/t5/python-questions/multiprocessing-with-arcgis-pro-arcpy/m-p/1417627#M70501</guid>
      <dc:creator>ckoenig77</dc:creator>
      <dc:date>2024-05-01T16:21:37Z</dc:date>
    </item>
    <item>
      <title>Re: Multiprocessing with ArcGIS Pro arcpy</title>
      <link>https://community.esri.com/t5/python-questions/multiprocessing-with-arcgis-pro-arcpy/m-p/1417653#M70503</link>
      <description>&lt;P&gt;With multiple processes accessing the same file geodatabase, especially to make edits, you will run into locking issues.&amp;nbsp; Might be worth reading through:&amp;nbsp; &lt;A href="https://support.esri.com/en-us/knowledge-base/how-are-the-various-lock-mechanisms-implemented-in-arcs-000006976" target="_blank"&gt;FAQ: How Are the Various Lock Mechanisms Implemented in ArcGIS Enterprise and the Geodatab (esri.com)&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 01 May 2024 17:12:38 GMT</pubDate>
      <guid>https://community.esri.com/t5/python-questions/multiprocessing-with-arcgis-pro-arcpy/m-p/1417653#M70503</guid>
      <dc:creator>JoshuaBixby</dc:creator>
      <dc:date>2024-05-01T17:12:38Z</dc:date>
    </item>
    <item>
      <title>Re: Multiprocessing with ArcGIS Pro arcpy</title>
      <link>https://community.esri.com/t5/python-questions/multiprocessing-with-arcgis-pro-arcpy/m-p/1417898#M70509</link>
      <description>&lt;P&gt;The usual way is to ensure all sub-processes have their own workspace/gdbs. Your sub-processes can write their FCs to those&amp;nbsp;without any locking issues from other sub-processes. Then the main process can collect all the FCs and copy them to the final output workspace.&lt;/P&gt;</description>
      <pubDate>Thu, 02 May 2024 02:13:14 GMT</pubDate>
      <guid>https://community.esri.com/t5/python-questions/multiprocessing-with-arcgis-pro-arcpy/m-p/1417898#M70509</guid>
      <dc:creator>Luke_Pinner</dc:creator>
      <dc:date>2024-05-02T02:13:14Z</dc:date>
    </item>
    <item>
      <title>Re: Multiprocessing with ArcGIS Pro arcpy</title>
      <link>https://community.esri.com/t5/python-questions/multiprocessing-with-arcgis-pro-arcpy/m-p/1418222#M70513</link>
      <description>&lt;P&gt;This makes sense thank you for your suggestion!&lt;/P&gt;</description>
      <pubDate>Thu, 02 May 2024 16:53:43 GMT</pubDate>
      <guid>https://community.esri.com/t5/python-questions/multiprocessing-with-arcgis-pro-arcpy/m-p/1418222#M70513</guid>
      <dc:creator>ckoenig77</dc:creator>
      <dc:date>2024-05-02T16:53:43Z</dc:date>
    </item>
  </channel>
</rss>

