Can Arcpy accept input as file-like object?

612
4
Jump to solution
05-10-2023 07:37 AM
Jay_Gregory
Regular Contributor

I'm curious if arcpy methods that take as input a CSV or text file can accept a file-like object instead, so I don't have to write to disk before invoking.  

For example, if I have a dataframe df:

 

import pandas as pd
from io import StringIO
import arcpy
s_buff = StringIO()
df.to_csv(s_buff)
arcpy.management.BearingDistanceToLine(s_buff, "memory\\output", "x", 'y', 'd', bearing_field="b")

 

This generates an error: Error in executing tool.  s_buff.read(), s_buff.seek(0) also generates errors.  I know I could write to a temporary file, but I'd rather not.  

0 Kudos
1 Solution

Accepted Solutions
DannyMcVey
Esri Contributor

What about something like this?

import numpy as np
import pandas as pd
import arcgis
 
data = np.random.randint(5, 35, size=11)
df = pd.DataFrame(data, columns=['random_numbers'])

df.spatial.to_table(r'memory/processed_df')

arcpy.GetCount_management(r'memory/processed_df')

 

View solution in original post

4 Replies
KenBuja
MVP Esteemed Contributor

You are able to use the memory workspace in geoprocessing tools, according to the documentation.

Use of memory-based workspaces in Python is only valid for geoprocessing tools. Memory is not a general-purpose virtual directory where you can write files or other data.

Jay_Gregory
Regular Contributor

Indeed, but I'm just working with a vanilla pandas dataframe.  The memory workspace is limited to the output of arcpy operations it seems.

0 Kudos
DannyMcVey
Esri Contributor

What about something like this?

import numpy as np
import pandas as pd
import arcgis
 
data = np.random.randint(5, 35, size=11)
df = pd.DataFrame(data, columns=['random_numbers'])

df.spatial.to_table(r'memory/processed_df')

arcpy.GetCount_management(r'memory/processed_df')

 

Jay_Gregory
Regular Contributor

That works! Thanks!  just have to deal with the overhead of importing the python api to access the GeoAccessor 🙂