Select to view content in your preferred language

Process CSV to Feature Class every 30 minutes

1647
9
Jump to solution
09-18-2023 08:26 AM
kapalczynski
Frequent Contributor

I have a csv file that I a can read from a specific URL.

https://stadlspoceastus.blob.core.windows.net/curated/CrashAnalysisData/v2/real_time_prediction........ 

The data in this link refreshes every 30 minutes or so.  I am trying to find the best workflow to display this data in AGOL.

NOTING that I have to process this data a bit before viewing... The csv file has XY and MeasureFrom and MeasureTo fields... 

I have a route later that I am using arcpy.lr.MakeRouteEventLayer on.  I take the MeasureTo and MeasureFrom in the csv file to create a line feature.  It is this line feature that I am trying to show in AGOL.

My initial thought was to

Step 1

  1. Take the csv file and run the GP tool to create this Feature Class in SDE
  2. Spin this FC up to a service and reference in AGOL

Step 2

  1. Create a python file that will read the .csv file every 30 minutes or so and create a temp Feature Layer in Python using MakeRouteEventLayer
  2. Then use the New Route Event Layer to Truncate and append the data in SDE.  Thus updating the data in AGOL

Does that make sense?  Any other ways to do this?  Im am thinking Local as the processing every 30 minutes would eat credits... 

Is there a way to add this .csv file to AGOL and then create a feature layer off of it?  But how would I automate the truncate and append in AGOL?

 

 

0 Kudos
1 Solution

Accepted Solutions
kapalczynski
Frequent Contributor

I did a truncate and append and its works fine now.

import arcpy
import pandas as pd
import os

arcpy.env.overwriteOutput = True

df = pd.read_csv ("https://stadlspoceastus.blob.core.windows.net/curated/CrashAnalysisData/v2/real_time_prediction/crashprediction_latest.csv?sp=r&st=2023-06-13T17:50:36Z&httpsDmm4eAiY%3D")
print(df)
# saving the dataframe
df.to_csv(r'C:\Users\Desktop\Crash_Prediction\Crash_Prediction.csv')

# Set workspace
arcpy.env.workspace = r"C:\Users\Desktop\Crash_Prediction\CrashPrediction.gdb"
roads = "I95" # the road feature in the Geodatabase
csvtoProcess = r'C:\Users\Desktop\Crash_Prediction\Crash_Prediction.csv'

eventLayerCreated = arcpy.lr.MakeRouteEventLayer(roads, "RTE_NM", csvtoProcess, "RTE_NM; Line; MeasFrom; MeasTo", "crashprediction_latest (1) Events",
                             None, "NO_ERROR_FIELD", "NO_ANGLE_FIELD", "NORMAL", "ANGLE", "LEFT", "POINT")

# Make a layer from the feature class
arcpy.MakeFeatureLayer_management(eventLayerCreated,"CrashPrediction_lyr")

arcpy.TruncateTable_management(r"Database Connections\\data.sde\GIS_DATA.SDE_VDOT_CrashPredictionOutput")

arcpy.Append_management(eventLayerCreated,
                        "Database Connections\\data.sde\GIS_DATA.SDE_VDOT_CrashPredictionOutput",
                        "NO_TEST")

View solution in original post

9 Replies
clt_cabq
Frequent Contributor

I've run processes like this that update on shorter frequencies - up to every 5 minutes, so this whole process could be run from a python file that executes on a server and runs as a windows scheduled task. There's a lot going on here - the file you linked only has 80 records in it - is this a typical size or are you handling more records than this? If its a lot more records you might find there is a lot of overhead that makes a 30 minute update frequency challenging. I think the best approach would be to create a permanent feature class that is served through ArcGIS Enterprise, then is registered in AGOL so it updates as your server based feature class is updated. The less you mess around with recreating your feature layer or downloading and storing  in AGOL the better because stuff like rendering and popups you might apply won't necessarily persist and AGOL may not very well support the update cycle you are interested in.

ghaskett_aaic
Emerging Contributor

This is totally doable. 

You can either wipe the old file clean and replace it or append new data to it.

I had to do something similar a few years back, however all those files are now on an external hard drive somewhere.  I will try to see if I can find them.

Below is a link to solution by Henry Lindemann that might get you pointed in the right direction.  Look for his reply in the thread:

https://community.esri.com/t5/arcgis-enterprise-portal-questions/append-data-to-hosted-feature-layer...

BobBooth1
Esri Contributor
kapalczynski
Frequent Contributor

As you can see below this csv file being grabbed from the internet is formatted weird... I converted it to JSON to view better.. As you can see it lists field 1 and then all the values, then field 2 and all the values... 

How do I write this to a Feature Layer in Python?  Not sure how to parse through this format to create a Feature from the XY fields.  Do I need to read it differently so it reads like a table with rows?

kapalczynski_1-1695127163834.png

 

0 Kudos
BobBooth1
Esri Contributor

I'd suggest reading it into a Pandas dataframe. 

Could use just Pandas or could use ArcGIS Spatially Enabled Data Frame.

https://developers.arcgis.com/python/guide/introduction-to-the-spatially-enabled-dataframe/

Pandas has lots of good tools for working with tables.

https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.transpose.html

 

kapalczynski
Frequent Contributor

Think I got it with this.... Unless anyone sees a better way????

  1. Read Online CSV
  2. Convert to CSV
  3. MakeRouteEventLayer
  4. Copy to Feature Class in GDB 
import arcpy
import pandas as pd
import os

arcpy.env.overwriteOutput = True

df = pd.read_csv ("https://stadlspoceastus.blob.core.windows.net/curated/CrashAnalysisData/v2/real_time_prediction/crashprediction_latest.csv?sp=r&st=2023-06-13T17:50:36Z&httpsDmm4eAiY%3D")
print(df)
# saving the dataframe
df.to_csv(r'C:\Users\Desktop\Crash_Prediction\Crash_Prediction.csv')

# Set workspace
arcpy.env.workspace = r"C:\Users\Desktop\Crash_Prediction\CrashPrediction.gdb"
roads = "I95" # the road feature in the Geodatabase
csvtoProcess = r'C:\Users\Desktop\Crash_Prediction\Crash_Prediction.csv'

eventLayerCreated = arcpy.lr.MakeRouteEventLayer(roads, "RTE_NM", csvtoProcess, "RTE_NM; Line; MeasFrom; MeasTo", "crashprediction_latest (1) Events",
                             None, "NO_ERROR_FIELD", "NO_ANGLE_FIELD", "NORMAL", "ANGLE", "LEFT", "POINT")

# Make a layer from the feature class
arcpy.MakeFeatureLayer_management(eventLayerCreated,"CrashPrediction_lyr")

# Write the selected features to a new featureclass
arcpy.CopyFeatures_management("CrashPrediction_lyr", r"C:\Users\Desktop\Crash_Prediction\CrashPrediction.gdb\CrashPredictionOutput")

 

0 Kudos
kapalczynski
Frequent Contributor

I can do the above and it works... but wondering if anyone knows how I could read in the csv and not have to write it to a CSV file to be used in the MakeRouteEventLayer.  Something I can keep in memory for a short period of time...

0 Kudos
kapalczynski
Frequent Contributor

I spun this up to a service and now when I run it I get an Error.... Does the COPY in the script have to be changed to something else like a delete and insert cursor or something

 

ERROR 000464: Cannot get exclusive schema lock. Either being edited or in use by another application or service.

0 Kudos
kapalczynski
Frequent Contributor

I did a truncate and append and its works fine now.

import arcpy
import pandas as pd
import os

arcpy.env.overwriteOutput = True

df = pd.read_csv ("https://stadlspoceastus.blob.core.windows.net/curated/CrashAnalysisData/v2/real_time_prediction/crashprediction_latest.csv?sp=r&st=2023-06-13T17:50:36Z&httpsDmm4eAiY%3D")
print(df)
# saving the dataframe
df.to_csv(r'C:\Users\Desktop\Crash_Prediction\Crash_Prediction.csv')

# Set workspace
arcpy.env.workspace = r"C:\Users\Desktop\Crash_Prediction\CrashPrediction.gdb"
roads = "I95" # the road feature in the Geodatabase
csvtoProcess = r'C:\Users\Desktop\Crash_Prediction\Crash_Prediction.csv'

eventLayerCreated = arcpy.lr.MakeRouteEventLayer(roads, "RTE_NM", csvtoProcess, "RTE_NM; Line; MeasFrom; MeasTo", "crashprediction_latest (1) Events",
                             None, "NO_ERROR_FIELD", "NO_ANGLE_FIELD", "NORMAL", "ANGLE", "LEFT", "POINT")

# Make a layer from the feature class
arcpy.MakeFeatureLayer_management(eventLayerCreated,"CrashPrediction_lyr")

arcpy.TruncateTable_management(r"Database Connections\\data.sde\GIS_DATA.SDE_VDOT_CrashPredictionOutput")

arcpy.Append_management(eventLayerCreated,
                        "Database Connections\\data.sde\GIS_DATA.SDE_VDOT_CrashPredictionOutput",
                        "NO_TEST")