I have a csv file in a folder location that gets updated every hour.
I need to take this info and update a Feature Class that is being published to a feature service.
Looking for the smartest and most effective means in which to do this.
Stored Procedure and then to FC? Any ideas would be appreciated.
Hi
An ETL approach can operate directly on both the CSV file and the feature service:
You could script it entirely too a few ways. Is the geometry in the data or are you geocoding?
I have an XY field in the csv file
I can run a stored procedure and read the data....I think I need to then read the Stored Procedure via python to create the Feature Class? Using an insert cursor?
ALTER PROCEDURE [dbo].[addData]
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
CREATE TABLE #ImportData
(
CFS varchar(MAX),
Code varchar(MAX),
)
-- Insert statements for procedure here
BULK INSERT #ImportData from 'C:\Scripts\Service\DailyCalls.csv' WITH
(
FIELDTERMINATOR = ',',
FIRSTROW = 2
)
SELECT * FROM #ImportData
DROP TABLE #ImportData
END
This topic describes using the arcgis API:
Alright I simply created a model
Event Layer then to Geodatabase. Will set this python script to run on a scheduled task
What I've done in a similar situation is to turn the csv into a xy event, ( Make XY Event Layer—Data Management toolbox | Documentation ) and then create a feature class from that. All in arcpy an all in memory. Then truncate the target feature class and append the in_memory feature class to it.
Nice one Joe, however it may be a requirement with this update frequency to have no downtime, in which case I would lean to the ETL approach which operates one feature at a time or to the feature service REST API so you only operate on the feature service change set in milliseconds. Sorry to be pedantic ;-).
Good point Bruce....I will have to continue your approach...but still confused....
I just did what Joe was talking about.....so now to continue with your solution