11,19770167;56,58325000;219016873;20,000000;1;2014-07-01 10:44:44
I am developing a live feed vehicle tracking web map that receives and consumes CSV file data from GPS feeds on vehicles. A dummy file of the live feed is provided in this email and the feed is polled or queried via a URL that downloads the latest instance of the current coordinates for the vehicles.
The process that I’ve tried to follow is similar to the one shown in the GeoEvent Extention Introduction tutorials (attached is the tutorial that shows how to create a ‘live’ map).
I can make the tutorial data (flights) update using the tcp-in and tcp-out connectors and the GeoEvent simulator but if I try the same process as the flights geoevent but use the Location.csv data for Sappi vehicles it does not update.
I’ve also tried the Watch folder for csv update (my preferred method for this task) import connector and created a geoevent service that then connects to a fs-out to update the point feature layer that matches the input csv data. This also did not work.
Some aspects that I’ve trouble shot so far are:
The original csv file that downloads has quotation marks at the beginning and end of each field as well as commas to seperate the fields. I've tested without the quotation marks and still have the same result. No data can be seen transferring in the monitor page of the GeoEvent manager.
Hi Greg
I've taken a quick look at your Original-Locations and have that successfully running, as is with the quotes, using the watch folder for CSV and just going out to another CSV file again.
A couple of possible things spring to mind.
As you have the coordinates coming in in 2 separate columns you need to add in an extra column into the input Event Definition for the geometries to be assembled into and have the "Build Geometry From Fields" switch to yes and populate the x and y field in the Input Service
You’ll also need to specify the Expected Date Format for your data as it’s not what the GEP expects as the default. Not that that’s stopping the in/out process but you get incorrect data the output field otherwise.
You need to drop a uniquely named file into the watched for each time you. GEP won't process a file of the same name if it’s been processed before even if it has different content.
Some good info in this thread regarding the CSV file inputs https://community.esri.com/thread/89695. Note in that thread about GEP expecting a two-line header in the file otherwise you'll be dropping one or two rows from each CSV
I’ve (hopefully) attached screen shots of how I set the input Event Definition and the Input Service. I’ve assumed the first column in the file is the identifier for the flight so have tagged that as the TRACK_ID
Once you get the CSV file out running and your happy with content in the output file then switch out the csv output in the GeoEvent Servicefor the feature service output, making sure the Event Definition data types from the feature service layer correspond to the equivalent data types from the GEP. You’ll need a Field Mapper Processor to point the GEP Geometry field to the Shape field in the feature service layer and the other fields also. Then you should be up and running.
Hope this helps
Cheers Evan
Hi Evan,
Thanks for your help.
I have applied the suggestions from https://community.esri.com/thread/89695. and created geoevent definitions acordingly before as well.
From what I can tell it is not even reading the csv file being written to the watch folder. I've now contacted esri support for assistance and have had feedback that the file ran normally on their side so all I can think is it may be a server setting somewhere. I've checked permissions on the user, the services and the geoevent side and all have full admin rights so I'm really not sure why the geoevent manager wont even read that a csv file has been placed in the correct folder.
I'll reply with feedback should I find what the problem was.
Thanks
Greg
??Hi Greg,
No problem for the help, I know how frustrating it gets when things don't want to work when they should, sorry it didn't solve the issue
From your email it looks like you've pretty much covered all the options. Only thing on the permissions is that it's the account that the GEP was installed under that needs full permissions on the input file.
I did my test under my own user login which doesn't have admin rights on the server and that server is still at 10.2.1
It does sound a bit odd so will be interesting to see what ESRI support come back with
Cheers Evan
Evan Perry
Manager Spatial Solutions
d +64 4 590 6847
m +64 21 650 171
p +64 4 590 6800
f +64 4 590 6801
e evan.perry@ems.co.nz<mailto:evan.perry@ems.co.nz><mailto:evan.perry@ems.co.nz>
e evan.perry@transpower.co.nz<mailto:evan.perry@ems.co.nz>
energy market services
a division of Transpower NZ Limited
Ground Floor | Transpower House | 96 The Terrace | Wellington 6011 | New Zealand
PO Box 5363 | Lambton Quay | Wellington 6145 | New Zealand
Notice: This e-mail and any files transmitted with it are confidential and are for the sole use of the intended recipients. If you have received this e-mail and any attachment in error please immediately notify energy market services by "Reply" and delete it. Any review, dissemination, disclosure, alteration, printing, copying or transmission of it is prohibited and may be unlawful. Note also that its content could have been altered already without the author's consent.
Any views expressed are those of the individual sender, except where specifically stated that they are the views of energy market services.
Hi Evan,
Ok so I do have some feedback after meeting with the ESRI support person. It seems that for some reason the default directory that I used was the problem.
Apparently creating a new directory (we made a directory on the C drive directly) and pointing the input and output services to the new directory worked. I wonder if having a longer path to the folder had something to do with my problem. Also I noticed that he suggested also removing the input directory option in the input settings seems to have helped in my case too.
Thank you very much for your help
Greg
Hi Greg
That's an interesting one, glad they have it sorted for you
All the best
Evan
Hello Evan / Greg -
I would remove the line specifying the attribute field names from the CSV file you are writing into the folder the 'Watch a folder for new .csv files' inbound connector is watching ... better to include only the actual event data in the file and leave out the attribute field names.
In a reply to Greg, Evan cross-references an older thread, watching a folder with csv files, in which I state: "the first two lines of the CSV file are reserved for a comma separated list of field name and a comma separated list of field data types - your actual event data needs to start with the 3rd line of the text file." But then in this thread I suggest the opposite, to remove specification of attribute field names from the CSV file.
So which is it, you're asking. Should a CSV file include a two-line header with field names and types, or shouldn't it?
My apologies for the confusion. In the older thread I must have been thinking of when a CSV file is used as the source of event enrichment, not when the CSV file is used as the source of the event data.
Evan is correct, above, when he suggests that the OS level user responsible for running the GeoEvent Windows service is the account which needs permission to access a registered folder. I don't remember if it was at 10.2.2 or a later release that we added visual indication of the apparent permissions on a registered folder to the Site > GeoEvent > Data Stores page in Manager. That particular enhancement might help clear up potential issues with whether or not an input connector will be able to retrieve data from a registered folder.
Also, Greg points out that when registering the system folder, the user-interface's default is to place a value 'input' in the 'Input Directory' parameter. I always remove this value and leave the 'Input Directory' parameter unspecified. The default would only work if you wanted your 'Watch a Folder for New CSV Files' input to look for a sub-directory named input beneath a registered system path C:\GeoEvent\csv (for example).
When working with file-based input and output (and enrichment) I usually register separate folders:
I don't configure my 'Watch a Folder for New CSV Files' to expect an 'input' sub-directory, and I don't allow it to 'Include Subfolders'.
Finally, please keep in mind that the GeoEvent Extension generally supports RESTful data streams. The inputs capable of watching a system folder for files have some significant limitations and are generally intended to prove that real-time analytics you have designed in a GeoEvent Service behave as you intend. Moving toward production we expect real-time data feeds to arrive via HTTP/POST or as replies to queries you make on an external server's URL.
Best Regards -
RJ