|
BLOG
|
These days, the preferred method for monitoring a GeoEvent Server in a production system is the ArcGIS Monitor. But sometimes you want to just see/monitor the local system directly within GeoEvent to quickly evaluate how the system is performing (or alert you if it isn't performing well). Below are instructions for deploying an input connector that will collect some basic information from your system and output it to a csv file. You are free to modify the configuration once imported to write the data to a geodatabase and/or alert when certain parameters exceed certain tolerances. Import the System Info Transport Unzip the attached zip file below to a temporary location. You will deploy the new System Info Transport to your GeoEvent server. Method 1: Copy to the deploy folder Copy the .jar file from the temporary location into your GeoEvent Deploy directory. On Windows the default location for the deploy directory is: c:\Program Files\arcgis\server\geoevent\deploy\ ). Method 2: Import the .jar file using GeoEvent Manager In GeoEvent Manager, navigate to Site > Components > Transports and press the Add Local Transport button. Press the Choose Files button, browse to the temporary location, select the .jar file, press Open, then press the Add button. Once the System Info Transport is deployed it will show up in your list of inbound transports within GeoEvent Manager on the Site > Components > Transports page (you may have to refresh the page). Import the configuration In GeoEvent Manager navigate to Site > GeoEvent > Configuration Store and press the Import Configuration button. Press the Choose File button, browse to the location of the GeoEventConfig SystemInfo.xml file, press the Open button, then press the Next button. Select Import Configuration and press the Import button. This will import the System Info Connector, GeoEvent Definition, Input, Output, and Geoevent Service for you. Explore the connector Navigate to Site > GeoEvent > Connectors The connector utilizes the System Info Transport, which generates information in generic JSON format. The adapter is the Generic JSON Adapter. The following fields are hidden from the user since they are not used: Expected Date Format, Construct Geometry from Fields, X/Y/Z Geometry field, Default Spatial Reference, Learning Mode, As GeoJSON, and JSON Object Name. Note that the JSON Object Name is first default hard coded to ‘OperatingSystemInformation’. The remaining fields can be left in the Shown Properties area. You may wish to modify the Update Interval, since the default value (1 second) may be too frequent for long term monitoring. Explore the GeoEvent Definition A default system-info-in GeoEvent Definition is supplied in the configuration with some basic parameters. You may wish to have the input re-generate a new GeoEvent Definition for you just to be sure that the properties reported on your system are the same. To do this, follow the steps below: Rename the system-info-in GeoEvent definition to something like system-into-in-orignal. Navigate to Services > Inputs > system-info and make the following change and save the input: Create GeoEvent Definition: Yes GeoEvent Definition Name (New): system-info-in-auto Run the input until at least one event is received, then turn the input off. Navigate to Site > GeoEvent > GeoEvent Definitions and make a copy of the site-info-in-auto GeoEvent Definition. Name the copy: site-info-in Save the copy Navigate back to Services > Inputs > system-info and make the following change and save the input: Create GeoEvent Definition: No GeoEvent Definition Name (Existing): system-info-in At this point you can delete the system-info-in-auto and system-info-in-original GeoEvent Definitions, since they are no longer used. Explore the GeoEvent Service The GeoEvent Service system-info using the system-info input to generate a system-info-in event every 60 seconds. These events are written directly to an output that writes the events to a csv file in the GeoEvent Backup folder. By default this folder (on Windows) is located: C:\ProgramData\Esri\GeoEvent\systeminfo\ These csv files will record the status of your server over time and can be easily manipulated in Excel to create graphs of the historical system resources.
... View more
07-17-2020
02:15 PM
|
1
|
3
|
1530
|
|
BLOG
|
With the 2020 Esri UC coming up next week I wanted to highlight what Esri and the larger GIS community are doing this year to promote Diversity, Equity & Racial Justice. I believe that everyone should step up and participate in these activities. Click the links below to see what is going on and how you can get involved: Diversity-Equity-Inclusion-Racial-Justice-at-the-esri-UC2020 Racial Equity & Social Justice SIG Thursday, July 16 @ 9:15 AM PT GIS for Equity & Social Justice SIG Thursday, July 16 @ 11:45 AM PT https://userconference2020.schedule.esri.com/schedule/1335296033Women and GIS SIG Thursday, July 16 @ 1 PM PT
... View more
07-02-2020
12:21 PM
|
0
|
0
|
610
|
|
BLOG
|
With the 2020 Esri UC coming up next week I wanted to highlight the ability to schedule appointments with experts to get your questions answered, your designs reviewed, or whatever it is that you feel may need a review by an expert. Click the link below to schedule your appointment today: UC/Experience/ArcGIS-Appointments
... View more
07-02-2020
12:08 PM
|
1
|
1
|
816
|
|
BLOG
|
When writing a custom component for GeoEvent using the GeoEvent SDK, it is sometimes helpful to cache some information to the file system. This information could be a cache of the most recent data or configuration properties. This blog shows you how to allocate a folder within the GeoEvent file structure that will allow you create files and store data that will persist through GeoEvent Server restarts. Step 1: OSGI Blueprint Configuration The first step is to modify the config.xml file under your src/main/resources/OSGI-INF/blueprint/ folder. You need to add the following property to your <bean>: <property name="dataFolder" value="./data/yourfolder" /> For example: <bean id="myTransportServiceBean" class="com.esri.geoevent.transport.example.MyTransportService" activation="eager"> <property name="bundleContext" ref="blueprintBundleContext" /> <property name="dataFolder" value="./data/mytransport" /> </bean> The base directory is the install directory for GeoEvent Server (default on Windows is c:\Program Files\arcgis\server\geoevent\ ). Within the GeoEvent Server install folder there are two directories that are recommended for use: ./data/: The data directory is a file cache of information such as OSGI bundles, logs, and other temporary files. The files in this folder are used for caching the state of GeoEvent Server between restarts. If you are storing a cache of your custom component's data or configuration information, this is an ideal location. ./assets/: The assets folder is a place where you can store information that will be available on the GeoEvent Server Manager's web server. You can use this location to store any information that may potentially be accessed via a browser. This could be configuration information or a status page for your custom component. Items in this folder can be accessed using the following URL: https://yourServerName:6143/geoevent/assets/ Step 2: Add a setDataFolder() method to your Service After you make the config.xml file change above, OSGI will inject a folder location into your service bean. Within your Service Java class, you need to add a setter method to allow the framework to inject this folder object. Here's an example transport Service class: public class MyTransportService extends TransportServiceBase { protected static File dataFolder; public MyTransportService() { definition = new XmlTransportDefinition(getResourceAsStream("geotab-transport-definition.xml")); } public void setDataFolder(File inDataFolder) { dataFolder = inDataFolder; if (!dataFolder.exists()) { dataFolder.mkdirs(); } } @Override public Transport createTransport() throws ComponentException { return new MyTransport(definition, dataFolder); } } Step 3: Use the folder After the above changes, your custom component will have access to a file folder location that can be used to store files. As mentioned above, these files can contain configuration information, cached data, status information, or whatever you like.
... View more
06-04-2020
02:19 PM
|
0
|
0
|
680
|
|
POST
|
Hey Abdelhalim Ibrahim If I understand your question correctly, you would like the output to recognize that each event is a duplicate of the previous one and not send that event. Unfortunately, the OOTB Push JSON to External Web Site does not implement any mechanism to determine if an event is unique or not. Thus, if you have 5,562 events in your input, it will write 5,562 events to your output. The easiest solution would be to deploy the Updates Only Processor and insert that between your input and output. This processor will drop any event with a TRACK_ID+timestamp that is identical to a previous event. solutions-geoevent-java/solutions-geoevent/processors/updateOnly-processor at master · Esri/solutions-geoevent-java · Gi… If you have some control over the website you are pushing data to and can accomodate an updated JSON schema, you can create your own GeoEvent Outbound Connector that uses the [Feature] JSON Outbound Adapter (not the Generic JSON Adapter) combined with the HTTP REST Outbound Transport. The Feature JSON Outbound Adapter will utilize the event's TRACK_ID to determine if the event is unique or new (within the update interval setting). The easiest way to to do this is to make a copy of the existing Push JSON to Exernal Web Site, change the adapter to JSON, and update the parameter/properties appropriately: Unfortunately your JSON format will change to the Esri Feature Set JSON format (see JSON response example (returnIdsOnly=false and returnCountOnly=false) here).
... View more
05-26-2020
12:46 PM
|
0
|
0
|
600
|
|
POST
|
Hey, I would use a Watch a Folder for new JSON files input to read the files in. If there aren't a lot of files, you could copy them by hand one at a time into the folder that is being watched. If there are a lot of them, you could set up a python script to do it at a sustainable rate. Just be sure to copy them into the watched folder using a different extension than is being watched for and rename them once they are copied. Basically, you want to be sure all the data is written to the file in the watched folder BEFORE GeoEvent tries to open it up and read it. For example, set GeoEvent up to watch for *.json files, copy all of the data files in as *.txt files, then rename them one by one to have the *.json extension. More details can be found in the following blog: https://community.esri.com/people/eironside-esristaff/blog/2019/04/11/geoevent-input-watch-a-folder-for-new-files
... View more
05-26-2020
12:20 PM
|
0
|
0
|
1138
|
|
POST
|
Hey Tim Leefmann Long story short, best practice is to create a GeoEvent Definition with/for your output first, field map into that definition before the field calculator, then field calculate into an existing field. Some of the details behind the 'why' is contained in the following blog article: https://community.esri.com/people/eironside-esristaff/blog/2020/01/07/geoevent-consuming-new-data Basically, if you have the Field Calculator create the GeoEvent Definition, this definition is transient and created/destroyed at the platforms whim. Whenever possible create GeoEvent definitions that are owned by your user and can't be deleted by the system. Field Map first, then use the existing fields whenever posssible. Best, Eric
... View more
05-26-2020
12:11 PM
|
2
|
0
|
1406
|
|
BLOG
|
One common input for GeoEvent is to create a REST endpoint to receive XML, JSON, or CSV events. Once created the URL is displayed on the input and is available for data providers to POST data to. But some have wondered how you secure such an endpoint. In general, this type of endpoint is relatively secure because it requires an attacker know two very specific things: The exact case-sensitive URL of the receiver endpoint. The GeoEvent definition that the receiver endpoint is listening for. Current versions of GeoEvent exposed a directory of GeoEvent configurations, making it easier to discover this information. When GeoEvent 10.8.1 is released, this will all change and the directory will be secured just like the /geoevent/admin/ API. This blog discusses one option for securing your receiver end points, regardless of version. These two steps are not foolproof, but do go a long way toward making your receiver endpoints as secure as possible. Step 1: Reverse Proxy The first step is to obscure the endpoint behind a reverse proxy. The proxy will translate an externally available HTTP URL on a standard port (80/443) to the internal URL on port 6143. When communicating with external/3rd party data providers, you can give them the external URL and don't have to expose your internal network names. Utilizing a reverse proxy also allows you to hide the directory information above so that external entities cannot browse your GeoEvent configuration. Step 2: White List IPs For the second step, you will need to know the IP addresses that your data provider will be POSTing data from. This could be a single IP or a range/set of IPs in cases where the data provider is using a cluster of servers or has fail-over/disaster recovery set up on their system. Have your IT administration white list these IP addresses and block all other communication to the external URL. Once this step is done, only the white listed IP addresses will be able to post data to your external receiver URL. Step 3: Monitor As with any good security system, your receiver endpoints must be monitored. At a minimum be sure to periodically inspect the data that is received and compare it to known information you have about the event stream (like a separate API for querying data, equipment logs, or known operating parameters). In addition, you could monitor event rates looking for situations where the number of events per period of time increases, indicating someone is injecting events or a denial of service attack.
... View more
04-30-2020
08:51 AM
|
1
|
0
|
1463
|
|
POST
|
Hey, At last check, the Zonar feed was working OOTB with GeoEvent using the following endpoint. The documentation doesn't mention at the top, but the method does return XML and JSON. https://support.zonarsystems.net/hc/en-us/articles/360016205232-Export-a-Path The status is bit-encoded, so you may have to use a field calculator or filter to do something with it (or just use the numbers as is and symbolize on that). Best, Eric
... View more
04-22-2020
10:25 AM
|
0
|
0
|
925
|
|
POST
|
On March 1, MDS added a /vehicles end point to their specification for release 0.4.1. It should provide a cleaner way to get data without the paging start/end dates. Not sure if/when this will show up in vendor APIs. https://github.com/openmobilityfoundation/mobility-data-specification/pull/376 Best, Eric
... View more
04-22-2020
10:18 AM
|
0
|
2
|
1436
|
|
BLOG
|
target="_self">Complex Track Gap Requirements Complex Reporting Requirements 'Keep It Simple' Solution Real-time Device Updates Real-time Device Status Group Status Publish the Views Complex Track Gap Requirements I was recently asked to provide a solution that would identify the status of a set of devices. Each device could be 'Active' or 'Inactive' based on the amount of time that had elapsed since the last message received. On first glance this seemed like a good solution for the Track Gap Detector, however there was an additional detail in the requirements that threw that solution out the window: Each class of device had a different time window for active/inactive status (as shown below). So for "Red" devices, they were sending events every 15 seconds and needed to be marked as inactive if more than 8 events (or 120 seconds) had been missed. But for "Pink" devices, they were sending events every 10 minutes and needed to be marked as inactive if more than 2 events (or 20 minutes) had been missed. Complex Reporting Requirements To make matters even harder, a report was needed that would indicate the number of each device that was active/inactive. This report needed to be accurate and updated in real-time. GeoEvent can handle the real-time updates of singular events, but aggregation and caching of state is NOT something GeoEvent does (with the exception of a few operations like enter/exit). 'Keep It Simple' Solution I considered having a different Track Gap Detector for each group, but that wasn't sustainable if the list of devices was to grow. I also considered writing some sort of custom processor to cache the state of each TRACK_ID and provide some sort of aggregate count of group status. But then I came to my senses and realized there is one tool in our GIS arsenal that is expert at on the fly status calculations and aggregations: Database Views (or Query Layers if you prefer). A database view is created in a relational database (SQL Server Express in this instance) by ArcCatalog. You use SQL to define the view. Each view is based on an underlying database table (feature class) and is refreshed for each request. So you can use database views to do all sorts of wonderful things to your data that you can't do elsewhere. Real-time Device Updates To start, I used GeoEvent to write the latest updates from each of the devices to a feature class in the database. These records contained a lot of information from the device, but the most important fields for this blog are Name (TRACK_ID), Type (Group), and MsgDatetime (last time the device reported in). This feature class was stored in the same relational database as the configuration table above (I left out the threshold in seconds because it can be calculated). This configuration table can easily be maintained or even expanded as more devices and/or new types of devices are added to the system. Real-time Device Status Now for our first view. To determine the status of any device, we need to compare the age of the last event record to the inactive threshold. In SQL Server, we can use the DATEDIFF function to subtract the message time from the current time. The result (in seconds) tells us how old each record is. Age of Last Message = DATEDIFF( seconds, device.MsgDatetime, SYSUTCDATETIME() ) To calculate the inactive threshold in seconds, we multiply the report interval by the inactive threshold: Inactive Threshold (seconds) = config.ReportInterval * config.Threshold For the status, we need an CASE WHEN statement to see if the age (in seconds) is greater than or less than the inactive threshold (in seconds): CASE
WHEN (DATEDIFF(second, asset.MsgDatetime, SYSUTCDATETIME()) > (config.ReportInterval * config.Threshold))
THEN 'Inactive'
ELSE 'Active'
END
AS AssetStatus In order to join the two tables together, we use a LEFT OUTER JOIN. This type of join selects all the records of the first table, and joins a matching record from the second table. We perform this join on the device's type. FROM dbo.DEVICECURRENT AS device LEFT OUTER JOIN
dbo.DEVICECONFIG AS config ON device.Type = config.DeviceType The final view SQL is as follows and the result of the view is below that: SELECT
device.OBJECTID,
device.Name,
device.DeviceType,
device.MsgDatetime,
config.ReportInterval,
config.Threshold,
config.ReportInterval * config.Threshold AS MaxAgeSeconds,
DATEDIFF(second, device.MsgDatetime, SYSUTCDATETIME()) AS CurrentAge,
CASE
WHEN (DATEDIFF(second, device.MsgDatetime, SYSUTCDATETIME()) > (config.ReportInterval * config.Threshold))
THEN 'Inactive'
ELSE 'Active'
END
AS DeviceStatus,
device.SHAPE
FROM dbo.DEVICECURRENT AS device
LEFT OUTER JOIN dbo.DEVICECONFIG AS config ON device.DeviceType = config.DeviceType Group Status Now we need to aggregate these individual status records into a group status. To do that, we want to group above status view by the DeviceType and count how many records are in each type. GROUP BY DeviceType But we don't just want a count, we need to know how many are active and how many are inactive. In order to conditionally count records we can use the COALESCE function in SQL Server: COALESCE (SUM(CASE WHEN AssetStatus = 'Active' THEN 1 ELSE 0 END), 'n/a') AS Active
COALESCE (SUM(CASE WHEN AssetStatus = 'Inactive' THEN 1 ELSE 0 END), 'n/a') AS Inactive In order to aggregate the geometry of the group, I've chosen the convex hull operation provided by SQL Server: geometry::ConvexHullAggregate(SHAPE) AS shape So the final SQL that defines this view is as follows: SELECT MIN(OBJECTID) AS OBJECTID, DeviceType, COALESCE (SUM(CASE WHEN DeviceStatus = 'Active' THEN 1 ELSE 0 END), 'n/a') AS Active, COALESCE (SUM(CASE WHEN DeviceStatus = 'Inactive' THEN 1 ELSE 0 END), 'n/a') AS Inactive, geometry::ConvexHullAggregate(SHAPE) AS SHAPE FROM dbo.DeviceStatus GROUP BY DeviceType Which results in the following view table: Publish the Views Once I had the views working, I added them to a map service, created a web map and a dashboard to display each device's current location and status as well as the group status. All of that information updates in real-time based on each device's last event age.
... View more
04-06-2020
01:30 PM
|
1
|
0
|
1323
|
|
BLOG
|
Ingesting Variable Length CSV Messages Flexible CSV Adapter Deploying & Compatibility Connectors Troubleshooting Updated to GeoEvent 10.6.1 on June 27, 2022. Ingesting Variable Length CSV Messages In GeoEvent, you can use a Text Adapter on an inbound connector to parse comma separated values (CSV), typically with either the TCP transport (like when you use the GeoEvent Simulator) or the File transport (to watch a folder for data files). In general, these CSV messages tend to be static in format allowing you to use one GeoEvent definition to read them in. However, I recently ran into a situation where a device was reporting a truncated message, based on the amount of data it actually had to report. In this case, the devices were only reporting updates, so if there was no update for fields at the end of the message, those fields were omitted from the CSV message. A sample of the data would look like this (I've added some spaces to make the structure a bit clearer): Name,IdNum,Type,MsgDatetime ,Value1,Value2,Value3,Value4,Value5,Value6,Value7,Value8,Value9,Value10 Red1,1 ,Red ,1585947092133, , , , , ,20.402, , , ,-0.57 Red1,1 ,Red ,1585947692133, , , , , ,1.5001, ,148 ,2 Red1,1 ,Red ,1585948292133, ,10 ,62 ,4 ,-0.22 ,17.701,170 , ,1 Red1,1 ,Red ,1585948892133,6 , ,65 , ,0.64 , ,220 , ,8 Red1,1 ,Red ,1585949492133, , , ,1 Red1,1 ,Red ,1585950092133,94 ,15 , ,1 , , ,480 ,139 ,3 Red1,1 ,Red ,1585950692133,42 ,3 , , ,0.64 , , , ,4 ,-0.99 Red1,1 ,Red ,1585951292133,67 ,-17 , , ,-0.05 ,5.4005, , , ,-0.74 Red1,1 ,Red ,1585951892133, , ,15 ,3 ,-0.72 , ,280 , ,7 Red1,1 ,Red ,1585952492133,22 ,14 , , ,-0.33 ,28.502, , ,8 Red1,1 ,Red ,1585953092133, ,2 , ,1 , , ,395 , ,3 Red1,1 ,Red ,1585953692133, , , ,1 , , , , ,4 Red1,1 ,Red ,1585954292133,55 ,3 ,20 , , , , , ,8 Red1,1 ,Red ,1585954892133,97 ,8 , , ,-0.2 , ,470 , ,8 ,-0.71 Using the OOTB CSV inputs, all but 4 of these messages would be dropped because they don't contain the full set of fields. All of the messages conform to the expected GeoEvent Definition, however some of them are missing the trailing set of fields because there is no updated value for those fields. Flexible CSV Adapter To get around this issue I updated the OOTB inbound Text Adapter to be a bit more flexible. For each CSV message, it will attempt to put each field into the provided GeoEvent definition. If there aren't enough fields in the CSV message to fill the GeoEvent, the remaining fields in the event will have their values set to NULL. If there are too many fields in the CSV message, the extra fields in the CSV message will be ignored (like if you only wanted the first 5 values in the above example messages). Deploying & Compatibility This adapter is compatible with any version of GeoEvent Server 10.4 or later. To deploy, unzip the attached file and either upload the .jar file to GeoEvent Manager on Site > Components > Adapters using the Add Local Adapter button. Alternatively, you can copy the .jar file into the deploy folder of your GeoEvent Server installation (for windows that would typically be C:\Program Files\ArcGIS\Server\GeoEvent\deploy\ ). The current release is marked version 0.1 (proof of concept). Please report any issues in the comments section of this blog. Connectors Adding the .jar file to GeoEvent Server will create two inbound connectors for you. You can create additional connectors utilizing different transports if needed. Troubleshooting Loggers that can help troubleshoot: com.esri.geoevent.adapter.flextext.TextInboundAdapterService
com.esri.geoevent.adapter.flextext.TextInboundAdapter Source Code The source code for this adapter is available in the following GitHub repository: https://github.com/EsriPS/geoevent-flextext-adapter
... View more
04-06-2020
11:29 AM
|
3
|
3
|
1681
|
|
BLOG
|
target="_self">Updating Data "On Change" Custom, Non-null Feature JSON Adapter Installation & Compatibility Connectors Troubleshooting Updated 04/20/2021 - Now compatible with any GeoEvent Server version 10.6 or later Updating Data "On Change" In GeoEvent, when you write out data to an Update Features output, it will update the fields in an existing record (if that record already exists). But I recently ran into a situation where a device was reporting several parameters, but only on change. So the events over time might look like the following: When I passed this data to the Update Features output, it updated the current values, but replaced the past values with the null value. I could have used a filter and a field reducer to remove the null fields, but I would have needed to do that for each field (10 filter/field reducers!!). Custom, Non-null Feature JSON Adapter Instead, I created a custom JSON Adapter (based on the OTTB JSON Adapter) that omitted any fields that were either NULL or Empty Strings. It has all of the same parameters as the OOTB adapter, with the addition of two more: Write Null Values? [Yes/No] and Write Empty Strings? [Yes/No] Installation & Compatibility This adapter is compatible with any version of GeoEvent Server 10.4 or later. To deploy, unzip the attached file and either upload the .jar file to GeoEvent Manager on Site > Components > Adapters using the Add Local Adapter button. Alternatively, you can copy the .jar file into the deploy folder of your GeoEvent Server installation (for windows that would typically be C:\Program Files\ArcGIS\Server\GeoEvent\deploy\ ). The current release is marked version 0.1 (proof of concept). Please report any issues in the comments section of this blog. Connectors Installing the .jar file will create two outbound connectors for you. You can create additional connectors as needed. Troubleshooting Loggers that can help troubleshoot: com.esri.geoevent.adapter.nonnulljson.FeatureJsonOutboundAdapterService
com.esri.geoevent.adapter.nonnulljson.FeatureJsonOutboundAdapter
com.esri.geoevent.adapter.nonnulljson.FeatureJsonOutboundConverter
... View more
04-06-2020
10:58 AM
|
2
|
0
|
1010
|
|
POST
|
Hey, GeoEvent is not designed to share information across multiple events. There are a few processors out there that might cache a previous event (enters, exits, event joiner, etc.) but these are usually the exception to the norm. Rather than using GeoEvent to aggregate your records, I would suggest using the platform that is storing your data: RDBMS - If you are putting the data into a relational database, create a SQL View to join and aggregate your data. ArcGIS Online - In ArcGIS Online you can take advantage of the Join Features functionality to create a join between two tables (e.g. county polygons and your Survey123 data table) that will update with the data. See the following blog for a discussion and example. Visualizing related data with Join Features in ArcGIS Online
... View more
03-24-2020
01:20 PM
|
1
|
1
|
1914
|
|
POST
|
Hey Adam Repsher Stefan P. Jung suggestion is a good one. I would add the following additional step to your batch file: When you copy the file from the romote server to your local drive, make sure you use a different extension than what GeoEvent is looking for. So assuming your geoevent is monitoring for '*.csv' files, you will want to xcopy the files as 'file.txt'. After the file is copied, add the following lines to your script to rename the file and include a date/timestamp in the name. net use \\remoteserver\sharedfolder /user:domain1\user1 password xcopy \\remoteserver\sharedfolder\file.txt C:\Temp\GeoEventWatch\ /Y For /f "tokens=2-4 delims=/ " %%a in ('date /t') do (set mydate=%%c%%a%%b) For /f "tokens=1-2 delims=/:" %%a in ("%TIME%") do (set mytime=%%a%%b) rename c:\temp\GeoEventWatch\file.txt file_%mydate%_%mytime%.csv net use \\remoteserver\sharedfolder /delete Please see the following blog for more information https://community.esri.com/people/eironside-esristaff/blog/2019/04/11/geoevent-input-watch-a-folder-for-new-files Note: if you run the above commands outside of a batch file, change the '%%' to '%' in the "For ..." lines.
... View more
03-23-2020
10:34 AM
|
3
|
3
|
3356
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 01-05-2023 02:58 PM | |
| 2 | 07-21-2021 07:16 PM | |
| 1 | 02-05-2024 11:02 AM | |
| 1 | 09-14-2023 08:09 PM | |
| 2 | 05-13-2019 09:32 AM |