|
BLOG
|
This blog is one in a series of blogs discussing debugging techniques you can use when working to identify the root cause of an issue with a GeoEvent Server deployment or configuration. Click any link in the quick list below to jump to another blog in the series. Configuring the application logger Add/Update Feature Outputs Application logging tips and tricks Geofence synchronization deep dive (this blog) In this blog I will discuss a technique I have used to perform a targeted extraction of debug messages being logged as GeoEvent Server queries a feature record set from an available polygon feature service to synchronize its geofences. The technique expands the use of command-line utilities first introduced in a previous blog. These utilities enable us to perform pattern matching on specific sections of a logged message then extract and apply string substitution and formatting to the logged messages, live, as the messages are being written, to make them easier to read. A lot of the analysis I am about to demonstrate could be done using a static log file and a text editor, but I have come to really appreciate the power inherent in the command-line utilities I am covering in this blog. Our goal will be to find and review HTTP requests GeoEvent Server makes on a feature service resource being used as the authoritative source of area of interest polygons as well as the feature service's responses to the requests. Scenario A customer has published a feature service with several dozen polygons representing different areas of interest and configured a Geofence Synchronization Rule to enable the polygons to be periodically imported and synchronized to keep a set of geofences up-to-date. We know that GeoEvent Server polls the feature service to obtain a feature record set and registers the geometries with its AOI Manager – in this context AOI is short for "Area of Interest". For this exercise we are interested in the interface between GeoEvent Server and an ArcGIS Server feature service, not the internal operations of the AOI Manager. We want to capture information on feature record requests and the responses to these requests. GeoEvent Manager does not provide an indication of when geofence synchronization occurs, only that it occurs once every 10 minutes in the customer's configuration, so the customer would like to know if enabling debug logging for a specific component logger will grant them additional visibility into the periodic geofence refresh as it takes place. Knowing when a synchronization is about to occur will more deterministic testing on the real-time analytics configured without resorting to aggressive synchronizations every few seconds. Geofence Synchronization To begin testing the scenario described above I published a feature service and added a few dozen polygon feature records to the service's feature data set. I can query the feature records via the feature service's REST endpoint: Notice that each feature record has two attribute fields, gf_category and gf_name, which can be used to uniquely name and organize geofences when they are imported into GeoEvent Server. Next, in GeoEvent Manager, I configure a synchronization rule that will query the feature service every 10 minutes. The feature records illustrated above will be loaded into the GeoEvent Server’s AOI Manager which handles the addition and update of geofences. At this point I know that GeoEvent Server is periodically querying the feature service, but the GeoEvent Manager web application does not provide any indication of when the synchronizations will occur. I know the synchronization cycle starts when I click the Synchronize button and occurs every 10 minutes after that, but I do not know which component loggers would be most appropriate to watch for DEBUG messages. My only real choice, then, is to request debug logging for all component loggers by setting the level DEBUG on the ROOT component (knowing that this will cause the karaf.log file to grow very large very quickly). In a previous blog, Application logging tips and tricks, I introduced tail and grep, a couple of command-line tools that can be used to help identify and isolate logged messages based on keywords or phrases. Using this technique to identify logged messages which include specific keywords allows me to focus on messages of particular interest. In this case, however, using grep to search will not work as well because a pattern match may occur anywhere in a logged message's text. Using grep to look for something like MyGeofences[/]FeatureServer[/]0 to is likely to match more than we are interested in, specifically because the feature service’s URL appears in both the thread identifier as well as the actual message portion of numerous logged messages. So we need a more discriminating technique. We need a way to apply a regular expression pattern match to a specific portion of a logged message and associate a successful pattern match with an action we can run on the text of messages as they are written to the system’s log file. Power tools for text processing and data extraction Consider the following command which leverages awk rather than grep and a new stream editing utility sed: rsunderman@localhost //localhost/C$/Program Files/ArcGIS/Server/GeoEvent/data/log tail -0f karaf.log | awk -F\| '$6 ~ /rest.*services.*MyGeofences/ { print $1 $4 $6; fflush() }' |sed 's/[&]token[=][0-9a-zA-Z_-]*/.../' The awk command is typically used for data extraction and reporting. It is a text processing language developed by Aho, Weinberger, and Kernighan (yes, AWK is an acronym). The sed command is a stream editor used to filter and transform text. When interpreting the command line illustrated above remember that logged messages have six parts and each part is separated by a pipe ( | ) character. As new messages are added to the karaf.log each message’s text is processed by the awk script which specifies that a pipe character should be used as the field delimiter and that the sixth field, the actual message, should match the specified regular expression pattern. If the pattern is matched then fields 1, 4, and 6 from each logged message are printed as output. The fflush( ) is important to force the command's buffered content to be flushed as each line of text is processed so that the sed command can identify a string of characters matching a query parameter &token= and replace the entire string with a few literal dots (simplifying the overall string). There is a lot of power packed into this command. It enables us to apply a dynamic if / then evaluation to each logged message as the message is committed to the system log file, discard any message when a specific field does not match a specific pattern, and reformat messages on-the-fly to simplify their display. Wow. You can read all about the power of sed and awk online. O’Reilly Media has an entire book dedicated to using sed and awk as power tools for text processing and data extraction. Determining which component logger(s) to watch The following illustration shows the output produced when the command above is used to filter the large volume of messages logged by all components when debug logging is requested at the ROOT level. For this example, assume that the command was run just before the Synchronize button is clicked to force a geofence synchronization rule to perform a set of queries against the feature service. One pattern that stands out immediately is that there appear to be four requests made. Different component loggers represent these the requests in their own way, but we see key phrases repeated such as "Executing following request", "Main Client Exec ...Executing request" and the request's outgoing headers and actual request going out over the HTTP wire: (Click to Enlarge Image) We certainly don't need to see each request represented four different ways, and a quick search of the karaf.log for the key word "MainClientExec" shows the raw (unprocessed) log messages are associated with a particular class and bundle. These are clues to loggers we can interrogate further: (Click to Enlarge Image) If we are careful to leave DEBUG logging turned on at the ROOT level for only as long as it takes to navigate to the GeoFence Synchronization Rules and click Synchronize, then return to the Logs and change the settings back to WARN for the ROOT level, we can use the cached logged messages to generate a list of possible component loggers we might be interested in looking at more closely. Two loggers that seem specifically appropriate are org.apache.http.impl.execchain.MainClientExec (because "MainClientExec" was identified as a class name of interest) and com.esri.ges.httpclient.Http (because the bundle identifier "com.esri.ges.framework.httpclient" was part of each logged message). Requesting DEBUG logging on the HTTP Client logger will still produce a large number of logged messages. By targeting a single logger, however, we reduce the number of messages being logged overall; we are not interested in examining debug messages from the header or wire components for example. Also, we can tailor our sed and awk command to help further identify messages of particular interest. If we run our text extraction and format command on an active tail of the karaf.log – and take care to start and end the tail around the time that we navigate to GeoFence Synchronization Rules and click Synchronize – the number of logged messages is surprisingly manageable. I have included the 24 lines extracted and formatted by the sample command below which is looking specifically for the key phrases "Executing request" and "Got response": $ tail -0f karaf.log |awk -F\| '$6 ~ /(Executing request|Got response)/ { print $1 $6; fflush() }' |sed 's/[&]token[=][0-9a-zA-Z_-]*/.../' 2019-11-07T18:06:40,294 Executing request POST /arcgis/admin/machines/localhost/status HTTP/1.1
2019-11-07T18:06:40,342 Got response from HTTP request: <html lang="en">
2019-11-07T18:06:43,629 Executing request POST /arcgis/admin/system/configstore HTTP/1.1
2019-11-07T18:06:43,646 Got response from HTTP request: <html lang="en">
2019-11-07T18:06:46,622 Executing request GET /arcgis/help/en/geoevent HTTP/1.1
2019-11-07T18:06:46,626 Executing request GET /arcgis/help/en/geoevent/ HTTP/1.1
2019-11-07T18:06:47,250 Executing request GET /arcgis/rest/info?f=json HTTP/1.1
2019-11-07T18:06:47,253 Got response from HTTP request: {"currentVersion":10.8,"fullVersion":"10.8.0","soapUrl":"https://localhost:6443/arcgis/services","secureSoapUrl":null,"authInfo":{"isTokenBasedSecurity":true,"tokenServicesUrl":"https://localhost:6443/arcgis/tokens/","shortLivedTokenValidity":900}}.
2019-11-07T18:06:47,710 Executing request GET /arcgis/rest/services/?f=json..... HTTP/1.1
2019-11-07T18:06:47,720 Got response from HTTP request: {"currentVersion":10.8,"folders":["System","Utilities"],"services":[{"name":"AffectedTransLines-Buffers","type":"StreamServer"},{"name":"AffectedTransLines-Intersections","type":"StreamServer"},{"name":"CriticalInfrastructure","type":"FeatureServer"},{"name":"CriticalInfrastructure","type":"MapServer"},{"name":"Geofence_Stream","type":"StreamServer"},{"name":"MyGeofences","type":"FeatureServer"},{"name":"MyGeofences","type":"MapServer"},{"name":"SampleWorldCities","type":"MapServer"},{"name":"TropicalStormPolygons","type":"StreamServer"}]}.
2019-11-07T18:06:48,060 Executing request GET /arcgis/rest/services/?f=json..... HTTP/1.1
2019-11-07T18:06:48,068 Got response from HTTP request: {"currentVersion":10.8,"folders":["System","Utilities"],"services":[{"name":"AffectedTransLines-Buffers","type":"StreamServer"},{"name":"AffectedTransLines-Intersections","type":"StreamServer"},{"name":"CriticalInfrastructure","type":"FeatureServer"},{"name":"CriticalInfrastructure","type":"MapServer"},{"name":"Geofence_Stream","type":"StreamServer"},{"name":"MyGeofences","type":"FeatureServer"},{"name":"MyGeofences","type":"MapServer"},{"name":"SampleWorldCities","type":"MapServer"},{"name":"TropicalStormPolygons","type":"StreamServer"}]}.
2019-11-07T18:06:56,608 Executing request GET /arcgis/rest/services/MyGeofences/FeatureServer/0/query?f=json.....&where=1%3D1&outFields=gf_name%2Cgf_category&outSR=4326 HTTP/1.1
2019-11-07T18:06:56,635 Got response from HTTP request: {"objectIdFieldName":"objectid","globalIdFieldName":"","geometryType":"esriGeometryPolygon","spatialReference":{"wkid":4326,"latestWkid":4326},"fields":[{"name":"gf_name","alias":"gf_name","type":"esriFieldTypeString","length":50},{"name":"gf_category","alias":"gf_category","type":"esriFieldTypeString","length":50}],"features":[{"attributes":{"gf_name":"Alpha_003","gf_category":"Alpha"},"geometry":{"rings":[[[-120.252028,30.944518],[-119.784204,29.644623],[-120.566595,29.483390],[-121.447948,30.461197],[-121.275115,30.841066],[-120.252028,30.944518]]]}},{"attributes":{"gf_name":"Alpha_005","gf_category":"Alpha"},"geometry":{"rings":[[[-120.943999,33.487086],[-121.032831,32.575755],[-121.690575,31.901015],[-122.421752,32.119503],[-122.245335,33.447119],[-120.943999,33.487086]]]}},{"attributes":{"gf_name":"Alpha_006","gf_category":"Alpha"},"geometry":{"rings":[[[-122.691280,29.516679],[-123.226533,29.802332],[-122.749277,31.805495],[-122.429246,32.118518],[-122.421752,32.119503],[-121.690575,31.901015],[-121.275115,30.841066],[-121.447948,30.461197],[-122.691280,29.516679]]]}},{"attributes":{"gf_name":"Alpha_008","gf_category":"Alpha"},"geometry":{"rings":[[[-120.851764,33.649721],[-120.165423,33.593953],[-119.397317,32.932664],[-120.074747,32.236847],[-121.032831,32.575755],[-120.943999,33.487086],[-120.851764,33.649721]]]}},{"attributes":{"gf_name":"Alpha_010","gf_category":"Alpha"},"geometry":{"rings":[[[-116.132660,31.584451],[-116.047511,31.421341],[-115.681330,29.828279],[-116.996163,30.707680],[-116.957586,31.120668],[-116.132660,31.584451]]]}},{"attributes":{"gf_name":"Alpha_011","gf_category":"Alpha"},"geometry":{"rings":[[[-117.397404,29.178025],[-117.888219,29.222649],[-118.816738,30.272808],[-118.787736,30.371437],[-118.275610,30.321049],[-117.406325,29.315306],[-117.397404,29.178025]]]}},{"attributes":{"gf_name":"Alpha_013","gf_category":"Alpha"},"geometry":{"rings":[[[-118.461679,30.835274],[-118.017691,30.803414],[-118.017804,30.732340],[-118.275610,30.321049],[-118.787736,30.371437],[-118.830849,30.513622],[-118.712642,30.726205],[-118.461679,30.835274]]]}},{"attributes":{"gf_name":"Alpha_014","gf_category":"Alpha"},"geometry":{"rings":[[[-118.017804,30.732340],[-117.331629,29.805684],[-117.406325,29.315306],[-118.275610,30.321049],[-118.017804,30.732340]]]}},{"attributes":{"gf_name":"Alpha_018","gf_category":"Alpha"},"geometry":{"rings":[[[-118.291482,32.187915],[-118.236504,32.105900],[-118.461679,30.835274],[-118.712642,30.726205],[-118.910999,31.691403],[-118.291482,32.187915]]]}},{"attributes":{"gf_name":"Alpha_021","gf_category":"Alpha"},"geometry":{"rings":[[[-118.236504,32.105900],[-117.610715,31.368293],[-118.017691,30.803414],[-118.461679,30.835274],[-118.236504,32.105900]]]}},{"attributes":{"gf_name":"Alpha_022","gf_category":"Alpha"},"geometry":{"rings":[[[-118.415540,32.957686],[-118.306950,32.857367],[-118.241221,32.789177],[-118.291482,32.187915],[-118.910999,31.691403],[-119.764803,31.398264],[-120.074747,32.236847],[-119.397317,32.932664],[-118.415540,32.957686]]]}},{"attributes":{"gf_name":"Alpha_023","gf_category":"Alpha"},"geometry":{"rings":[[[-116.802682,34.081536],[-115.652745,33.453531],[-115.641477,32.336068],[-115.871176,32.188122],[-117.476538,32.953794],[-116.802682,34.081536]]]}},{"attributes":{"gf_name":"Alpha_024","gf_category":"Alpha"},"geometry":{"rings":[[[-122.562145,38.611239],[-122.186957,38.336612],[-122.195569,38.107073],[-123.151426,37.051286],[-122.951406,38.539622],[-122.562145,38.611239]]]}},{"attributes":{"gf_name":"Alpha_026","gf_category":"Alpha"},"geometry":{"rings":[[[-121.521585,38.401753],[-121.317484,37.884274],[-121.542840,36.899873],[-121.646568,36.833617],[-122.195569,38.107073],[-122.186957,38.336612],[-121.521585,38.401753]]]}},{"attributes":{"gf_name":"Alpha_028","gf_category":"Alpha"},"geometry":{"rings":[[[-122.340072,35.533098],[-121.150046,34.948623],[-120.851764,33.649721],[-120.943999,33.487086],[-122.245335,33.447119],[-122.980897,34.359173],[-122.340072,35.533098]]]}},{"attributes":{"gf_name":"Alpha_030","gf_category":"Alpha"},"geometry":{"rings":[[[-122.195569,38.107073],[-121.646568,36.833617],[-122.497375,35.977360],[-123.316198,36.487165],[-123.151426,37.051286],[-122.195569,38.107073]]]}},{"attributes":{"gf_name":"Alpha_033","gf_category":"Alpha"},"geometry":{"rings":[[[-120.462003,35.205002],[-119.434621,34.271019],[-120.165423,33.593953],[-120.851764,33.649721],[-121.150046,34.948623],[-120.462003,35.205002]]]}},{"attributes":{"gf_name":"Alpha_034","gf_category":"Alpha"},"geometry":{"rings":[[[-120.314583,38.136693],[-119.722418,38.072922],[-120.702231,36.942569],[-121.542840,36.899873],[-121.317484,37.884274],[-120.314583,38.136693]]]}},{"attributes":{"gf_name":"Alpha_036","gf_category":"Alpha"},"geometry":{"rings":[[[-121.315802,38.800687],[-120.314583,38.136693],[-121.317484,37.884274],[-121.521585,38.401753],[-121.315802,38.800687]]]}},{"attributes":{"gf_name":"Alpha_038","gf_category":"Alpha"},"geometry":{"rings":[[[-117.414676,35.037767],[-116.873703,34.464625],[-116.878218,34.445776],[-117.452834,34.349222],[-117.874553,34.354357],[-117.900979,34.521991],[-117.414676,35.037767]]]}},{"attributes":{"gf_name":"Alpha_039","gf_category":"Alpha"},"geometry":{"rings":[[[-118.276782,35.277533],[-118.210805,35.269057],[-118.075170,35.188227],[-117.900979,34.521991],[-117.874553,34.354357],[-118.597699,33.869008],[-118.895173,34.292433],[-118.827343,34.677451],[-118.276782,35.277533]]]}},{"attributes":{"gf_name":"Alpha_041","gf_category":"Alpha"},"geometry":{"rings":[[[-119.837637,35.784433],[-118.827343,34.677451],[-118.895173,34.292433],[-119.434621,34.271019],[-120.462003,35.205002],[-119.837637,35.784433]]]}},{"attributes":{"gf_name":"Alpha_042","gf_category":"Alpha"},"geometry":{"rings":[[[-117.922812,36.598206],[-117.057992,36.394203],[-118.210805,35.269057],[-118.276782,35.277533],[-118.493063,35.699818],[-118.285579,36.557670],[-117.922812,36.598206]]]}},{"attributes":{"gf_name":"Alpha_043","gf_category":"Alpha"},"geometry":{"rings":[[[-118.075170,35.188227],[-117.434176,35.060678],[-117.414676,35.037767],[-117.900979,34.521991],[-118.075170,35.188227]]]}},{"attributes":{"gf_name":"Alpha_045","gf_category":"Alpha"},"geometry":{"rings":[[[-119.400803,37.662407],[-118.363534,37.428185],[-118.320407,36.580012],[-119.191764,36.798038],[-119.400803,37.662407]]]}},{"attributes":{"gf_name":"Alpha_046","gf_category":"Alpha"},"geometry":{"rings":[[[-119.629592,38.114023],[-119.400803,37.662407],[-119.191764,36.798038],[-119.814395,36.016233],[-120.702231,36.942569],[-119.722418,38.072922],[-119.629592,38.114023]]]}},{"attributes":{"gf_name":"Alpha_049","gf_category":"Alpha"},"geometry":{"rings":[[[-118.645011,38.788713],[-116.375726,38.414554],[-117.879285,37.762548],[-117.981593,37.812142],[-118.645011,38.788713]]]}},{"attributes":{"gf_name":"Alpha_050","gf_category":"Alpha"},"geometry":{"rings":[[[-117.981593,37.812142],[-117.879285,37.762548],[-117.850493,37.692040],[-117.922812,36.598206],[-118.285579,36.557670],[-118.320407,36.580012],[-118.363534,37.428185],[-117.981593,37.812142]]]}},{"attributes":{"gf_name":"Bravo_003","gf_category":"Bravo"},"geometry":{"rings":[[[-114.966407,29.010553],[-115.556070,29.010613],[-115.565899,29.576853],[-114.877388,30.652564],[-113.973903,31.144828],[-113.874728,31.024001],[-114.966407,29.010553]]]}},{"attributes":{"gf_name":"Bravo_004","gf_category":"Bravo"},"geometry":{"rings":[[[-116.047511,31.421341],[-114.877388,30.652564],[-115.565899,29.576853],[-115.681330,29.828279],[-116.047511,31.421341]]]}},{"attributes":{"gf_name":"Bravo_005","gf_category":"Bravo"},"geometry":{"rings":[[[-113.665880,33.089569],[-113.578775,33.084994],[-113.595481,32.109917],[-113.642988,31.702545],[-113.997771,31.250836],[-115.038456,32.239077],[-114.828726,32.423734],[-113.665880,33.089569]]]}},{"attributes":{"gf_name":"Bravo_007","gf_category":"Bravo"},"geometry":{"rings":[[[-116.842159,36.261753],[-116.346604,35.977788],[-116.271652,35.913982],[-116.010855,34.931918],[-116.723780,34.610674],[-116.842159,36.261753]]]}},{"attributes":{"gf_name":"Bravo_009","gf_category":"Bravo"},"geometry":{"rings":[[[-115.760073,34.464355],[-115.288817,34.426023],[-115.232718,33.750965],[-115.651894,33.454681],[-115.760073,34.464355]]]}},{"attributes":{"gf_name":"Bravo_012","gf_category":"Bravo"},"geometry":{"rings":[[[-112.309934,34.738007],[-111.899280,34.156477],[-111.896265,34.000485],[-112.259220,33.812066],[-112.559940,33.882035],[-112.785387,34.665098],[-112.309934,34.738007]]]}},{"attributes":{"gf_name":"Bravo_013","gf_category":"Bravo"},"geometry":{"rings":[[[-113.866163,31.017642],[-112.821893,30.803882],[-112.208051,30.141331],[-112.201103,29.892425],[-113.605198,29.548920],[-113.866163,31.017642]]]}},{"attributes":{"gf_name":"Bravo_014","gf_category":"Bravo"},"geometry":{"rings":[[[-111.159012,32.522856],[-110.824649,32.494792],[-110.141501,31.918549],[-110.403977,31.052692],[-111.259674,30.945511],[-111.315692,30.965117],[-111.474622,31.590875],[-111.467117,31.957874],[-111.159012,32.522856]]]}},{"attributes":{"gf_name":"Bravo_016","gf_category":"Bravo"},"geometry":{"rings":[[[-113.946703,29.010449],[-114.966407,29.010553],[-113.874728,31.024001],[-113.866163,31.017642],[-113.605198,29.548920],[-113.946703,29.010449]]]}},{"attributes":{"gf_name":"Bravo_018","gf_category":"Bravo"},"geometry":{"rings":[[[-112.935806,32.530579],[-112.905623,32.022493],[-113.642988,31.702545],[-113.595481,32.109917],[-112.935806,32.530579]]]}},{"attributes":{"gf_name":"Bravo_019","gf_category":"Bravo"},"geometry":{"rings":[[[-111.317819,34.348952],[-110.978438,33.833734],[-111.375070,33.198857],[-111.896265,34.000485],[-111.899280,34.156477],[-111.317819,34.348952]]]}},{"attributes":{"gf_name":"Bravo_020","gf_category":"Bravo"},"geometry":{"rings":[[[-112.909116,33.056288],[-112.454372,32.971595],[-112.423551,31.937676],[-112.551553,31.875448],[-112.905623,32.022493],[-112.935806,32.530579],[-112.909116,33.056288]]]}},{"attributes":{"gf_name":"Bravo_021","gf_category":"Bravo"},"geometry":{"rings":[[[-112.150814,33.001880],[-111.414706,32.914951],[-111.159012,32.522856],[-111.467117,31.957874],[-112.423551,31.937676],[-112.454372,32.971595],[-112.150814,33.001880]]]}},{"attributes":{"gf_name":"Bravo_022","gf_category":"Bravo"},"geometry":{"rings":[[[-114.546436,34.566336],[-114.000237,34.215944],[-114.203561,33.611652],[-114.485905,33.625307],[-114.892467,33.864714],[-114.546436,34.566336]]]}},{"attributes":{"gf_name":"Bravo_028","gf_category":"Bravo"},"geometry":{"rings":[[[-116.878218,34.445776],[-116.802682,34.081536],[-117.476538,32.953794],[-118.241221,32.789177],[-118.306950,32.857367],[-117.452834,34.349222],[-116.878218,34.445776]]]}},{"attributes":{"gf_name":"Bravo_029","gf_category":"Bravo"},"geometry":{"rings":[[[-116.010855,34.931918],[-115.974970,34.925304],[-115.760073,34.464355],[-115.651894,33.454681],[-115.652745,33.453531],[-116.802682,34.081536],[-116.878218,34.445776],[-116.873703,34.464625],[-116.723780,34.610674],[-116.010855,34.931918]]]}},{"attributes":{"gf_name":"Bravo_035","gf_category":"Bravo"},"geometry":{"rings":[[[-113.322646,36.492659],[-112.866662,35.823773],[-113.157328,34.983246],[-113.614524,35.263022],[-113.911112,36.188262],[-113.322646,36.492659]]]}},{"attributes":{"gf_name":"Bravo_037","gf_category":"Bravo"},"geometry":{"rings":[[[-115.337130,36.999392],[-114.286610,36.369889],[-114.190413,36.202457],[-114.581011,35.827259],[-114.984871,35.568119],[-115.038457,35.629228],[-115.332937,36.457187],[-115.360534,36.658888],[-115.337130,36.999392]]]}},{"attributes":{"gf_name":"Bravo_039","gf_category":"Bravo"},"geometry":{"rings":[[[-114.581011,35.827259],[-113.881352,35.215058],[-114.599067,34.811896],[-114.604860,34.809841],[-114.934130,35.107656],[-115.025696,35.248019],[-114.984871,35.568119],[-114.581011,35.827259]]]}},{"attributes":{"gf_name":"Bravo_040","gf_category":"Bravo"},"geometry":{"rings":[[[-115.635838,37.895516],[-114.244290,37.688867],[-114.286610,36.369889],[-115.337130,36.999392],[-115.635838,37.895516]]]}},{"attributes":{"gf_name":"Bravo_041","gf_category":"Bravo"},"geometry":{"rings":[[[-116.236850,38.430295],[-115.893363,38.243036],[-115.830905,38.042138],[-116.762045,36.853773],[-117.850493,37.692040],[-117.879285,37.762548],[-116.375726,38.414554],[-116.236850,38.430295]]]}},{"attributes":{"gf_name":"Bravo_043","gf_category":"Bravo"},"geometry":{"rings":[[[-113.032255,37.207939],[-112.165492,36.820693],[-111.999148,36.398822],[-112.273275,35.883587],[-112.866662,35.823773],[-113.322646,36.492659],[-113.032255,37.207939]]]}},{"attributes":{"gf_name":"Bravo_044","gf_category":"Bravo"},"geometry":{"rings":[[[-113.614524,35.263022],[-113.157328,34.983246],[-113.059816,34.730975],[-113.114472,34.672716],[-114.599067,34.811896],[-113.881352,35.215058],[-113.614524,35.263022]]]}},{"attributes":{"gf_name":"Bravo_045","gf_category":"Bravo"},"geometry":{"rings":[[[-113.372466,34.311458],[-113.327958,33.182361],[-113.578775,33.084994],[-113.665880,33.089569],[-114.203561,33.611652],[-114.000237,34.215944],[-113.372466,34.311458]]]}},{"attributes":{"gf_name":"Bravo_046","gf_category":"Bravo"},"geometry":{"rings":[[[-112.273275,35.883587],[-111.846116,35.154557],[-112.309934,34.738007],[-112.785387,34.665098],[-113.059816,34.730975],[-113.157328,34.983246],[-112.866662,35.823773],[-112.273275,35.883587]]]}},{"attributes":{"gf_name":"Bravo_047","gf_category":"Bravo"},"geometry":{"rings":[[[-111.265839,37.278314],[-111.175263,37.246313],[-110.752701,36.701053],[-110.999562,36.098601],[-111.999148,36.398822],[-112.165492,36.820693],[-111.896582,37.016041],[-111.265839,37.278314]]]}},{"attributes":{"gf_name":"Charlie_004","gf_category":"Charlie"},"geometry":{"rings":[[[-111.990476,39.623930],[-111.672383,39.412941],[-111.919637,38.212856],[-111.999787,38.197742],[-112.565550,38.183798],[-112.769221,38.858292],[-112.622809,39.071739],[-111.990476,39.623930]]]}},{"attributes":{"gf_name":"Charlie_005","gf_category":"Charlie"},"geometry":{"rings":[[[-112.811170,38.874945],[-112.769221,38.858292],[-112.565550,38.183798],[-113.120292,37.481661],[-113.857748,37.760542],[-113.852376,37.928864],[-113.462539,38.716036],[-112.811170,38.874945]]]}},{"attributes":{"gf_name":"Charlie_007","gf_category":"Charlie"},"geometry":{"rings":[[[-111.919637,38.212856],[-111.687861,38.152417],[-111.265839,37.278314],[-111.896582,37.016041],[-111.999787,38.197742],[-111.919637,38.212856]]]}},{"attributes":{"gf_name":"Charlie_008","gf_category":"Charlie"},"geometry":{"rings":[[[-111.315692,30.965117],[-111.259674,30.945511],[-110.796944,30.198316],[-110.956477,29.722505],[-112.072384,29.643578],[-112.201103,29.892425],[-112.208051,30.141331],[-111.315692,30.965117]]]}},{"attributes":{"gf_name":"Charlie_009","gf_category":"Charlie"},"geometry":{"rings":[[[-110.141501,31.918549],[-109.185745,31.877346],[-108.663232,31.118540],[-108.398365,30.600397],[-108.351497,29.968889],[-109.843830,30.505716],[-110.403977,31.052692],[-110.141501,31.918549]]]}},{"attributes":{"gf_name":"Charlie_012","gf_category":"Charlie"},"geometry":{"rings":[[[-110.403977,31.052692],[-109.843830,30.505716],[-110.466160,30.249463],[-110.796944,30.198316],[-111.259674,30.945511],[-110.403977,31.052692]]]}},{"attributes":{"gf_name":"Charlie_016","gf_category":"Charlie"},"geometry":{"rings":[[[-111.251511,38.338078],[-110.907121,38.232546],[-110.744795,37.758057],[-111.175263,37.246313],[-111.265839,37.278314],[-111.687861,38.152417],[-111.251511,38.338078]]]}},{"attributes":{"gf_name":"Charlie_017","gf_category":"Charlie"},"geometry":{"rings":[[[-110.848814,35.410481],[-110.395022,35.218355],[-110.629724,33.983079],[-110.978438,33.833734],[-111.317819,34.348952],[-111.227231,35.202718],[-110.848814,35.410481]]]}},{"attributes":{"gf_name":"Charlie_018","gf_category":"Charlie"},"geometry":{"rings":[[[-108.957760,35.823204],[-108.582120,34.448969],[-108.695642,34.413530],[-109.616894,35.160616],[-109.861048,35.391741],[-109.580750,35.671064],[-108.957760,35.823204]]]}},{"attributes":{"gf_name":"Charlie_019","gf_category":"Charlie"},"geometry":{"rings":[[[-109.616894,35.160616],[-108.695642,34.413530],[-109.209776,34.009453],[-109.616894,35.160616]]]}},{"attributes":{"gf_name":"Charlie_020","gf_category":"Charlie"},"geometry":{"rings":[[[-109.580423,33.485896],[-109.019515,32.893275],[-109.185745,31.877346],[-110.141501,31.918549],[-110.824649,32.494792],[-109.600599,33.481973],[-109.580423,33.485896]]]}},{"attributes":{"gf_name":"Charlie_026","gf_category":"Charlie"},"geometry":{"rings":[[[-107.448109,32.465305],[-107.543777,31.824440],[-108.398365,30.600397],[-108.663232,31.118540],[-108.172089,32.338211],[-107.448109,32.465305]]]}},{"attributes":{"gf_name":"Charlie_027","gf_category":"Charlie"},"geometry":{"rings":[[[-107.000647,33.131979],[-106.927124,33.119847],[-105.707650,32.307470],[-105.776106,31.834468],[-105.924658,31.432486],[-106.583294,31.970314],[-107.000647,33.131979]]]}},{"attributes":{"gf_name":"Charlie_028","gf_category":"Charlie"},"geometry":{"rings":[[[-106.583294,31.970314],[-105.924658,31.432486],[-105.907446,31.138703],[-106.150128,30.782825],[-107.095091,31.764133],[-106.583294,31.970314]]]}},{"attributes":{"gf_name":"Charlie_029","gf_category":"Charlie"},"geometry":{"rings":[[[-107.359704,33.554365],[-107.142280,33.201815],[-107.448109,32.465305],[-108.172089,32.338211],[-108.685156,32.962142],[-107.359704,33.554365]]]}},{"attributes":{"gf_name":"Charlie_030","gf_category":"Charlie"},"geometry":{"rings":[[[-105.994738,33.337477],[-105.436816,32.627620],[-105.422219,32.568667],[-105.707650,32.307470],[-106.927124,33.119847],[-105.994738,33.337477]]]}},{"attributes":{"gf_name":"Charlie_034","gf_category":"Charlie"},"geometry":{"rings":[[[-105.313424,33.867144],[-105.436816,32.627620],[-105.994738,33.337477],[-105.313424,33.867144]]]}},{"attributes":{"gf_name":"Charlie_036","gf_category":"Charlie"},"geometry":{"rings":[[[-110.055886,36.915457],[-110.000338,36.888631],[-109.580750,35.671064],[-109.861048,35.391741],[-110.224385,35.222138],[-110.395022,35.218355],[-110.848814,35.410481],[-110.999562,36.098601],[-110.752701,36.701053],[-110.055886,36.915457]]]}},{"attributes":{"gf_name":"Charlie_039","gf_category":"Charlie"},"geometry":{"rings":[[[-109.580447,36.961026],[-108.825944,35.920920],[-108.957760,35.823204],[-109.580750,35.671064],[-110.000338,36.888631],[-109.580447,36.961026]]]}},{"attributes":{"gf_name":"Charlie_040","gf_category":"Charlie"},"geometry":{"rings":[[[-108.381504,36.137964],[-106.993809,34.882178],[-107.053820,34.603020],[-107.376441,34.146431],[-108.254492,34.413094],[-108.396412,36.127488],[-108.381504,36.137964]]]}},{"attributes":{"gf_name":"Charlie_043","gf_category":"Charlie"},"geometry":{"rings":[[[-109.532017,37.942050],[-109.235501,37.903205],[-109.059044,37.837445],[-108.846176,37.328161],[-109.580447,36.961026],[-110.000338,36.888631],[-110.055886,36.915457],[-110.093643,37.604001],[-109.532017,37.942050]]]}},{"attributes":{"gf_name":"Charlie_045","gf_category":"Charlie"},"geometry":{"rings":[[[-110.295214,38.567151],[-110.205307,37.706183],[-110.744795,37.758057],[-110.907121,38.232546],[-110.295214,38.567151]]]}},{"attributes":{"gf_name":"Charlie_046","gf_category":"Charlie"},"geometry":{"rings":[[[-111.672383,39.412941],[-111.040607,39.348197],[-111.251511,38.338078],[-111.687861,38.152417],[-111.919637,38.212856],[-111.672383,39.412941]]]}},{"attributes":{"gf_name":"Charlie_047","gf_category":"Charlie"},"geometry":{"rings":[[[-108.586684,38.599815],[-107.751418,38.385245],[-107.882355,37.022654],[-108.096641,36.914409],[-108.846176,37.328161],[-109.059044,37.837445],[-108.586684,38.599815]]]}},{"attributes":{"gf_name":"Charlie_049","gf_category":"Charlie"},"geometry":{"rings":[[[-110.089960,38.906828],[-109.532017,37.942050],[-110.093643,37.604001],[-110.205307,37.706183],[-110.295214,38.567151],[-110.089960,38.906828]]]}},{"attributes":{"gf_name":"Charlie_050","gf_category":"Charlie"},"geometry":{"rings":[[[-109.171210,39.584714],[-108.586684,38.599815],[-109.059044,37.837445],[-109.235501,37.903205],[-109.527535,39.304970],[-109.171210,39.584714]]]}}]}.
2019-11-07T18:06:56,895 Executing request GET /arcgis/rest/services/MyGeofences/FeatureServer/0/query?f=json.....&where=1%3D1&outFields=gf_name%2Cgf_category&outSR=4326&returnIdsOnly=true HTTP/1.1
2019-11-07T18:06:56,906 Got response from HTTP request: {"objectIdFieldName":"objectid","objectIds":[3,5,6,8,10,11,13,14,18,21,22,23,24,26,28,30,33,34,36,38,39,41,42,43,45,46,49,50,53,54,55,57,59,62,63,64,66,68,69,70,71,72,78,79,85,87,89,90,91,93,94,95,96,97,104,105,107,108,109,112,116,117,118,119,120,126,127,128,129,130,134,136,139,140,143,145,146,147,149,150]}.
2019-11-07T18:06:57,294 Executing request GET /arcgis/rest/services/MyGeofences/FeatureServer/0/query?f=json.....&where=1%3D1&outFields=objectid%2Cgf_name%2Cgf_category&returnGeometry=false HTTP/1.1
2019-11-07T18:06:57,305 Got response from HTTP request: {"objectIdFieldName":"objectid","globalIdFieldName":"","geometryType":"esriGeometryPolygon","spatialReference":{"wkid":4326,"latestWkid":4326},"fields":[{"name":"objectid","alias":"OBJECTID","type":"esriFieldTypeOID"},{"name":"gf_name","alias":"gf_name","type":"esriFieldTypeString","length":50},{"name":"gf_category","alias":"gf_category","type":"esriFieldTypeString","length":50}],"features":[{"attributes":{"objectid":3,"gf_name":"Alpha_003","gf_category":"Alpha"}},{"attributes":{"objectid":5,"gf_name":"Alpha_005","gf_category":"Alpha"}},{"attributes":{"objectid":6,"gf_name":"Alpha_006","gf_category":"Alpha"}},{"attributes":{"objectid":8,"gf_name":"Alpha_008","gf_category":"Alpha"}},{"attributes":{"objectid":10,"gf_name":"Alpha_010","gf_category":"Alpha"}},{"attributes":{"objectid":11,"gf_name":"Alpha_011","gf_category":"Alpha"}},{"attributes":{"objectid":13,"gf_name":"Alpha_013","gf_category":"Alpha"}},{"attributes":{"objectid":14,"gf_name":"Alpha_014","gf_category":"Alpha"}},{"attributes":{"objectid":18,"gf_name":"Alpha_018","gf_category":"Alpha"}},{"attributes":{"objectid":21,"gf_name":"Alpha_021","gf_category":"Alpha"}},{"attributes":{"objectid":22,"gf_name":"Alpha_022","gf_category":"Alpha"}},{"attributes":{"objectid":23,"gf_name":"Alpha_023","gf_category":"Alpha"}},{"attributes":{"objectid":24,"gf_name":"Alpha_024","gf_category":"Alpha"}},{"attributes":{"objectid":26,"gf_name":"Alpha_026","gf_category":"Alpha"}},{"attributes":{"objectid":28,"gf_name":"Alpha_028","gf_category":"Alpha"}},{"attributes":{"objectid":30,"gf_name":"Alpha_030","gf_category":"Alpha"}},{"attributes":{"objectid":33,"gf_name":"Alpha_033","gf_category":"Alpha"}},{"attributes":{"objectid":34,"gf_name":"Alpha_034","gf_category":"Alpha"}},{"attributes":{"objectid":36,"gf_name":"Alpha_036","gf_category":"Alpha"}},{"attributes":{"objectid":38,"gf_name":"Alpha_038","gf_category":"Alpha"}},{"attributes":{"objectid":39,"gf_name":"Alpha_039","gf_category":"Alpha"}},{"attributes":{"objectid":41,"gf_name":"Alpha_041","gf_category":"Alpha"}},{"attributes":{"objectid":42,"gf_name":"Alpha_042","gf_category":"Alpha"}},{"attributes":{"objectid":43,"gf_name":"Alpha_043","gf_category":"Alpha"}},{"attributes":{"objectid":45,"gf_name":"Alpha_045","gf_category":"Alpha"}},{"attributes":{"objectid":46,"gf_name":"Alpha_046","gf_category":"Alpha"}},{"attributes":{"objectid":49,"gf_name":"Alpha_049","gf_category":"Alpha"}},{"attributes":{"objectid":50,"gf_name":"Alpha_050","gf_category":"Alpha"}},{"attributes":{"objectid":53,"gf_name":"Bravo_003","gf_category":"Bravo"}},{"attributes":{"objectid":54,"gf_name":"Bravo_004","gf_category":"Bravo"}},{"attributes":{"objectid":55,"gf_name":"Bravo_005","gf_category":"Bravo"}},{"attributes":{"objectid":57,"gf_name":"Bravo_007","gf_category":"Bravo"}},{"attributes":{"objectid":59,"gf_name":"Bravo_009","gf_category":"Bravo"}},{"attributes":{"objectid":62,"gf_name":"Bravo_012","gf_category":"Bravo"}},{"attributes":{"objectid":63,"gf_name":"Bravo_013","gf_category":"Bravo"}},{"attributes":{"objectid":64,"gf_name":"Bravo_014","gf_category":"Bravo"}},{"attributes":{"objectid":66,"gf_name":"Bravo_016","gf_category":"Bravo"}},{"attributes":{"objectid":68,"gf_name":"Bravo_018","gf_category":"Bravo"}},{"attributes":{"objectid":69,"gf_name":"Bravo_019","gf_category":"Bravo"}},{"attributes":{"objectid":70,"gf_name":"Bravo_020","gf_category":"Bravo"}},{"attributes":{"objectid":71,"gf_name":"Bravo_021","gf_category":"Bravo"}},{"attributes":{"objectid":72,"gf_name":"Bravo_022","gf_category":"Bravo"}},{"attributes":{"objectid":78,"gf_name":"Bravo_028","gf_category":"Bravo"}},{"attributes":{"objectid":79,"gf_name":"Bravo_029","gf_category":"Bravo"}},{"attributes":{"objectid":85,"gf_name":"Bravo_035","gf_category":"Bravo"}},{"attributes":{"objectid":87,"gf_name":"Bravo_037","gf_category":"Bravo"}},{"attributes":{"objectid":89,"gf_name":"Bravo_039","gf_category":"Bravo"}},{"attributes":{"objectid":90,"gf_name":"Bravo_040","gf_category":"Bravo"}},{"attributes":{"objectid":91,"gf_name":"Bravo_041","gf_category":"Bravo"}},{"attributes":{"objectid":93,"gf_name":"Bravo_043","gf_category":"Bravo"}},{"attributes":{"objectid":94,"gf_name":"Bravo_044","gf_category":"Bravo"}},{"attributes":{"objectid":95,"gf_name":"Bravo_045","gf_category":"Bravo"}},{"attributes":{"objectid":96,"gf_name":"Bravo_046","gf_category":"Bravo"}},{"attributes":{"objectid":97,"gf_name":"Bravo_047","gf_category":"Bravo"}},{"attributes":{"objectid":104,"gf_name":"Charlie_004","gf_category":"Charlie"}},{"attributes":{"objectid":105,"gf_name":"Charlie_005","gf_category":"Charlie"}},{"attributes":{"objectid":107,"gf_name":"Charlie_007","gf_category":"Charlie"}},{"attributes":{"objectid":108,"gf_name":"Charlie_008","gf_category":"Charlie"}},{"attributes":{"objectid":109,"gf_name":"Charlie_009","gf_category":"Charlie"}},{"attributes":{"objectid":112,"gf_name":"Charlie_012","gf_category":"Charlie"}},{"attributes":{"objectid":116,"gf_name":"Charlie_016","gf_category":"Charlie"}},{"attributes":{"objectid":117,"gf_name":"Charlie_017","gf_category":"Charlie"}},{"attributes":{"objectid":118,"gf_name":"Charlie_018","gf_category":"Charlie"}},{"attributes":{"objectid":119,"gf_name":"Charlie_019","gf_category":"Charlie"}},{"attributes":{"objectid":120,"gf_name":"Charlie_020","gf_category":"Charlie"}},{"attributes":{"objectid":126,"gf_name":"Charlie_026","gf_category":"Charlie"}},{"attributes":{"objectid":127,"gf_name":"Charlie_027","gf_category":"Charlie"}},{"attributes":{"objectid":128,"gf_name":"Charlie_028","gf_category":"Charlie"}},{"attributes":{"objectid":129,"gf_name":"Charlie_029","gf_category":"Charlie"}},{"attributes":{"objectid":130,"gf_name":"Charlie_030","gf_category":"Charlie"}},{"attributes":{"objectid":134,"gf_name":"Charlie_034","gf_category":"Charlie"}},{"attributes":{"objectid":136,"gf_name":"Charlie_036","gf_category":"Charlie"}},{"attributes":{"objectid":139,"gf_name":"Charlie_039","gf_category":"Charlie"}},{"attributes":{"objectid":140,"gf_name":"Charlie_040","gf_category":"Charlie"}},{"attributes":{"objectid":143,"gf_name":"Charlie_043","gf_category":"Charlie"}},{"attributes":{"objectid":145,"gf_name":"Charlie_045","gf_category":"Charlie"}},{"attributes":{"objectid":146,"gf_name":"Charlie_046","gf_category":"Charlie"}},{"attributes":{"objectid":147,"gf_name":"Charlie_047","gf_category":"Charlie"}},{"attributes":{"objectid":149,"gf_name":"Charlie_049","gf_category":"Charlie"}},{"attributes":{"objectid":150,"gf_name":"Charlie_050","gf_category":"Charlie"}}]}.
2019-11-07T18:06:57,563 Executing request GET /arcgis/rest/services/MyGeofences/FeatureServer/0/query?f=json.....&where=1%3D1&outFields=objectid%2Cgf_name%2Cgf_category&returnGeometry=false&returnIdsOnly=true HTTP/1.1
2019-11-07T18:06:57,573 Got response from HTTP request: {"objectIdFieldName":"objectid","objectIds":[3,5,6,8,10,11,13,14,18,21,22,23,24,26,28,30,33,34,36,38,39,41,42,43,45,46,49,50,53,54,55,57,59,62,63,64,66,68,69,70,71,72,78,79,85,87,89,90,91,93,94,95,96,97,104,105,107,108,109,112,116,117,118,119,120,126,127,128,129,130,134,136,139,140,143,145,146,147,149,150]}.
2019-11-07T18:07:00,360 Executing request POST /arcgis/admin/machines/localhost/status HTTP/1.1
2019-11-07T18:07:00,414 Got response from HTTP request: <html lang="en">
2019-11-07T18:07:03,673 Executing request POST /arcgis/admin/system/configstore HTTP/1.1
2019-11-07T18:07:03,688 Got response from HTTP request: <html lang="en"> There is quite a bit of JSON data embedded in the results above which can be helpful in identifying exactly what a feature service returns to a client when the client queries the service. The timestamps also help if you need to return to the full karaf.log and look for messages logged just before or just after a line matching the command's search patterns to see if there is additional information not captured by the command which might help debug an issue. Information provided by the timestamps on each logged message can also provide empirical evidence of exactly how long it takes to get a response back from the feature service each time an HTTP request is made. Computing a delta between the date/time a request is logged and the response to the request can be valuable if you suspect latency introduced by geofence synchronization is causing a problem. Remember, nothing happens in zero time, and frequent queries every few seconds to a large feature record set can impact overall GeoEvent Server operations. Also, keep in mind that a feature service may be configured to return a maximum number of feature records for any given query. GeoEvent Server may have to make several queries to page through a complete feature record set when there are more than 1000 feature records, for example, being imported to update geofences. The techniques I have described provide a way to delve deeply into geofence synchronization to examine the REST requests and responses when interfacing with a feature service. You can use these techniques to obtain information on request latency as well as implementation details such as how GeoEvent Server pages through large feature record sets or how a feature service handles a number of queries sent in a series. I have attached a PDF illustration of the above two dozen formatted log messages with additional formatting I applied manually to make the JSON in each logged message easy to read. I hope that you find the combination of debug logging with scripted text extraction and string formatting a helpful debugging technique. – RJ
... View more
11-08-2019
05:22 PM
|
0
|
0
|
1514
|
|
POST
|
Hello Shengrong Liu – Stefan P. Jung is correct. There is an issue against the FILE inbound transport both to support multi-byte character sets and to provide a better error message than suggesting that the problem is with the UTF-8 character encoding. The suggested work around is to try having the inbound transport treat the file as non-text (setting 'Is File Text' to 'False') ... or using a different transport like HTTP. You indicated that the input's event count increases when the file is read as non-text ... but the In/Out event counts of the GeoEvent Service do not change. I would try copying the input to create a new instance and then incorporating that new instance in a new GeoEvent Service with not processors or filters, routing the inbound file's content direct to an 'Write to a JSON File' output. What I'm looking for here is whether there is a problem between the existing input instance such that an underlying Kafka topic / consumer is not recognizing that an inbound connector has successfully received and adapted event data. A GeoEvent Service should at least consume data adapted by one of its inputs, so the 'In' event count should increase even if there's some other problem such that no event records actually pass 'Out' from the GeoEvent Service. For your last question, on how an input's Expected Date Format parameter works and what a "date" will be converted to – I would say that when a GeoEvent Definition specifies that a field is a Date an inbound adapter will attempt to use the pattern in the Expected Date Format parameter to interpret a string to create a Date. In this context a Date is a Java primitive type like Long, String, or Double. You might want to take a look at a blog I published https://community.esri.com/community/gis/enterprise-gis/geoevent/blog/2019/03/14/what-time-is-it-well-that-depends?sr=search&searchId=d2d0b578-09b4-44ae-9fc1-1f7872307741&searchIndex=0 A date can be represented several different ways. As a verbose string ... "Thursday, August 22, 2019 3:30:00 AM (GMT)" As an ISO 8601 value ... "2019-08-22T20:30:00-07:00" As a long integer in epoch milliseconds ... 1566444600000 All three representations above are the exact same date/time. When using a 'Write to a CSV File' output the default is to represent the time as an ISO 8601 formatted string. The +/- 0:00 at the end of the string tells you how many hours the value has been offset from UTC. When adding or updating feature records in a geodatabase like the spatiotemporal big data store Date values are stored as epoch long integer values in milliseconds. When you click a feature in a web map it is up to the client how to represent the date value. You should be aware that when an Expected Date Format pattern is specified it is likely that the date value will be considered a local date/time value. This also likely means that a web application, when it queries the date value from the geodatabase, will assume that it is receiving a UTC value and offset the value for you to a local date/time and represent it as a string. I mention this so that you will look for the possibility that a date you read from an input file is not representing as the expected date value once it is (a) written out to a feature record and (b) queried by a client and converted to a string for display in a pop-up dialog. Hope this information is helpful – RJ
... View more
08-20-2019
08:30 PM
|
0
|
0
|
1772
|
|
POST
|
Hello Nicholas – A few things have changed since I replied to Thibaut back in August 2015. From the GeoEvent Server side, it is important to know that a Field Calculator processor's expressions support a number of different string functions. From the ArcGIS Server side, the ArcGIS REST Services API changed at the 10.7 release to offer clients an ability to specify whether or not to rollback a transaction if a failure is encountered. GeoEvent Server has not yet integrated the changes made to the map/feature services REST API. That means that at the latest releases (10.6.x and 10.7.x) you will not be able to configure an output in GeoEvent Manager to specify that a transaction should or should not rollback when restrictions enforced by the feature service (such as string length) are not satisfied. I also want to clarify what I meant when I said "use GeoEvent filters to screen for event attributes which do not satisfy your feature service's constraints". On the one hand, you could use a filter to discard event records whose eventId attribute were null. This would allow you to catch and discard event records whose undefined event identifier would otherwise fail to satisfy a nullable: false restriction enforced by a feature service. You cannot configure a filter with an expression such as length(eventId) however, so its not possible to configure a filter to discard event records whose event identifier string is "too long" to be used when adding or updating a feature record. What you can do is configure a Field Calculator to trim the string value to a compliant length or remove portions of a string you know make the string too long. The simpler approach is discussed in the Introduction to GeoEvent Server tutorial, Module 4 pages 10-12. The Field Calculator (Regular Expression) processor can be configured with a regular expression which applies a quantifier (or match count) to a portion of a pattern. The example illustrated below anchors the pattern match to the beginning of the string in the Description attribute and matches zero-up-to-thirty-two single characters. The matching sub-string is then written back into Description effectively trimming the string's length. Your other option is to use a regular Field Calculator which supports a variety of string functions discussed on that processor's page in the on-line help. For example, say that you wanted to eliminate all of the text between two delimiting dashes to simplify a description. Given the string: Flight SWA2607 - Departed 17:20 hours - OnTime You could use an expression: replaceAll(eventid, '^(.*)([ ][-][ ].*[ ][-][ ])(.*)', '$1 - $3') to effectively cut out the departure time and rewrite just the first and third parts of the string. The regular expression pattern in that second example uses rounded parentheses to identify three groups with two sets of three square parentheses to identify single characters ... a space followed by a dash followed by another space. Each .* in the pattern matches zero or more characters, so we effectively parse out the flight number and its status (e.g. "OnTime") and throw away all the stuff in the middle of the string. Hope this information helps – RJ
... View more
08-20-2019
01:26 PM
|
0
|
1
|
1008
|
|
POST
|
Did you perhaps find an answer to your original question on how to BOLD text in your notification message, and re-post to ask about formatting decimal values in more user-friendly formats rather than scientific notation? GeoNet sent me a message about the former, but looking at the thread this morning, I'm seeing the latter. In any case, for the question GeoNet is showing now, if the data you are receiving is a high precision value such that Java wants to use scientific notation to represent the value, I think you're stuck with the scientific notation. It's not a question of how to use the MessageFormatter (the adapter used to format an event record prior to sending an email or SMS text message request to an SMTP Server) to reformat a Double value 123456789000 to represent it as 123,456,789,000 rather than 1.23456789E11. The output's adapter does not support string formatting or data type conversion. The only formatting it offers is variable substitution ( e.g. ${VehicleID} ) and HTML / CSS formatting like what I've shown above. As you've discovered, using a Field Calculator expression such as toString(myValue) to try and convert a Double value to a String simply represents 1.23456789E11 as the string literal "1.23456789E11". Unfortunately the Field Calculator does not support Java code such as String.format("%.0f", myValue) or java.text.DecimalFormat.format(myValue). The processor is only able to interpret a selection of string and mathematical functions for which wrappers have been developed as part of the processor's implementation. Do you need to handle the cost value as a Double at any point in the GeoEvent Service? You might consider configuring the GeoEvent Definition used by your input to specify costValue be treated as a String, even though the data is arriving as a long integer value. You could use the attribute costValue in your e-mail notification message and (if necessary) use a Field Mapper to explicitly cast the String to a Double by mapping costValue into an event attribute field costValueAsDouble whose data type was Double. Then you could use costValueAsDouble to perform any calculations you needed. But once cast to a Double I don't think you'll be able to revert a high precision value back to a String in order to avoid the data being represented in scientific notation. Also, using the input's inbound adapter to handle costValue as a String in the first place isn't going to format the string using commas to separate out the thousands portion of the value; it's just going to give you a really long string of digits. – RJ
... View more
08-02-2019
11:54 AM
|
0
|
0
|
2104
|
|
POST
|
GeoNet sent me a message that you were having some trouble using basic HTML tags in the Message Body to simply display certain fields in BOLD. The following worked for me (entered as a single line of text): <span style='font-weight:600;'>Vehicle ID:</span> ${VehicleID}<br/> <span style='font-weight:600;'>LastUpdated:</span> ${LastUpdated} The following also worked for me, using HTML tags rather than CSS Properties: <strong>Vehicle ID:</strong> ${VehicleID}<br/><strong>LastUpdated:</strong> ${LastUpdated} – RJ
... View more
08-02-2019
11:53 AM
|
0
|
0
|
2104
|
|
IDEA
|
Hello Minbin – We have something planned as part of enhancements being developed for the GeoEvent Manager web application which I think you'll like. The upcoming 10.8 release will include a "GeoEvent Sampler" as part of the service designer which will enable you to click on any node – input, processor, filter, or output – in a GeoEvent Service and "refresh" a cache to see a sample of the event records. Inputs, of course, would only show you a sample of the event records they had successfully ingest and adapted to create GeoEvents (e.g. event records for processing). But you could select a filter or processor and refresh the event cache to see what came into the processor (or filter) and what came out. You will be able to toggle this "GeoEvent Sampler" on/off to collect a sample of event records to see what an input is producing, what a processor or filter is receiving and producing, and what an output is receiving to process for dissemination. Again, this is actively being developed for the 10.8 release, so specific details are still being worked out. – RJ
... View more
08-01-2019
11:16 AM
|
1
|
0
|
754
|
|
POST
|
Hello Hossein – Every event record received by an inbound connector is generally atomic, by which I mean the adapter/transport used to implement the connector has no knowledge of event records previously received or knowledge of event records about to be received. The same goes for the nodes (e.g. the processors and filters) in a GeoEvent Service. There are some exceptions, with their own limitations. A filter configured with a spatial operation ENTER, for example, needs to know if the geometry of an event with a TRACK_ID previously observed was outside or disjoint so that it knows the current event record's geometry, which is now inside, should evaluate as having entered the polygon used to model the geofence. Likewise, a TrackGap Detector processor needs to maintain a cache of TRACK_ID values it has previously observed so that it knows when a report from a given asset is expected but has not been received. There are other examples, but I won't go into those here. What you need to take away from this that GeoEvent Server - fundamentally - does not hold onto or cache data it is processing. Doing so runs counter to the objective of ingesting, processing, and disseminating data as quickly as possible at the highest possible volume. Since you stipulated you only want to notify when the geometry from a new tracked asset, as identified by a unique TRACK_ID, has entered an area ... I'm assuming that if a tracked asset leaves and later re-enters the area you do not particularly care. The best way best I can think of, then, to notify (a) that an event record's geometry intersects or is "inside" a geofence and (b) not re-notify that an event record is still inside the area, or later re-enters the area would be to save any event record you've determined intersects your geofence as a feature record. The feature record's schema will need to have an additional field hasBeenNotified whose value is initially set to, say, zero. Part of the trick to making this work is https://community.esri.com/community/gis/enterprise-gis/geoevent/blog/2018/10/03/using-a-partial-structure-or-schema-to-update-feature-records?sr=search&searchId=52457f48-aa43-4b7c-b83f-c070908bf928&searchIndex=6. If you deliberately omit the hasBeenNotified attribute from the GeoEvent Definition used to add/update feature records as real-time data is received from a sensor, and configure the feature service to apply a default value for hasBeenNotified when no value is specified, then only new feature records will have the hasBeenNotified attribute assigned to zero. It's important to recognize that this zero value is set as a default by the ArcGIS Server Feature Service, not by GeoEvent Server. Event records you receive from the real-time sensor are primarily driving updates into existing feature records, using the unique TRACK_ID for values such as date/time reported and location, and not touching the hasBeenNotified attribute value. You are relying on a partial GeoEvent Definition to leave the hasBeenNofified attribute value of existing feature records unmodified. Then, when your other input polls the feature service for feature records whose hasBeenNotified attribute value is zero, it uses a different GeoEvent Definition – one that does include the hasBeenNotified attribute – along with a Field Calculator to specifically update the hasBeenNotified attribute of the feature record it just polled with the value '1'. The GeoEvent Service you configure for notification, then, is not only pushing out notifications ... it's recording the fact that a notification has been made into the feature record set from which it collects feature records which require notification. Hope this information helps – RJ
... View more
07-25-2019
03:41 PM
|
2
|
1
|
1597
|
|
POST
|
Hello Hossein – I think the problem you're going to have with data like what you've illustrated will be similar to the challenge covered in the thread Streaming OpenSKY JSON problem The following JSON, modeled from your original sample, organizes its data as nested arrays. {
"time": 1564084010,
"states": [
["ac96b8", -90.34112, 39.6066, 1072.83],
["ae1fa3", -105.3463, 38.4699, 2026.92],
["bg3fm1", -115.8164, 37.1992, 1921.61],
["cx7ka0", -95.09134, 36.3191, 2235.56]
]
} Neither array has a key which can be used to access a name/value pair. GeoEvent Server cannot really construct a GeoEvent Definition for this JSON because GeoEvent Definitions are structures in which every value has an attribute name. If you allow an Receive JSON on a REST Endpoint inbound connector to create a GeoEvent Definition for you, given the above data structure, the event definition produced will only have one attribute, time, because that is the only named key/value pair that the inbound adapter is able to parse completely. Either every interior array will have to have a name, requiring it be enclosed within an object: {
"time": 1564084010,
"states": [
{"1": ["ac96b8", -90.34112, 39.6066, 1072.83]},
{"2": ["ae1fa3", -105.3463, 38.4699, 2026.92]},
{"3": ["bg3fm1", -115.8164, 37.1992, 1921.61]},
{"4": ["cx7ka0", -95.09134, 36.3191, 2235.56]}
]
} ... or the outer array will need to become an object so that the arrays in its collection can be named: {
"time": 1564084010,
"states": {
"1": ["ac96b8", -90.34112, 39.6066, 1072.83],
"2": ["ae1fa3", -105.3463, 38.4699, 2026.92],
"3": ["bg3fm1", -115.8164, 37.1992, 1921.61],
"4": ["cx7ka0", -95.09134, 36.3191, 2235.56]
}
} If you have sufficient influence over the data provider to make these changes, then you'll be able to specify named values using syntax such as: states[2].3[0] which will access the value "bg3fm1" from the array named "3" in the first example, or states.2[0] which will access the value "ae1fa3" from the array named "2" in the second example If you cannot convince the data provider to change their format, you have two options: 1) Create a GeoEvent Definition with two attributes, time and states, specifying the latter be handled as a String 2) Develop a bridge whose responsibility is to receive the data in its original nested array structure and re-write the structure in a form that has named values for each array The first approach would allow you to use a series of Field Calculator processors with RegEx pattern matching to extract specific sub-string values from the nested array, which is now being handled as one giant string. But I'm assuming that the outer array will contain a variable number of inner arrays, so RegEx pattern matching may be difficult and error prone. I don't know whether it would be easier to configure a number of Field Calculators in this case, or use the GeoEvent Server Java SDK to write a custom processor that was capable of pulling data out of the massive data string. (For that matter, maybe you want to develop a custom adapter that knows how to adapt the nested arrays to produce multiple event records ... that's what the geoJSON and Esri Feature JSON inbound adapters have to do as those formats also include unnamed nested arrays.) The second approach is probably more difficult up-front as you have to script a Python parser to re-structure the data, or develop a web application whose JavaScript might make the data re-structuring easier. Either way, once your "bridge" has re-written the data in a format which is more friendly to GeoEvent Server, you can relay the re-formatted data to a GeoEvent Server inbound connector for further processing. Hope this information helps – RJ
... View more
07-25-2019
02:01 PM
|
2
|
0
|
1392
|
|
POST
|
Hello Dave – I am not familiar with a GeoEvent Server processor, out-of-the-box, that would allow you to configure a query to a feature service and then sort the returned feature record set to identify the one whose distance from an event record’s point location is the smallest. That sort of functionality would have to be developed as a custom processor using the GeoEvent Server's Java SDK. The objective, if I understand what you want to do, is to locate the bathymetry line closest to a point (denoted by the blue dot in the below illustration) which represents the point geometry received as part of a tracked asset’s location/status event record. You want to enrich the received event record with a “depth” attribute taken from the polyline feature. GeoEvent Server can model a set of geofences using bathymetry polylines, but I would not recommend using a spatial operation such as INTERSECTS or TOUCHES to determine a spatial relationship between a point and a polyline as there is no way to specify a tolerance. The Esri Java Geometry API used by GeoEvent Server for all spatial relationship tests does not provide a NEAREST spatial relationship operator to allow the determination of the closest bathymetry line to a vessel’s reported position, which makes what you want to do difficult using out-of-the-box capabilities. For example, out-of-the-box, you can use a Field Enricher processor to enrich a received event record with attribute and/or geometry from a “related” feature record, but that processor only supports an attribute join for event enrichment. GeoEvent Server does support a spatial join, but only through the GeoTagger processor, which limits the event enrichment to incorporating the name of a spatially relevant geofence. Eric Ironside suggested an approach in which he used a custom processor to obtain a pixel value from an image service. In his case the image service data was elevation contours, but the pixel could just as easily provide the depth for bathymetry. He developed his custom processor to enrich GPS data that wasn’t reporting elevation with with the z-values. Gregory Christakos suggested that developing a custom processor to invoke a GP Tool, via a REST endpoint as a GP Service might be an option. I'm not sure which GP tool you might want to use, Greg suggested ArcGIS Pro's Near tool. GeoAnalytics includes a similar tool in their standard analysis tool set for batch analysis. There's also the Find Nearest tool in the ArcGIS Enterprise portal’s standard analysis toolset. Perhaps more important than which tool you want to try and expose as a GP Service is how you would obtain the bathymetry geometry to pass to the tool along with your vessel's real-time location. I suppose you could use GeoEvent Server to buffer the vessel's reported point location by a discrete distance, then use a GeoTagger to get the "names" of bathymetric polylines you had loaded as geofences which intersect that buffer, then Field Enrich to pull the geometries of the polylines into the event record ... but that's a lot of work to do in advance of using a custom processor to invoke an operation, which is where I think you're going to end up. Determining the optimal distance to use when buffering the vessel's location, to try and guarantee intersection with at least one bathymetry line without intersecting multiple bathymetry lines could be difficult. I think your concerns regarding event volume, efficiency, and leveraging spatial indexes are also justified. The number of spatial relationship queries a custom processor might need to make to determine the “nearest” bathometry line to a point could devastate overall event throughput. If you want development help from an engineer in Esri Professional Services, let me know and I can ask someone to reach out to you. – RJ
... View more
07-24-2019
05:00 PM
|
2
|
0
|
1660
|
|
BLOG
|
Hey Brian ... what you've illustrated above is a problem for GeoEvent Server. The JSON is valid from the point-of-view of the specification, but by making the data structure so concise the data provider has basically overloaded an object key/name by making the key dynamic to also supply a data value. I cannot think of a way, out-of-the-box, to extract the string value of an event attribute name and write the value as an attribute value. Field Mapper, for example, wouldn't support this. A custom processor could, but that's a lot of overhead to assume and I'd only take that approach if there was no other way. Could you possibly get the data provider to format the data as an array, rather than as nested objects with dynamic key names? What you have... What I'd recommend... If you've no influence over the data provider, you might want to consider writing a Python script (or something similar) to take the data being offered and restructure it into something that GeoEvent Server is able to ingest. Sometimes developing such a Python "bridge" between a data provider and GeoEvent Server to perform some simple data manipulation or clean-up is easier than using the GeoEvent Server Java SDK to develop a custom inbound adapter or processor. - RJ
... View more
07-23-2019
10:34 AM
|
0
|
0
|
30586
|
|
BLOG
|
Hello Emma ... When clicking the "node hierarchy" icon – to the left of the pencil when the attribute is of type 'Group' – I've had the GeoEvent Manager pop back out to the 'Group' level after editing one of the attributes nested beneath the group. Rather annoying as I have to then click the "node hierarchy" icon again to dive back into the group and add/edit another attribute within the group. But no, I haven't had edits I try to make to specify that an attribute is cardinality 'Many' rather than cardinality 'One' not take when I click 'Save' (to save my changes to the field attribute) and then 'Save' again (to save my overall changes to the GeoEvent Definition). Sorry for what may be a reply too late to be helpful. If this is still a problem for you, please submit an incident to Esri Technical Support and they can likely help you through a screen share. - RJ
... View more
07-23-2019
10:16 AM
|
0
|
0
|
30586
|
|
POST
|
Hello Rahul - My apologies for the delayed reply. Generally, yes, the understanding expressed in the points you call out is correct. >> In a data push scenario, when an external web server, web service, or data provider sends data to GeoEvent Server, the data is sent to a specific resource endpoint via a fully-qualified hostname and port. If this resource is one of three GeoEvent Server instances you have configured – that represents a single point of failure. You will need a load balancer which is smart enough to continue sending data to a specific endpoint as long as that endpoint is available and responding HTTP/200 when data is received. The load balancer will be the solution component responsible for redirection to a different resource / endpoint when the primary receiver is not responding. >> In a data pull scenario, when a GeoEvent Server inbound connector is polling an external web server or web service, GeoEvent Server (in a "site" deployment) should detect when a machine has left the site's configuration and allow another GeoEvent Server instance to adopt and begin running the inbound connector's polls. This resilience, allowing event record ingest to fail-over to another instance, is one advantage to the "site" approach. You will want to administratively monitor and determine why an instance has failed or left the ArcGIS Server site configuration and confirm that another running instance of GeoEvent Server has adopted the running input. This is one of those "trust but verify" scenarios. >> Yes, each instance of GeoEvent Server runs within its own JVM, and the web socket used to broadcast data for a stream service is run from within the GeoEvent Server's JVM container (not by ArcGIS Server as a SOC process). The stream service outbound connector implements a fan-out strategy which uses an internal message bus to forward copies of processed event records to other stream service web sockets so that client applications can subscribe to any GeoEvent Server's web socket and get all of the event records regardless of which instance(s) actually processed the event data. You will need to monitor the event record velocity / volume of each subscribing client. Too many clients subscribing to any single web socket instance will reduce data throughput to all subscribing clients. >> Implementing some sort of reverse proxy to allow web mapping applications and clients to subscribe to stream service web sockets without knowing the specific resource they are connecting to is one way for you to take control of client subscription distribution. The ArcGIS Server stream service, as I understand it, directs client subscription requests to an available server within the site using a round-robin mechanism. But once a subscription connection has been made the client web application is communicating with the web socket GeoEvent Server is running, not with the stream service. The problem here is that if / when a GeoEvent Server instance fails there is no notification to a client application to signal it to unsubscribe and re-subscribe, giving them an opportunity to connect to an available / running GeoEvent Server instance. A brute force work around one of our distributors decided to try was to have their client apps actively, periodically, unsubscribe and re-subscribe. That way, if a client were connected to a "dead" web socket the periodic unsubscribe / re-subscribe would allow automatic recovery and reconnection. This obviously depends on the velocity and volume of event records being broadcast in your solution. A second mitigation the distributor adopted was to train users that if they had any reason to feel that their web map's display was stale to manually refresh the web page which explicitly causes the same unsubscribe / re-subscribe to occur. I am not a system architect. You will probably want to work with your Esri Technical Advisor or request to speak with a Technical Account Manager through Esri Technical Support for help identifying a resource in Esri Professional Services which can help you with system / solution architecture best practices. Best Regards -- RJ
... View more
06-18-2019
01:19 PM
|
1
|
0
|
1922
|
|
IDEA
|
Praveen Ponnusamy Chris Whitmore I know that ArcGIS Pro recently implemented support for stream layers. Check out their on-line help on this topic: https://pro.arcgis.com/en/pro-app/help/mapping/layer-properties/stream-layers.htm Does anyone know if there are plans to close functional gaps between the ArcGIS Pro implemenation for stream layers and the stream layer implementation used by the Enterprise portal web map and ArcGIS Online web map? The latter do not support feature labeling or attribute-driven symbology, but I believe these are capabilities supported by ArcGIS Pro.
... View more
06-17-2019
06:43 PM
|
0
|
1
|
2651
|
|
IDEA
|
Praveen Ponnusamy Chris Whitmore I know that ArcGIS Pro recently implemented support for stream layers. Check out their on-line help on this topic: https://pro.arcgis.com/en/pro-app/help/mapping/layer-properties/stream-layers.htm Does anyone know if there are plans to close functional gaps between the ArcGIS Pro implemenation for stream layers and the stream layer implementation used by the Enterprise portal web map and ArcGIS Online web map? The latter do not support feature labeling or attribute-driven symbology, but I believe these are capabilities supported by ArcGIS Pro.
... View more
06-17-2019
06:43 PM
|
0
|
1
|
2581
|
|
BLOG
|
This blog is one in a series of blogs discussing debugging techniques you can use when working to identify the root cause of an issue with a GeoEvent Server deployment or configuration. Click any link in the quick list below to jump to another blog in the series. Configuring the application logger (this blog) Add/Update Feature Outputs Application logging tips and tricks Geofence synchronization deep dive In this blog I will discuss GeoEvent Manager's user interface for viewing logged messages, the location of the actual log file on disk, and how logging can be configured -- specifically how to control the size of the log file and its rollover properties. The GeoEvent Manager Logging Interface ArcGIS GeoEvent Server uses Apache Karaf, a lightweight flexible container to support its Java runtime environment. A powerful logging system, based on OPS4j Pax Logging, is included with Apache Karaf. The GeoEvent Manager web application includes a simple user-interface for the ops4j logging system. You can use this interface to see the most recent messages logged by different components of ArcGIS GeoEvent Server. The UI illustrated below caches up to 500 logged messages and allows you to scroll through logged messages specifying how many messages should be listed on a page, select a specific type of logged message (e.g. DEBUG, INFO, WARN, or ERROR) as well as perform keyword searches. A significant limitation of this logging interface is that only the most recent 500 logged messages are maintained in its cache, so review and keyword searches you perform are limited to recently logged messages. This means that the velocity and volume of event records being processed as well as the number of GeoEvent Services, inputs, and outputs you have configured can affect (and limit) your ability to isolate logged messages of interest. A valuable debugging technique is to locate the actual log file on disk and open it in a text editor. Location of the log file on disk On a Windows platform, assuming your ArcGIS GeoEvent Server has been installed in the default folder beneath C:\Program Files, you should be able to locate the following system folder which contains the actual system log files. C:\Program Files\ArcGIS\Server\GeoEvent\data\log In this folder you will find one or more files with a base name karaf.log – these files can be opened in a text editor of your choice for content review and search. You can also use command-line utilities like tail, string processing utilities like sed, grep, and awk, as well as regular expressions to help isolate logged messages. Examples using these are included in other blogs in this series. Only one log file, the file named karaf.log, is actively being written at any one time. When this file's size has grown as large as the system configuration allows, the file will automatically rollover and a new karaf.log file will be created. Log files which have rolled over will have a numeric suffix (e.g. karaf.log.1) and the file's last updated date/time will be older than the karaf.log currently being written. If you open the karaf.log in a text editor you should treat the file as read-only as the logging system is actively writing to this file. Be sure to periodically reload the file's content in your text editor to make sure you are reviewing the latest file. How to specify an allowed log file size and rollover properties Locate the org.ops4j.pax.logging.cfg configuration file in the ArcGIS GeoEvent Server's \etc folder: C:\Program Files\ArcGIS\Server\GeoEvent\etc Using a text editor run as an administrator, because the file is located beneath C:\Program Files, you can edit properties of the system log such the default logging level for all loggers (a "logger" in this context is any of several components that are actively logging messages, such as the outbound feature adapter or the inbound TCP transport). For example, at the 10.7 release a change was made to quiet the system logs by reducing the ROOT logging level from INFO to WARN so that only warnings are logged by default. You can see this specified in the following line in the org.ops4j.pax.logging.cfg configuration file: # Root logger log4j2.rootLogger.level = WARN Searching the configuration file for the keyword "rolling" you will find lines which specify the karaf.log file's allowed size and rollover policy. Be careful -- not all of the lines specifying the rollover policy are necessarily in the same section of the log file; some may be located deeper in the file: # Rolling file appender log4j2.appender.rolling.type = RollingRandomAccessFile log4j2.appender.rolling.name = RollingFile log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i log4j2.appender.rolling.append = true log4j2.appender.rolling.layout.type = PatternLayout log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} log4j2.appender.rolling.policies.type = Policies log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.rolling.policies.size.size = 16MB log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy log4j2.appender.rolling.strategy.max = 10 The settings above reflect defaults for the 10.7 release which specify that the karaf.log should rollover when it reaches 16MB and up to 10 indexed files will be used to archive older logged messages. The anatomy of a logged message Before we conclude our discussion on configuring the application logger I would like to briefly discuss the format of logged messages. The logged message format is configurable and logged messages by default have six parts. Each part is separated by a pipe ( | ) character. The thread identifier default specification (see illustration below) has a minimum of 16 characters but no maximum length; some thread identifiers can be quite long. The class identifier spec includes a precision which limits the identifier to the most significant part of the class name. In the illustration above the fully-qualified class identifier com.esri.ges.fabric.core.ZKSerializer has been shortened to simply ZKSerializer. We will discuss the impact of this more in a later blog. You can edit the org.ops4j.pax.logging.cfg configuration file to specify different patterns for the appender. You should refer to https://logging.apache.org/log4j/2.x/manual/layouts.html#PatternLayout in the Apache logging services on-line help before modifying the default appender pattern layout illustrated below. # Common pattern layout for appenders log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %geoeventBundleID - %geoeventBundleName - %geoeventBundleVersion | %m%n log4j2.out.pattern = \u001b[90m%d{HH:mm:ss\.SSS}\u001b[0m %highlight{%-5level}{FATAL=${color.fatal}, ERROR=${color.error}, WARN=${color.warn}, INFO=${color.info}, DEBUG=${color.debug}, TRACE=${color.trace}} \u001b[90m[%t]\u001b[0m %msg%n%throwable Conclusion Using the logging interface provided by GeoEvent Manger is a quick, simple way of reviewing logged messages recently produced by system components as they ingest, process, and disseminate event data. Event record velocity and volume can of course increase the number of messages being logged. Increasing the logging level from ERROR or WARN to INFO or DEBUG can drastically increase the volume of logged messages. If running components are frequently logging messages in the system's log file only the most recent the messages will be displayed in the GeoEvent Manager user-interface. Messages which have been pushed out of the cache can be reviewed by editing the karaf.log in a text editor. This is a key debugging technique, but you must be aware that the karaf.log is actively being written and will rollover as it grows beyond a specified size. As you make and save changes to the system logging, for example, to request DEBUG logging on a specific logger, the changes will immediately be reflected in the org.ops4j.pax.logging.cfg configuration file. You can edit this file as an administrator and any changes you save will be picked up immediately; you do not have to stop and restart the ArcGIS GeoEvent Server service.
... View more
06-14-2019
05:45 PM
|
4
|
0
|
7495
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 01-05-2023 11:37 AM | |
| 1 | 02-20-2025 03:50 PM | |
| 1 | 08-31-2015 07:23 PM | |
| 1 | 05-01-2024 06:16 PM | |
| 1 | 01-05-2024 02:25 PM |
| Online Status |
Online
|
| Date Last Visited |
6 hours ago
|