|
POST
|
Hi Jerry Stafurik, I have two ideas about how you can do this. One though GeoAnalytics, and one without GeoAnalytics. Through GeoAnalytics. Create a text file with the field names you are interested in, and one dummy row. For example, I created one that looked like this: Field1,Field2,Field3
1,10.6,Hello I then registered it as a big data file share, and used that as the input to Copy to Data Store. It created a layer with the following fields: Field1, Field2, Field3, globalid, OBJECTID The other way to do this, is to create a layer using the portal sharing API. You can use this endpoint to create a service: Create Service—ArcGIS REST API: Users, groups, and content | ArcGIS for Developers . To specify that you want to save it to spatiotemporal add the following:
"options":{"dataSourceType":"spatiotemporal"} So your JSON might look something like this (sorry about the formatting - I recommend copying it in to a JSON viewer 😞 { "currentVersion": 10.6, "serviceDescription": "", "hasVersionedData": false, "supportsDisconnectedEditing": false, "hasStaticData": true, "maxRecordCount": 2000, "supportedQueryFormats": "JSON", "capabilities": "Query", "description": "", "copyrightText": "", "allowGeometryUpdates": false, "syncEnabled": false, "editorTrackingInfo": { "enableEditorTracking": false, "enableOwnershipAccessControl": false, "allowOthersToUpdate": true, "allowOthersToDelete": true }, "xssPreventionInfo": { "xssPreventionEnabled": true, "xssPreventionRule": "InputOnly", "xssInputRule": "rejectInvalid" }, "tables": [], "name": "mylayer", "options":{"dataSourceType":"spatiotemporal"} } And then you can populate the values using the add to definition endpoint of the service. https:/<url>/<hosted server WA>/rest/admin/services/Hosted/mylayer/FeatureServer/addToDefinition . Make sure /admin/ is included in your URL, otherwise you won't see the addDefinition option. Additionally, the following table outlines some of the differences between spatiotemporal and relational. At 10.6.1, you are not able to delete columns from a layer hosted in the spatiotemporal big data store. Please let me know if you have any follow up questions. Sarah Ambrose Product Engineer, GeoAnalytics Team
... View more
01-04-2019
09:40 AM
|
1
|
8
|
4557
|
|
POST
|
Hi Emmanuel Rosetti, That's awesome news that you are going to be using GeoAnalytics to power your analysis! You may already know about some or all of these links, but I wanted to provide them in one place in case anyone else is reading this question later on - Deployment patterns - includes minimum requirements for components: Deployment patterns for ArcGIS Enterprise—ArcGIS Enterprise | ArcGIS Enterprise + Base ArcGIS Enterprise deployment—ArcGIS Enterprise | ArcGIS Enterprise - Steps to set up GeoAnalytics: Set up ArcGIS GeoAnalytics Server—Documentation | ArcGIS Enterprise - GeoAnalytics settings (configured after you have it all set up to fully use your resources): GeoAnalytics Server settings—Documentation | ArcGIS Enterprise Currently, the documentation suggests a minimum of: - 16 GB for GeoAnalytics Server - 16 GB for a base deployment (portal, hosting server, and relational + tile cache) - 16 GB for the spatiotemporal data store I would personally recommend a 3 machine setup, one for each of the 16 GB mentioned above (and 32 if you have it!). If this is just for prototyping, and not for deployment, you could try it all on a single machine with 32GB. If you have 2 machines, I would put the spatiotemporal data store on a second machine. Be warned in that case, it probably won't be performant, but you'll become familiar with the tools and workflows for scaling out later. Please let me know if you have any follow up questions, Sarah Ambrose Product Engineer, GeoAnalytics Team
... View more
11-21-2018
08:38 AM
|
2
|
1
|
876
|
|
BLOG
|
Hi Daniel, Can you confirm for me what version of Enterprise you have (you'll need 10.6.1) and what version of the ArcGIS API for python you have? Thank you! Sarah Ambrose GeoAnalytics Product Engineer
... View more
11-06-2018
06:46 AM
|
0
|
0
|
422
|
|
POST
|
Hey Andrew Rudin, Thanks for the use case, it really helps me understand the workflow and the problem. So I really appreciate it. Can you expand on this (what you said above): For example, maybe I want to investigate "how many tickets were created in various parts of the city over space and time the last 5 years?" and then "is there a spatiotemporal variance in how long tickets are open in parts of the city". With how it's setup now I don't see how I can do it without extracting the SBDS data to my desktop or another hosted copy where I can redefine the time-enablement on a Pro layer or a separate nearly-identical feature service. If I load the SBDS feature service into Pro, the time-enablement section of the properties is grayed-out so I can't redefine it there before creating the space-time cube. I also don't see a way to alter the properties of the feature service in GeoEvent or Portal to tinker with the time-enable properties. I think hosted views aren't an option for this kind of data either. Here is the solution I was planning on proposing: I think there are two ways you can solve this: 1. Modify the time settings on the layer: You can change the time settings of the layer in portal. To do this - open the item in portal (the item page), you'll see the layers listed. From there, there should be a button called "time settings", and you can change the time on the service that way. This does change the service definition though. 2. Create a view of the layer, and modify the time settings on that. You can make a view of the layer (again, from the portal item page), and modify the time settings on that layer (in the portal item page again). Here is an overview of views to get you started: Create hosted feature layer views—Portal for ArcGIS | ArcGIS Enterprise So I'd like to know: Why can't you create a view on this layer? In the portal item page, does "create view layer" show up? In portal - you should be able to change the time settings on the item page. Are you the owner of the layer? Thanks!
... View more
10-12-2018
10:36 AM
|
0
|
1
|
1564
|
|
POST
|
Hi Andrew Rudin - thanks for the question! Currently - you're only able to store data in the spatiotemporal data store in WGS84 (wkid 4326). Tools in GeoAnalytics that require binning (Create Space Time Cube, Aggregate Points) require that your data be projected for those analyses. To do this we recommend that you project on the fly (like you mentioned). Compared to the rest of your big analysis, we haven't noticed that projecting adds to much time. I'm guessing you are at version 10.5? At 10.5.1 or greater, we will automatically project data being binned for you to World Equal Area projection if it's in a geographic coordinate system (we don't overwrite the projection choice if you supply one!). Depending on how you are running your analysis (Pro, Rest, Portal) you can specify the processing SR. For example, if you were using the NAD_1927_StatePlane_Texas_ Central_FIPS_4203 projection, you would apply the WKID 32039 (I like to use this link to find the wkids, or look at the spatial reference options in Pro - outlined below). To do this for each UI: In ArcGIS Pro, set the GP Environment settings by going to Analysis ribbon > Environments > Output Coordinate System > and paste in that WKID or browse for it. Then hit save. In the portal Map Viewer, when you open up a tool, hit the "gear" icon and set the processing spatial reference there. In REST you would set the context parameter like so {"processSR":{"wkid":32039}} The results of your tool will either be written to: the spatiotemporal data store and saved in WGS84 (projected back) the relational data store and saved in the spatial ref you processed in a cube (which you download, not store it in a data store) in the spatial reference your processed in. Please let me know if you have any follow up questions or I missed anything, Sarah Product Engineer, GeoAnalytics Team
... View more
10-11-2018
12:49 PM
|
1
|
3
|
1564
|
|
POST
|
Hi Dalinda Damm, Thanks for the update. Can you please send me the support case number to sambrose at esri.com? I'd like to check it out, and chat with the analyst to understand their interpretation. Once I do that I'll update this to make sure we have an "official" answer here. Thank you! Sarah Product Engineer, GeoAnalytics
... View more
08-09-2018
08:34 AM
|
0
|
0
|
1858
|
|
POST
|
Hi Mody Buchbinder, In this case it doesn't make a big difference on how you organize your data. GeoAnalytics will read in the data based on the number of features in the big data file share dataset (so all shp that form the dataset). The thing to be aware of is that you don't want a shp file that is bigger than allowed - which is 2GB. As long as each shapefile is smaller than that, you should be good to go. Please let me know if you have any follow up questions. Thanks, Sarah Product Engineer, GeoAnalytics Team
... View more
08-08-2018
11:45 AM
|
1
|
1
|
735
|
|
POST
|
Hi Mody Buchbinder, Yes, if you want 3 GeoAnalytics machines for analysis, you should re-do it with 4 machines: 3 GeoAnalytics, and 1 File Server. The file share machine doesn't need to have the same memory/cores as your GeoAnalytics machine. For example, you could do 1 core, with decent memory and good IO. This machine is used for backup and recovery. Thanks, Sarah Product Engineer, GeoAnalytics Team
... View more
08-08-2018
11:41 AM
|
0
|
0
|
1751
|
|
BLOG
|
As part of the 10.6.1 release, we've published a blog post outlining how to create layers that aggregate large data sets at different zoom levels. You can check out the blog post here: Visualize aggregated data in ArcGIS GeoAnalytics Server 10.6.1. The blog contains a link to a sample notebook that you can modify to use with your own GeoAnalytics Server, as well as instructions on how to run this through REST. This functionality exposes capabilities of the spatiotemporal big data store that is also used with GeoEvent Server. I'm excited to hear any feedback or any further requests! To see what else we added at 10.6.1 (spoiler: new tools and big data file share inputs) see What's new in ArcGIS Enterprise 10.6.1—ArcGIS Enterprise | ArcGIS Enterprise. Happy Aggregating! Sarah Ambrose Product Engineer, GeoAnalytics Team Real-Time & Big Data GIS Big Data What's New in ArcGIS ArcGIS Enterprise
... View more
07-31-2018
10:37 AM
|
0
|
2
|
1282
|
|
POST
|
Hi Sergio Eduardo Galindo, Are you able to call into tech support about this case. An analyst will be able to do a screen share with you and investigate what's going on. Thanks, Sarah
... View more
07-19-2018
02:11 PM
|
0
|
1
|
776
|
|
POST
|
Hi Morne Thero, The GIS Tools for Hadoop will help you complete this workflow. Here is an overview page: GIS Tools for Hadoop by Esri. In particular, you can use the Geoprocessing Tools for Hadoop ("The Connector") in the link above. These tools move data between Hadoop and a feature class (in ArcMap). The readme file on the wiki outlines the instructions on how to use the tools, and there is also a tutorial on how to run them. If you have any issues while running the tools, I would recommend creating an issue in the github repository. Hope that helps! Sarah Product Engineer, GeoAnalytics Tools
... View more
06-27-2018
08:51 AM
|
0
|
0
|
710
|
|
POST
|
Hi GUO MEILING, Yes, lines are supported through CSV. If you are unable to get this to work, please contact support services and they can help troubleshoot that the problem is. Thanks, Sarah Ambrose Product Engineer, GeoAnalytics Server.
... View more
06-04-2018
10:02 AM
|
0
|
0
|
409
|
|
POST
|
Hi Dalinda Damm, By default, GeoAnalytics uses the spatitoemporal data store as an output, because of its ability to scale out to multiple machines and handle large datasets. There is currently some documentation on when you might want to use the relational data store in the tool hover help. GeoAnalytic results are stored in an ArcGIS Data Store and exposed as a feature layer in Portal for ArcGIS. In most cases, results should be stored in a spatiotemporal data store. This is the default. The following are reasons why you may want to store results in a relational data store: To use your results in portal-to-portal collaboration To enable sync capabilities with your results You should not use a relational data store if you expect your GeoAnalytics results to increase and want to take advantage of the spatiotemporal big data store's capabilities to handle large amounts of data. At 10.6.1 (releasing soon), we've also added some documentation here: Feature layers—Portal for ArcGIS | ArcGIS Enterprise . You won't be able to see it yet, but once 10.6.1 documentation goes live with the 10.6.1 release, there will be a table outlining the differences between feature layers in relational and spatiotemporal data stores. I'm hoping that new documentation helps - we wanted it just for this purpose Sarah Product Engineer, GeoAnalytics
... View more
05-29-2018
01:19 PM
|
2
|
2
|
1858
|
|
POST
|
Hi Ryan Bae, Yes - the GeoEvent Server and GeoAnalytics Server will always use the same spatitoemporal data store set up with your hosting server. As for sufficient resources, that will depend on a few different things: - the amount of RAM/space on your spatiotemporal data store machines - the amount of data stored in your spatiotemporal data store - the amount of data you are ingesting through GeoEvent - the size the analysis results you are writing (and how often). This is related to the amount of data stored. So if you do find that you are running out of resources, you will need to install ArcGIS Data Store on additional machines, and configure it with the same server (hosting) that your spatiotemporal data store is already configured with. So for example, if you installed and configured two more spatiotemporal machines, you would now have a spatiotemporal data store with (3+2) = 5 nodes. Let me know if you have any follow up questions, Sarah Product Engineer, GeoAnalytics
... View more
05-29-2018
12:55 PM
|
1
|
1
|
950
|
|
POST
|
Hi Joshua Bixby, There is some documentation on the health check for 10.6 - it's just isolated to the Server Administrator API reference. You can access that through installed help by going to your admin endpoint for your server, click API reference, and then just search health check. It will have a URL like this: http://<machine-name.domain.com>/<gax WA>/admin/www/doc/index.html#/Compute_Platform_Health_Check/02w0000000n2000000/ or online here: Compute Platform Health Check—ArcGIS REST API: Administer your server | ArcGIS for Developers As for the content of that new documentation, it's not a secret at all . I've pasted the intro paragraph below that shows you where to find the health check. I can email you the full content if you are interested - just let me know, it will go over understanding the health check json format. The health check operation indicates the status of compute resources and jobs within the GeoAnalytics Server compute platform and is the underlying framework that allows GeoAnalytics Server to distribute analysis across multiple GeoAnalytics Server cores and machines. The health check operation, which can be used for troubleshooting and monitoring, is available to the server or portal administrator and is accessed by logging in to your ArcGIS ServerAdministrator Directory using the URL format https://gis_geoanalytics_server.domain.com:6443/arcgis/admin. To navigate to the health check, start from your GeoAnalytics Server site and click System > Platform Services >Compute Platform > Health. Use the health check operation to do the following: Verify and identify GeoAnalytics Server machines that are being used for analysis. Verify that the allotted memory and cores are being used. Check the amount of cores or memory available on the GeoAnalytics Server machines. This can be helpful when using the GeoAnalytics Server settings. Check the number of GeoAnalytics Server jobs currently running or completed. Find the jobID of a GeoAnalytics Server job to view the REST endpoint. Troubleshoot and resolve error messages. You're correct, when I say stop or cancel I job, I mean - "terminate the job". There isn't currently a concept of pausing a job. I just checked, and found the documentation on the one minute limit in the GeoAnalytics Server Settings documentation here: GeoAnalytics Server settings—Documentation | ArcGIS Enterprise > When selecting the amount of memory to use, be sure to set a number that is lower or equal than the percentage set for machine resources (default 80 percent). If you do, jobs will wait for resources that are not actually available for one minute, and then be canceled with the following error: ERROR BD_101057: Unable to start distributed job. Please check your GeoAnalyticsTools service settings and ensure that there are enough resources available for the job to run.. You will also see this error if resources are unavailable because other jobs are using the resources. > The existing tools are OK for managing the umbrella of geoprocessing tools to date, I am not sure they are up for the task of managing a distributed geoprocessing environment for hundreds, possibly thousands, of users. Then again, maybe what I am envisioning for a distributed geoprocessing platform isn't what Esri is building, hence my questions. For addressing this question, which I believe is similar to one we have discussed before - it would help us to understand the workflows and management that you are looking to do. For example, a lot of this may be completed using ArcGIS Monitor. Without knowing specifics, it's difficult for us to outline best practices, or figure out the best enhancements to implement in server. Thanks, Sarah
... View more
05-15-2018
10:08 AM
|
1
|
1
|
1105
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 07-18-2023 04:44 AM | |
| 1 | 06-01-2021 07:01 AM | |
| 1 | 07-29-2020 03:36 PM | |
| 1 | 04-13-2015 10:32 AM | |
| 2 | 05-29-2018 01:19 PM |
| Online Status |
Offline
|
| Date Last Visited |
11-21-2025
09:23 AM
|