|
POST
|
Hi Javier Distefano, To clarify: At 10.6.1 we added support for Kerberos with HDFS We do not currently support Kerberos with Hive. - Sarah
... View more
05-22-2019
08:40 AM
|
0
|
0
|
3151
|
|
POST
|
Hi Zakaria Douiri, Can you please turn on debug on your GeoAnalytics Server and rerun the tool. The debug messages will be more informative about the issue. Please double check that your keytab is up to date, and available to all machines in the GeoAnalytics Server site. Thanks, Sarah Product Engineer, GeoAnalytics Server
... View more
05-09-2019
12:28 PM
|
0
|
1
|
3151
|
|
POST
|
Hi Mody Buchbinder, You'll want to turn on debug to see the cause of the invalid input. That will help figure out if there is an issue with your query. The format of the input should look like this {"url":"https://machine.domain.com/gax_web_adaptor/rest/services/DataStoreCatalogs/bigDataFileShares_pyTest/BigDataCatalogServer/californiapoints", "filter":"fieldname > 10"} Thanks, Sarah
... View more
04-25-2019
05:55 AM
|
0
|
0
|
662
|
|
POST
|
Hi Mody Buchbinder, Currently there isn't a way to set a filter on big data file shares when running tools through ArcGIS Pro (ENH-000118129) or your portal Map Viewer (ENH-000116581). But, filters are supported at the GeoAnalytics level by supplying them through REST - so you can definitely apply filters to your analysis that way. As you mentioned, Copy to Data Store in the Pro UI also exposes the filter parameter (and Find Similar Locations). For example: {"url" : "https://myportal.domain.com/server/rest/services/Hosted/hurricaneTrack/FeatureServer/0", "filter": "Month = 'September'"} Sarah Ambrose GeoAnalytics Product Engineer
... View more
04-10-2019
06:39 AM
|
0
|
2
|
662
|
|
POST
|
Thanks Josh Hevenor, As mentioned in the other thread, the Gen 2 Data Lake isn't supported. If you moved this over to a blob store with a similar structure you would would register the blob container 'lakefilsystem', and not include a folder. The 4 datasets, AIS, flight_delays, NYCTaxi and reference would be recognized and registered in your big data file share. The NYC Taxi dataset would include every folder below as a single data. If you wanted to create a new dataset for each year in NYC Taxi, like 2018 and then had folders for 2019, 2017 etc. You would set the folder as "NYCTaxi", and every subfolder at the next "level" (so what you see when you browse into NYXTaxi), would then be represented as an individual dataset. - Sarah
... View more
04-04-2019
12:10 PM
|
1
|
1
|
1825
|
|
POST
|
Hi Josh Hevenor, I've worked with the cloud store team to test this out. We do NOT currently support Gen 2 Data Lakes. We'll work on making that clear in the documentation, and look into supporting this in future releases. Please feel free to submit an enhancement request through support as well as use those to prioritize new functionality. If you are able to have your admin create a GEN 1 Data Lake (or use a blob store) that will work at 10.6.1. Thanks for bringing this to our attention! Sarah
... View more
04-04-2019
11:34 AM
|
2
|
5
|
5047
|
|
POST
|
Hi Josh Hevenor, I'm checking in with the cloud store team now. I've been looking into the Gen 2 data lake, and given that the Gen 2 data lake is very different than Gen 1 and was released much later than 10.6.1 - I want to check with them to see if it's supported. If possible - I would recommend not spending time testing until I get an official confirmation from them . I'm hoping to know by today. - Sarah
... View more
04-04-2019
07:23 AM
|
0
|
2
|
5047
|
|
POST
|
Josh Hevenor, Are you able to share a screen shot of the folder structure of your Azure Data Lake? (or mock it up) I noticed you no longer have the same file name as your other post.Right now I am assuming you have: -- Azure Data lake ----Folder (NYC Taxi) ------Taxi_data_1.csv but from the other one I assumed: --Azure Data Lake ----LakeFileSystem ------NYCTaxi --------Taxi_data_1.csv For example, this is the one I use: The blacked out term is part of the account endpoint. "simple" is the folder I specified in the cloud registration. When I later register this as a big data file share, I will have 5 datasets (each a folder) show up in GeoAnalytics: - Copy_to_Data_Store_for.... - Copy_to_Data_Store_..... - earthquakes - mineplant - sdadsa I have noticed mine is Gen1 - so I'll additionally try out a Gen 2 one today and let you know how it goes. - Sarah Ambrose Product Engineer, GeoAnalytics Team
... View more
04-04-2019
06:36 AM
|
0
|
1
|
1825
|
|
POST
|
Hi Josh Hevenor, I think you have correctly modified your account endpoint. For example, my testing env looks like "mydatalakestorename.azuredatalakestore.net". Please try removing the trailing slash if you still have it. Another suggestion is to double check the folder you are registering. For a big data file share you want to be registering the folder that contains folders of datasets. So if "NYCTaxi" contains all your NYC taxi files, you should only be specifying the folder "lakefilesystem". I don't think this is causing your issue, but you may run into a problem in the next step. Steps here. Please let me know what you get after these steps. There is a possibility that you don't have the correct access permissions set up, but let's check that the above is completed before that step. We do have the following issues at 10.6.1, but if you are at 10.6.1, this shouldn't be an issue. I've pasted it here *just in case*, as well as a reference for any one that finds this post later. Keep me updated! Sarah Ambrose Product Engineer, GeoAnalytics Team
... View more
04-04-2019
06:26 AM
|
1
|
4
|
5047
|
|
POST
|
Hi Rico Illes, Can you do the following: 1. In Server manager on your GeoAnalytics server set the log level to debug. 2. Then rerun the tool and query the logs for the last 15 minutes (of however long it takes to fail), and the Source to System/GeoAnalyticsTools.GPServer. Make sure you are on the GeoAnalytics Server. Can you also verify you have been able to run Copy on a smaller dataset, and that you are copying to the spatiotemporal data store (and not the relational). I don't think the warning about the managed edgb is related, but will have to get the full GeoAnalytics logs to be sure. Please let me know if you have any questions, Sarah Ambrose Product Engineer, GeoAnalytics Team
... View more
02-26-2019
08:23 AM
|
0
|
1
|
986
|
|
POST
|
Hi Javier Distefano, Did you have any luck? - Sarah Product Engineer, GeoAnalytics Team
... View more
01-28-2019
05:01 AM
|
0
|
2
|
3151
|
|
POST
|
Thanks Anjitha Senarath, Are you able to send me the case number? Can you please clarify what your input data is? (big data file share registered with GeoAnalytics, something else?) The 1000 limit from server will not impact your results at all and you are able to create a feature service with 4 million points. I will point out that you don't need to create a feature service to run GeoAnalytics tools though, you can keep the data where it is, and register it as a big data file share and use that as input to any tool. - Sarah
... View more
01-23-2019
06:02 AM
|
0
|
3
|
2154
|
|
POST
|
Hi Anjitha Senarath, What is the input of your data? When you are copying from ArcGIS Pro to GeoAnalytics it's best to have your data local to your server. This means as either a hosted feature layer, or as a big data file share. If you're using other inputs, such as dataset local to your Pro machine, the data will be copied from Pro to Server before the GeoAnalytics tool even runs, which can take a very long time. If your data is a big data file share or a hosted layer and you're seeing this error message - I would recommend calling into support. They will be able to turn on more informative error messages and help diagnose the issue. Please let me know if you have any follow up questions. Thanks, Sarah Ambrose Product Engineer, GeoAnalytics Team
... View more
01-08-2019
05:52 AM
|
0
|
5
|
2154
|
|
POST
|
Hi Jerry Stafurik, You can use the REST call addToDefinition on the feature service. The documentation for that call is outlined here: Add to Definition (Feature Service)—ArcGIS REST API: Services Directory | ArcGIS for Developers You will access it by going to the following URL of the FS you created, https:/<url>/<hosted server WA>/rest/admin/services/Hosted/mylayer/FeatureServer/addToDefinition, and modify the JSON to outline the feature service you are interested in creating. - Sarah
... View more
01-07-2019
09:20 AM
|
0
|
3
|
4557
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 07-18-2023 04:44 AM | |
| 1 | 06-01-2021 07:01 AM | |
| 1 | 07-29-2020 03:36 PM | |
| 1 | 04-13-2015 10:32 AM | |
| 2 | 05-29-2018 01:19 PM |
| Online Status |
Offline
|
| Date Last Visited |
11-21-2025
09:23 AM
|