|
POST
|
Thanks for the response. I think I'm going to put in a support ticket. For me, running print(arcgis.__path__) returns the path to my default environment, even though the cloned environment is shown as the active one in the Pro python package manager.
... View more
07-16-2019
03:03 PM
|
1
|
0
|
1512
|
|
POST
|
I am trying to get going with Jupyter Notebooks and the ArcGIS API for Python, but I'm having some problems with the version installed with ArcGIS Pro 2.4. I know that in Pro you can create multiple Python environments, and you can activate which one is used by the project via the GUI by going to Project->Python. I saw that Pro 2.4 installs the ArcGIS API version 1.6.1...however the API website shows 1.6.2 is the latest. I created a clone of the default environment and upgraded the API using these instructions. When I go back into Pro I can see the clone environment has arcgis api version 1.6.2, and it is the active version for my project. The problem is, when I open the Jupyter Notebooks provided on the Start Menu under ArcGIS, I believe it is using the default environment and not my upgraded one. I don't see anything on the Jupyter Notebook interface that tells me which environment is being used. To check, I created a new notebook and entered the following code: import arcgis print(arcgis.__version__) The result says 1.6.1, therefore I assume it's using the default environment. Any advice on how to get the Jupyter Notebook to use my clone environment? I'm trying to stick with the out of the box tools installed with Pro
... View more
07-16-2019
08:21 AM
|
0
|
2
|
1860
|
|
POST
|
I'm trying to setup a new 10.6.1 GeoEvent server to read AVL data. I want to filter it to only vehicles that have a string attribute that starts with the letters AW, but ^AW isn't returning any results. Any recommendations on what to fix? Here's what it looks like in the GeoEvent Filter configuration screen:
... View more
03-12-2019
07:43 AM
|
0
|
1
|
1024
|
|
POST
|
Thank you! I used the "Settings Keys" section here: (https://github.com/Esri/arcgis-pro-sdk/wiki/ArcGIS-Pro-Registry-Keys#settings-keys-1). Manually created a new registry key and pasted in my folder path. Appears to work. If I open Pro and remove the add-in, then close and reopen it appears. Will try on Monday with a colleague logging in for the first time.
... View more
02-01-2019
03:09 PM
|
0
|
0
|
944
|
|
POST
|
I'm setting up a shared PC with ArcGIS Pro, sort of like a computer lab PC. I have an Add-In that I want to be turned on by default for every new user that logs in and starts up Pro. First I placed the add-in in a folder on the PC all users can access. Then when I logged into Pro I was able to go to Project->Add-In Manager->Options, then choose "add folder..." and point to that folder. Now that I've done the work to set that up, is there a way to make that add-in folder applied permanently for all Pro users on that machine?
... View more
02-01-2019
12:49 PM
|
0
|
2
|
1021
|
|
POST
|
Okay, I think I see the issue. I do not get those options to create a view or alter time settings on the SBDS layer I created in GeoEvent manager. To be specific, in GeoEvent Manager go to Site->Spatiotemporal Big Dat Stores->Create Data Source. I guess GeoEvent created SBDS layers are not considered hosted layers. To test I created a second SBDS layer a different way...by going into Pro and running GeoAnalytics->Copy to Data Store on the sample data housed on my PC. That creates a portal item labeled as hosted and does provide the "time settings" and "create view" options. Perhaps if I create a hosted SBDS layer first via Pro, I can get GeoEvent to write to it. Something to try later. Regardless, given this little discrepancy, sounds like we'll have to put some forethought into how we want to use any data from GeoEvent for analysis before we put any GeoEvent services into production. Thank for walking through this example with me. This really helps me understand how these servers fit into solutions for our business needs. -Andrew
... View more
10-12-2018
12:33 PM
|
0
|
0
|
1542
|
|
POST
|
Thank you Sarah, That helps to know the projection I'm aiming for so I can design GeoEvent processes to do as little as possible. Would hate to project to State Plane only to have the data auto-reprojected back. I'm using fresh 10.6.1 servers and Pro 2.2.3. I've now redeployed a default GeoEvent SBDS output layer and have some sample data populated. When I try to create a space time cube from it using the GeoAnalytics toolbar in Pro, I see that it was just giving me a warning about the coordinate system and that the tool will apply the "World Cylindrical Equal Area" projection. However, right after that it bombed out because the input layer is time interval based, and that is what caused it to error out. I pasted the results window text below. The data I'm using for my sample are city 311 service request tickets. There are 500,000 records covering requests for all the various City services and we have history for the last 5 years, so it seemed like a good candidate to try spatiotemporal hosting on, I used Geoevent to create it since I can see us loading in tickets as they come in during every day. The data has a few date fields in them that would be useful in various spatiotemporal analysis operations depending on what an analyst wants to look at. In my case those fields are created_date, last_status_update_date, and close_date. When I defined the GeoEvent Definition for this data, I tagged the created_data as the START_TIME tag, and the closed_date as the END_TIME tag. Therefore, I guess when the overlying feature service got created , it was then time-enabled as interval based with those two fields hard coded as start-end time values. The space-time cube wants a dataset that is instant time...meaning only one field like created_date is used to time-enable. So as an analyst, if I stumble upon this amazing GeoEvent output layer in my Portal and I want to do spatiotemporal analysis on it, then how would I go about it? For example, maybe I want to investigate "how many tickets were created in various parts of the city over space and time the last 5 years?" and then "is there a spatiotemporal variance in how long tickets are open in parts of the city". With how it's setup now I don't see how I can do it without extracting the SBDS data to my desktop or another hosted copy where I can redefine the time-enablement on a Pro layer or a separate nearly-identical feature service. If I load the SBDS feature service into Pro, the time-enablement section of the properties is grayed-out so I can't redefine it there before creating the space-time cube. I also don't see a way to alter the properties of the feature service in GeoEvent or Portal to tinker with the time-enable properties. I think hosted views aren't an option for this kind of data either. I really wish users could alter/override the time-enablement of web layers inside of applications (Pro, Portal map layers), rather than it being hard coded into service definitions. This would give analysts freedom to explore lots of different attributes from a single dataset, rather than being limited to whatever the data admin that initially created the web service picked. Here is the output from the processing. Parameters Point Layer AustinOpenData_311UnifiedData\AustinOpenData_311UnifiedData Output Name junkspacetimecube.nc Distance Interval 1000 Feet Time Interval 1 Months Time Interval Alignment REFERENCE_TIME Reference Time 1/1/2013 Summary Fields Output File Messages Start Time: Thursday, October 11, 2018 5:04:48 PM Running script Create Space Time Cube... Submitted. Executing... Executing (CreateSpaceTimeCube): CreateSpaceTimeCube "Feature Set" 1000 Feet 1 Months ReferenceTime 1/1/2013 [] junkspacetimecube.nc # Start Time: Thu Oct 11 17:04:51 2018 Using URL based GPRecordSet param: https://coagisentd1.coacd.org/server/rest/services/Hosted/AustinOpenData_311UnifiedData/FeatureServer/0 WARNING 120094: Bin generation and analysis requires a projected coordinate system and a default projection of World Cylindrical Equal Area has been applied. ERROR 120040: Wrong time type for 'Input Features'. Expected 'instant', got 'interval'. Failed to execute (CreateSpaceTimeCube). Failed at Thu Oct 11 17:04:55 2018 (Elapsed Time: 3.91 seconds) Failed. ERROR 000582: Error occurred during execution. Completed script Create Space Time Cube... Failed to execute (CreateSpaceTimeCube). Failed at Thursday, October 11, 2018 5:04:55 PM (Elapsed Time: 6.66 seconds) -Andrew
... View more
10-12-2018
06:25 AM
|
0
|
2
|
1542
|
|
POST
|
We just deployed GeoEvent, GeoAnalytics, and the Spatiotemporal Big Data Store (SBDS) deployed and I ran into an anomaly when while trying them out. In GeoEvent I'm reading in CSV data that has geometry in lat/long format, and then I want to store the results in a SBDS layer for use in analysis later. On my first attempt, I chose the defaults when creating the SBDS layer in GeoEvent Manager, which means it is created as WGS84. Once I got some sample data loaded, I tried to use GeoAnalytics to create space time cubes out of it, but I got the warning that the layer needs to be in a projected coordinate system. So, back to the drawing board. I figured I needed to recreate the SBDS layer with a projected coordinate system (at my office we use one of the US State Plane formats). When I went back and tried to create a new SpatioTemporal Data Source, I saw there are options to define the coord system of the Map & Feature services attached to the data source, but no option to define the coord system of the underlying data. I went ahead and tried this option to have the services in State Plane, but the result were services that were still in WGS 84, and when I look at the JSON results of the web service I see the coordinates are still lat/long. So here are my questions? 1) Am I right in thinking I should store data in the SBDS in the same coordinate system as the rest of our organization to avoid "projection on the fly" processing overhead during analysis. 2) If so, how do I define the coord system of the spatiotemporal layer? And then how do I get the lat/long feed to go into the data source properly? Thanks for any help you can provide, Andrew
... View more
10-11-2018
12:16 PM
|
0
|
4
|
1823
|
|
POST
|
I want to get a report of all groups in AGOL and their user count via the ArcGIS API for Python. I cannot figure out how to get the gis.groups.search() function to return all groups. All the Esri examples have some sort of text filter applied. The code I'm running from the ArcGIS Pro Python window is below. At first I tried using the syntax gis.groups.search(), but this returns the error RuntimeError: Unable to perform group search.'q' parameter must be specified. Then I tried gis.groups.search('*'), which ran without error, but the result is an empty set. Sample code run from ArcGIS Pro Python window: from arcgis.gis import GIS gis = GIS("https://myorg.maps.arcgis.com",clientid='myclientid') all_groups = gis.groups.search() for group in all_groups: members = group.get_members() print(str(len(members)))
... View more
06-13-2018
07:26 AM
|
0
|
2
|
4998
|
|
POST
|
No, I have just started working with ArcGIS Enterprise for the first time in a sandbox server. I didn't want to start hacking it on day one before I started exploring. I've also heard that vector tile hosted layers rely on the package item for the data when users view it, so I'm leery to start deleting data from the back end.
... View more
01-25-2018
03:19 PM
|
1
|
1
|
4317
|
|
POST
|
Hello, I'm curious if anyone has instructions or tips for generating self-signed certificates for use with development deployments of ArcGIS Enterprise within an AWS VPC? I'm brand new to AWS and I just want to test the platform, so I don't want to have to procure a CA certificate. We deployed an Enterprise machine in our on-premise data center using a domain certificate, which is good enough for early testing. In AWS, I want the enterprise stack to be behind the VPC network, so no public DNS name required. I tried using Route53 to create a domain name (which I intend to use just inside the VPC), and then using AWS Certificate Manager to generate a certificate for that domain. But after the CNAME value is added to the domain in Route 53, the certificate never gets verified... after an overnight wait it still said "Pending Verification." I ran into this issue while tinkering with the new ArcGIS Enterprise Cloud Builder CLI for AWS. The tool requires you to have a certificate prior to creating the EC2 Instances and deploying the Esri images. The Route 53 / AWS Certificate Manager process described above was just to do generate something that would allow the install to complete, but no luck. I also tried downloading OpenSSL and generating a self signed certificate that way, but the Esri CLI / AWS returned an error when I used it. I know self signed certificates are not for production, and I can figure that out later. Right now I just want to test deploying the software in an automated way using Esri's tools, and doing some basic performance tests. -Andrew
... View more
01-22-2018
07:27 AM
|
1
|
0
|
823
|
|
IDEA
|
Although optimized hosted feature layers were added as an option to AGOL this year, the documentation has the following caveat: "The basemap on which the feature layer is drawn must use the Web Mercator Auxiliary Sphere projection (such as Esri default basemaps) to see improved drawing times. When the layer is added to basemaps with projections other than Web Mercator Auxiliary Sphere, the layer is loaded without optimization." This means organizations that have standardized on other projections cannot take advantage of this feature. Please make this feature compatible with all projections.
... View more
12-27-2017
08:08 AM
|
0
|
0
|
435
|
|
POST
|
Also you mention that I could delete the .sd files after the layer is published via a script. I tried deleting the .sd file via the Portal user interface but it gives the error message "This item cannot be deleted until these dependent layers are deleted:" then it lists the hosted feature layer name. That lead me to believe the .sd file was used by the hosted feature layer service. Do you think a backend script would avoid this same error message?
... View more
12-27-2017
07:55 AM
|
1
|
3
|
4317
|
|
POST
|
Thank you, Jonathan. This explanation really helped.
... View more
12-27-2017
06:20 AM
|
0
|
0
|
4317
|
|
POST
|
Thanks for the clarification. In our existing pre-Enterprise architecture we have ~600 feature classes on an Oracle SDE referenced by our ArcGIS Desktop users and our ArcGIS Server web applications. As we start to look at Enterprise, I've heard mumblings that migrating the SDE layers to data store hosted feature layers would increase performance and/or scalability...provided the data store server had very fast disk read/write speed. Ultimately we can size the portal server to be bigger to handle the .SD files if need be if the hosted feature layers perform better than RDBMS registered feature layers.
... View more
12-22-2017
12:30 PM
|
1
|
6
|
4317
|
| Title | Kudos | Posted |
|---|---|---|
| 4 | 3 weeks ago | |
| 1 | 06-16-2025 02:17 PM | |
| 18 | 10-11-2024 12:58 PM | |
| 5 | 09-20-2024 09:19 AM | |
| 4 | 07-28-2022 08:55 AM |
| Online Status |
Offline
|
| Date Last Visited |
2 weeks ago
|