|
POST
|
I have never used StreetMap Premium, but reading this page: https://doc.arcgis.com/en/streetmap-premium/latest/get-started/overview.htm shouldn't you be downloading the GCS mobile map package from your StreetMap Premium licensed organizational "My ESRI" portal, as that map package appears to contain pre-symbolized data similar to ESRI's "Navigation Map" style if I understand the Help text correctly? See the last bullet "Cartographic display".
... View more
10-24-2024
10:06 AM
|
0
|
0
|
1971
|
|
POST
|
@TimMinter wrote: In further poking about the web, I ran across a bunch of grumbling about how Python will just hang during larger data loads. I'm assuming that ArcGIS Pro Append GP tool uses Python code, and I have a suspicion (that I won't pursue) that it's just another example of that behavior. AFAIU, quite a lot of the core ArcGIS geoprocessing tools have been written in C++, not Python, but Vince can give a more informed and balanced answer. I don't agree about the "Python will just hang during larger data loads" as cause of data load issues, see my response about OpenStreetMap data processing at planetary scale. Crappy programming and poor hardware / networks can cause major issues though. Marco
... View more
10-22-2024
02:58 PM
|
1
|
0
|
3457
|
|
POST
|
Well, I can't vouch for any specific tools like Append, but my experience has shown you can handle multi-billion record spatial tables in PostgreSQL and create ArcGIS Query Layers for them in ArcGIS Pro. However, this experience has also shown that some ArcGIS geoprocessing tools are very efficient and can handle such large datasets, and others cannot. It all depends on the tool and its exact implementation. I currently run a >4TB large OpenStreetMap database as a non-geodatabase PostgreSQL database (v17.0 currently) based on Facebook/Meta's "Daylight Map Distribution for OpenStreetMap", that includes nearly all of Google Open Buildings. The hardware I use is a refurbished HP Z840 workstation that I beefed up with 20TB of NVMe disks (RAID-0), half of it serving as superfast backup, and 512GB RAM. The largest table, containing all buildings and other Polygon geometries of OpenStreetMap for the entire planet, has nearly 2.5Billion(!) records (yes, I am using 64-bit ObjectIDs), see screenshot of DBeaver, and the actual styled data in Pro in the background as an exported PDF created by ArcGIS Pro. To handle import and export, I have successfully used Python modules available in ArcGIS Pro, like sqlite3, pyodbc and psycopg2. The actual original import of OpenStreetMap data is using osm2pgsql, subsequent processing steps use the other libraries. All of these tools have proven to be capable to process hundreds of millions of records data transport in and out the database without significant slow down and to be able to create > 400M record File Geodatabases and SQLite spatial databases as export, but it requires really good thought of how to handle stuff in your Python code. E.g. storing ObjectIDs to process as ordinary Python number objects in some huge Python list, will quickly have you run out of RAM with billions of records, even if you have 64GB or more available. I solved that issue by using numpy arrays and data types, that can be far more memory efficient. But there is a plethora of issues to deal with if you want to write efficient Python / ArcPy code for handling these amounts of data, so I am not at all surprised some of the current ArcGIS geoprocessing tools fail to scale to those table sizes, likely because no developer ever tested them at those scales of data processing. That said, using standard SQL and pyodbc / psycopg2, I have been able to e.g. run SQL 'UPDATE' statements that modify an entire > 1B record table at a rate of 500M - 3.5B records per hour (1 million rows updated per second) depending on the operation performed on this 2016 hardware using Python multi-threading options... Modern hardware should be able to easily double that.
... View more
10-22-2024
02:09 PM
|
1
|
1
|
3463
|
|
POST
|
Well, there is the: Recalculate Feature Class Extent (Data Management) geoprocessing tool that you could easily call from ArcPy. Do note though that it requires an exclusive schema lock, which might be problematic in some cases. It also calculates the extent based on the actual features according to the Help page, while your first post suggests setting a predefined "organization" extent that may not reflect the actual data's true extent. I am not sure if that is possible with the ESRI tools, or if you would need to hack into the geodatabases system tables to achieve that.
... View more
09-27-2024
09:52 AM
|
0
|
2
|
1263
|
|
POST
|
If I remember it well, the "Number of Points" setting determines the minimum number of data points to use to calculate a grid cell value, and the "Maximum Distance" setting can limit this: - If you set a 'fixed' distance for the search radius, and only 3 points are found within this distance, the cell's value will be based solely on those 3 points, not the set "Number of Points", e.g. 12 as default. - If you set a 'variable' distance, the search for nearest data points will continue until "Number of Points", e.g. 12 is found, irrespective of the distance. So you have to answer the questions: - Do I mind potentially adding data points beyond the set search radius? You likely shouldn't worry to much about data points beyond the search readius. Since Kriging is in a sense a form of IDW (Inverse Distance Weighted) interpolation, points further away will influence the grid cell's value less anyway, and shouldn't overly contribute or distort results even if less appropriate (unless some major break in data values is visible due to e.g. geological factors, in which case you might wish to set barries). - Do I care if less data points are being included to calculate a cell's value? If the data is erratic (which already means it is less suitable for interpolation), having more datapoints included might be better to get the overall picture. Either way, I think the differences between the options will be limited. Try it out to find out and explore the error surface after interpolation. A last question you always need to ask yourself: is my data suitable for interpolation in the first place? Sometimes other statistical methods are better suited for certain types of data, and classification of data points and correlation with environmental factors based on ordinary statistics is the more appropriate method to extract value from your dataset. E.g. if you have statistically proven certain values correlate with certain geological strata, classifying a geological map based on this knowledge could also be a valid method of generating a space filling dataset, instead of interpolation. Of course, with all the options for data exploration and post result evaluation in a tool like Geostatistical Analyst, you should be able to tell if your data is suitable for interpolation or not. But sometimes people forget that data should in fact have spatial auto-correlation, and stubbornly ignore indications otherwise, in a desperate attempt to "create a surface" of a set of data points, because the data "must be interpolated!" (no, it doesn't always, and if you've sampled at the wrong spatial scale to capture the actual phenomenon your trying to get a handle on, spatial auto-correction may also be virtually absent). If your data exploration says there is no real spatial auto-correlation, don't attempt to interpolate, find other ways to handle processing of your data, it may well still be suitable for some other type of statistical analysis with proper input of environmental factors.
... View more
09-01-2024
02:19 PM
|
2
|
0
|
1071
|
|
POST
|
Due to a historic Javascript and Node.js limitation, 64-bit is actually currently 53-bit support for some applications and uses in the ESRI product line. This might change in the future: https://blog.logrocket.com/how-to-represent-large-numbers-node-js-app/ https://v8.dev/features/bigint Not that you are likely to hit 53-bit ObjectIDs anytime soon, but see the "Caution" remark in this ESRI Help page.
... View more
08-27-2024
10:56 AM
|
1
|
0
|
2853
|
|
POST
|
This ESRI page mentioning the Microsoft Azure Database for PostgreSQL (Flexible Server): https://enterprise.arcgis.com/en/system-requirements/latest/windows/databases-in-the-cloud.htm refers to this Microsoft Help page: https://learn.microsoft.com/en-us/azure/postgresql/flexible-server/how-to-connect-tls-ssl Maybe there is something there that may explain the issue.
... View more
08-25-2024
07:38 AM
|
0
|
0
|
1351
|
|
POST
|
Can't speak for ESRI, and I'am not a web developer, but considering all the blog posts with new features of Calcite released by ESRI in the past few years, I highly doubt Calcite will be ditched any time soon for another framework. Maybe the Jimu thing being default is specific to the iOS and Mac platforms, as ESRI may feel Calcite is not yet mature enough on these platforms? Just speculation though...
... View more
08-25-2024
07:27 AM
|
0
|
0
|
1710
|
|
POST
|
Some tools like Calculate Field do not actually create a new dataset as output, but just modify the existing dataset. I think this may be the reason the "Add To Display" option is not working, as the modified dataset may already have been added to the TOC in the previous step. You could use an extra "Make Feature Layer" tool step in the model just after the "Calculate Field" tool, and add the output of that instead.
... View more
08-02-2024
03:00 AM
|
1
|
0
|
765
|
|
POST
|
One thing you might still try though, is to disable the graphics card in Windows before installing the graphics driver, than re-enable it afterwards after the successful install of the driver, if the installation process of the driver hasn't already done so.
... View more
05-07-2024
05:33 AM
|
0
|
0
|
4707
|
|
POST
|
Also note that these "dedicated" graphics cards in laptops are not like true desktop graphics cards stuck in a PCIe express port. They are usually essentially a kind of co-processor chips directly stuck on your mother board to help and work together with the integrated graphics. That is fundamentally different from a high end desktop graphics card, and may explain some of these issues as well.
... View more
05-07-2024
05:23 AM
|
0
|
0
|
4707
|
|
POST
|
@c1asse wrote: The GPU drivers gets disabled automatically and on startup there is no display currently. Please note that i have used other simulation software such as HEC-RAS and FLOW3D and have never faced any issue like this. I am unsure what mistake i am making while using the software or saving its files. Any help is appreciated. Thankyou. Do you mean the display in the Map view is permanently stuck on the "Loading Map" progress icon and the actual map display stays grey, not showing anything, and Pro reporting "Changes in your graphics hardware detected"? If so, welcome to the club ;-(. I have had a similar experience with what is still my only laptop, a Core i7-7700HQ with NVidia Geforce 1050 graphics card. Despite contacting ESRI support about this, I have never managed to get Pro working with the graphics driver, not with DirectX, nor OpenGL, nor by upgrading to any of the latest versions of the drivers or any of the Windows updates since I acquired it. I finally resorted to disabling the NVidia Geforce 1050 graphics card altogether through Windows settings, and only run on the Core i7-7700HQ's integrated graphics. That actually works better than initially expected, but of course feels a bit dump knowing I have "dead" unused hardware lying around that probably could do a bit better. Nonetheless, if you have no option to switch laptops, I would say go for it, and run on integrated graphics only. More modern laptops than my 6 years old machine, should have more muscle for that work too, so if I can run it on integrated, you can probably too and with better experience.
... View more
05-07-2024
05:14 AM
|
0
|
0
|
4707
|
|
IDEA
|
@RTPL_AU wrote: @SSWoodward From a paying customer perspective if you already have internal testing processes that detect regressions, these regressions should be public and part of the release notes, and if not, this is still a valid idea and should be moved a more appropriate section such as "things that Esri could do better that isn't a product feature". I think I would even go a step further: regarding the nature of the bugs when there is a new software release, there are two clearly distinguishable ones in my opinion: - 1) Bugs in newly added functionality that doesn't affect any of the existing functionality of ArcGIS and allows the current user base to continue using Pro unhindered as they did with the previous release, and to potentially upgrade to the latest release with the minor caveat of needing to avoid the affected functionality. While it might be a nuisance that some fancy new feature is not usable due to such an issue, such bugs in my opinion are relatively harmless and lower priority from the perspective of the user base (unless there is a risk that the bug causes major damage to e.g. an enterprise geodatabase, but such bugs are rare). - 2) Regression bugs that directly affect the usability of existing functionality of ArcGIS, and make it impossible to use and / or upgrade to the latest release either due to a severe performance issue, or tools / functionality being truly broken and totally unusable. In my opinion, the second type of bugs - regression bugs - is far worse than the first, and should always be top priority for fixing as they make it impossible for the existing user base to use or upgrade to the latest version while continuing their regular workflow. The specific issue you've brought up was introduced in Pro 2.9.6 and 3.1 and has been addressed in the coming release of ArcGIS Pro 3.3 In my opinion, regression bugs should always be fixed in the next patch release, not in a major or minor release! That is, this bug should have been fixed in a 2.9.x or 3.1.x patch, not in Pro 3.3. Or, if the original fix was developed based on the 3.3 code base, it should have been back ported to all previous releases affected. I realize time constraints may make it impossible to fix stuff in the next patch release, e.g. 2.9.7, but that still means the fix should have been part of 2.9.8 or so. If you do not follow such practices, you may end up with a perpetually broken product, because regressions are never fixed in the release cycle as used by the existing user base.
... View more
04-30-2024
01:31 AM
|
0
|
0
|
1989
|
|
POST
|
With such a big difference, and all other things equal, I would definitely recommend contacting ESRI support, they should be able to tell you if this is a known issue, and whether it is scheduled for a fix. It is not the first time ArcGIS suffers a performance regression and may need a fix.
... View more
04-28-2024
02:21 AM
|
0
|
0
|
1216
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 12-08-2025 09:12 AM | |
| 1 | 12-05-2025 12:38 PM | |
| 1 | 12-04-2025 10:08 PM | |
| 1 | 12-04-2025 10:11 AM | |
| 1 | 04-29-2020 03:54 AM |
| Online Status |
Offline
|
| Date Last Visited |
yesterday
|