POST
|
Similar behavior on our system but in our case we found that invalid tokens attached to the Portal web map item for the REST service was causing the GP Print Service to fail when using secured services. Public services printing fine. This was also affecting the display of secured services with other export functionality. Service information, legend, everything shows up. Health checks fine. No error messages specifically leading to failure diagnosis. But still blank output on the screen randomly and in utilities like the Print service. Not sure why the invalid tokens are being generated during map publication for use. We are using MXDs and not Pro Fix that working with Tech support generated for us was to create a new portal web map item and add the service to it as a web item. During that process we manually inputted new credentials for authentication and stored the new token with the item. Update web apps to use the new web maps and print service and everything else displays fine. Random condition on our ecosystem. Not every publication fails. Verified condition with services published using MXDs from 10.4.1 to 10.7.1. Most recently verified this week with a brand new install of 10.7.1 Desktop published service storing an invalid token in Portal.
... View more
09-14-2019
10:26 AM
|
1
|
0
|
1822
|
POST
|
Portal upgrade from 10.6.1 to 10.7.1 took 3.5 hours on our non-HA internal business system. All totaled approximately 7 hours for the complete enterprise component upgrade to 10.7.1, patching, reconfiguration of scripts, and basic validation of the post-upgrade status for our business system.
... View more
08-28-2019
03:02 AM
|
1
|
3
|
2656
|
POST
|
Same problem after a stand-up of brand new 10.6.1 base deployment to replace prior deployment that had upgraded over multiple versions. Did not exist in 10.6.1 when upgrading over 10.4.1. Confined to both our "brand new" clean 10.6.1 install attempts. Issue is quite extensive for our users > Any service running through either IE or Edge that has to authenticate through "ArcGIS.com" (ie in Web Map) fails with the message "The webpage you are viewing is trying to close the window. Do you want to close this window?". JavaScript loads fine. No limitations on CORS in place. Google and Firefox work fine. Affected browsers are IE and Edge both on prem and off-prem. Only fix found so far is to add "https://www.arcgis.com" to the IE Options as a trusted site. Adding it to Portal as a trusted site made no difference. Hugely problematic as our DB server is 2012 which means we can't upgrade to 10.7.1 to fix this and both our content producers as well as consumers are affected.
... View more
08-10-2019
05:05 PM
|
0
|
0
|
2514
|
IDEA
|
I get that the ESRI message is "we don't want you using MS Access", but it's really the best software out there for the desktop user, far better than a File Geodatabase primarily because fGDB is closed source. Ever try running strict SQL on an fGDB? Well, I do it, through the github gdBee project. Life saver that project is. Microsoft is revamping their OLEDB commitments. Maybe ESRI should follow suit. The need is there and so it was undeprecated in 2017. Plus ODBC to MS Office is always possible. More appropriately, maybe ESRI should release a proper ODBC driver for its file Geodatabase. All of this fury would dissapate quickly if we could work with the fGDB like we work with SQLite or MS Access. We need a regular, non-catalog, way to access the file geodatabase contents for both data entry and data manipulation. ESRI offers full access to data contained in Excel. Easy extension to put it back in Access. Any refusal to do so is not because it's not possible. It's purely a business decision. Please hear what we are saying ESRI. Thanks.
... View more
07-24-2019
10:36 PM
|
0
|
2
|
1752
|
IDEA
|
This may or may not be helpful but if you can still access the older 32 bit Python environment from arcgis desktop you can script it without having to use a gui. Not sure if this can be done with engine or runtime or if there is a way to simply add in the esri components or not. Also, if you can spin up a VM and install the 32 bit desktop on that, it might be the way to go. Give your VM access to the location where you store the pGDB and run the transition. Can get an Edge VM image from Microsoft for free. All you need is the ESRI software and a VM host.
... View more
07-24-2019
10:08 PM
|
0
|
0
|
1752
|
POST
|
Just encountered this in Excel 2016 as an available OLE data source. Curious as to it's origination and usefulness and if we can use it to connect to file geodatabases.
... View more
05-08-2019
03:27 PM
|
0
|
0
|
948
|
POST
|
As an aside, watch out when working with 32 bit architectures. Got bit recently when scripting work across Pro and Desktop through 32 Bit personal geodatabases. I now have to keep 3 separate Python environments running to undertake simple workflows: 32 Bit Desktop 2.7 64 Bit Desktop 2.7 Pros 3.6 In order to work with Personal Geodatabases in script you have to switch to 32 Bit Python environment, which doesn't come up automatically if you have 64 Bit Geoprocessing installed.
... View more
05-06-2019
06:33 PM
|
0
|
0
|
734
|
POST
|
I'm leaning towards 64 bit MS Access as the FE for SQL Server. ArcGIS is meant to work with the architecture of spatial things. And that it does well. But it has never done the data component of spatial "data" well and it seems to have less and less capability to work with 'data' as the software matures. We keep losing functionality that worked (works) so well in favor of what? MS Access is a superb FE regardless of whether the data is in an MS Access container or linked from SQL Server into an MS Access container. As Tom mentions above, if the worry is that users might accidently cause havoc with the ESRI columns, create Views, or create data entry forms for the MS Access data. ESRI has the proper replacement for the pGDB already, in SQLite, but for whatever reason won't push it forward into production. And the fGDB is absolutely just a virtual paperweight for real data applications (black box with no gateway except ESRI tools) - can't connect to it to run standard SQL queries with anything other than QGIS (limited) and one or two independent projects (that work very well imho) pushed forward in GitHub. We have a 10 year minimum life-span on some spatial data apps - the solution certainly isn't to create "geodatabase relates" or ArcMap joins in proprietary software, knowing full well they will break when ESRIs stated update schedule make it incompatible. The longevity solution for us is to use a proper container (in this case SQL Server), run data updates and maintenance in pure SQL (not sql in python), and join the non-spatial tables to the spatial framework through key fields just like we were doing in the 1990s. We still use ESRI proprietary Geodatabases on SQL server because they have features that work well (versioning etc). But we also attach to those SQL Server databases with MS Access and allow users to work with the data tables or just work with them in SQL Server Management Studio directly, which also works very well. Office 365 is pushing default installs of 64 Bit products. There is no getting around migration to 64 bit compatibility. ArcGIS is already a bit late to the party on providing compatibility for MS Access 64 bit. Upvoted the suggestion.
... View more
05-06-2019
06:16 PM
|
3
|
1
|
1893
|
POST
|
Can you elaborate about the steps that ESRI used? We are having the same issue but when I was on with Tech support they ended up booting us to 'bug' status and closing the ticket. I'm a little concerned about the stability of our portal and whether or not we should just rebuild. Rebuilding would be a last resort for us as we have production in it dating back to 10.3.1 and would require months to recreate. Debug logs don't show errors but Users out of sync. Reindex does nothing. Decided to peak into the black box using PG Admin and run a few quick SQL queries to export a list of users from the store. Compared them with the list of users created by "ListUsers.bat" and found that I have 43 ESRI users in the store that are not in the index. Unsure about whether this is a problem or not. Would be very happy if there is a way to rebuild the index and move on from this problem. I've told my users not to create any more content until we have resolution, that could be months away.
... View more
04-16-2019
05:18 PM
|
0
|
1
|
2038
|
BLOG
|
Just adding a comment to my post that while I would posit in a theoretical discussion that web GIS is past its heyday, I see web databasing as just beginning to grow into its own legs...hence the need for better DB management tools in Pro. Pro is ideally situated to be a leader in web db technology with it's Conda environment leading the way. Spark clustering capabilities, server-side processing in SQL Server via Python and R integration on the Microsoft side, all truly powerful aspects of the implementation. Problem as I see it is a disconnect between the data analytics capabilities that are so awesome in Pro and what is actually being implemented by the product team for the front end. Maps are just 'views' of data, the end product of GIS Data, not the starting point. Can't make a map if you don't have data. What Pro lacks are the tools interfaces to actually work with 'data' and data storage containers. For instance, I'd love to just create some data matrices and then work with them in Pro, much the same way 'R' or any other data analytics tool does. Data is easy. It's some spatial descriptors attached to attributes and stored in a container that allows it to be parsed. So why would I need to make a map to work with my spatial data? Never understood that myself. Generally maps are made so that the data can be understood visually. Totally get that. But the data itself requires a certain integrity before it can be understood in a map and that's where it all begins...working with data storage formats and the attributes contained therein. Web GIS without direct access to the data storage mechanism is kind of a broken system imho. It makes Pro just a client side software. If it isn't going to work for data management, they need to dump the FGDB and other proprietary formats and just say "we will work with all yer DBs" and open up their APIs. I miss the Coverage model.
... View more
03-06-2019
01:16 PM
|
4
|
0
|
11283
|
BLOG
|
Just getting into planning to set up distributed workflows for our municipality. Web GIS does not work for us due to a number of infrastructure related problems and fiber limitations. We tried for years to make the single GIS server hosting up data for editing to work but alas, throwing in the towel on that and moving to distributed and even mobile data servers to accommodate operations that stand up at different locations. Makes no sense for users to suffer through bandwidth limitations and bottlenecks throttling data flow between offices via a central repository and overworked servers when distributing data directly to a building/office networks directly works so much better and speeds up data editing and manipulation capabilities by factors that make the end user happy. Somewhat disappointed to see Pro moving away from supporting distributed and enterprise needs in support of workflows that really only work in urbanized areas where data is always open, everyone is always connected, and fiber practically runs to each machine. That's only a small fraction of GIS users realities. While I understand the "dream", it's not practical for many, and many more than ESRI is willing to admit will never be able to capitalize on their new business model. We can't even get a "view" of one of our contractors DBs hosted on the mainland to work with any consistency or reliability inside our network. How in the world would we be able to support all of our various business functions through "web GIS"? Data throughput would be through the roof to even try. We are moving from hosting much of our data in GIS server to hosting very little of it in GIS Server. Distributing data to local sites is the future of GIS, not the past, as data needs grow and infrastructure doesn't keep up. Plus, with ISPs now legally able to throttle bandwidth however they wish, moving to Web GIS is simply a very risky prospect when you require guaranteed up time. Not to mention the increased frequency and complexity of attacks on the national and international backbone infrastructure by malicious actors. My .02 is that Web GIS has already had its heyday … another reason we'll be on ArcGIS Desktop until ESRI finally retires the final version of it. We have to be able to function and Pro doesn't contain the tools that will allow our business units to continue to operate. Some of my users already switching to QGIS and what we are seeing is a clear differentiation between workflow components - data maintenance and daily workflows on desktop using DBs and "published" final data sets being set up on the GIS server. Offline editing of Database data directly is absolutely a root level business necessity for us. We are under grant requirement to maintain local data stores …. another use case that ESRI completely misses the ball with in Pro. +1 to the idea of branch versioning with complex requirements such as topologies.
... View more
03-06-2019
12:39 PM
|
3
|
0
|
11283
|
POST
|
Don't mean to state the obvious but Idle 2.7.13 is Python 2.7 from ArcGIS Desktop and Idle (ArcGIS Pro) uses Python 3.6.5., two stated incompatible versions of Python. Analyze Tools for Pro is a surface level tool. It does not work so well on complex scripts. Has a success rate about on par with 2to3 (not great) and doesn't patch most scripts well enough to create any semblance of true interoperability between Py 2 and Py3. In my case, re-writing the code by hand after debugging did the trick for errors similar to yours. Found that I had to implement a bunch of try and except statements to get around incompatibilities between 2 and 3 that don't get caught by "Analyze Tools for Pro"
... View more
12-28-2018
05:29 PM
|
1
|
2
|
1486
|
IDEA
|
Pro is a no-go for my users until pGDBs are supported. Let me explain. We run Enterprise GDBs for Enterprise workflows. We have complex workflows at the organizational level that join disparate sources of data and run business units. For that we use and Enterprise Data basing solution. However the majority of our users have no need whatsoever for the level of complexity inherent in enterprise databasing. They want to fire up the software, do some editing, get a result. Wham Bam done. And for that they use pGDBs because pGDBs are accessible to SQL, outside of ArcGIS. Again, wham, bam, done, reports printed, data delivered, actionable derivatives defined. ESRI fGDBs are almost entirely useless as a "data" storage back end. They can't be accessed by anything other than ArcGIS. Sure, those of us in the know use QGIS to perform standard SQL queries on fGDBs, but how many basic desktop users are going to do that? The answer in my organization is none. fGDBs are one way storage containers. Data goes in but you can't get it back out. It's trapped there. And we still have no answer from ESRI as to why they won't open the spec. And so my users complain that it doesn't meet their needs and me, for my part, provide them with what they do need, pGDBS. For the desktop user pGDBs are ideal. They are hassle free. They report in formats that desktop users want. They are relational and quickly interface with data in multitudes of other formats. Forms for data entry are easy to set up and quick to deploy. And SQL queries, while somewhat limited by the container design, are flexible enough to output all the tables that any user needs. Plus, using pGDBs doesn't require that your end user be an IT expert. Anyone that uses MS Office can use a pGDBs Yes pGDBs are 'slow' in I/O. But they are extremely fast to the finish line for data users who have desktop needs. Basic users don't care about I/O, all they care about is how much of a PITA it is to get a report or analysis. So for every day uses we set up our user ecosystem with pGDBs. For departmental workflows that have multiple users we set up Enterprise GDBS. File GDBs don't have a place in our workflows except for scratch workspaces. ESRI could change that paradigm by releasing the full spec on the fGDB and a non-map interface where we can implement the SQL and data analysis workflows that are higher level than map making and what the users want. But they have not. So all we have is QGIS if we want to use fGDB data. Our users will continue to use pGDBs until Microsoft ends MS Access. Pro doesn't work with pGDBS, Pro doesn't work for our basic users. The enhancement is necessary in the Pro product before we transition. If it isn't there, we don't transition to Pro. The calculus is really that simple.
... View more
11-08-2018
03:04 PM
|
9
|
0
|
1432
|
POST
|
Cloning doesn't actually "clone" your installation. It clones the packages in your installation. To accomplish the clone it downloads all of the packages from online repos as if you were creating a new installation for the first time. So when you don't let if finish out it won't be a complete environment. Conda everything. ArcGIS Package Manager is not yet up to the task of managing Python environments. It'll get there some day I'm sure but it's not there yet. Dan Patterson is on the mark. I've set up .bat files which allow me to automate 'cloning' my setup to each of my advanced users machines including Spyder, Anaconda Navigator, and Jupyter Lab while preserving the default ArcGISPro environment in case of a failure or revert or new clone for integration with R etc, requirement. Or, if you want simplicity and to keep your hard drive from filling up with utterly useless clones, Conda straight into arcgispro-py3. Add Spyder and upgrade the ESRI packages to 1.5 and off you go...If I didn't have users with unique needs that's what I would chose to do.
... View more
10-22-2018
04:04 PM
|
1
|
1
|
1308
|
POST
|
Also realizing this post is dated, but also that some orgs (us for one) are still on SQL Server 2012 and it's relevant to point out my experience troubleshooting poor performance with the platform over the last several weeks. We are running enterprise 10.4.1 and SQL Geometry. Found in load testing that the DB was the bottleneck in our system with response times of 5 to 15 seconds while the rest of the system was responding at less than .04 seconds. This thread was truly helpful to us in figuring out how to troubleshoot, specifically the list of configurations to change posted above by Brian Leung. We kept hyperthreading enabled, went to 64k clusters for our data and log files, ran calculations for setting MAXDOP to a more appropriate number (was set to 0 before optimizing), raised the threshold of parallelism to 50 for the machine, and kept default spatial indexing (for tweaking later). Results were phenomenal. The machine when from returning 2 transactions per second to 22 transactions per second. Response time also balanced out against the rest of the system at less than .04 seconds for response when loaded. Thanks!
... View more
10-01-2018
03:54 PM
|
0
|
2
|
802
|
Title | Kudos | Posted |
---|---|---|
1 | 04-26-2022 10:53 AM | |
1 | 01-13-2022 09:54 AM | |
2 | 01-13-2022 10:52 AM | |
1 | 12-13-2021 03:15 PM | |
1 | 10-05-2020 01:49 PM |
Online Status |
Offline
|
Date Last Visited |
08-16-2022
08:15 PM
|