POST
|
Aaron: I've resolved my version of this, at least. What wound up fixing it was me removing and reinstalling the 'X Window System' group package, which I did out of pure desperation. I'm not sure what, exactly, happened, but perhaps a package was updated since it was initially installed. (I did not to an update first to see what was different though) Hope that helps, -morgan
... View more
10-07-2013
01:47 PM
|
0
|
0
|
1684
|
POST
|
Aaron: I'm running into what looks like the exact same problem on two of my servers. I've disabled selinux, puppet, and iptables on that server and our networking guys confirm that the port is open (and that I can get to machine:6080 to get into the manager helps illustrate that also), so I'm rather bumfuzzled. Did two completely fresh installations of AGS and still no dice. We've opened a support request and I'll update here with anything we figure out. Morgan Harvey GIS Application Developer Portland State University
... View more
10-03-2013
10:42 AM
|
0
|
0
|
1684
|
POST
|
We are currently testing and benchmarking ArcSDE with PostgreSQL and I've run into some strange problems that I haven't been able to resolve on my own. I have been able to import some rasters and feature classes through both the ArcSDE Service and through a direct connection, but larger rasters and feature classes fail to load completely. For rasters, it appears that ArcCatalog just stops sending data when the PostgreSQL table size reaches 539 MB �?? this happens consistently and with several separate rasters and LZ77 compression. When using no compression, the upload stops consistently at 531 MB. (This usually takes anywhere from three to five minutes) When using sderaster, the raster stops uploading after just 97 MB (about two minutes). For features, I am able to import only about 3.5 million features (904 MB in PostgreSQL) of a shapefile containing about ten million points. When these limits are hit, ArcCatalog acts as if it's still uploading data and on the Postgres server I can see that the connections are still open but idle. I've left it running for quite some time in this state and it never seems to recover. Sometimes it will fail with a Visual C++ Runtime Library error stating that the "application has requested the Runtime to terminate in an unusual way." Canceling the operation causes ArcCatalog to crash. When the ten million points fail to import, ArcCatalog lists an "ERROR 000224." There are no errors reported in the Postgres logs. There is more than enough disk space available for the Postgres tablespace (423 GB) and other applications are able to insert large amounts of data through long running transactions. Again, I am able to import smaller datasets �?? for example, I've repeatedly imported a shapefile containing one million points �?? and have consistently and repeatedly imported smaller rasters. I've attempted raising the shared buffer size on the postgres server (needed to be done anyway), reinstalling ArcGIS Desktop, running ArcCatalog in both Windows XP and Windows 7, and many different raster sets. Nothing I've tried has produced any different result. The PostgreSQL server is running on a Centos 5 box, fully updated. We're using ArcGIS Server 10 and I'm using ArcGIS Desktop 10 with an ArcInfo license. Can anybody point me towards a solution or at least some other things to try to troubleshoot the problem? Thanks,
... View more
10-22-2010
03:30 PM
|
0
|
1
|
475
|
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|