ArcGIS Data Store Under the Hood

06-11-2015 09:02 AM
New Contributor II

What is ArcGIS Data Store under the hood? Sql Server Express? Sqlite? Something proprietary?


Tags (1)
28 Replies
Regular Contributor

Thanks Elijah, this solution worked for me for the hosted feature service database (9876) and really appreciate the help!! We needed to connect to the backend database to troubleshoot converting hosted feature service back to an enterprise data store. We were running into an issue with column names and case sensitivity since our widgets broke in an existing Web Application as we are trying to port the database to enterprise. 

This helped confirm all column names exist as lower case in the backend database including objectid & globalid. Also when column names exceeded 32 chars we did not know how it named them and it basically renamed all of the columns with numeric suffixes and alias values with the entire name. 

Esri Frequent Contributor

Was there a reason you couldn't look at the Fields information within the feature service layer?


Regular Contributor

Yes because there are instances when publications do not work due to underlying database issues and these are not reported back to the end user. In the example of bug BUG-000122241 the publication failed. So knowing as much as possible before going into tech support saves us hours and hours on the phone. The additional details knowing the table published but all of the fields names did not have associations in the service was part of this root cause analysis. 

New Contributor

Thank you so much Elijah for this useful answer. I have a follow up question almost similar to this one. I recently failed to access Portal for ArcGIS then found that there were a space issue( disk space full) on the /home directory.  Then saw this error under db.log " 2020-04-25 15:22:26 PDT: [12017]: PANIC:  could not write to file "pg_xlog/xlogtemp.12017": No space left on device".

We added additional storage and restarted the server however we couldn't still access portal for ArCGIS. I assumed the PANIC state had to shutdown the internal PostgreSQL database:  my question is is there a way I can start it manually? From which path can I find the postgreSQL.service to stop and start it?

(Restarting datastore did not help).

Thank you for any hint on this.

0 Kudos
Regular Contributor

Hello Celine,

We experienced a very similar issue about a month or so ago. I reported a support incident and they only thing the support analyst advised me to do was to uninstall Data Store and reinstall. If you had any patches you will want to install those after the base install as well. 

In our case this allowed the database instance to start again and everything was in tact, although it took a very long time for this to uninstall and reinstall.

Hope this helps!

0 Kudos
New Contributor III

Hi Celine.

Generally when a server runs out of disk memory unexpectedly like that it's because the system has been on its feet for a long time, and meanwhile some application with an aggressive logging behavior has eventually filled the disk.

While you say you've added additional storage, are you sure the new space is available to /home? If so, I would try renaming 'pg_xlog/xlogtemp.12017' (i.e. mv pg_xlog/xlogtemp.12017 pg_xlog/xlogtemp.12017.OLD), then reboot and see if things come back online. It's possible some aspect of the last_used log file is corrupted now, and Arc-something-or-other is trying to access it and crashing when it fails to open it. Maybe removing that file from scope will force it to start over, clean.

I don't recall what that service name is or what it might be, and unfortunately I'm not in a position to check. But, in the terminal, you might try typing "service p", then hit tab two times fast, and see if CentOS will suggest a bunch of known service names that start with "p" ..if there are multiple service entries that look "postgres-like" (one might say "postgresque.. ..sorry I realize this is no place for a joke :/) then try stopping and restarting each of those services. Do a full service stop, then fresh service start just in case.

The next thing I would try would be clearing out the entire log directory (pg_xlog), just in case there is a related log that got hosed. If your organization looks down on deleting logs, copy them into a safe directory before eliminating them so that you'll have copies. Another thing worth considering, if free space in your /home directory is still an issue, deleting one or two recent log files probably won't help. You might need to figure out what happened to all your space and free some up relative to /home. It might be enough just to find and eliminate the largest and/or oldest logs. I would aim to free up at least a gig to give you some headroom. Then once you've done the spring cleaning, reboot and see if the system comes back up.

Hopefully it doesn't come down to uninstalling and reinstalling DataStore, entirely, as someone else mentioned. That seems like an awfully heavy-handed solution. (Although, unfortunately, it wouldn't surprise me since it came from the same organization that tightly couple's the system's FQDN to the software at install-time absolutely terrible design decision, IMO.)

Anyway, I apologize this isn't more helpful. May the odds be ever in your favor.


0 Kudos
New Contributor III

For a base deployment (portal, server and data store on the same machine), how many PostGreSQL gets installed? Multiple PostGreSQL or one PostGreSQL with multiple instances? same port and dbname for access in pgAdmin? Can anybody please clarify?

0 Kudos
Esri Frequent Contributor

If Portal and Data Store on the same machine, you'll have two instances of PostgreSQL running. It'll use different connection strings, (ports, database name, credentials, etc). Repeating what's been mentioned above, though, it's not supported nor recommended to access the database directly. If there is an API that is missing that could be useful for you to retrieve the information you're looking to retrieve, feel free to create an idea on the Ideas site.

Regular Contributor

Since it is still not possible to group layers in webmaps (the old map viewer required for Web AppBuilder) I only use map image layers so I can do the grouping in ArcMap / Pro. But now I have some hosted layers I want to get in, and it is not supported to publish a map image layer with hosted feature layers in it. So I was thinking I would add it as a querylayer from the postGREsql database behind DataStore. That's how I found this thread, but after reading this I guess it is not possible?

Or does anyone have a suggestion?
@ElijahRobison I guess your solution isn't applicable on query layers from arcmap/pro?

0 Kudos