|
POST
|
I've also had great luck exporting a raster layer from an .mxd to kml / kmz using the custom tool found here (I didn't create this or know who did): wankoelias / MapToGarminCustomMap — Bitbucket It seems to do a good job of preserving raster resolution in the output, and can be ran straight from ArcCatalog inside of an ArcMap session. Just download and unpack the tool / toolbox locally, navigate to it in Catalog and give it a shot. Hope this helps! -Rex
... View more
05-31-2018
05:46 AM
|
0
|
0
|
5638
|
|
POST
|
Additionally, has the maximum server memory setting been adjusted to the instance properties on the server? By default, SQL will grow and use as much as it is allocated. That's a good amount of data, and depending on the volume of requests this server is receiving and serving- I could see how 16GB would possibly become limiting. In the screenshot example below, SQL has been capped at 4GB. If you suspect that there may be specific databases that are taking up too much resources / memory or buffer pool consumption, you can identify the most intensive databases on the instance by running the following query: The first step would be to run the following query against the SQL Instance to see a list of databases, and the respective buffer pool (% of total) contribution: DECLARE @total_buffer INT; SELECT @total_buffer = cntr_value FROM sys.dm_os_performance_counters WHERE RTRIM([object_name]) LIKE '%Buffer Manager' AND counter_name = 'Database Pages'; ;WITH src AS ( SELECT database_id, db_buffer_pages = COUNT_BIG(*) FROM sys.dm_os_buffer_descriptors --WHERE database_id BETWEEN 5 AND 32766 GROUP BY database_id ) SELECT [db_name] = CASE [database_id] WHEN 32767 THEN 'Resource DB' ELSE DB_NAME([database_id]) END, db_buffer_pages, db_buffer_MB = db_buffer_pages / 128, db_buffer_percent = CONVERT(DECIMAL(6,3), db_buffer_pages * 100.0 / @total_buffer) FROM src ORDER BY db_buffer_MB DESC; Once a particular database has been identified, the following SQL Query can be ran on that database to return a list of all indexes within the database, and their respective size / % contribution to the buffer pool (just update the <databasename> value from the high contributing database(s) from the first query above: USE <databasename>; ;WITH src AS ( SELECT [Object] = o.name, [Type] = o.type_desc, [Index] = COALESCE(i.name, ''), [Index_Type] = i.type_desc, p.[object_id], p.index_id, au.allocation_unit_id FROM sys.partitions AS p INNER JOIN sys.allocation_units AS au ON p.hobt_id = au.container_id INNER JOIN sys.objects AS o ON p.[object_id] = o.[object_id] INNER JOIN sys.indexes AS i ON o.[object_id] = i.[object_id] AND p.index_id = i.index_id WHERE au.[type] IN (1,2,3) AND o.is_ms_shipped = 0 ) SELECT src.[Object], src.[Type], src.[Index], src.Index_Type, buffer_pages = COUNT_BIG(b.page_id), buffer_mb = COUNT_BIG(b.page_id) / 128 FROM src INNER JOIN sys.dm_os_buffer_descriptors AS b ON src.allocation_unit_id = b.allocation_unit_id WHERE b.database_id = DB_ID() GROUP BY src.[Object], src.[Type], src.[Index], src.Index_Type ORDER BY buffer_pages DESC; I hope this helps in determining the cause of the memory issues and confirms if you either need additional resources or if current data or settings in the instance can be adjusted to suit needs. -Rex
... View more
05-31-2018
05:08 AM
|
3
|
14
|
6148
|
|
POST
|
To add onto what Asrujit has posted above- there is an underlying difference between registering data as versioned, and the creation and use of child / edit or transactional versions in the geodatabase. The former prepares data for versioned editing through the creation of the associated database objects- primarily the delta tables (tables tracking adds / edits and deletes) for that (for example feature class) as well as entries in the required system tables. A quick tour of registering and unregistering data as versioned—ArcGIS Help | ArcGIS Desktop The latter refers to the process of creating versions to perform edits in off of (or below / children of) the Default version which will always exist and is owned by the gdb administrative user. The link that was shared above is helpful for more information regarding this. Put another way- you can have versioned data with only the Default version, and you can have transactional or edit versions created in a database with no versioned data... but if you wish to edit versioned data in a version other than default- you will need to do both (register data as versioned, and create and maintain edit versions other than default). Hope this is helpful!
... View more
05-29-2018
11:06 AM
|
1
|
0
|
1086
|
|
POST
|
Hello Muhammad, From the description and the attached screenshots, it appears that the issue is related to ArcGIS Server's inability to connect to the underlying Oracle database using TNS names. This can have several culprits which you can systematically check off / test until the issue is hopefully resolved. Some things to consider / test: If possible, try to make a connection using Oracle EZ / easy connect syntax (instance = servername/service name) vs TNS names to the underlying Oracle database. If this is successful and works for your needs- no further steps are needed. However, if TNS naming is a requirement then you will need to do the following: ArcGIS Server requires the 64 bit Oracle client to be installed and configured as per the suggested configurations (depending on how your AGS Server, Oracle, and client are setup) here: Register an Oracle database with ArcGIS Server—Documentation | ArcGIS Enterprise Ensure to place the location of the 64 bit Oracle client libraries first in the PATH variable on the ArcGIS Server machine. If you install the 64 bit Oracle ADMIN client- run netca / Oracle Network Configuration Assistant which will create the appropriate tnsnames.ora and sqlnet.ora files on the Server machine. If you install the 64 bit Oracle INSTANT client you will need to do a couple of additional steps: Copy the tnsnames.ora and sqlnet.ora text file from the Oracle database server or client machine that CAN connect successfully. These files are typically found in: dbhome\NETWORK\ADMIN on the database server for the instance. Paste these files into an accessible folder (to ArcGIS Server / service) on the AGS machine Create a new system environment variable named "TNS_ADMIN" with the value = the path to the above folder / directory containing the newly copied tnsnames.ora and sqlnet.ora files. More on this is described here: Instant Client FAQ Restart the ArcGIS Server service. Then re-attempt connection / registration of the database with ArcGIS Server. This is a fairly exhaustive list but lists the most typical reasons why people encounter ORA TNS errors when registering an Oracle database with ArcGIS Server. I hope this is helpful- please feel free to reach out with any questions and if this works for you!
... View more
05-29-2018
05:51 AM
|
2
|
1
|
1674
|
|
POST
|
Hello Kirsten, This is an interesting issue. If you run the Clear Workspace Cache tool on the Published database connection from which the FC is coming, and then re-try the select by attributes, does the query show up? A few other questions: How is the FC copied from Production to Published SQL Geodatabases? Is it just a copy / paste in Catalog? Are Prodcution and Published both 10.3.1 geodatabases residing on the same database instance or different servers / instance? Have you tried this from a different machine / workstation or different OS version to make sure it's not GUI based? Does a simple attribute query still work in Published even though the SQL statement is missing or do they fail? Thanks and hope this is helpful! Best, Rex R
... View more
05-24-2018
09:58 AM
|
2
|
1
|
857
|
|
POST
|
Hi Claudio, You might be experiencing an issue similar as to what was described here: Add Data Query with Custom SRID. You might want to reach out to Melita who would likely be able to either answer your question or point you in the right direction to someone who can. I hope this is helpful!
... View more
05-01-2018
09:44 AM
|
1
|
0
|
1576
|
|
POST
|
Hello Matthew, If you had a support call that didn't result in a successful workaround for this issue, I'd suggest either reopening the last ticket, or request a new case to be created and see if this request is suitable in nature for an enhancement to be logged. Ensure to mention that you have already logged an ArcGIS Idea documenting this use case, and the reasoning behind it. Although this can't guarantee that this functionality (supported PostGIS storage outside of the Public schema)- it will at least better the odds of getting a final word on if this is possible or will be addressed in the future. I hope this is helpful!
... View more
04-24-2018
02:00 PM
|
1
|
1
|
1029
|
|
POST
|
Hello Edith, How large of a table are you loading? Also by loading, do you mean performing writes to the database itself via data loading, or simply drawing or querying the database to view data in ArcGIS Desktop? Does this appear to be data specific or only size specific? It appears that the solution you mentioned (SEND_TIMEOUT setting) is typically recommended: https://kb.informatica.com/solution/23/Pages/40/262530.aspx Could you test lowering the 600 setting to something lower that would still allow the operation to complete that your DBA would be more satisfied with? Lastly, have you ruled out any network instability or intermittent connection drops from clients to the geodatabases involved with the testing? I hope this is helpful!
... View more
04-24-2018
07:50 AM
|
0
|
0
|
747
|
|
POST
|
Hello Mats! Is there a need to custom-make theis address locator? I ask because China is a supported Level 2 country in the ArcGIS World Geocoding Service: Geocode coverage—ArcGIS REST API: World Geocoding Service | ArcGIS for Developers More on using the service can be found here: Working with the ArcGIS Online World Geocoding Service—Help | ArcGIS Desktop - it should be available under the Geocoding toolbar in ArcGIS Desktop. Note that cities are essential for Chinese addresses to work as expected- see note below: Note: There are additional requirements for geocoding addresses in China. To geocode any addresses in China, a valid Chinese city name must be included in the geocode request. To geocode English or Pinyin addresses in China specifically, the sourceCountry parameter must be included in the request: sourceCountry=CHN. I hope this is helpful!
... View more
04-20-2018
05:13 AM
|
1
|
0
|
2801
|
|
POST
|
As Jake said above- there will be no easy way to achieve this using ArcGIS tools as you will need at minimum two geodatabases for gdb replication. It looks like something similar to this might be possible, but will require some further research and extensibility: GoldenGate: Replicate data from SQLServer to TERADATA – Part 1 – GREPORA and GoldenGate: Replicate data from SQLServer to TERADATA – Part 2 – GREPORA
... View more
04-19-2018
05:55 AM
|
1
|
0
|
977
|
|
POST
|
Hi DEV APP, Have you been able to identify the field which is getting created as a type LOB? Chances are upon XML import its trying to create this table in a tablespace and encountering the ORA-02327 error. As it indicated, you cannot move or rebuild an index on a LOB datatype in another tablespace. Oracle DBA: How to rebuild a LOB Index in oracle Therefore, I'd suggest changing the column datatype, or using an import / load method that won't automatically try to create an index on this field. You could try running the feature class to feature class GP tool, or creating a schema only table in Oracle mapped from the schema of the source table, and then load data for this particular feature class. I hope this is helpful!
... View more
04-19-2018
05:21 AM
|
1
|
0
|
1351
|
|
POST
|
Hello Ewa- what version of version of SQL Server and ArcGIS DT / Pro are you using to run the Enable GDB tool from? Have any privileges been revoked from any user and / or the public role? Lastly, can you check the sde_setup.log file which should be located in the C:\Users\<username>\AppData\Local|\Temp directory.
... View more
04-19-2018
04:47 AM
|
0
|
0
|
7055
|
|
POST
|
Hi Doug, I'm, sorry to hear you are still hitting some permissions issues at 2.1.2. Just to confirm- have any modifications been made to the public server role in this geodatabase? Specifically, do you know if any privileges have been manually revoked from the public role? There was a specific defect logged recently for AD / SQL Server permissions, however after further investigation, it was found that the geodatabases where this issue was reproducible had modified/revoked permissions from the public role. I just want to ensure that's not the case here. If there have been no privileges modified for the public server role in this geodatabase, it sounds like a new defect needs to be logged for Pro 2.1.2 (and possibly for 10.5.1 or 10.6). I'd recommend either requesting to reopen the #02071470 case- citing that this is still an issue at Pro 2.1.2, or (it may be faster) to request to have a new case created specifically for this behavior. I hope this is helpful, and please just let me know if I can be of any help with the process!
... View more
03-22-2018
05:49 AM
|
1
|
1
|
2859
|
|
POST
|
Hi Bill- sorry to hear you are still having issues with this compress / locks. The filling of the transaction log could well be due to existing / orphaned locks still existing. Did you ensure the following tables were completely empty in SQL before attempting the compress? sde.SDE_object_locks sde.SDE_layer_locks sde.SDE_table_locks sde.SDE_state_locks sde.SDE_project_information Additionally, I'd recommend truncating or deleting a large amount of the current transaction log to free up space in SQL. Further- ensure the 'Recovery Model' of the geodatabase is set to 'Simple' before running compress. You can change it back to 'FULL' after the compress if you want, or just leave it set to simple and change it to FULL when you want to take a manual or scheduled full backup of the database. Please feel free to give the above suggestions a try and hopefully, this gets your GDB back up, running, and accessible / compressed.
... View more
03-22-2018
05:41 AM
|
1
|
1
|
3796
|
|
POST
|
Hi Alex, thanks for posting those screenshots. Is the web app consuming the published map service or the published feature service as it appears you are publishing both? Also- for the feature class in question can you post a screenshot of the feature class field (condition field) properties in ArcCatalog (right click < Properties < Fields < Click the Condition field) as in the example below. I want to confirm the proper domain is mapped to the proper field: . Thanks!
... View more
03-16-2018
10:28 AM
|
1
|
0
|
2898
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 04-19-2018 05:21 AM | |
| 1 | 02-27-2018 08:22 AM | |
| 1 | 01-03-2018 11:16 AM | |
| 1 | 12-28-2017 11:30 AM | |
| 1 | 03-05-2018 06:35 AM |
| Online Status |
Offline
|
| Date Last Visited |
11-11-2020
02:24 AM
|