POST
|
I have a python web tool I made that has an associated map service. The tool runs without issue and the result can be added to my map as a Map Image Layer but if I print the map the layers features won't be on it. All my other mapping services are added to the map without an issue, it's only this one that's a problem. API Version 4.25
... View more
09-15-2023
02:29 PM
|
0
|
1
|
250
|
POST
|
That is not the direct reason...as the error states it's an underlying exception raised by pandas due to non finite values being in one of the columns/fields; I get the same error with fields that have no coded domains. My guess is that in older versions of the API they were catching this error or doing a conditional check before fixing and returning from the query method but for whatever reason the workflow has been axed in the newest version. Could also be to a newer pandas version making a change. Either way, I'm not going looking through all that code to figure it out haha. To check for the field you could iterate through every field in the layer and try a query with as_df =True for each of them...whatever fields are causing the error will fail. @ConradSchaefer__DOIT_ good pick up on the .sdf property still working...it's not ideal but it is a workable fix until this is addressed (for me atleast). I haven't tested, but another solution would be to do what I am suggesting above to find the fields and then doing an update to update any NA or INF values (or values that will convert to those types in pandas rather) to some default value. Hopefully ESRI will address this soon.
... View more
09-23-2022
06:18 AM
|
0
|
0
|
3585
|
POST
|
I'm trying to export a file geodatabase via the createReplica operation. I've tried both through the python API and the REST API directly (using requests) but both fail on my hosted feature layer due to an issue exporting the relationships for one of my layers...it does not give any more information then that. Does anyone know a possible fix or the root cause of it? I have an almost identical hosted feature layer I use for testing and this works with no issue on it so it must be something with the data but I can't figure out what. The layer is Sync Enabled and here are the parameters I'm using: # Using requests
data = {'f' : 'json',
'replicaName' : "pi_features_prd",
'layers' : layers,
'layerQueries': json.dumps(layer_queries),
'returnAttachments' : 'false',
'returnAttachmentsDatabyURL' : 'false',
'syncModel' : 'none',
'dataFormat' : 'filegdb',
'async' : 'true',
'token': token}
result = requests.post(replica_url, data=data).json()
# Using Python API
gis = GIS("https://arcgis.com", username, password)
fs_item = gis.content.get(item_id)
fc = FeatureLayerCollection.fromitem(fs_item)
sync_manager = SyncManager(fc)
result = sync_manager.create(
output_fgdb_name,
layers,
json.dumps(layer_queries),
return_attachments=True,
sync_model = 'none',
asynchronous=True,
wait=False,
attachments_sync_direction='bidirectional',
data_format='filegdb'
)
# the error
"""
replicaName': 'pi_features_prd', 'replicaID': '', 'submissionTime': 1661540514537, 'lastUpdatedTime': 1661540521653, 'status': 'Failed', 'error': {'code': 500,
'message': 'Unable to create replica. Please check your parameters.', 'details': ['Exporting relationships for layer 7 failed.']}}
"""
... View more
08-26-2022
12:49 PM
|
1
|
1
|
312
|
POST
|
The setting in connect/web app to "Require update to latest version of this survey" is useless in the workflow where you have users accessing a survey via a map popup if the previous version of the survey is already downloaded on their device. In this scenario the old version of the survey will automatically open within the Survey123 app (from either Field Maps or Collector) and the user can submit without issue. Is there anyway to stop this from happening?
... View more
07-20-2022
08:08 AM
|
6
|
1
|
339
|
POST
|
Hmm okay thanks for the help. I'm starting to think it may have been either a corrupted service definition or an issue with the symbols used (they are a custom font). That being said, even after I switched back to the subtype field and set symbols on AGOL using one of their symbol types the issue persisted, so more likely something with the SD I would think. Probably gonna have to go to ESRI support...be super fun.
... View more
01-12-2022
10:36 AM
|
0
|
0
|
1640
|
POST
|
Can you publish the same layer again, but this time have the layer uniquely symbolized with a different field. That's how my layers were published when I encountered this. Going to do the inverse with my own data, but I can't share and show it due to the nature of it; would be good to have a public example for ESRI if needed.
... View more
01-12-2022
08:37 AM
|
0
|
0
|
1648
|
POST
|
No. Are you sure you have a subtype applied to that layer? Check your service definition. Also, what operating system are you using for field maps? I only tested IOS
... View more
01-12-2022
04:42 AM
|
0
|
0
|
1656
|
POST
|
What's the reasoning? Is this a side effect of something else or an issue with the field apps (it happens in collector also). Symbology w/o the subtype field works fine in Map Viewer, it's only in the field apps I'm having the issue. It's weird, they'll still come up as related layers to other layers in the map, but any arcade expressions set with them no longer work and the feature don't display on the map (but no loading error). As soon as you switch back to the subtype field the problem is fixed I can't find anything in the documentation regarding it. It's a pain for me as I have to have subtypes for my field for reasons I won't bore those reading this with. edit: Look like it had nothing to do with subtypes but the symbols we were using may not be supported in the field apps for some reason...or possibly some side effect of them; with these symbols removed the issue is gone. Anyone know if there is a reference for symbology and fonts supported by Field Maps and Collector?
... View more
01-11-2022
02:55 PM
|
0
|
7
|
2156
|
POST
|
They could query the versioned views, but for the requirement that is being fulfilled the compression to state 0 once weekly is sufficient and simpler; not to mention it keeps our SDE running smoothly. What I need to figure out is the cause of these unreferenced states to pop-up and/or how to get them to compress. I have successfully gotten to state 0 with my method before, and from what I've read in ESRI docs and forum post it should work without fail each time...so I'm obviously missing something here. Assuming it's not our services (and I'm 99% sure it's not, the data owner for all our datasets is a different user schema) and it's not a lock, where are these coming from? Thank you for the code, I'll give it a look over when I have some spare time.
... View more
10-04-2021
11:11 AM
|
0
|
1
|
2454
|
POST
|
@George_Thompson wrote: You are correct that locks may prevent a "full" compress to state 0. Is the big thing that you are looking for is getting all the edits flushed out of the delta tables? - Yes we want everything in the base tables after this script is run, it's part of our weekly process. Otherwise anyone trying to query the data at the database tier is in for a headache, don't want to be messing with querying the delta tables to figure out what is what. If you have any services pointed to the data that it will put a lock on the tables, even if not editing. I don't think that's exactly true, but feel free to correct me if im wrong. Defining "service" as something running on ArcGIS Server (to not be confused with a window service) it depends on the user who created the service, does it not? To my knowledge we have no services using our admin/sde user, we use a separate schema specifically for map services. Do you know how I can track down the origin of the states?
... View more
10-04-2021
10:00 AM
|
0
|
3
|
2466
|
POST
|
Can you give reasoning and state the relevance to the above? We assume everything of importance is posted at the time of this script run, any existing lock would interfere with getting to state 0, that's why those tables are truncated. Process_Information is truncated to remove any lingering connections, I then reconnect after that to get the sde connection back. Regardless the real question is where at these excess states coming from? They shouldn't be because again I delete all versions (except default of course) and turn all our services off (and by services I mean window services connecting to the SDE that mange our post and qa queues). No map service uses the SDE connection
... View more
10-04-2021
09:55 AM
|
0
|
0
|
758
|
POST
|
No there shouldn't be, but i'll drop check that. Truncate those tables because at the time this script runs everything should be posted and our mappers are aware of that, we work on a week timeframe and always got to state 0 at the end of the week. Locks could prevent the compression to state 0 because they prevent the deleting of a version (correct me if im wrong). This is not a common workflow probably but the docs support it, if you are assuming all non-posted versions are to be deleted (which we are).
... View more
10-04-2021
09:49 AM
|
0
|
5
|
2470
|
POST
|
and I can provide this token to initiate a connection AGOL via the python api's GIS module?
... View more
10-04-2021
08:41 AM
|
0
|
0
|
1345
|
POST
|
sorry I should have shared the code itself, I truncate all those tables prior to compression...but perhaps you could shed some light on this. I ran into an issue where if I make a direct connect to the SDE w/admi credentials and then truncate process_information the compress would fail; I don't understand the underlying reasoning other then it needs whatever info is in that table to compress. Maybe I need to separate my truncating of all the lock tables from process_information? I do it in a loop since they are all the same commands. # Connect to the sde, truncate all the lock tables and process information,
# to help with getting to state 0. Note, this block **MUST** be ran before
# making a connection to the SDE via the arcpy interface as it will add a row
# to the process_information table that is essential to the compress operation.
logger.info("Truncating lock tables and process information for sde")
sde_conn = cx_Oracle.connect('{}/{}@{}'.format(admin_user,admin_pw,sde))
sde_cursor = sde_conn.cursor()
sde_truncate_ls = ["object_locks","table_locks","state_locks","layer_locks","process_information"]
for table in sde_truncate_ls:
sde_cursor.execute("Truncate table {}".format(table))
# Get an sde connection file for admin, prevent users from connecting
# disconnect anyone currently connected.
admin_sde = check_get_sde_conn(None,sde,admin_user,admin_pw,True)
arcpy.env.workspace = admin_sde
arcpy.AcceptConnections(admin_sde,False)
arcpy.DisconnectUser(admin_sde,"ALL")
# Before compress need to delete all versions if we want to get
# the sde to state 0
logger.info("Getting and deleting all versions that have not been posted")
ver_list = [ver.name for ver in arcpy.da.ListVersions()
if ver.name.lower() != 'sde.default']
if len(ver_list) > 0:
for version in ver_list:
logger.info("Deleting Version: {}".format(version))
arcpy.DeleteVersion_management(admin_sde,version)
# Truncate mm_sessions,mm_px_version,gdbm_post_queue tables.
# Not nessecary for state 0, but after deleting all versions
# any rows left in those tables are erroneous there is no longer
# any state or version history associated with them. Erroneous rows
# can be in gdbm_post_queue so it should be truncated also, everything
# should be posted at this point.
logger.info("Truncating process tables")
proc_conn = cx_Oracle.connect('{}/{}@{}'.format(proc_user,proc_pw,sde))
proc_cursor = proc_conn.cursor()
proc_truncate_ls = ["mm_session","mm_px_versions","gdbm_post_queue"]
for table in proc_truncate_ls:
proc_cursor.execute("Truncate table {}".format(table))
# Time to compress the sde, then query the database
# and check how many states remain
# If not at state 0 after first compress, log the remaining states and try one more time
# sometimes a second compression will clean up any lingering unreferenced states
logger.info("Compressing the sde")
state_id,states = compress_and_check(admin_sde,sde_cursor)
... View more
10-04-2021
08:39 AM
|
0
|
0
|
2528
|
POST
|
Ah cool, so if I do it like this even if the password expires on the account that registered the application I can still generate the access token indefinitely?
... View more
10-04-2021
08:28 AM
|
0
|
2
|
1368
|
Title | Kudos | Posted |
---|---|---|
1 | 08-26-2022 12:49 PM | |
6 | 07-20-2022 08:08 AM | |
1 | 06-26-2020 07:03 AM | |
1 | 10-04-2021 07:37 AM | |
1 | 02-25-2021 06:26 AM |
Online Status |
Offline
|
Date Last Visited |
4 weeks ago
|