|
POST
|
@JakeSkinner Thanks - though this seems a little beefy for a lot of use cases. But my question is specific to large feature classes - I would like to use the following two methods: def truncate_portal_data(fl_url:str)->object:
"""Truncates a feature layer or able through the REST endpoint
Args:
fl_url (str): REST endpoint of the feature layer
Returns:
object: results object
"""
ids = fl_url.query(return_ids_only=True)['objectIds']
results = fl_url.edit_features(deletes=ids) if len(ids)>0 else {"results":"No features to delete"}
return results
def update_portal_data(df:pd.DataFrame, fl_url:str, truncate:bool=True, chunk_size:int=500)->object:
import numpy as np
from time import sleep
"""Adds features from a dataframe to a Portal / AGOL feature service.
Args:
df (pd.DataFrame): DataFrame from which to update features
fl_url (str): Feature service URL
truncate (bool, optional): Truncate table before updating. Defaults to True.
Returns:
object: results of update operation
"""
if truncate:
truncate_portal_data(fl_url)
numchunks = int(len(df)/chunk_size) or 1
chunks = np.array_split(df,numchunks)
for chunk in chunks:
fl_url.edit_features(adds=chunk.spatial.to_featureset())
sleep(5)
return True But my question is is there any reason your script is preferable over the much simpler API methods above for large feature classes?
... View more
10-22-2024
11:21 AM
|
0
|
0
|
1740
|
|
DOC
|
This is an excellent script - thought seems a little beefy for a simple truncate and append if your schema is the same. Seems like doing it with two methods is simpler. However I was trying to understand if this was supported in truncate/appending large feature classes in AGOL. def truncate_portal_data(fl_url:str)->object:
"""Truncates a feature layer or able through the REST endpoint even if sync is enabled
Args:
fl_url (str): REST endpoint of the feature layer
Returns:
object: results object
"""
ids = fl_url.query(return_ids_only=True)['objectIds']
results = fl_url.edit_features(deletes=ids) if len(ids)>0 else {"results":"No features to delete"}
return results
def update_portal_data(df:pd.DataFrame, fl_url:str, truncate:bool=True, chunk_size:int=500)->object:
import numpy as np
"""Adds features from a dataframe to a Portal / AGOL feature service.
Args:
df (pd.DataFrame): DataFrame from which to update features
fl_url (str): Feature service URL
truncate (bool, optional): Truncate table before updating. Defaults to True.
Returns:
object: results of update operation
"""
if truncate:
truncate_portal_data(fl_url)
numchunks = int(len(df)/chunk_size) or 1
chunks = np.array_split(df,numchunks)
return list(map(lambda x: fl_url.edit_features(adds=x.spatial.to_featureset()), chunks))
... View more
10-18-2024
07:28 AM
|
0
|
0
|
14512
|
|
POST
|
I'm curious what the best practices are for truncate and load for large datasets into ArcGIS Online feature services using Python. My inclination is to use a spatial data frame, the manager.truncate and edit_features methods on the featurelayer class. However is writing 100k records using edit_features supported? Should I break it up into chunks of 10k and load that way? Maybe wait 10 seconds between each load? Another patten (one I don't like is much) is using a CSV, shapefile, or something else and then doing the "overwrite layer" pattern. However in most cases I am querying the source data from an API or database elsewhere so its easier to create a SDF instead of a shapefile or CSV. Yet another is creating a new feature service from the data and choosing replace layer. The context here is I will sometimes get some unexplained errors using the edit_features method - it will just crash and say "json decode error".
... View more
10-16-2024
09:03 AM
|
0
|
2
|
1875
|
|
POST
|
I'm trying to query the AVG_CPU for my m2 data store per https://developers.arcgis.com/python/latest/api-reference/arcgis.gis.admin.html#datastoremetricsmanager and I'm trying to understand the units of the results. My code is gis.admin.datastore_metrics.query(metric=DataStoreMetric.AVG_CPU, bin_size=1, bin_unit=DataStoreTimeUnit.HOUR, ago=7, ago_unit=DataStoreTimeUnit.DAY) Any my results are [{'ts': datetime.datetime(2024, 10, 9, 17, 0), 'value': 793.0},
{'ts': datetime.datetime(2024, 10, 9, 16, 0), 'value': 2905.0},
{'ts': datetime.datetime(2024, 10, 9, 15, 0), 'value': 3252.0},
{'ts': datetime.datetime(2024, 10, 9, 14, 0), 'value': 2101.0},
{'ts': datetime.datetime(2024, 10, 9, 13, 0), 'value': 49.0},
{'ts': datetime.datetime(2024, 10, 9, 12, 0), 'value': 29.0},
{'ts': datetime.datetime(2024, 10, 9, 11, 0), 'value': 48.0},
{'ts': datetime.datetime(2024, 10, 9, 10, 0), 'value': 24.0},
{'ts': datetime.datetime(2024, 10, 9, 9, 0), 'value': 0.0},
{'ts': datetime.datetime(2024, 10, 9, 8, 0), 'value': 1.0}, Am I reading this correctly that my avg_cpu is over 100% at certain times?
... View more
10-09-2024
10:59 AM
|
0
|
3
|
1815
|
|
POST
|
I'm trying to query the AVG_CPU for my m2 data store per https://developers.arcgis.com/python/latest/api-reference/arcgis.gis.admin.html#datastoremetricsmanager and I'm trying to understand the units of the results. My code is gis.admin.datastore_metrics.query(metric=DataStoreMetric.AVG_CPU, bin_size=1, bin_unit=DataStoreTimeUnit.HOUR, ago=7, ago_unit=DataStoreTimeUnit.DAY) Any my results are [{'ts': datetime.datetime(2024, 10, 9, 17, 0), 'value': 793.0},
{'ts': datetime.datetime(2024, 10, 9, 16, 0), 'value': 2905.0},
{'ts': datetime.datetime(2024, 10, 9, 15, 0), 'value': 3252.0},
{'ts': datetime.datetime(2024, 10, 9, 14, 0), 'value': 2101.0},
{'ts': datetime.datetime(2024, 10, 9, 13, 0), 'value': 49.0},
{'ts': datetime.datetime(2024, 10, 9, 12, 0), 'value': 29.0},
{'ts': datetime.datetime(2024, 10, 9, 11, 0), 'value': 48.0},
{'ts': datetime.datetime(2024, 10, 9, 10, 0), 'value': 24.0},
{'ts': datetime.datetime(2024, 10, 9, 9, 0), 'value': 0.0},
{'ts': datetime.datetime(2024, 10, 9, 8, 0), 'value': 1.0}, Am I reading this correctly that my avg_cpu is over 100% at certain times?
... View more
10-09-2024
10:59 AM
|
0
|
0
|
1817
|
|
POST
|
I'm trying to query the AVG_CPU for my m2 data store per https://developers.arcgis.com/python/latest/api-reference/arcgis.gis.admin.html#datastoremetricsmanager and I'm trying to understand the units of the results. My code is gis.admin.datastore_metrics.query(metric=DataStoreMetric.AVG_CPU, bin_size=1, bin_unit=DataStoreTimeUnit.HOUR, ago=7, ago_unit=DataStoreTimeUnit.DAY) Any my results are [{'ts': datetime.datetime(2024, 10, 9, 17, 0), 'value': 793.0},
{'ts': datetime.datetime(2024, 10, 9, 16, 0), 'value': 2905.0},
{'ts': datetime.datetime(2024, 10, 9, 15, 0), 'value': 3252.0},
{'ts': datetime.datetime(2024, 10, 9, 14, 0), 'value': 2101.0},
{'ts': datetime.datetime(2024, 10, 9, 13, 0), 'value': 49.0},
{'ts': datetime.datetime(2024, 10, 9, 12, 0), 'value': 29.0},
{'ts': datetime.datetime(2024, 10, 9, 11, 0), 'value': 48.0},
{'ts': datetime.datetime(2024, 10, 9, 10, 0), 'value': 24.0},
{'ts': datetime.datetime(2024, 10, 9, 9, 0), 'value': 0.0},
{'ts': datetime.datetime(2024, 10, 9, 8, 0), 'value': 1.0}, Am I reading this correctly that my avg_cpu is over 100% at certain times?
... View more
10-09-2024
10:58 AM
|
0
|
0
|
1819
|
|
POST
|
@JamesTedrick et al - Any update on this - we have a public crowd-source survey that we want to rely on the inbox feature (for the app) since the initial report is complicated with multiple nested repeats etc. Is it possible to use the inbox for public surveys in this way. Currently when I try to open the desktop app anonymously with the appropriate link (arcgis-survey123://?itemID={formid}&action=edit&q:globalId={globalid}&update=true), the app just says "searching for surveys" and never finishes.
... View more
08-06-2024
07:34 AM
|
1
|
0
|
1289
|
|
POST
|
We have survey users that can edit existing surveys using the web version of the survey, but cannot using the Survey123 app (either on mobile or desktop). Any ideas what could be going on here? The survey has the inbox enabled and all the permissions are set up correctly.
... View more
08-01-2024
08:12 AM
|
0
|
1
|
729
|
|
POST
|
We have an application that uses app-based authentication using OAuth2.0 appid and secret workflow. The development team is concerned that they might need to recycle the secret of the OAuth2.0 application (in case of a breach, or to comply by other security standards) while the Portal item owner is away (vacation, sick leave, etc.). We tried adding the OAuth Portal item to a shared update group, but members of the shared upgrade group couldn't recycle the secret on behalf of the owner. Is there another pattern that Esri wants us to follow in this case, short of creating system / headless accounts (which I think is prohibited anyway). These system accounts would need to not only own the OAuth2.0 app but also any of the data layers its scoped to.
... View more
07-19-2024
10:34 AM
|
0
|
0
|
651
|
|
POST
|
I'm assuming this is unsupported, but wanted to make sure there is no workaround or I'm doing something wrong. I am creating a table in my enterprise geodatabase, enabling global ids and archiving, and sharing as a referenced feature service with sync enabled to my Portal. I do this all with ArcGIS Pro / Python. When I share this feature layer with my distributed collaboration configured to send as copies, a hosted feature layer is created in my ArcGIS Online instance reflecting the data in my enterprise geodatabase. That works as expected. However, this table is periodically updated outside ArcGIS - that is, with native SQL. This process runs a truncate on the table, and repopulates with new data. This process also generates new global ids to populate on load as well. This is where the distributed collaboration breaks. My logs say "failure in processing exports for Replica", "Failed to export data changes message for replica with Guid," "Failed to export data changes to replica". "Invalid column value [globalid]." So I'm assuming something is happening with the globalid. They look to be the standard format {8}-{4}-{4}-{4}-{12}, where the number is the number of characters (e.g. 52B2EBC3-DBA2-46C1-93F1-0D6DD52A2F13) So two questions: 1. It is unsupported to maintain a distributed collaboration when the source table is maintained outside of ArcGIS 2. If not, is there a different process our DBA should follow so that the synchronization successfully processes?
... View more
07-10-2024
06:03 AM
|
1
|
1
|
1222
|
|
POST
|
I can't seem to figure out how to measure from points on extruded features. Differences in elevations from one feature to the next, distance from the z value to the terrain, etc. Is this possible?
... View more
06-12-2024
11:58 AM
|
0
|
1
|
837
|
|
POST
|
We have an experience builder app in AGOL that is shared with our organization, and can point people to it. The URL takes the form of https://experience.arcgis.com/experience/{itemid}. When a user navigates to that link, they are given the ArcGIS Online link. However, our users have to use a specific ArcGIS Organization URL to enable them to sign in with a SAML federation (they don't have usernames/passwords). So the first time they have to click the button to enter your org url, type in the org name, click remember this url, then choose continue. Is it possible to edit the experience builder url with url params so that it knows which organization the experience is part of. It would be a much simpler process for the users (who sometimes don't know our org url) to just be presented with their SAML authentication option
... View more
06-06-2024
10:19 AM
|
0
|
2
|
1201
|
|
POST
|
I've been exploring the Python APIs AGOLAdminManager history queries, and was disappointed to learn it does not seem to log data access on a per user basis. So if we store organizational data on ArcGIS Online, and need to pull logs on who accessed it and when, is that even possible with AGOL logs?
... View more
05-17-2024
05:51 AM
|
0
|
0
|
726
|
|
POST
|
I'm working in ArcGIS Notebooks, and would like to have users be able to view the code in a shared notebook without having to actually open it and start up the kernel. I thought the Preview tab in the Notebook's Item Information Page had done this before, but currently in ArcGIS Online when I click on that (even on Notebooks I own), it says "Preview is not available at this time". Is this a known issue?
... View more
04-03-2024
10:42 AM
|
0
|
0
|
790
|
|
POST
|
@SHMAGIC This is a great solution thank you. My question is if you are doing this in vanilla Python, I'm curious how this works since you need a token. I can generate an API key, but elevation is not technically in the "location services" available to your API tokens. So is it possible to generate an token for API requests based on an API token without specifying your username and password (we log in via SAML)
... View more
03-28-2024
08:38 AM
|
0
|
0
|
4580
|
| Title | Kudos | Posted |
|---|---|---|
| 2 | 02-23-2026 11:00 AM | |
| 1 | 07-08-2025 11:33 AM | |
| 1 | 11-07-2023 08:32 AM | |
| 2 | 10-01-2025 06:52 AM | |
| 5 | 09-08-2025 07:31 AM |
| Online Status |
Offline
|
| Date Last Visited |
2 weeks ago
|