POST
|
@RichardHowe thanks! Yes, clone_items() worked well. Other team members using clone for system migrations have had issues with data integrity, so I was skeptical of using it at first. Works fine in this context though.
... View more
10-18-2023
08:12 AM
|
1
|
0
|
976
|
POST
|
Getting the same IndexError when I run the code snippet. No other code other than logging into the GIS. Put a couple print statements in for the IDs and they came back with the expected IDs: 3 feature layers and a table, numbered 0-3. LYR IDS: [0, 1, 2] TBL IDS: [3] If I change the table ID list to [0], I get a different error: Exception: Unable to add feature service definition. Invalid definition for System.Collections.Generic.List`1[ESRI.ArcGIS.SDS.Metadata.LayerCoreInfo] Exception has been thrown by the target of an invocation. (Error Code: 400)
... View more
09-29-2023
03:25 PM
|
0
|
1
|
1070
|
POST
|
Could only find 3.0.3 in the downloads on My Esri. Still no joy on the portal access after wiping former Pro/python version and reinstall.
... View more
08-15-2023
09:21 AM
|
0
|
0
|
614
|
POST
|
Yes, using Pro 3.1.2 for this. Will test with 3.0.1.
... View more
08-14-2023
04:58 PM
|
0
|
0
|
651
|
POST
|
Wasn't sure whether to put this in the Portal community or the Python community. Have a user account setup on a recent upgrade to portal 11.1. Scripts using the account have no problem accessing the portal if they use arcpy.SignInToPortal() but scripts fail if they attempt to login via arcgis.GIS(). The scripts work fine on portal 10.9. Checking if this is a bug or possibly an upgrade setting miss somewhere. User is Creator type, publisher role. However, user ID used on 10.9 was Pro Advanced. Did have someone test it on 11.1 with another ID as pro advanced and still got the login failure. from arcgis import GIS
import arcpy
portal_url = 'https://portal.ourportal.com/portal'
usr = 'username'
gis = GIS(portal_url,usr)
>>>Enter password:
# VERSUS
arcpy.SignInToPortal(portal_url, usr, pw) TRACE DUMP... Traceback (most recent call last): File "d:\ESRI_conda\envs\arcgispro-py3-clone\lib\site-packages\arcgis\gis\_impl\_con\_connection.py", line 1948, in _check_product res = self.get( File "d:\ESRI_conda\envs\arcgispro-py3-clone\lib\site-packages\arcgis\gis\_impl\_con\_connection.py", line 871, in get return self._handle_response( File "d:\ESRI_conda\envs\arcgispro-py3-clone\lib\site-packages\arcgis\gis\_impl\_con\_connection.py", line 1008, in _handle_response self._handle_json_error(data["error"], errorcode) File "d:\ESRI_conda\envs\arcgispro-py3-clone\lib\site-packages\arcgis\gis\_impl\_con\_connection.py", line 1031, in _handle_json_error raise Exception(errormessage) Exception: User not allowed for this account (Error Code: 403) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "<stdin>", line 1, in <module> File "d:\ESRI_conda\envs\arcgispro-py3-clone\lib\site-packages\arcgis\gis\__init__.py", line 585, in __init__ raise e File "d:\ESRI_conda\envs\arcgispro-py3-clone\lib\site-packages\arcgis\gis\__init__.py", line 524, in __init__ self._portal = _portalpy.Portal( File "d:\ESRI_conda\envs\arcgispro-py3-clone\lib\site-packages\arcgis\gis\_impl\_portalpy.py", line 205, in __init__ self.con = Connection( File "d:\ESRI_conda\envs\arcgispro-py3-clone\lib\site-packages\arcgis\gis\_impl\_con\_connection.py", line 354, in __init__ self._product = self._check_product() File "d:\ESRI_conda\envs\arcgispro-py3-clone\lib\site-packages\arcgis\gis\_impl\_con\_connection.py", line 1954, in _check_product res = self.get(baseurl + "info", params={"f": "json"}) File "d:\ESRI_conda\envs\arcgispro-py3-clone\lib\site-packages\arcgis\gis\_impl\_con\_connection.py", line 871, in get return self._handle_response( File "d:\ESRI_conda\envs\arcgispro-py3-clone\lib\site-packages\arcgis\gis\_impl\_con\_connection.py", line 1008, in _handle_response self._handle_json_error(data["error"], errorcode) File "d:\ESRI_conda\envs\arcgispro-py3-clone\lib\site-packages\arcgis\gis\_impl\_con\_connection.py", line 1031, in _handle_json_error raise Exception(errormessage) Exception: User not allowed for this account (Error Code: 403) VERSIONS: portal 11.1 arcgis 2.1.0.2 arcpy 3.1
... View more
08-14-2023
04:04 PM
|
0
|
7
|
710
|
POST
|
Problem: When using arcpy to publish a web layer/feature layer collection, I am not getting the item.get_data() object that lists the fields, pop-ups, etc. Code follows. Scenario: * Have a set of maps in a Pro project, each having multiple layers. Pro v3.0.3 * Script loops through these, publishing each of these as web layer (feature layer collection) * Publish script runs fine, except that I do not get the data object (i.e. 'text' in item properties I believe) created when I look at it in Assistant. * If I publish the same map manually from Pro, I get the data object. (Share > Publish web layer) Need this object (specifically, the layers) for the remainder of the scripting that updates popups, field settings, etc. Item object looks as expected except for missing the item.get_data() object. The JSON in the REST end point looks as expected. Feature layer collection functions as expected. What am I missing? Is there a subsequent step needed to manifest this object in the item? aprx_file = 'path/to/file.aprx'
folder_name = 'Existing_folder_name'
group_name = 'Some group name'
server_type = 'HOSTING_SERVER'
aprx = arcpy.mp.ArcGISProject(aprx_file)
map_objects = aprx.listMaps()
for map_object in map_objects:
sddraft = map_object.getWebLayerSharingDraft(
server_type = server_type,
service_type = 'FEATURE',
service_name = map_object.name
)
sddraft.exportToSDDraft(sddraft_filename)
arcpy.server.StageService(sddraft_filename, sd_filename)
arcpy.server.UploadServiceDefinition(
in_sd_file = sd_filename,
in_server = server_type,
in_service_name = service_name,
in_folder_type = 'EXISTING',
in_folder = folder_name,
in_override = 'OVERRIDE_DEFINITION',
in_public = "PRIVATE",
in_organization = 'NO_SHARE_ORGANIZATION',
in_groups = [group_name])
... View more
04-13-2023
07:46 AM
|
1
|
2
|
351
|
POST
|
I was hoping the documentation was lagging/lacking. Saw another post where they guessed a undocumented name by adding the 'DE' prefix and it worked. Have seen a handful of similar missing items mentioned in posts at times. DETraceNetwork Have tried Solution, GPSolution, and DESolution just to take shots at it. 🙂
... View more
02-09-2023
10:35 AM
|
0
|
1
|
529
|
POST
|
Is there a parameter type that can be used to navigate to a deployed Solution in a python toolbox? I have the user entering the item ID currently as a text string. Clunky. Would prefer navigating it to in portal contents like GPFeatureLayer for feature layers, etc.
... View more
02-08-2023
11:45 AM
|
0
|
3
|
579
|
POST
|
Hello @Clubdebambos, Thank you for the reply. I think the section of my original post where I rename the layers (third code block) is essentially the same as what you have provided. My bad, I was inconsistent on naming and did not call out one abbreviation detail. My line... fs_id = 'TheGUID' ...is for the published feature service item. Think the rest then aligns with what you suggested. And my... lyr_renames = {'fc_name': 'fc_alias'} ...was meant to be just a placeholder for a full dictionary, just like you proposed. I compiled it when building the GDB/zip file that was published, but effectively the same. I should have noted that in the post. If I am understanding the AGOL dynamics correctly, it is a bit circular. I can rename the layers in the feature service, but the next overwrite is dropping another copy of the GDB, with the old names (because of the same naming restrictions as the original). That clashes with the renamed feature service. What I was hoping, in part, that there was a similar name/alias relationship in the published feature class that I could exploit. Or a way of forcing the publish and overwrite to use the GDB aliases. For now, our work around is to create a view that has the naming the users prefer.
... View more
01-16-2023
09:16 AM
|
0
|
0
|
649
|
POST
|
I have not deployed that particular solution, but have deployed two others. Documentation, as you have found, is slim. It has been trial and error, exploring the various components and seeing how they relate. Looking at the content page for the deployed Solution item itself will help. Look at how Esri has grouped the various components. The grouping should make sense in the context of the domain of the solution. For your Joint Use solution, these all tie into the chart at this link Essentially, you will have a core set of 1-to-N feature services that feed a network of views, web maps, web apps, and dashboards. (Plus there are some pieces in the Joint Use that I have not seen before.) It takes a little studying and walking through your typical domain workflows, but the groups and pieces will start to make sense. Along those lines, I would suggest you write out user stories than see how they translate to the groupings. Loading the data to those core feature services is a large mapping exercise. Plan on spending a lot of time on that. I think that Esri has a tool out there to help with the data load (Google search for it), but we built our own ETL scripts in Python. The dashboards are another area that will need some work. That is not surprising as dashboards are nearly always customized to the particular business needs.
... View more
01-05-2023
07:35 AM
|
1
|
1
|
244
|
POST
|
Short story: I would like AGOL to use the layer alias names, not the feature class names, when I publish from a file GDB. Long story: I am publishing a set of 20 feature classes contained in a file GDB to AGOL. This ends up as a feature layer collection after processing. The online layers are created with the feature class names instead of the alias. I would like the online layers to use the feature class alias as there are spaces and punctuation in the names that the users prefer. That punctuation is not allowed in the feature class names in the GDB but is OK for the AGOL names. Also, the AGOL feature service will subsequently be updated with an overwrite, so the names either need to be 'sticky' or the same question goes to the overwrite process. How to use the alias name instead of the feature class name? Publish code I am using... gis = GIS(userName, passWord)
service_properties = {
'title': 'A Name',
'type': 'File Geodatabase',
'itemType': 'file',
'tags': ['tags'],
'typeKeywords': ['File', 'Geodatabase'],
'description': 'A description',
'snippet': 'A snippet',
'spatialReference': '3857'
}
content_zipped = r'C:\ZippedUpFileGDB.zip'
new_item = gis.content.add(
item_properties = service_properties,
data = content_zipped
)
publish_properties = {
'name': 'Service name',
'description': 'Description',
'layerInfo': {"capabilities": 'Query'},
'targetSR': {"wkid": 3857}
}
published_item = new_item.publish(
publish_parameters = publish_properties,
file_type = 'filegeodatabase',
overwrite=True
) Update/overwrite code: fs_id = 'TheGUID'
content_zipped = r'C:\ZippedUpFileGDB.zip'
agol_item = gis.content.get(fs_id)
agol_flc = FeatureLayerCollection.fromitem(agol_item)
results = agol_flc.manager.overwrite(content_zipped) I know I can rename the AGOL layers like this... fs_id = 'TheGUID'
agol_item = gis.content.get(fs_id)
fs_layers = agol_item.layers
lyr_renames = {'fc_name': 'fc_alias'}
for layer in fs_layers:
current_name = layer.properties.name
if current_name in lyr_renames:
update_name = {
"name": lyr_renames[current_name]
}
response = layer.manager.update_definition(update_name) ...but that breaks the update/overwrite.
... View more
01-03-2023
05:45 PM
|
0
|
4
|
768
|
POST
|
'''
arcgis_helper.py
'''
from arcgis.gis import GIS
class AGOAccountInfo:
url = None
username = None
password = None
class ArcGISManager:
# global variables
account_info = None
gis = None
feature_layer = None
test_run = False
def __init__(self, ago_account_info, test_run=False):
self.account_info = ago_account_info
self.gis = GIS(ago_account_info.url, ago_account_info.username, ago_account_info.password)
self.test_run = test_run
def set_feature_layer(self, feature_service_id, sublayer_index=None):
if sublayer_index == None:
sublayer_index = 0
self.feature_layer = self._get_feature_layer(feature_service_id, sublayer_index)
def _get_feature_layer(self, feature_service_id, sublayer_index):
feature_layer = None
feature_service = self.gis.content.get(feature_service_id)
if feature_service != None and len(feature_service.layers) > sublayer_index:
feature_layer = feature_service.layers[sublayer_index]
if feature_layer == None:
raise Exception('The layer with id {} could not be found.'.format(feature_service_id))
return feature_layer
def get_existing_features(self, where):
query_results = self.feature_layer.query(where=where)
features = query_results.features
return features
def get_feature_service_layers(self, feature_service_id):
'''
Gets the set of feature layers of a feature service. If
feature service does not exist or no layers are present,
raises exception.
'''
feature_layers = None
feature_service = self.gis.content.get(feature_service_id)
if feature_service and len(feature_service.layers) > 0:
feature_layers = feature_service.layers
elif feature_service and len(feature_service.layers) == 0:
raise ValueError(f'The feature service with id {feature_service_id} has no layers.')
elif feature_service is None:
raise ValueError(f'The feature service with id {feature_service_id} could not be found.')
return feature_layers
def update_feature_layer_definition(self, update_data):
'''
Updates the layer definition. See Esri doc for various
input data structures that can be used
https://developers.arcgis.com/rest/services-reference/online/update-definition-feature-layer-.htm
Results structure:
{
"error": {
"code": 400,
"message": "",
"details": [
"Unable to update feature service layer definition."
]
}
}
'''
return_value = False
results = self.feature_layer.manager.update_definition(update_data)
return_value = None
if 'success' in results:
return_value = True
else:
err_code = results['error']['code']
err_msg = results['error']['message']
err_details = results['error']['details']
err_string = f'Code: {err_code} Message: {err_msg} Details: {str(err_details)}'
raise RuntimeError(err_string)
return return_value
def get_attribute_type(self, check_field):
'''
Gets the data type associated with the given field.
Returns None if field not found.
'''
field_type = None
for field in self.feature_layer.properties.fields:
if field.name == check_field:
field_type = field.type
break
return field_type
def get_layer_attribute_names(self):
'''
Get the names of the current layers attributes
'''
layer_attributes = []
for field in self.feature_layer.properties.fields:
layer_attributes.append(field.name)
return layer_attributes
def get_domain_name(self, check_field):
'''
Gets the domain associated with the given field.
Returns None if field not found or if there is no
domain for the field.
'''
domain_name = None
for field in self.feature_layer.properties.fields:
if field.name == check_field and field.domain:
domain_name = field.domain.name
break
return domain_name
def get_domain_code_description(self, check_field, domain_code):
'''
Gets the description associated with the given field and domain code
'''
descrip = None
for field in self.feature_layer.properties.fields:
if field.name == check_field and field.domain:
for domain_pair in field.domain.codedValues:
if domain_pair.code == domain_code:
descrip = domain_pair.name
break
break
return descrip
def get_domain_type(self, check_field):
'''
Get the domain type for the given field
'''
domain_type = None
for field in self.feature_layer.properties.fields:
if field.name == check_field and field.domain:
domain_type = field.domain.type
break
return domain_type
def get_domain_codes(self, check_field):
'''
Gets a list of current domain codes for the field
'''
domain_codes = []
for field in self.feature_layer.properties.fields:
if field.name == check_field and field.domain:
for domain_pair in field.domain.codedValues:
domain_codes.append(domain_pair['code'])
break
return domain_codes
def get_domain_coded_values(self, check_field):
'''
Gets a list of current domain codedValues for the field
codedValues structure:
{
"name": "DESCRIPTION",
"code": "CODE"
}
'''
domain_coded_values = []
for field in self.feature_layer.properties.fields:
if field.name == check_field and field.domain:
domain_coded_values = field.domain.codedValues
break
return domain_coded_values
def append_deprecated_domain_codes(self, check_field, new_codes):
'''
Compares a codedValue list against the existing domain
codedValue list. Retains any codes that have dropped, but who
have been used in the field. Makes these as deprecated.
Returns an updated codedValues dictionary.
Structure for new_codes and updated_codes:
[
{
"name": "Description of code",
"code": "Domain code"
}
]
'''
updated_codes = new_codes[:]
update_code_list = [x['code'] for x in new_codes]
current_codes_used = self.get_attribute_unique_values(check_field)
for code in current_codes_used:
if code not in update_code_list:
current_name = self.get_domain_code_description(check_field, code)
if current_name:
new_name = '_'.join(['DEPRECATED', current_name])
updated_codes.append(
{
'name': new_name,
'code': code
}
)
else:
raise ValueError('Domain description missing')
return updated_codes
def create_domain_update_dictionary(self, check_field, new_coded_values):
'''
Create the dictionary to be used with the update-definition
method of ArcGIS layer.manager
'''
update_dictionary = {"fields": []}
domain_name = self.get_domain_name(check_field)
domain_type = self.get_domain_type(check_field)
update_dictionary['fields'].append(
{
'name': check_field,
'domain': {
'name': domain_name,
"type": domain_type,
'codedValues': new_coded_values
}
}
)
return update_dictionary
def get_attribute_unique_values(self, attribute):
'''
Get a list of unique values used in the field
of the current feature layer
'''
current_values = []
query = "1=1"
features = self.get_existing_features(query)
for feature in features:
# Checks if attribute is present and if it has a value other than null/None
if attribute in feature.attributes and feature.attributes[attribute]:
current_values.append(feature.attributes[attribute])
# Reduce to unique values
if current_values:
current_values = list(set(current_values))
return current_values
... View more
09-28-2022
09:05 AM
|
2
|
2
|
2880
|
POST
|
Came across this thread a month or so ago. Trying to answer the same question about updating domains from a list. It has been 19 months since the question, but thought the attached code might be helpful for future searchers. The two modules are a mashup from a broader solution. See comments in the code for config and changes needed. This is set to update multiple domains across multiple feature layers. The input is an Excel file, with one sheet per field/domain. '''
domain_update.py
Imports from a Excel file. Each sheet in the file needs to be named
to match the field name containing the domain. Each sheet has two
columns: Code and Description. This is the list of code/name pairs
you want the domain to reflect.
'''
import os
import json
import logging
import pandas as pd
from arcgis_helper import ArcGISManager, AGOAccountInfo
#<<<<<<<<<<<<<<< Configuration Required >>>>>>>>>>>>>>>
DOMAIN_LOAD_EXCEL_FILE = r'C:\path\to\your\excel.xlsx'
AGOL_URL = r'https://yoururl.arcgis.com/'
FS_ID = #'ITEM_ID_FOR_FEATURE_SERVICE'
#<<<<<<<<<<<<<<< End of config >>>>>>>>>>>>>>>
class BaseDataMapper:
def __init__(self, logger, arcgis_manager=None):
self.logger = logger
self.arcgis_manager = arcgis_manager
self._layer_configs = {}
def set_arcgis_manager(self, arcgis_manager):
self.arcgis_manager = arcgis_manager
class DataMapper(BaseDataMapper):
def __init(self, logger, arcgis_manager=None):
super().__init__(logger, arcgis_manager)
def update_domains(self, update_data):
'''
Update domains for the layers in the indicated feature service
JSON format for update_data:
{
"solutionID": "SOLUTION_FEATURE SERVICE_ID",
"domainValues": [
{
"field": "ATTRIBUTE_NAME",
"codedValues": [
{
"description": "TEXT DESCRIPTION",
"code": "CODE"
}
]
}
]
}
'''
self.logger.info('Decomposing incoming JSON')
updates = json.loads(update_data)
solution_id = updates['solutionID']
new_domain_values = updates['domainValues'][:]
self.logger.info('Getting layers from AGOL Comm Solution')
fs_layers = self.arcgis_manager.get_feature_service_layers(solution_id)
for layer in fs_layers:
lyr_indx = fs_layers.index(layer)
self.arcgis_manager.set_feature_layer(solution_id, lyr_indx)
lyr_attributes = self.arcgis_manager.get_layer_attribute_names()
for domain_update in new_domain_values:
field_name = domain_update['field']
if field_name in lyr_attributes:
msg = f'Attempting to update {field_name} domain for layer {layer.properties.name}'
self.logger.info(msg)
self.logger.info('Vetting incoming domain code type')
# new_coded_values = domain_update['codedValues']
new_coded_values = self._check_domain_code_types(field_name, domain_update['codedValues'])
self.logger.info('Checking for deprecated values that need to be retained')
current_values = self.arcgis_manager.get_attribute_unique_values(field_name)
if current_values:
new_coded_values = self.arcgis_manager.append_deprecated_domain_codes(field_name, new_coded_values)
self.logger.info('Verifying there is an "Unknown" value present in the domain codes')
new_coded_values = self._verify_unknown_code_present(field_name, new_coded_values)
self.logger.info('Updating the field definition with the new domain codes')
new_update_dict = self.arcgis_manager.create_domain_update_dictionary(field_name, new_coded_values)
try:
self.arcgis_manager.update_feature_layer_definition(new_update_dict)
except RuntimeError as update_err:
err_msg = f'Failed to update {field_name} domain for layer {layer.properties.name}. Error: {update_err}'
self.logger.error(err_msg)
else:
msg = f'{field_name} domain updated for layer {layer.properties.name}'
self.logger.info(msg)
else:
msg = f'Field domain {field_name} skipped for layer {layer.properties.name}. Field not in layer.'
self.logger.warning(msg)
return
def _check_domain_code_types(self, check_field, domain_values):
'''
Check the incoming code values and match the type to the related
field. Return an updated list as needed.
'''
field_type = self.arcgis_manager.get_attribute_type(check_field)
updated_domain_values = []
for coded_values in domain_values:
if field_type == 'esriFieldTypeString':
code_set = {
'name': coded_values['name'],
'code': str(coded_values['code'])
}
elif field_type in ['esriFieldTypeSmallInteger', 'esriFieldTypeInteger']:
code_set = {
'name': coded_values['name'],
'code': int(coded_values['code'])
}
else:
raise TypeError('Error in type for incoming domain data')
updated_domain_values.append(code_set)
return updated_domain_values
def _verify_unknown_code_present(self, check_field, domain_codes):
'''
Verify there is a 'unknown' present in the domain codes.
Insert if missing and return the verified code list.
'''
domain_descriptions = [domain['name'] for domain in domain_codes]
validated_domain_list = domain_codes
if 'Unknown' not in domain_descriptions:
field_type = self.arcgis_manager.get_attribute_type(check_field)
if field_type == 'esriFieldTypeString':
unknown_code = {"name": "Unknown", "code": "UNK"}
else:
unknown_code = {"name": "Unknown", "code": 0}
validated_domain_list.insert(0, unknown_code)
return validated_domain_list
def _create_domain_load(field_name, domain_data:pd.DataFrame):
'''
Take the data from a dataframe and put it into the json structure
needed for the domain processing
'''
domain_load = {
'field': field_name,
'codedValues': []
}
for row in domain_data.itertuples():
code_set = {
'name': row.Description,
'code': row.Code
}
domain_load['codedValues'].append(code_set)
return domain_load
def main():
'''Setup and load from Excel'''
# TODO: Roll your own on the logger
logger = setup_your_logger()
data_mapper = DataMapper(logger)
# TODO: Need to create a method to input your UID/PW
credentials = get_username_password()
ago_account_info = AGOAccountInfo()
ago_account_info.url = AGOL_URL
ago_account_info.username = credentials[0]
ago_account_info.password = credentials[1]
arcgis_manager = ArcGISManager(ago_account_info, False)
data_mapper.set_arcgis_manager(arcgis_manager)
file_to_upload = DOMAIN_LOAD_EXCEL_FILE
data_load = {
"solutionID": FS_ID,
"domainValues": []
}
logger.info('Getting the data from the manual upload template')
if os.path.exists(file_to_upload):
df_uploads = pd.read_excel(file_to_upload, sheet_name=None, usecols=['Code', 'Description'])
for df_name, df_data in df_uploads.items():
logger.info('Assembling JSON for %s field', df_name)
domain_update = _create_domain_load(df_name, df_data)
data_load["domainValues"].append(domain_update)
logger.info('Starting domain upload process')
data_mapper.update_domains(json.dumps(data_load))
else:
logger.critical('Excel template not found. Unable to process manual domain updates.')
if __name__ == '__main__':
main()
print('End of manual domain load script')
... View more
09-28-2022
09:04 AM
|
2
|
3
|
2880
|
POST
|
Additional info for others that may be searching for this like I was. The RegisterWithGeodatabase tool now throws out 'ERROR 160328: Cannot create a table with a duplicate column' if the GDB_GEOMATTR_DATA column exists in the table you are trying to register. For a python script, I am using 'try...except arcpy.ExecuteError' to trap for this. If ERROR 160328 is presented, it deletes the GDB_GEOMATTR_DATA and reruns RegisterWithGeodatabase(). Registration will recreate that column, so do not be spooked if you see it show up again. Arcpy 2.9 MS SQL 14 Pro v2.9.1
... View more
02-03-2022
12:27 PM
|
0
|
0
|
591
|
Title | Kudos | Posted |
---|---|---|
1 | 04-13-2023 07:46 AM | |
1 | 08-17-2021 02:37 PM | |
1 | 10-18-2023 08:12 AM | |
2 | 09-28-2022 09:05 AM | |
2 | 09-28-2022 09:04 AM |