POST
|
Just noticed Line 8 should be: with requests.get(replica_url, stream=True, timeout=30) as f:
... View more
09-15-2020
10:41 AM
|
0
|
1
|
1334
|
POST
|
Naari, Have you tried the Python requests module to stream the download? import requests
from pathlib import Path
replica_url = "https://services9.arcgis.com/iERBXXD4hiy1L6en/arcgis/rest/services/Example/FeatureServer/replicaFiles/my_replica.zip"
save_dir = Path("C://backup_utility//test.zip")
download_size = requests.get(url, stream=True).headers['Content-length']
with requests.get(replica_url, stream=True, timeout=30) as f:
with open (save_dir, 'wb') as save:
for chunk in f.iter_content(chunk_size = 1024*1024):
save.write(chunk)
size_on_disk = Path(save_dir).stat().st_size
print(f"{size_on_disk} of {download_size} downloaded")
... View more
09-14-2020
04:36 PM
|
0
|
3
|
1334
|
BLOG
|
Hi Jeanne, At a minimum, you'd have to remove the date query because it uses the datetime module, which is not part of the default ArcGIS Notebooks library. I don't think you can install additional Python modules to the Notebooks environment. However, you can probably do without the date query - it's only there to cull the list of inspections so the script doesn't have to comb through every inspection ever, just the ones from the past day. Besides that, I'd recommend just giving it a try on some non-production services. I'd love to hear how it works out!
... View more
09-14-2020
03:25 PM
|
0
|
0
|
6751
|
POST
|
Yep. Nevertheless, your solution still solves my particular use case and is much appreciated.
... View more
09-07-2020
12:40 PM
|
0
|
0
|
5462
|
POST
|
Joshua, I was saying that when you update any setting (either on the Settings tab or programmatically), this date is updated. So if you enable/disable sync, enable/disable editing, enable/disable editor tracking, enable/disable extract,etc., it updates that date.
... View more
09-07-2020
10:44 AM
|
0
|
2
|
5462
|
POST
|
I'll just add that this date is apparently also updated any time a setting/capability is updated. Sync, extract, editing, etc. Which is unfortunate - it would be nice to have access to a date that always reflected the last time the data itself changed. I doubt that's possible without querying features on an editor tracking-enabled service.
... View more
09-06-2020
10:46 PM
|
1
|
4
|
5462
|
POST
|
Excellent, that does exactly what I was looking for. Thanks Joshua.
... View more
09-06-2020
10:36 PM
|
0
|
0
|
5462
|
POST
|
Hey Joshua, Unfortunately this only gives the Modified Date, i.e., the date item details were changed (tags, title, etc.), not the date the data was last changed. The below screenshots are all for the same item. Here's the modified/updated date: And the Data Last Updated date: And the script output, which shows the modified/updated date, not the Data Last Updated date:
... View more
09-05-2020
03:41 PM
|
1
|
8
|
5462
|
POST
|
Is there any way to programmatically access the 'Data Last Updated' value on the ArcGIS Online Item Details page, e.g., through the Python API? If not, are there any plans to make it available?
... View more
09-05-2020
01:13 PM
|
0
|
11
|
6165
|
BLOG
|
I've had a longstanding need to visualize, query, and filter features using values from a related table, and came up with the following solution, which works great for my use case. Note: You can also use joined hosted feature layer views to symbolize by related records, but these have some limitations, as listed in the documentation. But if a joined view works for your use case, it's an easier option. The use case involves managing stormwater catch basin inspection and cleaning (I've simplified the workflow for purposes of this post). The customer wanted field workers to be able to open Collector and quickly see basins that need attention or haven't been inspected/cleaned in over a year, and provide that same information in a dashboard. It's easy to set up: Add fields to the feature layer to hold the attributes to bring over from the related table Grab the most recent record from the related table, and write values from that record over to the feature layer using the ArcGIS API for Python Put script on PythonAnywhere and set it to run every 60 seconds Configure web map symbology with a simple Arcade expression to show expired and failed inspections Details on each step above: Self-explanatory. I named the fields "Status" and "LastCleaning". Wrote the script shown at the end of this post to grab records from the related table from the past day, sort them by time and drop duplicate records related to an asset (in case there were two inspection/cleaning records within the past 24 hours -- for example, a failed inspection on Wednesday afternoon was resolved on Thursday morning), then use a unique identifier ('FacilityID') to update the asset with data from the most recent inspection/cleaning. from arcgis import GIS
from arcgis.features import FeatureLayer
import pandas as pd
from arcgis.features import SpatialDataFrame
from datetime import datetime, timedelta
import time
gis = GIS(f"https://someorg.maps.arcgis.com", 'someuser', 'somepass')
def update_basins():
one_day = datetime.today() - timedelta(days=1)
string_day = one_day.strftime('%Y-%m-%d %H:%M:%S')
where_query = f"DateInspected >= DATE '{string_day}'"
# get catch basin features
catch_basins = gis.content.get('21343f6579b74cf212576e5614db8866')
catch_basins_lyr = catch_basins.layers[0]
catch_basins_sdf = SpatialDataFrame.from_layer(catch_basins_lyr)
catch_basins_fset = catch_basins_lyr.query()
catch_basins_features = catch_basins_fset.features
# get cleanings records
cleanings_url = 'https://services9.arcgis.com/iERASXD4kaw1L6en/arcgis/rest/services/this_is_an_example/FeatureServer/1'
cleanings_lyr = FeatureLayer(cleanings_url)
cleanings_sdf = SpatialDataFrame.from_layer(cleanings_lyr)
cleanings_fset = cleanings_lyr.query(where=where_query, out_fields='DateInspected, FacilityID, Status')
cleanings_features = cleanings_fset.features
# sort by clreaning date and drop all but most recent
cleanings_df = cleanings_sdf.sort_values('DateInspected', ascending=False)
cleanings_df = df.drop_duplicates(subset="FacilityID")
# find overlapping rows between catch basins and cleanings
overlap_rows = pd.merge(left = catch_basins_sdf, right = df, how='inner', on = 'FacilityID')
def update(basins, cleanings):
for FacilityID in overlap_rows['FacilityID']:
try:
basin_feature = [f for f in basins if f.attributes['FacilityID'] == FacilityID][0]
basin_feature.attributes['LastCleaning'] = cleanings.attributes['DateInspected']
basin_feature.attributes['Status'] = cleanings.attributes['Status']
catch_basins_lyr.edit_features(updates=[basin_feature])
print(f"Updated {basin_feature.attributes['FacilityID']} status to {basin_feature.attributes['Status']}", flush=True)
except Exception as e:
print(f"Could not update {FacilityID}. Exception: {e}")
continue
update(catch_basins_features, cleanings_features)
while True:
update_basins()
time.sleep(60) Set up an "Always-On" task on PythonAnywhere to continually run the script. This is a very easy process. Just set up a PythonAnywhere account (the free tier would probably be fine for this application), upload your script file, and add the script on the Tasks tab as an Always-On Task. Now, the script writes the most recent inspection/cleaning record to the catch basins attribute table every 60 seconds: And lastly, just a simple Arcade expression to symbolize by the status of each basin (Current, Expired, or Needs Attention): var present = Now()
var last_cleaning = $feature.LastCleaning
var cleaning_age = DateDiff(present, last_cleaning, 'years');
If (cleaning_age < 1 && $feature.Status == 'CleaningComplete') {
return "Current"
} else if ($feature.Status == 'NeedsAttention') {
return "Needs Attention"
} else if (cleaning_age > 1) {
return "Expired"
} else if (IsEmpty($feature.Status)) {
return "Record missing or incomplete"
} I hope this is helpful to someone. Feel free to offer suggestions or ask questions.
... View more
03-25-2020
10:31 AM
|
10
|
12
|
10551
|
POST
|
Rickey, de-clustering resolved the problem. Thanks for prodding me to check that. Unfortunately I need clustering, so it looks like I'll have to sacrifice printing capability. Thanks again.
... View more
02-18-2020
07:38 AM
|
2
|
0
|
2370
|
POST
|
Hi David, I am using Esri's default print service. I edited my post to make this clearer. Thanks.
... View more
02-17-2020
10:42 AM
|
0
|
0
|
7505
|
POST
|
Thanks Rickey - as I mentioned in the original post, MAP_ONLY is working. In any case, I need my users to just be able to print a map without having to know a workaround or being limited to MAP_ONLY. This will be public-facing.
... View more
02-17-2020
09:56 AM
|
0
|
4
|
7506
|
POST
|
Thanks Rickey...I forgot I had removed it. I added it back just now.
... View more
02-17-2020
09:12 AM
|
0
|
1
|
7506
|
POST
|
Hi, I built a simple Web AppBuilder application in ArcGIS Online (with all data coming from hosted feature services), but I'm getting strange behavior from Esri's default print service. When trying to use any other layout than MAP_ONLY, it either foes nothing and instantly returns only an "Error, try again" warning as shown below. When I hover over the error, a tooltip says "Cannot read property renderer of undefined". Oddly, MAP_ONLY works fine. Also odd is that I checked several other apps I've previously built, and the print service works fine on those. I've tried rebuilding the app from scratch to no avail, and tried a different browser. Any insight or workaround is appreciated. Below is a screenshot of the app as well as the developer console in Chrome.
... View more
02-17-2020
08:26 AM
|
1
|
14
|
10582
|
Title | Kudos | Posted |
---|---|---|
2 | 3 weeks ago | |
1 | 09-06-2020 10:46 PM | |
1 | 09-05-2020 03:41 PM | |
1 | 06-03-2024 04:01 PM | |
1 | 04-25-2024 05:25 PM |
Online Status |
Offline
|
Date Last Visited |
Tuesday
|