POST
|
I had been using ArcGIS Pro 2.8 with this workflow to georeference images and share them with my organization. Process to Georeference plan sheets: 1Download plans pdf 2Open in bluebeam 3Extract the single page you want to georeference 4 save the page as tif 5Add that tif to arcgis pro 6The arcgis pro map should be set to use the KRCS Zone 11 county system projection 7In arcgis pro change the appearance to grayscale stretch from rgb classified (if not in color) 8Give the appearance some transparency, around 33% 9Arcgis pro - imagery - georeference 10Zoom to project location, rotate, zoom, and fit to display until lot corners are very close, within about 5 ft 10.5 change layer order so image is under plats/parcels 11Place a few control points to tie the lot corners down If it gets weird remove the tif and go back to step 5 12Save the control points 13Save the image in the georef toolbar 14Copy the image to another map for publishing to arcgis online in the web mercator projection 15Publish the tiled image service Share - web layer - tile Folder - plan review; share with city; Configure - AGS/web maps, Cache locally to save credits Analyze, ignore web mercator warning Publish at an the most detailed tile scale with a size that is around 4-6 MB 16Add the image service to the map in KRCS projection and remove the tif 17Save the map Short version, I was georeferencing in the project coordinate system, then once that image was georeferenced I could publish in web mercator tiles by copying the georeferenced image (Saved) into the web mercator projection map. Now I am using ArcGIS Pro version 2.9 and this workflow results in an unknown spatial reference for the georeferenced image, unless I temporarily save the georeferenced image as a new image. Has anybody else noticed this difference? It adds a step I would prefer not to add into my workflow. Was this an intended change or is it a bug with copying a georeferenced file between maps? Even if I import control points to the image Pro will not recognize the spatial reference when attempting to publish in the web mercator projection unless the georeferenced image has been saved as a new image - I think this is a bug, I prefer the previous behavior.
... View more
03-23-2022
12:24 PM
|
0
|
3
|
1227
|
POST
|
You can look at the environments tab there and check your horizontal and vertical coordinate system reference, I think you can even change your desired output reference there - but there are separate vertical and horizontal coordinate system references. Also consider you are reconditioning for archydro and accurate channel depth doesn't really matter in the hydrologic analysis aspect, other than it needs to be deep enough to cut through say a road embankment.
... View more
11-22-2021
12:50 PM
|
0
|
0
|
2094
|
POST
|
Try importing pandas and using df = pd.DataFrame.from_records(data=arcpy.da.SearchCursor(fc,fields),columns=fields) where fc is your GIS feature class layer and fields are a list of fields in fc you define format the data frame and print tables, display tables, or display(HTML(table.to_html())) for larger outputs try using group by with .sum or .apply and lambda functions for fast results printed in the jupyter output window
... View more
11-19-2021
11:28 AM
|
0
|
0
|
1031
|
POST
|
I burned culverts recently and ran into this. One way around this is to create a polygon that is the size of your DEM extent with a value of zero. Then merge your culvert buffer polygons with nonzero cut depth with that zero polygon and make the cut raster, so that your cut values are zero or desired cut values and not null values. If there is an easier way to calculate null values in the raster to zero I have not found it.
... View more
09-27-2021
07:25 AM
|
0
|
1
|
1756
|
POST
|
I didn't develop the approaches, I only programmed and tried to document it in the jupyter notebook. It's just like you said, its ages, materials, any factors you can identify, then relationships to breaks, put things in bins or categories, and try to figure out what makes sense - that's where the AI approach would be helpful in making it less arbitrary and examining sensitivity of various factors in your system (its the scientific method - guess and test). It would be interesting to do more comparison with relation between breaks and weather data as well. I have tried looking at emergent hot spots but the break data isn't dense enough over time to make real useful statistics determinations.
... View more
09-16-2021
02:04 PM
|
0
|
0
|
1425
|
POST
|
I've updated a couple existing organization processes for this into a Jupyter notebook, shared here. https://github.com/gontek/CMED/blob/main/AssetPrediction.ipynb https://github.com/gontek/CMED/blob/main/WaterMainAssessment.ipynb I know of other people in the business doing great research into this stuff utilizing AI to explore deeper into what causes breaks and poor service. You are right that there are many factors to consider and each system is different and unique, and you are very fortunate if you have good break/leak data to work with.
... View more
09-16-2021
11:22 AM
|
2
|
1
|
1435
|
POST
|
for OSOW one business case for certain would be having the routes as the common operating platform for referencing of construction projects and bridge locations.
... View more
04-15-2021
08:31 AM
|
1
|
0
|
525
|
POST
|
In the meeting the question about impedances comes up, one answer to that question (and it was discussed in the meeting) and the solution this discussion topic proposes is to close the gap between State DOT Linear reference systems and non-State DOT organization transportation and logistics management.
... View more
04-14-2021
01:05 PM
|
2
|
0
|
1508
|
POST
|
Cities and Counties could easily calculate impacts to traffic improvements by comparing before and after crash rates along the routes when traffic safety countermeasures are applied.
... View more
04-14-2021
12:57 PM
|
2
|
0
|
1510
|
POST
|
Pavement Condition Index and PM2 (pavement performance measures) for MPOs could be easily collected and referenced over time on the same routes for pavement lifecycle analysis and management across DOT and MPO jurisdictions.
... View more
04-14-2021
12:56 PM
|
2
|
0
|
1511
|
POST
|
It's the latest viral trend, sharing a screenshot and explanation of your Roads and Highways Route Concurrency Rules: Ours is pretty basic: Its the lesser of route ID parameters but the top rule is a dominance event "Route Dominance" that identifies exceptions to the logic, with 19 exceptions designed to prevent double-counting centerline mileage for locations where opposite carriageway results in logical primary on each side. I'd like to know how other states define concurrency rules to gain insight to practices.
... View more
02-02-2021
08:33 AM
|
2
|
1
|
966
|
POST
|
USE KHUB SELECT [ActivityType], count([ObjectId]) as Edits FROM [RH].[LRS_EDIT_LOG_evw] WHERE TransactionDate BETWEEN '1/1/2020' AND '1/1/2021' group by ActivityType order by ActivityType 1 930 2 649 3 489 4 793 5 433 6 1003 7 534 12 24
... View more
01-07-2021
08:55 AM
|
2
|
0
|
943
|
POST
|
Good idea. To do this you will want to track edit changes and ensure your portal group security is working with the change tracking, and yes protect your default and manage LRS Locks in event editor with a lockroot subversion from default and conflict prevention enabled in the ALRS Settings. I would configure event editor applicatons, secured and delivered to members assigned to portal groups, and customize the cartographic content to facilitate the workflow to the maximum extent possible. Also configure the event editor with full ability of the editor to create a version, post and reconcile, manage locks, and delete a version all with event editor buttons. What I do is publish the services with a shared user that is only used for publishing services, not a data owner user or an active directory user but a named user in the database with privileges set for editing event data in versions. I think in a multi-user environment you would definitely want to have at least a version per user, sometimes more depending on the situation and the number of EE configurations you set up. It is possible to configure event editor using portal maps that can combine read-only services with the ALRS service, but snapping in the Event Editor might not work on those non-alrs services. You can also probably add read-only connections when publishing the service to enable the reference layers to support snapping, maybe. For our installation it was difficult to get the federated portal all set exactly how it had to be for ADFS users and conflict prevention to work but now that we have that figured out I wouldn't want to do it any other way, and it appears this approach is going to work well to achieve your goal, and mine too, of supporting program areas in such a way that they can utilize the event editor and spread the workload horizontally into the agency.
... View more
09-24-2020
03:59 PM
|
1
|
0
|
2526
|
POST
|
not quite like that... are you using append events?
... View more
07-28-2020
07:34 AM
|
0
|
2
|
1385
|
POST
|
Shawn recently developed a geocoder based on the route concurrency table, so that concurrency route segments can be geocoded as intersections. We are also ironing out the process of maintianing GIS refposts in a hosted feature dataset (facilitating easy Collector/Quick Capture app access) and locating those features to the refpost control state LRM routes as events for geocoding and offsetting. here are the ingredients we use ON_ROAD/Route AT_ROAD/Milepost OFFSET DIRECTION here is the formula: 1. Geocode ON/AT roads as intersections using esri geocoder 2. Obtain Projected XY Coordinate 3. Buffer the point in by offset distance 4. Intersect the buffer outline to the road centerline segments as points 5. In SQL, query the matching ON road (standardized) to the centerline segment road name. CASE Select coordinate directions from DIRECTION (ie N, E, S, W, NE etc) and RANK the offset coordinates so that you are selecting the most "east" of the intersected points, or keep the intersection if it is the most east. I'll try to find some actual SQL that does the offsetting. With this method we can locate a few hundred thousand crash records in about a workday.
... View more
06-25-2020
08:41 AM
|
0
|
0
|
814
|
Title | Kudos | Posted |
---|---|---|
1 | 06-18-2024 03:18 PM | |
1 | 09-01-2023 11:54 AM | |
2 | 02-24-2023 02:56 PM | |
1 | 12-19-2022 07:19 AM | |
1 | 11-04-2022 06:06 AM |
Online Status |
Offline
|
Date Last Visited |
a week ago
|