POST
|
You can look at the environments tab there and check your horizontal and vertical coordinate system reference, I think you can even change your desired output reference there - but there are separate vertical and horizontal coordinate system references. Also consider you are reconditioning for archydro and accurate channel depth doesn't really matter in the hydrologic analysis aspect, other than it needs to be deep enough to cut through say a road embankment.
... View more
11-22-2021
12:50 PM
|
0
|
0
|
1670
|
POST
|
Try importing pandas and using df = pd.DataFrame.from_records(data=arcpy.da.SearchCursor(fc,fields),columns=fields) where fc is your GIS feature class layer and fields are a list of fields in fc you define format the data frame and print tables, display tables, or display(HTML(table.to_html())) for larger outputs try using group by with .sum or .apply and lambda functions for fast results printed in the jupyter output window
... View more
11-19-2021
11:28 AM
|
0
|
0
|
706
|
POST
|
I burned culverts recently and ran into this. One way around this is to create a polygon that is the size of your DEM extent with a value of zero. Then merge your culvert buffer polygons with nonzero cut depth with that zero polygon and make the cut raster, so that your cut values are zero or desired cut values and not null values. If there is an easier way to calculate null values in the raster to zero I have not found it.
... View more
09-27-2021
07:25 AM
|
0
|
1
|
1486
|
POST
|
I didn't develop the approaches, I only programmed and tried to document it in the jupyter notebook. It's just like you said, its ages, materials, any factors you can identify, then relationships to breaks, put things in bins or categories, and try to figure out what makes sense - that's where the AI approach would be helpful in making it less arbitrary and examining sensitivity of various factors in your system (its the scientific method - guess and test). It would be interesting to do more comparison with relation between breaks and weather data as well. I have tried looking at emergent hot spots but the break data isn't dense enough over time to make real useful statistics determinations.
... View more
09-16-2021
02:04 PM
|
0
|
0
|
1123
|
POST
|
I've updated a couple existing organization processes for this into a Jupyter notebook, shared here. https://github.com/gontek/CMED/blob/main/AssetPrediction.ipynb https://github.com/gontek/CMED/blob/main/WaterMainAssessment.ipynb I know of other people in the business doing great research into this stuff utilizing AI to explore deeper into what causes breaks and poor service. You are right that there are many factors to consider and each system is different and unique, and you are very fortunate if you have good break/leak data to work with.
... View more
09-16-2021
11:22 AM
|
2
|
1
|
1133
|
POST
|
for OSOW one business case for certain would be having the routes as the common operating platform for referencing of construction projects and bridge locations.
... View more
04-15-2021
08:31 AM
|
1
|
0
|
441
|
POST
|
In the meeting the question about impedances comes up, one answer to that question (and it was discussed in the meeting) and the solution this discussion topic proposes is to close the gap between State DOT Linear reference systems and non-State DOT organization transportation and logistics management.
... View more
04-14-2021
01:05 PM
|
2
|
0
|
1220
|
POST
|
Cities and Counties could easily calculate impacts to traffic improvements by comparing before and after crash rates along the routes when traffic safety countermeasures are applied.
... View more
04-14-2021
12:57 PM
|
2
|
0
|
1222
|
POST
|
Pavement Condition Index and PM2 (pavement performance measures) for MPOs could be easily collected and referenced over time on the same routes for pavement lifecycle analysis and management across DOT and MPO jurisdictions.
... View more
04-14-2021
12:56 PM
|
2
|
0
|
1223
|
POST
|
It's the latest viral trend, sharing a screenshot and explanation of your Roads and Highways Route Concurrency Rules: Ours is pretty basic: Its the lesser of route ID parameters but the top rule is a dominance event "Route Dominance" that identifies exceptions to the logic, with 19 exceptions designed to prevent double-counting centerline mileage for locations where opposite carriageway results in logical primary on each side. I'd like to know how other states define concurrency rules to gain insight to practices.
... View more
02-02-2021
08:33 AM
|
2
|
1
|
789
|
POST
|
USE KHUB SELECT [ActivityType], count([ObjectId]) as Edits FROM [RH].[LRS_EDIT_LOG_evw] WHERE TransactionDate BETWEEN '1/1/2020' AND '1/1/2021' group by ActivityType order by ActivityType 1 930 2 649 3 489 4 793 5 433 6 1003 7 534 12 24
... View more
01-07-2021
08:55 AM
|
2
|
0
|
699
|
POST
|
Good idea. To do this you will want to track edit changes and ensure your portal group security is working with the change tracking, and yes protect your default and manage LRS Locks in event editor with a lockroot subversion from default and conflict prevention enabled in the ALRS Settings. I would configure event editor applicatons, secured and delivered to members assigned to portal groups, and customize the cartographic content to facilitate the workflow to the maximum extent possible. Also configure the event editor with full ability of the editor to create a version, post and reconcile, manage locks, and delete a version all with event editor buttons. What I do is publish the services with a shared user that is only used for publishing services, not a data owner user or an active directory user but a named user in the database with privileges set for editing event data in versions. I think in a multi-user environment you would definitely want to have at least a version per user, sometimes more depending on the situation and the number of EE configurations you set up. It is possible to configure event editor using portal maps that can combine read-only services with the ALRS service, but snapping in the Event Editor might not work on those non-alrs services. You can also probably add read-only connections when publishing the service to enable the reference layers to support snapping, maybe. For our installation it was difficult to get the federated portal all set exactly how it had to be for ADFS users and conflict prevention to work but now that we have that figured out I wouldn't want to do it any other way, and it appears this approach is going to work well to achieve your goal, and mine too, of supporting program areas in such a way that they can utilize the event editor and spread the workload horizontally into the agency.
... View more
09-24-2020
03:59 PM
|
1
|
0
|
2130
|
POST
|
not quite like that... are you using append events?
... View more
07-28-2020
07:34 AM
|
0
|
2
|
1090
|
POST
|
Shawn recently developed a geocoder based on the route concurrency table, so that concurrency route segments can be geocoded as intersections. We are also ironing out the process of maintianing GIS refposts in a hosted feature dataset (facilitating easy Collector/Quick Capture app access) and locating those features to the refpost control state LRM routes as events for geocoding and offsetting. here are the ingredients we use ON_ROAD/Route AT_ROAD/Milepost OFFSET DIRECTION here is the formula: 1. Geocode ON/AT roads as intersections using esri geocoder 2. Obtain Projected XY Coordinate 3. Buffer the point in by offset distance 4. Intersect the buffer outline to the road centerline segments as points 5. In SQL, query the matching ON road (standardized) to the centerline segment road name. CASE Select coordinate directions from DIRECTION (ie N, E, S, W, NE etc) and RANK the offset coordinates so that you are selecting the most "east" of the intersected points, or keep the intersection if it is the most east. I'll try to find some actual SQL that does the offsetting. With this method we can locate a few hundred thousand crash records in about a workday.
... View more
06-25-2020
08:41 AM
|
0
|
0
|
664
|
POST
|
You end up with a lot of potential options. To update everything you could end date everything and start sate new records Later than the end date of the last thing. Some of the Esri feature service and app capabilities let you play data through time using this sort of method, so you can see changes in conditions or measurements over time. It gets complex when you compare multiple "related" historical things, were they always related? What if somebody finds a record from the past that should have been a different value? You can deal with all this but you have to organize virtual history, real history, and reconcile why things changed over time, and how actual changes factor in to predictive analytics vs virtual changes. An example we deal with a lot is crashes at an intersection, then the intersection is improved, then the factors contributing to crashes can really change significantly because of real changes. Say we change the name of the highway at the intersection, that doesnt change the actual history, just the relationship of the history to what the name was at the time vs now.
... View more
06-13-2020
02:19 PM
|
1
|
0
|
667
|
Title | Kudos | Posted |
---|---|---|
1 | 09-01-2023 11:54 AM | |
2 | 02-24-2023 02:56 PM | |
1 | 12-19-2022 07:19 AM | |
1 | 11-04-2022 06:06 AM | |
2 | 09-16-2021 11:22 AM |
Online Status |
Offline
|
Date Last Visited |
Monday
|