|
POST
|
I wrote this script in attempt to reduce the NAN routes on gapped local roads for our RH implementation. pydev106/CalibrateRouteParts.py at master · KDOTGIS/pydev106 · GitHub It basically end dates about 100,000 calibration points and appends about 140,000 new ones. I ran it on my desktop version 10.6.1 ArcGIS to the default version in an edit session, its a SQL Server versioned geodatabase. Processing began when I saved edits...a few days ago. The delta tables are counts of D-~100K and A~260K and it has been processing for days. There are probably about 70,000 miles / 45,000 routes being recalibrated, the response from ArcGIS desktop switches between "timeslicing and recalibrating routes" and "updating events", there are no events to update right now in this database, just networks. Was this a bad idea? I'm thinking I should have approached this in batches of a few thousand records at a time with version management happening between the batches, or maybe try SQL methods using the versioned view. Even if I let this go for however long it takes, I fear the regenerate routes and compress are also going to take forever. I'm going to see where it stands Monday, but how would I even back out of this processing at this point? Restore my database to the state is was from before this started? Should I be second guessing this approach so much? Is any of this even necessary?
... View more
03-02-2019
11:25 AM
|
0
|
3
|
1377
|
|
POST
|
here is a roundabout way to delete the identical records keeping the one with the lowest (minimum) object ID, demonstrated with my sloppy python copied from the results tab. arcpy.Merge_management(inputs="CPNewMeasEndEventsY;CPNewMeasEndEventsX;CPNewMeasEventsY;CPNewMeasEventsX", output="C:/temp/routeparts.gdb/CalibrationPoints_MultipartLocals") arcpy.FindIdentical_management(in_dataset="CalibrationPoints_MultipartLocals", out_dataset="C:/temp/routeparts.gdb/cp_identicals", fields="RID;MEAS;POINT_X;POINT_Y;Measure", xy_tolerance="", z_tolerance="0", output_record_option="ONLY_DUPLICATES") arcpy.Statistics_analysis(in_table="cp_identicals", out_table="C:/temp/routeparts.gdb/cp_identicals_to_delete", statistics_fields="IN_FID MIN", case_field="FEAT_SEQ") arcpy.AddJoin_management(in_layer_or_view="CalibrationPoints_MultipartLocals", in_field="OBJECTID", join_table="cp_identicals", join_field="IN_FID", join_type="KEEP_COMMON") arcpy.SelectLayerByAttribute_management(in_layer_or_view="CalibrationPoints_MultipartLocals", selection_type="NEW_SELECTION", where_clause="1=1") arcpy.AddJoin_management(in_layer_or_view="CalibrationPoints_MultipartLocals", in_field="cp_identicals.FEAT_SEQ", join_table="cp_identicals_to_delete", join_field="FEAT_SEQ", join_type="KEEP_COMMON") arcpy.SelectLayerByAttribute_management(in_layer_or_view="CalibrationPoints_MultipartLocals", selection_type="REMOVE_FROM_SELECTION", where_clause="cp_identicals_to_delete.MIN_IN_FID = cp_identicals.IN_FID") arcpy.RemoveJoin_management(in_layer_or_view="CalibrationPoints_MultipartLocals", join_name="cp_identicals_to_delete") arcpy.RemoveJoin_management(in_layer_or_view="CalibrationPoints_MultipartLocals", join_name="cp_identicals") arcpy.DeleteFeatures_management(in_features="CalibrationPoints_MultipartLocals")
... View more
02-26-2019
02:39 PM
|
0
|
0
|
8901
|
|
POST
|
I had similar problems after os upgrades on some servers, part of the problem for me was running tools as admin user made it difficult to troubleshoot running something as that user.
... View more
02-22-2019
09:03 AM
|
1
|
0
|
3344
|
|
DOC
|
more editors added - I think we have at least one from each state on the list, plus Arizona and Oklahoma now.
... View more
02-22-2019
08:22 AM
|
0
|
0
|
1566
|
|
DOC
|
Hi Nicole, I added you and some others as editors of this document.
... View more
02-21-2019
11:56 AM
|
0
|
0
|
1566
|
|
POST
|
I'm just suspicuous because the tools are called with different names in arcmap vs python, and I have fell subject to calling tools from a wrong library in an esri product after a software update before.
... View more
02-21-2019
06:39 AM
|
0
|
0
|
3344
|
|
POST
|
I think I made a mistake, I think that is the path to the locfef library I mean to specify: So same question, with this as the correct path. Sounds like at least Scott and I are on the same page here. Kyle
... View more
02-21-2019
06:33 AM
|
1
|
1
|
3344
|
|
POST
|
I'm developing some python scripts using the location referencing tools - I found I can include the location referencing tools if I add Program Files (x86)\ArcGIS\LocationReferencing\Desktop10.6\Bin to my python path and import functions from the file locref.py - is this the correct way to use the locating referencing tools in python? The functions are named a little differently in this file than they are when you run them from the toolbox, so I just want to be sure I've got the right tools. for example if I import this: from locref import GenerateRoutes and use GenerateRoutes in a python script, am I correct to assume this is the exact same thing as runnning the generate routes toolbox tool, which results in a function from arcpy.GenerateRoutes_locref? I can't import GenerateRoutes_locref in my python script, unless it's hiding in another python lib other than the one I found. Kyle
... View more
02-21-2019
06:11 AM
|
1
|
12
|
4695
|
|
BLOG
|
PyDev for Eclipse has been my preferred python development environment for many years now, I can't say it enough. Getting it set up can be a bit of an experience, it's not just a simple windows executable and boom go- there are lots of choices and potential ingredients - but with that comes support for many flavors, textures, and pythonic experiences to support GIS development and automation. Here's my recipe for getting started let's grab the latest eclipse version - lots of options here. It's like a cheesecake factory menu. I'll go with the Eclipse IDE for JavaScript and Web Developers, it's not a huge file, it has the Git client and maybe I'll look at some JavaScript files there someday. Mostly I'll use pyDev though. Oh hey look we need a JVM. I heard something about Java recently, what was it? Cant remember, I think it had something to do with Oracle. Probably no big deal. I'll just go with this OpenJDK . Think you would be safe if this link here will download a huge zip file? It will probably be fine, just look at the link. You might go with an Oracle or IBM option if that's more in your wheelhouse. First unzip the java files, then unzip the eclipse files, right in my downloads directory. I'll go to the eclipse folder that was just unzipped and try to open the eclipse application. OK, it worked. That wasn't supposed to happen, I was supposed to get a java runtime error the first time. Maybe I did it wrong - I probably already had a compatible Java standard edition runtime environment 1.8 version installed in my C:\Program files\Java with this new VM, this never happens. I'm sure someone will have the error if you follow this, what you will need to do is install the JRE there. Now that it's working for me I will go back to the eclipse folder and pin the application shortcut to my windows taskbar. I will open to the default workspace for now. In Eclipse, go to the Help menu, and select "install new software" add pydev - http://www.pydev.org/updates I'll order one PyDev and take the Mylyn Integration on the side, and of course I will read ALL of the license terms and review them with my IT boards or directors and legal council before I choose to accept. Something about accepting the authenticity that you get with PyDev, an eclipse restart, and I am almost ready to go. In Eclipse again, go to Window, Perspective, Open Perspective, Other, then choose PyDev. Close the welcome screen, and this is my home territory. In the package explorer, right click try to create a new project, select the pydev/pydev project wizard, and make up a project name like PydevTest106. I will start with my 2.7 interpreter for a project I need to do in ArcGIS 10.6.1. Later I will come back and set up the pro python 3 in another interpreter python path. I need to click here to configure my interpreter. I'll do a manual config, and browse for my python environment at C:\Python27\ArcGIS10.6. I changed the name and added the 27 just to be extra clear which path and interpreter I'm using. Click OK , apply, and/or/then apply and close. Now I'm good with this for now: let's just finish, and get started already. Create a new python module in the package explorer window now under the pydevtest106 enviornment. I'll start with an empty shell python script. Can I import modules from arcpy libraries? I can, with autocomplete That wasn't supposed to happen, I was supposed to have to go back to the environment under window/preferences/pydev/Interpreters/python interpreter/ in the libraries tab and add the ArcGIS libraries here: Oh good it crashed. It must be working. I want to show my console under window/show view/console just to prove it's working. That's all it takes to get started using my favorite python development environment for medium to large python projects, way better than Idle.
... View more
02-18-2019
11:08 PM
|
0
|
0
|
3119
|
|
POST
|
I love this question! So what is best for GIS, SQL Server or Postgresql (or Oracle)? Depends. It's almost a question akin to what is your favorite flavor? I think the answer depends a lot on the nature of your agency, your IT support, your security needs, and to some degree database administrator bias. My DBA's have traditionally supported Oracle and SQL server, they have invested significant training and developed expertise in those platforms. Recently a project required them to deal with Postgresql - this does not make them happy, but it might open a door for a more serious discussion about this in the future. I use PostgreSQL on my desktop for development and local processing and I love it for that, I use PGIII for admin - but I wouldn't run my enterprise on it at this point because I would rather rely on my DBA team than try to go my own way. So here are some more of my opinions - Postgresql with PostGIS has more native spatial functionality than MS SQL server, and rivals Oracle functionality. If you want to do a lot of tabular and spatial queries and use advanced native SQL functions like linear referencing, then PostgreSQL and PostGIS is great. VACUUM is a nice tool with PostgreSQL. I think more SQL spatial expertise exists in the Oracle community, and the transition for the analyst having Oracle Spatial SQL experience to PostgresSQL more empowers that rare developer than going to Microsoft SQL server. The geodatabse containers in PosgreSQL are a little more like SQL server in a database as opposed to Oracle, which is at the schema. Updating a geodatabse in Oracle is a chore compared to SQL server or Postgresql geodatabase updates, but you didn't ask about Oracle so I digress...anyway that is about the same between MSSQL and Posgresql so for me each of those options score a point against oracle there. The advantage to me to using MSSQL in my agency is the support and expertise I get from IT involved in hosting, backing up, configuring, and maintaining databases in a supported environment - not to mention from Microsoft and Esri. From what I've seen Esri has a great relationship with the PostGRESQL developer community, some of the Esri GDB team are contributors the the FOSS project, postgresql is enterprise ready, even Gartner says so, and postgresql implementations are an increasing trend. However unless you pay someone for support (like we do microsoft and oracle) who do you have to email or call and call you back right away when you have a support issue? There are companies that I'm sure will provide that service for a cost, just something to think about that can be important to an enterprise.
... View more
02-14-2019
10:24 AM
|
2
|
1
|
1656
|
|
DOC
|
From version 1 to version 2 I deleted the row with the KDOT enhancement request about XML importing and exporting. XML for transfer works fine. Modeling event behaviors and LRS events in XML using Sparx EA (the real goal for the enhancement) could be worked around by populating the LRS Event behavior table post import any number of ways.
... View more
02-13-2019
02:06 PM
|
0
|
0
|
1630
|
|
POST
|
I need to update that on the list - I believe XML will work IF you export all the data with the XML. If you would try to export the XML without data, you wound need to somehow carefully tend to the data in those Lrs Tables in conjunction with loading and appending data. My favorite thing to do (assuming enterprise to enterprise) is have my DBA's backup and restore the whole database from environment to environment, then fiddle with user security at the environment a little. That method should also give you some peace of mind that you can succesfully back up and restore one of these databases, and your team will learn how to quickly and appropriately check/adjust the security settings when restoring from a backup. Other than that, I'd always follow what Amit says is best.
... View more
02-13-2019
06:31 AM
|
1
|
0
|
2083
|
|
POST
|
I've thought this over for a minute and I think I would try export to CSV from Pro then import to access. To go multiuser with personal geodatabases via linked tables in other access databases can be done if you contend with locks,and don't mind occasional lost edits or crashes (situations where a versioning approach would have an edit conflict), and have good file backups - but would be best done through an enterprise platform. In a one-way direction csv should work. If I needed cheap and easy multi-user edits I'd look at ArcGIS online something like Survey 123 for forms, Collector apps, or a hosted feature service solution, then export that data to csv routinely for reports or applications built in access.
... View more
02-12-2019
06:14 AM
|
0
|
0
|
3025
|
|
IDEA
|
I do not think access is a good, stable database platform for enterprise GIS Data, but I upvoted this because it is so and convenient just to export data into to a personal geodatabase then open the mdb file in access to do crosstab queries and process data into reports, excel for charts or back into the GIS for visualization. Yes there are workarounds and other ways to do that data processing, but there are times when I want to use Access and Excel because that is so easy anyone can do it. I already use SQL server, Oracle, Postgresql - I want this personal geodatabase option because I don't want the hassle of code and odbc drivers and a dsn just to get the data linkedin into access when all Esri has to do is support the create, export to and import fc from personal geodatabase in pro, or all I have to do is open ArcMap or FME and create a personal geodatabase. I suppose I can just convert data to CSV then import that to Access, but there were things people could do with personal geodatabases... and linked tables, and application access databases, potentially Esri unsupported things... those were good times ten years ago.
... View more
02-12-2019
05:52 AM
|
4
|
0
|
2054
|
|
POST
|
I have some code here that does this. this code had to deal with actual permitted routes so date and time was also a major factor in aggregating the vehicle counts and weights per day. It might seem complicated at first, but the "monthly stats" function is where the joining happens. pydot/KTRIPS_Processor.py at master · KDOTGIS/pydot · GitHub
... View more
02-05-2019
06:19 AM
|
0
|
0
|
2327
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 04-21-2025 06:20 AM | |
| 1 | 02-10-2025 06:22 AM | |
| 1 | 01-23-2025 07:01 AM | |
| 1 | 06-18-2024 03:18 PM | |
| 1 | 09-01-2023 11:54 AM |
| Online Status |
Offline
|
| Date Last Visited |
10-09-2025
11:47 AM
|