POST
|
@davedoesgis wrote: Also open to other strategies, so fire away! thanks! Does that willingness to be "open to other strategies" also extent to seriously paying for a high quality / high resolution PDF or TIFF/PNG file fully prepared for you, or does that openness stop right there? You are asking a lot here, at 36x24 feet final print size, but fortunately, I have potentially a lot to offer to you. I am asking because I have been working on an un-released personal multi-year project to develop a sophisticated "ArcGIS Renderer for OpenStreetMap". This has been a gargantuan project and challenge to be honest, but I think the results are pretty astounding by now, and far surpass any of the existing well known map services like the ones from ESRI, Mapbox, Google or Bing in terms of cartography and detail (especially in the true topographic scales of >= 1:50k). I have presented a poster of this work at last year's "State Of The Map" OpenStreetMap conference in Florence, Italy. A screenshot of that poster is attached here in the post. If you want to see the poster at its true size and in full glory, you can download it from the SOTM 2022 poster gallery here: https://2022.stateofthemap.org/posters/ It is "Poster 22 ArcGIS Renderer for OpenStreeMap". Click on the (unfortunately garbled) poor quality preview to open the full PDF from the gallery and save it locally if you want. About your project: 36x24 feet gives me a nice 1:500k scale map in UTM zone 14N centered on the middle of the US (lower 48 states). That would give the detail you can view in the other two attached images. I actually developed two styles: one loosely based on the existing OpenStreetMap "Default" map style (but with significant enhancements), and a true topographic style similar to some of world's most well known National Mapping agencies. These styles do not match ESRI's "Topographic" style, they are my own custom ArcGIS Pro styles. Also, the actual style format is my own propriety design, not a Mapbox vector style derivative. It actually uses Pro layer files (*.lyrx) as part of the file format. It is a massive amount of data for the United States at the desired scale, but fortunately the Renderer does employ sophisticated generalization to handle this amount of data, without the typical crude "minimum area" filtering & Douglas Peucker generalization causing random ugly losses in e.g. forest polygons, as seen in many vector tile maps at small scales. As you can see from the poster, I actually currently run a 1.5 TB spatial database with the entire Planet's worth of OpenStreetMap data in true spatial vector format, so exporting the lower 48 states in the topographic style is no problem. You do have to realize though that the current map content of the US in OpenStreetMap, although significant, is not at the level of some of the European countries, and also doesn't perfectly match the content of ESRI's "Topographic" map that is likely enhanced with Natural Earth and USGS or other local government agencies data. The data related to especially the landuse, natural features like mountain ranges, and hydrography is kind of patchy and incomplete in the US in OpenStreetMap, which will influence the final map image. That said, the final result map will still be impressive at the scale and size you desire with the entire major road network of the US, cities and the hillshade blended in. If you want to review the attached images, I recommend you to download them. Unfortunately, the "Document Preview" option here on the ESRI forums seems to forcefully re-scale the images causing blurry low quality images, and doesn't allow you to set exact 100% viewing scale.
... View more
05-14-2023
02:25 AM
|
0
|
3
|
3547
|
POST
|
Hi Bern, Side note, but is ESRI actually planning to add color to faces and roofs of buildings? I have noticed that ESRI's interpretation of the OpenStreetMap Simple 3D Building tagging specification is actually quite good, as e.g. witnessed by the below rendering of the Berlin cathedral in ArcGIS Pro based on the ESRI Scene Layer service, that shows proper interpretation of things like roof shapes and directions. However, it appears any color tagging is ignored. I think it would be nice if this was added, see the second example of F4Maps with the correct coloring of the same building based on the OpenStreetMap color tags.
... View more
04-24-2023
10:16 AM
|
0
|
0
|
4449
|
POST
|
I have now tested this with a Pro 3.1 Map document with >50 label classes and with clearly defined label priorities, but reversing the order of the classes in the "Label Priority Ranking" dialog doesn't fix my labeling issues related to the "Remove ambiguous labels" with "Remove all" settings, so I don't think it is actually true that reversing the order of the label classes in the dialog fixes stuff. I think it likely you had an accidental combination of Maplex settings where this seemed to work, but that it is not a real general solution for what would constitute a potential Pro bug if it were true.
... View more
03-10-2023
12:22 PM
|
1
|
3
|
1346
|
POST
|
That's not weird behavior, it is a bug if that is true. If I may ask, was this Pro 3.1, that has a new Maplex setting "Remove ambiguous labels" that defaults to "Remove all"? I have had issues with this new default setting, and needed to switch to "Do not remove" or "Remove within same label class" to get proper behavior similar to Pro 3.0, and reported the issues with the "Remove all" default as a potential bug in Pro 3.1 to ESRI, which is still in investigation. I am now wondering if this issue is related, and the "Remove all" problems I had were actually caused by your observation: that the interpretation of the priority list may be inverted for that specific setting, and likely not with the other two, as I got proper result there.
... View more
03-10-2023
11:39 AM
|
1
|
0
|
1346
|
POST
|
That is essentially the same problem. Uninitialized variable. I have again updated the code, see the code sample in the previous post, it will now set the 'jurisleft' and 'jurisright' variables to empty string "" if no jurisdiction is found. You may consider replacing that with null. Change the variable initialization to: var jurisleft = null
var jurisright = null Who actually developed this code? Is it part of an ESRI solution, or was this developed in-house? These errors are really basic ones, that shouldn't be in released and properly tested code.
... View more
02-26-2023
12:56 AM
|
0
|
1
|
1024
|
POST
|
I have updated the code. I noticed the "isLeftValue" and "isRightValue" are only initialized inside an 'if', that may not be entered by the code based on the 'if' condition. Putting the variable initialization outside the 'if' should work. // This rule will calculate the left and right municipality for a road.
// It determines if the road is completely within a intersectingArea or falls on the edge of a intersectingArea and updates the appropriate values on the road
// Define the Road Centerline fields
var jurisleft_field = "L_JURISDIC";
var jurisright_field = "R_JURISDIC";
// Define the Geopolitical Areas fields
var jurisname_field = "JURISD";
// Get the intersecting Geopolictical Areas
var intersectingAreas = Intersects(FeatureSetByName($datastore, "PublicSafetyGIS_Data.PUBLICSAFETYGIS.Test_Jurisdictions", [jurisname_field], true), $feature)
// This function will convert a polygon geometry to a polyline
function polygonToPolyline(p) {
var json = Dictionary(Text(p));
var polylineJSON = {
"paths": json["rings"],
"spatialReference": json["spatialReference"]
};
return Polyline(polylineJSON)
}
// Test if the road falls completely within a area and does not overlap any of the area's outline
// If it does update the left and right value to be equal to the area's value
var isWithin = false;
var partialOverlap = [];
for (var intersectingArea in intersectingAreas) {
if (Within($feature, intersectingArea)) {
var line = polygonToPolyline(Geometry(intersectingArea));
if (!Overlaps($feature, line)) {
var jurisleft = intersectingArea[jurisname_field];
var jurisright = intersectingArea[jurisname_field];
isWithin = true;
break;
}
// Store any boundaries the line is partially within (overlaps some of the polygons intersectingArea)
else {
Push(partialOverlap, intersectingArea);
}
}
}
// If the road does not fall within a area, attempt to find any areas that it overlaps the outline
// Then test if the polygon is on the right or left side of the line and update the right or left value
var isRightValue = false;
var isLeftValue = false;
if (!isWithin) {
for (var intersectingArea in intersectingAreas) {
var line = polygonToPolyline(Geometry(intersectingArea));
if (Within($feature, line)) {
// Offset the geometry to the right and test if it intersects the intersectingArea
var offset_geometry = Offset($feature, 5);
if (Intersects(offset_geometry, intersectingArea)) {
var jurisright = intersectingArea[jurisname_field];
isRightValue = true;
}
// Offset the geometry to the left and test if it intersects the intersectingArea
offset_geometry = Offset($feature, -5);
if (Intersects(offset_geometry, intersectingArea)) {
var jurisleft = intersectingArea[jurisname_field];
isLeftValue = true;
}
}
}
}
// If either the left or right value is not set we will loop through the partially within
var jurisleft = ""
var jurisright = ""
if (isLeftValue || isRightValue) {
for (var i in partialOverlap) {
var intersectingArea = partialOverlap[i];
var line = polygonToPolyline(Geometry(intersectingArea));
// Get the portion of the road that overlaps the polygon intersectingArea
var intersection_geometry = Intersection($feature, line);
// Offset this portion of the road to the right and test if it intersects the intersectingArea
var offset_geometry = Offset(intersection_geometry, 5);
if (Intersects(offset_geometry, intersectingArea) && !isRightValue) {
jurisright = intersectingArea[jurisname_field];
}
// Offset this portion of the road to the left and test if it intersects the intersectingArea
offset_geometry = Offset(intersection_geometry, -5);
if (Intersects(offset_geometry, intersectingArea) && !isLeftValue) {
jurisleft = intersectingArea[jurisname_field];
}
}
}
return {
"result": {
"attributes":
Dictionary(
jurisleft_field, jurisleft,
jurisright_field, jurisright
)
}
}
... View more
02-25-2023
01:59 PM
|
2
|
3
|
1048
|
POST
|
Before the upgrade to Pro 3.1, were you using the default 'arcgispro-py3' Python environment, or a customized clone created with the ArcGIS Pro "Package Manager"? If the latter, I guess you may need to update the cloned Python environment for the latest version. Pro will even automatically switch to the default 'arcgispro-py3' environment, if the clone is not up-to-date, and display a warning, so you may be missing vital packages now that you installed in the clone. The "Environment Manager" that is part of the "Package Manager" will show this with a special icon next to the environment that offers the option to upgrade the environment, which seemed to work well in my personal case. Only after upgrading, will you be able switch back to your customized cloned environment and activate it for use in your workflow and scripts.
... View more
02-24-2023
01:39 PM
|
0
|
1
|
3883
|
POST
|
Well, exactly as you say: you create a geodesic buffer (which is the only really accurate buffer), and compare the result with a plain circle with radius calculated in what is likely a high distortion projection like Web Mercator. This is the classic "Greenland-is-as-big-as-Africa" in Web Mercator problem, where in reality Greenland is far smaller than the Africa continent. In such situations, at high latitude, the "plain circle" will be much smaller than the geodesic buffer, and severely underestimate / under-represent the real size of the buffer (which, again, can only be properly represented by the geodesic buffer). If you need accurate buffers in a high distortion projection like Web Mercator, only use geodetic buffers. If you have a low distortion local projection (e.g. data within one UTM zone), you can get away with using planar buffer calculation, and it will be relatively accurate (but still not as good as geodetic).
... View more
02-20-2023
11:29 PM
|
0
|
0
|
829
|
IDEA
|
I think your best bet to accomplish this is to convert the dynamically placed Maplex labels to annotation using the: "Convert labels to annotation" geoprocessing tool in ArcGIS Pro as this will give you a geodatabase table with data about the labels, which should include their text. I know of no other good way to get the data of the displayed labels, e.g. there is no "listVisibleLabels" property on something like a Layer object in ArcPy.
... View more
01-28-2023
09:41 AM
|
0
|
0
|
396
|
POST
|
I actually now managed to workaround this performance issue of the "Apply Symbology From Layer" related to "Query Layers". I have actually been contemplating options for a significant amount of time, but overlooked a relatively straightforward solution that works for my specific case: temporarily rename the original table in the database just before running the "Apply Symbology From Layer" tool, create a database view with only a single record of this renamed table using the original table's name (I use a selection with MAX on the objectid column in the SQL), and subsequently run the "Apply Symbology From Layer" tool against the layer now referencing the single record database view. This is very fast and makes the tool's performance independent of the underlying table size of the original table. After running the tool, I delete the view, and rename the secondary table back to the original table's name. Fortunately, a Query Layer already in the TOC doesn't seem to be affected by temporarily knocking out its datasource, so this rather drastic measure doesn't seem to have negative side effects. It would still be nice though if the original issue is fixed.
... View more
09-29-2022
01:30 PM
|
0
|
0
|
551
|
POST
|
@ZihanSong Is there any news to share about development around this tool? This tool is absolutely vital for a custom written geoprocessing toolbox that I have, that extensively uses Python scripting. Unfortunately, I have seen deplorable performance with this tool when using ArcGIS Query Layers using ultra large datasets (>100M records!) stored in an ordinary PostgreSQL / PostGIS database (non-enterprise geodatabase enabled!). E.g. I have seen it take >48 hours(!) to update symbology for a singly Query Layer using this tool. This totally wrecks the performance of my tools. I have absolutely no idea why this tool is so slow in these situations (note the underlying database and table is fully indexed), other then a hunch it is doing some very inefficient full table scan involving terribly slow cursors and inappropriate cursor handling for this size and type of data set. Very unfortunately, despite extensive reviewing of arcpy options, I haven't found any realistic alternative for my custom tool and workflow. E.g. due to the specific processing and highly dynamic processing flow, I absolutely cannot use something like a script tool output parameter's "Symbology" option, because my tool generates a large number of dynamically generated layers, not a fixed set of output parameter feature layers that could have symbology set via the tool parameter's "Symbology" option. There is also a large difference in the performance when having different type of input layers for the tool. E.g., using a 390M record dataset: - It takes 2 minutes to run the "Apply Symbology From Layer" tool when the data source of the feature layer in Pro is a SQLite based ESRI "Mobile Geodatabase" feature class - It takes about 3 hours to run the "Apply Symbology From Layer" tool when the feature layer in Pro is a Query Layer using a SQL statement referencing a dedicated table stored in PostgreSQL / PostGIS containing only those records used in the Query Layer, and the SQL statement is therefor of the type "SELECT * FROM <TABLE_NAME>" without a WHERE clause. - It takes >48 hours to run the "Apply Symbology From Layer" tool when the feature layer in Pro is a Query Layer using a SQL statement referencing a generic table stored in PostgreSQL / PostGIS containing those records used in the Query Layer but also others, and the SQL statement is therefor of the type "SELECT * FROM <TABLE_NAME> WHERE x=y", so including a WHERE clause. In all of the above cases, the data is actually the same (buildings from OpenStreetMap), just stored differently. As said, I don't understand why this tool needs to take so much time. It just needs to swap some symbols for the layer's symbology (note that I am explicitly using the 'update_symbology="MAINTAIN"' option in my arcpy code), and I would expect this to be near instantaneous.
... View more
09-24-2022
02:58 AM
|
0
|
0
|
568
|
POST
|
I have been maintaining a good old "ArcMap" style *.tbx toolbox + python scripts compatible with ArcMap, ArcGIS Pro 1.x, 2.x, 3x for the past five years since Pro 1.x came out. I adapted the toolbox to be compatible with Python 2.x and 3.x, and thus ArcMap and ArcGIS Pro, according to the instructions ESRI provided in the Pro Help, and using e.g. the from __future__ import Python 2.x/3.x compatibility functions. Overall, I've found this pretty well manageable, especially using the ArcPy arcpy.GetInstallInfo() function to get product name and version, so as to be able to distinguish it in code and adapt code flow depending on the capabilities of a version of a product. So if you're reasonably well versed in python and arcpy, I wouldn't worry to much about it. And regarding the specific upgrade from Pro 2.x to 3.x: this was essentially without any issues. Tools just run, at least for my code base. The previous 1.x to 2.x upgrade did represent issues.
... View more
07-12-2022
02:49 PM
|
0
|
0
|
415
|
POST
|
Note that you don't have to install the full .NET 6 SDK unless you intend to do development using ESRI's Pro SDK's. Pro 3.0 runs fine with just the .NET 6 Desktop Runtime: https://dotnet.microsoft.com/en-us/download/dotnet/6.0
... View more
06-29-2022
02:46 AM
|
1
|
0
|
4416
|
POST
|
Hi Tierney, Somehow my brain managed to miss those red major road errors through optical illusion ;-)... I probably got distracted by seeing that major difference in the display of the white minor roads. One last suggestion I have for you is to increase the "Vector resolution" in DPI that you can set upon exporting a layout. I have seen some weird issues happening, e.g. entire layers missing upon export to PDF, when I had low values set for vector resolution. This was especially a problem though with ArcMap, that uses a rather different display engine than Pro, so I don't know if this will make a difference in Pro. Of course, since the display is fine in Adobe, but it fails in print, ultimately this seems to be more of an Adobe or printer driver issue, not so much related to ArcGIS itself.
... View more
03-09-2022
01:38 AM
|
0
|
3
|
1225
|
POST
|
If you refer to the very thin nature of the white roads in the print version of your PDF map, then I am almost certain this is not an issue in ArcGIS, but a mis-configuration in Adobe Reader's settings. Adobe, in its eternal wisdom, decided to make the "Enhance thin lines" option, which "thickens" thin lines upon display in Adobe Reader, default to ON. This means that thin lines show up wider than they are in the actual document (and in your ArcGIS settings). Many graphic designers have lamented this choice, as it falsely suggests a width that is not actually set in the document, and can ruin a good design. In your case, you should first disable the "Enhance thin lines" option in Adobe Reader's "Preferences" setting: - Go to Edit/Preferences, and then under "Page Display", disable "Enhance thin lines" You will now likely notice the lines display much thinner in Adobe Reader, similar to the print, and as you in fact set them in ArcGIS! So to correct that: - Go back to ArcGIS and set a wider width for your road lines, re-export the document to PDF and check if the widths are now satisfactory.
... View more
03-04-2022
02:09 PM
|
0
|
0
|
1264
|
Title | Kudos | Posted |
---|---|---|
2 | 3 weeks ago | |
1 | a month ago | |
1 | 08-02-2024 03:00 AM | |
1 | 05-21-2020 08:56 AM | |
1 | 02-05-2024 12:48 PM |
Online Status |
Offline
|
Date Last Visited |
3 weeks ago
|