|
POST
|
Sorry for posting so much. I was able to sum the various raster layers using Raster Calculator. Thank you again Xander. Now my question is pertaining to adding raster layers with weighted fields. One of my layers contains data that is weighted between 0 and 1. Is it possible to add all other layers normally (as in adding them together using a value of 1) and add this special layer based on the spatial location of the weighted fields (ie: if an area with a weight of .5 overlaps with another area, it will be 1.5 and so on). Thank you in advance. Side-note: Different areas within a single raster have differing weights. Hi Matthew, If your individual rasters already contain a weight (e.g. values ranging from 0 - 1), you can simply add the rasters together. Maybe you will have to force integer rasters to float with "Float(myIntegerRaster)" to avoid that the values will be truncated. I don't think a "Weighted Overlay (Spatial Analyst)" in that case is what you are looking for. Also the "Weighted Sum (Spatial Analyst)" let's you add a weight per raster, but that's not necessary if your raster values already contain that weight. The weighted overlay and sum are meant for assigning a weight to each individual raster. So if the occurrence of a critical area is more important than the presence of ships, this can be done with these tools. If you want to make things more complex there is a thing called "Applying fuzzy logic to overlay rasters". For now I would stick to simply summing the rasters. Kind regards, Xander
... View more
11-13-2013
11:11 PM
|
0
|
0
|
1714
|
|
POST
|
Hi Xander , Ya , it will take me lot of time if i had to digitize individually each water body . My LAS data has information of intensity ,return number , number of returns,scan direction , color red , color green ,color blue etc. Can you please explain me how these information will be helpful and how i can apply it to minimize my error . I am sure there must be data for breaklines with others but I don't know whom to contact and from where to get the data Thanks , Regards , Sam Hi Sam, The reason I ask this, is that different features will behave differently. If you have more information per LAS point, this will help in trying to distinguish between the wetlands and other features. If you have the LAS dataset toolbar in ArcScene you could symbolize on each of those characteristics like Class, Return, Intensity or RGB. In case you can visually distinguish those features, you may be able to instruct the computer to do that also. These LAS attributes can be converted to raster format using: LAS Point Statistics As Raster (Data Management) or LAS Dataset To Raster (Conversion) The resulting rasters can be used with some Raster Calculations or maybe even in a "Maximum Likelihood Classification (Spatial Analyst)" or a supervised classification (see An overview of the Image Classification toolbar). Kind regards, Xander
... View more
11-13-2013
06:55 AM
|
2
|
0
|
4452
|
|
POST
|
Hi Sam, I see what you mean. That would be a lot of work. What other kind of information do you have in your LAS file (RGB, Intensty, returns, ?). This might be useful to detect those points that need correction. Otherwise, I think there should be some data source somewhere that could be helpful, right? Kind regards, Xander
... View more
11-13-2013
05:57 AM
|
0
|
0
|
4452
|
|
POST
|
Hi Erin, Do the pipelines with relative heights correctly connect at the start and end points? If not then adding the DEM information will not correct anything. Adding an average height for a line will create these kind of errors. What you need is to distinguish between "from" and "to" height (DEM elevation) for each polyline and use the "Feature To 3D By Attribute (3D Analyst)". To do this you could follow these steps: Create points at From and End point with the tool "Feature Vertices To Points (Data Management)" using the point_location = "BOTH_ENDS". This requires an Advanced (Arc/Info) license. If you don't have Arc/Info you could use some Python code like this: import arcpy, os
# featureclasses
polylineFC = r'C:\Path\To\Your\filegeodatabase.gdb\Pipelines' # edit this!
fldID = "NameOfYourUniqueIDfield"
# outputs
outWS = r'C:\Path\To\Your\filegeodatabase.gdb' # output workspace
outFromNodes = 'FromNodes' # output names
outToNodes = 'ToNodes' # output names
# Set local variables
geometry_type = "POINT"
has_m = "ENABLED"
has_z = "ENABLED"
# Use Describe to get a SpatialReference object
spatial_reference = arcpy.Describe(polylineFC).spatialReference
# Execute CreateFeatureclass
arcpy.CreateFeatureclass_management(outWS, outFromNodes, geometry_type, polylineFC, has_m, has_z, spatial_reference)
arcpy.CreateFeatureclass_management(outWS, outToNodes, geometry_type, polylineFC, has_m, has_z, spatial_reference)
# make nodeDict polylines
featuresFrom = []
featuresTo = []
lstIDs = []
curF = arcpy.da.InsertCursor(outWS + os.sep + outFromNodes,("SHAPE@", fldID))
curT = arcpy.da.InsertCursor(outWS + os.sep + outToNodes,("SHAPE@", fldID))
with arcpy.da.SearchCursor(polylineFC, ("SHAPE@", fldID)) as cursor:
for row in cursor:
polyline = row[0]
uniqueID = row[1]
pntF = polyline.firstPoint
pntT = polyline.lastPoint
curF.insertRow((pntF,uniqueID))
curT.insertRow((pntT,uniqueID))
del row
del curT
del curF Use the output feature classes to extract the DEM values to the points. Create in the pipes featureclass the fields "FromHeight" and "ToHeight". Join both the point featureclasses to the pipes featureclass based on the unique ID field. Fill the "FromHeight" and "ToHeight" fields with the field calculator. Apply the relative height to obtain the absolute height for from and to points of the pipes. Now you can use "Feature To 3D By Attribute (3D Analyst)" using the "FromHeight" and "ToHeight". If pipes share a from/to point the height will be the same at that point IF the relative height for the pipes is the same. Kind regards, Xander
... View more
11-13-2013
05:47 AM
|
0
|
0
|
2192
|
|
POST
|
Hi Samriddhi, There is a Help topic called "Incorporating breaklines with lidar". You will still need to have polygons that outline the waterbodies to correct the error. Kind regards, Xander
... View more
11-13-2013
04:46 AM
|
0
|
0
|
4452
|
|
POST
|
Hi James, I didn't test the code, but something like this should do the trick (see code below). The only difference is: for each folder a new table is created: C:/data/temp.gdb/matchtable + unique number I use the ABSOLUTE setting, since I think the RELATIVE may not work properly in this setting. If the output match table was created it is added to a list The list is used to merged all the tables into the output table at the end the intermediate tables are removed (no checks for locks, though) # Import system modules
import arcpy, os
# Set local variables.
rootFolder = 'c:/work/'
parcelsFc = "C:/data/parcels.gdb/parcels"
matchTbl = "C:/data/temp.gdb/matchtable"
keyField = "AttachmentKeyField"
filterFile = "*property*.jpg"
relPathSetting = "ABSOLUTE" # in this setting I would use ABSOLUTE instead of RELATIVE
lstMatch = []
cnt = 0
for folder in os.walk(rootFolder):
if folder[0].find('.gdb') == -1:
# create unique output with counter
cnt+=1
matchTblOK = matchTbl + str(cnt)
arcpy.GenerateAttachmentMatchTable_management(parcelsFc,folder[0],matchTblOK,keyField, filterFile,relPathSetting)
if arcpy.Exists(matchTblOK):
lstMatch.append(matchTblOK)
# append list
arcpy.Merge_management(lstMatch, matchTbl)
# remove the intermediate tables
for tbl in lstMatch:
arcpy.Delete_management(tbl) Kind regards, Xander
... View more
11-13-2013
04:37 AM
|
0
|
0
|
1191
|
|
POST
|
Hello all! So I am really new to Python. I had to convert a Matlab Script into Python script in order to be able to add it as a script tool in Arcmap. I am having a hard time figuring out how to input a shapefile into my script. My whole script runs based on the shapefile so I have no way of checking if it works unless it reads the shapefile. Thanks in advance!! If you run the script standalone, then you just use the path to the ".shp" file like this: import arcpy
inputShp = r'C:\Path\to\some\shapefile.shp'
outputShp = r'C:\Path\to\other\shapefile.shp'
# do something with the shapefile
arcpy.CopyFeatures_management(inputShp, outputShp)
If you added the script to a Toolbox and the shapefile is a parameter, define the parameter as shapefile and read it from the script like:
import arcpy
inputShp = arcpy.GetParameterAsText(0)
# do something with the shapefile
If this is not what you're looking for then share the code and we can have a look at it. Kind regards, Xander
... View more
11-13-2013
04:16 AM
|
0
|
0
|
4948
|
|
POST
|
I have point shapefile. Some points have Z value and others not. I have to determin somehow missing values. I want to do next: 0.) find a point(1) without Z value 1.) select two points(2,3) with Z value on the opposit side of first point(1) 2.) calculate average Z of points(2,3) 3.) select point(1) and write calculated average Z in Z column of point(1) I had once code for this in Avenue (AV 3.3), but now is different. Can you give me some advice? H Franci, I think it is necessary to provide some more information in order to solve it: how do you want to determine the points on opposite sides of the current point without Z? what to do if one of those point doesn't have a Z value itself? wouldn't it be more accurate to use a distance weighted average value? what if you can't interpolate between points and have to extrapolate? do the points represent vertices on a line or are they randomly distributed? In this case maybe the easiest way would be to create a surface or TIN based of the known Z values and then use "Add Surface Information (3D Analyst)" to obtain the Z values in the point feature class. Kind regards, Xander
... View more
11-13-2013
03:59 AM
|
0
|
0
|
869
|
|
POST
|
Thank you Xander for taking the time to look at this issue. Just also want to clarify: the original arcpy cursors, e.g. arcpy.UpdateCursor(), work just fine with annotation. FYI, they are not as fast as the new Data Access cursors, but are quite a bit faster than using the tool arcpy.CalculateField_management(). I've had the pleasure of timing all the above :rolleyes: Best regards, Adam Hi Adam, That's true. Since this (unwanted) behaviour only applies to the da cursor, I think it's best to report it to Esri support. Kind regards, Xander
... View more
11-13-2013
03:45 AM
|
0
|
0
|
1783
|
|
POST
|
Hi Jasmine, Have you tried using the "Combine (Spatial Analyst)" tool? Kind regards, Xander
... View more
11-12-2013
10:14 PM
|
0
|
0
|
822
|
|
POST
|
Hello and thank you in advance for any suggestions to my question. I am attempting to create an output that conveys locations with overlapping features. I have between 4 and 5 layers in polygon and polyline format. I would like to show areas that overlap on a scale base (areas that overlap 1-10 times using a graduated symbology format). The output may either be in raster or polygon form but I would like it to convey a type of "hot-spot" analysis where areas that overlap more times than others are described using a graduated symbology. Thanks. Hi Matthew, You can go either way: Raster (required Spatial Analyst) You can convert each polygon and polyline to a raster, then for each raster you can convert it to a 1/0 raster (1 being present, 0 being non present). To do this use the Raster calculator or directly in the Python window with something like: import arcpy from arcpy.sa import * myRas = 'NameOfRasterInTOC' myRes = Con(IsNull(myRas),0,1) myRes.save(r'C:\Path\To\Output\Folder\or\filegeodatbase.gdb\RasterName') When all the occurrence rasters are created then simply sum them and the cell value will represent the number of feature classes (lines and polygons) that overlap that pixel. If you want to apply a weight you could also use something like weighted sum. Vector With vector you would have to use Union (repeatedly if you don't have access to Desktop Advanced). To also be able to include the polylines (and still have polygons as output), it's best to convert them to polygons by applying a small buffer. With each union you will obtain a feature class with the combined polygons. The output feature class will contain a FID_<name> attribute for each of the input feature classes. Use this field to determine if that featureclass was present for a certain feature. A value -1 will indicate that it's not present. Use the field calculator to create occurrence fields based on these FID_<name> fields (-1 should be 0, rest of value should be 1) and sum these fields to obtain the number of features at a certain location. Symbolize on that field. Kind regards, Xander
... View more
11-12-2013
10:09 PM
|
0
|
0
|
1714
|
|
POST
|
Discharge at a location will, of course, vary with rainfall (and time....and catchment characteristics*)- No rain = no flow/discharge. Your weight map can be used to represent this rain. The default value is 1 which assumes the amount of 'water' applied at each cell is the same and therefore flow is hypothetical. However, if a rainfall event only affected half the catchment, then this weighting might give values of 1 for the area where rain fell and 0 for areas where it didn't etc meaning flow would only be present in cells below the rainfall extent. * these characteristics are probably best calculated with other software. As Tim already outlined the weight map can be used to distinguish between rain (1) and no rain (0) to obtain a "discharge". The discharge in that case will be the number of pixels that contribute to the discharge. What I was referring at, is if you knew what amount of rainfall per time unit is contributed by each pixel, then this information as a weight would lead to the discharge of the river in a flow accumulation calculation. This is only theoretical since it will be quite challenging to derive that kind of information. The schema below shows some of the aspects involved: [ATTACH=CONFIG]29068[/ATTACH] source: http://www.bbc.co.uk/scotland/education/int/geog/rivers/drainage/index.shtml Kind regards, Xander
... View more
11-12-2013
09:32 PM
|
0
|
0
|
7301
|
|
POST
|
Except for the #local Variables, the body of this script works fine. When working with one feature class, setting a workspace seems easy, but when setting the workspace for multiple feature classes it becomes more difficult. If anyone can help-a world of thanks, Larry import arcpy import sets #local Variables: Asbuilt_Log = "C:/Projects/Asbuilt_Log.gdb" Rancho = "C:/Projects/North/Rancho.gdb" Clearlake = "C:/Projects/South/Clearlake.gdb #Body of script set_one = set, int(r[0] for r in arcpy.da.SearchCursor("Asbuilt_Log", "GWO")) set_two = set, int(r[0] for r in arcpy.da.SearchCursor("Rancho", "GWO")) set_three = set, int(r[0] for r in arcpy.da.SearchCursor("Clearlake", "GWO")) print "Items unique to Asbuilt_Log: {}".format(", ".join(sorted(set_one-set_two-set_three))) Hi Larry, There's nothing wrong with working with more than 1 workspace. In fact you don't have to define a workspace (arcpy.env.workspace) in your script. There are a few errors in the script. The local variables should point to featureclass (or tables), now they point to file geodatabases You don't use the local variables in the search cursors. In stead you use strings, which arcpy will search for in the current workspace (and will probably not find). the set, int(... statement is not OK Try this: import arcpy
import sets
#local Variables:
Asbuilt_Log = "C:/Projects/Asbuilt_Log.gdb/NameOfFeatureClassHere" # edit this
Rancho = "C:/Projects/North/Rancho.gdb/NameOfFeatureClassHere" # edit this
Clearlake = "C:/Projects/South/Clearlake.gdb/NameOfFeatureClassHere" # edit this
fldName = "YourFieldToCompare" # edit this
#Body of script
set_one = set(int(r[0]) for r in arcpy.da.SearchCursor(Asbuilt_Log, fldName))
set_two = set(int(r[0]) for r in arcpy.da.SearchCursor(Rancho, fldName))
set_three = set(int(r[0]) for r in arcpy.da.SearchCursor(Clearlake, fldName))
print "Items unique to Asbuilt_Log: {}".format(", ".join(sorted(set_one-set_two-set_three))) Some more reading: SearchCursor (arcpy.da) Kind regards, Xander
... View more
11-12-2013
06:30 AM
|
0
|
0
|
436
|
|
POST
|
I'm using ArcMap 10 and have 1 arc-second, 1/3 arc-second, and 1/9 arc-second DEMs. I understand that 1/9 arc-second is about 3 meters, 1/3 arc-second is about 10 meters, and 1 arc-second is about 30 meters (http://ned.usgs.gov/). 1. This means 3 m by 3 m cell, 10x10, and 30x30? (Just making sure). 2. To change the symbology of the DEM, I use the option "classified" and am separating the cell values into certain heights "100, 250, 500" meters. Am I okay in assuming that the cell values of the DEM represent heights in meters at each 10x10 cell (of whatever arc-second map I'm using) ? 3. After I change the symbology of the DEM, is there anyway to make these sub-divisions into shapefiles? Similar to the process of highlighting certain areas/lines/points in a shapefile and then exporting it to its own shapefile? I would like to do this process for the DEM with certain elevation sections (i.e. 100-250 meters, 250-500 meters). Is this possible? Thank you. I'm pretty new to ArcGIS. 1. Yes 2. All elevation values are in meters (the data is distributed in geographic coordinates in units of decimal degrees, so your cells will have very small values, since they are decimal degrees). You can project the raster using the "Project Raster (Data Management)" tool. 3. there are a few steps required: The symbology does not affect the values in the raster. Before you can convert them to shapefile you will first have to use Reclassify them through the "Reclassify (Spatial Analyst)" tool. Please note that this requires the Spatial Analyst extension. If you select the DEM with the classified symbology the ranges will already be specified. The result will be a raster holding a unique value for each class you defined. The next step is to convert this raster to polygons. The resulting featureclass will have a field called "gridcode". This holds the unique value of your raster. A value 1 will refer to your first class, and so on... Kind regards, Xander I just noticed that Jim already posted useful info...
... View more
11-12-2013
05:49 AM
|
0
|
0
|
3713
|
|
POST
|
Hi Enrich, In case you haven't done this already, you can best contact Esri support and provide them with the necessary data to see if they can solve this: http://support.esri.com/en/ Kind regards, Xander
... View more
11-12-2013
02:56 AM
|
0
|
0
|
1200
|
| Title | Kudos | Posted |
|---|---|---|
| 6 | 12-20-2019 08:41 AM | |
| 1 | 01-21-2020 07:21 AM | |
| 2 | 01-30-2020 12:46 PM | |
| 1 | 05-30-2019 08:24 AM | |
| 1 | 05-29-2019 02:45 PM |
| Online Status |
Offline
|
| Date Last Visited |
11-26-2025
02:43 PM
|