POST
|
T.., reasonable question; there is some ambiguity in the nomenclature since the N or S following the UTM ZONE number can refer to either latitude bands, which alpabetically progress from the south to the north, or just Northern hemisphere and Southern hemisphere, thus telling you which way to measure the northing or southing. Here is a good Wikipedia reference http://en.wikipedia.org/wiki/Universal_Transverse_Mercator_coordinate_system Since the origin of all UTM zones is at the central meridian of each zone and on the Equator the latitude band information is redundant once you have a northing or southing for a point. Latitude bands N and S both occur in the northern hemisphere so the S you refer to in California is just that, a latitude band and happens to be north of band N which is just above the equator. Potentially confusing so you should have to state what nomenclature is being used, but in common practice the N and S just refer to North and South. Hardolph
... View more
07-04-2012
12:36 PM
|
0
|
0
|
740
|
POST
|
Mike, reading between the lines of your question I think Lucas's suggestion "For example, you could have data gathered in WGS 1984 and then go in and Define the Projection as NAD 1983 UTM Zone 15N. That doesn't mean that the data are actually NAD 1983 UTM Zone 15N, rather it means that you simply gave your data that label." is on the mark. I would infer that you have defined the lat/long data as projected when in fact they are not so they are being displayed as through degrees of lat and long were just meters. Go back and re-add the XY Data, but when it comes to entering the coordinate system (the EDIT ... button) only enter a Geographic Coordinate System, WGS 84, NAD 83 etc. Ditch the previously created shapefile. The base layer's coordinate system is irrelevant as is that of the Data Frame. The GIsystem just has to know what kind of units the numbers in the spreadsheet represent and in what coordinate reference frame they were measured in order to digest them properly. Hardolph
... View more
06-29-2012
01:03 PM
|
0
|
0
|
506
|
POST
|
Henrik, OK that's a bit weird; but this is also ArcGIS 10 I presume because the attribute table looks different. try some of my earlier suggestions about adding a new field and using Field Calculator to copy over the attributes. You could even try a couple of fields: one with the domain assigned and another without; the one without should show just coded values and they should all be in the set 2, 3, 4, 998 with no distinction between new and old or there is something corrupted. Also can you manually edit the "old" records at all: delete, rewrite, select and recalc etc.? Even try deleting one of the features and then undoing the delete. Another couple of prods would be to export or copy the feature class within the geodatabase or export to a shapefile and reimport to a feature class. I've seen domain pulldowns get "sticky" and not activate right away, or stuck when switching subtypes such that I had to go back a few steps, but the records should remain editable. Maybe you have discovered a real bug/glitch and should contact ESRI for support. Hardolph
... View more
06-28-2012
12:22 PM
|
0
|
0
|
1425
|
POST
|
Eydís, good to hear it worked out. I'd be interested to see a screen shot of it. What kind of birds? Hardolph
... View more
06-28-2012
07:21 AM
|
0
|
0
|
2171
|
POST
|
Henrik, looks like my followup on your last post disappeared. In case it pops up from the ether here's most of it again: I had a look at the screen shots, but there is ambiguity: The only way to evaluate the domains is to check the editing response in the full attribute table not the attribute editing dialog popup. Open the full attribute table in an edit session and then check how the old vs new records respond. There should be no difference unless the old values are truly not in the set of coded values in the domain. The potential ambiguity with looking at the popup dialog table is that it is based on selection and you could have inadvertently created a new feature class with the old values or even done something in creating a new field with a misleading alias etc. and moved it in the layer display. All the records in a field get treated the same; they are not like cells in a spreadsheet where a formula could be applied to only part of a column of cells or the formatting is different. Hardolph
... View more
06-27-2012
11:52 AM
|
0
|
0
|
1425
|
POST
|
[ATTACH=CONFIG]15503[/ATTACH] Eydís, If you have more than one point per cell you should use the "JOIN_ONE_TO_MANY" option in the ArcToolbox > Analysis Tools > Overlay > Spatial Join instead of the simple join in the TOC layers right click menu. With this tool you need to uncheck the "keep all target features" option as shown in the clip of the tool dialog box. (With the TOC layer join you'll only get the first instance of a join.) From there you will have multiple overlapping polygons to create centroids points from and may have to manually shift the points to display them in the cells the way you were wanting to. Presumably these are the exceptions and the manual work will be minimal, or you might have to consider a smaller cell size. Hardolph
... View more
06-26-2012
09:46 AM
|
0
|
0
|
2171
|
POST
|
Henrick, Are the coded values in the domain equal to the old values (1, 2, 3) used? The field type should be OK because you cannot assign a domain except to a field of the same type. As a test export the feature class to a shapefile or table outside the geodatabase and inspect the values in that field. That will show them up as just coded values and you should see a difference if the old values are not immediately translating into coded value descriptions in the feature class once the domain is assigned. Hardolph
... View more
06-26-2012
08:55 AM
|
0
|
0
|
1425
|
POST
|
Henrick, I'm not sure what you mean by "related table value" (?new terminology for coded value?), but the symptoms you describe would be expected if the coded values in the domain were not = 1, 2, and 3 with the respectively corresponding coded value descriptions 8", 10" and 12" for the valves. If so you could assign the domain to a new field with the correct field type (e.g. short integer, but could be text) and use field calculator by selecting in turn the records with 1, 2 and 3 in the old field (you could do this recalc on the original field too) and apply the new coded values . It seems a bit odd that when you edit the persistent old values, that the pull down pick list does not appear so perhaps I have missed something or you are using a version of ArcGIS 10 with some new idiosyncracy. Hardolph
... View more
06-25-2012
09:51 PM
|
0
|
0
|
1425
|
POST
|
Eydis, In general terms, to get the design of the map you refer to and presuming the GPS point data contains the breeding attributes of one species, and the grid is a polygonized fabric with one point max per cell: 1. do a spatial join with the polygons as target and the points as the join table in ArcMAP (points to Polygons and select option to add all attributes) 2. The resulting output polygons shapefile with have an automatic field call "distance": Select from the new polygon shapefile those polygons with "0" distance (means they fell within) and export this selection to a new shapefile (this step is just to eliminate the grid cells with no points) 3. add 2 fields to the new join polygon shapefile and use Field Calculator to create the X and Y coordinates of the Centroid of the polygon. 4. export this shapefile to a table (dbf etc.) 5. Using the Add XY Data (or the Arc 10 equivalent Display XY ..) create a point file using the new XY coordinates of the centroids 6. symbolize the points using the breeding characteristics as Unique Categories (or create buffers) with nothing appearing for the 7. overlay this on a map of the area with or without the grid polygons Each of those steps assumes a bit of how - to familiarity with Attribute Table operations (right click menus and table Options), but all basic and easily found in the help files as you proceed. No doubt there may be more elegant ways of doing this in Arc10, but I think this will work for what you want. Post again if you have questions specific to any step or find something better. Hardolph
... View more
06-21-2012
12:45 PM
|
0
|
0
|
2171
|
POST
|
Marcus, the offset you describe; 2800 miles south and 260 west of Utah puts the CAD data on the Equator in the Pacific. That means that either you have not defined the projection in such a way that the embedded coordinates are interpreted as being in Utah State Plane with the correct false origin (lat 38.2 deg N, long -111.3 deg W; 2,000,000 false easting) or they are in a local drawing or mine grid with numbers in the 10s of thousands of feet and the proejction has placed them just that; a few 10s of thousands of feet north of the equator. With that offset placing them on the equator the effective false origin of the projection is zero north by some value of longitude other than zero or -180, so it looks more likely to be the UTM zone for western Utah that you have projected them into. I expect that you will have to georeference the CAD data; not project it. For this you use only 2 points and otherwise it works just like a raster georeferencing job. Hardolph
... View more
06-19-2012
08:39 PM
|
0
|
0
|
302
|
POST
|
Richard, try making polygons from them. If the polylines form closed loops by themselves or by intersection with others they will allow creation of a polygon. Easy to do in ArcView, no Python etc required. Here's the simplest case; just ignore step 1. http://forums.arcgis.com/threads/6786-polyline-to-polygon Hardolph
... View more
06-18-2012
11:05 AM
|
0
|
0
|
1779
|
POST
|
Tim, you need a text header at the top of the column. It's reading the first numbers in your lat-long data. Hardolph
... View more
06-17-2012
04:49 PM
|
0
|
0
|
350
|
POST
|
John, Actually most of Lake Superior is in zone 16: the 1 meter offset would have implied more than the full 6 degree width of the UTM zone at that latitude since most of Lake Superior is actually in UTM zone 16N with west boundary at 90 W. The east edge of Lake Superior that I used as an example is at 84.5degW and the east side of UTM zone 14 is at 90, almost 850 km away. Right next to the zone boundary there would be zero offset and it would ramp up in a slightly non-linear relation to the other side. Upon remeasure with a zone 16N projection of the Lake against a zone 15N projection at the eastern side of Lake Superior, which is a good 430 km from the east side of zone 15N and almost to the east side of zone 16N, I get an error of only 10 cm in the projection. So if you are operating up to 100 km into an adjacent UTM zone at mid lat, I would expect less than 2 cm error max. At the equator you could probably go 150 km with that error. thanks for prompting the question John, I would not have thought of measuring this otherwise. Hardolph one little afterthought; if you really go too far with misprojections like a zone 16 feature projected to zone 8 they will get chopped by the algorithms which only support 90 degrees of UTM projection display and you can visualize orthogonals to a flat tangential projection plane laid flat on the real zone start to have really low incident angles and in the limit are tangents to the earth themselves. H
... View more
06-14-2012
03:23 PM
|
0
|
0
|
2918
|
POST
|
John, I think I see what you've driving at: As Joe said UTM zones do not overlap by definition, but that does not mean that you cannot project features existing in one zone using an adjacent one. The issue is not immediate locational error, but progressive shift, map distortion and therefore area and length. Depending on how far north you are you can probably use an adjacent UTM zone as the projection say half way across a few hundred km ... or anywhere say 3 degrees of longitude before things start to get noticeably out of shape and if projected out of alignment by say a meter at mid latitudes. That is why there are other projections like Albers to preserve area across several zones. But project to excess across a few zones and you will see shifts and severe area distortion like these shots of Lake Superior in a screen projection of UTM Zone 15N (with a background map in NAD84 Web Mercator Auxialiary Sphere). The progessive offset was the result of projecting a shapefile from zone 15N into 14, 13, 12, 11, 10 and 9 shown overlapping to the west with the zone 9 on top. In 14 there is only about a meter of offset but the area is also significantly changed in the shapefile. Progressive westward errors in meters from the zone 15N projection; UTM14N = 1m, UTM13N = 25m, UTM 12N = 314m, UTM 11N = 2500m, UTM10N = 18000m, UTM9N= 102,000m. Areas are displayed as square meters in a jumble with the minimum being the zone 15N projection and the max 9N. The larger red number is the area in a continental Albers projection. So based on that you have to decide what precision you can tolerate whether you are using a UTM projection or Albers. [ATTACH=CONFIG]15195[/ATTACH] Hardolph
... View more
06-13-2012
02:36 PM
|
0
|
0
|
2917
|
POST
|
Eric, frustrating business ... offhand suggestions regarding a wholesale spatial adjustment would be: 1. work from copies of the original 2. review the help sections on spatial adjustment thoroughly 3. review the use of snapping As you observed, rubbersheeting the whole thing can lead to frustrating iterative effects and you may be better off shifting sections of parcels and snapping them to your old ones. All that being technically feasible, the contractor's statement that the original shapefile was not correctly laid out seems ambiguous in light of the incorrect georeferencing of the photomosaic they used, unless they were given the photomosaic as is. I'd get clarification of that from them. I don't know how much is at stake in the new shapefile or what the intent was, so it is hard to recommend a course of action, but I'd resolve the georeferencing of the photomosaic first. If the new shapefile covers the same parcels as the old one only and with the same attributes, I'd be tempted to abandon it, but I have not seen either except in glimpses from the screen shots. If you adjust the new file the photomosaic will no longer be useful as an underlay for new development areas. At least it is just Pembroke and not all of Ottawa or Toronto. So to reiterate: I'd not recommend spatially adjusting the new shapefile until you have resolved the georeferencing of the photomosaic. Get the ground control points and double check their coordinates and that they were used correctly to georeference the photos. If there was an error in that find out who was responsible. If the GIS contractor was only responsible for drawing the shapefile they might be off the hook, but it seems byzantine to me. If you want to upload the two shapefiles, copy them into a new folder in ArcCatalog and see if you can upload that here as a unit or send them to me directly at my e-mail and I'll have a look. As I said I could not do anything with the xml file alone. The photo raster is probably too large. looks like you've done a thorough job so far. Hardolph PS. as an afterthought on spatial adjustment in case you really get stuck into it: avoid rubbersheeting as much as possible and minimize the offset you use if you do by first finding an absolute offset that comes close as possible by using the editor MOVE function to shift all the polygons by some measured and recorded amount (e.g. 3m west and 5m south), or attempt to use a Projective transformation. Again work from copies and if it screws up ditch it and start again. Rubbersheeting should be your last resort. H
... View more
06-07-2012
10:33 AM
|
0
|
0
|
559
|
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|