|
POST
|
I�??m having trouble coming up with a good workflow for editing the boundaries between existing Right of Way and Tax Parcels in a parcel fabric (v10.1). I'd love to get any ideas from experienced parcel fabric users or hear how others might do this. Here are some things that I think are contributing to my problem and some specific issues I'm running into: Our countywide parcel and R/W data is of poor quality but is the only reasonable option for creating our parcel fabric. R/W polygons are, for the most part, only split at township lines. Thus, many of them are large �??spider web�?� polygons with multiple donut hole areas which can make them difficult to work with. More on that after the next two points... I�??m trying to take a day-forward approach to improving parcel accuracy by completely re-creating parcels using surveyed information, when available, before completing parcel splits, BLA�??s, etc. This often creates an obvious mismatch between the re-created parcel�??s road frontage lines and the existing R/W lines. Especially when curves are involved, as they often are with R/W, I�??m not sure what approach to take to edit the mismatched R/W lines so they align with the parcel lines exactly. I�??ve tried to fix small portions of R/W by using Construct From Parent, marking some lines unbuildable and constructing more accurate lines, or by unjoining then deleting and constructing new linework. By only trying to fix a small portion of a large R/W, I�??m trying to mix accurate linework into inaccurate linework and it never seems to work very well. Things don�??t line up very well or get distorted. If I use Construct From Parent on any of these �??spider web�?� R/W polygons that have the donut hole areas, when I Build Parcels it creates R/W island polygons in every donut hole area. This leads to some annoying clean-up work to delete all of the island polygons and the potential to have erroneous parcels if I fail to catch all of them. Is there any way to prevent this, besides the obvious route of splitting the R/W polygons so there are no longer any donut holes (a big task for a 2500 sq mi county)? If I unjoin a R/W that spans a large area, edit some lines, then join it back, I�??ve got hundreds of join links to create. Auto Join doesn�??t always seem to work very well, but I don�??t know if I�??m quite using it right. Any thoughts on this approach? If this doesn't make sense I can show a simple example. Thanks for any help!
... View more
02-14-2014
02:22 PM
|
0
|
3
|
3946
|
|
POST
|
Mike, did you ever get any resolution to this issue? I'm seeing almost exactly the same thing using 10.1.
... View more
12-20-2013
12:26 PM
|
0
|
0
|
3598
|
|
POST
|
In the Loading" rel="nofollow" target="_blank">http://www.esri.com/library/whitepapers/pdfs/loading_data_parcel_fabric.pdf]Loading Data into a Parcel Fabric white paper that many of us are familiar with, there is a handy migration checklist. The last checklist item says: Part Connectors: Check for part connection lines that are very long�??if you have multipart parcels that are spatially away from each other, long part connectors have a bad impact on your geodatabase performance. Of course this is pretty ambiguous, "very long" could mean 100 feet or 1 mile, whatever. My pilot data I've loaded has many connection lines, a result of very large right-of-way polygons with their donuts and islands. Because of the way the loader created the part connector lines, many of them are very long (over 1000 feet). So I'm trying to find more information about this to see if I need to do anything to eliminate some of the longer connector lines, but I haven't found much. The only other thing I've come across is the Data Reviewer for Tax Parcel Editing flags part connector lines greater than 100 feet as errors. 100 feet doesn't seem very long to me. Anyways, the checklist specifically mentions multipart parcels (which I don't have) but I'm dealing with donuts/islands so I don't know if this even pertains to my situation. There are times I perceive the fabric as chugging quite slowly but usually blame it on the godzilla ROW polygons. Can anybody, perhaps one of the esri folks, clue me in on this issue? There's got to be more information, as it was deemed important enough to mention specifically in the white paper.
... View more
11-27-2013
09:40 AM
|
0
|
0
|
1289
|
|
POST
|
Thanks Tim! That add-in is exactly what I was looking for, without knowing what to look for. Worked perfectly to remove the point. I really poorly worded my second question, and ended up figuring out the answer. With the LGIM disabled, by default there are relates on the lines layer between it and the points layer based on the ToPointID and FromPointID fields. With the LGIM enabled, these relates aren't established. It's handy if you want to identify a curve center point and, through the relate, see which curve/parcel it belongs to.
... View more
11-22-2013
02:18 PM
|
0
|
0
|
1181
|
|
POST
|
I checked my parcel fabric for errors and it found a point error called "Point does not belong to a valid parcel or control point". The point is symbolized like a boundary point but I suspect it is a remnant center point. It is not related to any other feature. I can't find any way to delete the point. Standard editing tools don't work. Selecting it with the Select Parcel Features tool and clicking delete will delete the parcel that it is floating on top of, but not the point itself. Any ideas? Also, is there a reason why you can't see features that are related to fabric points when the Local Government Information Model is enabled, but you can when the information model isn't enabled? It doesn't even appear that there is a relationship class between points and lines or control.
... View more
11-21-2013
02:30 PM
|
1
|
2
|
5079
|
|
POST
|
Isn't it Historical <> 1? Historical can be 0 or null so that doesn't work. I guess it would be best to calc Historical null values to 0 but I'm still migrating so I haven't done that yet, and probably not everybody will think to do that. But in that case, Historical = 0 from ESRI's script would work. Like I said in my last post, historical parcels have already be queried out of the Tax Parcels layer, so I just removed that part of the query from ESRI's script all together. Another thing I noticed about this script is that it will append all the parcels from the fabric's Tax Parcels layer to the ParcelPublishing\TaxParcels feature class every time you run it, even if the parcels already exist in that feature class. So you will end up with parcels stacked on parcels stacked on parcels... I haven't messed around with the script any further, but I'll probably add a line that runs Delete Features on ParcelPublishing\TaxParcels somewhere before the append runs in order to clear it out for a fresh set of parcels.
... View more
09-25-2013
07:54 AM
|
0
|
0
|
1979
|
|
POST
|
It looks like this thread never really got an answer and it popped up on a search for me, so I'll repost my solution to this problem from another thread... If you have enabled the Local Government Information Model on your parcel fabric, which you would have if you want to use the Tax Parcel Editing Template, then there is no "Parcels" sub-layer in the fabric. This is why the following line fails: parcels = r"%s_Layer\Parcels"%(name) With the LGIM enabled on the parcel fabric, there is a sub-layer called "Tax Parcels", which is what you want to use. So change the line I mentioned before to: parcels = r"%s_Layer\Tax Parcels"%(name) Also, the Tax Parcels sub-layer has already queried out historical parcels, so the query in the following line will return 0 parcels so none will be appended to the parcels publishing feature class: arcpy.management.SelectLayerByAttribute(parcels, "NEW_SELECTION", """Type = 7 AND Historical = 0""") So remove AND Historical = 0 from the query to get the proper results. I suppose you could even completely remove the Select Layer By Attributes line because the Tax Parcels sub-layer has already been queried to only include Type 7. These are the things I had to do to get the script working, I hope this helps anybody with this problem.
... View more
09-05-2013
02:47 PM
|
0
|
0
|
1979
|
|
POST
|
I've found the 'Tax Parcel Publishing' script in the ParcelPublishingTools toolbox from the TaxParcelEditingfor10.1 that I believe is supposed to migrate our data from the Parcel Editing dataset to the Parcel Publishing dataset. I have not been successful in running this tool. Looking at the updatetaxparcels.py in the same Parcel Publishing folder I cannot figure out where to change the python code to make this script work. Is there an updated version of this script or any reason for it to not be working? I want to provide an answer to this question since I had the same problem as b.blackman and this thread came up for me in a search. If you have enabled the Local Government Information Model on your parcel fabric, which you would have if you want to use the Tax Parcel Editing Template, then there is no "Parcels" sub-layer in the fabric. This is why the following line fails: parcels = r"%s_Layer\Parcels"%(name) With the LGIM enabled on the parcel fabric, there is a sub-layer called "Tax Parcels", which is what you want to use. So change the line I mentioned before to: parcels = r"%s_Layer\Tax Parcels"%(name) Also, the Tax Parcels sub-layer has already queried out historical parcels, so the query in the following line will return 0 parcels so none will be appended to the parcels publishing feature class: arcpy.management.SelectLayerByAttribute(parcels, "NEW_SELECTION", """Type = 7 AND Historical = 0""") So remove AND Historical = 0 from the query to get the proper results. I suppose you could even completely remove the Select Layer By Attributes line because the Tax Parcels sub-layer has already been queried to only include Type 7. These are the things I had to do to get the script working, I hope this helps anybody with this problem. edit: sorry for reviving such an old thread, but it came up on a google search and I only noticed the last two posts were from a month ago!
... View more
09-05-2013
12:30 PM
|
0
|
0
|
642
|
|
POST
|
Thanks Steve. It's not the answer I was hoping for but at least I understand what's happening now.
... View more
04-23-2013
09:53 AM
|
0
|
0
|
2173
|
|
POST
|
So it sounds like you're saying Spatial Analyst doesn't provide the user any way to control how it resamples data? I may also be seeing this same issue in areas of the mosaic that are being projected on the fly. That's a real bummer, because I thought one of the big advantages to using the mosaic dataset is that you could combine original source data even in different coordinate systems and resolutions, on the fly. The assumption being, you could run geoprocessing tools on the dataset and get acceptable results. Anyways, I do appreciate your responses, Steve.
... View more
04-23-2013
08:28 AM
|
0
|
0
|
2173
|
|
POST
|
Steve, thanks for the reply. I think I get what you're saying, but I'm not sure because I don't understand how that matters in the scenario I'm describing. In my case, I have 6-foot and 3-foot pixels. If what you're saying applies, then the 3-foot pixels are getting resampled to 6-foot so they are the ones that should look jagged, or they all should look that way if all cells are being resampled. But if you look at my picture, the contours on the north half look fine, that is the area represented by 3-foot pixels. It's the ones generated from the original 6-foot pixels that look bad. And when I generate these contours from the source mosaic with all 6-foot pixels (the purple lines in my example) they look fine, so it's not like this resolution is "too low" to look good at this scale. Besides, I can take the 3-foot source mosaic, resample it to 6-foot using either Nearest or Bilinear, create contours from it and they look perfectly fine. So unless Spatial Analyst uses some other resampling method that isn't as good, why should this even matter?
... View more
04-23-2013
07:15 AM
|
0
|
0
|
2173
|
|
POST
|
Hmm, I wasn't sure whether to post this here, on the Spatial Analyst forum, or the desktop general forum. Maybe this was the wrong place... I can't be the only person who has encountered this before.
... View more
04-22-2013
06:49 AM
|
0
|
0
|
2173
|
|
POST
|
I created a file geodatabase mosaic dataset combining lidar-derived DEM TIFFs from a number of different surveys. It combines multiple cell sizes and horizontal datums. There is a lot to love about the mosaic dataset, I like it. But I'm having a problem with generating contours from the master mosaic. In areas where the source data is the lower resolution (6ft pixels vs 3ft) the contours appear jagged, blocky, pixelized, however you want to say it. What is really throwing me off, though, is if I generate the contours directly from the source mosaic that has 6ft pixels, the contours are much more acceptably smooth, as they should be. I'm attaching an example, it is a transition area where there is overlap between 3ft and 6ft pixels. The yellow contours come from the master mosaic and it's easy to see what I'm talking about. The purple contours come from the 6ft pixel source mosaic. I'd appreciate any help with this!
... View more
04-19-2013
11:30 AM
|
0
|
7
|
3071
|
|
POST
|
I am using a Trimble Geoexplorer 6000 GeoXH with ArcPad 10.0.4 and GPSCorrect. It is stated in the ArcPad 10 Help that EPE is only output by Garmin GPS receivers so setting a GPS Quality threshold using maximum EPE is only valid when using a Garmin. But my setup does output an EPE, which can be viewed in the GPS Position Window. With this in mind, I've got a little EPE "wishlist". Maybe some of these things are possible, but I haven't been able to figure them out. Since the receiver is outputting EPE, allow it to be used as a data collection quality threshold (doesn't work now) Allow EPE to be displayed in the GPS Status Bar instead of elevation Allow EPE units to be changed to something other than meters Allow EPE to reflect your estimated postprocessed accuracy, like it can be displayed in GPSCorrect These would be nice options to have.
... View more
03-08-2013
09:31 AM
|
0
|
1
|
3850
|
|
POST
|
Paul, I agree that it should be more obvious what units are being used for EPE and the user should be able to decide which units are used. It would also be nice if you could set EPE as the main display property on the GPS Position Window instead of elevation. If you are a Trimble GPSCorrect user and you turn on the status bar in your ArcPad map, it will display your positional accuracy as you have set it up in GPSCorrect. In my case, it displays the estimated accuracy after postprocessing, in inches. This spot on the status bar toggles between your map scale and the accuracy estimate.
... View more
03-08-2013
08:59 AM
|
0
|
0
|
414
|
| Title | Kudos | Posted |
|---|---|---|
| 2 | 05-07-2024 12:35 PM | |
| 1 | 07-26-2018 04:07 PM | |
| 1 | 02-09-2017 10:27 AM | |
| 2 | 02-07-2017 03:20 PM | |
| 1 | 02-29-2016 02:15 PM |
| Online Status |
Offline
|
| Date Last Visited |
3 weeks ago
|