|
POST
|
Mike, did you ever get any resolution to this issue? I'm seeing almost exactly the same thing using 10.1.
... View more
12-20-2013
12:26 PM
|
0
|
0
|
2092
|
|
POST
|
In the Loading" rel="nofollow" target="_blank">http://www.esri.com/library/whitepapers/pdfs/loading_data_parcel_fabric.pdf]Loading Data into a Parcel Fabric white paper that many of us are familiar with, there is a handy migration checklist. The last checklist item says: Part Connectors: Check for part connection lines that are very long�??if you have multipart parcels that are spatially away from each other, long part connectors have a bad impact on your geodatabase performance. Of course this is pretty ambiguous, "very long" could mean 100 feet or 1 mile, whatever. My pilot data I've loaded has many connection lines, a result of very large right-of-way polygons with their donuts and islands. Because of the way the loader created the part connector lines, many of them are very long (over 1000 feet). So I'm trying to find more information about this to see if I need to do anything to eliminate some of the longer connector lines, but I haven't found much. The only other thing I've come across is the Data Reviewer for Tax Parcel Editing flags part connector lines greater than 100 feet as errors. 100 feet doesn't seem very long to me. Anyways, the checklist specifically mentions multipart parcels (which I don't have) but I'm dealing with donuts/islands so I don't know if this even pertains to my situation. There are times I perceive the fabric as chugging quite slowly but usually blame it on the godzilla ROW polygons. Can anybody, perhaps one of the esri folks, clue me in on this issue? There's got to be more information, as it was deemed important enough to mention specifically in the white paper.
... View more
11-27-2013
09:40 AM
|
0
|
0
|
1215
|
|
POST
|
Thanks Tim! That add-in is exactly what I was looking for, without knowing what to look for. Worked perfectly to remove the point. I really poorly worded my second question, and ended up figuring out the answer. With the LGIM disabled, by default there are relates on the lines layer between it and the points layer based on the ToPointID and FromPointID fields. With the LGIM enabled, these relates aren't established. It's handy if you want to identify a curve center point and, through the relate, see which curve/parcel it belongs to.
... View more
11-22-2013
02:18 PM
|
0
|
0
|
967
|
|
POST
|
I checked my parcel fabric for errors and it found a point error called "Point does not belong to a valid parcel or control point". The point is symbolized like a boundary point but I suspect it is a remnant center point. It is not related to any other feature. I can't find any way to delete the point. Standard editing tools don't work. Selecting it with the Select Parcel Features tool and clicking delete will delete the parcel that it is floating on top of, but not the point itself. Any ideas? Also, is there a reason why you can't see features that are related to fabric points when the Local Government Information Model is enabled, but you can when the information model isn't enabled? It doesn't even appear that there is a relationship class between points and lines or control.
... View more
11-21-2013
02:30 PM
|
1
|
2
|
4865
|
|
POST
|
Isn't it Historical <> 1? Historical can be 0 or null so that doesn't work. I guess it would be best to calc Historical null values to 0 but I'm still migrating so I haven't done that yet, and probably not everybody will think to do that. But in that case, Historical = 0 from ESRI's script would work. Like I said in my last post, historical parcels have already be queried out of the Tax Parcels layer, so I just removed that part of the query from ESRI's script all together. Another thing I noticed about this script is that it will append all the parcels from the fabric's Tax Parcels layer to the ParcelPublishing\TaxParcels feature class every time you run it, even if the parcels already exist in that feature class. So you will end up with parcels stacked on parcels stacked on parcels... I haven't messed around with the script any further, but I'll probably add a line that runs Delete Features on ParcelPublishing\TaxParcels somewhere before the append runs in order to clear it out for a fresh set of parcels.
... View more
09-25-2013
07:54 AM
|
0
|
0
|
1311
|
|
POST
|
It looks like this thread never really got an answer and it popped up on a search for me, so I'll repost my solution to this problem from another thread... If you have enabled the Local Government Information Model on your parcel fabric, which you would have if you want to use the Tax Parcel Editing Template, then there is no "Parcels" sub-layer in the fabric. This is why the following line fails: parcels = r"%s_Layer\Parcels"%(name) With the LGIM enabled on the parcel fabric, there is a sub-layer called "Tax Parcels", which is what you want to use. So change the line I mentioned before to: parcels = r"%s_Layer\Tax Parcels"%(name) Also, the Tax Parcels sub-layer has already queried out historical parcels, so the query in the following line will return 0 parcels so none will be appended to the parcels publishing feature class: arcpy.management.SelectLayerByAttribute(parcels, "NEW_SELECTION", """Type = 7 AND Historical = 0""") So remove AND Historical = 0 from the query to get the proper results. I suppose you could even completely remove the Select Layer By Attributes line because the Tax Parcels sub-layer has already been queried to only include Type 7. These are the things I had to do to get the script working, I hope this helps anybody with this problem.
... View more
09-05-2013
02:47 PM
|
0
|
0
|
1311
|
|
POST
|
I've found the 'Tax Parcel Publishing' script in the ParcelPublishingTools toolbox from the TaxParcelEditingfor10.1 that I believe is supposed to migrate our data from the Parcel Editing dataset to the Parcel Publishing dataset. I have not been successful in running this tool. Looking at the updatetaxparcels.py in the same Parcel Publishing folder I cannot figure out where to change the python code to make this script work. Is there an updated version of this script or any reason for it to not be working? I want to provide an answer to this question since I had the same problem as b.blackman and this thread came up for me in a search. If you have enabled the Local Government Information Model on your parcel fabric, which you would have if you want to use the Tax Parcel Editing Template, then there is no "Parcels" sub-layer in the fabric. This is why the following line fails: parcels = r"%s_Layer\Parcels"%(name) With the LGIM enabled on the parcel fabric, there is a sub-layer called "Tax Parcels", which is what you want to use. So change the line I mentioned before to: parcels = r"%s_Layer\Tax Parcels"%(name) Also, the Tax Parcels sub-layer has already queried out historical parcels, so the query in the following line will return 0 parcels so none will be appended to the parcels publishing feature class: arcpy.management.SelectLayerByAttribute(parcels, "NEW_SELECTION", """Type = 7 AND Historical = 0""") So remove AND Historical = 0 from the query to get the proper results. I suppose you could even completely remove the Select Layer By Attributes line because the Tax Parcels sub-layer has already been queried to only include Type 7. These are the things I had to do to get the script working, I hope this helps anybody with this problem. edit: sorry for reviving such an old thread, but it came up on a google search and I only noticed the last two posts were from a month ago!
... View more
09-05-2013
12:30 PM
|
0
|
0
|
488
|
|
POST
|
Thanks Steve. It's not the answer I was hoping for but at least I understand what's happening now.
... View more
04-23-2013
09:53 AM
|
0
|
0
|
1432
|
|
POST
|
So it sounds like you're saying Spatial Analyst doesn't provide the user any way to control how it resamples data? I may also be seeing this same issue in areas of the mosaic that are being projected on the fly. That's a real bummer, because I thought one of the big advantages to using the mosaic dataset is that you could combine original source data even in different coordinate systems and resolutions, on the fly. The assumption being, you could run geoprocessing tools on the dataset and get acceptable results. Anyways, I do appreciate your responses, Steve.
... View more
04-23-2013
08:28 AM
|
0
|
0
|
1432
|
|
POST
|
Steve, thanks for the reply. I think I get what you're saying, but I'm not sure because I don't understand how that matters in the scenario I'm describing. In my case, I have 6-foot and 3-foot pixels. If what you're saying applies, then the 3-foot pixels are getting resampled to 6-foot so they are the ones that should look jagged, or they all should look that way if all cells are being resampled. But if you look at my picture, the contours on the north half look fine, that is the area represented by 3-foot pixels. It's the ones generated from the original 6-foot pixels that look bad. And when I generate these contours from the source mosaic with all 6-foot pixels (the purple lines in my example) they look fine, so it's not like this resolution is "too low" to look good at this scale. Besides, I can take the 3-foot source mosaic, resample it to 6-foot using either Nearest or Bilinear, create contours from it and they look perfectly fine. So unless Spatial Analyst uses some other resampling method that isn't as good, why should this even matter?
... View more
04-23-2013
07:15 AM
|
0
|
0
|
1432
|
|
POST
|
Hmm, I wasn't sure whether to post this here, on the Spatial Analyst forum, or the desktop general forum. Maybe this was the wrong place... I can't be the only person who has encountered this before.
... View more
04-22-2013
06:49 AM
|
0
|
0
|
1432
|
|
POST
|
I created a file geodatabase mosaic dataset combining lidar-derived DEM TIFFs from a number of different surveys. It combines multiple cell sizes and horizontal datums. There is a lot to love about the mosaic dataset, I like it. But I'm having a problem with generating contours from the master mosaic. In areas where the source data is the lower resolution (6ft pixels vs 3ft) the contours appear jagged, blocky, pixelized, however you want to say it. What is really throwing me off, though, is if I generate the contours directly from the source mosaic that has 6ft pixels, the contours are much more acceptably smooth, as they should be. I'm attaching an example, it is a transition area where there is overlap between 3ft and 6ft pixels. The yellow contours come from the master mosaic and it's easy to see what I'm talking about. The purple contours come from the 6ft pixel source mosaic. I'd appreciate any help with this!
... View more
04-19-2013
11:30 AM
|
0
|
7
|
2330
|
|
POST
|
I am using a Trimble Geoexplorer 6000 GeoXH with ArcPad 10.0.4 and GPSCorrect. It is stated in the ArcPad 10 Help that EPE is only output by Garmin GPS receivers so setting a GPS Quality threshold using maximum EPE is only valid when using a Garmin. But my setup does output an EPE, which can be viewed in the GPS Position Window. With this in mind, I've got a little EPE "wishlist". Maybe some of these things are possible, but I haven't been able to figure them out. Since the receiver is outputting EPE, allow it to be used as a data collection quality threshold (doesn't work now) Allow EPE to be displayed in the GPS Status Bar instead of elevation Allow EPE units to be changed to something other than meters Allow EPE to reflect your estimated postprocessed accuracy, like it can be displayed in GPSCorrect These would be nice options to have.
... View more
03-08-2013
09:31 AM
|
0
|
1
|
3693
|
|
POST
|
Paul, I agree that it should be more obvious what units are being used for EPE and the user should be able to decide which units are used. It would also be nice if you could set EPE as the main display property on the GPS Position Window instead of elevation. If you are a Trimble GPSCorrect user and you turn on the status bar in your ArcPad map, it will display your positional accuracy as you have set it up in GPSCorrect. In my case, it displays the estimated accuracy after postprocessing, in inches. This spot on the status bar toggles between your map scale and the accuracy estimate.
... View more
03-08-2013
08:59 AM
|
0
|
0
|
279
|
|
POST
|
I've solved my problem. I incorrectly assumed that this was some kind of usual ArcMap behavior, and I simply needed to find a way to deal with it. Thank you for your replies, Jason, while I didn't go through the debugging process you suggested, you did get me thinking along the correct lines. Since I have a number of other ArcMap add-ins installed on my machine, I decided to uninstall all of them then test for this behavior. After doing this, it worked fine, the locks were going away even after saving edits. I reinstalled my current project and it still worked fine. Using the process of elimination, I narrowed the problem down to an add-in called Attribute Assistant, from ESRI's water utilities team. Maybe there is a memory leak, as Jason suggested, but I didn't look into it further since I don't need the Attribute Assistant anymore. They have also updated it a number of times since I installed it, so maybe it's fixed.
... View more
01-31-2013
09:38 AM
|
0
|
0
|
1236
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 05-07-2024 12:35 PM | |
| 1 | 07-26-2018 04:07 PM | |
| 1 | 02-09-2017 10:27 AM | |
| 2 | 02-07-2017 03:20 PM | |
| 1 | 02-29-2016 02:15 PM |
| Online Status |
Offline
|
| Date Last Visited |
07-01-2025
08:41 AM
|