POST
|
You only have to do that once, the first time. After that, you have your fields and append does not require using the field maps if you set it to TEST. If you are dealing with extra fields in some files, you can add fields manually using the Append field map. But I would just try a big merge to get all the fields first, empty the feature, then use append.
... View more
04-02-2021
02:04 PM
|
0
|
0
|
1602
|
POST
|
The lakes themselves may work as an index if you experiment with the zoom percentage. You can also experiment with how much to round the scale. Choosing to, for example, round scale to the nearest 2000 will guarantee a minimum map scale of 1:2000, but it could also mean a map page best generated at 10,500 will generate at 12000. There are two typical problems: shapes that are too far from rectangular pages (such as super long and skinny) and shapes with outlier sizes; your tiny ones may look zoomed in too far and not include enough surrounding features, and the super large ones may not show enough detail because the scale is too small. Methods I have used to deal with these kind of problems include excluding the outliers from the DD map index feature, copying the index feature and editing some of the shapes to get a better fit, and creating 2 DD mxd layouts, such as one for portrait and one for landscape layouts (using a layout field to select which shapes to include when). One advantage of copying a polygon feature that is your index is that you can then do a Page Query on it, which you cannot do directly on an index polygon feature.
... View more
03-09-2021
03:24 PM
|
1
|
0
|
1157
|
POST
|
At any precision, each contour represents a closed polygon that represents where the elevation is above or below the break value, such as 10m. 10m contours draw completely different shapes than 100m contours, with many more bends than 100m contours. They should not be evenly distributed. They may even cross the 100m contour lines because they are working from more information. Brown is 20m line; greet is detail
... View more
02-18-2021
12:08 PM
|
0
|
3
|
2652
|
POST
|
Are you referring to Composite relationships (related tables are maintained by relationship) or just simple ones? My guess would be that simple relationships don't affect performance.
... View more
01-27-2021
01:52 PM
|
0
|
1
|
455
|
POST
|
I noticed that my last IP address is listed in my profile. Can we turn this off? Even if somehow other people cannot see this, this is not OK in my secure environment. I am sure others have this issue.
... View more
12-23-2020
09:54 AM
|
1
|
1
|
696
|
POST
|
Global IDs would not be a first choice for a relationship; they are used in distributed databases and other situations, but can be hard to work with. Just use a good foreign key for the relationship, one you already have that is unique (on the 1 side of the relationship) and does not change. A relationship class can help prevent orphans, but you already have the bad data so you will have to clean that up first. If you set the relationship up to cascade delete the many records when you delete the one related feature, you probably won't get orphans again. But, you could lose table records if you accidentally or temporarily delete a feature. I don't have many 1-many situations.
... View more
11-24-2020
11:47 AM
|
1
|
0
|
874
|
POST
|
It sounds like you can join-by-attribute the old property records to the new feature class table, assuming the field that is used in the relationships to link tables is usable. When you start a join from the many side (table), it should find all the one side (feature) records that match. If the link field cannot be used to match the records, you will have to look at other fields and/or clean up the data until you get a single field that can match old records to new features.
... View more
10-26-2020
10:37 AM
|
0
|
1
|
549
|
POST
|
Yes, I have, back in 10.4. Nothing worked (reloading from table, typing/copying in everything in the domain, GP append/remove value tools). I did not have the same problem with the domain in my file geodatabases. I had to use a new copy of the domain. After over a year, the problem went away, but I cannot explain why, but I was able to repoint features to the fixed domain with the original standard domain name. Plus the latest version of SDSFIE does not need that domain, so I rarely use it. It is possible that a compress or some other big change fixed it but no one noticed the change.
... View more
10-23-2020
08:58 AM
|
0
|
0
|
1006
|
POST
|
Read up on Spatial Join. Spatial Join—Help | ArcGIS Desktop This can add the parcel number data to your other table, if that is what you want. The tool is part of the ArcMap Join tool although you do have to choose to join spatially, as join by attribute is the default. There is also a geoprocessing tool that does this, and will give you more options.
... View more
08-20-2020
02:57 PM
|
0
|
0
|
594
|
POST
|
Topology rules often don't make sense for most of the data. Especially true if someone at a higher level created them for several departments/organizations. They are just there to force you to look at the data and decide whether there is a real error there. I probably have a 500 to 1 exception to real error ration, but that is what exceptions are for.
... View more
06-09-2020
08:53 AM
|
0
|
0
|
1143
|
POST
|
I have to work with topologies, not of my design, that generate such errors. There are a huge number of exceptions and overlapping errors and exceptions, but one of the bad areas that triggers the error is less than 100 square meters (out of a million+ acres) and has almost no data in it. So I think the topology engine must be producing bad geometry or just simply can't handle something there. If I turn on Task Manager while these bad areas are being validated, I can see ArcGIS slow down to almost no read/write activity, but it will continue for at least 10 minutes until it fails if I don't cancel validation. I mapped the bad areas, which I digitized as a few polygons. Then I validate in sections, avoiding these areas. This works, but even when I have had to rebuild the topology due to feature name changes, the bad areas are always there.
... View more
06-09-2020
08:47 AM
|
0
|
0
|
1143
|
POST
|
Thanks. I went with 2020 and ArcPro 2.5 because of some important changes in 2.5.
... View more
05-12-2020
04:40 PM
|
0
|
0
|
509
|
POST
|
Relative path would require the data paths to be set up in each user's space. So that is often the same folder or a folder under the location of the mxd. Not sure why they can't copy it that way, You could look into Map Packaging, which will keep links together; however, it is not always reliable or easy to use.
... View more
05-12-2020
04:37 PM
|
0
|
0
|
808
|
POST
|
What software are you using? If you can use SDE (enterprise) databases, there is an archiving toolset for tracking histories of feature classes as well as editor tracking, which operates at the record level. So search for these; they may be available with other software such as Pro.
... View more
05-01-2020
03:26 PM
|
0
|
1
|
375
|
POST
|
Been there... I would first focus on organizing better, that is, don't get caught up in whether data is in geodatabase format or shape files, for example. Ultimately, you can change the format once you determine what is the best data. Then you have to get current users to use only the best data. To organize better, consider very general categories. Does some data belong to a certain group and is that very important to how it is created and used? Are there other general purpose data sources that don't belong to a group and that most users want to map? For those, I would look at ISO at ISO 19115 Topics, SDSFIE, or other keyword methods of modeling the world. Metadata often uses these terms. For example, biota/flora/ as a starting point for plant data. You can keep raw data way down in the hierarchy on the share drive(s). You can keep the best data in a geodatabase organized by dataset name. Keep in mind that moving data creates broken links but moving maps and output does not. We store data, maps, and output separately. That is not a very project-oriented approach (and not one people learn in school); it is a long term shared data management strategy, which is what it sounds like you need. So it took a long time, but I know where all the current data is now and where new data should go.
... View more
04-24-2020
02:18 PM
|
1
|
0
|
3835
|
Title | Kudos | Posted |
---|---|---|
1 | 04-13-2022 10:06 AM | |
1 | 05-17-2016 09:37 AM | |
1 | 12-08-2023 04:22 PM | |
1 | 11-06-2023 09:43 AM | |
1 | 04-24-2020 02:18 PM |