|
POST
|
Some tips from someone who has done a lot of this for SDSFIE. Make the code and description fields have the same names for all the tables; then you can use Geoprocessing Results to re-do each import with minimal changes. In Excel, you can make each domain a named range, with the name you will be using for the domain. This will make viewing and adding domains easier. When you have the domains in, don't export feature classes into the database; import or copy/paste may avoid the domain getting a "_1" appended to the end of the domain, which creates 2 duplicates in your database. If the fields in the feature classes don't have domains assigned to them yet, this should not be a problem. Also, if you are using user names to access your database, choose a generic owner of the domains so they don't end up belonging to someone who leaves.
... View more
07-10-2017
08:48 AM
|
1
|
1
|
11962
|
|
POST
|
Yes, so I no longer use that button. I just add it from the toolbar menu. It still disappears on occasion.
... View more
06-23-2017
02:11 PM
|
0
|
0
|
2597
|
|
POST
|
When I had a much more complex situation involving products, companies, and articles, I handled obsolete product references by periodically purging some references and changing others I could not purge fully to something generic. In your case, something like "former employee" might work. In my case, I used vendor independent terms such as "GIS". I recommend doing a domain to table before any purges so you have all the information about the former employee names first. After the purge I did queries to check that they were indeed gone before removing the products from my file (this was not ArcMap, so my software did not enforce this rule). Another approach is to archive/remove older data periodically and remove the domain connection in the archive file, and disconnect the archive file from domains. Bosses tend to like this method because it sounds like streamlining; it makes me nervous because the longer the data has been archived, the farther it may deviate from the current data standards and design, which means it could be hard to get archived data back into useable form if needed.
... View more
06-14-2017
02:36 PM
|
1
|
0
|
1113
|
|
POST
|
Have this problem constantly, and it is not limited to SDE. I found an answer on the forum quite a while ago. It is more of an interface/display problem than a database issue. Try clearing your map cache and/or switching back and forth between Data and Layout views. This seems to clear out something that fixes the problem. If you have this problem with a non-OID table such as Excel, try converting it to an ArcMap table.
... View more
06-02-2017
08:07 AM
|
1
|
0
|
939
|
|
POST
|
Thanks, I did find something new thinking about your question. I believe there used to be a way to set the option to validate immediately when editing, but I doubt this worked with field calculations. First, what I do now is summarize my domain-enforced fields periodically and/or do joins to an external domain table (exported from the domain and/or turn the show "display coded attributes..." button on and off in a table. This last method will make all the bad values pop out (because they won't change), assuming your domain description field has a longer/different text than your domain code, which makes them easy to fix. The new thing: http://desktop.arcgis.com/en/arcmap/latest/manage-data/editing-attributes/applying-the-same-attribute-values-to-multiple-features-in-a-layer.htm You can do quick global edits of domain-enforced fields using a drop down list without using field calculations by using the Editing Attributes window. Maybe you could train some of your editors on this method. I don't use the Attributes window much except for annotation because I don't like looking at records one at a time, so I might never have found this feature.
... View more
03-30-2017
10:17 AM
|
0
|
0
|
2082
|
|
POST
|
Any rules that make the placement of labels pickier/harder will help, e.g. use street placement and a small End of Street clearance; remove duplicates; add a hard label buffer; use small or no or overrun feature size; give feature weights to other features on your map. I did once get a geocoding dataset, with way too many segments, to produce a nice amount of labeling this way. But the label buffer and feature weights may adversely affect labeling of non-street features. To get a little more exact with what percentage of features are added and more random, you could use the MOD function in SQL for a labeling class to select only, say, on some attribute (ObjectID might work) that can be divided evenly by a number. Or use a LIKE select for your labels that picks something meaningless but uncommon in part of an attribute.
... View more
03-29-2017
09:13 AM
|
0
|
0
|
2532
|
|
IDEA
|
Great idea. I believe the default should be on, though. I have figured out the what and where hundreds of files just from the GP history. Most of the field calc history is a waste of time to read, but the appends are useful. None of these files had real metadata (documentation), and you can't count on most people to create any metadata, especially non-GIS non-IT people.
... View more
03-24-2017
08:28 AM
|
1
|
0
|
4444
|
|
IDEA
|
Yes, oh yes. I do export my domains often to gdb tables so I can: document them through reports, check data against domains, compare databases, join them to dirty data to clean up values.
... View more
03-24-2017
08:18 AM
|
0
|
0
|
4176
|
|
POST
|
Wow, I will stop complaining about my work computing environment, for a while. Have you tried exporting to PDF with no layers, no georeference data, RGB? (should make the file smaller/simpler). Also turn off all memory-hungry software such as Outlook. If you can open the PDF at all, can you save as PDF Reduced? Sometimes, we are able to save reduced on our older machines, as the new ones seem to be messed up periodically by updates. We never, ever print maps directly without going through Adobe for tiling. I cannot get our HP plotters to tile PDFs using the Internet job interface. Some people suggest doing each tile as a separate job (horrors) by shifting the extent. But this would create a lot of margins between your sections. Good luck.
... View more
03-17-2017
12:01 PM
|
0
|
0
|
1065
|
|
POST
|
Re DOD sites, dropbox and similar sites are verboten, at DOD and a lot of places (I found this out when job searching, since I had a portfolio on one of these systems). There is a secure FTP site that is similar for DOD. Your DOD contacts can give you the URL. The site's limit is 2GB and 20 files. You do not have to have a DOD credential or email to receive files, but you may have to set up an account to send since you don't have the credential. Re map packaging, I have found the results are often huge, for a lot of reasons. There is a Map Consolidate tool that is cleaner. It just won't keep the file paths working for the recipient the way a map package does. But it is easy enough for a GIS user to work around the broken links.
... View more
03-17-2017
11:36 AM
|
0
|
0
|
664
|
|
POST
|
If one file is raster, use Spatial Analyst Extract values to points tool. It will produce an output of points with all the original point file attributes, plus your raster cell value (interpolated as an option) as the last field.
... View more
03-14-2017
09:12 AM
|
1
|
1
|
678
|
|
POST
|
Try searching for "Daltonize." These tools let you see what your image would look like to someone with red-green or other color blindness. It is not just about color choices or ramps, it is about contrast and what colors abut each other. The tools might be able to directly fix a JPEG, but they don't work with ArcMAP. 508c compliance is way more subtle and complex than an app. (You could say that about so many things.) I took a very time consuming 1-unit class in Web compliance several years ago. There are lots of things to look at, but a compliant site is better in many ways, such as being easier to find and to search and to maintain.
... View more
03-13-2017
09:27 AM
|
0
|
2
|
2621
|
|
POST
|
Yes, but with a huge caveat. Most of my maps in folders named "older" have broken links and they open OK, albeit extremely slowly if the links were to SDE. However, much of the symbology and definition queries, labeling rules, may not be visible until you fix the links. An you may lose some of this. If you had created any .lyr files in 10.3, they usually work in 10.2, but not always.
... View more
02-16-2017
04:02 PM
|
0
|
0
|
2239
|
|
POST
|
Not sure about some of your attempts or exactly what you want. As far as getting 2 data frames to show the same object, bookmarks belong to each data frame. You have to save one out to disk, then import it to the second data frame using Bookmark Manager. I just tried this and it did adjust the scale on the new data frame so the bookmarked area showed in the center. As to symbols, the best way to handle them on inset maps is to use a reference scale for the data frame. If this is say, 30 percent, bigger than the actual scale of the data frame, the symbols should all adjust (look smaller), assuming you take the default on your layers to scale symbology when a reference scale is set. Labels won't scale but annotation should.
... View more
02-16-2017
03:52 PM
|
0
|
1
|
1466
|
|
POST
|
I believe this came in at about 10.3. I think the default was to show them before that. I love those prefixes when I am comparing data or copying from field to field.
... View more
02-03-2017
04:01 PM
|
0
|
0
|
1218
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 04-13-2022 10:06 AM | |
| 1 | 05-17-2016 09:37 AM | |
| 1 | 12-08-2023 04:22 PM | |
| 1 | 11-06-2023 09:43 AM | |
| 1 | 04-24-2020 02:18 PM |