|
POST
|
Sam, I can hardly read the screenshot you posted. Most forums, and this seems too, resize your screenshots to some predefined maximum allowable size. It is wise to stay below it, so as to prevent resizing, for example by selecting only the error window in an image editor (e.g. Windows Paint) and writing that to a file before attaching to a post. However, from what I see, it is clear that the Multiple Element Layout Manager (which I am not yet familiar with - is it some user submitted add-on for ArcGIS?), expects an extra field in your DDP index layer, if I read it right it must be named "pageNameTimID". You may be able to solve the issue by adding such a field, although it would be wise to consult the documentation, if any, to find out which fields, and their field types, are required for this application. Marco
... View more
11-09-2012
01:10 PM
|
0
|
0
|
4250
|
|
POST
|
If you really don't want to use Python, I guess the answers to 1) and 2) is to first create the DDP index layer, than review manually which pages need to be portrait or landscape, and which ones need insets or not. Add this data as extra attributes in your DDP index layer for each index polygon. Now create the appropriate 4 needed ArcMap documents: - Portrait / Inset - Portrait - Landscape / Inset - Landscape Alternatively, you might be able get by with just a Portrait and Landscape layout MXD, and add the insets outside of the layout page, only manually shifting them in place (possibly using snap Guides), when you need them. I have done that with different legend types, reducing the need to create multiple MXD and still use DDP. Now use the added attributes in the DDP index layer to select the appropriate index polygons to process by setting a Definition Query in the layer's properties, and than setup the DDP again in ArcMap so as to only reflect the selected pages. As for 3), this is simple, just add the fixed text outside the dynamic text tags, e.g.: Distribution map of <ITA><dyn type="page" property="Description"/></ITA> The ITA tag is for Italic style by the way, but I guess you knew that already.
... View more
11-08-2012
10:01 AM
|
0
|
0
|
4250
|
|
POST
|
You may also be able to solve this last issue by setting the "Resample during display using" option under the Layer properties "Display" TAB to bilinear interpolation or cubic convolution instead of the default nearest neighbour resampling. This will interpolate a value during display and zooming, giving smoother results.
... View more
11-06-2012
10:55 PM
|
0
|
0
|
3091
|
|
POST
|
Thanks for the feedback! That's good to know and I'll explore more into this. Jason Hi Jason, Although Eric is right that from a purely practical point of view "if the data meets the assumptions ... kriging should work", I would really like to emphasize the small caveat Eric placed in between these quotes and that I hereby rephrase: "Does it make sense to interpolate your particular dataset???" People all to easily grab to the first available tool that seems suited, while there may be better, and from a scientific point of view sounder, methods to accomplish what you want, e.g. create a map of potential customers. Interpolation always seems attractive when having a point dataset, as it is an "easy" way to get a surface, but is it sound from a logical and scientific point of view? I can't answer that question for you, but you might be... based on your knowledge and questions like the one below E.g. do you find it likely that two people nearby have a higher likelihood to buy a certain product than two people further apart (spatial autocorrelation)? Does this almost always hold? (seems unlikely with market research data, two neighbours may have a completely different social background and status, but for good continuous environmental datasets it usually does) My gut feeling says that this type of "social" data is not the best suited for interpolation, and that you should be careful in your decision to use it and interpret results. I really shouldn't ignore general statistical analysis (multi-variate) like can be done with general statistics packages like NCSS and SPSS, and than use any found statistical relationships to classify other datasets with your market data to answer your question of "where potential customers may be". Yes, it is more laborious than the "one-stop point-to-surface option" that interpolation seems to offer, but it might be more sound from a scientific and statistical point of view. That is all not to say it can't be done, just that you should be careful in your decision and not ignore other options. And ESRI has of course "Business Analyst". Maybe time to look at that? Marco
... View more
10-24-2012
06:29 AM
|
0
|
0
|
933
|
|
POST
|
Thanks Jeff, In my case, I finally decided to set the visibility of the layers on/off depending on data driven page index, as this did the job I wanted it to do for this particular case. It took some time to figure out how to implement this based on Python example scripts I found in the Help. I still think it would be nice if more of the DDP stuff was exposed through the ArcToolbox and thus directly usuable in ModelBuilder. Quite a lot of people are still unfamiliar with Python, and learning a new programming language can be a long story, especially for those without any previous programming experience or simply difficulties to adjust to programming type tasks. I am not one of them, but I have seen other people struggling with any kind of programming style "logic". Connecting things up in a graphic style designer like ModelBuilder, is just more accessible for some. Marco
... View more
09-26-2012
05:11 AM
|
0
|
0
|
1131
|
|
POST
|
From what I have seen up to now, if you want to stay within the realm of the DDP functionality and be able to use the standard tools and toolbar to accomplish what you want, there is no real other option than creating *multiple* parcels polygons for the same lot each having a single attribute of the type and content you want (parcel number, owner etc.). This is not an ideal situation, because it will mean having overlapping polygons, but you might automate the maintenance of the DDP polygon index layer by using ModelBuilder tools to stack polygons based on the same lot and add the attributes from whatever source you are using, and than schedule to update the layer each week or day or so... Problem is, it seems ESRI hasn't yet exposed much of the DDP functionality through python, except the most basic functions. Hence some stuff, like the "Page Definition" on a layer, isn't available programmetically. Seems we need to wait for ESRI to upgrade DDP functionality and expose more through programming interfaces like python.
... View more
09-21-2012
04:37 AM
|
0
|
0
|
856
|
|
POST
|
Hi all, I have the need to dynamically set a layer's "Page Definition" query of multiple layers based on a ModelBuilder variable retrieved from a field using an iterator. Each value in the field thus must lead to it's own query and has to be set as the Page Definition of an exported layer. However, looking around in the ArcToolbox, and the arcPy layer properties, it seems the "Page Definition" isn't actually being exposed anywhere except through the user operated dialogs of ArcGIS itself?? I may well be overlooking something, but are there really no options to programmatically set the DPP "Page Definition" through python scripting that can be integrated with a ModelBuilder model? Marco
... View more
09-19-2012
12:57 PM
|
0
|
2
|
1296
|
|
POST
|
Then when you do detrending, you're removing even more spatial autocorrelation, and there's nothing left to perform kriging on. Are you sure your data needed detrending? This is my interpretation too. Looks like the data doesn't need detrending. Try and read some of the documentation to understand what detrending is about and when there is, or isn't, a need to apply it. You shouldn't just use every tool or option available in Geostatistical Analyst just because it is there, you need to apply them based on knowledge of your dataset and statistics provided by Geostatistical Analyst. Simply viewing the dataset's values symbolized with an appropriate legend in ArcMap, should tell you a lot about a possible need to detrend the data.
... View more
07-11-2012
10:52 PM
|
0
|
0
|
1271
|
|
POST
|
I tried the simple kriging model as you suggested, but can I assume that the mean/trend is known and constant for my data? Thanks again! Kirsten This is something only you can answer with your expert and field knowledge of the terrain / research site. There are statistical tools in Geostatistical Analyst that will tell you whether some assumptions are more or less reasonable or appropriate based on statistics, but in the end it is you who needs to decide which models or assumptions are best for your data.
... View more
07-11-2012
10:39 PM
|
0
|
0
|
1208
|
|
POST
|
Are you using Background Processing? Processing in the background means sending the features you want to process over to another process and then getting a result back. Because its another process we have to persist them to disk for ArcMap to read them on the way back. Why o why didn't ESRI document this crucial factor in getting in_memory to work in the Help file??? I really can not find it in the main in_memory Help topic. This issue regarding background processing should be added under the "The following considerations must be made in deciding to write output to the in-memory workspace" text line. I have been struggling with this issue too.
... View more
06-20-2012
06:12 AM
|
0
|
0
|
1697
|
|
POST
|
You may also wish to have a look at this recent thread, as it discusses issues and solutions regarding 32-bit / 64 bit memory space and Python/ArcGIS too: Large Dictionary Compression? http://forums.arcgis.com/threads/58348-Large-Dictionary-Compression?highlight=32+bit+stream+network Marco
... View more
06-20-2012
06:05 AM
|
0
|
0
|
1638
|
|
POST
|
Some geoprocessing tools cannot take in_memory datasets as inputs/outputs. Yes, I was aware of that possibility. However, I still can not get the Con (conditional) tool of Spatial Analyst to accept the set extent on the "env" environment object, despite it being listed among the accepted "Environments" in the ArcGIS Desktop 10 Help. I clearly see it being set while debugging, as the correct extent with correct XMin, YMin, XMax, YMax is visible in PyScripter, but as soon as I use the Con tool in the script, the output is using the input raster extent instead of the set extent of the environment settings. The mask setting is properly used as long as the dataset is physically written to disk, as I wrote before.
... View more
06-09-2012
05:31 AM
|
0
|
0
|
850
|
|
POST
|
I don't know if it has anything to do with it, but I now noticed that only when I specifically write the mask file to disk, it is properly used as a Mask environment setting. I had been using the "in_memory" option for storing the polygon representing the mask instead of physically writing it to disk. Maybe there is an undocumented limitation with using "in_memory" datasets as environment settings for analysis, and you can't use them for that directly...? Anyone else experience this issue?
... View more
06-03-2012
08:57 AM
|
0
|
0
|
850
|
|
POST
|
Hi all, I am trying to set some environment settings through Python. When I look in Debug mode in the PyScripter IDE at all environment variables (launched from ArcMap by right clicking the script and choosing "Debug"), I see my Extent and Mask setting correctly being applied to the GPEnvironment "env" object, the correct coordinates and path to the mask file are visible. However, when I look at the final result of a Spatial Analyst "Con" conditional operation, the resulting raster doesn't honor the Mask set for the environment in the script in which the Con command is used. According to the Desktop Help, the Con command should honor the Mask setting... In addition, the extent, when viewed in PyScripter debugging mode, of the resulting raster, is incorrect too. The extent is the extent of the original raster, not the extent set dynamically in the Python code of the script. Any clues as to why this is happening? ArcGIS 10 SP4 by the way. Marco
... View more
06-03-2012
05:19 AM
|
0
|
3
|
1300
|
|
POST
|
Did you check out the Spatial Analyst extension? Hi Luke, Just before you posted, I found a similar topic regarding licensing of ArcGIS levels and extensions in the Desktop 10 Help: Accessing licenses and extensions in Python http://help.arcgis.com/en/arcgisdesktop/10.0/help/index.html#//002z0000000z000000 Like you suggested, I did not check out the Spatial Analyst extension, as I wasn't aware I needed to do this. I now changed the code with help of the examples, and it works properly! I can see the Raster object listed now in PyScripter's debugging mode and variables window. Very nice and useful. Thanks for the response. Marco
... View more
06-02-2012
01:45 AM
|
0
|
0
|
625
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 01-31-2026 04:45 AM | |
| 1 | 12-08-2025 09:12 AM | |
| 1 | 12-05-2025 12:38 PM | |
| 1 | 12-04-2025 10:08 PM | |
| 1 | 12-04-2025 10:11 AM |
| Online Status |
Offline
|
| Date Last Visited |
03-11-2026
01:10 PM
|