|
POST
|
Thanks for the feedback! That's good to know and I'll explore more into this. Jason Hi Jason, Although Eric is right that from a purely practical point of view "if the data meets the assumptions ... kriging should work", I would really like to emphasize the small caveat Eric placed in between these quotes and that I hereby rephrase: "Does it make sense to interpolate your particular dataset???" People all to easily grab to the first available tool that seems suited, while there may be better, and from a scientific point of view sounder, methods to accomplish what you want, e.g. create a map of potential customers. Interpolation always seems attractive when having a point dataset, as it is an "easy" way to get a surface, but is it sound from a logical and scientific point of view? I can't answer that question for you, but you might be... based on your knowledge and questions like the one below E.g. do you find it likely that two people nearby have a higher likelihood to buy a certain product than two people further apart (spatial autocorrelation)? Does this almost always hold? (seems unlikely with market research data, two neighbours may have a completely different social background and status, but for good continuous environmental datasets it usually does) My gut feeling says that this type of "social" data is not the best suited for interpolation, and that you should be careful in your decision to use it and interpret results. I really shouldn't ignore general statistical analysis (multi-variate) like can be done with general statistics packages like NCSS and SPSS, and than use any found statistical relationships to classify other datasets with your market data to answer your question of "where potential customers may be". Yes, it is more laborious than the "one-stop point-to-surface option" that interpolation seems to offer, but it might be more sound from a scientific and statistical point of view. That is all not to say it can't be done, just that you should be careful in your decision and not ignore other options. And ESRI has of course "Business Analyst". Maybe time to look at that? Marco
... View more
10-24-2012
06:29 AM
|
0
|
0
|
626
|
|
POST
|
Thanks Jeff, In my case, I finally decided to set the visibility of the layers on/off depending on data driven page index, as this did the job I wanted it to do for this particular case. It took some time to figure out how to implement this based on Python example scripts I found in the Help. I still think it would be nice if more of the DDP stuff was exposed through the ArcToolbox and thus directly usuable in ModelBuilder. Quite a lot of people are still unfamiliar with Python, and learning a new programming language can be a long story, especially for those without any previous programming experience or simply difficulties to adjust to programming type tasks. I am not one of them, but I have seen other people struggling with any kind of programming style "logic". Connecting things up in a graphic style designer like ModelBuilder, is just more accessible for some. Marco
... View more
09-26-2012
05:11 AM
|
0
|
0
|
890
|
|
POST
|
From what I have seen up to now, if you want to stay within the realm of the DDP functionality and be able to use the standard tools and toolbar to accomplish what you want, there is no real other option than creating *multiple* parcels polygons for the same lot each having a single attribute of the type and content you want (parcel number, owner etc.). This is not an ideal situation, because it will mean having overlapping polygons, but you might automate the maintenance of the DDP polygon index layer by using ModelBuilder tools to stack polygons based on the same lot and add the attributes from whatever source you are using, and than schedule to update the layer each week or day or so... Problem is, it seems ESRI hasn't yet exposed much of the DDP functionality through python, except the most basic functions. Hence some stuff, like the "Page Definition" on a layer, isn't available programmetically. Seems we need to wait for ESRI to upgrade DDP functionality and expose more through programming interfaces like python.
... View more
09-21-2012
04:37 AM
|
0
|
0
|
632
|
|
POST
|
Hi all, I have the need to dynamically set a layer's "Page Definition" query of multiple layers based on a ModelBuilder variable retrieved from a field using an iterator. Each value in the field thus must lead to it's own query and has to be set as the Page Definition of an exported layer. However, looking around in the ArcToolbox, and the arcPy layer properties, it seems the "Page Definition" isn't actually being exposed anywhere except through the user operated dialogs of ArcGIS itself?? I may well be overlooking something, but are there really no options to programmatically set the DPP "Page Definition" through python scripting that can be integrated with a ModelBuilder model? Marco
... View more
09-19-2012
12:57 PM
|
0
|
2
|
1055
|
|
POST
|
Then when you do detrending, you're removing even more spatial autocorrelation, and there's nothing left to perform kriging on. Are you sure your data needed detrending? This is my interpretation too. Looks like the data doesn't need detrending. Try and read some of the documentation to understand what detrending is about and when there is, or isn't, a need to apply it. You shouldn't just use every tool or option available in Geostatistical Analyst just because it is there, you need to apply them based on knowledge of your dataset and statistics provided by Geostatistical Analyst. Simply viewing the dataset's values symbolized with an appropriate legend in ArcMap, should tell you a lot about a possible need to detrend the data.
... View more
07-11-2012
10:52 PM
|
0
|
0
|
888
|
|
POST
|
I tried the simple kriging model as you suggested, but can I assume that the mean/trend is known and constant for my data? Thanks again! Kirsten This is something only you can answer with your expert and field knowledge of the terrain / research site. There are statistical tools in Geostatistical Analyst that will tell you whether some assumptions are more or less reasonable or appropriate based on statistics, but in the end it is you who needs to decide which models or assumptions are best for your data.
... View more
07-11-2012
10:39 PM
|
0
|
0
|
840
|
|
POST
|
Are you using Background Processing? Processing in the background means sending the features you want to process over to another process and then getting a result back. Because its another process we have to persist them to disk for ArcMap to read them on the way back. Why o why didn't ESRI document this crucial factor in getting in_memory to work in the Help file??? I really can not find it in the main in_memory Help topic. This issue regarding background processing should be added under the "The following considerations must be made in deciding to write output to the in-memory workspace" text line. I have been struggling with this issue too.
... View more
06-20-2012
06:12 AM
|
0
|
0
|
1102
|
|
POST
|
You may also wish to have a look at this recent thread, as it discusses issues and solutions regarding 32-bit / 64 bit memory space and Python/ArcGIS too: Large Dictionary Compression? http://forums.arcgis.com/threads/58348-Large-Dictionary-Compression?highlight=32+bit+stream+network Marco
... View more
06-20-2012
06:05 AM
|
0
|
0
|
1306
|
|
POST
|
Some geoprocessing tools cannot take in_memory datasets as inputs/outputs. Yes, I was aware of that possibility. However, I still can not get the Con (conditional) tool of Spatial Analyst to accept the set extent on the "env" environment object, despite it being listed among the accepted "Environments" in the ArcGIS Desktop 10 Help. I clearly see it being set while debugging, as the correct extent with correct XMin, YMin, XMax, YMax is visible in PyScripter, but as soon as I use the Con tool in the script, the output is using the input raster extent instead of the set extent of the environment settings. The mask setting is properly used as long as the dataset is physically written to disk, as I wrote before.
... View more
06-09-2012
05:31 AM
|
0
|
0
|
713
|
|
POST
|
I don't know if it has anything to do with it, but I now noticed that only when I specifically write the mask file to disk, it is properly used as a Mask environment setting. I had been using the "in_memory" option for storing the polygon representing the mask instead of physically writing it to disk. Maybe there is an undocumented limitation with using "in_memory" datasets as environment settings for analysis, and you can't use them for that directly...? Anyone else experience this issue?
... View more
06-03-2012
08:57 AM
|
0
|
0
|
713
|
|
POST
|
Hi all, I am trying to set some environment settings through Python. When I look in Debug mode in the PyScripter IDE at all environment variables (launched from ArcMap by right clicking the script and choosing "Debug"), I see my Extent and Mask setting correctly being applied to the GPEnvironment "env" object, the correct coordinates and path to the mask file are visible. However, when I look at the final result of a Spatial Analyst "Con" conditional operation, the resulting raster doesn't honor the Mask set for the environment in the script in which the Con command is used. According to the Desktop Help, the Con command should honor the Mask setting... In addition, the extent, when viewed in PyScripter debugging mode, of the resulting raster, is incorrect too. The extent is the extent of the original raster, not the extent set dynamically in the Python code of the script. Any clues as to why this is happening? ArcGIS 10 SP4 by the way. Marco
... View more
06-03-2012
05:19 AM
|
0
|
3
|
1163
|
|
POST
|
Did you check out the Spatial Analyst extension? Hi Luke, Just before you posted, I found a similar topic regarding licensing of ArcGIS levels and extensions in the Desktop 10 Help: Accessing licenses and extensions in Python http://help.arcgis.com/en/arcgisdesktop/10.0/help/index.html#//002z0000000z000000 Like you suggested, I did not check out the Spatial Analyst extension, as I wasn't aware I needed to do this. I now changed the code with help of the examples, and it works properly! I can see the Raster object listed now in PyScripter's debugging mode and variables window. Very nice and useful. Thanks for the response. Marco
... View more
06-02-2012
01:45 AM
|
0
|
0
|
531
|
|
POST
|
Hi all, Just starting with Python, I now decided it was time to see if I could get debugging functional in a Python IDE, to ease coding and debugging. Since I couldn't get PythonWin to function as a debugger for ArcGIS, even after specifically setting it as the debugging application in ArcGIS 10's Geoprocessing Options, I decided to download PyScripter instead, as I saw it mentioned on another forum. That works! I could get it launched from ArcGIS by right clicking a script and choosing "Debug". It properly loads and executes and nicely shows all content and properties of the variables and functions used in the script. However, as soon as I use a Spatial Analyst command (which is licenced, and I have run the same script successfully from ArcGIS), the debugger stops at that point stating "The tool is not licensed" as visible in the PyScripter "Python Interpreter" window. Is there anything else I need to do to make an external Python IDE like PyScripter aware of available licenses on extensions like Spatial Analyst? Any specific settings to be made or arcpy modules to import??? Any suggestions welcome! Marco
... View more
06-02-2012
12:18 AM
|
0
|
2
|
1233
|
|
POST
|
Thanks Curtis, this is very useful information, certainly a help for future development with ModelBuilder.
... View more
05-21-2012
10:15 PM
|
0
|
0
|
1294
|
|
POST
|
This is the way to do it. It seems elaborate but there is no other way as ModelBuilder accesses those variables accessed with "%" syntax as strings -- not objects. The commas separating arguments is python syntax, there is no way around that. Thanks Curtis, it is appreciated. This is a clear explanation, and I was suspecting something like that. As I am just starting with Python, it is all a bit new to me and getting to know these specific issues and how to handle them, takes time. The way this all works is the the input expression to Calculate Value is interpreted by model builder (variables substituted in) -- and the resulting function is then passed to python literally. This is the same reason that pathnames must be used with Calculate Value like this: def func(r"%mbuilder path variable%") Hope this helps! Could you elaborate a bit more on this specific subject regarding pathnames? I haven't read about it before. And is the "r" before the inline variable a typing error, or does it need to be there?
... View more
05-21-2012
01:14 PM
|
0
|
0
|
1294
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 12-08-2025 09:12 AM | |
| 1 | 12-05-2025 12:38 PM | |
| 1 | 12-04-2025 10:08 PM | |
| 1 | 12-04-2025 10:11 AM | |
| 1 | 04-29-2020 03:54 AM |
| Online Status |
Offline
|
| Date Last Visited |
Friday
|