|
POST
|
I fixed the script but I'm not exactly sure why it decided to work consistently again.
... View more
03-11-2015
11:56 AM
|
0
|
0
|
2493
|
|
POST
|
Well, I thought that was a fix, but as soon as I tried to see what would happen if I used feature classes as input, it gave me an error and now I can't get it to work again. The problem seems to be related to the fact that when I got this to run successfully, the input window would show the "Add" button for the Output Folder. Now, it only shows me the "Save" option and that always returns the error.
... View more
03-11-2015
06:28 AM
|
0
|
1
|
2493
|
|
POST
|
I fixed the script. I added the bold items and it fixed ERROR 00210. The first thing that I tried was to add the overwrite option (I already had that specified in my geoprocessing-->options, but I thought it couldn't hurt to include this). That change alone did not solve the problem. However, the class LicenseError(Exception): pass lines did the trick. I have no idea why that worked. Now, can anyone tell me how to get this to work with file geodatabase feature class as input and output? # Import system modules class LicenseError(Exception): pass import string import os import sys import arcview import arcpy from arcpy import env import traceback try: # Create the Geoprocessor object arcpy.env.overwriteOutput = True arcpy.AddMessage("Starting script...") print arcpy.GetMessages()
... View more
03-11-2015
06:00 AM
|
0
|
2
|
2493
|
|
POST
|
I have used the attached toolbox/script in the past to automate Select by Location. In this case, I have points that I want to assign to polygons. The script allows you to enter the polygon layer and an ID field as well as the item to be selected (points layer). It places the polygons into memory and iterates through each polygon, selecting the points that intersect the polygon and writing the ID field to a table of those records and exporting the output shapefile of the points with the polygon ID assignment. After it iterates through all of the polygons, it merges the output shapefiles into a single shapefile called Final_Merge.shp. (Note, some people may wonder why I'm not using a spatial join for this operation. The reason is that the "point in polygon" method skips any points that are on the boundary line. Yes, you could use the "closest" option of the spatial join, but I find that to take a long time to calculate the distances when you have large files. In addition, I like the flexibility and control of the select by location options to control just what gets selected and assigned.) I've used the script for other layer combinations, not just points and polygons. I edit the script before running the toolbox to change which Select by Location method works best for the situation --e.g. INTERSECT versus WITHIN_A_DISTANCE, by using the # to comment out the intersect line and "uncommenting" the line that has the selection option that I want. The script used to work great. With very large processes, it sometimes ran out of memory after about 500 shapefiles and I would simply re-run the script with the records that were remaining. I ran this once this morning as a test--and it worked perfectly. However, when I tried to run it again, it always gives me a "ERROR 000210: Cannot create output". I tried rebooting, creating a new MXD, using a different ID, using different output folders, and nothing has worked. The main reason for my using the script now is that I wanted to try to change the input and output types to file geodatabase feature classes instead of shapefiles, but all of my attempts to make that change have failed and until I can get the original script to run consistently with shapefiles, I don't want to try to make further changes. I am not a Python programmer and the original code was written by someone else. Can anyone help get this working again? I think this tool could help a LOT of people. I have attached a zip file with the toolbox file and the Python script. Message was edited by: Susan Zwillinger I uploaded the revised toolbox and script that is working consistently for me when using shapefiles. Message was edited by: Susan Zwillinger Removed the option that deleted the output folder when the script failed in case someone chose an output folder that contained other data that should not be deleted. Message was edited by: Susan Zwillinger
I got it working again. This time it is more consistent. I removed some of the output messages in the script. It will accept either a shapefile or a feature class as an input, but it still only outputs shapefiles. One note about the different inputs--if you use a file geodatabase as an input, depending on whether or not you used the default XY tolerance value when you created the file geodatabase or whether you specified this in the geoprocessing-->environment settings, you can get very different "select by location" results compared to the same data in a shapefile format.
... View more
03-10-2015
09:33 AM
|
0
|
3
|
6486
|
|
POST
|
This is very helpful. It sounds like we should be creating a spatial view in SQL Server 2008 R2 (the client is using ArcSDE 10.2.1) and then setting up a trigger for when the lat/long data is updated for the points rather than using the Make Query Table option to create an event layer. I'm assuming that the term "spatial view" is equivalent to an "indexed view" in SQL Server terms. Can you point me to any good resources for further reading? Since the client wants to use Desktop Basic (and not Standard or Advanced), is there any problem with updating or adding new points in this scenario? Several years ago, I remember a different client having ArcSDE and the only way they could edit or update feature classes was to have an ArcEditor license. For cost reasons, it is obvious that the current client would want to avoid having to use the Standard license when updating or adding data to the map layer from the spatial view being stored in SQL Server.
... View more
02-13-2015
08:06 AM
|
0
|
1
|
3276
|
|
POST
|
This is not a question of functionality; it's about performance and best practices. I can see the database tables and views and I can create event layers in ArcGIS. To be more clear about what I am asking, I have numbered my questions below: 1) What are the best practices for using SQL Server tables, views, or indexed views? 2) Does ArcGIS work with an indexed view (I know it works with tables and regular views) and is there any benefit to an indexed view being used with ArcGIS? 3) Are we better off duplicating a view as a regular, indexed table? Would that be essentially the same as creating an indexed view? 4) Do tables that are "registered" have limited functionality for ArcGIS Desktop Basic users versus ArcGIS Desktop Standard users? 5) Can indexed views be "registered" and is there any benefit to doing that? Regards, -Susan-
... View more
02-12-2015
10:24 AM
|
0
|
6
|
3276
|
|
POST
|
Thanks for your helpful information, Jake. I have used the Make Query Layer tool in ModelBuilder, and this does address some of the issues with event layers, but it doesn't seem to address the underlying performance issues. I'm wondering if having an indexed view would help? In addition, I'm wondering whether it is possible to register an indexed view with the geodatabase and whether the combination of the indexed view and the registered view would improve performance.
... View more
02-05-2015
10:56 AM
|
0
|
0
|
4224
|
|
POST
|
Typically, I don't look at the old help files, but in this particular case I couldn't find the corresponding help topic when I looked in the 10.2 help file, so I thought that it may still be a limitation that you can't register a view with the geodatabase. Have you ever tried to register a SQL Server View in 10.2 as opposed to registering SQL Server tables (which I know we can do)?
... View more
02-05-2015
10:52 AM
|
0
|
8
|
4224
|
|
POST
|
Did you try converting your shapefile to a feature class in a file geodatabase? A FGDB does not have the limitations that shapefiles have and I think you might have much better luck exporting the entire table to a text file if the source data is not a shapefile.
... View more
02-05-2015
10:28 AM
|
0
|
0
|
1064
|
|
POST
|
To all of the ArcSDE experts, What are your best practices for using nonspatial tables and/or SQL Server views as input for event layers? The Help docs for 9.3 note that because ArcGIS cannot add an ObjectID field to a view, you cannot register a view with the geodatabase. (I'm using 10.2, but I don't think this has changed.) If a view is not registered, it means that the table (even if it has a field called ObjectID) will not work properly as input for an event layer. Sure, the event layer can be created, but if you try to query the data with a "Select by Attributes", you get very strange results. In addition, you are not able to select features on the map with the Select tool. This is a severe limitation when your data is changing frequently and you want users to be able to see those changes dynamically. With large data files, the process to convert the event layer to a feature class is painfully slow. So, any suggestions on best practices? Would it help to create an indexed view in SQL Server? (Indexed views can provide better performance but this has to be weighed against the increased maintenance requirements of the database. Would ArcGIS "recognize" an indexed view?) Are we better off duplicating a view as a regular, indexed table and then registering that for the event layer? This would require some kind of automatic trigger to re-create the table after the underlying tables were changed by users and seems like a very bad idea when you have multiple people editing the data. I'm also wondering whether tables that are registered have limited functionality for ArcGIS Desktop Basic users versus ArcGIS Desktop Standard users. Most of the users only have a Basic license and it would be costly to upgrade everyone to a Standard license. Currently, the underlying SQL tables can be edited through a custom form in ArcGIS regardless of the license level being used. When the data is refreshed, the event layers show the edits made by the users. Regards, -Susan-
... View more
02-04-2015
08:30 AM
|
0
|
17
|
13071
|
|
POST
|
Thanks, Kyle. That list is helpful. I know that there are many military and unique zip codes that end up in the point zip code file but not the polygons that are created by companies like HERE and TomTom. My concern is related to the polygons decreasing. We generally have around 41,000 point zip codes and I sometimes use them for geocoding, but the main concern is related to looking at demographics and trade areas by zip code. Do you know if the USPS provides a polygon file as well as point file? It would be great to have some cross reference capability to validate the data that is included with the Esri Business Analyst software. (I think the current version of BA uses data form HERE.) Regards, -Susan-
... View more
07-18-2014
11:43 AM
|
0
|
1
|
1522
|
|
POST
|
Creating accurate trade areas for business locations that are not your own (i.e. you don't have customer addresses and sales data) usually involves multiple analysis techniques. The revenue estimates in the infoUSA data are not very accurate . . . so you can only use this as a rough guide of the "order of magnitude" difference in sales between different businesses. However, there are some things that you can do to try to analyze the locations. Below are some ideas: 1) Use the Business Analyst menu-->Trade Areas-->Data Driven Rings wizard to create rings based on the sales revenue data from infoUSA. For example, you can have a 1 mile ring represent 10000 from the revenue field--which translates to a store with $10 million in sales. (The infoUSA data reports sales data in thousands, $000) You can then use this information to vary the size of the drive time areas . . . the assumption being that stores with larger revenue will have a larger trade area and longer drive time. Obviously, that assumption could be wrong in some cases, but I would say that the assumption would hold in many cases. If you have data from your own stores, you can use that information to guide how you choose to represent the size of the rings relative to revenue. You also may need to separate your grocery store data into different types. A super Walmart or a Super Target is going to have a very large trade area, but a local grocery store or specialty grocery store could have a much smaller footprint that serves only an immediate neighborhood. 2) Since many people tend to shop in their neighborhood for groceries (i.e. they want to be able to get home before their ice cream melts), you might want to consider using one or more neighborhood boundaries as a trade area (for example, you can download a shapefile from Zillow if you happen to be working in an urban area). You can translate these larger areas into a trade area by block group and then do manual customization of the trade area by selecting or removing block groups. Use the "Create Trade Area From Sub-geography Layer" in the Trade Areas tools in the Business Analyst Tools toolbox in your ArcToolbox window. 3) Try the Huff Equal Probability Model in the Trade Area wizard if you can weight the importance or market share for each location. This will help you understand the relative market share for each location, but it won't give you an easy way to compare one trade area to another; rings or grids are better for that type of analysis. 4) If you have time, you can use the Sales Potential Modeling under the BA menu (another form of the Huff Model) to create custom trade areas for each location based on an attractiveness factor for each of the grocery stores in the area. Once you have the results of the analysis, you can translate the block groups to a trade area by exporting the block group IDs to a table and importing that using the Business Analyst menu-->Trade Areas-->Standard Geographies wizard. 5) You can use the Territory Design toolbar to create trade areas (either automated or manually by block group) using the grocery stores as seed points and the "Food at Home" expenditure data for size parameters. Regards, -Susan-
... View more
10-03-2013
11:56 AM
|
0
|
0
|
1773
|
|
POST
|
Bret, I'm not sure I understand exactly what you are asking, but maybe the information below will help you. The infoUSA business data does not provide a numeric square foot area for each business (they only provide a size class--A, B, C, etc.), but I use the following proxy techniques: 1) Use Business Analyst menu-->Favorites-->Find Hot Spots (Grids) . . . to create a grid to cover your study area (which could be the entire continental US). Select the size of the grid (e.g. 2 miles by 2 miles) based on your understanding of the standard trade area size of the business you are trying to analyze. (You can use spatial statistics tools to help you determine this, but as a new user, you'll probably be fine with a "gut instinct" decision.) 2) Use Business Analyst menu-->Favorites-->Append Data (Spatial Overlay) . . . to aggregate the business count and demographic data from the standard BA BDS layers. The result will be a map layer with an attribute table that has data summarized by different business types. You can create a thematic map to highlight the hot spots and you can use the Select by Attributes and Select by Location options to query the data for the characteristics that are of interest. For example, you can find grids that have at least 1,000 daytime employees, 5,000 households and a median household income of $50,000 or higher (or whatever criteria you want). 3) If you don't like the business type breakdowns in the standard BDS layers, you can extract the data that you need from the infoUSA data and then use the Summarize Points option under the Business Analyst menu-->Analysis-->New Analysis-->Market Analysis-->Summarize Points. First, you would use the Business Analyst menu-->Add Business Listings-->Search option to extract the business data for a single category (e.g. NAICS 44) into a new map layer. I prefer to use the "Classic View" option in the Add Business Listings wizard since this will give you the most options to extract just the data that you need. After you have your business points, you can use Summarize Points to calculate the total number of business points, total number of employees, and estimated revenue for each grid. You can also use Summarize Points with the Shopping Center map layer to get a total of GLA in sq ft for the shopping centers in each grid. Using grids (you can also create hexagons with ArcGIS) allows you to evaluate a large market area and see color-coded hot spots for the businesses that are of interest. Another benefit of using grids is that in addition to calculating the total for your business demand generators (counting the number of hotels if you have a rental car business) or potential residential customers (for retail stores), you can also combine that with an understanding of supply by creating a ranked ratio or score for each grid. If your question was something like "Can this grid support another restaurant/department store/bank/hospital . . .?", then the analysis would have to involve an analysis of the specific competitors within each grid area. In many cases, when you are looking for suitability for a new site location, you don't want to avoid competition, but you do need to understand the market potential (demand) for the product or service in comparison to the supply provided by the competition. Business Analyst Online has a Retail MarketPlace Profile report that tries to identify surplus/leakage for various categories, but I prefer to do the analysis with grids in BA desktop first. (Esri has not made grid analysis an option for Business Analyst Online . . . yet . . . I've asked for this enhancement, but it may take some time before this is available.) Regards, -Susan-
... View more
10-03-2013
09:38 AM
|
0
|
0
|
1102
|
|
POST
|
My 2011 BA Data had 32,086 zip codes. The 2012 BA Data has 31,698. While I know that zip codes change frequently, I would expect the number to increase and not decrease. Can someone verify that Navteq, in fact, reduced the number of zip code areas that they are delivering? As a side note, thanks to Esri for including the usa_zip5_pnts layer with the BA data that has 41,304 zip codes . . . some of which are point zip codes instead of area zip codes. (We used to have to get that layer from the Esri Data & Maps DVD.) Regards, -Susan-
... View more
04-30-2013
09:04 AM
|
0
|
5
|
5635
|
|
POST
|
I am surprised that this has not been documented previously, but my search did not yield any results. Here's my question: There is sample code to export all of your data driven pages to individual PNG files with the page number at the end of the file name. How do I change the code to create file names that use the page name index value instead of the page number? Here's the sample code: mxd = arcpy.mapping.MapDocument("CURRENT")
... for pageNum in range(1, mxd.dataDrivenPages.pageCount + 1):
... mxd.dataDrivenPages.currentPageID = pageNum
... arcpy.mapping.ExportToPNG(mxd, r"C:\ArcGIS\Map_Area_" + str(pageNum) + ".png")
... del mxd
... View more
03-27-2013
04:12 AM
|
0
|
1
|
1346
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 10-11-2022 11:50 AM | |
| 2 | 08-09-2022 01:22 PM | |
| 1 | 08-06-2021 12:18 PM | |
| 1 | 04-26-2021 08:17 AM | |
| 1 | 02-26-2016 06:40 AM |
| Online Status |
Offline
|
| Date Last Visited |
08-08-2024
06:02 PM
|