POST
|
No, you didn't. You interpreted the data right. You undoubtedly know this already, but be aware that GEOMETRY and GEOGRAPHY in SQL Server don't use a f table, features are stored straight in the base table. Still a huge number of vertices per feature by any measure. Glad to know I had the details right. We are looking forward to the removal of f, s, and i tables from the architecture, and also to being able to use straight SQL (plus some follow-up spatial index management) to copy features from "master" sets to subsidiary subsets, without having to go through the pain of reprocessing the data for each variation. Since we are using county, census tract, and county subdivision boundary data obtained from outside sources (Esri and the Census) we're not predisposed to mess with them. Any idea why there might be over 145K vertices to define a single county boundary, even if that's an extreme outlier? -= Keith =-
... View more
07-31-2013
12:30 PM
|
0
|
0
|
231
|
POST
|
... x2, but maybe also give some more information about the configuration details. What hardware is the server? What network bandwidth do you have (still 100Mbit/s or Gigabit?) etc... So I thought I had responded to this but I don't see it. When I ran the code from my workstation (which has a 100 MBit network card and probably some other network bandwidth throttling that I can't see; it's Windows 7 with 4 GB of RAM and a bunch of other things running), it was taking 30 - 40 minutes to copy 4800 polygon features. When I ran the same code from a server, which has a gigabit connection, it took about 1.5 minutes. Problem solved, or at least worked around. Not sure how the customer's going to feel to have me using their servers as if they were my private workstation, but it will at least enable me to complete the task before I reach my retirement date a few years hence. So now here's a new conundrum: why is it that if I point Catalog at a table-based feature class and choose "Preview" it takes up to 30 seconds to get a response (there's even a several-second lag time before the cursor changes or any other visible reaction takes place), whereas feature classes built as spatial views connecting to the same underlying data respond within a second or two? -= Keith =-
... View more
07-31-2013
12:11 PM
|
0
|
0
|
308
|
POST
|
So you're saying you have 10K+ features comprising an average of 2100 vertices per feature at scale 1:100.000??? :eek: Is this data stream-digitized from imagery, with never a proper weeding out of unnecessary vertices? I can't imagine any polygon / line needing up to 2100 individual vertices to define its form at scale 1:100.000. Even at scale 1:1000, it would be highly undesirable to have such huge amounts of vertices per feature... For a highly detailed and high quality 1:1000 base map here in the Netherlands of the national highway system, where I was involved in the re-design of the photogrammetry based workflow and database migration some ten years ago, a test dataset had an average of about 27 vertices per feature..., minimum 4, maximum 1026, but that was a rare exception. Writing to an old, but dedicated and further unused Sun Sparc Ultra-2 single 100 MHz processor Unix server, and storing in SDE_BINARY, resulted in data load speeds of about 120 features / second, so about 27*120=3240 vertices / second. (Oracle 😎 Again, we're talking >10 years ago here... Your data loads at 14700 to 18900 vertices per second. I leave it to others to comment if that is a normal speed right now with the configuration details you posted... but you really may need to reconsider your workflow for collecting and storing this data... Hi- This is initially a project to carry out a one-time data conversion / migration exercise of existing data. What happens after that remains to be seen. We have built the data over the years, and rebuild / reload big chunks of it every night. The data are not versioned. We are in the early planning and analysis stages of redesigning the data management process, but for now that is beside the point. At this point, the data are what they are, and the sole issue is how to get them from the old beat-up DBs that have gone through three version upgrades of both the DBMS and ArcGIS into brand shiny fresh new ones. At the end of the day, the new and old need to be indistinguishable from one another except that the new will be in GEOMETRY where the old was SDE_BINARY. The databases are on virtual servers. The destination server is configured as two Intel E5-2260 @ 2.2 GHz CPUs, Windows Server 2008 Standard Edition, 64 bit, 16 GB RAM. Incidentally, I checked and verified that the county, census tract, and county subdivision boundary data from which we are building the feature class came from Esri and the Census Bureau. We're just consuming them without having compiled them. They get used for spatial analyses, so they need to be at as fine a level of detail as we can manage. Thanks- -= Keith Adams =-
... View more
07-16-2013
01:26 PM
|
0
|
0
|
1021
|
POST
|
The ArcSDE 'C' API doesn't interact with feature datasets at all (they do not exist in that API). Any table created by 'C' or Java API functions will register the table with ArcSDE, but not with the XML geodatabase (right-click "Register with Geodatabase" will still be available). ArcSDE SDK functions work fine with tables that happen to be in feature datasets, BUT: They will not honor topologies They're unlikely to handle CAD objects properly They will not modify geodatabase metadata In addition, the 'se_toolkit' tools will not insert, update, or delete within versioned tables or tables where archiving is enabled, or interact with ArcGIS-only raster types. Finally, the ArcSDE 'C' API will be deprecated with the release of 10.2, so long term plans for toolkit use should be limited. You should really determine where your I/O bottleneck is, since there isn't any clear indication of why your database is so slow. - V I'm not writing any C code directly, so the C API doesn't enter into it from my side. It's either command window or Python, so whatever theSDE command line tools and se_toolkit offer is what I have to work with, along with the other geoprocessing tools available through arcpy. None of our data are versioned, CAD-based, archive-enabled, or raster. As for the postulated I/O bottleneck: we only see this issue when copying data that are being converted from SDE_BINARY to SQL Server GEOMETRY.
... View more
07-16-2013
12:48 PM
|
0
|
0
|
355
|
POST
|
10.1 extended upon the number of tools you can use to administer a (geo-)database, and you can register datasets or enable geodatabase functionality easily from the toolboxes and geoprocessing tools available. Also, you can use Query Layers to access the data in read-only mode directly, without the need of registering the data with a geodatabase. See also my PDF here: http://forums.arcgis.com/threads/83644-quot-The-ESRI-Geodatabase-Framework-quot-PDF?p=295462&viewfull=1#post295462 and the "Future" PDF here: http://forums.arcgis.com/threads/83644-quot-The-ESRI-Geodatabase-Framework-quot-PDF?p=303021&viewfull=1#post303021 Hi- This is a data migration exercise at this point--- I am "just" (note caveat below) trying to get existing data from databases that have been through three SQL Server version updates as well as at least two (maybe three) ArcGIS platform upgrades, into fresh, shiny, unadulterated 10.1 / SQL Server 2008 R2 native data. At the end of the process, the data have to look exactly as they did when I started, except for having all the shape data in SQL Server GEOMETRY instead of SDE_BINARY. (Any time someone uses the word "just" in such a context, it's a tip-off that they are not entirely aware of the implications of what they are saying. I include myself in that, in this context.) I've already discovered that I have to use the SDE command line tools to get spatial views to be included in SDE_LAYERS, without which the Spatial Join cannot use them. The "Register with Geodatabase" tool will not register views. In my question about asc2sde, I was merely trying to ascertain and verify that I would have to take some other measure to get that to happen for data transferred by the method Vince outlined. -= Keith =-
... View more
07-16-2013
12:29 PM
|
0
|
0
|
355
|
POST
|
Hi- Perhaps what I was doing was exposing my naivete and relative inexperience when dealing with spatial data in "hard core" fashion. Here's how I got my numbers: I queried SDE_LAYERS to get the layer_id for the feature class I wanted to examine, then I simply queried the corresponding f table to take the average of numofpts column. That is the number I reported as being the average number of vertices for a feature in that class. Did I misunderstand the content of the f table? I have the following stats for the layer in question: 366k features numofpts: avg 2127, min 13, max 144,969 (!!!), standard deviation about 5086, median 1023. The feature with 145k vertices is Monroe County, FL. All of our data are vector, no raster. -= Keith Adams =- For what it's worth, this is a feature class built by use a merge and dissolve process. In addition to the business data, the three feature class inputs to the merge are what were represented to me as 1:100,000 scale county, census tract, and county subdivision boundaries. We run this process every day, generating something on the order of +/- 4,150 final perimeters and append the results to a growing "history" table that we can use for trending and historical change analysis. (I don't want to go further down the road of the hows, whys, and possible alternate strategies for doing this right now- that's neither her nor there in the context of this discussion because at the end of the day the 90+ days of historical data I have in the feature class are what has to get transferred and converted. I can thin the data some by archiving some of the days of data, and am in the process of doing so, but even so there's a big gob that has to get transferred. )
... View more
07-16-2013
11:52 AM
|
0
|
0
|
1021
|
POST
|
Hi Vince- Thanks for the extended example. Since asc2sde is not geodatabase-aware, I assume that I'll have to find a way to register the results (and get them into the desired feature datasets) separately?
... View more
07-16-2013
11:27 AM
|
0
|
0
|
355
|
POST
|
Hi- Thanks Vince. Is it possible to accomplish the same task, at similar speeds, using the geoprocessing tools provided with ArcGIS Desktop? We have a bunch of tables and views scattered across a half-dozen databases, and I'm trying to do as much of the work as I can in a testable, repeatable fashion with a Python GP script. If I have to rely on the SE_TOOLKIT, what tool do I use to transfer data from one SQL Server-based feature class to another? The sdecopy tool comes back with a "Not yet supported" message... so I'm kind of at a loss as to how to get from one SDE / SQL Server feature class to another with the toolkit. Thanks- -= Keith Adams =-
... View more
07-16-2013
07:54 AM
|
0
|
0
|
355
|
POST
|
Thanks for the link and feedback. For this issue, both ArcGIS and Geodatabase are 10.1 sp1, MSSQL is 2008R2. We imported a parcel base feature class of ~300,000 polygons, stored in sde_binary, and it performs just fine, as it always has since the days of MSSQL 2000 and 9.something ... but an exact copy of it with geometry storage, display and query performance is unacceptable, almost shapefile slow. The layer envelope is not the issue. I can't find any discussion on the support forums etc. or guidance in online help. Any suggestions? Thanks! If this ever gets / got answered, I'd LOVE to know because I'm in exactly the same boat, and on the same platform. To compound the problem, it is taking me 18+ hours to copy a single feature class of about 420,000 polygons, averaging 2,125 vertices / feature and I have too many feature classes to do to have the luxury of that kind of time. They all have to get done and they all have to get done in the same time window of less than 12 hours. Thanks- -= Keith Adams =- SAIC HRSA Data Warehouse Systems Analyst
... View more
07-15-2013
01:16 PM
|
0
|
0
|
131
|
POST
|
I am wondering if it is some kind of projection / spatial reference issue, with the stored SRID of the shapes somehow being messed up? In your first post you were slightly unclear, suggesting that in some cases, you did see shapes appearing? ("... In ArcCatalog or ArcMap it makes the views look like they are empty, or if they draw, you can't zoom in, out or pan because the features, if present disappear...") Do you, or don't you ever see shapes appearing in ArcMap / ArcCatalog? Hi- I'm a colleague of Joe's and have been at the pointy end of the conversion stick. Starting with the answer to the immediate question above: It depends. In some cases, we see the initial feature draw when viewing the spatial view at full extent, but any extent change causes some or all features to fail to show up. In other cases the features draw, but Identify operations are unable, ever, to find any features from which to select attributes. In such instances, attempting to display the view as a table instead of as a map shows an empty table. In yet other cases, no features ever draw. In no case did we change the join conditions when we dropped and re-created the spatial views during conversion. In all cases the data are (supposed to be) WGS84. We are relying on the Esri tools to correctly detect and handle that during data migration. Now on to the longer, more sordid history. We have tried dropping the views, converting the data, then re-creating the views. They seem to work initially, but the instant we refresh the spatial data they fail again. Our refresh method uses SQL-based business and spatial data as inputs, file GDBs to build the data, then the APPEND tool to bring it back into an existing feature class in SQL Server. The views are built on these existing feature classes, which are emptied and reloaded each time the data are refreshed rather than being dropped and re-created, so there can't be any schema change happening once the views themselves have been created after data migration. One side issue was that we created them on the first attempt with the new Create Database View tool, but the resulting views could not be used in Spatial Join operations because they didn't show up in the SDE_LAYERS table (and why would they?). Creating them with SDETABLE -o create_view made the Spatial Join tool happy, but then something else broke instead. We also tested the supposition that perhaps it was broken spatial indexes that were the culprit, and proved to our satisfaction that that was not the cause. Our test method was to delete and re-create the spatial index after a data reload, letting the tool calculate the grid sizes from the re-loaded data. At this stage we have abandoned the idea of convert-in-place and are headed down the path of creating completely new, empty DBs in an isolated environment, so that they will never have gone through either SQL Server or ArcGIS upgrades and conversions-- they'll be ArcGIS 10.1, SQL Server 2008R2 from the get-go. We plan to create brand new spatial views, again using SDETABLE, from the newly-transferred data once the tables themselves have been copied. Once everything is ready, we will use the new DBs to replace the existing production version and, hopefully, everything will be hunky-dory. The issue we face now is that writing data to the new databases is unbelievably slow- only a few features a second when writing polygons averaging 2,100 vertices per feature- and we have millions of features that have to be written. We can't possibly get the job done in a time frame that allows us to maintain our regularly daily data update production schedule. (Joe alluded to this in the first message in this thread;we are pretty sure now that the GEOMETRY data type is the culprit based on other threads that we've read.) We've seen that there's a patch for ArcGIS 10.1 when used with SQL Server 2012 that supposedly addresses this issue but since we're at SQL Server 2008R2 we haven't gone to get it. Thanks for any light you can shed on this vexing problem. We seem to be stuck between two equally non-viable courses of action! -= Keith Adams =- SAIC Systems Analyst HRSA Data Warehouse
... View more
07-15-2013
12:09 PM
|
0
|
0
|
686
|
POST
|
Hi- We are finding storing the shape data in SQL Server GEOMETRY format has a HUGE negative impact on performance. (I just replied to your thread about that migration process.) If you are doing that, that would be one place to look. Did the views exist prior to your conversion to GEOMETRY? If so, how was the performance then? For a polygon feature class that has about 25 attribute columns, we are getting data copy rates of only 7 to 9 features a second whether we use Feature Class to Feature Class, Copy Features, or Append. The polygons are based on 1:100,000 scale data at the county-level size (average of about 2,100 vertices / feature) to give you a sense of the level of detail. It does seem to us, at least anecdotally, that virtually ALL database operations are slower at 10.1 than they were at 10.0 We sit and sit and sit just waiting for ArcCatalog to connect to the geodatabase before we try to do anything else. It also seemed, again only at the anecdotal evidence level, as if spatial views actually drew faster than the underlying spatial table (again, using tables that store feature data in SQL Server GEOMETRY as opposed to SDE_BINARY format). A couple of other threads I have read suggest that there is a patch for SQL Server 2012, but we're still only at 2008R2. They also say that the "solution" is to go back to SDE_BINARY instead of using GEOMETRY, but that creates other problems for us (our spatial views inexplicably stopped working correctly, which was why we started down the GEOMETRY road in the first place) so it's not a viable alternative. Of course there is the basic issue of whether you have the business key data on which you are doing the SQL joins indexed properly- that can have an enormous impact on performance, particularly when you have thousands or tens of thousands of featured involved.
... View more
07-15-2013
09:56 AM
|
0
|
0
|
1021
|
POST
|
Hi all- We have now stumbled onto the same land mine after (finally) completing our migration from 9.3.1 to 10.0 SP 3. Our situation is: We have a series of Python-based GP scripts that get launched from inside Informatica workflows. The spatial join step (joining county data to point features, where the size of the point feature class ranges from a few dozen to about 10,000 features) fails and throws an o/s level exception. When the same code is run from a Windows CMD shell, we have no problem; it only happens when executed from within the Informatica workflows. Our environment is: Windows Server 2003 with all the latest and greatest service packs, KB patches, etc. ArcGIS 10.0 SP 3 MS SQL Server 2005 plus ArcSDE Informatica 8.1.1 (which Informatica no longer supports- that's next up on the modernization list) Data are being processed from a combination of file GDB (point features) and ArcSDE layers (counties) Text of the Windows Application event log is as follows: [INDENT]Event Type: Error Event Source: .NET Runtime 2.0 Error Reporting Event Category: None Event ID: 1000 Date: 2/25/2012 Time: 1:27:48 PM User: N/A Computer: XXXXXXX Description: Faulting application python.exe, version 0.0.0.0, stamp 4ba3e443, faulting module atio2kad.dll, version 4.0.0.1119, stamp 3bcc6880, debug? 0, fault address 0x000d0d49. For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp. [/INDENT] Thanks for any help / insights you have that fixes this problem. Our data managers will greatly appreciate not having to start their work day at 2 a.m. every day! Last post mentioned this was fixed in 10 SP2, I'm running SP3 and experiencing a similar issue. A model I've created that uses Spatial Join works fine in 9.3.1 but crashes in Arc10. I isolated it down to just the Spatial Join command, and created a model that just executes spatial join and it crashes every time. I tested the background geoprocessing on/off solution with no success. Same process works fine in 10 when executed outside of ModelBuilder. The spatial join is one-to-many, closest with a set maximum search distance. Any advice on solutions would be most appreciated.
... View more
02-28-2012
05:46 AM
|
0
|
0
|
829
|
POST
|
I am trying to construct a model (or pair of models) that I can subsequently publish as an ArcGIS Server service. Each model needs to accept a coordinate pair as the starting point. Having created an in-memory feature class with one point based on the input coordinates and a pre-set spatial reference, one branch of the model (or one of the two models) would do a spatial join against a polygon layer to determine the "most likely" feature, if any, into which the point falls. The second part, or the second model, would create a buffer of predetermined radius around the point, then do an intersect against the same polygon feature layer to determine whether the result obtained from part 1 is "realistic", or whether there might be alternate results based on positional uncertainty of either the point location or the line location (i.e. the point is too close to a boundary line for the point result alone to be considered "definitive" based on inherent uncertainty in location of point, boundary, or both). PROBLEM NUMBER 1: I cannot find any tool or combination of tools that can be used, abused, or otherwise manipulated to create a point feature from coordinates (even disregarding the need for a spatial reference- I can specify that.) Preferably the feature would be an in-memory feature rather than written to disk, but I cannot find anything one way or another. I've tried creating an event layer, but that starts from a table and there is no tool that I could find which can be used to add rows to a table. PROBLEM NUMBER 2: Assuming I could somehow find a way to create an event layer, I really need a feature class in order to do a spatial join. How do I get from the event layer to an actual by-gosh feature class? Again, bonus points if it exists only in memory, and only for the lifespan of model execution. PROBLEM NUMBER 3: Having now succeeded in creating this fantasy in-memory feature class, I want to create an equally fantastic in-memory BUFFER layer. Again, no tool I know of will do this in memory- the buffer tool wants to write to disk. Perhaps all of this is serialized and happens in memory only when the model is published as an ArcGIS Server service? PROBLEM NUMBER 4: I don't yet know what the problem will be but given the way this process has gone so far I'm sure there is something else I have overlooked. Any idea what it might be?
... View more
11-21-2011
01:23 PM
|
0
|
3
|
276
|
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|