POST
|
Interesting. Will that be 40 million points only or also 40 million polylines? also remember that a shapefile has a 2.1GB file size limit which comes into play around 40 million points or when the .dbf has many features See https://en.wikipedia.org/wiki/Shapefile
... View more
07-17-2013
10:16 AM
|
0
|
0
|
164
|
POST
|
This is a great idea. But the only bottleneck I have is that I have to start with SDC file as input. There is not a syntax to take SDC file as input. Right? Thanks for sharing your wisdom!!!!!!!!! Maybe take your logic to do the selection, and export that you've used in the python window and put it into a search cursor. For example the following would iterate through each state in a feature class: import arcpy
fc = r"C:\temp.gdb\states"
with arcpy.da.SearchCursor(fc, "State") as cursor:
for row in cursor:
# your code here See http://resources.arcgis.com/en/help/main/10.1/index.html#//018w00000011000000 for more information on the search cursor.
... View more
07-17-2013
05:02 AM
|
0
|
0
|
706
|
POST
|
The idea of using python to iterate is really intrigueing. But what would the code look like? Currently I am manually paste into the python window the code of arcpy.selectionmanagement...and then use arcpy to export the states, but I haven't try a loop or anything that I can use to get the data. Would you like to give me a sense what the code will be like? thanks!!!!! I would personally try to use the Split tool - use the 40 million as the input and your states polygon as the split feature. From memory, this outputs a new feature class (or shapefile) for each split field (in your case the state name). I think this is an ArcInfo level tool though. You can then use each shapefile import into PostGIS. Alternatively I would use python to iterate through each state feature, select all those that intersect and then write the output to a shapefile.
... View more
07-16-2013
06:28 PM
|
0
|
0
|
706
|
POST
|
Thanks! I will be using pgAdmin3. According to this post "http://workshops.opengeo.org/postgis-intro/loading_data.html", it seems I have to load shapefile into the system. So how long will it take to transfer from file geodatabase to shapefile then? Anyway I will try this method and let you know if it works better. Also, I am trying to use the python window to finish everything. So shall i use "arcpy.FeatureClassToFeatureClass_conversion" for geodatabase to shapefile transfer? Thanks a ton! Your help means a lot to me! What program do you intend to use as a postGIS program to work with the date. A shapefile may or may not be necessary. I believe it would go faster if you import to a personal or file geodatabase first. Also, when you are selecting and exporting pause the drawing. The length of time required is a direct relation to size of the data file and the power of your computer. close all programs on your computer and ONLY have the program you need running.
... View more
07-16-2013
06:43 AM
|
0
|
0
|
706
|
POST
|
I am not sure how to deal with sdc file. Can I export into file geodatabase? I think the bottleneck I have is about transforming the sdc file at the moment.
... View more
07-16-2013
06:27 AM
|
0
|
0
|
706
|
POST
|
Dear All, I am working on a project where I need to process in postGIS. In order to do so, I need shapefile as input. However, what I have at hand are SDC files. So I need to transform the SDC file into shapefiles in order to put the results in postGIS. But the problem is, the dataset contains 40 million records which is too big to be "save as shapefile" in one time. What I am doing now is to divide the records in some logical ways (such as select be the STATES) and then once I have the selection, I will create a layer from the selection and then export it into shapefile. To process 1/50 of the whole dataset takes me 50minutes to select, and another an hour or so to export. Which is too tedious. Does anyone know any other approach that can speed up this process? Will model builder help at all? (I am looking for automatically way of select and export so that I can leave my computer running over night.) Thanks. Best
... View more
07-16-2013
05:53 AM
|
0
|
12
|
1341
|
POST
|
I am trying to export a subset from a really big dataset(million level). There are 20 fields in the shapefile but I don't really need all of them. But I am making the trade off between the processing time of deleting the fields and the processing time of exporting with the extra fields. Does anyone know if it will speed up the exporting process if I delete more fileds from the shapefile? Thanks.
... View more
07-15-2013
09:22 AM
|
0
|
1
|
676
|
POST
|
Hi asrujit, Thanks a ton for the reply. The sdc file I am exporting contains 30,000,000 records and many fields. I tried export it by right click, but it took three days for arctics to process it and the final file had some fields messed up. I am wondering if my server has gone out of memory. In my case, shall I delete some field of sdc file first and then export them ( and maybe try to first select a subset of the file)? If I export the data as geodatabase, can I till import them into postgis? Thanks. Best Yk Yunkex, I think you can simply perform this task using your ArcCatalog interface. From ArcCatalog, create a folder connection and browse to the folder where your SDC dataset is stored. Right-Click on the SDC Feature Class and use the "Export to Shapefile" option to get the desired Shapefile. ** You didn't provide the version of ArcGIS you are using. The workflow above is suggested for 10.1. NOTE: If converting large amounts of data, it's probably best to use a geodatabase format for space efficiency and better capabilities; shapefiles are useful for interchange among different vendors' spatial data processing products. Regards,
... View more
07-12-2013
06:32 PM
|
0
|
0
|
323
|
POST
|
Dear all, I am trying to ge the shapefile from SDC file so that I can put them into postGIS. Would you please let me know if there is an easy way of doing it? Thanks. Best yk
... View more
07-12-2013
12:21 PM
|
0
|
2
|
3329
|
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:24 AM
|