POST
|
Yes clicking select all does show the total number, however I'm still not able to browse the attribute table. I want to scroll down to the bottom. But the 'bottom' of the table is still record no.2000:
... View more
03-28-2017
03:25 AM
|
0
|
1
|
4231
|
POST
|
FGDB. Thanks for the help Neil. I've gone for the method posted earlier now. Where I create a new dataset with a date tag added to the end of the feature class name. I just have to remember to resource the MXDs to this new layer after every update.
... View more
03-28-2017
03:11 AM
|
0
|
0
|
1724
|
POST
|
I'm in ArcMap, viewing the attribute table for a point feature class... '0 out of *2000 Selected'... Why has it put this limit on the attribute table? There should be 25,000 ish records. On the map itself, 25,000 points do display/draw however. I do have a definition query but this should exclude around about 500 records only.. This issue only occurs with File Geodatabase feature classes. Sourced to a shapefile copy of the dataset, this issue doesn't happen. What's going on? Lot's of questions I've had recently! ArcGIS Desktop 10.4
... View more
03-28-2017
03:09 AM
|
1
|
13
|
11533
|
POST
|
Is that 'Truncate Table' you use? And does your method work even if there's a LOCK file on the data?
... View more
03-28-2017
01:38 AM
|
0
|
2
|
1673
|
POST
|
Hi. I have a script created in Python 2.7.10. It runs without issue on a machine with 2.7.10 installed: (spreadsheet>>>XY Event Layer>>>Feature Class) But on a machine with 2.6.5 installed, it errors. Why is this?: Is it because I'm trying to make an XY event layer from a sheet within an Excel spreadsheet, and I don't have Microsoft Office installed on this second machine (no licence)? Screenshot below shows the spreadsheet needed for the XY event layer, and it's showing up as an unassociated file type (no program found to open it). I didn't think Microsoft Office would be needed for Python to be able to extract a sheet. I've also experimented with making a CSV version of the file and pointing the script to use that, but the same error appears. Any thoughts most welcome!!!
... View more
03-28-2017
01:19 AM
|
0
|
6
|
1460
|
POST
|
Of course. I haven't gone into details just yet and thoroughly appreciate Mitch's response. But anyway, there are 20 fields and 25,000 records. All the data in these records is subject to change, excluding a couple of ID fields. What I think I really need is to delete the existing shapefile, and replace with the updated version. However a LOCK file originating from the server (which accesses the shapefile) stops me from deleting this dataset. I need a script which: 1. Stop Server GIS service 2. Delete Points.shp 3. Create updated version of Points.shp in same directory with same name. 4. Start Server GIS service
... View more
03-27-2017
08:40 AM
|
0
|
4
|
1673
|
POST
|
Bruce Harold Yes it's the stopping service and starting service I can't figure out. We have ArcGIS Desktop 10.4 with Python 2.7.10 and ArcGIS for Server 10.0 with Python 2.6.5 on our seperate server machine (we host a lot of mapping data on this server including a web map interface which the whole company use. It's old but it works and we're reluctant to upgrade the software (no time to!))
... View more
03-27-2017
08:36 AM
|
0
|
1
|
1673
|
POST
|
Yes true, but two duplicate records may not always be identical. The newer one may have updated information within it. So how could the script decide which record is the newer and updated record? Thus keeping it.
... View more
03-27-2017
07:46 AM
|
0
|
6
|
1673
|
POST
|
Thanks Bruce but the data contains 1000s of records containing confidential tenant information. I doubt I'd be allowed to share In an ideal world, I just need a simple script that deletes the existing dataset and places the new, updated dataset in the same directory and with the same name. That way all the MXDs won't need to be re-sourced each time. The reason I can't do this method is because of an ever present LOCK file on the existing dataset (it's under constant access by our GIS server...). Maybe a script that says: 1. Stop Server GIS service 2. Delete Points.shp 3. Create updated version of Points.shp in same directory with same name. 4. Start Server GIS service
... View more
03-27-2017
07:44 AM
|
0
|
3
|
4801
|
POST
|
Also when appending the new dataset to the original, I want the new dataset to take priority over the existing one. ie. if two records match, then old records are overwritten by the new ones.
... View more
03-27-2017
07:24 AM
|
0
|
0
|
4801
|
POST
|
Dissolve won't work as it create a new dataset. It's crucial to me that the final dataset, is the existing one. (Lots of ArcMap MXDs link to that shapefile.)
... View more
03-27-2017
07:17 AM
|
0
|
0
|
4801
|
POST
|
There are multiple seperate points that share the exact same coordinates. So dissolving based on that would combine points together that are infact independent of each other. There is a unique ID field however... I could dissolve the appended dataset based on that field?
... View more
03-27-2017
07:11 AM
|
1
|
0
|
4801
|
POST
|
That's only available with an advanced license though. I only have Basic And yes most of the records are duplicates. Maybe there is away to only append records that are new...
... View more
03-27-2017
06:57 AM
|
1
|
3
|
4801
|
POST
|
thank you very much! I would improvise if I knew how to. I'm still learning the ropes you see... cheers!
... View more
03-27-2017
06:52 AM
|
0
|
0
|
644
|
POST
|
I have two point shapefiles, both have the exact same fields in them. There are some new records AND some duplciate records when comparing the two datasets. I want to use the append tool as I don't want to create a new dataset, I just want to add data to the existing original shapefile. However, when I append the two shapefiles, matching records are appended, thus leaving lots of duplicate records. How can I tell my script to only append new records and ignore duplicates?
... View more
03-27-2017
06:50 AM
|
1
|
29
|
16289
|
Title | Kudos | Posted |
---|---|---|
1 | 03-27-2017 07:11 AM | |
1 | 03-27-2017 06:57 AM | |
1 | 05-08-2017 04:50 AM | |
1 | 05-12-2017 12:52 AM | |
1 | 12-07-2016 08:01 AM |