I want to do spatial analysis with census data which can run to hundreds of thousands of rows and columns.
Last I remember, ArcMap limited Excel to something like 2048 lines. I would think any database would trump excel when talking about that much data. Might even be better to put it in a File Geodatabase, then analyze from there.
and you will be using all the rows and columns all at once? This sounds like one of those trick questions given the limitations inherent to Excel let alone Access.
Hundreds of thousands rows, probably. But hundreds of thousands columns? Who ever built that database needs to re-take database-building 101....
I guess I need to clarify the question. I would like to work with the data ***BEFORE*** it goes into ArcMap. I'd rather import 5,000 selected records rather than a mess of 150,000 records with several hundred unwanted attributes.
Personally, I would forget Excel as an option. Access might be able to handle what you want to do; SLQ Server or SQlServer Express are options as well.
I completely concur - don't even think about Excel (this isn't even a db), maybe consider Access for a second (just a second), but ultimately look at SQL... this is the way to go, hands-down.
So if I want to delete 50 columns, better to do it with SQL (which I don't know) than with Access?
Access should work fine unless you are getting well past 500,000 rows. There is an .mdb file size limit of 2.1 GB.
I have used access as a data store with tables over 1mill rows. One has to be aware of the mdb file size limit as mentioned by Curtis.
But, create a new mdb, import one or more data tables into to it.
Then build a make table query and add only those fields you need. That way, you still have all the original data, and you have some data you can use in ArcMap.
Retrieving data ...