We had a customer who was having very slow render times for their feature service. After much testing we discovered that the layer used was created by importing a CSV file using "add XY data", exporting to file geodatabase then to enterprise geodatabase. The file geodatabase feature class was 8,000 characters in length and the Enterprise Geodatabase field length was a whopping 1,073,741,822 characters long! I tried setting the field length in excel but it didn't carry over into ArcGIS. If there is not a workaround anyone knows about, this is an FYI to Esri.
Solved! Go to Solution.
Yes! Well we ended up creating an empty feature class and appending the data into it. That feature service runs super fast now
If it is a csv file, have you looked at the first few records? Why don't you use the Excel to Table tool rather than a csv
Hi Dan,
I did try the .xlxs route but it gives a 255 character limit.
I tried to delete the post in "mapping" but could not find delete and had to run
We are trying the "alter field" tool
I read the OP the other day and wondered out loud "how in the h-e-double-hockey-stick can you get a field that's billions of characters wide?"
Earlier today I found out...
I'm working on a project tha involves migrating data off of a COBOL programmed main frame computer into the modern realm. ( A quote from a meeting the other day that I just love: "The main frame was deployed 30 years ago and it seems to me that we've been trying to get off of it for 25".....)
None the less: Here's my story: I got a data dump of the main table in the main frame in, wait for it.... A text file. Hooray! I pounded that into a csv format and then imported the csv into Excel and saved it as a spreadsheet. Yeah... My favorite... I then added add the xlsx into ArcMap and exported it to a table. (Has anyone kept count how many times I've stepped on this data? )
I've acutally worked with the data, even joining it to a feature class, for a couple of days. But today I took a look at the various field properties: The character/text type fields are on the magnitude of 2,000,000,000 characters wide. That's 2 billion +; somebody hand me that double hockey stick!
First thing in the morning I'll be creating an empty feature class with all 62 (that's right kids, 62) fields of the proper dimensions and load the existing data into it. (Why bother with related tables when you can have ONE humongous one?)
Wowza....
BTW- I just looked at this: FAQ: Can the field length in an attribute table be modified? and it mentions the longest length of a field is 255. Now that's funny!
Ahhh COBOL... when life was simpler pre Y2K and phones were just phones
Yes! Well we ended up creating an empty feature class and appending the data into it. That feature service runs super fast now
Can you give more specifics on how you did the append? Once I set up the empty feature class I am not sure how to bring in the csv file
bring the csv in first making it a table, then append that table to the empty one
thanks for the help - got it to work !
To follow up on Dan's suggestion, check this out: https://www.arcgis.com/home/item.html?id=f3d91b8f852042e289e09a7ec8342431
It's a set of Excel and CSV Conversion tools from ESRI's teampython.....