Can you elaborate on what you mean?
I'm trying to sum the value of every building in a given district and arcgis desktop 10.1 can't handle big numbers. Long integers can store up to 2 billion or so. If your sum is more than that the spatial join will return Null. You don't get a warning or anything. This could be damaging if you weren't paying attention.
I would be surprised if Esri used long integers for summation when double would be more appropriate given that you can always downscale data...if this is in your table, I use double as the default for all numeric operations and downscale if I need integer
I was suprised. Go try it out. Maybe I'm missing something.
Downscaling the data is an inconvenient necessity that I've been doing for the last hour.
Have you tried doing the average as well as the sum...assuming the count * avg for your field will give you the total. count and sum are indeed Long types, but avg is Double, and Long * Double should upscale to double
That very well may work… seems like ESRI should just fix it instead of requiring data gymnastics.
I've written my own spatial join python script to push the data into an appropriate datatype. bigint would work fine I'd guess, but the point of my post is a warning to users that the spatial join out of the box will give BAD results if you've got a bunch of big numbers.
I agree Kevin but I suspect your sums might be in a distinct minority
not really big data... just too big for a great tool that needs improvement.
I've got a database of 62 billion data points that is the back end for the map at solarsimplified.org and each of those data points is a double. That's big data!