Hi Patrick,
This is definately not the case. We have services being used in our Open Data site, where we know the data, when downloaded, is not up to date, or in some cases, does not even download properly (in spreadsheet format for instance).
We also use a script to force update all items used in our Open Data site every night, to ensure that the cache is updated. Even then, we have erratic and inconsistant behaviour. Its only when we manually go in the back end, and update the cache, that in some instances, this may resolve the issue. We seem to find this issue with 2 of our large datasets, ie one with over 12,000 rows, and another with 120,000 rows. We dont consider this big data, therefore are unsure as to why this is occuring.
Like the above post, we are thinking of creating offline versions of our data to ensure our users get current data, which is not ideal.
If you could provide best practice methods for these large datasets, that would be great.