I have a very large data.frame in R (over 13m rows with 16 variables). Attempts at using arc.write to output the data.frame as a new table object in a geodatabase have repeatedly failed for what I guess to be memory constraints. How can the arc.write method be used to chunk this large table into the geodatabase so as not to exceed memory limits?
Hello Wyatt,
You can use the pattern below to read and write chunks for large rasters. In this scenario you read a large raster (call it raster_file), process it and write it in chunks to a new raster (let's call it r_new)
r_read <- arc.raster(arc.open(raster_file))
#Define Blank Raster to Write (with the same extent and discretization as the original stored in object r_read)
r2 = arc.raster(NULL, path=tempfile("r_new", fileext=".img"), dim=dim(r), pixel_type=r$pixel_type, nodata=r$nodata, extent=r$extent,sr=r$sr)
#Loop Through the Rows of the Large Raster (assuming you can hold one row in memory)
for (i in 1L:r$nrow)
{
v <- r$pixel_block(ul_y = i - 1L, nrow = 1L)
r2$write_pixel_block(v * 1.5, ul_y = i - 1L, nrow = 1L, ncol = r$ncol)
}
#Write the Predesignated Raster File to Disk
r2$commit()