Select to view content in your preferred language

R-ArcGIS Bridge for ArcPro - Memory Allocation Limit ?

744
2
Jump to solution
09-13-2017 04:36 PM
DavidHerries
Emerging Contributor

R-Script Fails for Larger Datasets (Memory Exhausted), but succeeds when not connected to ArcPro and run natively in RStudio without the Bridge.

Error Message:  Error: Memory exhausted (limit reached?)

In troubleshooting this, we used the memory.limit() function call to display memory limit allocated for the script to run within. When testing the memory limit from RStudio we get 16000+MB (PC has 16GB RAM), but when running the same check on the memory.limt() function from within ArcPRO with R the memory limit is reading Memory.limt() = 2047MB.

 

Hence why it is failing as it seems to be working with a different memory limit allocation when used in conjunction with ArcPro?  Does this right or are we missing something obvious?

 

Platform: ArcPro 2 Latest Release

R: 3.4.1 64 bit

OS: Windows 10

Hardware: i7 16GB RAM Dell XPS 15

Just moved this to this group from the main ArcPro support discussion.

Thanks in advance

Dave

0 Kudos
1 Solution

Accepted Solutions
ShaunWalbridge
Esri Regular Contributor

Dave,

I can reproduce this issue locally. Despite the 64-bit DLLs involved, the memory limit is being capped at what you'd expect for a 32-bit process (2GB). I'm looking into this issue, it looks like its a default value that is being set by R, but it isn't clear to me why that is yet. In the meanwhile as a workaround, you can add this to the top of any scripts you're working with that need more memory:

limit.mb <- as.numeric(
    gsub("TotalVisibleMemorySize=", "", gsub("\r", "",     
    system('wmic OS get TotalVisibleMemorySize /Value',     
    intern=TRUE)[3]))) / 1024

set.limit <- memory.limit(limit.mb)‍‍‍‍‍‍‍‍‍‍‍‍‍

That will set the memory available to R to the total memory available on the machine.

Cheers,
Shaun

View solution in original post

2 Replies
ShaunWalbridge
Esri Regular Contributor

Dave,

I can reproduce this issue locally. Despite the 64-bit DLLs involved, the memory limit is being capped at what you'd expect for a 32-bit process (2GB). I'm looking into this issue, it looks like its a default value that is being set by R, but it isn't clear to me why that is yet. In the meanwhile as a workaround, you can add this to the top of any scripts you're working with that need more memory:

limit.mb <- as.numeric(
    gsub("TotalVisibleMemorySize=", "", gsub("\r", "",     
    system('wmic OS get TotalVisibleMemorySize /Value',     
    intern=TRUE)[3]))) / 1024

set.limit <- memory.limit(limit.mb)‍‍‍‍‍‍‍‍‍‍‍‍‍

That will set the memory available to R to the total memory available on the machine.

Cheers,
Shaun

DavidHerries
Emerging Contributor

Shaun

This worked perfect and is a great work around thank you.   

Cheers

Dave