<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic how to make big data statistical analysis run quicker in ArcGIS GeoEvent Server Questions</title>
    <link>https://community.esri.com/t5/arcgis-geoevent-server-questions/how-to-make-big-data-statistical-analysis-run/m-p/802082#M3338</link>
    <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;DIV style="position: fixed; z-index: 1499; width: 0px; height: 0px;"&gt;&lt;DIV class="resolved" data-reactroot="" style="all: initial;"&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;P&gt;I am running some statistical analysis on a large amount of data on my own computer and need to figure out a method to make this run quicker.&amp;nbsp; 4-5 days for one run is ridiculous.&amp;nbsp; I already have a Google Cloud GPU server for my python code.&amp;nbsp; Can this be utilized for ArcMaps?&amp;nbsp; I would think I would have to install ArcMaps on the server first and set up the link in ArcCatalog.&amp;nbsp; I only have a student version.&amp;nbsp; I have seen an article that talked about uploading data on the cloud and registering it to access it. I don't know if this would access a faster processor so my analyses would run faster, or if it would just be convenient storage spot.&amp;nbsp; Can anyone answer how to process large amounts of data quickly using Esri products? Thanks!&lt;/P&gt;&lt;DIV&gt;&lt;/DIV&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
    <pubDate>Fri, 10 Apr 2020 21:55:30 GMT</pubDate>
    <dc:creator>JenniferCrosby1</dc:creator>
    <dc:date>2020-04-10T21:55:30Z</dc:date>
    <item>
      <title>how to make big data statistical analysis run quicker</title>
      <link>https://community.esri.com/t5/arcgis-geoevent-server-questions/how-to-make-big-data-statistical-analysis-run/m-p/802082#M3338</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;DIV style="position: fixed; z-index: 1499; width: 0px; height: 0px;"&gt;&lt;DIV class="resolved" data-reactroot="" style="all: initial;"&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;P&gt;I am running some statistical analysis on a large amount of data on my own computer and need to figure out a method to make this run quicker.&amp;nbsp; 4-5 days for one run is ridiculous.&amp;nbsp; I already have a Google Cloud GPU server for my python code.&amp;nbsp; Can this be utilized for ArcMaps?&amp;nbsp; I would think I would have to install ArcMaps on the server first and set up the link in ArcCatalog.&amp;nbsp; I only have a student version.&amp;nbsp; I have seen an article that talked about uploading data on the cloud and registering it to access it. I don't know if this would access a faster processor so my analyses would run faster, or if it would just be convenient storage spot.&amp;nbsp; Can anyone answer how to process large amounts of data quickly using Esri products? Thanks!&lt;/P&gt;&lt;DIV&gt;&lt;/DIV&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 10 Apr 2020 21:55:30 GMT</pubDate>
      <guid>https://community.esri.com/t5/arcgis-geoevent-server-questions/how-to-make-big-data-statistical-analysis-run/m-p/802082#M3338</guid>
      <dc:creator>JenniferCrosby1</dc:creator>
      <dc:date>2020-04-10T21:55:30Z</dc:date>
    </item>
  </channel>
</rss>

