<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic BMP for updating attribute of large Hosted Feature Service on a weekly basis? in ArcGIS API for Python Questions</title>
    <link>https://community.esri.com/t5/arcgis-api-for-python-questions/bmp-for-updating-attribute-of-large-hosted-feature/m-p/806124#M2285</link>
    <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;My organization has a point dataset with over 1 million records that needs to be symbolized by&amp;nbsp;one of it's&amp;nbsp;attributes (lets call it status). Each week approximately 10,000 statuses of the &amp;gt;1 million need to be updated based on what happened that week.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I'm curious if the best method is to simply use the Python API to join based on ID, and calculate?&amp;nbsp;Can I do this more efficiently then joining a such a small change to a large dataset?Am I over complicating this? Visual representation of the dataset below.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;IMG alt="" class="image-1 jive-image j-img-original" src="https://community.esri.com/legacyfs/online/447084_dataset.PNG" /&gt;&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
    <pubDate>Wed, 08 May 2019 21:05:38 GMT</pubDate>
    <dc:creator>ShelbyZemken1</dc:creator>
    <dc:date>2019-05-08T21:05:38Z</dc:date>
    <item>
      <title>BMP for updating attribute of large Hosted Feature Service on a weekly basis?</title>
      <link>https://community.esri.com/t5/arcgis-api-for-python-questions/bmp-for-updating-attribute-of-large-hosted-feature/m-p/806124#M2285</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;My organization has a point dataset with over 1 million records that needs to be symbolized by&amp;nbsp;one of it's&amp;nbsp;attributes (lets call it status). Each week approximately 10,000 statuses of the &amp;gt;1 million need to be updated based on what happened that week.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I'm curious if the best method is to simply use the Python API to join based on ID, and calculate?&amp;nbsp;Can I do this more efficiently then joining a such a small change to a large dataset?Am I over complicating this? Visual representation of the dataset below.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;IMG alt="" class="image-1 jive-image j-img-original" src="https://community.esri.com/legacyfs/online/447084_dataset.PNG" /&gt;&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 08 May 2019 21:05:38 GMT</pubDate>
      <guid>https://community.esri.com/t5/arcgis-api-for-python-questions/bmp-for-updating-attribute-of-large-hosted-feature/m-p/806124#M2285</guid>
      <dc:creator>ShelbyZemken1</dc:creator>
      <dc:date>2019-05-08T21:05:38Z</dc:date>
    </item>
  </channel>
</rss>

