<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Geoprocessing behind reports in Business Questions</title>
    <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443322#M532</link>
    <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;SPAN&gt;Hi Julie,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;All Esri Business Analyst or Community Analyst products, including the Desktop, Server, Online/Web, and API, essentially rely on the same data apportionment methodology through common business components "under the hood."&amp;nbsp; That said, the Desktop documentation that Jason has provided should be a good reference for you.&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Additionally, I wanted to call out an independent data accuracy study which compared our projections and estimates (which are computed with the underlying methodology we have been discussing in this thread) with those of major competitors:&lt;/SPAN&gt;&lt;BR /&gt;&lt;UL&gt;&lt;BR /&gt;&lt;LI&gt;&lt;A href="http://www.esri.com/data/esri_data/demographic-overview"&gt;Data Vendor Study Overview&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="http://www.esri.com/data/esri_data/demographic-overview/data-vendor-study/~/media/Files/Pdfs/library/brochures/pdfs/vendor-accuracy-study.pdf"&gt;Data Vendor Study Summary &amp;amp; Findings&lt;/A&gt;&lt;/LI&gt;&lt;BR /&gt;&lt;/UL&gt;&lt;BR /&gt;&lt;SPAN&gt;Thanks,&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Tony&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
    <pubDate>Tue, 02 Apr 2013 14:54:35 GMT</pubDate>
    <dc:creator>TonyHowser</dc:creator>
    <dc:date>2013-04-02T14:54:35Z</dc:date>
    <item>
      <title>Geoprocessing behind reports</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443317#M527</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;SPAN&gt;Hi all,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;I have checked the API reference and the Concepts reference for the Community Analyst API, but I can't see anything that describes the geoprocessing routine that generates the reports. In order to use the API, I need to know what assumptions are being made. For example, in this sample:&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;A href="http://help.arcgis.com/en/communityanalyst/apis/flex/samples/index.html?sample=SummaryReports"&gt;http://help.arcgis.com/en/communityanalyst/apis/flex/samples/index.html?sample=SummaryReports&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;are the values merely a summary of the census geographies that fall under the selected polygon, or is a clip taking place? If the latter, is there an assumption that population is regular across the entire census polygon, or is population apportioned based on other variables, e.g., land use. &lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;I suspect I'm just missing a key reference that documents how BA works - can someone please point me in the right direction???&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;TIA,&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Julie&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 29 Mar 2013 16:22:15 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443317#M527</guid>
      <dc:creator>JulieKanzler</dc:creator>
      <dc:date>2013-03-29T16:22:15Z</dc:date>
    </item>
    <item>
      <title>Re: Geoprocessing behind reports [Esri data apportionment methodology]</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443318#M528</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;BLOCKQUOTE class="jive-quote"&gt;Hi all,&lt;BR /&gt;&lt;BR /&gt;I have checked the API reference and the Concepts reference for the Community Analyst API, but I can't see anything that describes the geoprocessing routine that generates the reports. In order to use the API, I need to know what assumptions are being made. For example, in this sample:&lt;BR /&gt;&lt;BR /&gt;&lt;A href="http://help.arcgis.com/en/communityanalyst/apis/flex/samples/index.html?sample=SummaryReports"&gt;http://help.arcgis.com/en/communityanalyst/apis/flex/samples/index.html?sample=SummaryReports&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;are the values merely a summary of the census geographies that fall under the selected polygon, or is a clip taking place? If the latter, is there an assumption that population is regular across the entire census polygon, or is population apportioned based on other variables, e.g., land use. &lt;BR /&gt;&lt;BR /&gt;I suspect I'm just missing a key reference that documents how BA works - can someone please point me in the right direction???&lt;BR /&gt;&lt;BR /&gt;TIA,&lt;BR /&gt;Julie&lt;/BLOCKQUOTE&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Hi Julie.&amp;nbsp; &lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;This is a very good question and I can see that you are doing your due diligence.&amp;nbsp; I am happy to say that the overall discussion revolves around one of Esri's distinctive competencies with respect to spatial analysis and data apportionment.&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;At a high level, this essentially leverages the concept of &lt;/SPAN&gt;&lt;STRONG&gt;dasymetric interpolation&lt;/STRONG&gt;&lt;SPAN&gt; in order to provide very accurate estimates of population/households and their associated demographics, lifestyle, consumer characteristics, spending, market potential, etc.&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Our underlying methodology goes way beyond what a simple polygon intersection may provide because we support &lt;/SPAN&gt;&lt;STRONG&gt;truly variable study areas &lt;/STRONG&gt;&lt;SPAN&gt;(which may only cover a part of a standard geography/administrative boundary area like a ZIP code, county, Census tract, Census Block Group, Japanese prefecture, Russian oblast, UK postal code, etc.&amp;nbsp; e.g. What if my single study area crossed over multiple ZIP codes and only a part of each of them? ).&amp;nbsp; On top of this, we also take into account the &lt;/SPAN&gt;&lt;STRONG&gt;variability of population/household distributions&lt;/STRONG&gt;&lt;SPAN&gt; within these administrative geographies because households and populations are not uniformly distributed across areas (e.g. The urban core of metropolitan area may have a far greater population density than its outlying suburbs even though they may be in the same county or jurisdiction so...If my custom study area covered 50% of a county, I can't simply say that 50% of the population of the county lie in the queried study area because that assumes a uniform distribution of population---regretfully, many third-party solutions make this assumption which generally results in less accurate results.)&amp;nbsp;&amp;nbsp; &lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;These represent some of the enhancements in our underlying data apportionment "engine" which enable us to quickly and accurately provide estimates and projections for ad hoc or arbitrary study areas such as drive time polygons, manually digitized/drawn polygons, and custom regions or areas.&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Hope this was helpful and sheds light on why we believe our content and analysis represent an extreme value add to your use cases and workflows.&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Thanks,&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Tony&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 29 Mar 2013 19:06:46 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443318#M528</guid>
      <dc:creator>TonyHowser</dc:creator>
      <dc:date>2013-03-29T19:06:46Z</dc:date>
    </item>
    <item>
      <title>Re: Geoprocessing behind reports [Esri data apportionment methodology]</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443319#M529</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;SPAN&gt;Tony,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Thanks for the response! It sounds like you have a defensible approach, but I still would like to know a bit more. Every analytical result has caveats and pitfalls, and it's difficult to really know what decisions can be made based on data that isn't qualified by a clear understanding of the algorithm that produced it. I know that dasymetric mapping comes in many flavors, especially depending on the inputs you select for your interpolation. Do you have a whitepaper that summarizes the techniques you're using? Perhaps, at least some peer reviewed articles referencing the key components of your algorithm(s)? &lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Thanks again!&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Julie&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 29 Mar 2013 19:33:37 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443319#M529</guid>
      <dc:creator>JulieKanzler</dc:creator>
      <dc:date>2013-03-29T19:33:37Z</dc:date>
    </item>
    <item>
      <title>Re: Geoprocessing behind reports [Esri data apportionment methodology]</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443320#M530</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;BLOCKQUOTE class="jive-quote"&gt;Tony,&lt;BR /&gt;&lt;BR /&gt;Thanks for the response! It sounds like you have a defensible approach, but I still would like to know a bit more. Every analytical result has caveats and pitfalls, and it's difficult to really know what decisions can be made based on data that isn't qualified by a clear understanding of the algorithm that produced it. I know that dasymetric mapping comes in many flavors, especially depending on the inputs you select for your interpolation. Do you have a whitepaper that summarizes the techniques you're using? Perhaps, at least some peer reviewed articles referencing the key components of your algorithm(s)? &lt;BR /&gt;&lt;BR /&gt;Thanks again!&lt;BR /&gt;Julie&lt;/BLOCKQUOTE&gt;&lt;BR /&gt;&lt;SPAN&gt; &lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Julie,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;You find the below link to the Business Analyst 10.1 Desktop online Help useful as it details how the software summarizes data.&amp;nbsp; There are several methods that allow to decide how data is summarized spatially.&amp;nbsp; I believe BAO uses a method which is like the Hybrid that is detailed but I find the section on block apportionment vs. cascading to me the most informative.&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;A href="http://resources.arcgis.com/en/help/main/10.1/#/Data_tab/000z000000v7000000/"&gt;http://resources.arcgis.com/en/help/main/10.1/#/Data_tab/000z000000v7000000/&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Regards,&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Jason R.&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Sat, 30 Mar 2013 13:54:41 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443320#M530</guid>
      <dc:creator>Jason_RobinsonRobinson</dc:creator>
      <dc:date>2013-03-30T13:54:41Z</dc:date>
    </item>
    <item>
      <title>Re: Geoprocessing behind reports</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443321#M531</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;SPAN&gt;Hi Jason,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Thanks! I can see how the ArcGIS Desktop help could provide some clues here, because I'm guessing that everything is accessing the same server resources. But I'm just not sure how to rectify your input with what Tony provided. I was mainly asking about the SummaryReportsTask in the API, but I am also interested to learn more about the various ways that data can be pulled. How do the API, BAO, and the Extension relate to one another?&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Thanks again,&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Julie&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Mon, 01 Apr 2013 13:25:43 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443321#M531</guid>
      <dc:creator>JulieKanzler</dc:creator>
      <dc:date>2013-04-01T13:25:43Z</dc:date>
    </item>
    <item>
      <title>Re: Geoprocessing behind reports</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443322#M532</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;SPAN&gt;Hi Julie,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;All Esri Business Analyst or Community Analyst products, including the Desktop, Server, Online/Web, and API, essentially rely on the same data apportionment methodology through common business components "under the hood."&amp;nbsp; That said, the Desktop documentation that Jason has provided should be a good reference for you.&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Additionally, I wanted to call out an independent data accuracy study which compared our projections and estimates (which are computed with the underlying methodology we have been discussing in this thread) with those of major competitors:&lt;/SPAN&gt;&lt;BR /&gt;&lt;UL&gt;&lt;BR /&gt;&lt;LI&gt;&lt;A href="http://www.esri.com/data/esri_data/demographic-overview"&gt;Data Vendor Study Overview&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="http://www.esri.com/data/esri_data/demographic-overview/data-vendor-study/~/media/Files/Pdfs/library/brochures/pdfs/vendor-accuracy-study.pdf"&gt;Data Vendor Study Summary &amp;amp; Findings&lt;/A&gt;&lt;/LI&gt;&lt;BR /&gt;&lt;/UL&gt;&lt;BR /&gt;&lt;SPAN&gt;Thanks,&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Tony&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 02 Apr 2013 14:54:35 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443322#M532</guid>
      <dc:creator>TonyHowser</dc:creator>
      <dc:date>2013-04-02T14:54:35Z</dc:date>
    </item>
    <item>
      <title>Re: Geoprocessing behind reports [Esri data apportionment methodology]</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443323#M533</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;BLOCKQUOTE class="jive-quote"&gt;Julie,&lt;BR /&gt;&lt;BR /&gt;You find the below link to the Business Analyst 10.1 Desktop online Help useful as it details how the software summarizes data.&amp;nbsp; There are several methods that allow to decide how data is summarized spatially.&amp;nbsp; I believe BAO uses a method which is like the Hybrid that is detailed but I find the section on block apportionment vs. cascading to me the most informative.&lt;BR /&gt;&lt;BR /&gt;&lt;A href="http://resources.arcgis.com/en/help/main/10.1/#/Data_tab/000z000000v7000000/"&gt;http://resources.arcgis.com/en/help/main/10.1/#/Data_tab/000z000000v7000000/&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;Regards,&lt;BR /&gt;Jason R.&lt;/BLOCKQUOTE&gt;&lt;BR /&gt;&lt;SPAN&gt; &lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;I am writing a fact sheet to make folks at my university aware of the availability of Community Analyst. I would like to be able to give them a definitive answer as to how data is apportioned for non-standard areas (polygons). The answers in this thread point to methods outlined for Business Analyst Desktop (block apportionment, cascading centroid, and hybrid) in ArcGIS Destop 10.x Help. However, I do not see a definitive answer in this thread as to whether these are the same exact methods used in Community Analyst. Can anyone suggest where I could get that answer? Thanks&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 21 Jun 2013 19:03:09 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443323#M533</guid>
      <dc:creator>RobertSwett</dc:creator>
      <dc:date>2013-06-21T19:03:09Z</dc:date>
    </item>
    <item>
      <title>Re: Geoprocessing behind reports [Esri data apportionment methodology]</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443324#M534</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;BLOCKQUOTE class="jive-quote"&gt;I am writing a fact sheet to make folks at my university aware of the availability of Community Analyst. I would like to be able to give them a definitive answer as to how data is apportioned for non-standard areas (polygons). The answers in this thread point to methods outlined for Business Analyst Desktop (block apportionment, cascading centroid, and hybrid) in ArcGIS Destop 10.x Help. However, I do not see a definitive answer in this thread as to whether these are the same exact methods used in Community Analyst. Can anyone suggest where I could get that answer? Thanks&lt;/BLOCKQUOTE&gt;&lt;BR /&gt;&lt;SPAN&gt; &lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Bob,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Community Analyst uses Hybrid method.&amp;nbsp; Reports like the local version Demographic and Income Profile run in Business Analyst Desktop should be exactly the same if one selected the same report from Online Reports.&amp;nbsp; I know it was definitely a focus for the development to make sure both products line up report value wise the past couple of years.&amp;nbsp; I have attached a pdf that goes into more advanced detail on summarizations in general that might be beneficial.&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Regards,&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Jason R.&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 21 Jun 2013 20:59:09 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443324#M534</guid>
      <dc:creator>Jason_RobinsonRobinson</dc:creator>
      <dc:date>2013-06-21T20:59:09Z</dc:date>
    </item>
    <item>
      <title>Re: Geoprocessing behind reports [Esri data apportionment methodology]</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443325#M535</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;SPAN&gt;Thanks much Jason. Very helpful. &lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;I have another question. The following Esri web page seems to indicate that Community Analyst Standard Plus currently provides access to 12,764 variables: &lt;/SPAN&gt;&lt;A href="http://www.esri.com/software/arcgis/community-analyst/variables"&gt;http://www.esri.com/software/arcgis/community-analyst/variables&lt;/A&gt;&lt;SPAN&gt;. &lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Is that right?&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Also, when I sum up the quantities that appear alongside the names of each dataset (or data category) on the left of the page, the total is 30,595. If these quantities indicate the number of variables in each dataset (or data category), then am I right in assuming that some variables are being "double counted"? I.e., whoever created the list decided that particular variables could be classified in various data categories? &lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Thanks&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 27 Jun 2013 18:51:26 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443325#M535</guid>
      <dc:creator>RobertSwett</dc:creator>
      <dc:date>2013-06-27T18:51:26Z</dc:date>
    </item>
    <item>
      <title>Re: Geoprocessing behind reports [Esri data apportionment methodology]</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443326#M536</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;BLOCKQUOTE class="jive-quote"&gt;Thanks much Jason. Very helpful. &lt;BR /&gt;&lt;BR /&gt;I have another question. The following Esri web page seems to indicate that Community Analyst Standard Plus currently provides access to 12,764 variables: &lt;A href="http://www.esri.com/software/arcgis/community-analyst/variables"&gt;http://www.esri.com/software/arcgis/community-analyst/variables&lt;/A&gt;. &lt;BR /&gt;&lt;BR /&gt;Is that right?&lt;BR /&gt;&lt;BR /&gt;Also, when I sum up the quantities that appear alongside the names of each dataset (or data category) on the left of the page, the total is 30,595. If these quantities indicate the number of variables in each dataset (or data category), then am I right in assuming that some variables are being "double counted"? I.e., whoever created the list decided that particular variables could be classified in various data categories? &lt;BR /&gt;&lt;BR /&gt;Thanks&lt;/BLOCKQUOTE&gt;&lt;BR /&gt;&lt;SPAN&gt; &lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Bob,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;There is definitely double counting going on in that category list.&amp;nbsp; Expand Employments/Jobs/Labor as well as Transportation and you will see 2012 Labor Force datasets are in both.&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Regards,&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Jason&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 27 Jun 2013 21:16:04 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443326#M536</guid>
      <dc:creator>Jason_RobinsonRobinson</dc:creator>
      <dc:date>2013-06-27T21:16:04Z</dc:date>
    </item>
    <item>
      <title>Re: Geoprocessing behind reports [Esri data apportionment methodology]</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443327#M537</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;SPAN&gt;Again, thanks for you help Jason. (another question:)).&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Based on what I have read, it seems apparent that USPS ZIP codes are used as a standard geography for summarizing data in Community Analyst. NOT ZIP Code Tabulation Areas (ZCTAs). But I want to be sure that is the case. ZIP Codes or ZCTAs? (If ZIP Codes - how often does Esri update its data summaries/tabulations?)&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;(P.S. This Esri page contains descriptions of Community Analyst data: &lt;/SPAN&gt;&lt;A href="http://help.arcgis.com/en/communityanalyst/online/data/data_descriptions.html"&gt;http://help.arcgis.com/en/communityanalyst/online/data/data_descriptions.html&lt;/A&gt;&lt;SPAN&gt;.&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;When I hover over the geography description for "Census Tract" on the page, it indicates that a tract generally has between 1,500 and 8,000 households. I believe that it should refer to people instead of households. Right?) &lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Thanks&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 28 Jun 2013 14:17:00 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443327#M537</guid>
      <dc:creator>RobertSwett</dc:creator>
      <dc:date>2013-06-28T14:17:00Z</dc:date>
    </item>
    <item>
      <title>Re: Geoprocessing behind reports [Esri data apportionment methodology]</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443328#M538</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;BLOCKQUOTE class="jive-quote"&gt;Again, thanks for you help Jason. (another question:)).&lt;BR /&gt;&lt;BR /&gt;Based on what I have read, it seems apparent that USPS ZIP codes are used as a standard geography for summarizing data in Community Analyst. NOT ZIP Code Tabulation Areas (ZCTAs). But I want to be sure that is the case. ZIP Codes or ZCTAs? (If ZIP Codes - how often does Esri update its data summaries/tabulations?)&lt;BR /&gt;&lt;BR /&gt;Thanks&lt;/BLOCKQUOTE&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Bob,&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt; &lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Community Analyst just uses residential delivery zip codes (area/polygon zips) and does not use ZCTAs.&amp;nbsp; ZIP code boundaries come from the NAVTEQ Q4 2011 release for the Esri 2012/2017 data update.&amp;nbsp; Each time there is a new Esri data release there will be a newer zip code vintage.&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Regards,&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Jason R.&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 28 Jun 2013 22:53:45 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443328#M538</guid>
      <dc:creator>Jason_RobinsonRobinson</dc:creator>
      <dc:date>2013-06-28T22:53:45Z</dc:date>
    </item>
    <item>
      <title>Re: Geoprocessing behind reports [Esri data apportionment methodology]</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443329#M539</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;BLOCKQUOTE class="jive-quote"&gt;Bob,&lt;BR /&gt; &lt;BR /&gt;Community Analyst just uses residential delivery zip codes (area/polygon zips) and does not use ZCTAs.&amp;nbsp; ZIP code boundaries come from the NAVTEQ Q4 2011 release for the Esri 2012/2017 data update.&amp;nbsp; Each time there is a new Esri data release there will be a newer zip code vintage.&lt;BR /&gt;&lt;BR /&gt;Regards,&lt;BR /&gt;Jason R.&lt;/BLOCKQUOTE&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Thanks Again, Jason. I have been diving deep into Community Analyst and questions keep coming up.&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;I created a 3 mile ring around my home address in Community Analyst Online using "Find Location." I then created a 3 mile ring around my home address in ArcGIS 10.1 (spatial reference WGS_1984_Web_Mercator_Auxiliary_Sphere) using the Business Analyst Evaluate Site tool and uploaded (imported) the shapefile to Community Analyst. The radius of the "3 mile" ring created on the desktop is slightly smaller than that created in CA online, and the reports generated by each are different. Is this a spatial reference issue?&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 02 Jul 2013 15:49:48 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443329#M539</guid>
      <dc:creator>RobertSwett</dc:creator>
      <dc:date>2013-07-02T15:49:48Z</dc:date>
    </item>
    <item>
      <title>Re: Geoprocessing behind reports [Esri data apportionment methodology]</title>
      <link>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443330#M540</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;BLOCKQUOTE class="jive-quote"&gt;Thanks Again, Jason. I have been diving deep into Community Analyst and questions keep coming up.&lt;BR /&gt;&lt;BR /&gt;I created a 3 mile ring around my home address in Community Analyst Online using "Find Location." I then created a 3 mile ring around my home address in ArcGIS 10.1 (spatial reference WGS_1984_Web_Mercator_Auxiliary_Sphere) using the Business Analyst Evaluate Site tool and uploaded (imported) the shapefile to Community Analyst. The radius of the "3 mile" ring created on the desktop is slightly smaller than that created in CA online, and the reports generated by each are different. Is this a spatial reference issue?&lt;/BLOCKQUOTE&gt;&lt;BR /&gt;&lt;SPAN&gt; &lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Bob,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Business Analyst Desktop created layers should be in GCS_North_American_1983 though the data frame in the default business analyst map documents are in that web mercator projection.&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;I tested creating an evaluate site simple two mile ring in BA Desktop and ran a local D&amp;amp;I Report and then uploaded the same ring to BAO/CA and ran a D&amp;amp;I Report.&amp;nbsp; These two reports exactly matched.&amp;nbsp; However when I run an online D&amp;amp;I Report from Business Analyst Desktop the numbers were different from the other two reports so will investigate that.&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Regards,&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Jason R.&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 05 Jul 2013 23:07:49 GMT</pubDate>
      <guid>https://community.esri.com/t5/business-questions/geoprocessing-behind-reports/m-p/443330#M540</guid>
      <dc:creator>Jason_RobinsonRobinson</dc:creator>
      <dc:date>2013-07-05T23:07:49Z</dc:date>
    </item>
  </channel>
</rss>

