MANUAL: Georgia PDM Project Workflow Data Sources (Part I of IV) WinGAP Template

Document created by Georgia_GIO on Dec 21, 2016Last modified by Georgia_GIO on Dec 21, 2016
Version 2Show Document
  • View in full screen mode

GARC Team - Please make edits directly in this document as you identify issues/errors/useful work-arounds/etc

Georgia PDM Project Workflow

Data Sources (Part I of IV)

WinGAP Template

 December 8, 2016

 

 

 

The Polis Center

www.polis.iu.edu

 

Coastal Regional Commission

www.crc.ga.gov

 

Table of Contents

Project Overview. 3

File Management 4

Document Management 6

Workflow Overview. 7

WinGAP Template – RegionW.. 10

Task 1.1 - Copy Template Folders. 10

Task 1.2 – Download Data Sources. 12

Task 1.3 - Install Hazus Databases. 13

Task 1.4 – Export Boundaries. 14

Task 1.5 – Create Buildings. 20

Task 1.6 – Incorporate Essential Facilities from GMIS. 40

Appendix 1       Glossary. 42

Appendix 2       Issues and Questions. 43

Appendix 3       Design Considerations. 44

Appendix 4       FME Algorithms for Buildings. 45

Appendix 5       FME Algorithms for Facilities. 47

Appendix 6       Optional Process for Updating Essential Facilities. 48

 

Project Overview

The Polis Center (Polis) is helping the Georgia Emergency Management Agency (GEMA) to implement a multi-county Pre-Disaster Mitigation (PDM) project.  Successful mitigation activities are based on careful assessment of what may occur in the event of a disaster.  GIS-based loss estimates using FEMA’s Hazus software satisfies a major objective of hazard mitigation.  Polis maintains Hazus databases to support these efforts.

Hazus was designed to model hurricane, flood and earthquake events.  The estimated losses are dependent upon the accuracy and completeness of the Hazus data sources.  The quality of the Hazus databases can be improved by using local data sources.

Polis has built tools to supplement the Hazus data using county assessor and parcel databases.  The assessor data is used to obtain physical characteristics of the structures (age, height, construction type, occupancy code etc…).  The parcel data is used to determine the physical location of each structure.

Parcel and Assessor data have not been standardized across Georgia.  Tools to normalize the data sources into a common format are needed for each CAMA vendor.  This workflow describes the steps needed to install and run the tools used to create Building Inventory across all counties.

Separate data source workflows have been developed for each CAMA system in the State.   Workflows are named for the type of CAMA system they support.  RegionW, for example, would be applicable to a county or other jurisdiction that uses the WinGAP CAMA system.   Be sure to use the correct workflow for your county!

Workflow Name

CAMA System

WinGAP Template - RegionW

WinGAP

BiTEK Template – RegionB

BiTEK

Govern Template – RegionV

Govern

GSI Template – RegionG

GSI

Cox A-Plus Template – RegionC

Cox A-Plus

Cox 1 Tax Template - RegionX

Cox 1 Tax

INSERT WORKFLOW NAME

INSERT CAMA SYSTEM

INSERT WORKFLOW NAME

INSERT CAMA SYSTEM

INSERT WORKFLOW NAME

INSERT CAMA SYSTEM

INSERT WORKFLOW NAME

INSERT CAMA SYSTEM

 

Each data source workflow has four main components.

Tasks

  1. Workflow and tools to create Facilities and Buildings
  2. Workflow and tools to create Facility Inventory and Building Inventory
  3. Workflow and tools to update Hazus databases
  4. Workflow and tools needed to run the models and create the deliverable reports

 

File Management

Backups

 

Hazus does not support server-based workflows.  PDM_Georgia is based on an architecture built on a local C:\ drive.  User logins are sufficient - Hazus no longer requires administrator passwords

CRC: To support multiple needs, a change to the workflow has been made here. We have elected to copy the Templates to the server. We then complete the workflow to create Facility, Building, Parcels, Land Use and Addressing on the server. Once these tasks are complete we copy the file structure for the county to the C:\ to complete the Hazus related workflows.

 

Work performed on local PCs will need to be periodically secured.  References to the Q:\drive in this workflow refer to the backup server used at Polis:

C:\Projects\Hazus_Projects\PDM_Georgia                      Local project drive

Q:\PDM_Georgia                                                           Backup drive

Project Management

Project documentation is stored under the following directory structure: Optional

C:\Projects\Hazus_Projects\PDM_Georgia\Project_Management

Admin                                       Private: Proposals and contracts

Status                                       Project management progress reports

Advisory                                   Reference materials

Workshops                               Meetings and workshop materials

Data Management

Setting up the file structure for sustained use should be considered when establishing the workflow. Regional commissions setting up a similar path name \\[ServerName\FMEWorkflow\CountyName\ can then utilize the completed county to expedite future translations.

 

Data sets are managed by geographic region.  For example:

[Server]:\Projects\Hazus_Projects\PDM_Georgia\Data_Management\

RegionW                                   RegionW data, analysis and results

RegionB                                   RegionB data, analysis and results

Georgia                                    Statewide data

 

Regional data sets are managed under four main themes.  For example:

[Server]:\Projects\Hazus_Projects\PDM_Georgia\Data_Management\RegionW

Data_Sources                           Raw data that is pre-processed for modeling

Hazus_Updates                         Updated RegionW statewide tables

Inventory                                   Inventory exposed to the model

Hazards                                    Hazard definitions and loss estimates

Data Sources

Data sources received from various agencies are organized by geography.  For example:

…\Data_Management\RegionW\Data_Sources\

Downloads                               Raw data repository

Improvements                           Parcel points and CAMA attributes

Buildings                                  Source for BI

Facilities                                   Source for FI

Inventory

The inventory datasets used as inputs to the model are organized by county or watershed.  Typically it is sized based upon the Hazus Study Region.  For example:

…\Data_Management\RegionW\Inventory\

Boundaries                               Corporate, Block and County boundaries

Building_Inventory                     All structures within the Study Region

Facility_Inventory                      Critical facilities modeled as sites

Hazards

The definition of each hazard and the results of the analysis are stored as shown in the following example:

…\Data_Management\RegionW\Hazards\

Flood                                       FL scenarios and loss estimates

Hurricane                                  Hurricane scenarios and loss estimates

Tornado                                    Tornado scenarios and loss estimates

Hazus_Updates

Updated Hazus inventory is organized by country and inventory type.  The Hazus_Updates folder contains the most current Hazus databases as well as the files used to update Hazus inventory for each county.

…\Data_Management\Georgia\Hazus_Updates\

GA                                           Updated Hazus databases

 

…\Data_Management\RegionW\Hazus_Updates\

General_Building_Stock             GBS MDB for importing into CDMS

Essential_Facilities                    EF MDB for importing into CDMS

User_Defined_Facilities             UDF MDB for importing into Hazus

HPR                                         Exported Hazus Study Region

[Other]

The remaining folders contain the tools and maps used to generate the output products.  For example:

 

…\Data_Management\RegionW\

MXD_Documents                      Production and final mapping documents

Tools                                        FME scripts, mapping templates, toolboxes

Working                                    Temporary area for work in progress

Reports                                    Output products: maps and tables

Template

The Template folders contain the seed databases and tools to kick-start a new model.  For example:

 

…\Data_Management\Template\

Data_Sources                           Raw data that is pre-processed for modeling

Hazus_Updates                         Updated RegionW statewide tables

Inventory                                   Inventory exposed to the model

Hazards                                    Hazard definitions and loss estimates

MXD_Documents                      Production and final mapping documents

Tools                                        FME scripts, mapping templates, toolboxes

Working                                    Temporary area for work in progress

Reports                                    Output products: maps and tables

Document Management

Four documents are provided within the project framework. 

  1. The Data Sources Workflow describes the steps to normalize the provided data into Facilities and Building.
  2. The Inventory Workflow describes the steps to create the Facilities Inventory (FI) and building Inventory (BI) modeling inventory.
  3. The Hazus Updates Workflow provides the methodology for integrating the FI and BI into a Hazus Flood or Eathquake Study Region.
  4. The Analysis Workflow outlines the steps and processes for each hazard analysis.

The workflows are provided as “best practice”, and may be universally applied.  Staff will be trained on how to implement these tools on their various PDM projects.

The workflow documents are maintained for use on Pre-Disaster Mitigation (Hazus) projects at Polis.  The names of the files are:

…\Workflow\GA_RegionW_1_Data_Sources_Workflow_v<V>_<R>.docx

…\Workflow\GA_RegionW_2_Inventory_Workflow_v<V>_<R>.docx

…\Workflow\GA_RegionW_3_Hazus_Updates_Workflow_v<V>_<R>.docx

…\Workflow\GA_RegionW_4_RiskAnalysis_Workflow_v<V>_<R>.docx

 

where

<V>                              Version number 1-9

<R>                              Revision number 1-9

 

The following abbreviations are used throughout the document:

[TBD]                            To be determined – decision point

[PIO]                             Process improvement opportunities – ideas for development

[Caution]                       Proceed with care.  Things to watch out for.

[Name]                          Contributions required by …

[Rev]                            Major revision marker

[Note]                           Miscellaneous hints to the reader

[Option]                        Optional or alternative workflow

 

Versions are incremented with each project milestone.

Version

Date

Milestone

Change

1.1

5-June-2015

Draft

Workflow started

1.1

15-June-2015

Implementation

RegionW County

 

2.0

28-Nov-2016

First Work through

File Clarification, tool explanation

 

 

 

 

 

 

 

 

 

Workflow Overview

Staff will apply this workflow to PDM projects that require inventory for Hazus Flood and/or Earthquake analysis.  Since the foundational data structure is Building Inventory, the tasks to integrate BI with Hazus will be common between projects.

The steps to make BI are designed to be as generic as possible.  The data sources used for BI are varied.  Parcels and assessor data are the foundational pieces, but these can be supplemented with field data collection, addresses and building footprints.  The PDM_Georgia design creates two common feature classes very early in the workflow so that the downstream steps to make BI can be standardized between projects:

  • impPoints – the locations of the Buildings
  • impCAMA – the attributes of the Buildings

  • Buildings – the locations and attributes of the Building Inventory

Similarly, the steps to make FI are designed to be generic even though data sources can be varied.  Critical facilities are the foundational pieces, but these can be supplemented with field data collection, addresses and building footprints.  The desired output is five feature classes (Schools, EOC, Fire, Police and Hospitals) plus Community Assets.

  • Facilities – the locations and attributes of the Facility Inventory

The tools used to create these three Improvement feature classes are described in this document

Workflow Diagram

Steps to process the incoming data sources into a standard CAMA Extract are described in this document.   The desired outputs are impCAMA and impPoints.  These are the generic feature classes that are combined to make Building Inventory.

Steps to process the incoming data sources into a standard Facilities database are described in this document.   The desired output is a impFacilities GDB.  These are the generic feature classes that are used to make Facility Inventory.

Task 1 Download Data Sources

  • Copy Models\Template folder
  • Prepare template documents
  • Install Hazus databases
  • Download data sources
  • Create Improvements
  • Create Domains
  • Create Buildings
  • Create Facilities

 

Output example:

RegionW\Data_Sources\

GA_RegionW_Buildings_GDB.mdb

GA_RegionW_Facilities_GDB.mdb

Task 2 Building Inventory

  • Create BI Matrices
  • Create Building Inventory from Buildings
  • Optional: Deliver BI for QC/editing by GEMA

 

Output example:

RegionW\Inventory

GA_RegionW_BI_GDB.mdb

Task 3 Facilities Inventory (Optional)

  • Create FI Matrices
  • Create Facility Inventory from Facilities.
  • Optional: Deliver FI for QC/editing by GEMA

 

Output example:

RegionW\Inventory

GA_RegionW_FI_GDB.mdb

Task 4 Update Inventory Using CDMS

  • Create Hazus Essential Facilities for CDMS import
  • Use CDMS to replace the EFs for RegionW
  • Create Hazus GBS for CDMS import
  • Use CDMS to replace the GBS for RegionW

 

Output example:

GA_RegionW_CDMS_Import_GBS.mdb

GA_RegionW_CDMS_Import_EF_GDB.mdb

EF.mdb  |  bndryGBS.mdb

Task 5 Update Inventory Using Hazus

  • Create a Study Region for RegionW
  • Import UDFs into the Study Region

 

Output example:

GA_RegionW_Hazus_Import_UDF.mdb

GA_RegionW_FLEQ\

UDS.mdb

Task 6 Reports

  • GBS Exposure
  • UDF Exposure

 

Output example:

GA_RegionW_GBS_Exposure_Rpt.pdf

GA_RegionW_UDF_Exposure_Rpt.pdf

Task 7 Template Maker (Optional)

  • Refresh Template GDBs
  • Flush Template GDBs

 

Output example:

GA_RegionW_Template_Updates.tbx

GA_RegionW_Template_Updates.mxd

WinGAP Template – RegionW

 

Task 1 – Prepare Data Sources

Information in this portion of the workflow pertains specifically to counties that use the WinGAP CAMA system.   If you are working in a county that does not use the WinGAP CAMA system please consult the workflow for that CAMA system.

The Data_Sources folder structure contains the empty databases and tools needed to prepare the incoming data.  The provided data is downloaded to \Downloaded.  There is no attempt made to clean-up the raw data.  However, the data sources are standardized:

  1. Projected to GCS_NAD83
  2. Points only
  3. Clipped to county boundary
  4. Required fields only
  5. Required records only
  6. Geodatabase format

Task 1.1 - Copy Template Folders

Task 1.1.1 – Create RegionW from Templates

Templates have been set up for the data processing activities.  Each template contains the folder structure and tools used to prepare the source data and model data for each county.  The Template contains the knowledge base for the project – it is updated on the Q:\ drive as processes are improved.

CRC: \\[ServerName]\FMEWorkflow\CountyName\

 

  • Copy the Data Source template from
                Q:\PDM_Georgia\Data_Management\RegionW
    to
                \\[ServerName]\FMEWorkflow\CountyName

The copied Template materials need to be setup for each county.  Rename all templates and change the file properties.  Modify the contents to reflect the active model.

  • Rename:

From:               GA_RegionW_*.*

To:                   GA_<CountyName>_*.*

  • Update the File Properties on all RegionW documents

Subject:            RegionW
Author:             <Enter your name here>
Comments:       2015 PDM Georgia
Category:          PDM Inventory for RegionW
Company:         Polis | Polis

  • Open each document in Word and replace all occurrences:

From:               <County>

To:                   The active county name (e.g. ‘RegionW’)

 

CRC: Download the Bulk Rename Utility at http://www.bulkrenameutility.co.uk/Main_Intro.php

Task 1.1.1. A – Use the Bulk Rename Utility to quickly rename all the template folders to the county you are working on.

 

 

This will rename all locations in the file template from RegionW-2 to the name of your county in this case Camden. This will re-name all .mdb as well.

 

Task 1.1.2 - Working Folders

Working folders and GDBs are provided as temporary data stores to process intermediate feature classes.  Working folders or files are temporary – they may be removed after each model is complete.  There is no need to back-up Working databases.  There are two databases provided, one MS Access 2003 format and one ArcGIS personal geodatabase:

…\RegionW\Working

GA_RegionW_Working_GDB.mdb

GA_RegionW_Working.mdb

 

Task 1.2 – Download Data Sources

Typical data sets for processing include:

  1. Inventory sources (Building Footprints, CAMA, Parcels, Data Collection and Essential Facilities)
  2. Hazard sources (DFirms, DEM, depth grids)

Data source folders have been created as a repository for the raw DS databases provided by GEMA.  The data sources are organized by geography.  The Downloads folder contains the original data – do not modify data in \Downloads.  The \Downloads folder does not need to be backed-up.

  • Copy the source data to
    \\[ServerName\FMEWorkflow\CountyName\\Data_Sources\
                      Downloads\
  • Unzip (as needed) and delete the downloaded Zip file
  • Organize the data sources into one of three subfolders:

\Downloads\Cadastral

\Downloads\Inventory_Data

\Downloads\Hazard_Data

  • Remove all WinGAP FPT files from \Downloads\Inventory_Data

 

Bulk Re-name Utility can be used for the download of CAMA data. WinGap CAMA data is often seen as a folder named “publicexport” or “AY2015” Appraisal Year Date. With in the folder the CAMA files have been seen as “ALL CAPS”, “PE_filename”. To account for this the user has two choices.

 

  1. Change the transformer to connect to the native naming convention.
  2. Change the name to lowercase.

 

The following Bulk Name screen shot shows the boxes to change from a Capital to lowercase.

 

 

 

After re-nmaming the files the user should then delete the .FPT files.

 

Task 1.3 - Install Hazus Databases

Changes made to the Hazus statewide databases will be version controlled.  Updates to individual counties will be integrated with the master Hazus databases.   Details of the changes will be documented:

…\Hazus_Update\RegionW\Reports\Logs

GA_RegionW_Hazus_Updates_<yymm>.doc

GA_RegionW_Hazus_Updates_<yymm>.xls

 

Changes to the Hazus Study RegionW databases are not version controlled.

The Hazus GBS database must be installed on the local PC before inventory processing can begin.  The Block and County boundaries are derived from Hazus data.  Make sure that Hazus 2.2 databases are installed.  If in doubt, download a fresh set – select the homogeneous tables.

[Caution]  PDM Template tools are built for Hazus 2.2 databases which are not compatible with Hazus 2.1.

[Caution]  PDM Template tools assume that the Hazus 2.2 databases are installed in C:\HazusData_22.  This is NOT the default folder location.  If the data is installed in the default C:\HazusData\Inventory folder then the tools will need to be reconfigured. 

If the EFs have been updated for the project, then install the updated version (must be Hazus 2.2 compatible).

  • The default Hazus data is provided in:

…\Georgia\Hazus_Updates\GA\HazusData_22\
            EF.mdb

  • The updated Hazus data is provided in:

…\Georgia\Hazus_Updates\GA\2015\
            EF.mdb

  • Copy the updated EF database to:

            C:\HazusData_22\GA\

                        EF.mdb

Task 1.4 – Export Boundaries

The Block and County boundaries are derived from Hazus data.  The goal is to use standard boundaries for modeling and inventory inputs:

  1. Block and County boundaries
  2. Required fields
  3. Required records

 

Task 1.4.1 – Export Hazus Boundaries

Polygons for RegionW are extracted.  Additional attributes are inherited from hzDemographicsB and hzMeansCountyLocationFactor tables.

[Note]  Remember to change the County Name in the Workbench.  Annotation is provided in the Workbench as a reminder

[Caution]  Remove the Sample_Clipper for final implementations if it still exists in the FME tool.  The clipper is used in RegionW to limit the size of the output data sets.  Annotation is provided in the Workbench as a reminder.

  • Open the RegionW Data Sources MXD provided in:

…\RegionW\MXD_Documents\

GA_RegionW_Data_Sources.mxd

  • Add to the Downloads Layer Group

…\RegionW\Data_Sources\Downloads\Cadastral\                      

add the parcel SHPs or GDB
           

  • Add the RegionW Data Sources toolbox to ArcTools from:
    …\RegionW\Tools\

GA_RegionW_Data_Sources.tbx

  • Right-click on the FME toolbox named RegionW_Data_Sources and select Properties. Change the tool Label to RegionW and modify the Description to reflect the current implementation.
  • Right-click on the FME tool named 01_Export_Hazus_Boundaries and select Properties.
  • Change the tool Description to reflect the current implementation.
  • Right-click | Edit the 01_Export_Hazus_Boundaries tool to open up the FME workbench.
  • Set the Published Parameters | SourceDataset (right-click “Edit Value” and browse to):

 

Caution:  Check to make sure you are clicking on the right dataset for Source and Destination paths listed under Published Parameters. Wrong paths will lead to a failure in FME

 

  • C:\HazusData_22\GA\
    mdb
  • Q:\PDM_Georgia\Data_Management\Georgia\Data_Sources\ Jurisdictions\tl_2015_13_place

tl_2015_13_place.shp

CRC: Updated the place names to 2015 so in the old manual this was 2014. I was also changed the file path so it was not mixed with the other downloads.

 

 

  • Set the Published Parameters | DestDataset (right-click “Edit Value” and browse to):
    …\RegionW\Inventory\Boundaries\
                      mdb
  • Open the County_Tester by clicking on the gear in the upper right corner and change the CountyName to the county you are working on.

CRC: During translation, some features were read that did not match a reader feature type in the workspace. This can happen if the reader dataset is changed, or a reader feature type removed or renamed.

 

This is due to moving the bndry file to the server and updating the shape file to a 2015 file. … I believe the data looks good.

 

The translator will throw an error if the name is not exactly as was defined. So if using 2015 date… the translator is for 2014. You can rename your data to 2014  or change the Feature type name to do this go into the tools > FME Options > Workbench > Allow reader feature type editing.

 

 

This will allow you to change the reader name… then turn it back off after the edit has been done.

 

This is applicable to the commimp  commbase  Commimp Commbase problem later in the Make Cama Tables

 

  • Run the script and review the log file to make sure that the records were processed correctly.

[Note]  There are over 500,000 blocks to process in Georgia.  The process takes about 10 minutes.  Snack time.

[Caution]  Blocks are joined to the County based on a common FIPs … they are not clipped to the County boundary.  The Blocks and County boundaries do not line-up in Hazus 2.2, so clipped Blocks result in slivers that need to be cleaned-up.  Using the Join process eliminates the sliver issue.

  • Add the new Block and County feature classes to the Boundaries Group Layer and review the results.
  • If all OK, save the log file to
    …\RegionW\Reports\Logs
                      GA_RegionW_DS_01_ExportHazusBoundaries_<yymmdd>.txt
  • Save your changes and exit the FME workbench. to the 01_ExportHazusBoundaries tool.

Task 1.5 – Create Buildings

The provided data sources come in all shapes and sizes.  The goal is to standardize the inputs to a common format.  The outputs are impCAMA and impPoints which will be joined together to make Buildings.

  1. Personal GDB format
  2. GCS_NAD83 projection
  3. Required fields
  4. Required records

 

Task 1.5.0 – Make CAMA Tables

WinGAP data is provided as multiple DBFs (unknown vintage).  The DBFs will be imported into a temporary Working database for pre-processing.

 [Caution]  Check to make sure there are no FPT files in \Downloads\Inventory_Data.

Make sure FPT are removed …you will get an error message if they are not.

  • Right-click | Properties the 10_Make_CAMA_Tables tool and change the tool Description to reflect the current implementation.
  • Right-click | Edit to open up the FME workbench

Note: It is helpful to pull the Navigator window sidebar over so that you can see the entire path name.  Make sure that you point each file path to the correct location (reprop to reprop, commimp to commimp, etc).

 

  • Set the Published Parameters | SourceDataset to:

…\RegionW\Data_Sources\Downloads\Inventory_Data\

acc_ctrl.dbf

acessory.dbf (yes, only one “c”)

basectrl.dbf

Commadds.dbf

Commbase.dbf

commimp.DBF

mobile.DBF

realprop.DBF

reprop.dbf

 

CRC : This is were the case sensitive and naming can cause problems. In the original Pilot apparently the data was lowercase except for Commadds and Combase.

[Caution]  It is currently unknown if the source WinGAP DBFs will be standard between counties.  Note the misspelling of accessory.dbf.  Some file extensions are upper-case, and some filenames are mixed-case.  FME is case-sensitive, so it may be better to browse for the DBFs rather than changing the pathnames.

[Caution]  Some vendors are providing DBFs with a ‘_’ suffix to indicate that this is a dBase IV format.  Rename these files to the standard filenames (or modify the FME script).

  • Set the Published Parameters | DestDataset to:

…\RegionW\Working\
            GA_RegionW_Working.mdb

  • Run the script and review the log file to make sure that the records were processed correctly.
  • Add the new CAMA tables to the Data Sources Group Layer and review the results.
  • If all is OK, save the log file to:

…\RegionW\Reports\Logs\
            GA_RegionW_DS_10_MakeCAMATables_<yymmdd>.txt

  • Save your changes and exit the FME workbench..

Task 1.5.1 – Make CAMA Domains

Domains will be created to convert the CAMA codes into CAMA descriptions.  There are two options:

  1. The CAMA data is interrogated to find the unique codes for Occupancy, Condition, Construction and Foundation types. The name of the output tables are CAMA_xx.  In this case, the user must provide the descriptions from the CAMA documentation.
  2. The WinGAP CAMA system provides Domain control tables – they do not need to be inherited from the data. The name of the output tables are Domain_xx.  The Control values may be imported into FME directly – the descriptions are provided.

WinGAP implementations are based upon Option #2.

  • Right-click | Properties the 11_Make_CAMA_Domains tool and change the tool Description to reflect the current implementation.
  • Right-click | Edit to open up the FME workbench.

 

Caution:  Check to make sure you are clicking on the right dataset for Source and Destination paths listed under Published Parameters. Wrong paths will lead to a failure in FME

 

  • Set the Published Parameters | SourceDataset to:

…\RegionW\Working\
            GA_RegionW_Working.mdb

  • Set the Published Parameters | DestDataset to:

…\RegionW\Data_Sources\Improvements\
            GA_RegionW_Improvements_GDB.mdb

  • Run the script and review the log file to make sure that the records were processed correctly.
  • Add the new Domain_xx tables to the Data Sources Group Layer and review the results.
  • If all OK, save the log file to:

…\RegionW\Reports\Logs\
            GA_RegionW_DS_11_MakeCAMADomains_<yymmdd>.txt

  • Save your changes and exit the FME workbench..

 

[Note] For WinGAP systems the Control formats are converted to the CAMA data formats using FME.  The Domains may be used ‘as is’.  The Descriptions in the new Domain_xx tables do not need to be edited.

[Note] Check to make sure that the CAMA data values match the Domain values.  If the data does not match the domains, the target impCAMA tables will be populated with ‘Unk’.  All ‘Unk’ values will need to be resolved – either by fixing the data or populating defaults.  For example, the provided Condition domain ‘001’ is not the same as the Condition value ‘1’.  Also, the CAMA Condition value ‘0’ is not listed in the Control domain.  Ideally, the implemented FME mapping tables will match the values in the data.

CRC:

Task 1.5.2 – Create impCAMA

Much of the raw CAMA data is not needed.  Required tables are identified, and the desired fields from these tables are extracted into a temporary CAMA table named impCAMA.  The impCAMA table is ‘flat’ – the indexes between the various CAMA tables have been applied to generate a record for each structure to be modeled.  The impCAMA table has the same field names and field types as the raw CAMA data.  Three fields are added – hzOccCode, impCost and impArea.

CAMA data is not fixed or repaired.  However, <Null> values are replaced with ‘Unk’ (Char fields) or ‘0’ (Numeric fields) so that they can be evaluated.  hzOccCodes are populated – only valid codes are passed through.

Records that do not have valid hzOccCodes need to be evaluated.  The value of outdoor accessories that may be destroyed in a natural disaster should be accounted for (gazebos, barns, sheds …).  Structures that will not suffer damage can be excluded (fences, patios, decks etc…).  Other records are more difficult to assess.  For example, swimming pools may suffer little damage in a flood may incur significant losses in an earthquake.  In all these cases, we cannot assign an hzOccCode – they cannot be modeled as individual structures.  However, the values may be rolled-up into the parent structure to account for the added exposure.  These assumptions/rules should be documented in the Process report.

Two Matrix tables are needed:

  1. to convert the CAMA occupancies to hzOccCode values
  2. to filter out accessories that will not be modeled.
  • Open the Improvements database in Access:

…\Data_Management\RegionA\Data_Sources\Improvements

GA_RegionW_Improvements_GDB.mdb

  • Copy the Domain_coOccupancy table as Matrix_coOccupancy_hzOccCode
  • Open the Matrix_ coOccupancy _hzOccCode table in Design View and add a field named hzOccCode as Text(5)
  • Copy the Domain_reOccupancy table as Matrix_reOccupancy_hzOccCode
  • Open the Matrix_ reOccupancy _hzOccCode table in Design View and add a field named hzOccCode as Text(5)

 

  • Copy the Domain_coUsedAsOccCode table as Matrix_coUsedAs_hzOccCode

 

 

CRC:

 

 

 

  • Open the Matrix_coUsedAs_hzOccCode table in Datasheet View and manually add the equivalent hzOccCode values. Populate unwanted hzOccCodes with ‘Unk’.  Use Domain_hzOccCode as a guide.
  • Copy the Domain_acCompNo table as Matrix_acCompNo_StrYN
  • Open the Matrix_acCompNo_StrYN table in Design View and add a field named StrYN as Text(1)
  • Open the Matrix_acCompNo_StrYN table in Datasheet View and manually add ‘Y’ to those records considered structures. ‘N’ to the others.
  • Exit Access

 

[Note]  Accessories will not be modeled.  There is not enough information – most modeling values would be defaulted.  Instead, Accessory impCost values are added to the parent structure where StructureYN is ‘Y’.

The Matrix tables will be used by FME to convert the vendor specific codes into a normalized Building Inventory architecture for modeling purposes.

The ultimate goal is to populate replacement cost in Building Inventory, but this may not be possible here.  If replacement costs are available in the CAMA data, then use them.  Various categories of building costs and areas are normalized to impCost and impArea.  That is, each record is equivalent to a structure (one cost, one area).  CAMA costs and areas may need to be combined from multiple fields.  Ideally, every impCAMA record will either have a cost or an area.  This may not be possible.  Do NOT filter out records that do not have areas or costs – populate them with ‘0’.

Codes are replaced with Descriptions using the Domains constructed previously.  If the BldgConstruction domain value ‘W’ designates ‘Wood’, then ‘Wood’ is captured in impCAMA.  Values ‘w’, ‘W ‘, W?’ ‘WW’ ‘Wood’ are converted as ‘Unk’.  This provides the ability to review the data without having to understand or decode the underlying CAMA system.

[Caution]   The Make-impCAMA tool will run successfully even if the Value Mapper transformers have not been updated to reflect the values in RegionW.  However, the descriptions may be wrong (that is, a ‘3’ in RegionW is not the same thing as a ‘3’ in RegionB) or there will be many more illegitimate ‘Unk’ values (that is, a ‘7’ in RegionW was not even coded for in RegionB).

  • Right-click | Properties the 12_Make_impCAMA tool and change the tool Description to reflect the current implementation.
  • Right-click | Edit to open up the FME workbench.
  • Set the Published Parameters | SourceDataset to:

…\RegionW\Working\
            GA_RegionW_Working.mdb

  • Set the Published Parameters | DestDataset to:

…\RegionW\Data_Sources\Improvements\
            GA_RegionW_Improvements_GDB.mdb

  • Update the ValueMapper transformers based upon the new Domains. Following the general example, there is a list of the transformers which need to be updated.  The Residential Construction ValueMapper is shown below as an example:

 

 

Domain_reConstruction    |  _coConstruction  |  _moConstruction

Domain_reCondition         |  _coCondition       |  _moCondition

Domain_reFoundation      |  _coFoundation     |  _moFoundation

Matrix_reOccupancy         |  _coOccupancy     |  _moOccupancy

Matrix_reStructureYN       |   _coStructureYN   |  _moStructureYN

 

ADDENDUM IF WORKING WITH A COUNTY THAT HAS BEEN COMPLETED PRIOR – Added Aug 16

 

 

Task 1.5.2 – Create impCAMA on counties that have been completed before.

 

If you are starting with a county that has been completed before you can “check” the domain tables instead of matching and editing the tables again, however YOU MUST CHECK THEM.

 

To do this follow the following steps.

 

  1. Open the access database as identified in the original Task 1.5.2 instructions
    1. Open the Improvements database in Access:

…\Data_Management\RegionA\Data_Sources\Improvements

GA_RegionW_Improvements_GDB.mdb

 

  1. Click on the Database Tools Ribbion and select “ Relationships” (This has been set up in the new RegionW_2_Improvment_GDB.)
    1. Create the following simple relationships
      1. Domain_coOccupancy to Matrix_coOccupancy_hzOccCode by Code
      2. Domain_acCompNo to Matrix_acCompNo_StrYN
  • Domain_reOccupancy to Matrix_reOccupancy_hzOccCode

 

 

 

 

  1. Close out of the relationship window and open a new Query from Query Wizard.
  2. Select the “Find UnMatched Query Wizard”
  3. Select for the first Table
    1. Domain_co_Occupancy
  4. Select for the second table
    1. Matrix_coOccupancy_hzOccCode
  5. Select next when the following screen comes up

 

 

 

 

  1. Then move all the available fields to selected fields

 

 

 

  1. Select Finish and view results
  2. If you have ‘0” results it then you have no new codes to update.

 

           

  1. If you have results copy the rows in the Domain_CoOccupancy withouth Matching Matrix_CoOccupancy_hzOccCode and add them to the Matrix_CoOccupancy_hzOccCode.
  2. You will then only have to update the hzOccCode for the “new”codes.
  3. Open the Domain_coOccupancy and the Matrix_coOccupancy_hzOccCode and place them side by side
  4. Sort the codes Sequentially for both tables
  5. Double check visually that the tables are the same.

 

 

 

 

Now create the same Unmatched query for the following

 

  1. Domain_coOccupancy to Matrix_coOccupancy_hzOccCode by Code
  2. Domain_acCompNo to Matrix_acCompNo_StrYN
  • Domain_reOccupancy to Matrix_reOccupancy_hzOccCode

 

 

 

After completing the Unmatched Queries you will have these in your improvement database for future runs of that county.  You do have to make the relationships one at a time, but once you complete the unmatched queries in one county you can copy and paste them into new counties after the relationships are created.

 

 

 

 

 

If you have no new codes then the next steps of updating the value mappers is not necessary

  • Right-click > Properties > AttributeValueMapper
  • Delete all the current entries (empty Source Values and Destination Values)
  • Import > Source And Destination Values
  • Select Esri Geodatabase (Personal Geodb) in the Format dropdown box.
  • Navigate to the database containing the Domain table > Next >

…\Data_Sources\Improvements\GA_RegionW_Improvements_GDB.mdb

  • Select the relevant Domain table > Next >

Domain_reConstruction

  • Select the ‘Source Value’ column > Next >

Code

  • Select the ‘Destination Value’ column > Next > Import

Description

  • Update the Construction, Condition and Foundation ValueMappers to match the Mobile, Residential and Commercial Domain tables:
  • As you work through the list of Value Mappers the Complete column was ustilized as a mark off to keep track of which one is done when heavy interuptions are a problem

 

Complete

Value Mapper

Matrix Table

Source Value

Destination Value

 

acStructureYN

Matrix_acCompNo_StrYN

Code

StrYN

X

moConstruction

Domain_moConstruction

Code

Description

X

moFoundation

Domain_moFoundation

Code

Description

X

moCondition

Domain_moCondition

Code

Description

X

coConstruction

Domain_coConstruction

Code

Description

X

coFoundation

Domain_coFoundation

Code

Description

X

cohzOccCode

Matrix_coOccupancy_hzOccCode

Code

hzOccCode

x

reConstruction

Domain_reConstruction

Code

Description

X

reFoundation

Domain_reFoundation

Code

Description

X

reCondition

Domain_reCondition

Code

Description

x

rehzOccCode

Matrix_reOccupancy_hzOccCode

ITEM_NO

hzOccCode

 

Note:  The coCondition_AttributeRangeMapper is a range mapper, not a value mapper.  Check the values in the transformer against the data in CAMA_coCondition in GA_RegionW_Improvements_GDB.mdb to verify that the range covers the extent of the data values.   Adjust if needed.

  • Run the script and review the log file to make sure that the records were processed correctly.
  • Open the impCAMA table in GA_RegionW_Improvements_GDB.mdb and review the results.
  • If all OK, save the log file to
    …\RegionW\Reports\Logs
                      GA_RegionW_DS_12_MakeimpCAMA_<yymmdd>.txt
  • Save your changes and exit the FME workbench..

Task 1.5.3 – Create impPoints

The provided parcel polygons are not needed.  Polygons GT 100sqft are converted into points to use as approximate locations for the inventory.  Points outside RegionW boundary are deleted.  The only attribute needed is the PARID.  Duplicate or <Null> PARIDs are eliminated.

[Caution]   impPoints must be captured inside the polygon.  Take care when processing irregular shapes.  Do not use the geometric center.  To generate points guaranteed to be inside an area feature, use the InsidePointReplacer.

[Caution]  Regional Parcel data may not be provided in a standard Georgia State Plane East projection.  A re-projection Transformer is provided, but the user is required to set the data source projection if it is not standard.

[Option]  The Parcel output is optional.  It is only needed if either Building Footprints or Addresses are used to make the impPoints.  In this case, Parcels will be spatially joined to the impPoints to populate the PARIDs.  This option is not implemented in RegionW.

NOTE: parcels may be delivered as either shapefiles or in a geodatabase.  The FME tool has a reader for both.  Delete the reader not being used and adjust the following directions accordingly.

  • Right-click | Properties the 13_Make_impPoints tool and change the tool Description to reflect the current implementation.
  • Right-click | Edit to open up the FME workbench.

Caution:  Check to make sure you are clicking on the right dataset for Source and Destination paths listed under Published Parameters. Wrong paths will lead to a failure in FME

 

  • Set the Published Parameters | SourceDataset to:

…\RegionW\Inventory\Boundaries\
            GA_RegionW_Boundaries_GDB.mdb

…\RegionW\Data_Sources\Downloads\Cadastral\
            Parcels_GDB.mdb

  • Set the Published Parameters | DestDataset to:

…\RegionW\Data_Sources\Improvements\
            GA_RegionW_Improvements_GDB.mdb

  • Modify the ParPts_Reprojector transformer if the Parcels are not provided in GA State Plane East.
  • Run the script and review the log file to make sure that the records were processed correctly.
  • Add the new impPoints feature class to the Data Sources Group Layer and review the results.
  • If all OK, save the log file to
    …\RegionW\Reports\Logs
                      GA_RegionW_DS_13_Make impPoints_<yymmdd>.txt
  • Save your changes and exit the FME workbench..

Task 1.5.3 – Modify impPoints [Optional]

The parcel centroid locations can be improved upon with better sources.  Address points and/or building footprints can be used for more accurate building XYs.  In both options, a new set of impPoints are created.

This option is not implemented in RegionW.

Task 1.5.4 – Make Buildings

‘Buildings’ is a feature class made by joining impPoints to impCAMA.  Buildings are a window into the provided data sources to see if they are ready for BI.

  • Right-click | Properties the 14_Make_Buildings tool and change the tool Description to reflect the current implementation.
  • Right-click | Edit to open up the FME workbench.
  • Set the Published Parameters | SourceDataset to:

…\RegionW\Data_Sources\Improvements\
            GA_RegionW_Improvements_GDB.mdb

  • Set the Published Parameters | DestDataset to:

…\RegionW\Data_Sources\Buildings\
            GA_RegionW_Buildings_GDB.mdb

  • Open the impAttributeCopier transformer and set any unmapped fields

Set Vendor to ‘WinGAP’

Set Name to <Name>

 

            (Name refers to the name of the building/owner. The value is <Name>)

  • Run the script and review the log file to make sure that the records were processed correctly.
  • Add the new Buildings feature class to the Data Sources Group Layer and review the results.
  • If all OK, save the log file to
    …\RegionW\Reports\Logs
                      GA_RegionW_DS_14_MakeBuildings_<yymmdd>.txt
  • Save your changes and exit the FME workbench.

 

NOTE:  If your impCAMA and impPoints do not merge properly (you don’t have any Buildings or you have a lot fewer than you should), check the formats of the parcel IDs for the two datasets.  Some typical issues are that there are extra spaces in one of the datasets or that multiple fields need to be concatenated.

Task 1.5.5 – QC Buildings

Buildings data is not fixed or repaired.  However, too many ‘Unk’ or ‘0’ values may indicate that the wrong domains were applied.  Bad hit-rates (lower than 85%) may mean that the wrong PARIDs are being used for the join (ParcelNo, ParNum or ParcelID?).  Data processing events may be needed to clean up the PARID values (remove spaces, upper-case/lower-case, remove dashes, add county FIPs etc…) to improve the match rates.  It is important that the match rate between impCAMA and impPoints is optimized.

Buildings will be used to create Building Inventory.  BI cannot be populated with ‘0’ or ‘Unk’ values.  CDMS will trap invalid or missing values – one mistake will prevent the entire record set from being loaded into Hazus.  QC Buildings carefully to make sure it is ready to be handed-off.  Tools are provided:

  • Right-click | Properties the 15_QC_Buildings tool and change the tool Description to reflect the current implementation.
  • Right-click | Edit to open up the FME workbench.

Caution:  Check to make sure you are clicking on the right dataset for Source and Destination paths listed under Published Parameters. Wrong paths will lead to a failure in FME

 

  • Set the Published Parameters | SourceDataset to:

…\RegionW\Data_Sources\Buildings\
            GA_RegionW_Buildings_GDB.mdb

…\RegionW\Data_Sources\Improvements\
            GA_RegionW_Improvements_GDB.mdb

  • Set the Published Parameters | DestDataset to:

…\RegionW\Data_Sources\Buildings\
            GA_RegionW_Buildings_GDB.mdb

  • Run the script and review the log file to make sure that the records were processed correctly.
  • If all OK, save the log file to
    …\RegionW\Reports\Logs
                      GA_RegionW_DS_15_QCBuildings_<yymmdd>.txt
  • Save your changes and exit the FME workbench..
  • Save your changes and ArcGIS.

The best available values will be calculated to replace all occurrences of ‘0’ and ‘Unk’ in the BI Maker.  The risk is that the BI appears to be much higher quality than it really is.  Buildings are a much better indicator of the quality of the ‘soon to be’ modeling data.  Review the Building QC reports in Access:

  • Open Access to the Buildings GDB

…\Data_Management\RegionW\Data_Sources\Buildings\
            GA_RegionW_Buildings_GDB.mdb

  • Select the QC report named impCAMA_QC and right-click > Design View. Change the report title to RegionW, Georgia.
  • Right-Click on the report named impCAMA_QC and export to PDF:

…\Data_Management\RegionW\Reports\Exposure\
            GA_RegionW_impCAMA_QC.pdf

  • Select the QC report named impDomains_QC and right-click > Design View. Change the report title to RegionW, Georgia.
  • Right-Click on the report named impDomains_QC and export to PDF:

…\Data_Management\RegionW\Reports\Exposure\
            GA_RegionW_impDomains_QC.pdf

  • Exit Access

Review the QC Reports.  Open the QC report named impDomains_QC.pdf

Lipstick Application report shows how much of the CAMA values will be replaced in BI.  A Lipstick Factor of 14% means that 14% of the CAMA values either did not match the provided domains, were empty (<Null>, or ‘0’) or unusable (‘Other’, ‘Pending’).  Low Lipstick Factors are good.  High Lipstick Factors may indicate that the Control Domains are not being used (data is bad), or that the wrong domains are being used (translator is bad).  If the data is bad, consider making the domains from the data rather than the control tables.  The missed values are output to a table named impDomains_Failed.

Export Lipstick reports to the Process folder

Open the QC report named impCAMA_QC.pdf.  Buildings are made by joining impCAMA to impPoints – not the other way around.  There may be more than one impCAMA on an impPoint (they will be coincident). Review the ‘hit rate’ between impCAMA and impPoints.

Export Hit rate to Process folder

Since the Parcel and CAMA databases are maintained separately, a 100% hit rate is unlikely, but it should be better than 85%.  The risk is that the GBS will be degraded (not improved) with too many misses (records not migrated from CAMA). Not all impPoints will have an impCAMA.  Missed records are ported through the NotMerged port to a table called ‘impCAMA_Failed.  Review these to see why they failed, and if they can be corrected. The join field (PARID) comes from two different sources.  There is no guarantee that they will match.

Make sure that the PARIDs have identical formats (same types and field lengths) in impCAMA and impPoints.  Index both key fields to improve performance.  Review the records that did not join to see if they can be included.  Common PARID data problems are:

  1. Extra characters (-, _ ) that need to be removed from impCAMA or impPoints
  2. Trailing or leading spaces
  3. Extra qualifiers that need to be added or removed. For example, one set of PARIDs may have a County FIPs suffix.

After the Buildings have been QC’ed and pass inspection, add the QC statistics to the RegionW process document.  The Buildings are ready for migration to Building Inventory.  RegionW Inventory tools and workflows are provided separately.

  • Update the QC statistics in the RegionW Process document
    …\Data_Management\RegionW\Reports\Process
                      doc

Task 1.6 – Incorporate Essential Facilities from GMIS

The current process for managing essential facilities in the State of Georgia is for counties to upload updates to these facilities to the Georgia Mitigation Information System (GMIS) database managed by the University of Georgia’s (UGA) Carl Vinson Institute of Government.  Arrangements have been made with UGA to provide extracts of the CAMA system in a consistent format that can be converted to Hazus-MH compliant essential facilities using the tools described below.   Task 1.6 assumes that the county being modeled has provided credible essential facility updates to the GMIS.

Note: In the event that the county being modeled has not uploaded their current essential facilities inventory to GMIS and it is not possible to do this in a timely manner, you have the option of updating the inventory from local data.  That process is described in Appendix 6.

Open the GA_RegionW_Facilities.mxd and redirect the layers as necessary.

Get with ITOS to cross check

Note: Polis Center computers have been updated with the new EF.mdb.  As a result, it is only necessary to follow the steps below to create EFs for the county.

  • Add the facilities from C:\HazusData_22\GA\EF.mdb
  • Clip each feature class to the county boundary and save to

C:\Projects\Hazus_Projects\PDM_Georgia\Data_Management\RegionW\Hazus_Updates\Essential_Facilities\GA_RegionW_EFs.mdb

  • EOCFacilities
  • FireStationFacilities
  • MedicalCareFacilities
  • PoliceStationFacilities
  • SchoolFacilities

 

However, if it is necessary to create the EF.mdb from the GMIS download:

Add the Hazus Updates toolbox: …Data_Management\RegionW\Tools\GA_RegionW_Data Sources.

  • Right-click | Properties the 18 EF Extractor tool and change the tool Description to reflect the current implementation.
  • Right-click | Edit to open up the FME workbench.
  • Set the Published Parameters | SourceDataset to:

… \Data_Sources\Downloads\Inventory_Data\GMIS\<Name of GMIS spreadsheet download>.xlsx

…\Data_Management\RegionW\Inventory\Boundaries
            GA_RegionW_Boundaries_GDB.mdb

  • Set the Published Parameters | DestDataset to:

…\RegionW\Working\
            GA_RegionW_Working_GDB.mdb

  • Run the script and review the log file to make sure that the records were processed correctly.
  • If all OK, save the log file to
    …\RegionW\Reports\Logs
                      GA_RegionW_18_EF Extractor_<yymmdd>.txt
  • Save your changes and exit the FME workbench..

 

The Extractor tool extracts the facilities for a particular county.  Modifications need to be made to the attributes of the data to make it Hazus-MH compliant.

 

  • Right-click | Properties the 19 UC EF to HZ tool and change the tool Description to reflect the current implementation.
  • Right-click | Edit to open up the FME workbench.
  • Set the Published Parameters | SourceDataset to:

…\RegionW\Working\
            GA_RegionW_Working_GDB.mdb

  • Set the Published Parameters | DestDataset to:

…\RegionW\Hazus_Updates\Essential_Facilities
            GA_RegionW_ EFs_GDB.mdb

  • Run the script and review the log file to make sure that the records were processed correctly.
  • If all OK, save the log file to
    …\RegionW\Reports\Logs
                      GA_RegionW_19_EF to HZ_<yymmdd>.txt

 

Save your changes and exit the FME workbench..

 

Save your changes and ArcGIS.

[Note] The GMIS data on which the tools were built included BuildingValue with the type of valuation also noted.  ValuationType included “Assessed Value”, “Market Value”, “Replace Value”, “Other”, and “Unknown”.  There was no adjustment made to the Building Value based upon the Valuation Type; the data was used as provided.

 

Note: The following pages with Appendix give additional information about the tools and methods used within Tasks 1.1 to 1.6. It is not necessary to perform any of the steps outlined, however it is important to understand the techniques involved in creating the data.

Appendix 1    Glossary

The following terms and abbreviations are used throughout the workflow documentation.

Abbreviation

Context

Definition

CDMS

Abbreviation

Comprehensive Data Management System

Polis

Abbreviation

Data 3.0 Professional Services

Polis

Abbreviation

The Polis Center

EF

Abbreviation

Hazus Essential Facility point feature classes

HPLF

Abbreviation

Hazus High Potential Loss point feature classes

TRNS

Abbreviation

Hazus Transportation point feature classes

UTIL

Abbreviation

Hazus Utilities point feature classes

ETL

Abbreviation

Extract Transform Load

FME

Abbreviation

Feature Manipulation Engine

GBS

Abbreviation

General Building Stock

HIPOC

Abbreviation

Hazus International Proof of Concept

CTP

Abbreviation

Contracted Technical Partner

PDM

Abbreviation

Pre-Disaster Mitigation

UDF

Abbreviation

User Defined Facilities

GA

Abbreviation

State abbreviation (e.g. VW, GA, IN, PR)

ac

Prefix

Accessories

re

Prefix

Residential

mo

Prefix

Mobile

co

Prefix

Commercial and Industrial

Facilities

Term

Data sources for FI

Buildings

Term

Data sources for BI

BI

Feature Class

Building Inventory

FI

Feature Class

Facility Inventory

DEM

Raster

Digital Elevation Model - 10m statewide

Building Inventory

Term

Editing point GDB for Hazus GBS or UDF analysis

Essential Facilities

Term

Hazus Care, Fire, EOC, Police, School facilities

Transportation Facilities

Term

Hazus Airports, Bridges, Bus

HPLF Facilities

Term

Hazus Hazmat, Dams, Military, Nuclear Power

Utility Facilities

Term

Hazus Electric, Gas, Water, Waste, Communication

Critical Facilities

Term

Provided data sources for Facility Inventory

Facility Inventory

Term

Editing point GDB for Hazus EF analysis

General Building Stock

Term

Hazus aggregate inventory by Tract or Block

Study Region

Term

Hazus modeling extent

Study Area

Term

Hazus area extent (sum of all Study RegionWs)

RegionW

Term

Modeling area (usually a county, e.g. RegionW)

User Defined Facilities

Term

Hazus point inventory

 

 

 

 

Appendix 2    Issues and Questions

Data issues were discovered during the RegionW Pilot using Hazus 2.2.  Some have been resolved.  Others need to be resolved before proceeding to the next county.

Environment

  • Hazus 2.2 SP1 (released on 20-May-2015) was used to develop the Ver 1.1 workflow. The tools may need to be re-run on the current release.
  • Do we need to consider SQL Server Management Studio? Scripts are provided to compress the Study Regions without needing SQL Server.
  • Compress the Study Regions – the SQL log files are huge.

Data Sources

  • Many records contain <Nulls> or “ “ or “0” or “Unk”.
    All fields need to be populated will legitimate values.
  • The important CAMA fields are:
    PARID | Occupancy | Area | Cost | Construction | Foundation | Condition | Age | NumStories

Inventory

  • The xFactors ($/sqft) are derived from CAMA to determine impArea values for every structure. Replacement costs were calculated using Hazus 2.2 RSMeans hzDolSqFt values adjusted for RegionW.  New xFactors are determined for each country.
  • EFs were updated statewide.
          The steps are provided in \Georgia\Reports\Workflow.
          The tools are provided in \Georgia\Tools
  • Facilities were NOT migrated in RegionW. The Facilities workflow and tools for RegionW are provided in case EFs need to be updated or maintained locally.
  • Non-EF records are NOT migrated or modeled in Georgia PDM. [TBD] They may be reported if they fall inside the flood boundary.

Hazus Updates

  • UDFs are loaded into an existing Study Region using Hazus Import tools. FME tools exist to populate the Study Region SQL Server tables.  This workflow is not documented in Ver 1.1.

Reporting

  • SQL Server Management Studio is a better option for reporting losses in a Study Region. This option was not implemented in Ver 1.1.
  • Reporting tools for loss estimates were not documented in Ver 1.1. Hazus UDF reporting options are weak.  Ver 1.2 will explore GBS inventory to unleash better reporting tools (debris, shelter, business interruption losses etc…)

Appendix 3    Design Considerations

The RegionW Template is focused on FL models using inventory updated from local data sources.  There are known design limits:

  • The content and structure of the Data Source (DS) records belong to GEMA. DS records are migrated to Facility Inventory (FI) and Building Inventory (BI) for modeling purposes.
  • Do not add fields to ‘improve’ the DS data. Desired DS fields will be imported into BI (e.g. Age, Condition, Construction, Occupancy and Foundation).  Unwanted fields will be left behind.  The best way to improve the inventory is to populate the DS fields with legitimate values prior to delivery to Polis.
  • DS schemas need to be established during the Pilot. The more these change, the more the tools and workflows need to be changed.
  • DS formats will be standardized at the start of the workflow. Feature classes will be migrated to an ESRI PGDB.  Tables will be migrated to MS Access.  Critical fields will be selected.  Legitimate records will be selected.
  • DS records within a county need to be complete. The workflow is ‘replace’, not append.  There are no reconciliation steps in the workflow – out with the old, in with the new.  Records outside the county boundary will be ignored.
  • DS feature classes will be projected to GSC-NAD83 and clipped to the county boundary before starting work. This is the only projection system that Hazus supports.
  • The impCAMA design belongs to Polis. It represents the critical building attributes sourced from multiple CAMA vendors.  It is NOT enhanced – no lipstick on a pig.  The values are derived from CAMA – but the codes are replaced with descriptions so that they can be easily read and QC’ed.  Numbers are put in Numeric fields (Byte, Integer and Single).  Text attributes are sized AN(15).
  • The ImpPoint design belongs to Polis. It represents the ‘best available’ building point locations.  The Lat Lons are derived from parcel centroids, footprint centroids or E911 addresses.  The only field of interest is PARID – indexed and formatted the same as impCAMA | PARID.
  • The FI and BI designs belong to Polis. Values, domains, fields and formats are built to support several modeling tools (SAGA, Hazus, Aloha, FIA and SHERPA).  Although the tools to create BI and FI are dependent upon the various data sources that may be used, the approach can be standardized across projects.
  • The GBS and EF designs are based upon the target modeling systems. They are temporary databases used specifically for data loading.  Do not edit data in these databases.
  • BI values need to be populated with legitimate values. ‘0’ and ‘Unk’ will not work – they will be replaced with defaults.  CDMS performs strict data validation.  All values must be valid before any records can be imported into CDMS.
  • Hardcoded defaults (e.g. ‘1970’) are avoided when better values can be derived. For example, the average YearBuilt (by hzOccCode or Block boundary) is a preferred default where values do not exist.  However, this assumes that neighbor records have legitimate entries.

Appendix 4    FME Algorithms for Buildings

The following filters and mapping schemes were applied to create impCAMA for RegionW.  impCAMA is a table representing the structures to be modeled.  Information to populate the characteristics of each structure is gleaned from multiple sources.

Domain tables that translate CAMA codes to descriptions are provided in the Improvements MDB.  The WinGAP domain tables can be customized for each Region.

Fields are prefixed with imp to indicate that this is raw data.  There is no attempt to make the data better than it is.  Defaults are NOT applied here.  Missing values are recorded as ‘Unk’ or ‘0’.  Domains are applied, meaning that the impCAMA table will contain the descriptions, not the codes.  Descriptions are AN(15).  Hazus Occupancy Codes are populated.

The unit for impCost is $s.  The unit for impArea is sqft.

impCost is calculated by impArea * hzDolSqFt.  The hzDolSqFt values are the Hazus Ver 2.2 RSMeans values for each hzOccCode.

impConstruction values are set to ‘Unk’ unless provided in Domain_Construction.

impCondition values are set to ‘Unk’ unless provided in Domain_Condition.

Appendix 5    FME Algorithms for Facilities

The following filters and mapping schemes were applied to create Facilities for RegionW.  ‘Facilities’ is a GDB representing the feature classes to be modeled and reported individually. 

Standard processing flow using the GMIS database

The field values below were used to separate out the essential facilities.

 

Facility Type

Field/Value

Coded As:

K-12

Occupancy Contains “Grade School”

EFS1

Higher Education

Occupancy Contains “Colleges”

EFS2

Fire

Occupancy Contains “Emergency Response”

FacilityTypes Contains “Fire” or “EMS”

Failure for FacilityTypes Contains “Medical”

FDFLT

Police

Occupancy Contains “Emergency Response”

FacilityTypes Contains “Law Enforcement”

PDFLT

Care

Occupancy Contains “Hospital”, “Medical”, or “Nursing”

MDFLT

EOC

Occupancy Contains “Emergency Response” or “General Services”

FacilityTypes Contains “EMA”

EFEO

 

 

Field names and data types (string or number) were adjusted to facilitate import into Hazus.

 

The GMIS data included BuildingValue with the type of valuation also noted.  ValuationType included “Assessed Value”, “Market Value”, “Replace Value”, “Other”, and “Unknown”.  There was no adjustment made to the Building Value based upon the Valuation Type; the data was used as provided.


 

Appendix 6    Optional Process for Updating Essential Facilities

The following instructions are provided should the county choose to update their facilities locally.

Facilities data sources come in all shapes and sizes.  The goal is to standardize the inputs to a common format.  The generic name for the provided sources is ‘Critical Facilities’.  The outputs are the Facilities which will be used to create Facility Inventory.

  1. Personal GDB format
  2. GCS_NAD83 projection
  3. Required fields
  4. Required records

GEMA is the provider of Critical Facilities statewide.

[Note]  Sample EFs for GA were migrated in Aug-2012.  The source was a FGDB (18k records) named GMIS.gdb provided by DCA (Terry Jackson).  DCA also provided HSIP data, and LawEnforcement, Nursing Home and EMS SHPs.  These various sources were migrated for discussion purposes at the Kick-Off.  A FGDB named GMIS_EF_Source was provided by GMIS after the Kick-Off.  Statewide EFs were updated in Jan-2013 from a source feature class called Facilities.

[Note]  GMIS provided a FGDB again in Jun-2015.  Statewide EFs were updated in Jul-2015 from a source feature class called Facilities. 

[Option]  The Facilities workflow is provided to users wanting to maintain or update EFs locally.  In this case, the Critical Facilities are exported from Hazus for updating.  The steps to export, edit and import Critical Facilities are provided in Section 2 of this appendix.  In this case, there is no need to create Facilities.

Section 1: Creating Facilities

The high-level workflow to create Facilities:

Create Domain tables to compile Critical Facility codes

Clip to the county boundary

Project to GCS-NAD83

Filter Essential Facilities (School, Care, Fire, Police and EOC)

Optional - Filter Transportation Facilities (Bridges, Airports, Bus)

Optional - Filter High Potential Loss (Hazmat, Dams, Military, Nuclear)

Optional - Filter Utilities (Electric, Gas, Water, Wastewater, Communications)

All other records are Community Assets (e.g. Court House, Library, Stadium)

Export selected points to the Facilities GDB

 

Make Facilities Domains

Domains will be created to convert the Facility codes into Facility descriptions.  The Facilities data is interrogated to find the unique codes for Class, Condition, Construction and Foundation types.  The name of the output table is Domains_Facilities.  The user must provide the descriptions from the Facilities documentation if domain tables have not been provided.

[Note]  If the Domain tables have been provided they may be imported into FME directly and this task may be skipped.

The domain values will be extracted from the state data so that they may be applied to all counties, not just RegionW.

  • Right-click | Properties the 21_Make_Facilities_Domains tool and change the tool Description to reflect the current implementation.
  • Right-click | Edit to open up the FME workbench.
  • Set the Published Parameters | SourceDataset to:

…\Georgia\Data_Sources\Downloads\Inventory_Data\
            GMISExport<Date>

  • Set the Published Parameters | DestDataset to:

…\RegionW\Data_Sources\Facilities\
            GA_RegionW_Facilities.mdb

  • Run the script and review the log file to make sure that the records were processed correctly.
  • Add the new Domains_Facilities table to the Data Sources Group Layer and review the results.
  • If all OK, save the log file to:

…\RegionW\Reports\Logs\
            GA_RegionW_DS_21_MakeFacilitiesDomains_<yymmdd>.txt

  • Save your changes and exit the FME workbench..

 

The new Domains_Facilities table must be edited.  The user must add the Descriptions from the Facilities documentation.  Brief descriptions needed – all values will be truncated to 15 characters.

 

  • Open the Domains_Facilities table in Access:

…\Data_Management\RegionW\Data_Sources\Facilities

GA_RegionW_Facilities.mdb  |  Domains_Facilities

  • Enter the correct Descriptions into Domains_Facilities
  • Copy the Domains_Facilities table for each Domain table needed to populate the Occupancy, Condition, Construction and Foundation descriptions. Name each Domain using the following convention:

Domain_<Facilities_Field>

  • Sort on the Domain field and delete the domain records that do not apply
  • Exit Acces

Create Facilities

Much of the raw GMIS data is not needed.  Required records are identified, and the desired fields are extracted into a Facilities GDB.  The Facilities GDB is preformatted to include the 22 point feature classes that can be modeled.  Facilities that cannot be modeled are migrated into a new feature class named Community Assets.

The Facilities feature classes are ‘flat’ – one record for each structure to be modeled.  The Facilities feature classes have the same names as the destination FI, prefixed with ‘fac’.  The ‘fac’ feature classes have the same field names and field types as the destination FI.  However, all domains and constraints have been removed.  This is not an editing GDB.

[Note] This is the same workflow as described in Create Buildings.  This time, the prefix ‘imp’ is replaced with ‘fac’.  The prefixes are used on fields whose values will potentially be changed.  This distinguishes the raw data from the modeling data.

The GMIS download includes facilities for the entire state.  The records will need to be clipped to the county boundary and re-projected.  There is only one feature class.  Records will need to be filtered into the desired Essential Facilities and Community Assets.

Facilities data is not fixed or repaired.  However, <Null> values are replaced with ‘Unk’ (Char fields) or ‘0’ (Numeric fields) so that they can be evaluated.  Class codes are populated – the Critical Facilities are categorized into one of 22 site specific Hazus feature classes.  In some PDM projects only the EF classes are passed through.

Records that do not have valid Class Codes need to be evaluated.  In some PDM projects, these records are aggregated to Community Assets.  Community Assets are not modeled in Hazus, but they are reported if they lie inside the flood boundary.

Codes are replaced with Descriptions using the Domains constructed previously.  If the BldgConstruction domain value ‘W’ designates ‘Wood’, then ‘Wood’ is captured in Facilities.  Values ‘w’, ‘W ‘, W?’ ‘WW’ ‘Wood’ are converted as ‘Unk’.  This provides the ability to review Facilities without having to understand or decode the underlying Facilities system.

[Caution]   The Make_Facilities tool will run successfully even if the ValueMapper transformers have not been updated to reflect the values in RegionW.  However, the descriptions may be wrong (that is, a ‘3’ in RegionW is not the same thing as a ‘3’ in RegionB) or there will be many more illegitimate ‘Unk’ values (that is, a ‘7’ in RegionW was not even coded for in RegionB).  These issues are avoided by making statewide Domains.

  • Right-click | Properties the 12_Make_Facilities tool and change the tool Description to reflect the current implementation.
  • Right-click | Edit to open up the FME workbench.
  • Set the Published Parameters | SourceDataset to:

…\Georgia\Data_Sources\Downloads\Inventory_Data\
            GMISExport<Date>

  • Set the Published Parameters | DestDataset to:

…\RegionW\Data_Sources\Improvements\
            GA_RegionW_Facilities.mdb

  • Update the ValueMapper transformers based upon the new Domains_Facilities values in Task 1.7.1. The Construction ValueMapper is shown below as an example:
  • Right-click Properties > AttributeValueMapper
  • Delete all the current entries (empty Source Values and Destination Values)
  • Import > Source And Destination Values
  • Navigate to the database containing the Domain table > Next >

…\Data_Sources\Improvements\GA_RegionW_Facilities.mdb

  • Select the relevant Domain table > Next >

Domain_Construction

  • Select the ‘Source Value’ column > Next >

Code

  • Select the ‘Destination Value’ column > Next > Import

Description

  • Continue to update all ValueMappers to match the Domain tables in

GA_RegionW_Facilities.mdb

Construction  |  Condition  |  Foundation

  • Run the script and review the log file to make sure that the records were processed correctly.
  • Add the new Facilities feature classes to the Data Sources Group Layer and review the results.
  • If all OK, save the log file to
    …\RegionW\Reports\Logs
                      GA_RegionW_DS_22_MakeFacilities_<yymmdd>.txt
  • Save your changes and exit the FME workbench..
  • Exit the ArcGIS and save the changes.

 

This completes the Data Sources workflow.  The provided source data has been standardized into Facilities and Buildings.  Buildings and Facilities will be used to make the editing FI and BI GDBs.  The steps can be found in the Inventory workflow.

The process continues in Appendix 2 of the Inventory Workflow.

 

Section 2: Create Facility Inventory from Essential Facilities

Regions that cannot provide data sources to create EFs need another solution.  In this case, the best available data source is Hazus.  The option to export Hazus EFs and migrate them to FI_GDB for editing/updating purposes is provided here.  Skip this section if you already have new EFs ready to be imported into Hazus.

The default Essential Facilities exported from Hazus in the previous step will be used to make Facility Inventory.  Do NOT be tempted to share or edit the CDMS_Export_EF_DB.  This is not an editing GDB – domains and field constraints have not been applied.  Use the FI GDB to avoid data validation errors.

  • Open the Essential Facilities MXD in ArcGIS

…\Data_Management\RegionW\MXD_Documents

            GA_RegionW_Hazus_Updates.mxd

  • File > Add Data for RegionW EF to the Hazus Layer Group

…\RegionW\hazus_Updates\Site_Specific\
            GA_RegionW_CDMS_Export_EF_2014_GDB.mdb

FireStationFacilities

SchoolFacilities

PoliceStationFacilities

MedicalCareFacilities

EOCFacilities

 

Review the Essential Facility feature classes.  These are the default records exported for RegionW in the previous task.  These facilities will be updated in the FI_GDB.  Facilities will be added, changed, deleted and moved.

  • Right-click and select Edit to open the 22_OPT_EF To_FI_GDB script from the RegionW_Hazus_Updates toolbox.
  • Set the Reader data target to:
    …\RegionW\Hazus_Updates\Site_Specific
                            mdb
  • Set the Writer data source to:
    …\RegionW\Inventory\Facility_Inventory\
                            mdb
  • Click the run arrow to start the translator.
  • Review the log file for errors. Export the logfile to

…\RegionW\Reports\Logs
            GA_RegionW_HU_22_EF2FI_<yymmdd>.txt

  • Exit the Workbench and save changes to the 22_EF_To_FI FME script
  • Exit ArcGIS and save changes to the RegionW_Hazus_Updates MXD.

The exported EFs may be edited in GA_RegionW_FI_GDB.mdb using ArcGIS.  The goal is roof-top accuracy for the spatial locations.  The records should be current, so add missing Schools, Fire Stations, Police Stations and Hospitals.  Populate all missing values.

After the FI GDB has been updated, the feature classes will need to be migrated back to CDMS_Import_EF_GDB (see Task 5.1.2).

 

FME Algorithms

Information to populate the characteristics of each facility is gleaned from GA_Critical_2015.shp.

Matrix tables that translate DS values to target values are provided in the Facilities GDB.  The Pilot assumptions and matrix tables can be customized as the requirements change.

Feature classes are prefixed with ‘fac’ to indicate that this is raw data.  There is no attempt to make the data better than it is.  Defaults are NOT applied here.  Missing values are recorded as ‘Unk’ or ‘0’.  Domains are applied, meaning that the Facilities feature classes will contain the descriptions, not the codes.  Descriptions are AN(15).  CAT and SUBCAT codes are used to filter the feature classes.

Only Essential Facilities were migrated.  Records that could not be modeled as EFs were migrated as Community Assets.

The unit for impCost is $s.  The unit for facArea is sqft.

Attachments

    Outcomes