The ultimate goal of my project is to create a time series of predictive habitat suitability maps for an invasive species of interest to the MD DNR. We want to predict how climate change from 2010 - 2100 may effect suitable habitat range.

I acquired observational temperature and precipitation point data from 2009. They are a NARR reanalysis product. There are 35 points for each variable.

I also researched the IPCC, and found their most recent (2007) temperature and precipitation predicted change curves. I chose the A1B scenario, and then calculated the total predicted change in temperature and precipitation, then divided that by the 90 year time span to determine the change per year. I then multiplied the per year increase by 5, as I am interested in looking at 5 year time steps. Results were 0.135 C increase in temp/ 5 years, and 0.2% increase in prec / 5 years. (It was determined that for the purposes of this project a linear increase was appropriate, even though the increase will most likely be exponential.)

I imported the temp and prec points into ArcMap, plotted the XY coordinates, and exported the maps as .shp files.

I was not sure whether it was more appropriate to do one of two methods. I explain them, and the issues I am having with each...

1) First add fields to the attribute table of the point layers, 2010, 2015, 2020... and then use the field calculator to create a time series of "observation points" to populate each field. The calculation would be

2015 = 2010 + 0.135....etc. I could then interpolate each year's set of points to get a map for that year. (I researched and chose Kringing as the appropriate interpolation method for temperature data).

I complete this process, creating 18 interpolation maps in all, covering IPCC temperature predictions from 2010-2100 in 5 year time steps. All of the maps look the same, which I think is because each map has a different interval classification. I cannot figure out how to normalize the maps, to have one standard classification scheme, thereby allowing the user to see differences in the maps over time. (Also, is there any way to convert a floating point to a discrete raster??)

2) I used the original 2009 points to perform a kringing interpolation. I was then thinking about trying to use Map Algebra to apply the IPCC change scenario to the original interpolated map. I would perform this 18 times to produce all maps. In english, my equations would be:

2015 = 2010 + 0.135 .... 2020 = 2015 + 1.35 .... 2025 = 2020 + 1.35

Conversely, I have a table for each variable that shows the total change from 2010, so I could use the original map for each interpolation, rather than using the 2015 output to create the 2020 output and so on...

I am not sure what the corresponding Map Algebra would be...

Also, I have the equations for each curve

Temp: y = 0.027x -53.87

Prec: y = 0.04x - 79.8

Of course, I also thought about writing a Python code to automate this process...but I think that would take me too long right now, as I am supposed to have these layers by tomorrow afternoon.

Thanks for reading this incredibly long email, and for any help you might be able to provide!

- Rachel

I acquired observational temperature and precipitation point data from 2009. They are a NARR reanalysis product. There are 35 points for each variable.

I also researched the IPCC, and found their most recent (2007) temperature and precipitation predicted change curves. I chose the A1B scenario, and then calculated the total predicted change in temperature and precipitation, then divided that by the 90 year time span to determine the change per year. I then multiplied the per year increase by 5, as I am interested in looking at 5 year time steps. Results were 0.135 C increase in temp/ 5 years, and 0.2% increase in prec / 5 years. (It was determined that for the purposes of this project a linear increase was appropriate, even though the increase will most likely be exponential.)

I imported the temp and prec points into ArcMap, plotted the XY coordinates, and exported the maps as .shp files.

I was not sure whether it was more appropriate to do one of two methods. I explain them, and the issues I am having with each...

1) First add fields to the attribute table of the point layers, 2010, 2015, 2020... and then use the field calculator to create a time series of "observation points" to populate each field. The calculation would be

2015 = 2010 + 0.135....etc. I could then interpolate each year's set of points to get a map for that year. (I researched and chose Kringing as the appropriate interpolation method for temperature data).

I complete this process, creating 18 interpolation maps in all, covering IPCC temperature predictions from 2010-2100 in 5 year time steps. All of the maps look the same, which I think is because each map has a different interval classification. I cannot figure out how to normalize the maps, to have one standard classification scheme, thereby allowing the user to see differences in the maps over time. (Also, is there any way to convert a floating point to a discrete raster??)

2) I used the original 2009 points to perform a kringing interpolation. I was then thinking about trying to use Map Algebra to apply the IPCC change scenario to the original interpolated map. I would perform this 18 times to produce all maps. In english, my equations would be:

2015 = 2010 + 0.135 .... 2020 = 2015 + 1.35 .... 2025 = 2020 + 1.35

Conversely, I have a table for each variable that shows the total change from 2010, so I could use the original map for each interpolation, rather than using the 2015 output to create the 2020 output and so on...

I am not sure what the corresponding Map Algebra would be...

Also, I have the equations for each curve

Temp: y = 0.027x -53.87

Prec: y = 0.04x - 79.8

Of course, I also thought about writing a Python code to automate this process...but I think that would take me too long right now, as I am supposed to have these layers by tomorrow afternoon.

Thanks for reading this incredibly long email, and for any help you might be able to provide!

- Rachel