Hi

I have a layer of lat and long point data representing GPS tracks collected during vessel surveys (an individual survey is indicated as the time between departing and returning to shore and there can be more than one on a given date i.e. when we returned to shore and then went out again later that day). I also have a layer of 1 and 2 kmsq polygon grid cells. I would like to calculate the time spent 'on survey effort' in each cell to adjust for effort in other analyses. I intend to adjust the weighting of each cell for these other calculations depending on the amount of effort spent in the cell, i.e. adjusted density of a particular whale sighting based on the search effort in that cell.

So far I have used a Spatial Join to give the effort points that fall into a particular grid cell the unique identifier of the grid cell. I then exported the table to Excel to sum the time spent on effort in each cell using the unique identifier of that cell. This was calculated by summing the times (on a given date and survey) between the points collected along that track in that cell and then using a Pivot Table to sum the total survey effort in each cell by the cell's ID and the survey number. Unfortunately, I realized after doing this that the total time on effort for all cells i.e. for a given year, is different at the two different spatial scales of 1 and 2 kmsq. I believe this is because the calculation is missing the times the survey track crossed into a new cell i.e. the points fall in different cells so the time between these points was excluded.

I would really appreciate if anybody has a better idea of how I can calculate the sum of these survey effort times. I think I need to somehow make the point data into line data and extrapolate the times when the line crosses into a new grid cell but I'm not sure how to do this. Any thoughts?

Thanks!

I have a layer of lat and long point data representing GPS tracks collected during vessel surveys (an individual survey is indicated as the time between departing and returning to shore and there can be more than one on a given date i.e. when we returned to shore and then went out again later that day). I also have a layer of 1 and 2 kmsq polygon grid cells. I would like to calculate the time spent 'on survey effort' in each cell to adjust for effort in other analyses. I intend to adjust the weighting of each cell for these other calculations depending on the amount of effort spent in the cell, i.e. adjusted density of a particular whale sighting based on the search effort in that cell.

So far I have used a Spatial Join to give the effort points that fall into a particular grid cell the unique identifier of the grid cell. I then exported the table to Excel to sum the time spent on effort in each cell using the unique identifier of that cell. This was calculated by summing the times (on a given date and survey) between the points collected along that track in that cell and then using a Pivot Table to sum the total survey effort in each cell by the cell's ID and the survey number. Unfortunately, I realized after doing this that the total time on effort for all cells i.e. for a given year, is different at the two different spatial scales of 1 and 2 kmsq. I believe this is because the calculation is missing the times the survey track crossed into a new cell i.e. the points fall in different cells so the time between these points was excluded.

I would really appreciate if anybody has a better idea of how I can calculate the sum of these survey effort times. I think I need to somehow make the point data into line data and extrapolate the times when the line crosses into a new grid cell but I'm not sure how to do this. Any thoughts?

Thanks!

That is an intereseting analysis... Seems like you would need to also "length-weight" the times for the portion of the line that lies within a particular summany grid.

Some basic Python scripting knowledge would surely be a big help.