I have a point feature layer containing information about water monitoring locations. This includes a unique location ID, upper and lower limits for the recorded values and (what I want to be) the current value at that location. (Table 1)
Table 1:
Gid | Location_ID | Upper_Limit_m | Lower_Limit_m | Current_Value_m |
1 | GW 1 | 880 | 850 | |
2 | GW 2 | 886 | 856 | |
3 | GW 3 | 884 | 854 |
I also have a table that contains a history of recorded values at those locations listed above. (Table 2)
Table 2:
Location_ID | Date | Recorded_Value |
GW 1 | 01/06/2021 | 867.3 |
GW 2 | 01/06/2021 | 894.1 |
GW 3 | 01/06/2021 | 843.9 |
GW 1 | 05/06/2021 | 862.4 |
I am able to create a relationship between these two tables using the Location_ID as the Primary Key, using a one to many cardinality.
What I am trying to achieve is to "join" the most recent value in the Recorded_Value column of Table 2 into the Current_Value column in Table 1 for each unique monitoring location (Location_ID). Furthermore, I need to ensure that the Recorded_Value (from Table 2) is within the upper and lower limits (from table 1).
I'm thinking that I might be able to use a definition query (specifically using SQL) to achieve this.
Is this possible? If so, what would be the syntax to use a value from a related table in a definition query? Or is there an alternative way that I can achieve this?
There are a few possibilities, depending on what you're working with.
If you have an Enterprise geodatabase, showing the latest record is really easy: Create a query layer (stored in the map document) or a database view (stored in the database) with the follwing SQL:
SELECT t1.*, t2.*
FROM Table1 t1
INNER JOIN (
SELECT Location_ID, MAX(Date) MaxDate
FROM Table2
GROUP BY Location_ID
) MaxDates
ON t1.Location_ID = MaxDates.Location_ID
INNER JOIN Table2 t2
ON MaxDates.Location_ID = t2.Location_ID AND MaxDates.MaxDate = t2.Date
Both tools can be found in the DataManagement toolbox in the toolset "Layers and Table Views". However, making sure that the record is within the boundaries would be more complcated.
If you work in ArcGIS Pro, you can use Attribute Rules.
https://pro.arcgis.com/en/pro-app/2.7/help/data/geodatabases/overview/an-overview-of-attribute-rules...
// Constraint Rule
// Table 2
// Triggers: Insert, Update
// Rejects the edit if the Recorded_Value is not within the upper and lower limits defined in Table 1
// load table 1
var tbl_1 = FeatureSetByName($datastore, "Table1", "*", false)
// filter table 1 by Location_ID
var loc_id = $feature.Location_ID
var loc = First(Filter(tbl_1, "Location_ID = @loc_id"))
// test if value falls within bounds, return true or false
var val = $feature.Recorded_Value
if(val >= loc.Lower_Limit_m && val <= loc.Upper_Limit_m) {
return true
}
return false
// Calculation Rule
// Table 2
// field: Recorded_Value
// Triggers: Insert
// If you add a new record to Table 2, the Recorded_Value will be written into Table1.CurrentValue_m
// This assumes that the record you add is actually the latest one!
// load table 1 and get the globalID of the location
var loc_id = $feature.LocationID
var g_id = First(Filter(FeatureSetByName($datastore, "Table1", ["Location_ID", "GlobalID"], false), "Location_ID = @loc_id")).GlobalID
// write the Rcorded_Value into Table1.Current_Value_m
return {
"result": $feature.Recorded_Value,
"edit": [
{
"className": "Table1",
"updates": [
{"globalID": g_id, "attributes": {"Current_Value_m": $feature.Recorded_Value}},
]
}
]
}
If none of that works for you, you can use a Python script, but you'd have to manually run it every time you insert new records.