I was able to figure out the error from earlier, but i have one more request.
If i have other fields in the table i want to also show on the out_table for example see table below:
Data | UniqueID | Item ID | Begin | End | Input Table |
Observation1 | 555 | 1000 | 0 | 1 | |
555 | 1001 | 0 | 5 | ||
555 | 1002 | 1 | 7 | ||
555 | 1003 | 4 | 5 | ||
555 | 1004 | 3 | 5 | ||
Observation2 | 600 | 2001 | 0 | 3 | |
600 | 2002 | 2 | 6 | ||
600 | 2003 | 3 | 5 |
Data | UniqueID | Begin | End | Item ID | Output Table |
Observation1 | 555 | 0 | 1 | 1001 | |
555 | 1 | 3 | 1001; 1002 | ||
555 | 3 | 4 | 1001; 1002; 1004 | ||
555 | 4 | 5 | 1001; 1002; 1003; 1004 | ||
555 | 5 | 7 | 1002 | ||
Observation2 | 600 | 0 | 2 | 2001; 2002 | |
600 | 2 | 3 | 2001; 2002 | ||
600 | 3 | 5 | 2002; 2003 | ||
600 | 5 | 6 | 2003 |
Is there a way to differentiate between records that are within a certain range(with UniqueID) as shown in the table above, plus have other fields for that range (with same uniqueID) in the output table?
I am really grateful for your help thus far guys, Thanks.
Can you upload a copy of the actual input data table?
Clarification on the tables above:-
For both the Input and Output table, the Data field should not take the part in the segmentation as it was put in there to help visually differentiate between the UniqueIDs.
I think you have some errors in the output table you put together here, but I think this modified code should accomplish what you're after. I also rearranged some things so it only opens the insert cursor once.
def main(): import arcpy import os # Local variables sourcegdb = r"N:\TechTemp\BlakeT\Work\TEMP.gdb" table_in = os.path.join(sourcegdb, "SegmentSequence") # Create output table arcpy.CreateTable_management(sourcegdb, "SegmentSequence_output") table_out = os.path.join(sourcegdb, "SegmentSequence_output") arcpy.AddField_management(table_out, "UniqueID", "SHORT") arcpy.AddField_management(table_out, "Begin", "SHORT") arcpy.AddField_management(table_out, "End", "SHORT") arcpy.AddField_management(table_out, "SegmentItems", "TEXT", field_length=255) # Identify observation groups sqlprefix = "DISTINCT UniqueID" sqlpostfix = "ORDER BY UniqueID" observations = tuple(uid[0] for uid in arcpy.da.SearchCursor(table_in, ["UniqueID"], sql_clause=(sqlprefix, sqlpostfix))) fields_out = ["UniqueID", "Begin", "End", "SegmentItems"] with arcpy.da.InsertCursor(table_out, fields_out) as i_cursor: for obsv in observations: # Read item and (begin, end) into dictionary fields_in = ["ItemID", "Begin", "End"] where_clause = "UniqueID = {}".format(obsv) itemsdict = {r[0]:(r[1], r[2]) for r in arcpy.da.SearchCursor(table_in, fields_in, where_clause)} # Identify segments allsegments = [i[0] for i in itemsdict.values()] + [i[1] for i in itemsdict.values()] segments = tuple(sorted(set(allsegments))) del allsegments # Identify items in each segment for enum, seg in enumerate(segments): try: begin = seg end = segments[enum + 1] except IndexError: break seg_items = [k for k, v in itemsdict.items() if v[0] <= begin and v[1] >= end] # Write segment items to output table row = (obsv, begin, end, str(seg_items).strip('[]')) i_cursor.insertRow(row) if __name__ == '__main__': main()
Thanks Blake...
I just tested the new script, it did get the UniqueIDs but it returned so many instance of a segment (in other words redundant rows/records). i have attached a table of what the actual table looks like.
UniqueID | ItemID | Begin_Station_Num_m | End_Station_Num_m | TestScore | TestDate |
1104401 | 55653 | 0 | 32.918 | 1440 | 11/17/1998 |
1104401 | 55650 | 32.918 | 1303.02 | 1440 | 11/12/1998 |
1104401 | 55649 | 1303.02 | 1310.923 | 1440 | 10/22/1998 |
1107700 | 55585 | -9.193 | 4526.89 | 1440 | 2/2/1999 |
1107700 | 55584 | 4526.89 | 5358.384 | 1440 | 2/3/1999 |
1107700 | 68200 | 5202.631 | 5373.014 | 1440 | 11/2/2011 |
1107700 | 55583 | 5358.384 | 6493.764 | 1440 | 2/3/1999 |
1107700 | 55582 | 6493.764 | 8760.562 | 1440 | 2/4/1999 |
1107700 | 55581 | 8760.562 | 13235.635 | 1440 | 2/4/1999 |
1107700 | 55578 | 13235.635 | 15883.738 | 1440 | 2/5/1999 |
1107700 | 55576 | 15883.738 | 23348.899 | 1440 | 1/25/1999 |
1107700 | 55577 | 23348.899 | 25256.033 | 1440 | 1/25/1999 |
1107700 | 55573 | 25256.033 | 27711.806 | 1440 | 1/24/1999 |
1107700 | 55572 | 27711.806 | 31326.43 | 1440 | 1/23/1999 |
1107700 | 55575 | 31326.43 | 33917.534 | 1440 | 1/23/1999 |
1107700 | 55574 | 33917.534 | 39080.237 | 1440 | 1/23/1999 |
1107700 | 55586 | 39080.237 | 40664.892 | 1440 | 2/2/1999 |
1107700 | 55587 | 40664.892 | 44023.788 | 1440 | 2/3/1999 |
1107700 | 55588 | 44023.788 | 51158.242 | 1440 | 2/3/1999 |
1107700 | 55579 | 51158.242 | 51166.776 | 1440 | 1/28/1999 |
1107700 | 55589 | 51166.776 | 55466.285 | 1440 | 2/4/1999 |
1107700 | 55590 | 55466.285 | 59929.166 | 1440 | 2/2/1999 |
1107700 | 55556 | 59929.166 | 70217.69 | 1440 | 11/21/1998 |
1107700 | 55554 | 70217.69 | 71075.397 | 1440 | 11/14/1998 |
1107700 | 55555 | 71075.397 | 97066.608 | 1440 | 11/16/1998 |
1107700 | 55550 | 97066.608 | 99547.985 | 1440 | 11/7/1998 |
1107700 | 55551 | 99547.985 | 102256.133 | 1440 | 11/7/1998 |
1107700 | 55552 | 102256.133 | 105059.683 | 1440 | 11/8/1998 |
1107700 | 55553 | 105059.683 | 112981.13 | 1440 | 11/10/1998 |
1107700 | 68201 | 112937.544 | 112981.13 | 1440 | 10/17/2011 |
1107700 | 55569 | 112981.13 | 113000.637 | 1440 | 1/6/1999 |
1107700 | 55609 | 113000.637 | 116894.153 | 1440 | 10/17/1998 |
1107700 | 55611 | 116894.153 | 118538.244 | 1440 | 10/23/1998 |
1107700 | 55612 | 118538.244 | 121654.824 | 1440 | 10/23/1998 |
1107700 | 55613 | 121654.824 | 123223.629 | 1440 | 10/26/1998 |
1107700 | 55614 | 123223.629 | 126232.31 | 1440 | 10/26/1998 |
1107700 | 55605 | 126232.31 | 139808.712 | 1440 | 10/8/1998 |
1107700 | 55616 | 139808.712 | 157859.577 | 1440 | 11/10/1998 |
1107700 | 55617 | 157859.577 | 177810.871 | 1440 | 11/17/1998 |
1107700 | 55618 | 177810.871 | 182154.881 | 1440 | 11/16/1998 |
1107700 | 55619 | 182154.881 | 195965.673 | 1440 | 11/18/1998 |
1107700 | 55620 | 195965.673 | 198532.699 | 1440 | 11/24/1998 |
1107700 | 55621 | 198532.699 | 203721.614 | 1440 | 11/28/1998 |
1107700 | 55622 | 203721.614 | 209226.302 | 1440 | 12/3/1998 |
1107700 | 55623 | 209226.302 | 213641.33 | 1440 | 12/4/1998 |
1107700 | 55624 | 213641.33 | 220382.287 | 1440 | 12/15/1998 |
1107700 | 55625 | 220382.287 | 224331.885 | 1440 | 12/31/1998 |
1107700 | 55626 | 224331.885 | 225924.77 | 1440 | 1/3/1999 |
1107700 | 55627 | 225924.77 | 227517.96 | 1440 | 1/3/1999 |
1107700 | 55628 | 227517.96 | 231409.341 | 1440 | 1/5/1999 |
1107700 | 55629 | 231409.341 | 231485.517 | 1440 | 1/6/1999 |
if its also possible to have, we can have a date field that contains only the most recent date per segment of an ItemID.
Just to follow up on the date request;
Example: Input Table.
UniqueID | Item ID | Begin | End | Date |
555 | 1000 | 0 | 1 | 9/5/1999 |
555 | 1001 | 0 | 5 | 9/15/1999 |
555 | 1002 | 1 | 7 | 9/19/1999 |
555 | 1003 | 4 | 5 | 9/20/1999 |
555 | 1004 | 3 | 5 | 9/21/1999 |
600 | 2001 | 0 | 3 | 9/22/1999 |
600 | 2002 | 2 | 6 | 9/23/1999 |
600 | 2003 | 3 | 5 | 9/24/1999 |
OutPut Table
UniqueID | Begin | End | Item ID | Date |
555 | 0 | 1 | 1001 | 9/5/1999 |
555 | 1 | 3 | 1001; 1002 | 9/19/1999 |
555 | 3 | 4 | 1001; 1002; 1004 | 9/21/1999 |
555 | 4 | 5 | 1001; 1002; 1003; 1004 | 9/21/1999 |
555 | 5 | 7 | 1002 | 9/19/1999 |
600 | 0 | 2 | 2001; 2002 | 9/23/1999 |
600 | 2 | 3 | 2001; 2002 | 9/23/1999 |
600 | 3 | 5 | 2002; 2003 | 9/24/1999 |
600 | 5 | 6 | 2003 | 9/24/1999 |
The most recent date is entered in the date field per segment.
And if it is too much to ask i can understand.
Thanks Again.
Have you ever heard of the term, "Scope Creep"?
Seriously though, are there any other requirements we need to know about before spending time to solve your problem?
There are no further requirements other than the ones I've mentioned thus far...
I'm very grateful Blake and Xander for your solutions
I just noticed you have a negative number in there. Is that correct? Is that supposed to come before 0? Like this:
-9.193 to 0
0 to 32.918
32.918 to 1303.02
...
it is correct, but the -9.193 to 0 is for a different UniqueID(1107700) and
32.918 to 1303.02 is a section of the 1104401 UniqueID...