Select to view content in your preferred language

Query Regarding FeatureLayerManager in ArcGIS API for Python: Need Assistance with Event Listener Registration

755
4
05-23-2024 02:03 PM
AbiDhakal
Frequent Contributor

Hello GIS friends,

I'm currently working on a project where I need to register an event listener using the FeatureLayerManager in the ArcGIS API for Python. However, I'm encountering an issue where it seems that the FeatureLayerManager object does not have an attribute called register_listener. How can I get it to work?

Here's a snippet of the code I'm using:

from arcgis.features import FeatureLayer

 

# Get feature layer

feature_layer_url = "…………………………………………….."

try:

    feature_layer = FeatureLayer(feature_layer_url)

    print("Successfully fetched the feature layer")

except Exception as e:

    print(f"Failed to get feature layer: {e}")

    exit(1)

 

# Callback function to handle feature collection events

def handle_feature_collection_event(event):

    try:

        for feature in event['features']:

            created_user = feature['attributes']['created_user']

            subject = "New feature collected"

            body = f"A new feature was collected by {created_user}."

            send_email(subject, body)

    except Exception as e:

        print(f"Error handling feature collection event: {e}")

 

# Set up feature collection event listener

try:

    feature_layer.manager.register_listener('onEdits', handle_feature_collection_event)

    print("Successfully registered event listener")

except Exception as e:

    print(f"Failed to register event listener: {e}")

    exit(1)

0 Kudos
4 Replies
JohnEvans6
Regular Contributor

In poking around for a few minutes I don't see a register_listener operation in the docs. Do you have a link to it anywhere?

https://developers.arcgis.com/python/api-reference/arcgis.features.managers.html#featurelayermanager

If I do a quick dir check, the operation doesn't exist

dir (managers.FeatureLayerManager)

['__class__',
 '__delattr__',
 '__dict__',
 '__dir__',
 '__doc__',
 '__eq__',
 '__format__',
 '__ge__',
 '__getattribute__',
 '__gt__',
 '__hash__',
 '__init__',
 '__init_subclass__',
 '__le__',
 '__lt__',
 '__module__',
 '__ne__',
 '__new__',
 '__reduce__',
 '__reduce_ex__',
 '__repr__',
 '__setattr__',
 '__sizeof__',
 '__str__',
 '__subclasshook__',
 '__weakref__',
 '_check_status',
 '_get_status',
 '_hydrate',
 '_invoke',
 '_refresh',
 '_refresh_callback',
 '_token',
 'add_to_definition',
 'contingent_values',
 'delete_from_definition',
 'field_groups',
 'fromitem',
 'properties',
 'refresh',
 'truncate',
 'update_definition']

 

0 Kudos
AbiDhakal
Frequent Contributor

You are correct after further research register_listener is not an available method. Thank you for the list.

What I'm trying to do is use python to automate email when a new feature is collected and I'm hitting a dead end. Do you have any suggestions?

Thank you,
Abi

0 Kudos
JohnEvans6
Regular Contributor

I actually might? No idea if this will be of any use; I haven't looked at it since 2019, but I'll dump it in here and you can pick through it and see if anything jumps out at you as helpful. I'll try to pull out only the relevant bits.

The below notebook at one point was more or less an email alert system that used python pickling to store data based on a geospatial evaluation. The notebook ran on a schedule and any changes detected between the stored/pickled dataframe and the new dataframe were emailed. I eventually abandoned the whole thing because it wasn't "great" and I jumped over to the Geoevent Server instead.

I imagine your workflow could be something like your notebook (if you're using a notebook) is set to run every 15 minutes. The notebook "pickles" your FeatureLayer on first run. Every subsequent run the notebook compares the latest FeatureLayer pull with the pickled one. Changes are emailed and the pickle is overwritten with the latest. Your use case is probably a lot simpler than the one I have below.

 

from arcgis.gis import GIS
import pickle
import pandas as pd
import smtplib
import csv
from arcgis.features import GeoAccessor, GeoSeriesAccessor
from IPython.core.display import display, HTML

# Pandas Columns to Drop Post Join
droppable_columns = ['index_right', 'objectid']

# Historic File Checker
h_pickle = False

# Global Run Flag. Used in each block.
global_run_flag = True

# Who is getting these emails
to_email_list = ['test1@test.com, test2@test.com']

def send_email_smtp(recipients, message,subject="Test Notification from AGOL Notebook - Tsunami"):
    # Sends the `message` string to all of the emails rein the `recipients` list using the configured SMTP email server. 
    try:
        # Set up server and credential variables
        smtp_server_url = "relay_server"
        smtp_server_port = 25
        sender = "alert_sample@test.com"
        # username = "your_username"
        # password = secrets["smtp_email_password"]

        # Instantiate our server, configure the necessary security
        server = smtplib.SMTP(smtp_server_url, smtp_server_port)
        server.ehlo()
        # server.starttls() # Needed if TLS is required w/ SMTP server
        # server.login(username, password)
    except Exception as e:
        log.warning("Error setting up SMTP server, couldn't send " +
                    f"message to {recipients}")
        raise e
        
    # For each recipient, construct the message and attempt to send
    did_succeed = True
    for recipient in recipients:
        try:
            message_body = '\r\n'.join(['To: {}'.format(recipient),
                                        'From: {}'.format(sender),
                                        'MIME-Version: 1.0',
                                        'Content-type: text/html',
                                        'Subject: {}'.format(subject),
                                        '',
                                        '{}'.format(message)])
            message_body = message_body.encode("utf-8")
            server.sendmail(sender, [recipient], message_body)
            print(f"SMTP server returned success for sending email "\
                  f"to {recipient}")
        except Exception as e:
            log.warning(f"Failed sending message to {recipient}")
            log.warning(e)
            did_succeed = False
    
    # Cleanup and return
    server.quit()
    return did_succeed

def create_empty_spatial() :
    default_df = pd.DataFrame(columns=['ASSOC_CITY','ASSOC_ST','ActivePerimeter','CENTER','FACID','FACTYPE','GEO_CAT','GeoTime','LATITUDE','LG_NAME','LONGITUDE','OBJECTID','SHAPE','alertcode'])
    return default_df

def create_spatial_evalulation_df(geospatialspatial_ww_area_frame) :
    evaluation_df = tsunami_point_sdf.spatial.join(geospatialspatial_ww_area_frame, how='left')
    evaluation_df = evaluation_df.drop(droppable_columns, axis = 1)
    evaluation_df = evaluation_df.dropna(subset=['alertcode'])
    evaluation_df['ActivePerimeter'] = 'Y'
    
    return evaluation_df

def pickle_eval() :
    
    default_df = create_empty_spatial()
    pickle_check = os.listdir('.')
    
    warning_pickle_file = 'tsunami_intersection_warning_previous.pkl' not in pickle_check
    watch_pickle_file = 'tsunami_intersection_watch.pkl' not in pickle_check
    advisory_pickle_file = 'tsunami_intersection_advisory_previous.pkl' not in pickle_check

    if warning_pickle_file == True :
        default_df.to_pickle("tsunami_intersection_warning_previous.pkl")
    if watch_pickle_file == True :
        default_df.to_pickle("tsunami_intersection_watch_previous.pkl")    
    if advisory_pickle_file == True :
        default_df.to_pickle("tsunami_intersection_advisory_previous.pkl")
        
    return True

def df_size_eval(frame):
    s = frame.size
    return s
    
def display_frames(framelist) :
    for f in framelist :
        display (f)    

# Misc Notes
        
def Cleanup(*args) :
    for filenames in args :
        os.remove(filenames)

 

The Evaluation stuff, if I've pulled it out correctly, is below. 

 

# Get our information layers and create dataframes. Split by warning type
tsunami_point_sdf = pd.DataFrame.spatial.from_layer(gis.content.get(Facility_Equipment_Points).layers[0])
tsunami_layer_sdf = select_features.sdf

# Query Watch/Warning/Advisory Layers and pull polygons based on alertCode
tsunami_layer_advisory_df = tsunami_layer_sdf.query('alertcode == "TSY"')
tsunami_layer_watch_df = tsunami_layer_sdf.query('alertcode == "TSA"')
tsunami_layer_warning_df = tsunami_layer_sdf.query('alertcode == "TSW"')

# Display Data
display_frames([tsunami_layer_advisory_df, tsunami_layer_watch_df, tsunami_layer_warning_df])

# If we have Data, create a df for evaluation
if df_size_eval(tsunami_layer_advisory_df) == 0 :
    tsunami_intersection_advisory_df = create_empty_spatial()
else :
    tsunami_intersection_advisory_df = create_spatial_evalulation_df(tsunami_layer_advisory_df)

if df_size_eval(tsunami_layer_watch_df) == 0 :
    tsunami_intersection_watch_df = create_empty_spatial()
else :
    tsunami_intersection_watch_df = create_spatial_evalulation_df(tsunami_layer_watch_df)

if df_size_eval(tsunami_layer_warning_df) == 0 :
    tsunami_intersection_warning_df = create_empty_spatial()
else :
    tsunami_intersection_warning_df = create_spatial_evalulation_df(tsunami_layer_warning_df)

### This is the pickle evaluation part ###

# Load our old data. We need to check to see if the files exist. If they don't, we create empty files based on the current advisories format. Basically, push an empty DF with the right columns to the directory.
if global_run_flag is True :
    if h_pickle is False:
        h_pickle = pickle_eval()

    # Read our previous (or new)
    previous_tsunami_intersection_advisory_df = pd.read_pickle("tsunami_intersection_advisory_previous.pkl")
    previous_tsunami_intersection_watch_df = pd.read_pickle("tsunami_intersection_watch_previous.pkl")
    previous_tsunami_intersection_warning_df = pd.read_pickle("tsunami_intersection_warning_previous.pkl")

    display_frames([previous_tsunami_intersection_advisory_df, previous_tsunami_intersection_watch_df, previous_tsunami_intersection_warning_df])

# Determine if facilities is no longer under a watch/warning. Only retain records where the previous remains in new
if global_run_flag is True :
    
    display (previous_tsunami_intersection_advisory_df['FACID'].isin(tsunami_intersection_advisory_df['FACID']))
    display (previous_tsunami_intersection_watch_df['FACID'].isin(tsunami_intersection_watch_df['FACID']))
    display (previous_tsunami_intersection_warning_df.FACID.isin(tsunami_intersection_warning_df.FACID))

    previous_tsunami_intersection_advisory_df = previous_tsunami_intersection_advisory_df[previous_tsunami_intersection_advisory_df.FACID.isin(tsunami_intersection_advisory_df.FACID)]
    previous_tsunami_intersection_watch_df = previous_tsunami_intersection_watch_df[previous_tsunami_intersection_watch_df.FACID.isin(tsunami_intersection_watch_df.FACID)]
    previous_tsunami_intersection_warning_df = previous_tsunami_intersection_warning_df[previous_tsunami_intersection_warning_df.FACID.isin(tsunami_intersection_warning_df.FACID)]

# Now we do the opposite. Which facilities are newly detected as under things.
if global_run_flag is True :
    
    display (tsunami_intersection_advisory_df['FACID'].isin(previous_tsunami_intersection_advisory_df['FACID']))
    display (tsunami_intersection_watch_df['FACID'].isin(previous_tsunami_intersection_watch_df['FACID']))
    display (tsunami_intersection_warning_df.['FACID'].isin(previous_tsunami_intersection_warning_df.FACID))

    email_tsunami_intersection_advisory_df = tsunami_intersection_advisory_df[~tsunami_intersection_advisory_df.FACID.isin(previous_tsunami_intersection_advisory_df.FACID)]
    email_tsunami_intersection_watch_df = tsunami_intersection_watch_df[~tsunami_intersection_watch_df.FACID.isin(previous_tsunami_intersection_watch_df.FACID)]
    email_tsunami_intersection_warning_df = tsunami_intersection_warning_df[~tsunami_intersection_warning_df.FACID.isin(previous_tsunami_intersection_warning_df.FACID)]

    display_frames([email_tsunami_intersection_advisory_df, email_tsunami_intersection_watch_df, email_tsunami_intersection_warning_df])

# We now want to concatenate our updated previous dataframe and our newest dataframe. Our previous dataframe only has ones still under an advisory, while and our email_tsunami_intersection_advisory_df has only new ones.
if global_run_flag is True :

    tsunami_intersection_advisory_df = pd.concat([previous_tsunami_intersection_advisory_df, email_tsunami_intersection_advisory_df])
    tsunami_intersection_watch_df = pd.concat([previous_tsunami_intersection_watch_df, email_tsunami_intersection_watch_df])
    tsunami_intersection_warning_df = pd.concat([previous_tsunami_intersection_warning_df, email_tsunami_intersection_warning_df])

# Now we just need to pickle the current dataframes we've generated for each alert code and set them to our previous.
if global_run_flag is True :

    tsunami_intersection_advisory_df.to_pickle("tsunami_intersection_advisory_previous.pkl")
    tsunami_intersection_watch_df.to_pickle("tsunami_intersection_watch_previous.pkl")
    tsunami_intersection_warning_df.to_pickle("tsunami_intersection_warning_previous.pkl")

if global_run_flag is True :

    if email_tsunami_intersection_advisory_df.size > 0 or email_tsunami_intersection_watch_df.size > 0 or email_tsunami_intersection_warning_df.size > 0 :
        send_email = True
        print (send_email)
    else :
        send_email = False
        print (send_email)

# Now we have a list of facilities already split out by warning type. We want to build this into one html email.
if global_run_flag is True :

    # This sends only the additions/newest
    tsy_notification = build_notification_table("Tsunami Advisory (TSY)", email_tsunami_intersection_advisory_df)
    tsa_notification = build_notification_table("Tsunami Watch (TSA)", email_tsunami_intersection_watch_df)
    tsw_notification = build_notification_table("Tsunami Warning (TSW)", email_tsunami_intersection_warning_df)

    # This sends all data including new ones
    # tsw_notification = build_notification_table("Tsunami Warning (TSW)", tsunami_intersection_warning_df)
    # tsa_notification = build_notification_table("Tsunami Watch (TSA)", tsunami_intersection_watch_df)
    # tsy_notification = build_notification_table("Tsunami Advisory (TSY) TESTING USES FWW", tsunami_intersection_advisory_df)


    my_message = f"""

        <div style="width: 100%; font-family:Calibri, Arial, san-serif">
            <p style="font-size: 16px">A Tsunami Advisory, Watch, or Warning has been issued with potential impact to the following facilities & equipment:<br></p>
        </div>
        <div style="font-family:Calibri, Arial, san-serif">
            <p style="font-size: 12px; font-weight: bold">
            {tsw_notification}
        </div>
        <br>
            <div style="font-family:Calibri, Arial, san-serif">
            <p style="font-size: 12px; font-weight: bold">
            {tsa_notification}
        </div>
            <div style="font-family:Calibri, Arial, san-serif">
            <p style="font-size: 12px; font-weight: bold">
            {tsy_notification}
        </div>
        <p style="font-family:Calibri, Arial, san-serif;font-size: 10.5px">Questions? Need Removed? Email john.r.evans@faa.gov</p>

        """

if global_run_flag is True :
    if send_email is True :
        send_email_smtp(recipients = to_email_list, message = my_message)

 

 

AbiDhakal
Frequent Contributor

@JohnEvans6  - My sincere apology for not responding to you immediately. I'm sorry, please.

I have not had the chance to work on what you sent me yet, but I will give it a try once I get some time and will most definitely let you know. Thank you for understanding.
Abi

0 Kudos