Select to view content in your preferred language

Convert Pandas Series to Dataframe or run query on series data?

2093
5
Jump to solution
03-30-2020 12:42 PM
RexRobichaux
Frequent Contributor

Hello Python Gurus- 

 I've been doing some experimental design work for a recurring script which aims to pull in hourly json values for air quality monitoring stations from PurpleAir.com, parse the values in a field (PM2_5Value), and then if values exceed a certain threshold (let's say 40 for testing), send smtp email notifications to staff for awareness. 

I've been using Jupyter Notebooks and have been able to pull in the json values for all stations, parse the json into a series to show the json results (fields of interest) but have hit a mental block. It seems like while it's in series format, I can't perform sorting / querying of records. I'm unsure if this is by design, bug, or error on my part but DataFrame.sort_values fails, as does just df ['PM2_5Value'] > 40. 

I'm, guessing there is something obvious here I'm missing- is there an easier way to approach this, or turn this series table into a proper dataframe so that I can work with the data easier? If I attempt to search for values in the current series (which is probably my problem), I receive the following KeyError:

KeyError                                  Traceback (most recent call last)C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\site-packages\pandas\core\indexes\base.py in get_loc(self, key, method, tolerance)   2656             try:-> 2657                 return self._engine.get_loc(key)   2658             except KeyError:pandas/_libs/index.pyx in pandas._libs.index.IndexEngine.get_loc()pandas/_libs/index.pyx in pandas._libs.index.IndexEngine.get_loc()pandas/_libs/hashtable_class_helper.pxi in pandas._libs.hashtable.PyObjectHashTable.get_item()pandas/_libs/hashtable_class_helper.pxi in pandas._libs.hashtable.PyObjectHashTable.get_item()KeyError: 'PM2_5Value'

Here's the code so far and the series it produces in Jupyter:
import numpy as np
import pandas as pd
from requests import request
import json
from pandas.io.json import json_normalize 


df = pd.read_json('https://www.purpleair.com/json?show=39183|31221|31179|31185|2827|12785|35779|15947|46911|3095|35885|10286|33089|38627|34099|27405|21109|27825|34579|34109|27401|21111|27873|34747|26961|38253|34153|27807|27833|27863|22355|9930|2221|8244|8248|4427|2944|2514|15019|28651|47173|25361|36607|37575|4591|6612|36139|30739|5512|3157|34933|38835|48625|34951|2239|12016|34803|14687|26249|35693|36965|34847|34797|44553|39593|46293|48047|30169')
df.results.apply(pd.Series)

IDLabelDEVICE_LOCATIONTYPETHINGSPEAK_PRIMARY_IDTHINGSPEAK_PRIMARY_ID_READ_KEYTHINGSPEAK_SECONDARY_IDTHINGSPEAK_SECONDARY_ID_READ_KEYLatLonPM2_5Value...pm2_5_atmpm10_0_atmisOwnerhumiditytemp_fpressureAGEStatsParentIDFlag
0311851307 Highland Ciroutside771600ARNTKJE62A5YN65X7716012ADRDG27LBM5DMY437.221680-80.3942670.86...0.860.8601674936.040{"v":0.86,"v1":0.89,"v2":0.98,"v3":1.15,"v4":3...NaNNaN
1311861307 Highland Cir BNaN771602UDXTMLB1DM0RNWHI7716031B895L3NPZ431PZS37.221680-80.3942670.93...0.931.20NaNNaNNaN0{"v":0.93,"v1":0.78,"v2":0.83,"v3":0.99,"v4":3...31185.0NaN
2462931625 N Quincy Stoutside962830NID89HFS1SIXVFUP962832G05WHJ0G5OG6VX2F38.891215-77.1078142.14...2.142.53021781000.332{"v":2.14,"v1":2.32,"v2":3.3,"v3":5.63,"v4":15...NaN0.0
3462941625 N Quincy St BNaN962833VYPA6FS3H1542DNX9628355HNX1XDCTA5HCCE038.891215-77.1078142.05...2.052.190NaNNaNNaN2{"v":2.05,"v1":2.27,"v2":3.23,"v3":5.52,"v4":1...46293.0NaN
448047Arlington Heightsoutside983128NSU8T3TTQ9XILKLS9831295NCXVTW2WJ6GH3X138.869764-77.0915061.52...1.521.52019831002.440{"v":1.52,"v1":2.31,"v2":3.08,"v3":5.16,"v4":1...NaNNaN
548048Arlington Heights BNaN983130IJF0DUGUK7WV7I5J983131YP7IVMLJ4WR490DM38.869764-77.0915060.96...0.960.960NaNNaNNaN0{"v":0.96,"v1":1.68,"v2":2.51,"v3":4.57,"v4":1...48047.0NaN
62944Berkeley Springs, WVoutside318200AZPYYS7KAQZ39ET63182012XO6L3559B3XEP2E39.648592-78.1993450.41...0.410.4102371985.121{"v":0.41,"v1":0.39,"v2":0.44,"v3":0.51,"v4":3...NaNNaN
72945Berkeley Springs, WV BNaN318202TZSB40PD8JP31PVZ31820324TBUFMMRG1ABCJM39.648592-78.1993450.3...0.30.30NaNNaNNaN1{"v":0.3,"v1":0.73,"v2":1.08,"v3":1.33,"v4":4....2944.01.0
836607Black Cat Manoroutside839539PJ4PHDJPSQNFSI8W839540PYXGS0EJN4NDPXKF39.272077-77.6802830.36...0.360.3601976992.30{"v":0.36,"v1":0.58,"v2":1.03,"v3":2.88,"v4":1...NaNNaN
936608Black Cat Manor BNaN839541NKVRKKFCBKJMLUBI839542CLHQPSCLYECRQL7239.272077-77.6802830.0...0.00.00NaNNaNNaN0{"v":0.0,"v1":0.01,"v2":0.08,"v3":0.45,"v4":2....36607.01.0
1035885BlackmoreWX Hampton, VAoutside834095BEZHF80Z7CYE6S6W834096F0S87B4DA9HYLXEI37.088940-76.4223201.5...1.51.74017861012.251{"v":1.5,"v1":1.61,"v2":5.04,"v3":8.86,"v4":15...NaNNaN
1135886BlackmoreWX Hampton, VA BNaN8340976I8U5L7FGLPH6R4U83409817SGOW0FGT5FQWXL37.088940-76.4223201.19...1.191.190NaNNaNNaN1{"v":1.19,"v1":1.68,"v2":4.81,"v3":8.29,"v4":1...35885.0NaN
1234933Centrevilleoutside8258726D1621QIHLPGD5TX825873LP2LI15ULRG7AFF138.811130-77.4363160.66...0.660.66016781001.530{"v":0.66,"v1":0.8,"v2":1.15,"v3":2.96,"v4":13...NaN1.0
1334934Centreville BNaN825874C0SJ6M218BYMFG0A8258755R5V7H6WRS9LL2TM38.811130-77.4363160.79...0.791.240NaNNaNNaN0{"v":0.79,"v1":0.78,"v2":1.21,"v3":3.06,"v4":1...34933.0NaN
1438835Chantilly High School Science Projectoutside860940L764W9N28SGZMOTK8609419ZF41YZNGSD44T8W38.876441-77.3989770.72...0.720.7801681996.321{"v":0.72,"v1":1.4,"v2":2.01,"v3":4.42,"v4":13...NaNNaN
1538836Chantilly High School Science Project BNaN860942X1W5VR6ORLNG99NW860943FQOT1A66TS1MIJ7S38.876441-77.3989770.72...0.721.090NaNNaNNaN1{"v":0.72,"v1":1.16,"v2":1.71,"v3":4.13,"v4":1...38835.0NaN
1612785Clean Air Carolina Halifax Countyoutside536307J5GBI48AB0WYWUOY5363099TZNZWDDCBBJZZTG36.475129-77.6601870.56...0.560.56013901008.391{"v":0.56,"v1":0.62,"v2":0.58,"v3":0.57,"v4":5...NaNNaN
1712786Clean Air Carolina Halifax County BNaN536311QKJWMK24HDW1S5M45363120882XIES75X9KTXU36.475129-77.6601870.0...0.00.00NaNNaNNaN1{"v":0.0,"v1":0.33,"v2":1.96,"v3":2.42,"v4":3....12785.01.0
1830169Courthouseoutside7517256VZULVDOOG4WWFLN751727ICHVDCPGD8IKKSEA38.888120-77.0880942.17...2.172.640NaNNaNNaN1{"v":2.17,"v1":2.14,"v2":2.96,"v3":5.03,"v4":1...NaNNaN
1930170Courthouse BNaN751728U4R9FWN5JRYDMGKY751730D0CY0IMJBA6Z4HQC38.888120-77.0880941.76...1.761.760NaNNaNNaN1{"v":1.76,"v1":2.04,"v2":2.78,"v3":4.85,"v4":1...30169.0NaN
202827Crump Farm - Wylliesburg, VAoutside317192TCFK6ZTOJGLX7OU8317193D79WE9O2TLXMJETY36.844269-78.5551241.26...1.261.2602576994.81{"v":1.26,"v1":1.19,"v2":1.16,"v3":1.1,"v4":6....NaN1.0
212828Crump Farm - Wylliesburg, VA BNaN317194OT22QIL10O5NZQTN317195QK7B3TFH37PZBY8I36.844269-78.5551242.36...2.362.550NaNNaNNaN1{"v":2.36,"v1":1.95,"v2":1.9,"v3":1.81,"v4":8....2827.0NaN
225512Digital Bear Consultingoutside391865X3OHYCYU7RLGNG5P391866LZZ3G8LT0ELZC5D438.784090-77.4947662.48...2.482.65017841002.261{"v":2.48,"v1":1.08,"v2":1.33,"v3":3.16,"v4":1...NaNNaN
235513Digital Bear Consulting BNaN391867KAHNK34BLY0X825H391868U0FEUI4PQBQ8GWQE38.784090-77.4947662.57...2.572.710NaNNaNNaN1{"v":2.57,"v1":1.15,"v2":1.43,"v3":3.36,"v4":1...5512.0NaN
2415019Downtown Martinsburgoutside570005FSFIUN3UA90PYQD8570006N5RTIIXR3I8IN7S139.456469-77.9659350.58...0.580.5802175993.790{"v":0.58,"v1":0.75,"v2":0.79,"v3":1.03,"v4":9...NaNNaN
2515020Downtown Martinsburg BNaN570007PM7BPJBG7QNP04Q75700085GMSH2PDSIZLP91039.456469-77.9659350.38...0.380.380NaNNaNNaN0{"v":0.38,"v1":0.58,"v2":0.62,"v3":0.9,"v4":10...15019.0NaN
2630739Eastgateoutside7626348T35M0N3C4101ATM762635AD4BF1TNA6E8444N38.910599-77.4948861.22...1.221.7802082999.090{"v":1.22,"v1":1.75,"v2":5.71,"v3":9.84,"v4":1...NaNNaN
2730740Eastgate BNaN762636I0QLAJVCUMZV6M6Y762637B0TBB8WR2SVYCB3338.910599-77.4948860.7...0.70.980NaNNaNNaN0{"v":0.7,"v1":1.06,"v2":5.09,"v3":9.23,"v4":12...30739.0NaN
288248ENST 1: Methodist Churchoutside443667VMBIM75NF3BURRHR44366987H11G75H0BG41U738.264689-76.4540423.14...3.143.5013951005.820{"v":3.14,"v1":3.58,"v2":4.84,"v3":6.33,"v4":1...NaNNaN
298249ENST 1: Methodist Church BNaN44367191JRVN3T4KC2ZJJT443672CLQTHF01XE9PFRMN38.264689-76.4540422.41...2.412.830NaNNaNNaN0{"v":2.41,"v1":3.03,"v2":4.32,"v3":5.72,"v4":9...8248.0NaN
..................................................................
10627873UR_02_Vasen_decoutside719388DVEBUI9MZQ15W7P3719389HV035CPT31F5CN2237.570268-77.4730541.84...1.841.84015811004.361{"v":1.84,"v1":3.23,"v2":3.84,"v3":5.23,"v4":9...NaN1.0
10727874UR_02_Vasen_dec BNaN719390WZJIXJE3H70SKVAR719391WG7RRDC1FZNPTI1U37.570268-77.4730541.09...1.091.160NaNNaNNaN1{"v":1.09,"v1":2.44,"v2":3.05,"v3":4.29,"v4":8...27873.01.0
10827825UR_03_SixPts_e34outside717219179C16Z3X62CA5FJ717220LBTPX5P9VGG9V8B437.570367-77.4181623.81...3.813.93014841005.121{"v":3.81,"v1":2.14,"v2":2.48,"v3":4.39,"v4":1...NaNNaN
10927826UR_03_SixPts_e34 BNaN717221E7YMOT4T9DI0CY1Y717222MCL5U5ABZ4B0EURO37.570367-77.4181622.41...2.412.410NaNNaNNaN1{"v":2.41,"v1":1.88,"v2":2.25,"v3":4.04,"v4":9...27825.0NaN
11027833UR_04_Campus_dc8outside717235XQ4FPWQCST3IQYTJ71723665S6WZBP0V96NNIU37.578117-77.5370131.46...1.461.59013861005.091{"v":1.46,"v1":1.23,"v2":1.3,"v3":2.03,"v4":6....NaNNaN
11127834UR_04_Campus_dc8 BNaN717237NL4US20A32Q8AH4S717238UL185JBQFXN1N8BS37.578117-77.5370130.48...0.480.610NaNNaNNaN1{"v":0.48,"v1":0.47,"v2":0.48,"v3":0.76,"v4":3...27833.0NaN
11227807UR_05_Campus_de3outside717183NEQOLJIV2KHIWKAZ7171841FO753A6WOJ0EJMU37.575426-77.5392311.48...1.481.6013841006.051{"v":1.48,"v1":1.11,"v2":1.19,"v3":1.9,"v4":6....NaNNaN
11327808UR_05_Campus_de3 BNaN7171852DOTEYLF7PIW0F7I717186YMWZ0K8AAW93CJOA37.575426-77.5392311.76...1.761.80NaNNaNNaN1{"v":1.76,"v1":1.46,"v2":1.47,"v3":2.18,"v4":7...27807.0NaN
11427401UR_06_SMV_d8bcoutside710114JHHI709D03NCALCE710115C1YG1WCNAX1ZVPDH37.562716-77.4654232.26...2.262.39013861004.820{"v":2.26,"v1":1.83,"v2":2.03,"v3":3.52,"v4":9...NaNNaN
11527402UR_06_SMV_d8bc BNaN710116KDCNZCB5GZ53SB2U710117S73F578WL9JEI2K537.562716-77.4654231.87...1.872.60NaNNaNNaN0{"v":1.87,"v1":1.75,"v2":2.03,"v3":3.61,"v4":9...27401.0NaN
11627405UR_07_NRCFulton_d6ffoutside710122YLWXIL3N8L4AMRHT710123KZF1OOED3HISK0RX37.514527-77.3952961.45...1.451.77017821006.81{"v":1.45,"v1":2.16,"v2":2.47,"v3":4.17,"v4":1...NaNNaN
11727406UR_07_NRCFulton_d6ff BNaN710124JNRO82EG4U1AV94R7101255B9SN4ZFGI6NJYTY37.514527-77.3952961.88...1.881.880NaNNaNNaN1{"v":1.88,"v1":1.63,"v2":1.99,"v3":3.58,"v4":9...27405.0NaN
11826961UR_09_VUUnorth_d31coutside707419ELXSA3LZDYYTKKJ07074209X7RQWX70GDG5ZC437.573436-77.4499111.65...1.652.18016841005.221{"v":1.65,"v1":1.5,"v2":1.75,"v3":3.37,"v4":9....NaNNaN
11926962UR_09_VUUnorth_d31c BNaN707421QF4ZPUMCK8K6QS0F707422DNPALA7LSPK4J28D37.573436-77.4499111.79...1.791.860NaNNaNNaN1{"v":1.79,"v1":1.69,"v2":2.01,"v3":3.74,"v4":9...26961.01.0
12033089UUFP Sancuaryoutside799339CUQM5NZXZNF821K6799340FSWHOPTYMTER49SW37.099396-76.5027421.28...1.281.64019831011.670{"v":1.28,"v1":1.83,"v2":5.45,"v3":9.39,"v4":1...NaNNaN
12133090UUFP Sancuary BNaN7993419ZNRPG8A448V3BF27993421UN1NIG85O10U4UC37.099396-76.5027420.83...0.831.030NaNNaNNaN0{"v":0.83,"v1":1.57,"v2":4.67,"v3":8.11,"v4":1...33089.0NaN
12221109VCU2-MSiCoutside646707JHOUMNPBBW8P3BN2646708NPX9M6Y29WBXGABC37.556600-77.4005121.31...1.311.31018831004.650{"v":1.31,"v1":1.09,"v2":1.52,"v3":3.34,"v4":8...NaNNaN
12321110VCU2-MSiC BNaN6467096KBHZB33806PB32E646710NC1C0TAJOAKT4P7D37.556600-77.4005121.42...1.421.580NaNNaNNaN0{"v":1.42,"v1":1.22,"v2":1.77,"v3":3.81,"v4":9...21109.0NaN
12421111vcu6outside646711XKF3VLRU7W3OU685646712LZ3XGBJOC03PZ8PL37.541368-77.5123901.82...1.821.95019801005.881{"v":1.82,"v1":1.48,"v2":1.56,"v3":2.23,"v4":6...NaNNaN
12521112vcu6 BNaN646713J1MUHRGMCOTSES2164671401ENTKOPNSARPXHS37.541368-77.5123902.0...2.02.170NaNNaNNaN1{"v":2.0,"v1":1.51,"v2":1.56,"v3":2.18,"v4":6....21111.0NaN
1262239viennaxmasoutside3045175842OI0FVJ4MYV8M3045183EYD5JZCS53WHGVP38.900192-77.2655911.12...1.121.4101680996.230{"v":1.12,"v1":1.16,"v2":2.02,"v3":4.32,"v4":1...NaN1.0
1272240viennaxmas BNaN304519GUOAE89ILQAXXGFN30452062DXT8RYGLG5Y1AA38.900192-77.2655911.1...1.11.10NaNNaNNaN0{"v":1.1,"v1":1.02,"v2":1.73,"v3":4.21,"v4":15...2239.01.0
12831179Virginia Techoutside771588RWSK6YODJ7BUJFKU771589GP336K961UO52FFK37.231350-80.4239630.68...0.681.3201272941.50{"v":0.68,"v1":0.57,"v2":0.68,"v3":0.88,"v4":3...NaNNaN
12931180Virginia Tech BNaN7715901326KOMM4Y8V72G6771591LG9ZV9539Q9LWO8537.231350-80.4239630.74...0.740.880NaNNaNNaN0{"v":0.74,"v1":0.64,"v2":0.76,"v3":0.98,"v4":3...31179.0NaN
13031221Walls Branchoutside771672W07JDR5KIYYAP7GG77167338WX8RNVHXDROE0637.203450-80.4845751.12...1.121.4101386946.080{"v":1.12,"v1":1.07,"v2":1.92,"v3":2.22,"v4":4...NaNNaN
13131222Walls Branch BNaN77167405MMNSBEZMEJGYYW771675I321B9XZBZY903QJ37.203450-80.4845750.46...0.460.770NaNNaNNaN0{"v":0.46,"v1":0.94,"v2":1.77,"v3":2.03,"v4":4...31221.0NaN
1323157weatherCENTREoutside3241244RYTYM7ROTFH8JBW324125OUR97HQ5HD3OFLZH38.844998-77.4551740.83...0.830.83019791000.111{"v":0.83,"v1":1.17,"v2":4.65,"v3":8.82,"v4":1...NaNNaN
1333158weatherCENTRE BNaN324126CO9R5AUJYFRQZQZF324127OYZIS449TAT5WFWC38.844998-77.4551741.23...1.231.450NaNNaNNaN1{"v":1.23,"v1":1.18,"v2":4.86,"v3":8.87,"v4":1...3157.0NaN
1344427West Campusoutside366435SXRZMFJ1AL2IYU8K366436UERPDAM9SQQU45LO38.973145-78.2470770.26...0.260.001778988.590{"v":0.26,"v1":0.41,"v2":0.49,"v3":0.6,"v4":3....NaNNaN
1354428West Campus BNaN366437OYVZD2ZAB7G0Q9HS36643874AG80VHYFPFYWGX38.973145-78.2470770.0...0.00.00NaNNaNNaN0{"v":0.0,"v1":0.0,"v2":0.01,"v3":0.02,"v4":0.0...4427.0NaN

136 rows × 41 columns

In [ ]:
Tags (4)
0 Kudos
1 Solution

Accepted Solutions
GeoJosh
Esri Regular Contributor

Hey Rex,

I'd first use the requests module to get the json response, then load it as a dictionary. From there, you can use panda's from_dict function to create the DataFrame from the series. Something like this:

import numpy as np
import pandas as pd
import requests
import json

r = requests.get('https://www.purpleair.com/json?show=39183|31221|31179|31185|2827|12785|35779|15947|46911|3095|35885|10286|33089|38627|34099|27405|21109|27825|34579|34109|27401|21111|27873|34747|26961|38253|34153|27807|27833|27863|22355|9930|2221|8244|8248|4427|2944|2514|15019|28651|47173|25361|36607|37575|4591|6612|36139|30739|5512|3157|34933|38835|48625|34951|2239|12016|34803|14687|26249|35693|36965|34847|34797|44553|39593|46293|48047|30169')

dictionary = json.loads(r.text)
results = dictionary["results"]

df = pd.read_json(df.results.to_json(), orient='index')

Now the data is a DataFrame, as opposed to a Series. Hope this helps!

Best,

Josh

Edit: Modified the last line of code to reflect @joshua Bixby's suggestion. Thanks a lot!

View solution in original post

5 Replies
GeoJosh
Esri Regular Contributor

Hey Rex,

I'd first use the requests module to get the json response, then load it as a dictionary. From there, you can use panda's from_dict function to create the DataFrame from the series. Something like this:

import numpy as np
import pandas as pd
import requests
import json

r = requests.get('https://www.purpleair.com/json?show=39183|31221|31179|31185|2827|12785|35779|15947|46911|3095|35885|10286|33089|38627|34099|27405|21109|27825|34579|34109|27401|21111|27873|34747|26961|38253|34153|27807|27833|27863|22355|9930|2221|8244|8248|4427|2944|2514|15019|28651|47173|25361|36607|37575|4591|6612|36139|30739|5512|3157|34933|38835|48625|34951|2239|12016|34803|14687|26249|35693|36965|34847|34797|44553|39593|46293|48047|30169')

dictionary = json.loads(r.text)
results = dictionary["results"]

df = pd.read_json(df.results.to_json(), orient='index')

Now the data is a DataFrame, as opposed to a Series. Hope this helps!

Best,

Josh

Edit: Modified the last line of code to reflect @joshua Bixby's suggestion. Thanks a lot!

JoshuaBixby
MVP Esteemed Contributor

You can replace your last line with:

df = pd.read_json(df.results.to_json(), orient='index')

RexRobichaux
Frequent Contributor

Hello and a huge thank you to you both @Joshua Herman and @Joshua Bixby. From quick testing it looks like this is exactly what I needed. I've been swamped this week but will follow-up as soon as I can dig into this script a little more. Thanks again for the help!

RexRobichaux
Frequent Contributor

Thanks again for the assistance with this script all! I have it to a point to where it's working for some bare-bones functionality. At this point, it's is searching for any values exceeding a threshold, and if any records exist above that set threshold, it will fire off an email with the count of records in a pre-scripted message to recipients to notify them. 

I was wondering if it's difficult to embed the dataframe results (only the offending stations if they exists) into the email body or as an html attachment? I found a few methods to convert either a dataframe to html (https://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.to_html.html )

but am having issues getting it to embed properly into the body of the email as described here. I normally just end up with {None} getting embedded which indicates to me that it's not properly pulling in my offending values from the dataframe. The current, clean / working version of the code is below which should trigger if any records return values of 2.5 ATM values over 15 (this is easy to change in the script for testing). Any thoughts of an easy way to get additional data (Station Name / ID, etc.) added to the email notification?

Thanks again everyone. The help is much appreciated!

import numpy as np
import pandas as pd
import requests
import json
import smtplib


# In case of AQM Alert notification, this script will email the identified DEQ recipients. These configuration
# variables can be altered as needed.

scriptname = "AirJSONCalls.py"
primary_email = "Rex xxxxxxx <Rex.xxxxxxx@xxx.virginia.gov>"
backup_email = "xxxxxxx<xxxxxx@xxxx.virginia.gov>"
from_email = "xxx-xxx-xxxx <xxx-xxx-xxxx@xxx.virginia.gov>"
SMTP_server = "xxx.virginia.gov"
#email = (df2)
def pyMailer (alert):
    SERVER = SMTP_server
    FROM = from_email
    TO = [primary_email]
    SUBJECT = "AIR Quality Alert Notification"
    MSG = (f"This is an automated message. \n\rThere are Virginia AQM sites reporting concerning levels of AQI data. There are {alert} stations currently exceeding established thresholds. Please invesitage further here: https://www.purpleair.com/map?opt=1/i/mAQI/a10/cC0&select=38835#6.09/38.171/-79.871")
    # Building the message
    print ("Building the message...")
    MESSAGE = """\
    From: %s
    To: %s
    Subject: %s

    %s
    """ % (FROM, TO, SUBJECT, MSG)

    # Send the email
    print ("Sending the email...")
    server = smtplib.SMTP(SERVER)
    server.sendmail(FROM, TO, MESSAGE)
    server.quit()

#get VA Station Data and organize json into python dictionary to function as pandas dataframe:

r = requests.get('http://www.purpleair.com/json?show=39183|31221|31179|31185|2827|12785|35779|15947|46911|3095|35885|10286|33089|38627|34099|27405|21109|27825|34579|34109|27401|21111|27873|34747|26961|38253|34153|27807|27833|27863|22355|9930|2221|8244|8248|4427|2944|2514|15019|28651|47173|25361|36607|37575|4591|6612|36139|30739|5512|3157|34933|38835|48625|34951|2239|12016|34803|14687|26249|35693|36965|34847|34797|44553|39593|46293|48047|30169')
dictionary = json.loads(r.text)
results = dictionary["results"]

#Set options for dataframe:

pd.options.display.max_columns = None
pd.set_option('display.max_rows', None)

#View results and convert appropriate field (pm2_5_atm) from string to float for quantitative analysis:

df = pd.DataFrame(results)
df.pm2_5_atm = df.pm2_5_atm.astype(float)

#Return / retain only values that exceed threshold (set at 26 to represent close to 100 AQI- need to confirm this number)
df2 = df[df.pm2_5_atm > 15]
#df.sort_values('pm2_5_atm', ascending=False)

df2
total_rows=len(df2.axes[0])
print("Number of Rows: "+str(total_rows))


total_rows=len(df2.axes[0])
print("Number of Rows: "+str(total_rows))
rowcount=(total_rows)
#print (rowcount)
alert = rowcount

#email = email.format(df2=df.to_html())


if rowcount > 1:
    pyMailer (alert) 
else:
    print ("Concluded AQM script, no exceedingly high AQI records were found.")‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍
0 Kudos
RexRobichaux
Frequent Contributor

Sometimes... a little sleep and fresh eyes is all it takes to have an obvious "ah-ha" moment. Turns out dataframe content can be inserted into an email text by simply passing in the variable for the dataframe you want to include. Duh... and it doesn't look great- but not terrible either. Final script is:

import numpy as np
import pandas as pd
import requests
import json
import smtplib


# In case of AQM Alert notification, this script will email the identified DEQ recipients. These configuration
# variables can be altered as needed.

scriptname = "AirJSONCalls.py"
primary_email = "Rex xxxxx <xxxxxx@xxxx.virginia.gov>"
backup_email = "xxxx xxxxx <xxxxxx@xxxx.virginia.gov>"
from_email = "xxx-xxx-xxx <xxx-xxx-xxx@xxxx.virginia.gov>"
SMTP_server = "xxxx.xxxx.virginia.gov"

def pyMailer (alert):
    SERVER = SMTP_server
    FROM = from_email
    TO = [primary_email, backup_email]
    SUBJECT = "AIR Quality Alert Notification"
    MSG = (f"This is an automated message. \n\rThere are Virginia AQM sites reporting concerning levels of AQI data. There are {alert} stations currently exceeding established thresholds. Please invesitage further here: https://www.purpleair.com/map?opt=1/i/mAQI/a10/cC0&select=38835#6.09/38.171/-79.871  \n\rStation Details can be found below:\n\r{df2}")
    # Building the message
    print ("Building the message...")
    MESSAGE = """\
    From: %s
    To: %s
    Subject: %s

    %s
    """ % (FROM, TO, SUBJECT, MSG)

    # Send the email
    print ("Sending the email...")
    server = smtplib.SMTP(SERVER)
    server.sendmail(FROM, TO, MESSAGE)
    server.quit()

#get VA Station Data and organize json into python dictionary to function as pandas dataframe:

r = requests.get('http://www.purpleair.com/json?show=39183|31221|31179|31185|2827|12785|35779|15947|46911|3095|35885|10286|33089|38627|34099|27405|21109|27825|34579|34109|27401|21111|27873|34747|26961|38253|34153|27807|27833|27863|22355|9930|2221|8244|8248|4427|2944|2514|15019|28651|47173|25361|36607|37575|4591|6612|36139|30739|5512|3157|34933|38835|48625|34951|2239|12016|34803|14687|26249|35693|36965|34847|34797|44553|39593|46293|48047|30169')
dictionary = json.loads(r.text)
results = dictionary["results"]

#Set options for dataframe:

pd.options.display.max_columns = None
pd.set_option('display.max_rows', None)

#View results and convert appropriate field (pm2_5_atm) from string to float for quantitative analysis:

df = pd.DataFrame(results)
df.pm2_5_atm = df.pm2_5_atm.astype(float)

#Return / retain only values that exceed threshold (set at 26 to represent close to 100 AQI- need to confirm this number)
df2 = df[df.pm2_5_atm > 10]
#df.sort_values('pm2_5_atm', ascending=False)

df2
total_rows=len(df2.axes[0])
print("Number of Rows: "+str(total_rows))


total_rows=len(df2.axes[0])
print("Number of Rows: "+str(total_rows))
rowcount=(total_rows)
#print (rowcount)
alert = rowcount

#Check to see if any offending values are found for stations and if they are, send notification email. If not, script concludes.
if rowcount > 0:
    pyMailer (alert) 
else:
    print ("Concluded AQM script, no exceedingly high AQI records were found.")