Covid19 Cases Feature Layer Data and ESRI DUMPER

949
4
05-22-2020 02:29 AM
Dunn_Jeremy
New Contributor III

Morning All

I want to download the Covid19 Daily Cases data from here

https://services1.arcgis.com/0MSEUqKaxRlEPj5g/ArcGIS/rest/services/Coronavirus_2019_nCoV_Cases/Featu... 

I have tried this way.

1. ESRI DUMPER - Works but then I have difficulty converting the ESRI geojson to something that is usable in a DataBase.

Any thoughts on the best way to convert this output to a SQL readable format ? CSV/TAB delimited flat file for example ? 

Many thanks 

Regards. 

Jeremy. 



Tags (2)
0 Kudos
4 Replies
RandyBurton
MVP Alum

You can use the json module to process the geojson.  I was testing the following using python 2.7.  It makes the URL request and writes a tab delimited file.

import urllib, urllib2, json, sys, time, collections

fw = open("covidUpdate.txt", "w") # file to write

URL = "https://services1.arcgis.com/0MSEUqKaxRlEPj5g/ArcGIS/rest/services/Coronavirus_2019_nCoV_Cases/FeatureServer/1/query" # query this feature

# using GET method
query_dict = {
    "where" : "1=1", # something always true
    "outFields" : "*",
    "f": "geojson" }

# results in geojson format
jsonResponse = urllib.urlopen(URL, urllib.urlencode(query_dict))

features = json.loads(jsonResponse.read(),
                      object_pairs_hook=collections.OrderedDict)[u'features']# using the features section

# Header
fw.writelines("{}\t{}\t{}\t{}\t{}\t{}\t{}\t{}\t{}\n".format(
    'OBJECTID','Province_State','Country_Region','Last_Update','Latitude','Longitude','Confirmed','Recovered','Deaths'))

# Data rows
for f in features:
    fw.writelines("{}\t{}\t{}\t{}\t{}\t{}\t{}\t{}\t{}\n".format(
        f['properties']['OBJECTID'],
        f['properties']['Province_State'] if f['properties']['Province_State'] is not None else '',
        f['properties']['Country_Region'],
        # convert timestamp to local time
        # time.strftime('%c', time.localtime(f['properties']['Last_Update']/1000)) if f['properties']['Last_Update'] is not None else '',
        # convert timestamp to GMT
        time.strftime('%Y-%m-%d %H:%M:%S', time.gmtime(f['properties']['Last_Update']/1000)) if f['properties']['Last_Update'] is not None else '',
        f['properties']['Lat'],
        f['properties']['Long_'],
        f['properties']['Confirmed'],
        f['properties']['Recovered'],
        f['properties']['Deaths']))

fw.close()
print 'Done.'‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

Results:

OBJECTID	Province_State	Country_Region	Last_Update	Latitude	Longitude	Confirmed	Recovered	Deaths
1	Abruzzo	Italy	2020-05-22 23:32:40	42.35122196	13.39843823	3220	1647	394
2	Acre	Brazil	2020-05-22 23:32:40	-9.0238	-70.812	3343	0	80
3	Aguascalientes	Mexico	2020-05-22 23:32:40	21.8853	-102.2916	586	402	21
.....‍‍‍‍‍

I noticed that Provence_State is sometimes null/None, as was the timestamp on occasion.  I kept the time in GMT/UTC, but you can convert it to local time if desired.  It is also possible to use an insert cursor and create a feature layer if desired.

This should give you some ideas.

0 Kudos
RandyBurton
MVP Alum

If you requested json instead of geojson, you could use the JSONToFeatures  tool (Pro version) to import into ArcMap.

My test script using python 2.7:

import arcpy, urllib, urllib2, json

URL = "https://services1.arcgis.com/0MSEUqKaxRlEPj5g/ArcGIS/rest/services/Coronavirus_2019_nCoV_Cases/FeatureServer/1/query" # query this feature

# using GET method
query_dict = {
    "where" : "1=1",
    "outFields" : "*",
    "f": "json" }

# results in json format
response = urllib.urlopen(URL, urllib.urlencode(query_dict))

with open('covid_update.json', 'w') as f:
    f.write(response.read())
    
arcpy.JSONToFeatures_conversion(in_json_file="covid_update.json",
                                out_features="C:/Path/to/file.gdb/covid_update")

print 'Done.'
Dunn_Jeremy
New Contributor III

Good Afternoon Randy

Many thanks for the post. After a little bit of re-working I successfully created a script that worked.

Many many thanks. The only issue I have is that the feature layer I am querying has 3466 features in it and the Max Record Count is limited to 1000. Is there anyway to increase this ? 

Regards. 

Jeremy. 

0 Kudos
Dunn_Jeremy
New Contributor III

Morning Randy

I solved this by simply running a where statement on the ObjectID's. 

Where "ObjectID>1000"

Seemed to work. 

Regards. 

Jeremy. 

0 Kudos