Good afternoon. I am a returning user after some time away. I used ArcGIS Pro for 3 1/2 years in college, and have now purchased the subscription for myself.
Context:
I am pulling weather data using an API key, to eventually automate the data pull, conversion, and creation of ArcGIS Maps in one, seamless, automated process. However I am stuck on a current component of the process. The data I have been pulling has been within a json formatting. I have tried feeding json & geojson formatted files to Arc, and it will not accept them. However it is accepting Shape files. So I have been feeding that data that I have been collecting into ChatGPT to help me create python coding to convert the data. Merely to see if the data could work, I am now entering the stages of semi automation. The problem is, there is now over 2700 lines of code of data, and ChatGPT cannot take that much raw data to recreate into code, so I now have to troubleshoot and find a way to solve this.
Update:
I have created a manual system to find a way to go from json, to csv, to shape. This route works, however the shape is not having the correct formatting of the converted data, so when displayed on ArcGIS, it shows the data of the coast of Africa, instead of it intended location.
My question:
Explanation: The final product of this is, pulling weather API data every 15 minutes of the day, the script pulls the data, converts the data, then delivers the data to ArcGIS, and Arc complies the 15 minute intervals into a visual map with a moving forecast for people to view and consume. Other websites of weather stations and satellites will be incorporated later for larger data downloads / displays. However for now, this specific goal is what I am aiming for.
Many thanks for reading this far, any help or advice is appreciated. I thank you for your time and consideration
Solved! Go to Solution.
You'll need to extract that weather API's geometry format, convert it to the appropriate arcpy Geometry object, then feed that into your Insert Cursor using the magic "SHAPE@" field token. A hypothetical:
import json
fields = ("key", "SHAPE@")
crs = arcpy.SpatialReference(4326)
with arcpy.da.InsertCursor(existing_feature_class, fields) as cur, open(my_json_file) as f:
data = json.load(f)
for obj in data["features"]:
key = obj["id"]
points = arcpy.Array([arcpy.Point(pt["x"], pt["y"]) for pt in obj["polygon"]])
shp = arcpy.Polygon(points, spatial_reference=crs)
cur.insertRow((key, shp))
The exact method you use to extract the info and turn it into data arcpy can work with will differ based on what the API is returning, it's up to you to interpret their format correctly.
As for automating, if you build a feature class properly you can continue to load it with data from many input files. Alternatively, you can build a new feature class from scratch each time. Arcpy has all the tools to get you from point A to point B, play around a bit and you should find a solution.
Take a look into the built-in json library to convert the API results into a dictionary, then you can extract the values you need and feed it into a feature class (existing or new) with an Insert Cursor. You'll also have to create your feature class in the data's coordinate system or project the incoming data to match your feature class's CRS.
Is this able to evolve from point to polygon formatting, or is this merely a placeholder solution towards getting the system to read the pulled data in a new formatting? And what of no pre existing feature class? Will this route of data conversion allow for automation of creating a feature class from a new arc project, and able to create new feature classes without me having to monitor it 24/7?
Simply trying to find a path that will work for maximum efficiency, I don't mind how much code or effort it takes, I simply want it to pull as much data as it can on its own
You'll need to extract that weather API's geometry format, convert it to the appropriate arcpy Geometry object, then feed that into your Insert Cursor using the magic "SHAPE@" field token. A hypothetical:
import json
fields = ("key", "SHAPE@")
crs = arcpy.SpatialReference(4326)
with arcpy.da.InsertCursor(existing_feature_class, fields) as cur, open(my_json_file) as f:
data = json.load(f)
for obj in data["features"]:
key = obj["id"]
points = arcpy.Array([arcpy.Point(pt["x"], pt["y"]) for pt in obj["polygon"]])
shp = arcpy.Polygon(points, spatial_reference=crs)
cur.insertRow((key, shp))
The exact method you use to extract the info and turn it into data arcpy can work with will differ based on what the API is returning, it's up to you to interpret their format correctly.
As for automating, if you build a feature class properly you can continue to load it with data from many input files. Alternatively, you can build a new feature class from scratch each time. Arcpy has all the tools to get you from point A to point B, play around a bit and you should find a solution.
I will give this a try in different ways to see whether it is feasible for my project or not. Many thanks and appreciation for your time and consideration.