I am producing a large geojson file using python and some fields for some of the features are empty. Geojson spec is to set these as 'null'. However, on importing this into ArcGIS Pro, these prevent the field being numeric and it imports as text. Here is an example:
"features": [
{
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [
-75.35539,
39.823217
]
},
"properties": {
"DEPTH": 0.0,
"PARENT_SAMPLE_NAME": "sdffssdf",
"STRATIGRAPHY_LAYER": 1.0,
"qc [MPa]": null,
"fs [MPa]": null,
"u2 [MPa]": null,
"Qnet [Mpa]": null,
"SBTi": null
}
},
{
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [
-72.23549,
36.823217
]
},
"properties": {
"DEPTH": 8.4,
"PARENT_SAMPLE_NAME": "fsfsfd",
"STRATIGRAPHY_LAYER": 2.0,
"qc [MPa]": 1e-9,
"fs [MPa]": 1e-9,
"u2 [MPa]": 0.0,
"Qnet [Mpa]": -0.084208058,
"SBTi": 0.0
}
Because the first feature has null values (and some features throughout the file) have null values, then qc, fs, u2, qnet and sbti are imported as text. I need them to be numeric. Replacing 'null' with any number works, but I want them to remain as null or empty values.
Removing the properties that are null also doesn't work, as then that field is not present at all.
How to get these numbers imported as numbers?
You're probably better off establishing the layer in a geodatabase first, then using the GeoJSON to append feature to it.
When Pro imports what is essentially a text-based layer, it often "guesses wrong" about the data type of a field. The inverse problem is often seen with numeric identifiers being imported as true number fields and losing leading 0 digits. Establishing the destination layer first enforces the data types, and should give you a more reliable output.
OK So this leads to even more interesting behaviour. If the fields are set up as numeric, 'null' values in the JSON are imported as 0. If the fields are completely missing in the JSON for a point, the values are still imported as 0. This shouldn't really happen - arcgis is inventing data that isn't there in the source?