This is a recurring problem of mine: How do you easily tell if a feature class is part of a feature dataset or not?
It isn't an option in the Feature Class Properties, or in Describe Properties, or in Table Properties.
The best way I've found (unless I'm missing something very obvious, in which case I apologize to you all) is to just check the literal path of the feature class, e.g.:
z = [r"\\...\example.gdb\featDS\fc", r"\\...\example.gdb\fc"]
for x in z:
if ".gdb" in x:
y = x.split(".gdb")[1].split("\\")
print(y)
if len(y)>2:
print("y is in a feature dataset")
else:
print("y is standalone")
#"y is in a feature dataset"
#"y is standalone"
This is kind of gross, you know?
I suppose I could also check out the path property ('path': '\\\\...\\exfgdb.gdb\\FD'), but then I still have to parse the dang thing to figure out how long it is. It's certainly more elegant than the code above, but it doesn't solve the problem of "I shouldn't have to work this hard".
I know that for the majority of geoprocessing, being in a feature dataset is immaterial to arcpy. That is, you can feed Buffer() both "\\...\ex.gdb\featDS\fc" and "\\...\ex.gdb\fc" and it'll work just fine.
That being said, there are times (e.g. replacing data sources) where you need to know whether you're in a feature dataset or not.
Please add in a property letting us do one of the following:
1) .isInFeatureDataset(): Boolean
2) .FeatureDataset(): Actual Feature Dataset name, defaults to None for Shapefiles, gdb feature classes outside a feature Dataset
or
3) Release an easy, surefire way for us to figure it out without parsing the path as a string.
See below for a sample of a real feature class that lives in a feature dataset called "FD". Other than reading the catalog path or the path properties and then parsing them, there isn't any clean way to get the feature dataset name.
print(arcpy.da.Describe(r"\\...\exfgdb.gdb\FD\inFD"))
{'catalogPath': '\\\\...\\exfgdb.gdb\\FD\\inFD',
'FIDSet': None,
'aliasName': '',
'areaFieldName': '',
'attributeRules': [],
'baseName': 'inFD',
'canVersion': False,
'changeTracked': False,
'children': [],
'childrenExpanded': True,
'createdAtFieldName': '',
'creatorFieldName': '',
'dataElementType': 'DEFeatureClass',
'datasetType': 'FeatureClass',
'dataType': 'FeatureClass',
'defaultSubtypeCode': -1,
'DSID': 4,
'editedAtFieldName': '',
'editorFieldName': '',
'editorTrackingEnabled': False,
'extension': '',
'extensionProperties': {},
'extent': <Extent object at ...>,
'featureType': 'Simple',
'fields': [<Field object at ...>,
<Field object at ...>,
<Field object at ...],
'file': 'inFD',
'fullPropsRetrieved': True,
'geometryStorage': '',
'globalIDFieldName': '',
'hasGlobalID': False,
'hasM': False,
'hasOID': True,
'hasSpatialIndex': True,
'hasZ': True,
'indexes': [<Index object at ...>,
<Index object at ...],
'isCOGOEnabled': False,
'isCompressed': False,
'isTimeInUTC': True,
'isVersioned': False,
'lengthFieldName': 'Shape_Length',
'metadataRetrieved': False,
'MExtent': 'nan nan',
'modelName': '',
'name': 'inFD',
'OIDFieldName': 'OBJECTID',
'path': '\\\\...\\exfgdb.gdb\\FD',
'rasterFieldName': '',
'relationshipClassNames': [],
'representations': [],
'shapeFieldName': 'Shape',
'shapeType': 'Polyline',
'spatialReference': <SpatialReference object at ...>,
'subtypeFieldName': '',
'versionedView': '',
'ZExtent': 'nan nan'
}
This is extra annoying because if you check out a feature layer's properties, it straight-up tells you that it's in a feature dataset:
I agree, it would be nice if there was more simple method to interact with feature dataset properties.
Although not as efficient as your suggestion, I'd just like to point out that this functionality does exist using layer CIM properties.
Note, if you are operating directly on a feature class, I believe you have to use MakeFeatureLayer first.
For example:
aprx = arcpy.mp.ArcGISProject('CURRENT')
mp = aprx.listMaps('TEST_MAP')[0]
lyr = mp.listLayers('TEST_LAYER')[0]
lyrCIM = lyr.getDefinition('V3') # Layer CIM definition
dc = lyrCIM.featureTable.dataConnection # Layer CIM data connection properties
if hasattr(dc, "featureDataset"): # Check if layer is in feature dataset
print(dc.featureDataset)
This is kinda a messy workaround, but this little function does a good job of finding a dataset and returning the dataset describe object:
import arcpy
import arcpy.typing.describe as typdesc
import os
from typing import Optional, Union
def get_dataset(fc: Union["typdesc.base.FeatureClass", os.PathLike]) -> Optional["typdesc.base.Dataset"]:
if isinstance(fc, str):
fc = arcpy.Describe(fc)
fc_path: list[str] = fc.catalogPath.split(os.sep)
workspace_path: list[str] = fc.workspace.catalogPath.split(os.sep)
if len(fc_path) - len(workspace_path) == 1:
return None
return arcpy.Describe(os.sep.join(fc_path[:-1]))
It also uses arcpy's hidden describe typing system for autocomplete goodness. The code basically does what yours does, but instead uses the .catalogPath and .workspace.path properties of a FeatureClass describe object.
This approach should technically work even with non-file database sources.
Here's the function being used in a simple check:
fc = r"path\to\feature_class"
fc_dataset: Optional["typdesc.base.Dataset"] = get_dataset(fc)
if fc_dataset:
print(fc_dataset.name)
else:
print("No dataset found")
Thanks for the conversation around this Idea.
Describe properties typically only reference properties that exist in the data element of an object. Feature dataset information is not currently included in that, which is a reason for the current limitation.
The above methods are all perfectly viable for getting feature dataset information, I'll share the method that I use, which saves me from having to parse strings directly.
import arcpy
import pathlib
def get_fds_name(tbl_path: str | pathlib.Path) -> str:
parent = pathlib.Path(tbl_path).parent
if arcpy.Describe(parent.as_posix()).datasetType == 'FeatureDataset':
return parent.name
return ""
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.