Select to view content in your preferred language

Append skips records exceeding character length

551
8
09-04-2024 08:39 AM
BMCGIS
by
Occasional Contributor

I want to use the Append tool as part of a model process for adding new records to a database of building permits.

However, I have found that the Append tool fails for any records where the input text field has more characters than the corresponding target layer field width.

Append_AM_03b.png

So is there a tool that will truncate these records to the required character length before I use the Append tool? Or is there a process I can use afterwards to review and modify the records that didn't append? The error message only lists 10 of the 50 failed records. But it also generates a file containing the full list of OIDs for records not able to be processed.

Are there any other tools or best practices that I can use to prevent or work around this issue?

I can somewhat address this by using large field lengths in my target layer, but I have no control over the inputs, and there will always be a risk that some records will have longer text entries than I planned for.

0 Kudos
8 Replies
DanPatterson
MVP Esteemed Contributor

have you set up field mapping? and if so what have you set it as?

Append (Data Management)—ArcGIS Pro | Documentation

Merge might be a better option if you have no control over source and destination field structures


... sort of retired...
0 Kudos
BMCGIS
by
Occasional Contributor

I'm just using the "Input fields must match target fields" option. I don't need field mapping because the input fields will all be the same name and type as the target fields.

But I would use field mapping if that provided a way to solve this. I looked through the documentation, but I don't see one?

And I think Merge won't be a good option because it creates a new layer instead of updating the original layer.

To clarify, I do have control over the destination field structures. I just can't control the source field sizes. If a permit office employee enters a very long description of the permitted work, or even if a permit has an unusually long name or address, I can't control that.

For more context, I work for a regional planning agency and our member counties are sharing their permit data with us. So we have no method for implementing any controls or limits on the individual county systems.

0 Kudos
jcarlson
MVP Esteemed Contributor

Where is the source data coming from? If it's an Enterprise GeoDatabase, you could bring in the data as a Query Layer, then use SQL to pull in the field as "LEFT(County, 49)" or something.

- Josh Carlson
Kendall County GIS
0 Kudos
BMCGIS
by
Occasional Contributor

It will be coming from a regular file geodatabase. I'm not sure I totally follow what you're suggesting, but if I used that method, I would have to do that LEFT calculation for each of the 30 fields?

0 Kudos
jcarlson
MVP Esteemed Contributor

For any field where the value might exceed the destination field length. But it's a FGDB, so you don't have that option anyway.

A more advanced option would be to use the ArcGIS Python API to read the source data into a pandas dataframe, make whatever adjustments you need *there*, then push the updated data to the destination layer that way.

An easier, but still advanced approach, might be to use ModelBuilder to create an in-memory layer from the FGDB, perform field calculations on it, then append to the destination layer.

Basically, you just need some mid-stream processing to handle your invalid data and truncate the long values, or do a find-and-replace to abbreviate things, etc. Using Python or ModelBuilder would be nice because you can avoid creating temporary tables every time.

- Josh Carlson
Kendall County GIS
BMCGIS
by
Occasional Contributor

Thanks, this is helpful a helpful idea and I can envision how doing the mid-stream processing in Model Builder would work for the system I'm building.

I thought maybe I was overlooking some Pro tool or data cleaning process that was intended for this scenario, but I guess not!

0 Kudos
RandyKreuziger1
Frequent Contributor

Is there a reason you're not updating the target schema to match the source feature class?  That's the best solution.

BMCGIS
by
Occasional Contributor

Good question, but this isn't really possible because my source data comes from 7 different counties in Excel or delimited text format, so there's no standard schema that they adhere to.

I am currently working to design a new GIS-based system for processing these inputs, so I am designing the new table to have large field widths for fields that sometimes have long entries. But there's no way to guarantee that there will never be longer entries that cause the Append tool to fail. So I am trying to build this in a way that would handle those cases.

0 Kudos