Thank you for your reply, it points out how poor of a post I made. My problem comes from having many (400,000+) records needing updated. If this data were in a file geodatabase it would be fine because I can set the number of records to write in a transaction. The problem is I'm working in a version and can't run transactions. What I want to avoid is having to manually split the data into smaller sets and run one after the other instead of automating the whole process in Data Interoperability (DI).