Hey Tim Reissen,
Your optimise may have finished by the time you get this message but the speed that it optimises is related to the number of polygons and the number of vertices. Are your polygons detailed with lots of vertices? If they aren't you may wish to try restarting the process.
If this answer has helpful please mark it as helpful. If this answer solved your question please mark it as the answer to help others who have the same question.
Checked my computer again this morning, and it still says optimizing. yes there are lots of vertices, but i did not expect it to take a full day. How do I either cancel it, or restart?
Late response, and you probably solved your issues by now, but what is "lots of vertices"? Thousands, millions?
If you get over 100k vertices per polygon or line geometry, things can slow down dramatically. E.g. in a custom Python scripting and generalizing workflow, I've been processing data with records of up to 1.7M vertices per polygon in a PostGIS database generalizing them. I discovered by monitoring pgAdmin, the database management software of PostgreSQL, that it could take half a day for a single record to be processed when dealing with geometries > 1-2M vertices.
I don't know if it is acceptable for your workflow, but by dicing them to 100k vertex limit, processing times were dramatically improved. E.g. for one medium long running dataset (although not containing that 1.7M monster polygon), it reduced processing times from 1h23m to just over 4-5 minutes.
Dicing may not always be acceptable, but in my case it was, and it helped solving the performance issue.