I have a Pro project document that I am using to do some data processing using a notebook script and one map with references to file geodatabases, a bit of SDE and some feature services, including imagery. When I first packaged it up it took about 5 hours (though for some reason it is not counting correctly, because it somehow is missing the gathering part that seems to take forever). That was long but not unbearable, though I expected a larger file than 94.584 MB.
Then I went over to another project that was less complex and it took minutes not hours to create the package. This one also has a lot of varied sources and about 30 maps.
Then I went back to the first one a week later, and Project Package took about 12 hours.
So, in general terms, what is it doing for the gathering, what optimization can be done to get through it faster?
don't package history
locally stored data will take less time to package
packaging just for yourself will take less time
What conditions are your two packages subjected to?
Thank you @DanPatterson. I can drop the History but bringing in all the data local isn't worth the effort and adds confusion, and also others are outside my org. What do you mean by conditions?
You answered the conditions partially. Non-local data, (sde web-stuff etc) is going to add to the packaging time. Did the both have the same number of layouts? maps? tables, open tables etc. toolboxes
Package a stripped down version of your projects. In fact there is rarely a need to have a huge project to begin with since each project could be broken down to some degree based on use/functionality. Saves the headache of losing everything because it was in one project or package.
Good luck
What is the reason packaging online referenced data taking long time. What can be done to speed this process up?
We found out few things that may help. You may need to go to item details and disable attachments. We found a bug where relationship tables/classes when published online cause dramatic increase in packaging time. We also noticed that running package tool map often works better than sharing map package from ribbon. When using map packaging tool speed is much better if you are only packaging current version instead of all versions. This will not let you open it in earlier versions but speed is much better. Big kudos to #Kory Kramer for being so helpful with this. Hopefully these bugs are worked out soon so we can rely on using attachments without compromise on speed of packaging.
Just a note for others that may come across a .ppkx, .mpkx, .lpkx, etc. and can't open natively in your version -- the file is essentially a wrapper around file geodatabase(s) and other relevant objects depending on the package type, for example an .lpkx will contain the gdb data and a .lyrx file.
You can open the file with something like winrar or 7zip and explore the contents.
If you export with the option for multiple versions, there will be a subfolder for each version. I think depending on your version or underlying data there will be a single source of gdb data in a commondata folder, but otherwise it potentially will have a complete duplicate of the gdb data under each version folder.
I have had simliar confounding behaviors when exporting map packages where layers sourced via a UNC path instead of a mapped drive path (same data mind you) would cause different behavior in export time or size due to it kind of going haywire and trying to export data in one case and 'reference' the data in another case.
I think the whole set of packaging tools needs a lot more options to control exactly what you want to happen.