I am using Parallel Processing Factor to speed up a process that has been taking too long. I have a question though. I have a workstation computer that has a 16-core processor. Each core has two logical processors. How many processes would it do if arcpy.env.parallelProcessingFactor was set to 50%, 75%, or 100%, given that fact?
Why not use the Multiprocessing module that comes with the baseline python libraries? This typically splits things out by "threat" (Logical Core). At least that is the case on the 6 core i& I had that had 12 logical processors. We also use it for multi processing on virtual server environments, which don't normally list logical cores, so your "cpu count" would equate to the cores on the machine.
More so, if your machine displays "logical processors", then with the multiprocessing module, you can call the process to split processes out over each logical core that exists. In the case of my 6 core i7, I had 6 cores, but could split the processes out over the 12 logical processors.