Please let me clarify some statements:
When I say best practices, I mean best practices being suggestions, and the addition of trapping for the things which prevent your service from working.
Best practices appear as warnings and messages in the analyzer and others are the errors which wont allow you to publish the service.
Warnings and messages are something you (the service publisher) can chose to ignore or implement. We don't force these on you.
I understand your concern with re-running and adding 30 tool executions into a task. Honestly, throughout beta nobody came to us with more than 3 or 4 tasks per service presenting a concern regarding running this many tools. To me, 30 seems like a lot of tasks on a single machine, but if you're deploying that many then you must have the system resources to deal with that many running instances.
At this point all I can do is offer you some suggestions to reduce the time it takes to republish 30 tasks. Hopefully you can make use of one of these in your publishing:
1) In the Geoprocessing Options in ArcMap (Geoprocessing > Options), you can set Results Management to "Never Delete".
With this setting, run every tool you want to publish, you'll get a result for each
Save the MXD.
Anytime you want to republish the service, you simply need to open that MXD, run the updated tool, and then put together the service with a combination of old/new results.
>This method is also handy as if any data required for the service has gone missing, the analyzers should catch it.
2) Go to Windows > ArcMap options > Sharing Tab > check "show file location when saving draft service definitions"
Run all of the tools you want to publish (you'll get a result for each)
Get into the Service Editor by sharing a result as a service
Add all your Results into the service
Click the "x" on the service editor, and say YES to saving a draft.
Note where the draft file gets saved to.
You can re-open these and add or remove results from it later (when you need to update).
Note: The .SDDRAFT files act more like pointers, so if you start moving tools/data from where they were when you originally saved, you could break references.
3) For 10.1 sp1 we're working on a pythonic (arcpy) way to publish gp services.
It is early in the testing and looks good so far, but till its 100% I can't guarantee it'll make sp1.
In short, you could write a runner or execution script to power you through the publishing process.
The script calls each toolbox/task, runs it, assigning each to its own result object.
You take each result object and pass it into arcpy.GPCreateSDDraft([list of result objects], , , ) - new function, then Analyze it, Stage it into a .SD, and finally Upload.
This entire workflow has been well documented and exists for Maps. Like I said, we're working on it for 10.1 sp1 in terms of GP. Hopefully it'll come together like we want.