This is a tough topic.
It's really hard measure the exact impact of a SEO technique, even though it's obvious the results are here. Because the indexing and SERPs is highly time dependent, and as well highly content dependent.
Scientifically comparing two SEO technique would only make sens if we could experiment the same conditions for both and for some time (a year or two), but we cannot compare the same exact conditions because if we'd start two web site with the same content to make sure the only difference is our SEO trick, the fact they'd be duplicate would interfere too much to conclude, even though I'm ready to bet good SEO techniques would give the advantage (SERPs and visits) to the web site using it.
What's doable, is to experiment if something blocks indexing, or if something generally helps out to build better SERPS.
And this is not necessarily meaning this would bring much more visits either, depending on the search queries and keywords associated (being first on a search query never performed is not much).
If we'd compare for example a phpBB forum with SIDs with one which would only have gotten rid of SIDs, I'm sure the one without SID would be the most efficient, but this strongly depends on other matters, we all know phpBB.com is PR9 and fully indexed despite SIDs.
What I think is SEO is a way to go faster, faster in indexing, faster in building good SERPs and faster to build traffic, faster in comparison with the exact same web site without them.
So, talking about how much, even in percent, time would be saved is a rather hard one. It depends on how much and which content the web site will have, the number of backlinks and the traffic itself. Because the traffic is as well an parameter that plays a role as far as PageRank and SERPS.
Everything is relative SEOwise, the most important being what's optimized and not the optimization itself.
What is for sure is you'll have really better result in indexing if you go for mod rewritten urls with good keywords in it, and a Google sitemap than if you'd run awful urls with SID parameters. Anyone doing this, even phpBB.com I'm sure would experiment a difference in SERPS and traffic (not only bots, which is what Google analytics shows anyway, only real users with JS so it's rather a good tool to use for this).
But, if you'd start from a static rewriting, the difference would be of course less obvious.
You should be able to find better SERPs on many search queries, but this does not necessarily means more visitors, depending on the search queries. Even though better SERPs should mean more visitors, it can be hard to measure it if your web site is already building better traffic every month when you start SEOing a bit more.
You can evaluate the better positioning by taking a look at the amount of search queries providing visits, SEO should at least increase the number of possible entries, even though they could only bring one visitor per year.
A way to better evaluate SEO improvement would be to get rid of them and see what happens, but I don't think anybody would be willing to go for such experimentation