Well, this topic has often been raised, and many things have been said about it, among which many are totally out of mind.
minstrel (the article author you mention) is right when he states that SEO does not the majority of one site's success, but his explanation about the difference between dynamic and static pages (not urls) is just not covering the entire topic.
The thing is, SEO is optimization, and thus optimization of something which must exist since 0x100000 = 0. In the SEO case, we are optimizing the content's impact in search engine, so we obviously need decent content to start talking about impact and success, and this is the real hard part, content.
And more about content, not all type of content will need to appear in search engines, the interesting content to an SEO point of few is the one likely to be interesting many people and that can be displayed publicly (not private). This mean that not all type of site will need SEO.
Then, since SEO is strongly related to content, one as well has to adapt the SEO techniques used on a given site. Most of the basic one are pretty universal, but there can be great differences in what is the best possible way to implement a technique, especially when it comes to url rewriting (this is the reason why we offer so many possibilities here with phpBB3 urls).
Last thing before we proceed with some more in depth answer, a very simple fact to notice is that keywords in url won't only affect bots and search engines, there is no need to go for a headache to understand which one of these two url will make more sense in itself :
If you add the fact that in the given example, the text tag of the auto generated link will contain interesting keywords (of course relevancy is the key here, hence the several url rewriting offered here) in the first case and none in the second, you start to understand what kind of side effect that can be going on here.
No let's try to go a bit deeper. First thing to notice when talking about how a technique can have an impact in the way the search engine (SE) will evaluate a site is that, like in the quantum physics theory, the experiment will alter the result.
If for example you would want to benchmark a given SEO trick, and to do it scientifically, the only direct way to do it properly would be to start two web sites or pages with only this SEO trick as a difference, and measure the impact in SEs. Problem is, SE are moderating their results against many many parameters, among which duplicate content is a key one. This mean that for example two pages with same content, but different URLs will have trouble in SEs.
This of course does not mean one cannot have a good idea, based on facts and empiricism, about an SEO technique, but there is no easy answer such as do this and get that.
The thing is, you'll even find example where implementing SEO will barely not change a thing (I'm only considering cases where no error are made during the implementation, because a bad implementation case as well cause troubles).
It would probably be the case of the theadminzone.com forum, since they are in the case where many duplicates are indexed (post urls VS topic urls for the most), so if they'd go for a clean url structure without duplicate, which is unanimously recommended (at least by people who know what are they talking about), they'd loose a lot of indexed pages at first.
Right now they have :
Threads: 52,234, Posts: 374,654, Members: 31,360, Registered Today: 7, Active Members: 3,167
for 126,000 indexed pages on google
With 52,234 topics and 374,654 post the amount should be around 52,234 since 374,654 / 52,234 make an average of 7.17 post per topics, the majority of the topics should be a bit less than one page long.
As you must know, the total number of pages play a significant role for SE, they tend to give more importance overall to big websites.
So in this case, while reducing all the duplicate will for sure please the SEs, the number of page will decrease, so it's very likely that at first, the SERP and referrals would not significantly change with url rewriting and duplicate reduction.
But, I can tell you that on the long run it would be worth it, because the overall quality of the content will significantly increase once there won't be as many duplicates as the number of posts for each topic. There is no point to repeat the same content like this, even if this is increasing the amount of indexed pages, it's not a good thing at all. It's been proven for a while (penalization can be easily demonstrated than boost, because one can set up a test where the only difference between two page is the url and thus conclude about how great is the penalty very easily) that duplicate are really not good SEO wise and that SERPs are directly affected.
My bet is this will be more and more important as time will pass and SE get more and more resources to really deal with our content (performing more and more accurate stats).
This being said, believe me, cases are many where implementing some simple SEO technique would make an even tremendous difference.
The basis are obvious, but they often are not fully followed : one page, one title, one url.
And the very first thing to achieve is full indexing of the site, no indexing no SERPs, no SERPs no referrals, period.
I recently posted about this at phpBB.com and noticed something interesting : http://www.phpbb.com/community/viewtopi ... 5&start=15
(you'll need to follow the link to find out about the topic title
So the question is, where did all these topic go ?
I again repeat what's said before in the same topic :
dcz wrote:phpBB3 is just great as it is
For the same reasons stated above, SEO is not always needed, and there is no universal answer, because SEO is strongly related to content, and because a good SEO strategy, which includes a lot more than just installing some code, must go along well with your marketing strategy.
So if the initial question was, are there some efficient technique that can have a significant impact on the search engines, the answer is yes of course, and you'll find many example on phpBB SEO of sites even going through a huge boost.
Anyone stating the opposite just does not know what he or she is talking about.
Again it's hard to accurately benchmark, but give it a try, or start to through a deeper look to existing forums with no SEO, you'll see how often the very first basic principle (one page one title, one url) is not followed, and you'll see as well that you'll find more sites with url rewriting being fully indexed.
With more time experimenting, you'll find out that SERPs are as well affected, but it's where many more parameters comes in, so only time and experience can make you understand that it is still better to have nice and relevant keywords in urls, when of course it's possible.
The reason for that may just be as simple as stated above, keywords are keywords, they are providing with some information to people reading it, including bots, and sometime, it's the only available source of information about the linked page, like in the example posted above.
There is just no way that bots would not use any source of information they can to evaluate a page, it's actually the opposite, SE do tend to use all the possible informations they can collect and process about each page of the web.