Lou,
I'm simply remembering the chaos each time GD released it's lists. The specific accusations and examples escape me but surely you remember the threads? My impression was that some courses would go to significant expense to lavish raters with gifts etc...
Mind you, I'm only getting that information from these threads, no actual experience in that regard but I assume some level of truth.
I have a question about your last paragraph...is a wider array of opinions in the production of a single list really a good thing? In the course of a discussion group? Absolutely. In generating a list on a purely subjective topic such as golf course quality, does aggregation not equal dilution?
Unfortunately, a lot of bad information is posted here, much of it hearsay, and some which has been augmented to reinforce a POV. I suspect that the universe of raters compares favorably relative to the general population in terms of on-course behavior, though no doubt that some inappropriate conduct is displayed at times.
Both panels that I am familiar with stressed strong codes of conduct and a near-zero tolerance of violations. In fact, I thought them to be too strict because of a lack of due process- a complaint from a club could result in an individual being dismissed without the opportunity to present his side of the story.
Unfortunately, there is not a shortage of individuals attracted by the seeming glamour of the industry who lack the demeanor and service orientation required by some very demanding club positions (think of the turnover at the assistant professional level). Over the years I've fielded countless gripes about members, daily-fee golfers, raters, golf architects, maintenance staff, etc. Some may have been valid; most didn't warrant further thought.
One that really saddened me was a claim by a self-described "second in command" at a well-known multi-course facility in FL. We were early in a round on one of the best courses in the UK and he, as a comped guest of a golf writer/photographer doing a piece on the club, posited that one of his courses would have easily made a top 100 list had he approved a $20k ad placement in the magazine. I know the publication well and I am 100% certain that he made the story up, but that's how these things get started.
As to your question, I probably need to think a bit more about it. But, assuming a large enough number of ballots on a wide number of courses, careful selection of raters, clear criteria for evaluating a course, and proper training to at least calibrate the scale, a wider range of perspectives should result in superior rankings. The dilutive effect, again, assuming large numbers, should be offsetting, perhaps resulting in overall lower values across the rankings, but mostly consistent in relative terms.
I see greater danger and the possible conflict issues alluded to by JC in having a small number of like-minded raters with no criteria by which to derive a value. It is certainly easier to manage that process, but I think that it makes the product less useful for a wide variety of golfers. I've perused Golf's list and I can definitely see Ran's and gca.com's influence. And that's ok. Many of us know which of the major three speak to our subjective preferences and can use the information accordingly.