Mac,
Very interesting comments. Reminds me of the old statistics joke
about the guy with his head in the oven and his feet in the freezer. On average the temperature's perfect.
But the main thought your comments spark for me is the declining relevance of consensus in an age of personalization and fragmentation. If you think for example about traditionally how books were selected by readers for purchase -- which like golf (playing a round of golf) is an experience good -- essentially it boiled down to one of three ways (leaving out the ol' "judge a book by its cover"):
1) Bestseller lists -- essentially consensus
2) Review by someone the reader trusted (critics as well as friends, relatives)
3) Reputation of or past experience reading the author (author's "brand")
These days that's antediluvian. Bestseller lists probably count for much less than they used to as far as a selection tool, and why? Because of your rationale. These lists generally are sort of a lowest-common denominator consensus and pretty much always have been.
But in the past people didn't have the ability to supercharge Source #2 in the list above. Today thanks to technologies like collaborative filtering they do. Books, movies, all sorts of experience goods rely less on consensus and more on presenting the consumer with highly rated items
as rated by those with similar preferences.
That to me is what makes the three distinct but ultimately similar approaches taken by Ran Morrissett, Tom Doak, and Darius Oliver worthwhile. Each provide a written exposition on courses, so that in the cases of Doak and Oliver one can read the detail underpinning their scores. Ran's approach is different in that it lacks quantitative assessment but similar in its provision of a qualitative assessment. Essentially his approach is binary: if he likes it he writes it up. He uses omissions in a meaningful way.
Why is this approach superior? Because we can determine their preferences from their writing and thereby filter according to our preferences. Find someone who shares your preferences -- even better find a group who shares your preferences -- then leverage their experiences. It's a rudimentary way to "collaboratively filter."
Magazine lists fail on these counts. They are yesteryear's bestseller lists, a product of differing utilities mashed together into consensus. I wonder if something else will come along, and what that might be?
Interesting that Golf Digest has been proliferating lists relating to tough, fun, etc. Ham-handed and designed solely to sell magazines -- of course! -- this patronizing, "ghettoizing" may well be a harbinger. Certainly would be cool to create a "course chooser" and "course ranker" using the technologies available today.
Mark
PS Jim Colton: dagger!