Michael H:
I never remotely claimed superiority -- thanks for the plug though !
Bill S:
I like the idea in pushing overall ratings to a time frame of say 3-4 years. This would give sufficient time to see what is happening at different courses.
Andy:
I don't doubt you are right.
I was a GD panelist for 17 years and always tried to convince Ron W that some sort of bifurcated panel would work best. What many people don't realize is that Digest years ago did have state / regional selectors and a smaller national panel -- many on the national panel were more of the who's who types in golf (e.g. Sam Snead, Gene Sarazen, etc, etc).
It's possible to streamline the process because I look at what Digest is producing for it's top 100 and Im just shaking my head in disbelief. When Ballyneal isn't even listed -- something is big time wrong. Ditto the omission of a place like Kingsley.
Sad to say but Digest pays so much heed to private clubs and the fact is that many public courses that have opened in the last 20-25 years are likely beyond a good number on the list you see being trumpeted time after time.
I did create my own personal top 50 metro NYC listing. Going to a personal top 100 would take a bit of time to sort out -- likely I would divide it up between the top 100 private and the top 100 public I've played.
Bart:
Thanks for your comments -- however ...
You have people who weigh the value of one course and then you have completely different people weighing the value of another course -- at no time do you have meaningful spillovers from people playing a solid cross section of courses.
As I said before -- there needs to be -- as Lou D pointed out -- some form of scaled voting power. Too many people are regional in their application of rating numbers -- some who do travel only see and play courses from other areas on a once-in-a-lifetime basis.
There are solutions to improving the overall process. No doubt the final result is still a subjective account. But, simply adding more and more people to the overall rating numbers doesn't increase real meaningful analysis -- all it does is increase people.
Bart, your ideas on reform are somewhat patterned after what I and a few others said would be a scaling towards those who are really national panelists and those who are simply regional types.
Clearly, the process can be streamlined but the updated Digest list -- can't wait to see the state ratings -- is so inane as to be comical.
Jeff:
So what's your solution?
Add more raters to the braintrust they've got now !
Panelists can be broken down to regional and national levels. The people who handle the paperwork for the different mags know full well which people really get around and therefore are able to see / play the courses in question.
When you have one person who simply plays a singular 100-mile circle area of courses and then give him the same weight as those who see anywhere in the range of 50 or more courses in a year and travel criss-cross the nation you need to weigh such factors into the equation.
Tom Paul said it best this is just a listing without any real sense of education - it's just courses stuck together through some hodge podge of numbers that certain individuals have decided to throw forward.
I'll say this again -- in the event you missed it -- there are ways to streamline the process and still get the necessary input from different levels. The question is do the mags have the will to see the flaws that can be easily corrected.
Jeff, there is no perfect solution to this. I know that without doubt. I just think if someone were really networked you would get as much info -- if not more and could really elevate the top tier places -- many of which happen to be relatively newer courses and often times by architects who don't have star power status. That's all.