News:

This discussion group is best enjoyed using Google Chrome, Firefox or Safari.


Tom_Doak

  • Karma: +1/-1
Re: GOLF DIGEST by the numbers
« Reply #25 on: April 03, 2009, 10:06:07 AM »
Jim N:

I don't want to get off topic, but you just don't understand the GOLF Magazine system.  It does not ask raters to rate courses they haven't played.  It only asks them not to rate 10 courses among the top 10 in the world if they have only played 5 or 6 of the consensus top 10.

Richard C:

Your idea of having everyone rate a few "correlation" courses would be a good one, but only IF you really believe that every panelist should have the same opinion on every category, and that such votes are objective.  I don't think that's the case.  I'd be fine with having a panelist who disagreed with me about "shot values" or "conditioning" as long as they had a consistent viewpoint.  But that's the whole point, the GOLF DIGEST system is trying to break things down so the process looks objective, and it's not.

Mark B:

ANYONE could come up with a system that got the top seven pretty much the same, because we all have known for years what those seven should be.  You judge these lists by what courses are in the second half, although getting Riviera and SFGC outside the top 30 (and behind The Club at Black Rock) is indicative of something, too.

Richard Choi

  • Karma: +0/-0
Re: GOLF DIGEST by the numbers
« Reply #26 on: April 03, 2009, 10:42:04 AM »
Tom,

Calibrating raters based on pre-selected courses would not force every panelist to have the same opinion on every category. It just ferrets out if that particular rater has a tendency to rate too high or too low so their ratings do not adversely affect courses that they have not played.

By the sheer size of the panel, it is pretty clear that GD wants this process to serve as a reflection of the general golfing public. So, GD certainly does not want same opinion from all raters.

I could not agree more with your statement that "GD system is trying to break things down so the process looks objective, and it's not". And that is the point I have been trying DESPERATELY to convey to Matt W (which I have not made much in-road :) ). Either accept that fact that this process is purely subjective and accept the results as what they are, OR change the process so that it truly IS objective.

Tom_Doak

  • Karma: +1/-1
Re: GOLF DIGEST by the numbers
« Reply #27 on: April 03, 2009, 11:01:32 AM »
Richard:

We seem to be getting farther apart:

1)  When you say a panelist rates a course "too high" or "too low", what the hell does that mean?  I'm not allowed to think that the conditioning of Pacific Dunes is really good for that course and context?  Because clearly, from looking at all the conditioning numbers, conditioning could be renamed "fast greens".

2)  Clearly GOLF DIGEST does not want the rankings to reflect their readers, since nearly all of the 900 raters are 5 handicaps or better, and they are asked to rate the course from the championship tees.

3)  How could you change the process of ranking courses so that it "truly IS objective"?

Matt_Ward

Re: GOLF DIGEST by the numbers
« Reply #28 on: April 03, 2009, 11:27:58 AM »
Tom D:

I agree getting a cross section of keen eyes is critical -- the handicap dimension needs to be broadened by getting others involved -- the final tally demonstrates that in a host of specific instances.

The only point I would mention is that Digest should rate because it wants to highlight what is great -- if that means it needs to educate and inform its readers then so be it. With each new rating it throws forward I have to wonder if the mag has people who really understand how far its efforts have fallen.


Phil Young:

You were really moving along quite nicely with your thoughts UNTIL ...

"A class of national raters whose views are looked at as more accurate is a falacy on its face. As raters must pay their own way, the first and primary qualification for being one can't be knowledge but would have to be can you afford to do it? That is the glaring weakness of that idea."

Phil, let me help you out OK -- national raters have the wherewithal to do what's needed. There are people who have the KNOWLEDGE AND $$ to do what's needed. It's not an either or situation. The problem rests with the due diligence that the magazine could employ to ascertain who these folks are. RW could serve in that role -- plenty of magazine engage outsiders who have clear expertise in whatever is being evaluated. Digest opened up the process to a Yellow Pages listing of people -- hence you get a result that's really worthless given the limited range & udnerstanding is of so many of the people.


Richard:

You just can't admit it. I indicated to you the need for "national" raters -- you simply co-opted the idea and inserted the tag "super." Thanks for coming around to my thinking -- if belated.

"Super" raters are valued not solely because they can travel. They are at that level because they have the skills needed for the kind of analysis / cross comparisons that's missing now.

Subjectivity will always rein in such matters -- the issue is getting a pair of eyes that can truly appraise greatness. From what I have seen of Digest's list and also including the groupthink mode of the Golfweek list I see plenty of gaping holes.

Dan Boerger

  • Karma: +0/-0
Re: GOLF DIGEST by the numbers
« Reply #29 on: April 03, 2009, 11:41:11 AM »

Interestingly, in the world of wine, the most trusted  (as evidenced by market price impact) raters are the single tasters (Parker, Broadbent, Tanzer) and not the panels (like Wine Spectator). That's why I am always more interested in what a single person can tell me about their course likes, dislikes and inevitable comparisons to other courses. As with a single wine critic, I can calibrate my tastes accordingly and then consider where to spend my hard earned money and my limited leisure time.
"Man should practice moderation in all things, including moderation."  Mark Twain

Matt_Ward

Re: GOLF DIGEST by the numbers
« Reply #30 on: April 03, 2009, 11:45:31 AM »
Dan:

Well said ...

Richard Choi

  • Karma: +0/-0
Re: GOLF DIGEST by the numbers
« Reply #31 on: April 03, 2009, 12:07:56 PM »
1)  When you say a panelist rates a course "too high" or "too low", what the hell does that mean?  I'm not allowed to think that the conditioning of Pacific Dunes is really good for that course and context?  Because clearly, from looking at all the conditioning numbers, conditioning could be renamed "fast greens".

No, it is just a calibration tool. Say you have two raters; A tends to play mostly harder courses and when he rates the 10 calibration courses, his difficulty ratings are lower than the average. B, on the otherhand tends to play easier courses and when he rates the same courses his difficulty ratings are higher than average.

If you have a course where you are rated more by the "A" raters than "B" raters, your difficulty rating may be much lower than it should be. If you calibrate A raters so that their difficulty ratings are weighted higher, you can offset the bias and get better overall ratings.

2)  Clearly GOLF DIGEST does not want the rankings to reflect their readers, since nearly all of the 900 raters are 5 handicaps or better, and they are asked to rate the course from the championship tees.

I would argue that it is reflection of what die hard readers aspire to be. I believe most GD subscribers can appreciate good architecture, but they just have not been exposed to as much variety or quality due to other factors. GD list, to me, is an attempt to get an idealized reflection of what the subscribers want.

Think of it as an equivalent of electing a Congressional representative. They are usually more educated and affluent than the typical voters in the district, but they are still elected to represent everyone.

3)  How could you change the process of ranking courses so that it "truly IS objective"?

There are a lot of different ways to go. I have mentioned several already. Here are just initial suggestions.

-Rate individual holes. The tendency is for raters to be blinded by several great holes and overlook some really weak ones. They really should be judged on their own.

-Ask finer questions. For resistance to scoring, you can ask a question like this "If you miss your shot to the wrong side on this hole, how severely are you punished? Rate from 1 to 5" and "How easy is it to miss the shot to the wrong side? Rate from 1 to 5". Or for Conditioning; "How soft is the landing area? Rate 1 to 5" and "Is the design and strategy of the hole consistent with the softness/hardness of the landing area? Rate 1 to 5". And so on...

-You also need to ask the same questions in multiple angles so that you can judge if there are inconsistencies or hidden biases.
« Last Edit: April 03, 2009, 12:21:45 PM by Richard Choi »

Garland Bayley

  • Karma: +0/-0
Re: GOLF DIGEST by the numbers
« Reply #32 on: April 03, 2009, 12:10:49 PM »
...
Bottom line, I think all the numbers are a farce designed to present the illusion of serious study.  I would love to hear a couple of DIGEST panelists tell me it's not so, and provide examples of their voting to back it up.


Exactly what my pardner Richard Choi was saying on the other thread. Way to go pardner! ;)
"I enjoy a course where the challenges are contained WITHIN it, and recovery is part of the game  not a course where the challenge is to stay ON it." Jeff Warne

Richard Choi

  • Karma: +0/-0
Re: GOLF DIGEST by the numbers
« Reply #33 on: April 03, 2009, 12:20:30 PM »
Interestingly, in the world of wine, the most trusted  (as evidenced by market price impact) raters are the single tasters (Parker, Broadbent, Tanzer) and not the panels (like Wine Spectator). That's why I am always more interested in what a single person can tell me about their course likes, dislikes and inevitable comparisons to other courses. As with a single wine critic, I can calibrate my tastes accordingly and then consider where to spend my hard earned money and my limited leisure time.

Dan, the ratings of wine and golf courses are really different beasts.

If you are a wine taster, you can pretty much sample every bottle of note in a given year. You can't really do that with golf courses. It is hardly surprising that wine ratings are more consistent than golf course ratings since a single person can rate all of them.

How many golf raters have played all three or four hundred golf courses of note in this country? One? Two?

Golf course ratings relie on big panels because they have no other choice...

Matt_Ward

Re: GOLF DIGEST by the numbers
« Reply #34 on: April 03, 2009, 12:26:01 PM »
Richard said, "How many golf raters have played all three or four hundred golf courses of note in this country? One? Two?

Golf course ratings relie on big panels because they have no other choice..."

Candidly, it doesn't need to be.

There are people who have played nearly all the top tier layouts -- those that have been missed are likely to have been played by the others. We live in a real time 24/7 info age and I can tell you this -- there are more people than you think who have played the top tier and can very easily provide their opinions and thoughts.

When you add more and more people to the pie -- all you do is dilute the votes of those who have a wider base of courses to start with. I'll say this again -- your "super' raters (piggybacking off my national rater theme) works well. You can have a very good idea given the speed of info today on what is happening with just about any course.

Ratings don't have to be done every year (although certain mags do it for promo purposes) and frankly many courses don't have to be re-seen if nothing of note has happened since they were firsr rated.

Greg Tallman

  • Karma: +0/-0
Re: GOLF DIGEST by the numbers
« Reply #35 on: April 03, 2009, 12:27:51 PM »
Tom

But doesn't Golf Magazine rig the deck, too?  My reading of how they do it is that the position of the course in an individual's list receives a weighting.  Top 3 get a weighting of 100, 4-10 of 85, etc.  Who decides that a top 3 listing is 15 "units" better than 4-10?  And how did they come up with that?  What's the rationale that explains why #3 on a list is, what, 17.6 percent more important than #4?

And anyway we judge a tree by the fruit it bears:

Top 5, GD
Augusta
Pine Valley
Shinnecock
Cypress
Oakmont

Top 5, GM
Pine Valley
Cypress
Augusta
Pebble
Shinnecock

Looks to me like both trees produce apples.  Or something like apples...

Mark

EDIT: An improvement on Golf's methodology IMHO would be if the panelists were allowed to assign their own weights, ideally in a bounded, "hundred pennies" type of exercise.

Mark, I believe you are on the right track but slightly off... I think GOLF assigns the following groupings in terms of ranking a facility:

1-10
11-50
51-75
76-100
Not Meriting Consideration

Garland Bayley

  • Karma: +0/-0
Re: GOLF DIGEST by the numbers
« Reply #36 on: April 03, 2009, 12:28:48 PM »

Dan, the ratings of wine and golf courses are really different beasts.
...

The number of variables in tasting wine are miniscule compared to the number of variables in evaluating courses.
"I enjoy a course where the challenges are contained WITHIN it, and recovery is part of the game  not a course where the challenge is to stay ON it." Jeff Warne

Jim Nugent

Re: GOLF DIGEST by the numbers
« Reply #37 on: April 03, 2009, 12:41:38 PM »
Jim N:

I don't want to get off topic, but you just don't understand the GOLF Magazine system.  It does not ask raters to rate courses they haven't played.  It only asks them not to rate 10 courses among the top 10 in the world if they have only played 5 or 6 of the consensus top 10.


Tom, that does not really matter.  If you have not played all courses in the "consensus" top 10,  how can you say which are in the top 3?  Any of the courses you missed could well belong there.   Maybe all of them.  

Same arguments apply to top 15...top anything.  Unless you have played all the reasonable candidates, you have to compare courses you never played.  And even that "reasonable" requirement has problems.  Consensus usually means self-perpetuating.    

I've given some examples before, and they bear repeating.  If you have not played Shinnie, CPC and Pine Valley, how can you put Pebble in the nation's top 3?  

Similarly, if you have never seen a Tom Doak or C&C course, and you rank Nicklaus, Fazio and Norman as the best three modern architects, of what use is your ranking?  

Quote
Mark, I believe you are on the right track but slightly off... I think GOLF assigns the following groupings in terms of ranking a facility:

1-10
11-50
51-75
76-100


According to Golf Mag online, that is not correct.  The top 3 courses get 100 points...4-10 get 85 points...11-25 get 70 points...and so on. 


Tom_Doak

  • Karma: +1/-1
Re: GOLF DIGEST by the numbers
« Reply #38 on: April 03, 2009, 12:47:46 PM »
Jim:

First of all, I doubt there are ANY panelists on the GOLF Magazine ranking that have not played Shinnecock, Pine Valley OR Cypress Point.  I would bet that 75% of the panelists have played all three, and 90% two out of three.

But let's go down one step and say that a panelist has only played 7 of the top 10 courses.  The instructions then are not to vote for three "top three" courses and ten "top tens", but to vote for seven "top tens" and use your judgment on how many "top three" votes you assign.  I really do not know how they handle the numbers, they've changed that since I used to run the thing.

Besides, nobody is arguing about the top ten.  It's the middle and bottom of the list where all the arguments happen.

Jim Nugent

Re: GOLF DIGEST by the numbers
« Reply #39 on: April 03, 2009, 12:59:15 PM »
Tom, plenty of arguments about the order of the top 3.  Same with the top 10 and the other categories. 

But the point I made still stands.  Take the guy who played 7 of the top 10.  Say he played Pebble and CPC, but not Shinnie or PV.  He ranks Pebble in the top 3.  That's a problem. 

Also, you have a chicken and egg dilemma. 


Phil_the_Author

Re: GOLF DIGEST by the numbers
« Reply #40 on: April 03, 2009, 01:41:22 PM »
Sorry Matt, but we seem to be ddisagreeing once again.

You stated, "Phil, let me help you out OK -- national raters have the wherewithal to do what's needed. There are people who have the KNOWLEDGE AND $$ to do what's needed. It's not an either or situation..."

Remember Matt, it is your proposition that this be done and that the "national raters" are the cream of the crop, so-to-speak. Now that is all well and good, but the FINANCIAL & TIME ABILITY to play MANY, MANY courses on a national level REQUIRES that the financial aspects come first.

A person who doesn't have the wherewithall to play 50-100 courses a year all over the country with an overseas trip or two thrown in for laughs, might be a MUCH BETTER rater and evaluator of courses than many who can.

Since you can't guarantee that say, the top 50 raters have the wherewithal to do this traveling, then the idea of making a superior "national rater" category also becomes fallacious...

It's not the idea but the logistics that damn the idea...
 

Greg Tallman

  • Karma: +0/-0
Re: GOLF DIGEST by the numbers
« Reply #41 on: April 03, 2009, 02:06:15 PM »
Jim N:

I don't want to get off topic, but you just don't understand the GOLF Magazine system.  It does not ask raters to rate courses they haven't played.  It only asks them not to rate 10 courses among the top 10 in the world if they have only played 5 or 6 of the consensus top 10.


Tom, that does not really matter.  If you have not played all courses in the "consensus" top 10,  how can you say which are in the top 3?  Any of the courses you missed could well belong there.   Maybe all of them.  

Same arguments apply to top 15...top anything.  Unless you have played all the reasonable candidates, you have to compare courses you never played.  And even that "reasonable" requirement has problems.  Consensus usually means self-perpetuating.    

I've given some examples before, and they bear repeating.  If you have not played Shinnie, CPC and Pine Valley, how can you put Pebble in the nation's top 3?  

Similarly, if you have never seen a Tom Doak or C&C course, and you rank Nicklaus, Fazio and Norman as the best three modern architects, of what use is your ranking?  

Quote
Mark, I believe you are on the right track but slightly off... I think GOLF assigns the following groupings in terms of ranking a facility:

1-10
11-50
51-75
76-100


According to Golf Mag online, that is not correct.  The top 3 courses get 100 points...4-10 get 85 points...11-25 get 70 points...and so on. 



Interesting, must be relatively new.

Wayne Wiggins, Jr.

  • Karma: +0/-0
Re: GOLF DIGEST by the numbers
« Reply #42 on: April 03, 2009, 02:09:44 PM »
Can't believe I got sucked into this... I swore I wasn't even going to read this issue, these rankings, etc.  

Until someone sent me the link where a course, of which I have obvious biased feelings (Aronimink GC), was highlighted.  And while I could care less about where it was ranked, what struck was the fact that four years ago it was ranked something like 51, and then two years ago it wasn't even ranked, and now it's 70-something.  Nothing happened to the course during this period, and nothing was done to many of the courses that stayed on the list ahead of it either.

Can someone explain how that happens?  I just don't understand.

Doug Siebert

  • Karma: +0/-0
Re: GOLF DIGEST by the numbers
« Reply #43 on: April 05, 2009, 04:30:05 AM »
Wayne,

Let's think about the case of a guy who rates Aronomink 50th (I know they don't list courses from 1 to 100, but we'll assume the rating he assigned would rank it there if everyone agreed with him)  He sees it is 51st when the ratings come out and that's fine with it as he mostly agrees the courses rated higher are better and the courses rated lower are not as good.  Next time he sees it is out of the rankings entirely.  He thinks that's wrong, so the time after that he adjusts his ratings to effectively rank it 30th to compensate for the other raters being wrong in his mind.

Likewise, there could have been some guy who rates it 80th.  He sees its 51st when the ratings come out, so possibly he leaves it off his ratings next time, thinking that if everyone else is overrating it he'll adjust things to make up for that.

Unless the population of the raters is changing a lot every couple years, it is either this or there is no consistency in what numbers raters come up with.  If you played it once 10 years ago, and you don't keep any notes to refer to or even look at the values you assigned it last time then your own ratings will fluctuate a lot.  Maybe you review the ratings from last time to refresh your memory and the absence of Aronomink stands out to you.  You don't even consciously try to rate it higher, but just having it in your mind recalls memories of your experiences playing it which unconsciously gives it a higher rating from you this time.  These guys are human, there's no objective way to do this and those types of things will happen.

I wouldn't be surprised if there was some lobbying that occured from some of the members who didn't like when their club dropped off the list, so they called all their friends who they know are raters to ask them to remember Aronomink this time.  Not saying there is necessarily anything untoward going on, if all they said was to remind them of a round they played together a few years back so the course is fresh in the mind of the rater that's all it would take to pump the numbers back up again.


I think Matt Ward's idea of "super raters" might have some merit.  Given the choice between trusting the opinion of 1000 guys who have played some of the courses on the top 100 but on average probably playing well under half of them, versus the opinion of 20 guys who have played almost all of them, I'd go with the latter.  I know a lot of guys in GCA are raters and would hate to be kicked out of the party, so there's a lot of people against this viewpoint, but I think he's right.  I realize he's exactly the kind of guy who might merit selection as a super rater so he may have his biases as well, but that's no reason to reject his idea out of hand.

You could extend the super rater idea to state rankings as well.  The set of people who have played the top 30 in Iowa or Maine isn't necessarily going to have a lot of overlap between those playing all the courses of note nationwide.  But that doesn't matter because no one really cares how the 10th best in Iowa compares to the 10th best in Maine or the 100th best in California.  But it would be useful for the national super-raters if for instance a new course in Iowa was rated higher than The Harvester, they might think it merits a visit, along with revisiting The Harvester, to see if this course deserves mention nationally or if the The Harvester has lost something and deserves a demotion.
My hovercraft is full of eels.

Jonathan Cummings

  • Karma: +0/-0
Re: GOLF DIGEST by the numbers
« Reply #44 on: April 05, 2009, 08:32:10 AM »

I think Matt Ward's idea of "super raters" might have some merit.  Given the choice between trusting the opinion of 1000 guys who have played some of the courses on the top 100 but on average probably playing well under half of them, versus the opinion of 20 guys who have played almost all of them, I'd go with the latter. 

Doug - I'll take your speculation a step further.  Assuming Matt is right and the best rating list is based on votes ONLY from raters who have seen damn near all of the top 100 courses, I suspect you don't have an appreciation of how few raters your panel would have.  The average (pure guess on my part) number of top 100 courses seen by the GD panel is probably closer to 10-20%.

Look at GCA.com - a group of passionate golfers where almost all are interested in seeing  awide range of golf courses because they are curious of these course's architectural merits. 

I'll bet of the 1500 on this panel no more than 50 or so have seen half of the GD list.

JC   

Andy Troeger

Re: GOLF DIGEST by the numbers
« Reply #45 on: April 05, 2009, 08:39:21 AM »
Doug,
And to take Jonathan's remarks one step further--the few guys that have seen 50% of the GD list are likely raters that have only seen that many because of the access that being a rater has granted them. Sure, you might find a few people that are that well connected otherwise but likely not all that many. Even Matt Ward was a GD panelist for 17 years--I'm sure he saw a lot of courses during that time.

Eric_Terhorst

  • Karma: +0/-0
Re: GOLF DIGEST by the numbers
« Reply #46 on: April 05, 2009, 11:51:20 AM »

I think Matt Ward's idea of "super raters" might have some merit.  Given the choice between trusting the opinion of 1000 guys who have played some of the courses on the top 100 but on average probably playing well under half of them, versus the opinion of 20 guys who have played almost all of them, I'd go with the latter.  I know a lot of guys in GCA are raters and would hate to be kicked out of the party, so there's a lot of people against this viewpoint, but I think he's right.  I realize he's exactly the kind of guy who might merit selection as a super rater so he may have his biases as well, but that's no reason to reject his idea out of hand.

Maybe I'm missing something, but how does this differ from Golf Magazine's panel, described as follows

"Our rankings are guided by our panel, whose 100 members represent 15 countries. The men and women who cast their votes include major-championship winners, Ryder Cup players, architects, leading amateurs, journalists and a cadre of nearly a dozen course connoisseurs who've had the doggedness to play all Top 100 Courses in the World." 

If a group of GD's raters, or anyone else for that matter, thinks they are "Super" they should set up their own publication and try to attract a following, like Robert Parker of wine-rater fame.  Put up your list, se if anybody cares, and try to make a living from it, knock yourselves out. 

Interesting to me is Golf Magazine's "10 Most Overrated Courses"

10  AGNC
9  Harding Park
8 Sahalee
7 Bandon Trails
6 The Country Club (US Open Course)
5 Champions GC
4 Torrey Pines North
3 Pinehurst #2
2 Pebble Beach
1 Medinah

Since AGNC, Pebble, and Pinehurst are in the Golf's top 15 (2007 list), I guess the purpose of listing the "Most Overrated" ( by what measure or according to whom is not clear) is to send a message to the esteemed panelists:  "You're wrong"    ???

This reinforces Mr. Schmidt's interesting comment that "The greatest achievement a ranking can attain is to be taken seriously and doubted simultaneously."  This happens on the Golf Magazine web site, all in one place !!!

All of these ranking systems appear from the peanut gallery to be wildly flawed, to be used for entertainment purposes only.  The only reason to use more than one moment or brain cell trying to "fix" them is that they have such an important commercial impact.





Matt_Ward

Re: GOLF DIGEST by the numbers
« Reply #47 on: April 05, 2009, 03:07:45 PM »
Phil:

Super raters, or whatever one wants to call them -- can work. You don't need a laundry arrray of Yellow Pages people who really are more connected locally / regionally than anything else.

Phil, you have this incorrect thinking that coverage could not happen. Let me point out Phil I am no millionaire yet I see and play the courses that have gained attention. If one takes the time to monitor various info sources such as GCA and others, you can find the courses that are clearly making waves in a variety of ways. In years past getting detailed info on what was happening with different courses was more difficult -- that's not the case now.

I have my own sources of info and it helps me on what is happening in different sectors of the USA and elsewhere. One doesn't need to make visits to all courses because sometimes many of these same courses have not really done anything of note to merit a visit / revisit or to be bumped higher, lower or considered at all.

Phil, many people aspire to be raters solely for the access aspect. It's a good bit more involved than that and if one were to be considered a bona fide national rater then you need to demonstrate the desire to make the visits to see firsthand what is happening.

Many people can serve a valuable contributing role as a state / regional rater. I have said this many times that state ratings should be weighed more so to those who live in the area because they likely have played such courses more times than someone just dropping in for a one time visit. When national ratings are weighed you then need to have a certain core group of people who can make the cross comparisons that are lacking now. All you have is static numbers of people who have played course "A" -- and an entirely different group of people who have played / rated course "B."

Phil, the idea is not "fallacious" -- people exist now that can do the job. The issue is more with the magazines believing that more and more people will add to the info that's inevitably produced. It does not and the results one is seeing demonstrates the existing system is indeed wanting in so many ways.

Doug S:

You are so right about the logrolling that goes on with elite clubs reaching to their "rater" contacts to push their clubs. Once you throw into the equation people who have relationships with others it clearly makes plenty of people wonder if the final result is more related to real outcomes or just who has the better rolodex.

Doug, I think so much of what can be done today can be internalized with little outside interferences you see today. If you get a chance read the accounts of Laura Landro provides reviews for WSJ in their Friday / weekend pages on facilities she visits. It is well written and although I might disagree with her accounts from time to time -- I like the probing nature of what she provides. National raters can get beyond all the VIP preference treatment that many get simply to sway them in their corner.

I also like your idea on the state side of things.

Eric T:

No one doubts we are debating the subjectivity of this issue. There are no right 100% answers. But when the totality of what is produced is deemed as being America's greatest courses and when you see a range of other courses -- often it's the ones that get little attention or notice because they don't host majors, have all-star membership lists or provide VIP butt-kissing moments, etc, etc.

Let me point out that I didn't create the tag "super" to describe such people. They would only be "national" raters. No doubt even such a grouping would make an error -- whether too high, too low or not at all.

Eric, the existing system is flawed and can be strengthened. It would be no different than the small grouping of peole who do the selection and ratings of teams invited to the NCAA b-ball tournament. If you notice the group as a whole have done quite well in not only getting it right most of the time with the teams invited but with the seedings as well.

One final thing -- the "most overrated" does illuminate a number of interesting dimensions. That doesn't mean to say that a few of the courses listed should not be rated among
the elite -- just not as high as they are. In fact, I agree on a number of them listed.


Andy:

Trust me -- there are people who have played more than 50 of the listing and they are not raters. I know a fair number of people who have done this. The issue is that quality info is available and people who do love golf don't need the special invite to be a rater to do it.

Jonathan:

I agree with your take -- that few have played more than 50 of the GD list. When that happens you get people who will likely ELEVATE those in their "neighborhood" to extra high levels. What's interesting is that while I believe the NYC metro area has the best overall depth of private clubs in the nation -- I do believe a few of them have glommed on to the status / star of those that are located near to their entrance ways.

Andy Troeger

Re: GOLF DIGEST by the numbers
« Reply #48 on: April 05, 2009, 03:16:31 PM »
Matt,
Could you provide an estimate for "fair number" in your previous post? That could mean anything from 5-10 to hundreds. What percentage of them would be good raters?

Matt_Ward

Re: GOLF DIGEST by the numbers
« Reply #49 on: April 05, 2009, 07:54:42 PM »
Andy:

Let me just answer your question this way -- I have roughly 20-25 key sources that I have scattered throughout the USA. Plenty of them are not raters for any magazine -- either now or previously. They love golf as much as anyone on this site. The info I receive from them is reliable and while we may disagree with certain specifics -- the info is generally very helpful for me in determining my trip itinerary for the year ahead.

My point made previously is that in today's 24/7 info age and with the benefit of the Internet you can access a whole range of up-to-date info that previously was more difficult to access.

GD could very easily reduce the scale of "national" raters to a number of say 50-75 people tops. These people do get to the key places and I can just say from the contacts I have developed theinfo I both receive and share has really helpled me.

GD favors a Yellow Pages listing that only does one clear thing -- it elevates state / regional raters to the equal stature of those who really provide meaningful numbers at the national level. What that does is push down the more accurate info received from a few key people while elevating the presence of those who really spike rating results for those courses in and around their own respective areas.

Just my opinion ...


Tags:
Tags:

An Error Has Occurred!

Call to undefined function theme_linktree()
Back