News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Jon McNey

Differing philosophies in golf course ratings
« on: March 15, 2008, 11:38:09 PM »
Golf Digest, Golf Magazine, Golfweek--what are the panel biases?  Is one better than another? 

John Kirk

  • Total Karma: 4
Re: Differing philosophies in golf course ratings
« Reply #1 on: March 16, 2008, 12:43:58 AM »
That's a tough question, John.  Golf Magazine has a smaller panel of expert raters, who generally have some sort of pedigree within the game.  Golf tends to rate classic courses by famous designers higher, and their rankings closely reflect GolfClubAtlas sentiment.

Golf Digest has a large number of raters, generally low handicappers, who tend to place a premium on conditioning and difficulty.  Golf Digest's rankings tend to correlate least with GolfClubAtlas sentiment.

Golfweek has a bunch of clowns that Brad Klein has tried valiantly to train, but it is no use.  OK, Golfweek has 450 raters, with a fairly broad level of playing ability when compared to Golf Digest raters.  Golfweek's ratings correlate with GolfClubAtlas sentiment more than Golf Digest, but less than Golf.  Golfweek has separate lists for modern and classic courses, which I like a lot.

Golf rates with a single number, 1-10.
Golf Digest rates by combining several separate 1-10 ratings.
Golfweek rates in several categories, but the only one that matters is the oerall rating, which again is 1-10.

My favorite ratings are Golfweek, followed by Golf and Golf Digest.  Although Golf uses a team of experts, I think they may be a bit too dogmatic.

Doug Ralston

Re: Differing philosophies in golf course ratings
« Reply #2 on: March 16, 2008, 01:26:38 PM »
That's a tough question, John.  Golf Magazine has a smaller panel of expert raters, who generally have some sort of pedigree within the game.  Golf tends to rate classic courses by famous designers higher, and their rankings closely reflect GolfClubAtlas sentiment.

Golf Digest has a large number of raters, generally low handicappers, who tend to place a premium on conditioning and difficulty.  Golf Digest's rankings tend to correlate least with GolfClubAtlas sentiment.

Golfweek has a bunch of clowns that Brad Klein has tried valiantly to train, but it is no use.  OK, Golfweek has 450 raters, with a fairly broad level of playing ability when compared to Golf Digest raters.  Golfweek's ratings correlate with GolfClubAtlas sentiment more than Golf Digest, but less than Golf.  Golfweek has separate lists for modern and classic courses, which I like a lot.

Golf rates with a single number, 1-10.
Golf Digest rates by combining several separate 1-10 ratings.
Golfweek rates in several categories, but the only one that matters is the oerall rating, which again is 1-10.

My favorite ratings are Golfweek, followed by Golf and Golf Digest.  Although Golf uses a team of experts, I think they may be a bit too dogmatic.

Yes, but which of the three rate courses state by state and never send anyone into some of the states they rate? One? Two? Or all three?

"Don't know" should also be a possible response. If you rate based on opinions gleened otherwise, then "don't care" is true ranking.

Doug

Adam Clayman

  • Total Karma: 0
Re: Differing philosophies in golf course ratings
« Reply #3 on: March 16, 2008, 01:56:18 PM »


Yes, but which of the three rate courses state by state and never send anyone into some of the states they rate?

Doug, The flaw in your premise is that they send anyone. Raters are not told where to go.
 
The state by state rankings can only be based on the information they have and get.
  Perhaps down the road, the numbers for KY will reflect the courses you seem to have such a beef with. But, something tells me you'll find issue with those when they come out, too.

"It's unbelievable how much you don't know about the game you've been playing your whole life." - Mickey Mantle

Tiger_Bernhardt

  • Total Karma: 0
Re: Differing philosophies in golf course ratings
« Reply #4 on: March 16, 2008, 01:59:58 PM »
John Kirk that Golf Week comment makes me think you are jealous of another John K. John Kavanah aka Barney Frank.

Norbert P

  • Total Karma: 0
Re: Differing philosophies in golf course ratings
« Reply #5 on: March 16, 2008, 02:17:28 PM »
 My rating system goes to eleven. 



Nigel Tuffnel
"Golf is only meant to be a small part of one’s life, centering around health, relaxation and having fun with friends/family." R"C"M

cary lichtenstein

  • Total Karma: -1
Re: Differing philosophies in golf course ratings
« Reply #6 on: March 16, 2008, 02:25:05 PM »
Just think about how much money these 3 publications could save if they let Matt Ward be the sole rater. ;D
Live Jupiter, Fl, was  4 handicap, played top 100 US, top 75 World. Great memories, no longer play, 4 back surgeries. I don't miss a lot of things about golf, life is simpler with out it. I miss my 60 degree wedge shots, don't miss nasty weather, icing, back spasms. Last course I played was Augusta

Jim Nugent

Re: Differing philosophies in golf course ratings
« Reply #7 on: March 16, 2008, 03:58:12 PM »
In all the magazines, don't the raters have to play the courses before they can rate them? 

Doug Ralston

Re: Differing philosophies in golf course ratings
« Reply #8 on: March 17, 2008, 09:00:57 AM »


Yes, but which of the three rate courses state by state and never send anyone into some of the states they rate?

Doug, The flaw in your premise is that they send anyone. Raters are not told where to go.
 
The state by state rankings can only be based on the information they have and get.
  Perhaps down the road, the numbers for KY will reflect the courses you seem to have such a beef with. But, something tells me you'll find issue with those when they come out, too.



And Adam;

What exactly is that 'something' that whispers such assumptions in your ear?

Adam, I could live with people who play the courses and see them differently from myself. That is natural. What bothers me is ranking given on courses unplayed, simply by whatever rumor. So, again, why not admit what you do not know; give no list for unvisited states. You know, I assume, how these lists actually affect play on courses named, vrs those not named. By naming a course as 'Best in State', you will affect all other courses there. People from outside who go to, let us say, North Dakota, and play some mediocrity the magazine never visited rather than Bully Pulpit or Links of ND, will likely consider; "this is ND's best? No reason to come back here". But if those nice courses are indeed rated correctly, obviously they are encouraged to return. These ratings, and their quality, do matter.

So Adam, my answer is an unequivocal maybe! I would not object if some course of obvious high quality in KY, say Old Silo or Stonecrest, is rated at the top. People who play those will still want to see more. But last two years are Cherry Blossom [only fair] and Quail Chase [OMG!]. Those are likely to completely misrepresent the available quality of courses in Kentucky .... and that does cause 'issues' for me.

Adam, I purposely did not name the Trail courses, though I strongly believe them best. Why? Because I know some people there, and keep in some touch. I KNOW they were not visited. But! If I am wrong, and someone here did visit/rate them, please let them send me a PM and describe what they saw, so I can see how wrong I was.

But this isn't even about Kentucky, just substitute any 'out of the way' small state and it is still accurate. Badly wrong rating, especially at the top, hurts the play across the state from visitors.

"I don't know". It's not that difficult to say. If no one was 'sent' or went voluntarily to a state, then suitable list is:

1. Not assessed
2. Not assessed
3. Not assessed
4. Not assessed
5. Not assessed

Easy!

As for 'the numbers'; what are they? How can you tell without looking?

Doug

Chip Gaskins

  • Total Karma: 0
Re: Differing philosophies in golf course ratings
« Reply #9 on: March 17, 2008, 09:52:00 AM »
I assume raters can show up at a course and play without announcing who they are or why they are there?  I don't think identifying yourself is a requirement by any of the three magazines, no?
« Last Edit: March 17, 2008, 10:05:34 AM by Chip Gaskins »

Doug Ralston

Re: Differing philosophies in golf course ratings
« Reply #10 on: March 17, 2008, 09:54:55 AM »
I assume raters can show up at a course and play with out announcing who they are or why they are there?  I don't think identify yourself is a requirement by any of the three magazines, no?

Thus my invitation to any who did that to PM me. I really wanna know.

Doug

Adam Clayman

  • Total Karma: 0
Re: Differing philosophies in golf course ratings
« Reply #11 on: March 17, 2008, 10:19:08 AM »
Doug, I suspect that less than 10% percent of raters are members of this site. So, if you don't get any PM's you will likely continue to indict the system.

One other fact that seems to have slipped your premise is..a course does not receive a rting unless a minimum number  of votes are submitted. Your assertion that a "Not assessed" notation would be more accurate is plain wrong. If they are on the list they were assessed.
Now, if you want your "Trails" course to get more ratings, I suggest you have someone in charge contact the magazines and ask them to inform their raters that they exist and would be receptive to having them come out and evaluate the course.

Doug, It takes time for the dust to settle on any of the rankings. Be patient and if The Trails is worthy it will come out in the wash.

As for that little something, it is clearly your contrary nature that made me suspect what I suspect.
"It's unbelievable how much you don't know about the game you've been playing your whole life." - Mickey Mantle

Doug Ralston

Re: Differing philosophies in golf course ratings
« Reply #12 on: March 17, 2008, 11:20:04 AM »
Adam;

Thanks for the great news! One in ten. Then if just ten visits ever occured to the whole state, odds are, one person here will be sending me a PM describing what they experienced, won't they?  ;)

I would agree with you about my contrariness, but that would negate the point, wouldn't it?  :D

Well, I got people. Maybe your plan will work.

Doug

Brian Cenci

Re: Differing philosophies in golf course ratings
« Reply #13 on: March 17, 2008, 12:14:36 PM »
I assume raters can show up at a course and play without announcing who they are or why they are there?  I don't think identifying yourself is a requirement by any of the three magazines, no?

It's not a requirement and for Golfweek they indicate to their raters to not solicite play from any courses if it is considered "top tier".  Meaning, if you want to rate Inverness you find a way to play the course like a normal golfer.  You don't call up the pro and ask to come out and rate it, they know how good they are.

John Mayhugh

  • Total Karma: -1
Re: Differing philosophies in golf course ratings
« Reply #14 on: March 17, 2008, 12:17:49 PM »
Being a Kentuckian, it's easy for me to understand Doug's point.  It's OK if there is a difference of opinion on how a course is rated, but some of the results strongly suggest that worthy courses never even get considered. 

I would be interested to see each of the magazines post a simple table showing which courses were visited & the number of visits.  That would tell me how much merit to place in the rankings.  The results in KY are not very inspiring.  And I haven't even played several of the courses that Doug likes.

Brian Cenci

Re: Differing philosophies in golf course ratings
« Reply #15 on: March 17, 2008, 12:39:19 PM »
That's a tough question, John.  Golf Magazine has a smaller panel of expert raters, who generally have some sort of pedigree within the game.  Golf tends to rate classic courses by famous designers higher, and their rankings closely reflect GolfClubAtlas sentiment.

Golf Digest has a large number of raters, generally low handicappers, who tend to place a premium on conditioning and difficulty.  Golf Digest's rankings tend to correlate least with GolfClubAtlas sentiment.

Golfweek has a bunch of clowns that Brad Klein has tried valiantly to train, but it is no use.  OK, Golfweek has 450 raters, with a fairly broad level of playing ability when compared to Golf Digest raters.  Golfweek's ratings correlate with GolfClubAtlas sentiment more than Golf Digest, but less than Golf.  Golfweek has separate lists for modern and classic courses, which I like a lot.

Golf rates with a single number, 1-10.
Golf Digest rates by combining several separate 1-10 ratings.
Golfweek rates in several categories, but the only one that matters is the oerall rating, which again is 1-10.

My favorite ratings are Golfweek, followed by Golf and Golf Digest.  Although Golf uses a team of experts, I think they may be a bit too dogmatic.

John,
    I tend to disagree regarding Golf Magazine being as close to those on GCA.  They tend, IMO, have very "safe" rankings and overall I think are pretty good.  If the average golfer were to browse thru their list you wouldn't really see anything that jumped out as being outrageous or crazy.  However, I tend to find myself more in disagreement with their public rankings.  They're missing a lot of courses and again, go very mainstream.  Just my opinion.  
     I think that GD has some crazy rankings sometimes that are out of left field.  Medinah No. 3 at #11 (ahead of NGLA, Pac Dunes, Sand Hills to name a few), Victoria National # 22 top overall, Prairie Dunes at #30 (are you kidding me?).  Tullymore #14 top public, Black Lake #35 top public? Some really off the wall rankings IMO.  When you get into their state-by-state rankings it's even more head scratching.  Find me 1 person who thinks Kingsley Club is the 16th best course in the state of Michigan, behind Red Hawk, Black Lake, Shephards Hollow, True North...sometimes common sense needs to prevail over the numbers.
    I think that Golfweek has the best starting point, in separating between modern and classic courses.  I don't know how a rater could ever compare, Pacific Dunes vs. Crystal Downs, for example, like GD and GM do.  Two totally seperate erras of golf and construction practices alone, even though Tom has the minimalist philosophy.  I wonder if Tom D. would say that Pacific Dunes is a better course than Crystal Downs (as it is ranked higher, #9 compared to #14) on Golf Mags latest list in 2007.  Golfweek does a good job with the classic's that other ranking systems tend to miss.  For example, Franklin Hills in Michigan.  What a great Ross course that many feel is superior to Oakland Hills isn't listed on GM's and you won't find it until #21 top overall in the state on GD's best-in-state list for Michigan.

-Brian

John Kavanaugh

Re: Differing philosophies in golf course ratings
« Reply #16 on: March 17, 2008, 12:43:45 PM »
I assume raters can show up at a course and play without announcing who they are or why they are there?  I don't think identifying yourself is a requirement by any of the three magazines, no?

It's not a requirement and for Golfweek they indicate to their raters to not solicite play from any courses if it is considered "top tier".  Meaning, if you want to rate Inverness you find a way to play the course like a normal golfer.  You don't call up the pro and ask to come out and rate it, they know how good they are.


Shadow Creek and Pebble excluded.

Brian Cenci

Re: Differing philosophies in golf course ratings
« Reply #17 on: March 17, 2008, 12:49:36 PM »
John,
      There are always some exclusions.  Using BJP's should always prevail.

-Brian

Matt_Ward

Re: Differing philosophies in golf course ratings
« Reply #18 on: March 17, 2008, 02:48:01 PM »
John Mayhugh:

Your comments are indeed worth noting.

Often times it's states that don't fly high on the rating radar that often get shortchanged in terms of the information that comes forward. Knowing how many visits were made to specific courses would give you some idea on how valid the information that's posted on the so-called "best" courses in a given state.

I have issue with some of the results that have come from my home state of New Jersey in the Golfweek assessment. It's not as wide an issue as you and Doug have with Kentucky layouts but it does happen.

The likely situation is that results can be skewed when you have only a low number of visitors and the courses that are selected come from such a narrow base of comments.

I've often said that reviews of state courses should have some weight factor for those who actually live in the state because it's far more likely they will see and play such courses far more times than a single visit made by someone who is simply passing through.

Brian C:

Enjoyed your post and agree with many of your points.

Just realize this -- good public course information will likely not come from a group of people at Golf who are more likely going to tee it up at some of the more private of layouts here in the USA. To get really good public course info you have to be willing to play a few duds and criss cross a good part of the country in order to have some sort of representative sampling. Many of the "mainstream" public courses you mentioned are simply thrown into the pie because of their past success -- sometimes from years and years ago.

Frankly, if you want to see the quality of any ratings check out the types of courses that make a public listing. If a magazine can really unearth those gems then you've got an information source that is clearly doing its due diligence and will be a benefit to its readers.

Last thought -- the position of Kingsley (via the Digest state poll) in MI is truly remarkable for its ignorance on just how good that layout truly is. Makes you wonder what people are really seeing and thinking about.


Andy Troeger

Re: Differing philosophies in golf course ratings
« Reply #19 on: March 17, 2008, 06:49:12 PM »
I assume raters can show up at a course and play with out announcing who they are or why they are there?  I don't think identify yourself is a requirement by any of the three magazines, no?

Thus my invitation to any who did that to PM me. I really wanna know.

Doug

Doug,
I did a count and found that 40% of the courses I've submitted a rating for so far would not have known I was there. One of those was even in Kentucky! (Kearney Hill). However, I'm not with GolfWeek so I probably don't count  ;D