News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Jeff Bergeron

  • Karma: +0/-0
Process for developing the Golf Digest/Golfweek rankings
« on: February 07, 2015, 07:25:52 PM »
I have often wondered how these rankings are developed? The ranking are very important to the 'brand' of a golf course and are a source of pride for it's members and, of course, spirited discussion on these pages. Specifically I ask the following:

Are raters allowed to rate their own club? A club that they were denied membership to?

Do the supervisors of the rankings accept honorariums for visiting a golf course?

Do these supervisors consult on course restorations/renovations and/or participate in architect selection of ranked or potentially ranked courses?

Do these supervisors arrange or accept a honorarium for arranging large numbers of raters to visit a course with the expectation of influencing that course's rating?

Are other 'name' architects given undue influence with these supervisors in determining a courses' rating?

if the answer to any of the above questions is yes, how does it affect the 'independence' of those supervising the rankings and, ultimately, the integrity or the rankings.

Frank M

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings New
« Reply #1 on: February 07, 2015, 09:25:21 PM »
I don't get the big fuss with rankings. Why can't everyone accept them for what they are and that is a subjective list meant to illicit discussion? 
« Last Edit: July 05, 2024, 10:14:48 PM by Frank M »

Brad Tufts

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #2 on: February 07, 2015, 09:26:56 PM »
Hi Jeff,

I can speak in terms of GD, as I participate on that panel.

It seems you are driving towards some sort of conflict of interest, and I assure you great pains are taken to not allow this to factor.

Panelists can rate their home courses or any others they play, but with 45 (I think) ratings over 8 years required to qualify for T100 status, one single panelist cannot shift the ratings very much, especially because on a T100 course someone with a supposed slight bias will not deviate more than a point or two in each category, thus not changing the average much among 50+ ballots.

As for supervisors, Ron Whitten has consulted on a couple courses, and has always excluded them (Architects GC in NJ and Erin Hills in WI) from consideration.  This has changed slightly as Erin Hills has been added as a candidate because there have been enough modifications that Ron views the course as no longer his own as a whole.  This idea is debatable, as EH still has a similar routing as before, but IMO it is one of America's T100 courses today.

I know Golfweek does set up a rater retreat or two for many people to play a course and hear from that course's architect.  GD tries not to do this, as the conditions on one day can affect the rating for many people at once.  I'm not sure the GW method is wrong, but I agree with the GD perspective.  I wish there was a happy medium somehow, but maybe there isn't.

There are no "honorariums" of fees or kickbacks that anyone gets for visiting courses, or promoting such.  We as panelistz are allowed to accept comped greens fees, but this is not the point.  I have never factored "requiring a greens fee or not" into my ratings, and I have made sure others I know on the panel are on the same page as me.  I would guess in my travels, 2/3 of private courses offer comped greens fees, and 1/3 have some fee associated, greens fee or caddie fees...which is well within their right.  Usually if I am rating a pubic course, I am paying my way anyways, as this is more seamless than making a request, etc.

Comments?

And thank you Frank for giving a good overall perspective, one that I wholeheartedly share even while being a panelist!
So I jump ship in Hong Kong....

Joel_Stewart

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #3 on: February 07, 2015, 09:35:58 PM »
The tone of the questioning seems suspect.  What's with all of this supervision questions?

At GD, one guy is in charge of the panelists and one guy in charge of the courses to be rated. Panelists are free to play where they want. You may review your own course.  With that said, if you rate any course too high or low it's kicked out of it's 2 standard deviations away from the norm.

Tommy Williamsen

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #4 on: February 07, 2015, 09:39:52 PM »
Please, not another ranking thread!?!
Where there is no love, put love; there you will find love.
St. John of the Cross

"Deep within your soul-space is a magnificent cathedral where you are sweet beyond telling." Rumi

Dave Doxey

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #5 on: February 08, 2015, 08:59:42 AM »
How good can the rankings be when most courses get played by few, if any, raters?  Many courses do not get played at all.  Well known courses and those high on past lists obviously get played by many raters.

What role does promotion lay in a courses ranking.  I know that in my area, a couple of courses that spend a lot on advertising and play up to publications get rated well, despite their not being all that good in design or condition relative to others.

Can anyone share any information on the mathematical process of generating the rankings?

I tend toward believing that the whole thing is largely the result of “group think” based on past perceptions, fed with quite a bit of politics and perhaps even some ad revenue.

Being a “numbers guy” I'm probably over analyzing and Frank is correct that they are most likely just subjective lists to sell magazines and promote discussion.

Brad Tufts

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #6 on: February 08, 2015, 11:06:32 AM »
Dave,

The GD methodology is spelled out on the website.

Yes, there are varying levels of interest among the clubs to participate, so any panel cannot guarantee that every course is seen X number of times.  I don't know where the "groupthink" criticism comes from...methodology can be debated, there must be criteria and rules to produce any list.  As trends shift over time, courses fall into and out of favor.  Whether or not a golf course advertises with anyone does not play into the rating process.

Frank has it right.  It's all to generate discussion, and bring interest to the magazine.  There will never be a perfect list.
So I jump ship in Hong Kong....

Tom_Doak

  • Karma: +2/-1
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #7 on: February 08, 2015, 11:08:31 AM »
Being a “numbers guy” I'm probably over analyzing and Frank is correct that they are most likely just subjective lists to sell magazines and promote discussion.

Dave:

Being a "numbers guy," how can you seriously think that anything as subjective as golf course design can be "correctly" analyzed by a bunch of raters with differing viewpoints -- or, worse yet, with the same viewpoint?

Dave Doxey

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #8 on: February 08, 2015, 11:57:34 AM »
There are statistical methods that could be used that involve each rater putting courses played in a rank order, best to worst. Then it's possible to combine the rater lists via a statistical process to come up with a single ranked list. 

Such methods take into account individual raters ranking differently.  To make this work, a large sample size would be required - each course being played by multiple raters.  I think that Golfweek's Sagarin ranking of players uses a method like this - matching relative finish positions of players in tournaments where they are in the same field competing against each other. 

Given the large number of courses involved and the small number of raters, there would be a requirement to eliminate a lot of courses from consideration.  The number of raters and the number of courses that each ranked would determine the confidence in the results. Clearly not a perfect process, but one with some mathematical basis. 

It would be nice to learn how the magazines create their lists.  My guess is that somewhere in their process is a step where  "a couple of us sit around a table..." :) Likely, they start with the generally held perceptions reflected in past lists and then debate changes to that. For new courses Marketing people probably work to influence raters perception, while the old classics stand on their reputation.

Tom_Doak

  • Karma: +2/-1
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #9 on: February 08, 2015, 12:12:10 PM »
My guess is that somewhere in their process is a step where  "a couple of us sit around a table..." :) Likely, they start with the generally held perceptions reflected in past lists and then debate changes to that. For new courses Marketing people probably work to influence raters perception, while the old classics stand on their reputation.


Thirty years ago that's the way it was done.  Since then, each magazine has given the panelists a long list of courses and asked them to rate each one they've played.

GOLF DIGEST has the panelists rate each course 1-10 on seven different categories, and then they have a formula for adding them together.  ["Shot values" are twice as important as the other categories.]

GOLFWEEK asks panelists to give the courses an overall 1-10 grade.

GOLF MAGAZINE asks panelists where they think each course should fit into the rankings [top 3, top 50, second 100, off the ballot entirely, etc.] and assigns numbers to these results [I don't know if it's 1-10].

In short, they all do it with numbers.  GOLF DIGEST's ranking is steered more by a certain definition of what a great course is supposed to be; the others depend more on their panelists to decide what great is.  In the end, all of them are subjective, so any "statistical confidence" rests entirely on the panelists' opinions and their own individual biases.


Dave Doxey

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #10 on: February 08, 2015, 01:14:56 PM »
Thanks Tom.  It sounds like there is somewhat of a mathematical process. 

It would be interesting for one of the magazines to publish more detail, perhaps on a web site blog, shedding more light on the number of raters, number of courses played, and process for coming up with a ranked list.  Rank within a category would also be interesting.

I agree that it's an opinion-based process.  Opinions obviously don't always agree.It would be nice to see how the magazines resolve that.

For a new course, I guess that getting on that initial list given to reviewers is key to getting in the race.  Probably easier, now that there are fewer new courses.  I still wonder about "hidden gems" out there. And also old places living for a large part on their reputation.

Brad Tufts

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #11 on: February 08, 2015, 03:46:55 PM »
Every time the list comes out, Golf Digest for one DOES explain how the process works in the same section.  They have even in the past published a spreadsheet with all the averages for each T100 course in every category so it was all right in front of you.

I think we have something like 800 panelists, and all candidate courses (I would assume) are played at least once.  I believe the minimum # of ratings are 20 for best-in-state and 45 for T100.  I completely agree in establishing a minimum # of ratings to be listed on the BIS or T100.

Even if we did the Sagarin head-to-head ratings, there would still be results that someone does not like.

At least in Golf Digest, there is no "roundtable ranking" that goes on.

If there is one thing the rankings have taught me, it's that nobody will ever be completely happy with result other than their own personal rankings!
So I jump ship in Hong Kong....

Jeff Bergeron

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #12 on: February 08, 2015, 06:22:21 PM »
Hi Jeff,

I can speak in terms of GD, as I participate on that panel.

It seems you are driving towards some sort of conflict of interest, and I assure you great pains are taken to not allow this to factor.

Panelists can rate their home courses or any others they play, but with 45 (I think) ratings over 8 years required to qualify for T100 status, one single panelist cannot shift the ratings very much, especially because on a T100 course someone with a supposed slight bias will not deviate more than a point or two in each category, thus not changing the average much among 50+ ballots.

As for supervisors, Ron Whitten has consulted on a couple courses, and has always excluded them (Architects GC in NJ and Erin Hills in WI) from consideration.  This has changed slightly as Erin Hills has been added as a candidate because there have been enough modifications that Ron views the course as no longer his own as a whole.  This idea is debatable, as EH still has a similar routing as before, but IMO it is one of America's T100 courses today.

I know Golfweek does set up a rater retreat or two for many people to play a course and hear from that course's architect.  GD tries not to do this, as the conditions on one day can affect the rating for many people at once.  I'm not sure the GW method is wrong, but I agree with the GD perspective.  I wish there was a happy medium somehow, but maybe there isn't.

There are no "honorariums" of fees or kickbacks that anyone gets for visiting courses, or promoting such.  We as panelistz are allowed to accept comped greens fees, but this is not the point.  I have never factored "requiring a greens fee or not" into my ratings, and I have made sure others I know on the panel are on the same page as me.  I would guess in my travels, 2/3 of private courses offer comped greens fees, and 1/3 have some fee associated, greens fee or caddie fees...which is well within their right.  Usually if I am rating a pubic course, I am paying my way anyways, as this is more seamless than making a request, etc.

Comments?

And thank you Frank for giving a good overall perspective, one that I wholeheartedly share even while being a panelist!

Brad, this is an excellent response. I appreciate that. Does anyone know what Golfweek does, along these same lines?

Dave Doxey

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #13 on: February 08, 2015, 10:03:05 PM »
Every time the list comes out, Golf Digest for one DOES explain how the process works in the same section.  They have even in the past published a spreadsheet with all the averages for each T100 course in every category so it was all right in front of you.


Thanks Brad.

I'm not quibbling about the results. I really don't have a dog in that fight.  As I work in the analytics field, I was just expressing an interest in the process itself – how it works and how effectively it covers the vast number of courses in existence. I was also not implying any failings by you or the other raters.

I've seen the article on the criteria and the results for the winning courses.  This is what you are referring to, correct? http://www.golfdigest.com/golf-courses/2015-02/100-greatest-by-the-numbers This is interesting, but not very detailed, in that it covers only the chosen 100.  I wish that it was indeed a spreadsheet, so one could look at ranking in each individual category.

It would be interesting to know how many of the thousands of courses in the US were actually played by raters, vs. what I suspect was a focus on some subset believed to be top candidates.  According to the article, only those that get played 45 times over 8 years are eligible.  This means that there are likely courses that didn't get played 45 times, yet would score higher than some in the Top-100. (Is there a means whereby a rater who is impressed by a seldom-played course can recommend that other raters play it?)

Looking at the numbers – best case. There are 100 raters. If each rater played 25 different courses per year for 8 years, that would be 20,000 courses played.  Given that a course needs 45 ratings to qualify, the absolute most courses that could be could be eligible would be 444 (20,000 / 45).  That would be absolute best case – where any one course never got played more than 45 times (which is unlikely). It would appear that it's likely that the Top-100 are chosen from only 200 or 300 candidates.  Not very inclusive...

Sorry.  I said up front that I was a numbers guy.  I'll shut up now :)


Tom_Doak

  • Karma: +2/-1
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #14 on: February 08, 2015, 11:07:32 PM »
Yes, Dave, but there are more than 1,000 panelists, and many of them play more than 25 courses a year.  There is a lot of data.  The question is how relevant any of it really is.

Brad Tufts

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #15 on: February 09, 2015, 12:40:21 AM »
Thanks Dave...was beginning to think I was talking to myself!

I like your math actually...there are probably only 350ish courses that have a chance with enough ratings to be considered.

There are a few that get the numbers but not the number of ratings...Nanea was one this past year, and they are not interested in hosting raters, which is totally their right.  There were only a few in this category that would have made the T100, like no more than 3 if my memory is correct.

The issue with having more than "350ish" candidates is that those outside this realm very rarely have the categorical #s to potentially make the T100 or best in state.  All courses "with a chance" of making best in state or the T100 stay as candidates, and many of the others fall off the candidate list eventually.  We do not want to be wasting a club's time if there is no shot of making any of the lists.

All of that being said, we do add courses year to year if there has been a renovation or change that would put a course back into consideration, or if a club that has long not been interested in the rating process has a change of heart.
So I jump ship in Hong Kong....

Carl Rogers

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #16 on: February 09, 2015, 12:02:07 PM »
What are the qualifications to become a rater?  Yes, it is all about the raters once you get deeper than the top 20 round up the usual suspects.
I decline to accept the end of man. ... William Faulkner

Mike_Young

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #17 on: February 09, 2015, 07:01:27 PM »
After all the compilations and all of the reporting of ratings by all of the raters there is no agency that watches over the tallies or the final rankings given by the Mags.  There is nothing to stop them from rating any course  anyway they wish if they so choose.  And I personally don't have a problem with that at all.  It's nothing more than entertainment.
"just standing on a corner in Winslow Arizona"

Frank M

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings New
« Reply #18 on: February 09, 2015, 07:12:43 PM »
When you say "...there is no agency that watches over the tallies or the final rankings given by the Mags," what are you referring to exactly?   
« Last Edit: July 05, 2024, 10:13:36 PM by Frank M »

Mike_Young

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #19 on: February 09, 2015, 07:18:12 PM »
After all the compilations and all of the reporting of ratings by all of the raters there is no agency that watches over the tallies or the final rankings given by the Mags.  There is nothing to stop them from rating any course  anyway they wish if they so choose.  And I personally don't have a problem with that at all.  It's nothing more than entertainment.

When you say "...there is no agency that watches over the tallies or the final rankings given by the Mags," what are you referring to exactly?

And whatever this agency entails, isn't everything all nothing more than entertainment? Even golf course architecture itself? In the end, what can an agency give credit or discredit to? There's no right or wrong.  
Frank,
I agree with you 100%...as for what do I mean by agency..I'm saying there is no government agency that they have to abide by...they can do as they wish...I'm fine with it,  Let the mags rate the courses that can help them the most the highest....hell our biggest entertainemnt channels today are our news channels and there is nothing wrong with that as long as you recognize it for that ...same goes for for golf design and rankings ..
"just standing on a corner in Winslow Arizona"

PCCraig

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #20 on: February 10, 2015, 10:11:32 AM »
Yes, Dave, but there are more than 1,000 panelists, and many of them play more than 25 courses a year.  There is a lot of data.  The question is how relevant any of it really is.

Sounds like a race to the bottom to me...

I quit this scam when I started to be asked to pay for the privilege of being one of the herd, on top of the "mandatory" rater camps.  Last time i checked, those who prostitute themselves receive the money; they don't fork it over. 

The whole process is farce.   At this point, it's a business plan to salvage jobs for its promoters.  Nothing more.  I don't blame them.  They have no choice.  But I don't have to enable it.   So I don't.  No more complicated than that. 

Dave,

To be fair to Golf Magazine and Golf Digest, you should note that you are talking about Golfweek's program above.
H.P.S.

Adam Clayman

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings
« Reply #21 on: February 10, 2015, 11:07:47 AM »
Dave, Jeff, There's so much wasted bandwidth on this subject within this forum. Please try to search for the questions you're asking. Unless the threads have been culled, the answer to all your questions should be already answered several times.
"It's unbelievable how much you don't know about the game you've been playing your whole life." - Mickey Mantle

Tim Martin

  • Karma: +0/-0
Re: Process for developing the Golf Digest/Golfweek rankings New
« Reply #22 on: February 10, 2015, 02:11:00 PM »
Reading the original post it seems as if the author has an axe to grind. Every question insinuates that there is collusion, some form of financial gain or dishonesty in the rating process for both of the above referenced magazines. I think it's evident from the lack of responses/interested parties that this thread does little if anything to advance the discussion of golf architecture.
« Last Edit: February 10, 2015, 03:40:31 PM by Tim Martin »

Tags:
Tags:

An Error Has Occurred!

Call to undefined function theme_linktree()
Back