News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Jonathan Cummings

  • Karma: +0/-0
The Doak Scale
« on: December 07, 2008, 08:51:50 AM »
I posted earlier that I don't believe The Doak Scale to be the best scale to use as a top 100 measure because of its subjectivity.  I'd like to defend this opinion.

If I am tasked to generate a course ranking I want the panelists thinking of rankings not ratings.  I would want each "rater" to sort his own personal list before assigning any kind of numbers to it and then think of where they should fall on an absolute scale.  Granted, it is tough to distinguish the 33rd from the 34th course but you really don't need to distinguish to that detail.  A panelist need only group them using an absolute ranking guideline.  I think Golfweek does this best with something along the lines of:

10 - top 10
9 - top 11-25
8 - top 26-50
7 - top 51-100

etc....

The beauty of a rater submitting grades related to his own rankings is that when I average his with other panelist's ranking lists the composite list converges quickly.

Do you think Tom instructed his GM panelists to use The Doak Scale when he ran the GM panel?  Nope, he used a rating scale quite similar to GW although Tom’s had less resolution. 

I think Tom did the right thing.

I interviewed Tom about his rating technique for a travel log I wrote years ago.  Here’s what he told me……


In the late 1970s, the editors of Golf Magazine, recognizing the success and growing popularity of the Golf Digest US course rankings, embarked on compiling their own list, this one addressing the top courses in the world.  Golf Magazine selected John Ross to initially head this ‘Top’ committee.  Ross and his panel compiled a subjective list with little basis on criteria.  In 1983 Tom Doak wrote John Ross a letter citing that Golf Magazine was going about rating courses all wrong.  Doak wanted Magazine to truly rank courses rather than just list them.  Doak, a 23-year-old photographer and contributing editor to Magazine, had just graduated from Cornell with a degree in landscape architecture, and had begun working as a apprentice course designer under Pete Dye. Doak had spent time after graduation in the United Kingdom studying course design and design history.  Golf Magazine responded to his letter by making Doak the head of the Top 100 committee.  The selection format and panel that Doak formed was markedly different than Digest’s.  A panel of about 60, made up of both unknown amateurs and famous golfers like Nicklaus, Player, and Palmer, was formed.  A panelist was selected by Doak under the main criteria that he had played a broad array of well-known international courses and was well versed in the aspects of golf architecture.  Unlike Digest’s approach, Doak felt that if you select a knowledgeable and experienced enough panel, a numeric system of rankings was unnecessary.  You put your efforts into selecting a panel and then simply trusted their results. This, as opposed to Digest’s philosophy, which uses amateurs as panelists to generate the 'draft' list, to which the editors fine tune the panelist’s averaged results using modifying categories (tradition, etc) - effectively fudge factors.

 
The Magazine panelists graded a list which was generated subjectively by Doak and included several hundred of the most recognized and highly thought-of courses in the world.  A panelist was not required to have played a candidate course but to have at least seen it sometime in his lifetime (unlike Golf Digest’s requirement that panelists only rate courses they have played in the last 10 years).  Courses only seen, as opposed to played, were not as heavily weighted in the overall averaging.   A simple school grading system was employed.

A   -   A Top Ten Course
B   -   A 11-50 Course
C   -   A 51-100 Course
D   -   A 101-200 Course
E   -   A 201-300 Course
F   -   Should not be on list

Another requirement of the Digest list called for, a minimum of 10% of the panelists to have played a course for it to receive ranking consideration.  Moreover, a panelist was allowed to give out only as many ‘A’s as A courses he/she played, as many ‘B’s as B courses played and so one.  So, if a panelist had played only two top ten courses from the previous list, he/she could give out only two A’s on the current list being ranked.  This helped insure that Doak’s list did not become ‘top’ heavy and that a panelist’s votes were absolute rankings as opposed to relative (to that individual’s) rankings. If a candidate course was not visited enough (not seen by at least 10% of the panelists), then Doak assigned ‘E’ values to make up the difference.  In other words, if only 4 panelists visited a course, and each gave that course a ‘B’ grade, then Doak would add in 3 ‘E’ grade and average the results, then insuring that a ‘remote’, but potentially ‘great’ course, was not dropped from consideration because of lack of panelist’s exposure.

The first Golf Magazine list was published in 1979 and specified the top 50 courses (in no specific order) in the world.  The list was published again in the November 1981 edition and has continued to be published every odd year since then.  In 1983, with Doak’s influence, Magazine became the first publication to rank the top courses in order, 1-50.  Digest quickly followed suit in 1985, with their ordered top 100 U.S. rankings.  In 1985, Magazine expanded their world rankings to cover the top 100.  Starting in 1991, and continuing today, Golf Magazine publishes both the top 100 in the world and the top 100 in the U.S.

 

Tom Huckaby

Re: The Doak Scale
« Reply #1 on: December 07, 2008, 10:48:20 AM »
Jonathan:

Did anyone say the Doak scale was a good use for Top 100?  Apologies as I didn't follow that other thread.  My eyes glaze over when the minutia of ratings and rankings gets discussed.

My personal feeling remains that each magazine rating has its own unique value; GW gives what you state, and does it well... GD attempts a more scientific criterion based consensus of a huge panel, and that works also giving something different; GM gives the "we know it when we see it" take of a smaller panel of famous people and/or experts, and that has its value also.  We can debate until we're blue in the face about whichis the best; consensus on that will likely never be achieved.

To me the Doak scale is a totally different thing:  it tells me how "worth it" a course is to seek out.  That won't necessarily translate to Top 100 rankings, for exactly the reasons you state.  But it too has its value.

Just count me as with you anyway that it ought not to be used for ranking purposes.  I just don't see that as its intent.

TH

Jonathan Cummings

  • Karma: +0/-0
Re: The Doak Scale
« Reply #2 on: December 07, 2008, 11:03:47 AM »
Tom - you are wise beyond your years!  Agree with you completely.  jaycee

Adam Clayman

  • Karma: +0/-0
Re: The Doak Scale
« Reply #3 on: December 07, 2008, 11:08:38 AM »
JC's advice is sound. The Doak scale appears to have inherent formulaic flaws for ranking purposes.

Hopefully Ian will see this and reconsider his criteria.

But, I can tell you up front that there's no way that GCA.com will ever sanction a list that is perceived to be an official opinion from this website.
"It's unbelievable how much you don't know about the game you've been playing your whole life." - Mickey Mantle

Mark_Fine

  • Karma: +0/-0
Re: The Doak Scale
« Reply #4 on: December 07, 2008, 11:23:51 AM »
Johathan,
I am really hesitant to get into discussion on a "ranking" thread as rankings are such a SUBJECTIVE topic but I raise a question - If I follow your logic and a panelist submits his list and the highest ranked course he has on it is Bay Hill, does this list really help you with Top 100 rankings?  How would that panelist know if Bay Hill is a 10 or a 9 or a 5,...?  If they give Bay Hill a 10 because it is the best they have played (in their opinion), how valid are your rankings and how useful is their input? 
Mark

Adam Clayman

  • Karma: +0/-0
Re: The Doak Scale
« Reply #5 on: December 07, 2008, 12:00:19 PM »
Mark, I won't presume to answer for JC, but, when I was considered to be a panelist there was a vetting process. Not specifically on the rankings of courses but on the experience and diversity of one's scope, nationally. I assume the same is true at GM, only globally. 
"It's unbelievable how much you don't know about the game you've been playing your whole life." - Mickey Mantle

Robert Thompson

  • Karma: +0/-0
Re: The Doak Scale
« Reply #6 on: December 07, 2008, 12:11:47 PM »
Johathan,
I am really hesitant to get into discussion on a "ranking" thread as rankings are such a SUBJECTIVE topic but I raise a question - If I follow your logic and a panelist submits his list and the highest ranked course he has on it is Bay Hill, does this list really help you with Top 100 rankings?  How would that panelist know if Bay Hill is a 10 or a 9 or a 5,...?  If they give Bay Hill a 10 because it is the best they have played (in their opinion), how valid are your rankings and how useful is their input? 
Mark

Mark: This is a fine point and an issue we have in Canada, with our Canadian rankings at Scoregolf. If one hasn't seen Highlands Links in the east, St. George's in Toronto, Banff and Jasper in Alberta and Capilano in the west, then your frame of reference isn't really there. I think the Golf Mag lists insists its raters see a vast majority of the Top 100 before they are admitted -- giving them a frame of reference out of the gate. Without it, I don't think one can judge what is great from good.
Terrorizing Toronto Since 1997

Read me at Canadiangolfer.com

Jonathan Cummings

  • Karma: +0/-0
Re: The Doak Scale
« Reply #7 on: December 07, 2008, 01:33:02 PM »
Mark - no question, you raise a valid concern.  Ideally I would only use panelists who have seen a fair amount of courses and have a basis on which to form a comparative judgement.  In a perfect world I would specify a threshold to become a rater....have seen 100+ courses, 5-10 top 100 in the world???

In regards to Ian's list I can use some elementary statistics to numerically "test" how much in-line a rater's numbers are.  Comparing a rater's individual votes to the average of all raters votes on a course-by-course basis can suggest whether that rater's number are more widely spread (randomly generated) - and potentially of less value.

JC

Ian_L

  • Karma: +0/-0
Re: The Doak Scale
« Reply #8 on: December 07, 2008, 05:07:50 PM »
Jonathan, I definitely agree that your system works better when experienced golfers are involved.  But what happens when a golfer (like me) who hasn't seen a whole lot of great courses comes along and is asked to rank his courses?  How can he say that the course he played is a 9 on your scale if he hasn't played any other "9's"?  I know I would not feel comfortable voting on this scale, because it requires extensive knowledge of the world's great courses.  Which is why it works very well for a vetted group of panelists.

The reasons I chose to use the Doak Scale are:
1. It is well known around gca.com.
2. It's easy to compute rankings from it.
3. If he's looking for the right things in a course, even a relatively sheltered golfer can pinpoint his opinion of the course on this scale.  He may not know how good Muirfield is, but he knows that XYZ course was an 8 in his book. Of course, this golfer's opinion will not be as informed  as the more experienced golfer, but at least he has a chance to give his honest opinion.


However, if nobody else wants to use the Doak Scale, then we'll have to change the plan.  Of course, there's nothing stopping us from making multiple rankings, so we could try one first and then the other if there is interest in both.

Mark_Fine

  • Karma: +0/-0
Re: The Doak Scale
« Reply #9 on: December 07, 2008, 05:18:26 PM »
The bottomline in ANY scale or system is that it comes down to the experience and course exposure of the people doing the ratings.  The scale and/or criteria you use is really secondary.  Some for example say that the GD criteria is too analytical but frankly the reviews one gives a course comes down to the knowledge and experience of the panelists.  The numbers fall where they fall and you are not forced into anything.

I was getting ready for a trip to Hawaii a few weeks ago and a friend of mine said to me, "You have to make sure you see XYZ golf course while you are there.  It is the best course in the state."  I asked him how many courses he had played in Hawaii and he said three.  Need I comment more  ;)

Jonathan Cummings

  • Karma: +0/-0
Re: The Doak Scale
« Reply #10 on: December 07, 2008, 08:18:26 PM »
Ian - I'll be honest with you, I'd watch your raterings with an extra careful eye.  In a perfect world I wouldn't allow someone of your experience level to post votes. 

Mark - I'd be curious which Hawaii course your friend suggested not to miss.  According to a long past Links Magazine article I've seen damn near them all  ;) and I would be interested in how my "best of" compares with your friend's "best of".  JC

Ian_L

  • Karma: +0/-0
Re: The Doak Scale
« Reply #11 on: December 07, 2008, 08:26:21 PM »
Ian - I'll be honest with you, I'd watch your raterings with an extra careful eye.  In a perfect world I wouldn't allow someone of your experience level to post votes. 

I agree completely.  However, in the poll we'll be conducting all GCA members will be allowed to vote, for better or for worse, which is why I would not want implement your method in that excercise.

For a more educated group of voters, I would agree with your system.

Mark_Fine

  • Karma: +0/-0
Re: The Doak Scale New
« Reply #12 on: December 07, 2008, 10:10:45 PM »
Jonathan,
Mauna Kea was the course.  I've played it numerous times in the past but not this trip (might still be closed).  I did play Nanea as well as two others on the Big Island this past trip and I will say Nanea is superb! 
« Last Edit: December 08, 2008, 06:42:58 AM by Mark_Fine »

Andrew Summerell

  • Karma: +0/-0
Re: The Doak Scale
« Reply #13 on: December 07, 2008, 11:25:45 PM »
Jonathan,

How regularly do you expect your rankers to play/see each course?

It has always concerned me how many panellists continue to rank or rate courses they haven’t played in the last 5 years.


Tags:
Tags:

An Error Has Occurred!

Call to undefined function theme_linktree()
Back