I posted earlier that I don't believe The Doak Scale to be the best scale to use as a top 100 measure because of its subjectivity. I'd like to defend this opinion.
If I am tasked to generate a course ranking I want the panelists thinking of rankings not ratings. I would want each "rater" to sort his own personal list before assigning any kind of numbers to it and then think of where they should fall on an absolute scale. Granted, it is tough to distinguish the 33rd from the 34th course but you really don't need to distinguish to that detail. A panelist need only group them using an absolute ranking guideline. I think Golfweek does this best with something along the lines of:
10 - top 10
9 - top 11-25
8 - top 26-50
7 - top 51-100
etc....
The beauty of a rater submitting grades related to his own rankings is that when I average his with other panelist's ranking lists the composite list converges quickly.
Do you think Tom instructed his GM panelists to use The Doak Scale when he ran the GM panel? Nope, he used a rating scale quite similar to GW although Tom’s had less resolution.
I think Tom did the right thing.
I interviewed Tom about his rating technique for a travel log I wrote years ago. Here’s what he told me……
In the late 1970s, the editors of Golf Magazine, recognizing the success and growing popularity of the Golf Digest US course rankings, embarked on compiling their own list, this one addressing the top courses in the world. Golf Magazine selected John Ross to initially head this ‘Top’ committee. Ross and his panel compiled a subjective list with little basis on criteria. In 1983 Tom Doak wrote John Ross a letter citing that Golf Magazine was going about rating courses all wrong. Doak wanted Magazine to truly rank courses rather than just list them. Doak, a 23-year-old photographer and contributing editor to Magazine, had just graduated from Cornell with a degree in landscape architecture, and had begun working as a apprentice course designer under Pete Dye. Doak had spent time after graduation in the United Kingdom studying course design and design history. Golf Magazine responded to his letter by making Doak the head of the Top 100 committee. The selection format and panel that Doak formed was markedly different than Digest’s. A panel of about 60, made up of both unknown amateurs and famous golfers like Nicklaus, Player, and Palmer, was formed. A panelist was selected by Doak under the main criteria that he had played a broad array of well-known international courses and was well versed in the aspects of golf architecture. Unlike Digest’s approach, Doak felt that if you select a knowledgeable and experienced enough panel, a numeric system of rankings was unnecessary. You put your efforts into selecting a panel and then simply trusted their results. This, as opposed to Digest’s philosophy, which uses amateurs as panelists to generate the 'draft' list, to which the editors fine tune the panelist’s averaged results using modifying categories (tradition, etc) - effectively fudge factors.
The Magazine panelists graded a list which was generated subjectively by Doak and included several hundred of the most recognized and highly thought-of courses in the world. A panelist was not required to have played a candidate course but to have at least seen it sometime in his lifetime (unlike Golf Digest’s requirement that panelists only rate courses they have played in the last 10 years). Courses only seen, as opposed to played, were not as heavily weighted in the overall averaging. A simple school grading system was employed.
A - A Top Ten Course
B - A 11-50 Course
C - A 51-100 Course
D - A 101-200 Course
E - A 201-300 Course
F - Should not be on list
Another requirement of the Digest list called for, a minimum of 10% of the panelists to have played a course for it to receive ranking consideration. Moreover, a panelist was allowed to give out only as many ‘A’s as A courses he/she played, as many ‘B’s as B courses played and so one. So, if a panelist had played only two top ten courses from the previous list, he/she could give out only two A’s on the current list being ranked. This helped insure that Doak’s list did not become ‘top’ heavy and that a panelist’s votes were absolute rankings as opposed to relative (to that individual’s) rankings. If a candidate course was not visited enough (not seen by at least 10% of the panelists), then Doak assigned ‘E’ values to make up the difference. In other words, if only 4 panelists visited a course, and each gave that course a ‘B’ grade, then Doak would add in 3 ‘E’ grade and average the results, then insuring that a ‘remote’, but potentially ‘great’ course, was not dropped from consideration because of lack of panelist’s exposure.
The first Golf Magazine list was published in 1979 and specified the top 50 courses (in no specific order) in the world. The list was published again in the November 1981 edition and has continued to be published every odd year since then. In 1983, with Doak’s influence, Magazine became the first publication to rank the top courses in order, 1-50. Digest quickly followed suit in 1985, with their ordered top 100 U.S. rankings. In 1985, Magazine expanded their world rankings to cover the top 100. Starting in 1991, and continuing today, Golf Magazine publishes both the top 100 in the world and the top 100 in the U.S.