Huck - GD requires a minimum of 30 raters. GW likes a minimum of 10 ratings, but in regards to top courses, it gets many more. It is not different strokes, but a troublesome difference in measurements...
Here is a history of the Digest Ranking system I wrote a number of years ago...
JC
In 1962, a map-making firm asked Golf Digest to compile a list of top courses in America for a charting project the firm was preparing. At first turned off, but later intrigued by the concept of relative comparisons of the then-popular US courses, the editors of Digest, under the direction of Bill Davis, set about preparing a list. The resultant list, called ‘America’s 200 toughest Courses’, was based solely on course lengths and USGA rating values, and was first published in 1966. The list was revised in 1967 and has since been published by the magazine every odd year. The editors soon realized the enormity and potential complexity of evaluating and ranking a golf course. It could not be an undertaking performed by a library search, based on just a few parameters; such a basis would miss many of the qualities of ‘greatness’. It was clear that the opinion of experienced players was needed. So a panel was formed in 1968, comprised of both professionals and top amateurs, to assist Golf Digest in determining The List. This panel was initially comprised of about 100 regional selectors (mostly friends of Bill Davis) who provided course recommendations to a national selector board made up of 20 of the then (and some still!) Who’s Who’s in golf - Snead, Beman, Boatwright, Demaret, Wind, Nelson, to name a few. Panelists evaluated courses for this initial list based mostly on fame, tradition, and whether or not a course had a history of hosting tournaments. Architectural features were also assessed, but only subjectively. This national panel, in concert with the Golf Digest editors, then determined the final rankings by listing the top 50 courses in groups of ten - first ten, second ten, etc. - and listing the ten courses within a group alphabetically. The courses ranked 51-100 were lumped together in a single list, alphabetically, and called ‘the second fifty’. The first product of the newly formed panel was published in 1969
By 1975, the panel had begun to put more value on design features, like esthetics and quality of play, and to slightly de-emphasize tournament history of a course. This precipitated the change in the name ‘America’s 200 Toughest Courses’, to ‘America’s 100 Greatest Tests of Golf’, and a corresponding, reshuffling of The List. Courses like Seminole and Pine Valley, both tremendous designs but lacking in tournament history, shot up to near tops of The List. The panel modified further their evaluations by downplaying course length and favoring shorter designs that promoted shotmaking and finesse. Golf as the thinking man’s game was being encouraged.
Much stayed the same until 1983, when several important changes impacting The List occurred. The regional (state) panel, which before 1983 had fluctuated between 118 to 144 people, was now increased to 208, an increase of almost 50%. The national panel had crept up from 20 in 1975 to 23 in 1983. Additional raters were required to sample and rate new golf courses that were springing up in America at an ever-increasing rate. More raters also insured a statistically appropriate sampling ratio required for accurate predictors, much like exit polls during elections require a certain number of people interviewed (samples) to be deemed accurate and used to predict winners. The second and more profound change was the introduction of a point system, based on specific category. This system was used for ranking the top 20, and to ‘break ties’ in instances of extremely similar course ratings. The seven categories were:
(1) Shot Value
(2) Resistance to Scoring
(3) Design Balance
(4) Memorability
(5) Esthetics
(6) Conditioning
(7) Tradition
These categories were not without controversy, mostly regarding course conditioning. It was argued that a well-designed course retains its design features during transitory times of drought, frost, disease, and other natural events affecting living things. Greenskeepers come and go, and while their job is unquestionably important - even critical - their talents should play only a small role in determining the ‘greatness’ of a course. As one rater so aptly put it, “are we trying to find the most beautiful woman, or the best hairdresser?” (GD, Nov 1983 p64). The board argued that conditioning, either good or poor, could alter a course design’s intent, potentially devaluing architectural strength, so conditioning was retained as a category.
Several more important changes occurred with the 1985 list. First, a course was required to be at least three years old to be eligible for the top 100 list. Second, the raters were instructed to employ the seven categories to all courses they rated and generate a numeric value for each category for that course. Values of 1 to 10 (1 poor, 10 perfect) were assigned for all categories except shot value, in which the 1 to 10 values were doubled, and tradition, which ranged from 1 to 5. A perfect course score was 75. The national panel averaged the panelist’s results from the many courses rated and acted, along with the Digest editors, as arbitrators. Third, a true 1 to 100 ranking was employed based on composite, average values. The alphabetical listing within groups was abandoned. Now there was a best course in the US, which turned out to be Pine Valley, and, since 1985, has never left the #1 position. The 1985 listing also saw the emergence of Ron Whitten from rater and contributing editor to an elevated position as the ‘CEO’ of The List.
Whitten endeavored to broaden the pool of panelists to include women, minorities, and people who traveled and could sample a wide range of courses. Whitten still looked for lower handicapped amateurs who had a demonstrated experience in the world of golf. Gone were professionals and USGA types from The List. Architects were never allowed on Digest's panel, for it was felt that they would have too much of a vested interest, and - more - have a forum to promote their own courses.
By 1987, the panel had grown to 287 regional raters with 24 golf dignitaries on the national board. Making The List now required a course to be at least three years old, and for at least 10% of the panelists to have played it.
In 1989, Resistance-to-scoring and Tradition were dropped from the regional panelists scoring system. The course-ranking editorial panel (formally national board), felt that the USGA course slope was a more detailed assessment of relative course shot value, so a total of up to four points were added to a course’s overall average, using an equation based on course slope. Other changes included the devaluation of tradition from 5 to 2 point. Tradition points were now added by the panel, based on their assessment of what impact the candidate course has had on the history and lore of the game. A ‘perfect’ score now became 66.
1991 saw a quick reversal of the 1989 decision to drop resistance to scoring as a rating category. Digest realized the fallacy of basing course difficulty on slope for two reasons: (1) slope was a measure of the average golfer's ability (bogey player), where raters derived rating values from the tournament tees (scratch tees), and (2) slope was a static figure measured and revised by the USGA (re-rated) all too infrequently. Course raters could gage difficulty (resistance to scoring) as many times a year as the candidate course was visited. The highest score a course could receive was now 72. Ties were broken by prioritizing the highest shot value averages. This was also the last year in which the magazine published a glossary of the names and states of the individual panelists.
Controversy again struck The List in 1993 with the assignment of two ‘bonus points’ given to a course that allowed walking (one point if walking is restricted to certain times). A grass roots movement was afoot in golf to get back to the ‘traditional’ round: links courses were in vogue, caddie programs were strengthened, and walking was encouraged. Major golf magazines, including Golf Digest, championed the walking cause and likely put political pressure on the rating committee to reflect this policy in the course rating. Thus the added two points, which now made the perfect score 74. The regional group of panelists had swelled to 430.
This past year, 1995, saw further changes in The List. In an effort to correct the trend of new expensive, highly advertised courses replacing some of the quiet old established ones on The List, the tradition category value was increased from 2 back to 10 points. The Digest Rating Board administered these points based on historical research about a candidate course. The national raters, which now stood at 535 in number, did not rate tradition. A course was now required to be at least five years old and must have been visited by at least 24 panelists (roughly 5%) before being considered for The List. All rating values except conditioning is currently maintained for 10 years, or until a panelist revisits a course and revises his values. Conditioning values are considered too transitory and are purged from the averaging every two years. It is interesting to note that the resultant 1995 List contains just under half of the original courses, which started out in the 1966 List.