Someone please explain how Essex County jumped 18 points? I’m under the impression the course is little changed since Ross passed away?
First, don't forget, these 1-100 rankings blur that the underlying numbers are very close together. Essex is maybe only rated a fraction of a point higher than it was before, as a result of some panelists going to see it for the first time. But that's the difference between 91st and 73rd, because the courses in the bottom half of the list are bunched very closely together.
We did some work there years ago on bunkers and grassing lines, and my former associate Bruce Hepner has been back to clear away a bunch more trees and open up vistas.
But the change in its ranking is more of a change of perspective. The pecking order of Brookline - Salem - Winchester was established by rankings from 30+ years ago, and panelists knew those were supposed to be the top three, so it stayed that way for a long time. I suggested Darius Oliver go see Essex County and Myopia when he was making his big tour of America, and he rated Essex County as Ross's best course in America . . . so now there are a bunch of panelists who are freed up to vote that way, or maybe think they should.
Myopia's rise [76th from nowhere] is maybe even more impressive, but Essex County has leapfrogged a lot of sacred cow Ross courses from the old lists.
Could someone familiar with how Digest rates courses let me know how meaningful the small differences between actually is. For example, there are more than 30 courses which have scores between 60 and 61 and, when taking into account the second 100, even more which have scores between 59 and 60. Are these differences truly statistically significant or is it possible than one or two raters could make a difference in a coure’s aggretate rating?
Truly interested in understanding this. Thanks.
I HATE OUR SOFTWARE, as every post seems to always get jacked up and I have to edit after the fact... anyways......
Golf digest is trying to double the number or raters to almost 1900 by they year 2020 and I think they have about 1100 now. They presently have to have 45 ratings from their raters in the last 5 years to qualify and want to raise that to 70 which for statistics is good because the more data points you have the more regression you will have towards the mean and a normal distribution to mitigate the outliers. 45 is really only 9 a year over a 5 year period and I know many courses carefully manage how many raters they let on their course to include offering slots during their best conditioned months.
GD has 7 categories each with 12.5% going to the total, but shot values are doubled to 25%.
1. SHOT VALUES-How well do the holes pose a variety of risks and rewards and equally test length, accuracy and finesse?
2. RESISTANCE TO SCORING-How difficult, while still being fair, is the course for a scratch player from the back tees?
3. DESIGN VARIETY-How varied are the holes in differing lengths, configurations, hazard placements, green shapes and green contours?
4. MEMORABILITY-How well do the design features provide individuality to each hole yet a collective continuity to the entire 18?
5. AESTHETICS-How well do the scenic values of the course add to the pleasure of a round?
6. CONDITIONING-How firm, fast and rolling were the fairways, and how firm yet receptive were the greens on the day you played the course?
7. AMBIENCE-How well does the overall feel and atmosphere of the course reflect or uphold the traditional values of the game?
They are each out of 10 and each category is averaged for the final score, thus you can see this years category breakdown here:
https://www.golfdigest.com/story/how-our-panel-ranks-the-coursesAlso you have to take a look at their "raters", the qualification criteria is basically have a 5.0 handicap index and be willing to pay $1000 up front and $250 bucks a year in dues. They must submit a minimum of 12 (one rater told me), but others have told me 24 a year.
Statistically upping the minimum to 70 is going to help accuracy of the field on what their criteria is, however the criteria itself is hotly debated on this site and others. Increasing their pool of raters is good otherwise they won't get the 70 scores needed every 5 years for some courses to qualify.