Hi Tom,
As I am only a panelist for GD, I am not 100% certain how other publications come up with their numbers.
GW disconnects the individual category numbers from the overall score...why have categories?
Is there a minimum # of evaluations required to make another publication's T100 list? If not, are they basing the traditional low-panelist-visit course evaluations on only 3 evaluations, or on hearsay? If there is some secret to getting a good # of panelists to every single worthwhile venue, why isn't it widely known? I'm guessing if there was, it would be done!
My (oft cited on GCA in the past) point is that every system is flawed, and all we can do is try to get guys to see the worthwhile courses and do the best we can. As I said above, the category criteria can be argued to eventually present the best list possible for the GCAer, but GD is very transparent about how the rankings are derived....this works better for my brain. I don't bow down to GD's list either....everyone interested in this stuff has a personal list.
As for the Erin Hills issue, I agree with you that the high-road seems to have disappeared. Sounds like a question for JT and RW. Confusion aside, I have played EH and find it to be at least top 50 public in the US, maybe even scratching at the back door of the T100.
Personally, I think the rankings are fun because it gets us all talking about golf courses...that alone makes the exercise worthwhile.