Do they explain their methodology anywhere? Does every reviewer get a vote?
I've always liked their site and respected their rankings but I get confused by their "ball" ranking. I've see courses with 6 balls end up lower with a rank than something with 5 balls.
The balls are rankings their readers give out, and their readers' opinions have very little weight in the final voting, I think. They have a few panelists of their own and I think they give those guys a heavy thumb on the scale, but the same does not apply to just anyone who posts an online review.
I believe their methodology starts with inputting all the other published rankings, first. It's telling that most of the "big moves" Jeff and John mentioned in fact mirror the recent GOLF Magazine ranking. You can believe that two independent panels thought the same in all of those cases, if you want . . . but that's not what Occam's Razor says.
As to them adding Deal to the list, even though it's not on most of the others: you can do that if your own results for the U.K. show it clearly beating several of the other courses that got into the top 100 via outside opinion.
On the other hand, they have a couple of new courses they stuck into their ranking where they have no data from any other publications. Those have to be based on just a handful of votes. I had asked Ran about the voting results for one of those courses, and they clearly didn't get the same votes on it that the GOLF Magazine panelists cast!
As for giving this ranking more credibility than others because it comes out of the UK . . . it's more European-centric, for sure, and that will appeal to some, but they don't have near the coverage of classic U.S. courses that others do. But then, consider that the list
really comes off the internet, and think about what that does for its credibility.