In essence its not nuclear physics to make good rankings, its just a heck of a lot of work collecting the data from many respondents.
The method to use is to let individual golfers make a ranked list of courses ranging from the best to the worst course (they need to have played or walked the course). Better even is to use conjoint analysis, which forces you make many subsequent choices between two courses and also allows for other factors to be included (eg. would you rather play Valderama for 300 euros or De Pan for 100 euros, or would you rather play Swinley Forest for 120 pounds or New Zealand for 80 pounds). By going through such a choice process an algorith establishes the sequence of courses, but also the utilities each of the courses would have. Either of these two methods would establish a list of courses ranked from best ranked to worst ranked for one person. For some persons this list would be short, say 40 courses and for others the list would be huge, say 1000 courses.
Then you would need some demographical data from the person. Things like gender, age, income, golf handicap, GCA member, Pro, architect, Joe Blow, address, no of courses played. Using this data would for instance allow you to rank courses in GBI using as control group GCA members, handicap 10-15, average income. (as long as the group is large enough to be statistically significant). You could do the same for Joe BLow, handicap15-30, low income.
Key is to get as many people as possible to rank the courses they have played from best to worst, preferably through conjoint analysis with some utility like money to give the ranking a scale, then to segment the the overall pool by demographics.
This system should be relatively easy to set up for a website like Top 100, and would no doubt also increase the stickiness of the site. Approach a couple of hundred people around the world, get them to do the rankings. Then open the system up to all readers of your website. That way it would become a living ranking changing over time.
As far as I know the only site to have done something like this has been GolfWeekly in the Netherlands with a panel of 150 voters.
Rankings by experts on wine, like Robert Parker are comparable to Tom Doak and Confidential Guide. They work because we trust the judgement of the star, the expert. But it falls apart with sites like Top100, because we are not sure they have raters with the same standing/credibility (at least in the perception of the public) ....