News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Tom Huckaby

Re: Golfweek 2010 lists
« Reply #50 on: March 12, 2010, 04:33:37 PM »
Insert large sigh.

Jud, people have done a composite list for GD rankings many times, and each time, we all need to be reminded that such is invalid, because the points aren't the same on each list.  That is, points are compiled on each list only RELATIVE TO THE OTHER COURSES ON THE LIST.. it's not an overall scale... so a 8.2 on one list might be better than a 9.2 on the other.  The many GW raters who participate here can likely flesh this out more if you wish.

But the bottom line remains that while what you just posted might be interesting, in reality it's not, because the good folks at GW still refuse to compare golf courses to golf courses, preferring to set this arbitrary 1960 line so that more courses may get their moment of glory.

Sorry, should have posted something to this effect before you went to all that hard work.  I just figured one of the GW raters might do so.

 ;D

Jud_T

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #51 on: March 12, 2010, 04:34:36 PM »
 :'(
Golf is a game. We play it. Somewhere along the way we took the fun out of it and charged a premium to be punished.- - Ron Sirak

Mac Plumart

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #52 on: March 12, 2010, 04:54:24 PM »
Jud...

I think it would be neat to throw in the latest GW GB&I ratings to your composite list to get a sense for where courses rank in a more worldly sense.

I, for one, support your composite list...and I don't care at all if people protest.  It is an intersting excercise.
Sportsman/Adventure loving golfer.

Jerry Kluger

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #53 on: March 12, 2010, 04:57:26 PM »
Tom: Your statement about GW raters is confusing to me.  They rate based upon certain particular factors and then give an overall rating but then you say that the overall rating is based upon the rating of other courses.  How can do they do a comparative rating if they've never seen the other courses, especially those that are rated higher?  The next question would be what percentage of the raters have actually seen and played the top courses that have very limited access and if they haven't then what value is their rating?

Tom Huckaby

Re: Golfweek 2010 lists
« Reply #54 on: March 12, 2010, 05:10:23 PM »
Tom: Your statement about GW raters is confusing to me.  They rate based upon certain particular factors and then give an overall rating but then you say that the overall rating is based upon the rating of other courses.  How can do they do a comparative rating if they've never seen the other courses, especially those that are rated higher?  The next question would be what percentage of the raters have actually seen and played the top courses that have very limited access and if they haven't then what value is their rating?

I have no idea.  It has just been reported many times that the numbers they give are relative, and only relative Modern to Modern, Classic to Classic.

Someone who does this needs to respond.  I don't care to be honest.  I just have also wished for a composite list many times before, and always had my wishes dashed as well.

Jonathan Cummings

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #55 on: March 12, 2010, 05:25:33 PM »
As for Rock Creek - GW was unlikely to have the numbers needed.  Klein is agressively improving the statistical basis of his lists - a good thing.  I assure you RC will be on next year's list - probably top 50.  JC

Pete Lavallee

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #56 on: March 12, 2010, 05:33:02 PM »
JC,

Can you enlighten us on whether it is possible to compare the Classic and Modern lists; if the answer is no why not? It seems that even former GW raters are confused.
"...one inoculated with the virus must swing a golf-club or perish."  Robert Hunter

Tom Huckaby

Re: Golfweek 2010 lists
« Reply #57 on: March 12, 2010, 05:35:34 PM »
JC,

Can you enlighten us on whether it is possible to compare the Classic and Modern lists; if the answer is no why not? It seems that even former GW raters are confused.

LOL
Who's the former GW rater?  Certainly not me.

BTW I have no doubt that someone could tweak the math and make a composite list.  All I am saying is that as I understand their methods, the numbers put out as they are NOW cannot be used.  The math tweaking would have to happen first.  Jonathan Cummings is a FINE candidate to do such tweaking.  If he does, the results would indeed be very interesting...




jim_lewis

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #58 on: March 12, 2010, 06:22:04 PM »
Pete:

Maybe confusion is the reason said former rater is a former rater.
"Crusty"  Jim
Freelance Curmudgeon

Andy Troeger

Re: Golfweek 2010 lists
« Reply #59 on: March 12, 2010, 07:34:21 PM »
The thing I don't really understand regarding the Classic/Modern lists not being comparable...

When you look at the state lists, in two cases there are states with public courses that make either the classic or modern listing. In California you have Pebble, Spyglass, and Pasatiempo. In South Carolina, you have Caledonia and Dunes Club. In both cases, the order of the state listing would be in the order of the scores presented in the Top 100 lists. Of course, this could be a coincidence, but I have a hunch that the courses are just compared by score (as Jud did with his composite listing).

In any case, if you can compare classic and modern courses for the state lists, then logically you can compare them (somehow) for the national ones. Obviously it would be interesting to hear how this works from someone involved in the process, since the rest of us can really only speculate.

Adam Clayman

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #60 on: March 12, 2010, 07:39:25 PM »
I think it is impossible to have a composite list because the rater is thinking about the different categories when they vote. A completely separate vote would need to be asked of the rater to quantify how they feel about where specific courses should be ranked.

What no one has mentioned (for the newbies who are less familiar with the origins) is the justification for the separate lists. Huck was just stretching his Digest muscles when he poked fun at GW above, but the reasons for the different lists is more akin to comparing apples to apples and oranges to oranges. The building equipment, land available and other factors such as WWII, make 1960 a sensible demarcation point. The only course I know of that might not fit into the mix is Desert Forest. Since it was built in 1962 but with techniques reminiscent of pre-wars, using mules and carts to sculpt the desert.

BTW Huck, You have a composite list. Your own.
"It's unbelievable how much you don't know about the game you've been playing your whole life." - Mickey Mantle

Mac Plumart

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #61 on: March 12, 2010, 08:24:15 PM »
Adam...there is no doubt that you are correct.  That is indeed the rationale behind the lists.  But Andy's comment on logic is correct as well.  If the lists are combined for other GW lists, it is only logical to combine them for a composite list.

Nevertheless and regardless of who is right of wrong, the composite list makes for interesting discussion.  And isn't that why most of us like lists?  Discussion, debate, etc.

Sportsman/Adventure loving golfer.

Adam Clayman

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #62 on: March 12, 2010, 11:34:08 PM »
Mac, I have no problem discussing any list. The problem with Jud's is it's based on the numeric values published. They do not correlate as Jud  intends. That was my only point.

I do not know if the lists are combined for the state lists, do you?

Having just looked at the state courses you can play list, the only example that jumped off the page where discussion could be fruitful was Wisconsin. Placing Lawsonia (57th Classic) 3rd in the state behind WS (#3 Modern) and BWR (#39 Modern) doesn't compute in my opinion. Maybe it illustrates a bias I have towards Classics, which I don't think I do,  I just feel that Lawsonia is a better course. And I'm a huge fan of BWR. Can't comment on WS, have not been there. So, it's entirely possibly for the 7.34 on the Modern to be inferior to the 7.19 on the Classic. Once again, Apples to Oranges numerically.
"It's unbelievable how much you don't know about the game you've been playing your whole life." - Mickey Mantle

Tim Bert

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #63 on: March 12, 2010, 11:43:01 PM »
I've heard the "can't mix classic and modern" comments before, but I do have one hypothetical question for the GW raters as I try to wrap my arms around the (lack of) ability to compare.

If you had two courses that were exactly the same in every way except that one was designed in 1920 and one was designed in 1980 would those two courses get different scores from you just because one was modern and one was classic?

Jim Nugent

Re: Golfweek 2010 lists
« Reply #64 on: March 13, 2010, 01:49:29 AM »
As I understand it, the GW scores we see are strictly rankings.  Or really an average of how the GW raters rank courses in each category.  They are not scores against an absolute scale.  A comparison is how much money a pro golfer makes in any year, vs where he stands on the money winnings list.  GW shows us where the courses stand.  Not how much actual money they made. 

GW takes all the courses in, say, the modern category.  Raters end up giving each course they play a score from one to ten.  While I don't have the numbers exact, a ten means the ranker considers that course one of the best 5 modern courses in the country.  (Instead of 5, that number may be 3, or 10, but I don't think it's anything greater than that.)  A nine means it ranks somewhere between 5 and 25 (or similar numbers).  An 8 means it's between 26 and 75. 

Again, I don't know the precise numbers.  But I'm pretty sure they are close to the ones I cited.

Then GW averages all the scores from its raters.  Maybe they smooth the numbers out with some statistical magic.  From Jonathan's comment, sad to say, that seems likely.  But the scores you see are that average, statistically massaged or not.

Which is why (if all this is correct!) Huck is right.  When Sand Hills gets a 10, it only means that rater thinks Sand Hills is among the top 5 (or whatever) modern courses in the nation.  The same rater may rank Sand Hills 100 among all courses.  He may rank it first.  We have no way of knowing from the numbers GW supplies us.

Andy Troeger: the state lists only compare public access courses.  No private courses.  And almost no public courses are among the top 100 classic.  Only a few, by my quick count.  So no real basis to compare.   

Tim Bert, good question IMO.  A few years ago I asked Brad Klein how they, as individual raters, average their raw scores.  It was not a straight arithmetic average.  They could weight the categories differently.  And IIRC, each rater could decide how to weight the categories.  So one rater might give routing a bigger weight.  Another might give conditioning. 

I would like to see those "raw" scores.  Which are different from the scores we see in GW.   

Mac Plumart

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #65 on: March 13, 2010, 07:29:11 AM »
In relation to Jonathon's comments and Rock Creek, I have to believe that the statistical improvements he speaks of are relative to the robustness of the data when ample observations are collected.  If you run the numbers and use the r-squared, t-stat, etc...you can scientifically validate the fact that a certain number of observations are needed to derive accurate outputs.  BUT you can think of it in basic terms, if only 2 people play it you will get a much less varied opinion than if 30 played it, than if 50 played it, than if 1,000 played it.  I think that is what they are talking about.

Sportsman/Adventure loving golfer.

Jonathan Cummings

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #66 on: March 13, 2010, 08:10:06 AM »
From Jonathan's comment, sad to say, that seems likely.  But the scores you see are that average, statistically massaged or not.

No no.  I was simply saying more votes are better and Klein is striving to get more votes in his database to improve the stability of his averages. 

You've heard during political campaigns "1000 adults were asked during an exit poll......sampling error is +/-3%....  That 3% is calculated simply by 1 over the squareroot of the number sampled (1000).  More numbers/votes, the lower the sampling error.

I doubt GW has the wherewithal to run anything other than overall averages when computing their rankings.  To identify and correct for bias, skew, non-normal distributions, etc is not trivial and I doubt that Brad and his folks go there.  It was easy for me to do it with the gca list because it was so small and manageable.  Brad is likely playing with a database approaching (maybe even surpassing) 100,000 votes.

JC

Jim Nugent

Re: Golfweek 2010 lists
« Reply #67 on: March 13, 2010, 08:16:09 AM »
I certainly agree each course needs to get enough votes to make the result significant.  Do you know the minimum votes required to make GW's rankings?


Andy Troeger

Re: Golfweek 2010 lists
« Reply #68 on: March 13, 2010, 08:33:39 AM »

Andy Troeger: the state lists only compare public access courses.  No private courses.  And almost no public courses are among the top 100 classic.  Only a few, by my quick count.  So no real basis to compare.   
  

Jim,
I understand that, but there are three instances (Adam pointed out one I missed) where there are states with public access courses on both the top 100 classic and modern--unless those specific courses get double ballots (one for the classic/modern distinction and one for the public state distinction regardless of age) then the ballots would seem to be merged somehow.

Essentially though, what I'm saying is that if you can compare classic and modern for public access courses in each state (regardless of whether they are top 100 candidates), why can't you compare the top of each group. Heck, just take the French Lick Resort in Indiana which has the Ross and Dye Courses. Both make the top three in Indiana and a comparison is being made. How is that done?

Carl Nichols

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #69 on: March 13, 2010, 09:19:39 AM »
Andy-
Are you sure there aren't three voting categories -- classic, modern, and public (whether classic or modern)? That is, maybe they're not using the classic and modern votes to create the public list, but are just looking at a separate category of public only votes?

Jim Nugent

Re: Golfweek 2010 lists
« Reply #70 on: March 13, 2010, 09:21:33 AM »
Good questions.  We need someone who really knows to explain how GW does its scoring.  Paging Brad Klein...

mike_malone

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #71 on: March 13, 2010, 12:52:29 PM »
 If a course moved significantly up or down should they assume it was changes  that happened at their course in the recent past?  Isn't there a lag time in these rankings?
AKA Mayday

Andy Troeger

Re: Golfweek 2010 lists
« Reply #72 on: March 13, 2010, 01:34:35 PM »
Andy-
Are you sure there aren't three voting categories -- classic, modern, and public (whether classic or modern)? That is, maybe they're not using the classic and modern votes to create the public list, but are just looking at a separate category of public only votes?

Carl,
I'm no expert on the GolfWeek methods--those are my questions though. Is there a separate category for the publics--essentially do raters need to turn in two separate scores for them? If so, what are the differences?

George Freeman

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #73 on: March 13, 2010, 02:01:16 PM »
Andy-
Are you sure there aren't three voting categories -- classic, modern, and public (whether classic or modern)? That is, maybe they're not using the classic and modern votes to create the public list, but are just looking at a separate category of public only votes?

Carl,
I'm no expert on the GolfWeek methods--those are my questions though. Is there a separate category for the publics--essentially do raters need to turn in two separate scores for them? If so, what are the differences?

Andy,

GW does not differentiate between public and private, only classic and modern.
Mayhugh is my hero!!

"I love creating great golf courses.  I love shaping earth...it's a canvas." - Donald J. Trump

Carl Nichols

  • Karma: +0/-0
Re: Golfweek 2010 lists
« Reply #74 on: March 13, 2010, 02:05:08 PM »
George:
GW ranks the top public courses (and only public courses) in each State.  See http://www.golfweek.com/news/2010/mar/11/2010-golfweeks-best-courses-you-can-play.  Andy's point is that in certain States, the lists contain both modern and classic courses, so GW has figured out a way to rank modern and classic courses in relation to each other, at least for public courses.