News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Mac Plumart

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #100 on: December 29, 2012, 01:48:30 PM »
Mark...

Almost any variance metric will capture what I'm interested in and that is, well, variance.

And I think you are correct on simply using rating and slope for resistance to scoring.
Sportsman/Adventure loving golfer.

Nigel Islam

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #101 on: December 29, 2012, 01:49:14 PM »


I've seen it take years for good golfers to get comfortable with their club selection off the tee. I hate it when my opponents leave their driver in the bag.
[/quote]

I totally agree about it taking a long time to get comfortable. The first two times I played I found myself trying to steer everything for weeks afterward. Its definitely the toughest course I have ever played.

Nigel Islam

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #102 on: December 29, 2012, 01:58:59 PM »
Are there any courses found on the GD and Golf lists that are not found on any of the 400 courses listed in Golf Week? I suspect that course #50 on any list is probably closer to course #400 than the Pine Valleys, Shinnecocks, or Cypresses. I like that they rate that many courses. I'm encouraged that GD has added the second 100, and I'm anxious to see that. I assume that will be found when the magazine listings come out.

The state rankings are helpful to me. I'm glad they have these. I just don't pay any attention to the actual ranking order.
« Last Edit: December 29, 2012, 02:08:18 PM by Nigel Islam »

Mark Bourgeois

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #103 on: December 29, 2012, 02:00:30 PM »
Mac...

I get that the two of you are using "beta" in a generic sense for sigma and that's a-ok. But when you operationalize the concept I think you'll end up having to use coefficient of variation. I think.

Anyway, when will Kevin and you have something to show us? Do you need a reader?
Charlotte. Daniel. Olivia. Josephine. Ana. Dylan. Madeleine. Catherine. Chase. Jesse. James. Grace. Emilie. Jack. Noah. Caroline. Jessica. Benjamin. Avielle. Allison.

Kevin Lynch

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #104 on: December 29, 2012, 02:20:32 PM »

Mac and Kevin, if I understand the concept correctly, don't you mean sigma not beta? (Actually you'd probably want to measure the coefficient of variation to account for the varying means across courses. Sorry, I'm a pedant and can't help it.)

Mark,

Funny you should chime in.  I recalled your past statistical analyses of Masters scoring, and thought this would be right up your alley.  I was about to send you a PM to see if you'd ever seen or performed some type of volatility analysis.

Since I have an accounting background, I generally think about beta for measuring volatility (e.g. for valuing stock options).  My understanding is that beta would measure how much the rankings (prices) for a course fluctuate compared to the average fluctuation, but may not be the most correct measure, since there isn't a "market fluctuation."  However, I defer to your statistical expertise in determining the most appropriate measure.

I'm curious which courses have the widest range of ratings and/or have the most passionate extremes.  Put another way, courses that have the least amount of "moderate" or "centralized" opinions (As I type that, I guess sigma is most accurate).

Bill McKinley

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #105 on: December 29, 2012, 02:23:17 PM »
Tom Doak - I've hosted at least a dozen GD panelists at my home club and not once has one of them played from the back tees OR legitimately been a 5 hdcp or lower.  Granted, a small sample size but the ability of panelists to judge the "resistance to scoring as a scratch golfer from the back tees" is just as lacking as the reasoning for even having that category in the first place.

And if you were going to include "resistance to scoring" why not use an objective, widely respected measure (USGA rating) instead of making up your own, subjective, unclear metric?

Why would anyone do that?


To make matters worse with resistance to scoring, it's such a small number of players that they consider.  Take part the definition, "how difficult is the course for the scratch player"  But apparently that doesn't include tour players.  We hosted the Sr. PGA at Canterbury in 2009, the winning score was -6 with only 3 players shooting under par for the week.  Conversely, the Sr US Open was at Crooked Stick and the winning score was -20 with 31 players shooting under par for the week.  So one might think that Canterbury's resistance to scoring numbers would be much higher than Crooked Stick but actually it was quite the opposite. Canterbury was a 7.35 vs. Crooked Stick at 7.78.  It may not seem like much but if Canterbury was giving a 7.80 rating, we would have been ranked #87 as opposed to just outside the list.
2016 Highlights:  Streamsong Blue (3/17); Streamsong Red (3/17); Charles River Club (5/16); The Country Club - Brookline (5/17); Myopia Hunt Club (5/17); Fishers Island Club (5/18); Aronomink GC (10/16); Pine Valley GC (10/17); Somerset Hills CC (10/18)

Nigel Islam

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #106 on: December 29, 2012, 02:28:32 PM »
I'm curious which courses have the widest range of ratings and/or have the most passionate extremes.  Put another way, courses that have the least amount of "moderate" or "centralized" opinions (As I type that, I guess sigma is most accurate).


I remember reading something on this site as to Canyata being "the most polarizing" course on the ratings. 
It seems that GD has a lot of those courses on its list. This is probably my biggest complaint about their rankings.   

Kevin Lynch

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #107 on: December 29, 2012, 02:31:53 PM »
Mac...

I get that the two of you are using "beta" in a generic sense for sigma and that's a-ok. But when you operationalize the concept I think you'll end up having to use coefficient of variation. I think.

Anyway, when will Kevin and you have something to show us? Do you need a reader?

Mark,

You are correct - being more generic with the terms.  

I'll have something to show you whenever I can get Golf Week or Golf Digest give me their raw data, which I don't envision happening any time soon.  Sorry, just more theoretical exercise on my part, but I would love to see if our impressions of "most polarizing" courses match up to actual numbers (if I can ever get the data).


Nigel,

Perhaps Canyata could be my first case study (after Tobacco Road, of course).

Joe_Tucholski

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #108 on: December 29, 2012, 02:55:31 PM »
On the topic of statistical analysis, the golf digest criteria indicates "outlier" scores are thrown out before the list is published.  It's also interesting Dean Knuth does their analysis as he's often credited with the creation of the slope/rating system.

Mac Plumart

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #109 on: December 29, 2012, 03:20:45 PM »
Are there any courses found on the GD and Golf lists that are not found on any of the 400 courses listed in Golf Week? I suspect that course #50 on any list is probably closer to course #400 than the Pine Valleys, Shinnecocks, or Cypresses. I like that they rate that many courses. I'm encouraged that GD has added the second 100, and I'm anxious to see that. I assume that will be found when the magazine listings come out.

The state rankings are helpful to me. I'm glad they have these. I just don't pay any attention to the actual ranking order.

Nigel, I haven't done the precise work you are looking for.  But I do have a list of Top 100 courses that are rated as such on one, and only one, Top 100 list.

http://mrpgolf.com/controversial.html

I also have the "Unanimous Gems", which are courses rated Top 100 by all the lists...and a few other variations on that theme.

http://mrpgolf.com/gems.html


FYI, these are not updated for these brand news lists.  I hope to have it done by tonight.
Sportsman/Adventure loving golfer.

Mac Plumart

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #110 on: December 29, 2012, 03:22:47 PM »
On the topic of statistical analysis, the golf digest criteria indicates "outlier" scores are thrown out before the list is published.

I, personally, don't like that part of their process.  It would seem to breed stagnation and herd mentality.
Sportsman/Adventure loving golfer.

Kevin Lynch

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #111 on: December 29, 2012, 03:31:33 PM »
On the topic of statistical analysis, the golf digest criteria indicates "outlier" scores are thrown out before the list is published.  It's also interesting Dean Knuth does their analysis as he's often credited with the creation of the slope/rating system.

Thanks for the information.  I may try to contact Dean through his "Pope of Slope" website.

I doubt he would release data, but perhaps he'd be interested in a similar analysis (I can dream).

Terry Lavin

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #112 on: December 29, 2012, 03:33:01 PM »
Golfweek's list is pretty much the poster child for herd mentality.
Nobody ever went broke underestimating the intelligence of the American people.  H.L. Mencken

hhuffines

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #113 on: December 29, 2012, 03:33:09 PM »
Can anyone explain the disconnect between the State lists and the Top 200 rankings?

Nigel Islam

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #114 on: December 29, 2012, 03:40:39 PM »
Golfweek's list is pretty much the poster child for herd mentality.

Could you expand on that? I think I understand what you are saying, but I'm not 100% sure.

Mark Pritchett

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #115 on: December 29, 2012, 04:03:44 PM »
Can anyone explain the disconnect between the State lists and the Top 200 rankings?

Hart,

Courses listed on the best in state list only need 10 ballots.  For inclusion in the top 200 list I think a course need a minimum of 45 ballots.  So for example Yeamans Hall is second in SC, however lacks enough ballots to qualify for the top 200 rankings. 

Hope this helps.

Happy New Year.   Come to Augusta sometime. 

Mark

Mark Bourgeois

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #116 on: December 29, 2012, 04:04:32 PM »
Regarding outliers, if it's done properly it can be a valid exercise. For example, small sample sizes (ie not a lot of golfers seeing a course) can create a lot of "noise" that you'd want to filter. Maybe somebody pegs all the numbers to raise a course's averages / score. He doesn't actually think it's all 10s (or whatever).

Of course, that kind of raises some of the fundamental reasons why magazine rankings should come with black-box warning labels, don't you think?

And the process is totally opaque: who knows how the analysis is done and who knows what the actual scores are? There are just so, so many ways to manipulate the outcomes. Statistical manipulation is one of the easier ways -- all the better if there's zero transparency into how it's done.

Kevin, yes, I was "taking the piss" as the Brits say and just having fun. If through some miracle you got each rater's scores you could calculate mean and SD for each course using the population of rater scores. Because the courses' respective means would vary you'd want to use some type of normalization to put the SDs on equal footing, so to speak. Coefficient of variation would do the trick...I think.

I haven't thought about it too deeply but to calculate Beta you have to do a lot of work -- there's matrix algebra involved -- and anyway I think you probably end up with a less-direct measure of what you're really want, which is good ol' fashioned variance. Why do more work than necessary?  :)

(PS once upon a time I actually took at look at Golf Magazine's numbers using some quasi-fancy analysis -- and concluded the numbers they posted next to each course were mathematically impossible. But I was told there was no way the numbers I was looking at were fixed. No way. My analysis looked okay but the results of that analysis were absolutely wrong. They did say however if I looked in a slightly different place I would see where the numbers really were fixed. I'm a little hazy on the details of my analysis -- it was years ago and I had the flu -- but some day I should recreate it and have the group show me where I went wrong. If I went wrong: I don't think I did.)
Charlotte. Daniel. Olivia. Josephine. Ana. Dylan. Madeleine. Catherine. Chase. Jesse. James. Grace. Emilie. Jack. Noah. Caroline. Jessica. Benjamin. Avielle. Allison.

hhuffines

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #117 on: December 29, 2012, 04:07:11 PM »
Mark, thanks!  That explains a lot...  Comparing North Carolina to the main list was confusing me.

Nigel Islam

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #118 on: December 29, 2012, 04:21:51 PM »
Regarding outliers, if it's done properly it can be a valid exercise. For example, small sample sizes (ie not a lot of golfers seeing a course) can create a lot of "noise" that you'd want to filter. Maybe somebody pegs all the numbers to raise a course's averages / score. He doesn't actually think it's all 10s (or whatever).

Of course, that kind of raises some of the fundamental reasons why magazine rankings should come with black-box warning labels, don't you think?

And the process is totally opaque: who knows how the analysis is done and who knows what the actual scores are? There are just so, so many ways to manipulate the outcomes. Statistical manipulation is one of the easier ways -- all the better if there's zero transparency into how it's done.

Kevin, yes, I was "taking the piss" as the Brits say and just having fun. If through some miracle you got each rater's scores you could calculate mean and SD for each course using the population of rater scores. Because the courses' respective means would vary you'd want to use some type of normalization to put the SDs on equal footing, so to speak. Coefficient of variation would do the trick...I think.

I haven't thought about it too deeply but to calculate Beta you have to do a lot of work -- there's matrix algebra involved -- and anyway I think you probably end up with a less-direct measure of what you're really want, which is good ol' fashioned variance. Why do more work than necessary?  :)

(PS once upon a time I actually took at look at Golf Magazine's numbers using some quasi-fancy analysis -- and concluded the numbers they posted next to each course were mathematically impossible. But I was told there was no way the numbers I was looking at were fixed. No way. My analysis looked okay but the results of that analysis were absolutely wrong. They did say however if I looked in a slightly different place I would see where the numbers really were fixed. I'm a little hazy on the details of my analysis -- it was years ago and I had the flu -- but some day I should recreate it and have the group show me where I went wrong. If I went wrong: I don't think I did.)

Thats funny about the Golf Magazine list.  Maybe GD and Golf have to fudge the figures some. Imagine if say The Alotian was the actual number one with the raters. No way a course that 98% of their readers have never even heard of would reflect well with their readers. But having that course in the next 20 is barely even noticed by anyone, but us. I'm not ripping on The Alotian Club, but rather making a point with a course that I personally know very little about that is ranked quite high.

Kevin Lynch

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #119 on: December 29, 2012, 04:51:50 PM »

Kevin, yes, I was "taking the piss" as the Brits say and just having fun. If through some miracle you got each rater's scores you could calculate mean and SD for each course using the population of rater scores. Because the courses' respective means would vary you'd want to use some type of normalization to put the SDs on equal footing, so to speak. Coefficient of variation would do the trick...I think.

I haven't thought about it too deeply but to calculate Beta you have to do a lot of work -- there's matrix algebra involved -- and anyway I think you probably end up with a less-direct measure of what you're really want, which is good ol' fashioned variance. Why do more work than necessary?  :)


Mark - you are in another stratosphere of statistical knowledge relative to me.  I'd have to dust off many MBA textbooks to get within the same time zone. 

I'm not the least bit surprised that you have previously analyzed Golf Magazine's numbers.  I had the exact same thought as Nigel upon hearing that.

I can't even begin to contemplate the controls needed to get a truly accurate rating.  Theoretically, you'd have to normalize each "rater" as well, to determine which are the "soft graders" (give 8-10s only) vs the "hard graders" (give 5-8s with rare 9/10s). 

Jim Colton

Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #120 on: December 29, 2012, 05:01:00 PM »
Can anyone explain the disconnect between the State lists and the Top 200 rankings?

If you use the state rankings as a guide, there are at least 345 courses that have a higher score than Seven Canyons, the 200th ranked course on their list. There are a bunch of private clubs that would be in the second 100 if they had the 45 votes. There are 18 or 19 public courses in the first 100, and around 59 in the second 100. That just doesn't pass the sniff test.

Mark Pritchett

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #121 on: December 29, 2012, 05:06:48 PM »
Can anyone explain the disconnect between the State lists and the Top 200 rankings?

If you use the state rankings as a guide, there are at least 345 courses that have a higher score than Seven Canyons, the 200th ranked course on their list. There are a bunch of private clubs that would be in the second 100 if they had the 45 votes. There are 18 or 19 public courses in the first 100, and around 59 in the second 100. That just doesn't pass the sniff test.

Wow! 

Jud_T

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #122 on: December 29, 2012, 05:07:53 PM »
Put Butler National on the Ocean,  hold a bunch of Majors there and voila it's the best thing since Kirstie Alley got off coke, woke up and saw her first plate of fettucini alfredo.  I'll wait for their top 50 fun courses list and recycle Links magazine from the pro shop men's room in the meantime. I suppose this list has some value as a rorschach test of what the average player should avoid, but from my vantage point it does more harm than good.
« Last Edit: December 29, 2012, 05:09:31 PM by Jud Tigerman »
Golf is a game. We play it. Somewhere along the way we took the fun out of it and charged a premium to be punished.- - Ron Sirak

Tom_Doak

  • Karma: +3/-1
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #123 on: December 29, 2012, 05:12:22 PM »

I can't even begin to contemplate the controls needed to get a truly accurate rating.  Theoretically, you'd have to normalize each "rater" as well, to determine which are the "soft graders" (give 8-10s only) vs the "hard graders" (give 5-8s with rare 9/10s). 

There are two problems with this statement.

The second is that while there are always "hard" and "soft" graders, you have to take into account which courses they've seen.  If they have seen only a small percentage of the top 100 courses but they are giving out a lot of 9's and 10's, then they're really a soft grader; but if they had seen the same number of courses and their roster was skewed in favor of better courses, then you would be reducing their vote for no good reason.  You would just be fudging the numbers back toward the consensus, which is what GOLF DIGEST does by throwing out "outlier" votes ... which is why some of the same stupid courses persist on their list for years and years.

But, the first problem is the very idea that there is a "truly accurate rating" of such a subjective venture.  What makes the best courses the best is a truly subjective thing.  Everyone's got their own opinion, and the only reason one person's opinion is more valuable than another is if you personally agree with their viewpoint.  Having hundreds of anonymous raters write down a bunch of numbers does not make the process objective.  Garbage in, garbage out.

Mark Bourgeois

  • Karma: +0/-0
Re: Golf Digest's 2013-14 Top 100 Rankings
« Reply #124 on: December 29, 2012, 05:15:37 PM »
Thanks, Kevin, I really appreciate that but believe me there are galaxies' worth of knowledge beyond mine. For example, the guy who made reply #121: I'd be surprised if he hasn't forgotten more than I've learned.
Charlotte. Daniel. Olivia. Josephine. Ana. Dylan. Madeleine. Catherine. Chase. Jesse. James. Grace. Emilie. Jack. Noah. Caroline. Jessica. Benjamin. Avielle. Allison.