News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Jeff Goldman

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #150 on: March 01, 2004, 02:36:37 PM »
Pat, can't the "regionalism" problem go both ways?  Haven't we heard some claim that certain courses are overrated because they are in a good neighborhood (i.e., Long Island or Monterey)?  And isn't it at least possible that some courses in other regions are underrated because they are not in the East, and therefore don't get much notice?  Folks here have said that Beverly CC (or Skokie or __) would be top 100 if it were in NY or Penn., rather than south of Chicago.

Jeff Goldman
That was one hellacious beaver.

Matt_Ward

Re:Questions about the 2004 Golfweek list
« Reply #151 on: March 01, 2004, 03:00:33 PM »
Jeff G:

In regards to regionalistic tendencies there's little doubt in my mind that if someone were to rate (scale of 1-10) a course and give it a 8 and they just so happen to be from State "A" which doesn't have many superior courses it's hard to fathom how that one rater's 8 may correspond to another person who comes from an area of the country where the application of an "8" may mean something entirely different.

You also have to realize that of a given number of raters for any of the publications how many of them really are National in the scope of rating courses? Keeping current is a very important aspect in rating because even the great courses will change over time -- not just from the aspect in engaging an architect but by simple things that can happen over time. Sometimes it's not just a question if a person has played course "A" but when they played it.

There are plenty of people who are extremely knowledgeable about their "neck of the woods" but there are very few people who have the stamina, time and $$ to be able to make numerous visits throughout the USA during a condensed period of time. Just realize that I am not saying that people who play more should be given the benefit of doubt about their opinions because quality analysis is no less important. Finding people who can do both is indeed a much more difficult proposition.

I have always believed that you clearly have a "homer" effect in certain instances -- the NY metro area may be one of those areas as people not only rate the usual suspects (e.g. NGLA, SH, WF, QR, etc, etc) but also add on others as well. In essence -- the tide rises all others as well.

It may be important to limit the amount of reviews a course gets from those who live within 100 miles of a given metro area. This would force other raters from outside that area to weigh in with their numbers to balance the potential for "homers" getting even more support versus layouts that happen to be out in the prairie or other remote locale.

The key courses in the major metro areas should not have any problem with this since their visibility and fanfare should be able to provide enough different "eyes" to see them.

My observation is that few people can really provide the consistent cross comparison of courses that you need. I mean if one person plays course "A" and a completely different person plays course "B" how does one get any sense of perspective when different people rate different courses?

Years ago GD had a national and regional ratings boards. I would submit that there are people who are truly national in the number of courses they see in a given time period. Maybe there is a way to incorporate this. At Jersey Golfer we will be asking people who help us with our bi-ennial survey to indicate when was the last time they played the course. Those who have played the course from a time frame outside of five years will have their vote scaled down accordingly.

P.S. I am a big fan of Skokie and would most certainly include it among my personal top 100.

Patrick_Mucci

Re:Questions about the 2004 Golfweek list
« Reply #152 on: March 01, 2004, 03:39:42 PM »
Jeff,
Pat, can't the "regionalism" problem go both ways?  

Haven't we heard some claim that certain courses are overrated because they are in a good neighborhood (i.e., Long Island or Monterey)?

I would think that just the opposite is true.
I'm not so sure that mediocre or bad golf courses are elevated simply by drafting  in the wake of the good ones.
Could you give me some examples ?


And isn't it at least possible that some courses in other regions are underrated because they are not in the East, and therefore don't get much notice?  

Sand Hills, Prairie Dunes and Bandon/Pacific dunes aren't exactly on the beaten path, I would classify them as isolated from any meaningful population center, let alone the East.
Yet, I doubt you could say that they are underrated.


Folks here have said that Beverly CC (or Skokie or __) would be top 100 if it were in NY or Penn., rather than south of Chicago.

Like all statements, you must consider the source.
Who has said these things ?


I think one of the difficult concepts for many people to grasp is that a course ranked 110 might just be a few thousanths of a percent higher, numerically, then the course ranked 65th.

And, out of 17,000 golf courses, being ranked 110 is pretty lofty.

Unfortunately, only 100 golf courses can fit into the TOP 100 golf courses.


Jeff Goldman

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #153 on: March 01, 2004, 04:15:41 PM »
Pat,  I have played no courses in the Northeast, so I am not saying that courses there are overrated because of their location, simply that it is possible that some may be elevated somewhat because of regional bias, just as raters from some areas that do not see enough courses may rate the ones they do see too highly (which I think is your point).  Matt Ward made this point well.  I do recall posts stating that, among others, Maidstone, Atlantic, The Bridge, The Creek, WF East and Quaker Ridge get an eastern wind advantage.  

  Similarly, among others, Shivas, Paul Richards, Pat Hitt, Matt Ward, and others have stated that Skokie and Beverly, among others, are underrated, at least in part because they are stuck in the midwest.  None of these old line clubs have the publicity or renown of Bandon, Sand Hills, or Prairie Dunes, so I don't think these clubs disprove the point.  If raters come to Chicago, don't they want to see Medinah and Chicago GC first, then Shoreacres and maybe Olympia Fields before these other clubs?  I wonder how many visits from raters each club gets.  

  You're certainly correct that only 100 clubs make the list, and the differences are minute but that doesn't speak to your point about regionalism.  I just think that it is likely that regional biases go both ways.

Jeff Goldman
That was one hellacious beaver.

Patrick_Mucci

Re:Questions about the 2004 Golfweek list
« Reply #154 on: March 01, 2004, 05:08:52 PM »
Jeff,
Pat,  I have played no courses in the Northeast, so I am not saying that courses there are overrated because of their location, simply that it is possible that some may be elevated somewhat because of regional bias, just as raters from some areas that do not see enough courses may rate the ones they do see too highly (which I think is your point).  Matt Ward made this point well.  I do recall posts stating that, among others, Maidstone, Atlantic, The Bridge, The Creek, WF East and Quaker Ridge get an eastern wind advantage.  

The evaluation/rating is done in a vacuum, absent any comparison to other golf courses, in the east or anywhere else.

When a rater plays/walks a golf course he has a variety of categories that he is asked to assign numerical values to.
Those values are assigned solely in the context of the golf course the rater is evaluating that day.
The individual numerical values and their total are then forwarded via ballot to the magazine where they are put into the statistical hopper with other ballots for the purpose of determining an overall numerical value.  Only when the average of the ballots is compiled, and compared to other statistical values, do the rankings take shape.


Similarly, among others, Shivas, Paul Richards, Pat Hitt, Matt Ward, and others have stated that Skokie and Beverly, among others, are underrated, at least in part because they are stuck in the midwest.

That is solely their opinion, and not based on any statistical data

None of these old line clubs have the publicity or renown of Bandon, Sand Hills, or Prairie Dunes, so I don't think these clubs disprove the point.

I've played Prairie Dunes about a dozen times and it's about as low key as you can get, and about as far off the beaten path as you can get, and, it's in the heart of the midwest, rural and isolated, not next to a big city like Chicago.

If raters come to Chicago, don't they want to see Medinah and Chicago GC first, then Shoreacres and maybe Olympia Fields before these other clubs?

How do you draw that conclusion ?
How do you know what raters want ?
How about the raters that live in the Chicago area, or within
200 miles of Chicago, don't they have access to all of these clubs ?
Raters aren't pouring into Chicago from Florida, Oregon, Oklahoma, and Maine.  Regionalism and convenience has more to do with ratings then low air fares and extended travel.


I wonder how many visits from raters each club gets.

That would vary, but the magazines know exactly how many BALLOTS were cast for each club, not how many visits were made.  

You're certainly correct that only 100 clubs make the list, and the differences are minute but that doesn't speak to your point about regionalism.  

I just think that it is likely that regional biases go both ways.

I would disagree with your premise.

Jeff Goldman

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #155 on: March 01, 2004, 05:35:01 PM »
Pat, Ay caramba!  You asked for examples of courses getting a draft.  I gave you some.  You asked me to name individuals who thought courses outside the east were underrated.  I did so.  Of course it is "their opinion, not based on statistical data."  How could it be?  How could the regional advantage you propose be?  As to what courses raters want to visit, I spoke to a few, and, in any event, I wasn't stating a conclusion but asking your opinion.  That's what the question mark means.   ;D ;D  That is also why I asked about the number of site visits, although ballots may be more accurate.  Let me ask you a couple of others:  Are raters for these magazines equally distributed around the country?  Are there more on the coasts then elsewhere?  Do raters from the coasts get to play a whole bunch of great courses, and therefore have less inclination, in general, to visit courses and places of less renown?  I have no idea, but others might.  

Jeff Goldman
That was one hellacious beaver.

SPDB

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #156 on: March 01, 2004, 05:55:15 PM »
Geoff -
If freebies are anathema in the ratings game why do you abide it at RC?

Patrick_Mucci

Re:Questions about the 2004 Golfweek list
« Reply #157 on: March 01, 2004, 06:00:08 PM »
Pat, Ay caramba!  You asked for examples of courses getting a draft.  I gave you some.

How can you say those courses benefited from drafting if you've never seen or played them ?
 
You asked me to name individuals who thought courses outside the east were underrated.  I did so.  Of course it is "their opinion, not based on statistical data."  

You named three people, and you allege that they feel that courses outside the east are underrated.  It sounds more like hearsay.  I'd like to hear their opinions on the subject, with course specifics.  What about all the raters who played the Chicago area clubs, who rated them, who aren't from the east ?[/b]

How could it be?  How could the regional advantage you propose be?

It's simple, it's a matter of geography.
Most raters who live in Maine don't get to rate courses in Florida, Tennessee, Iowa, Oregon or New Mexico and vice versa, hence evaluations tend to be conducted within the framework of regionalism.  And, the quality of courses in each region has a profound affect on the other courses in the region.  The "big fish in a small pond" syndrome.

As to what courses raters want to visit, I spoke to a few, and, in any event, I wasn't stating a conclusion but asking your opinion.

I don't know what each individual rater thinks with respect to seeing and rating a golf course.  

I do think it's more random then planned.  Often, a rater will be vacationing or on business in Florida, Arizona, California and other states and try to combine that activity with rating a golf course or two.

I don't think someone from New York says, I think I'll fly to Billings, Montana and try to rate all the courses in that area this week.


That's what the question mark means.   ;D ;D  That is also why I asked about the number of site visits, although ballots may be more accurate.  Let me ask you a couple of others:  Are raters for these magazines equally distributed around the country?  

I wouldn't know the answer to that question.
GD used to publish the list of their raters, by state, so that may help you.  GW doesn't reveal the list of their raters, and I don't know what GM does


Are there more on the coasts then elsewhere?

My assumption would be that raters are indigenous to population centers

Do raters from the coasts get to play a whole bunch of great courses, and therefore have less inclination, in general, to visit courses and places of less renown?

I would have no way of knowing.
My guess is that those that are serious about this pursuit make the effort to get to as many courses as is practical, and that population centers remain a critical component



Jeff Goldman

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #158 on: March 01, 2004, 06:57:21 PM »
Pat, Where did I say definitively that certain courses benefitted from a draft??  I simply raised the possibility that the "regionalism" bias could go both ways, and gave examples that raised the possibility.  Don't you recall the posts about Maidstone benefitting from its neighborhood??  That's how I raise this possibility without playing the courses.  Moreover, even if I had played some of them, and even if I rated them lower than others, I could not state for sure that it was because of a "homer" effect or simply a difference of opinion.  Similarly, you can postulate a "regionalism" impact, which I agree makes could exist, but it can't be demonstrated scientifically.  We're not really far apart here; it's just that I see 2 kinds of regional impact and you see only 1.

  As to others who say courses like Skokie and Beverly are underrated, do a search.  That's what I did.  Their opinions are posted, though I hope they aren't put out that I dragged them in here.

  Third, you misunderstood my "how could that be" statement.  You stated that the folks I listed who thought Skokie underrated were simply giving their opinions, not based on statistics.  I in turn asked how your regionism hypothesis could be based on statistical data, rather than opinion.

  Your point about the happenstance nature of rating in other regions supports my idea.  It is quite likely that raters make special trips to see Bandon, Sand Hills, Prairie Dunes, rather than going to Chicago to see a bunch of courses.  This makes it more difficult for a lesser-known, though not necessarily lesser course, to make an impact.

  Last, your regionism hypothesis is based on the effect of one course on another -- a big fish in a little pond ("the quality of courses in a region has a profound effect").  Yet you also state that ratings are done in a vacuum, "absent any comparison to other courses..." [how do you know this??] Which is it?  I tend to think the former.

  I would like to hear more of your views on the individual courses you see and play, because your reports have been very informative in the past, rather than this kind of stuff (though it is fun once in a while).  Come to Chicago and see what's in our neighborhood, and you can decide for yourself whether some of the courses here get a short or long shrift.

Jeff Goldman
« Last Edit: March 01, 2004, 06:59:33 PM by Jeff Goldman »
That was one hellacious beaver.

Geoff_Shackelford

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #159 on: March 01, 2004, 07:01:01 PM »
SPDB,

Uh, I don't "abide it" at Rustic Canyon because I don't run Rustic Canyon, influence the operation in any way or have any stake in the place.

As always, thanks for trying to keep me honest. Maybe one of these days you'll actually get it right.
Cheers,
Geoff

Top100Guru

Re:Questions about the 2004 Golfweek list
« Reply #160 on: March 01, 2004, 07:06:40 PM »
For those that feel the list is tied to "free goodies" and other "perks" for the raters, I will respectfully submit, that there are:

17 Resort and 16 Daily Fee courses out of 100 on the Modern List and that number in and of itself is very "questionable" at best.........I can guarantee you that at least 1/3 of those "Resort & Daily Fee Courses" are not as good as the likes of some of the "private clubs" that are either no longer on this list or have not even made the list in the past. :o

SPDB

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #161 on: March 01, 2004, 07:11:33 PM »
Geoff -

how much did your last round at Rustic cost you?

Again, just out of curiosity

By the way, just so i don't become another one of your drive-bys. i did respond to your Fazio-conflict accusations, back on pg. 2.
« Last Edit: March 01, 2004, 07:34:04 PM by SPDB »

DMoriarty

Re:Questions about the 2004 Golfweek list
« Reply #162 on: March 01, 2004, 07:18:20 PM »
I've heard so many different versions of what happened with Rustic last year that I wouldnt mind having it cleared up.

Why was Rustic left off the list last year?  Not enough raters?  (If so, how many ratings were needed, and how and when did this rule come into existence?)  Not high enough scores?  (If so where would the scores have put Rustic, had they been allowed?)  Was anyone asked or pressured to lower their score, or was anyone's high score discarded?  

Geoff is correct that no one from Golfweek has denied that scores have been manipulated in the past.  I find this odd, given Brad's usual willingness to explain what is going on with the ratings.  Plus, I've heard enough rumors from unrelated sources to be curious as to just what happened.  



SPDB

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #163 on: March 01, 2004, 07:23:56 PM »
David -

Quote
Why was Rustic left off the list last year?

Personal vendetta.

Forrest Richardson

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #164 on: March 01, 2004, 07:31:10 PM »
My belief: There is no monkey business to do with rating courses at Golf Week. I know some of the raters — and also have regard for the work being done at Golf Week to maintain a professional publication. I don't feel anyone involved is manipulating.

Rating is — at best — a mostly subjective undertaking. We all need to expect a certain degree of surprise when the results are posted.

What I do believe is that the system of getting raters to new projects is broken (sorry, no other good word to describe it.) In my view it is disheartening to learn that some new courses have attracted raters in flocks, while others have been left behind.

A "fix" for this situation would be to require — more forcefully — that raters get to new courses. I believe it makes the published list of "Best Courses" ultimatelty less useful and bonafide when new courses are not evaluated by enough raters.
« Last Edit: March 01, 2004, 07:32:04 PM by Forrest Richardson »
— Forrest Richardson, Golf Course Architect/ASGCA
    www.golfgroupltd.com
    www.golframes.com

Mike_Sweeney

Re:Questions about the 2004 Golfweek list
« Reply #165 on: March 01, 2004, 07:34:39 PM »
Geoff -

how much did your last round at Rustic cost you?

Again, just out of curiosity

Sean,

I don't agree with everything that Geoff has said on this thread (and others), but let's be respectful of his efforts and passion at RC.

DMoriarty

Re:Questions about the 2004 Golfweek list
« Reply #166 on: March 01, 2004, 08:16:06 PM »
David -

Quote
Why was Rustic left off the list last year?

Personal vendetta.
SPDB.   You obviously have very strong feelings about the inaccuracy of Geoff's allegations, but as far as I can tell, you have no factual basis for your strong feelings.  (At least Geoff claims he has a source . . . .)

Do you know that Brad or GW has never shuffled the numbers in the past?   Do you know that the things Geoff is alleging are false?  If so, how about you come clean?  If not, then I assume that you would agree with me that some clarification from Golfweek might help clear things up.  

Forrest, same goes for your post and the post of everyone else who wants to vouch for golfweek, as well as everyone who wants to crucify the magazine.   Unless you have all the facts, your opinion is not really going to clear anything up, is it?   What says Brad about Geoff's allegations?  What says Golfweek?  


Willie_Dow

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #167 on: March 01, 2004, 09:11:17 PM »
I like the breakdown of catagories.  Classic, Modern, Private, Public, Links, Forest, what else?

This narrowness is for the birds!

Willie_Dow

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #168 on: March 01, 2004, 09:16:40 PM »
Maybe: Hickories only!

Steve Curry

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #169 on: March 01, 2004, 09:39:24 PM »
"1. What is it about Southern Highlands’ design that makes it worthy of its debut at #84 in the US Modern (note: no need to cite the complimentary wine list and free Pro-V1s, I already understand those gifts were well received by all the panelists attending the Las Vegas raters event)?"

Is this really a question, and who cares?  :P

2. Considering it only opened in September, how did enough people get out to rate Wintonbury Hills, the #1 public access course in Connecticut?

Is this really a question, and who cares?  :P

This thread is all a bunch of crap and who really cares, quit crying and arguing!



Forrest Richardson

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #170 on: March 01, 2004, 10:01:12 PM »
Mr. Moriarty — I merely gave my personal (and professional) opinion about Golf Week. I have always found the staff to be professional, and it appears they have the game in the forefront of their throughts. I spent 20 years in advertising and served as a professional judge for scores of competitions. I know how to run and judge competitions — the Golf Week rating system seems very well thought out...less the exception I noted in the previous post.

Mr. Curry — Who cares? Well, for one many architects who consider the finished golf course the end result of many years work and passion. It also extendes to owners and private clubs. I'd say anyone who has a vested interest in golf courses relative to their professional life. To quote a well respected guy: "...because it's so important to our business..." That was T. Doak from a few years ago commenting about the Golf Digest "Best New" listings.
— Forrest Richardson, Golf Course Architect/ASGCA
    www.golfgroupltd.com
    www.golframes.com

Patrick_Mucci

Re:Questions about the 2004 Golfweek list
« Reply #171 on: March 01, 2004, 10:21:34 PM »
Jeff Goldman,

You've totally missed the underlying principles on the influence of regionalism.

Regionalism is an acknowledged fact by some at GW for the simple reason that more raters are likely to rate golf courses located near their home base then they're likely to rate golf courses 1,000, 2,000 or 3,000 miles from their homes.

Please tell me you understand that.

If you do, then you'll understand the influence of regionliism and the difficulty in overcoming it, and, that it's not a two way street as you'd like to believe, with an overwhelming number of outsiders coming into Chicago and giving those area courses low marks, which might counter the high marks you believe those courses receive from local raters.

Mike_Cirba

Re:Questions about the 2004 Golfweek list
« Reply #172 on: March 01, 2004, 11:50:40 PM »
Forrest;

An effort is made by Golfweek to get raters to new courses.  In fact, each year we get a list of "assigned" courses, of which each of us has to play a certain number in multiple states to maintain our rater status.

The "assigned" list is made up of two types of courses;

1) Brand New courses that haven't had the requisite minimum number of ratings (10) to qualify for the lists.

and  

2) Other courses that have not had the requisite minimum number of voters but who's scores to date indicate that they might be top courses based on the data to date.  

There is also the problem of pure numbers of raters, as well as geographic distribution of raters and Golfweek is trying to address that as well.  In the past two years, the number of raters has been increased from about 170 to 285, a very significant percentage increase.  That increase has been done with an eye towards geographic location as a major factor.  Fact is, there weren't enough raters in previous years from the midwest, northwest, and southwest away from major population centers and steps were taken to try to locate folks in those areas who loved the game and course architecture and ask them to help.

Like anything else, it's an imperfect system, but I think some of the past weaknesses have been recognized, helpful steps have already been taken, and everyone is working towards continuous improvement of the ratings.


« Last Edit: March 01, 2004, 11:51:52 PM by Mike_Cirba »

Forrest Richardson

  • Karma: +0/-0
Re:Questions about the 2004 Golfweek list
« Reply #173 on: March 02, 2004, 12:35:27 AM »
Mike — I'm aware of this change. And the requirement. It still remains a weakness.
— Forrest Richardson, Golf Course Architect/ASGCA
    www.golfgroupltd.com
    www.golframes.com

Tom_Doak

  • Karma: +2/-1
Re:Questions about the 2004 Golfweek list
« Reply #174 on: March 02, 2004, 01:39:08 AM »
Obviously I have no inside information about the ranking of Rustic Canyon, but I do know the answer to why it wasn't ranked last year:

NO COMMENT

You really cannot run a ranking any other way.  You can't explain to 16,900 courses "why" they were left off a list, because ultimately the only reason is that they didn't get as high a score as the top 100.  If you start trying to explain, you're badgered about the answers until someone accuses you of a conspiracy.

When I ran the GOLF Magazine panel years ago, Shadow Creek debuted at No. 59 (or something like that) in the top 100 in the world.  They were furious at this ... said they knew exactly which panelists had been to the golf course and that the ratings must be rigged because everyone had told them how they voted.  They suspected that I had voted quite low on the course, although in fact I was one of its supporters ... other panelists were not so enthusiastic about the course on a secret ballot as they were to Steve Wynn's face.

I do think it's pretty obvious from all of the above that there are a lot of GOLFWEEK panelists here who are very defensive about the whole process and who are too invested in it personally.  Your own personal list is not THE list.  There is plenty of room in this business for differences of opinion.

I also agree with Forrest that the ranking of new courses is far from objective.  It's a pack mentality; when a lead wolf likes a certain new course he e-mails all his buddies on the panel, and all of a sudden 25 people are gushing about Course X while no one will even go see Course Y on the other side of town.  This, I think, is one of the real problems with having all the raters' events that GOLFWEEK likes to have ... individual thought is discouraged.

Tags:
Tags:

An Error Has Occurred!

Call to undefined function theme_linktree()
Back