News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Tom Doak

Re: GOLFWEEK:  strictly by the numbers?
« Reply #25 on: March 12, 2002, 02:20:49 PM »
David,

I was always conscious of my "conflict of interest" position and was very careful in making any recommendations for the committee.  I'd draft up a list of people, and let George Peper make the call on adding them.  I passed over many highly qualified people simply because they were friends, and I didn't want that to call into question whether any course I'd built was worthy.

Eventually, I got to be friends with several people who were on the committee, and that's one of the reasons I quit.

I do think I had some influence on the list, but indirectly.  THE CONFIDENTIAL GUIDE highlighted some courses which were previously lesser-known, and got panelists to take them seriously.  Also, certainly, my being a member of Crystal Downs made it easier for people to see it; it would have been on the lists forever except that so few people had traveled there.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Dr. Reynolds

Re: GOLFWEEK:  strictly by the numbers?
« Reply #26 on: March 12, 2002, 03:52:44 PM »
Mr. Doak,

My guess is this list is edited by Brad Klein or someone at Golfweek.

I feel this way for two reasons:
There are two golf courses which I am contracted with one made the modern list and one did not. I was curious as to the number of actual raters that visited each course.  When I spoke to the Golf professionals at each course I was surprised to hear the rated course recieved 1 rater while the unrated course recieved over ten!

The other reason is the simple process of rating a golf course against the next is absurd and already been accomplished for the past 30 some years by a premier publication of which GW is not!


« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Paul Richards

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #27 on: March 12, 2002, 05:14:30 PM »
Dr. Reynolds:

And you are?
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »
"Something has to change, otherwise the never-ending arms race that benefits only a few manufacturers will continue to lead to longer courses, narrower fairways, smaller greens, more rough, more expensive rounds, and other mechanisms that will leave golf's future in doubt." -  TFOG

jim_lewis

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #28 on: March 12, 2002, 05:37:51 PM »
Dr. Reynolds:

It is not likely that your golf professional really knows how many raters have played his courses. It is not unusual for raters to play courses without introducing themselves as raters to the pro.  This is particularly true at private clubs if the rater is playing as a guest of a member. Contrary to popular belief, not all raters are looking for a free round nor have to use their rater credentials to gain access to courses. I can assure you that no course appears on the GOLFWEEK America's Best list if it has been seen by only one rater. It would not even be considered.

Raters do not "rate a course against the next". Courses are rated against a set of criteria. They are then RANKED in relation to each other based on their average RATING. It is the ranking that appears in the publication, not the rating.

BTW, GOLFWEEK is THE premier golf publication unless you are looking for instruction, equipment recommendations, or a travel guide.
« Last Edit: December 31, 1969, 07:00:03 PM by -1 »
"Crusty"  Jim
Freelance Curmudgeon

John_Conley

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #29 on: March 12, 2002, 08:06:04 PM »
Crusty:

It isn't worth your breath or key-pressing to try to explain Golfweek to a guy who says it is not a premier publication.  He obviously doesn't like it.

Golfweek has come a long way from the Stine days.  Golf World may be the best for covering professional golf, Links has by far the prettiest pictures, and Golf Magazine and Golf Digest have a lot of things in an attempt to appeal to the masses.

You and I may prefer reading GW, but I can certainly see how somebody else might not.  The strength of the mag is that it DOESN'T try to broaden its appeal.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

David Wigler

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #30 on: March 12, 2002, 09:46:16 PM »
Ron,

Thanks for answering my question.  You really highlighted the catch-22.  I have not yet played Architects Club but it would be a shame if it was deserving of placement on GD's list and yet was not allowed.  The great irony would be if it made GW's.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »
And I took full blame then, and retain such now.  My utter ignorance in not trumpeting a course I have never seen remains inexcusable.
Tom Huckaby 2/24/04

Brad Klein

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #31 on: March 13, 2002, 02:11:55 AM »
Sorry I couldn't get to this thread sooner but I was traveling for six days and was in a series of "quaint" hotels or golf cottages that didn't provide me with internet access.

Let's see. We have 175 raters, with about 15 more being added this month. Most of the new ones we are adding lately are from lesser traveled places like Arkansas, Nebraska, Kansas, the Dakotas, Montana, etc. That way we get more of them to places like Chisolm Trail. We had only 3 visits, far fewer than who visited Sand Hills (34), Prairie Dunes (51) or Bandon Dunes (26). Prosecutor Whitten wants me to "fess up." Ron, I confess. I wasn't there. But there's nothing else to fess up to.

Jonathan Cummings is the technical guru who runs the spread sheets, collates the ballots, and spits out the results. I spend about at least 5-8 hours every week on that stuff, but between November and March far more than twice that amount of time on the lists, rater rosters, etc. We print out the results, and yes, I do edit the list by knocking out from the top-100 lists those courses that don't get enough votes (8 is the minimum for the Classical and Modern list, 3 is the minimum for state by state, though in many state like Fla. or SC all of our listed courses have 8+ votes).

I'm a little surprised and disappointed by both Tom Doak and Ron Whitten for asking these questions. I've had both over my house to dinner and they didn't have the nerve to ask me face-to-face (though Doak was there before I did the ratings, but he has my phone number). They seem to find it okay to pose the question in public. I suspect there's a bit of grandstanding going on. But that's fine, as they raise important issues that go to the heart of being a reputable, credible magazine that is under amazingly close scrutiny - mote than their lists have ever been under.

There are courses on the list that I personally think don't belong there; there are courses off the list that I'd like to see on there. Neither Golf Digest or Golf Magazine does anything like we do to educate our raters on an ongoing basis - see Cosgrove's post above on this.

There is nothing more valuable than such meetings. It does a lot more for quality ratings than having celebrity voters or 750 single-digit golfers as raters.

There are also some embarassing results I have to live with. Case in pont: Salem CC's superintendent, Kip Tyler, wins our "Superintendent of the Year" award and yet his course, which went through winter kill (and recovery) but no design changes in years, falls 11 spots. That makes no sense to me, and I hated it, but I had to accept it and go wth the results.  

If we had room we'd run lists of the top-200, and I'm sure we then get the same questions about courses 201, 202, 203. I have also tried to get the room to run a complete statistical breakdown of all the votes, numbers cast, standard deviation, etc., of the top-100 Classical and Modern but we decided not to because of space and readability. But we have all that data, and that's because questions like this arise all the time and we need to have statistical certainty. We're in the process of developing a comprehensive Web-based balloting system which will give us access to these outcomes on an ongoing basis.

By the way, many courses have no idea when raters show up. I don't know (or worry) how the other magazines do it, but we have lots of raters that don't announce themselves. In fact, I just got a call from Nicklaus' organization wondering how The Bear's Club could have made the list when it was closed to all raters. Turns out 10 of our raters had been there!

Anyway, this is a healthy discussion. I doubt any other magazine's list draws this scrutiny.

« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Brad Klein

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #32 on: March 13, 2002, 02:32:07 AM »
By the way, Ron, if I were a Golf Digest panelist I'd be very upset about the way the allocation of "tradition" points totally undermines any legitimate rating criteria. You might as well announce that the ballots are cooked.

Now "fess up," tell me you think that's handled with the same integrity and mathemtical precision as the rest of your balloting. I know you work hard to get those votes right mathematically, but that little jury-rigged system is very suspect to me - a point I did raise with you face-to-face over dinner.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Dunlop_White

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #33 on: March 13, 2002, 01:20:39 PM »
Golf Digest has been criticized for utilizing 'tradition' as a criterion in rating a golf course. Consequently, many raters concentrate on the famous and traditional layouts, those which have hosted a major championship. Other raters focus on prestigious golf clubs, those which are not accessible to the common player. Because of this preoccupation with playing the elite designs, many wonderful courses, among the 18,000+ in our country today, go unnoticed and are not even considered. If 'tradition' was not a criteria, perhaps many other courses would be discovered and gain the status they deserve.

Many famous golf clubs with years of history and tradition do in fact have great golf courses.Too many of these courses are also glorified just because they are famous, and 'fame and greatness' are simply not synonimous. Just because a course has hosted a major tournament doesn’t annoint it as having a  great layout, though this perception prevails. For example, why should the Ocean Course at Kiawah be ranked any higher because The Ryder Cup was held there? Likewise, why should a course, such as Merion, lose its lofty rating simply because major championships have bypassed it for years?

Golf Digest has also been criticized for using 'ambience' as a criterion. 'Ambience' is defined as 'the quality of the atmosphere or setting'. Ambience too often distracts the rater from focusing on the elements of pure design. A rater should not be influenced by an immaculate club house, a delicious lunch, or a helpful caddie just as a film critic does not evaluate movies based upon the intimacy of the theatre, the fresh popcorn or the friendly attendant. These amenities are extrinsic to the matter at hand. Ambience, therefore, clearly does not have anything at all to do with a course’s integrity of design. It can certainly compliment the day; however, it is irrelevant in judging architecture. A golf course should not climb the charts simply because they hire a staff as accommodating as Augusta Nationals or construct a locker room as charming as Seminoles.

Ideally, outside influences, such as tradition, prestige, ambience, and amenities should not serve to influence raters. Raters are well rehearsed in being objective in response. Raters fully understand that the substance of the design outweighs the form of its parts. Nevertheless, raters are human and therefore cannot totally separate prejudicial influences. Just as a juries have difficulty disregarding incriminating, inadmissible evidence, raters too are inherently prejudiced by the subjectivities which bookend a round.

Since extrinsic matters, such as 'tradition and ambience', naturally influence many panelists anyway, then why does Golf Digest find it necessary to list them specifically as criterion? By doing so, they are sending the wrong message to architects and clubs across the country. Everyone wants their course to be ranked. Everyone tries to be a part of something special. In an attempt to bolster their recognition and ranking, clubs are attempting to build tradition and create ambience.

Many greens committees have destroyed their original classic designs in attempt to create traditional, championship layouts. Short, distinctive par 4’s have been all but eliminated because of the obsession with length found on championship courses. Lilliputian Donald Ross designs are considered outdated even though they require a tremendous amount of finesse and skill to negotiate. Architects as well are designing 7200-yard courses in hope of attracting a major tournament. Developers are spending extra millions just to compete with neighboring courses which are doing the same. They are presently incorporating unnatural, eye-catching features into golf courses today. They are trusting that raters will remember the glamour and glitz captured by lake fountains, waterfalls, island greens, and outlandish clubhouses above and beyond the strategic value of shotmaking options. Consequently, more new courses are becoming too long, too difficult, too artificial, and too expensive to play and maintain. Furthermore, many laid-back clubs with extension-of-the-home atmospheres have flooded their memberships with a surplus of extras and amenities, the absence of which made the club more appealing to begin with.

In reality, tradition is acquired over time. It cannot be created over night. Similarly, ambience is heightened when it is natural and unforced. All too often, by trying to build tradition and create ambience, clubs have destroyed their own. If the 'tradition and ambience' criterion was replaced with a tree management criterion for instance, perhaps these very same clubs would shift gears and perform measures which would actually benefit their design. At least this criterion would be relevant in evaluating golf architecture instead of the club as a whole.

As toppers, Ron and his in house panel supposedly can add bonus points to their rater's final tallys based upon tradition and ambience. Speaking of "cooked", or a justification to alter....It potentially resides at Golf Digest as well.

Golfweek does not specifically utilize 'tradition' and 'ambience', as criteria to evaluate. These elements may perhaps influence individual panelists, but at least they are not promoted as integral to the overall process. All of Golfweek’s guidelines do appear to focus on design integrity.




« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Mike Vegis @ Kiawah

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #34 on: March 13, 2002, 01:57:13 PM »
"For example, why should the Ocean Course at Kiawah be ranked any higher because The Ryder Cup was held there?"

No...  It should be ranked higher because it's such a fine, demanding-yet-fair layout ;D  ;)  ;D...  Each of the ranking/rating systems have their strong points and weak points and, as is mentioned quite often here, are all quite subjective.  Take them for what they're worth -- they are a means to entertain readers and sell magazines -- nothing more, nothing less.  As a side benefit, they open the eyes of many (including some posting on this board) that golf courses are more than dirt and grass.  Don't take them so seriously...
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Dunlop_White

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #35 on: March 13, 2002, 03:06:33 PM »
Dear Mr. Mike Vegis @ Kiawah:

Please understand that this not my thread! The above post is a candid response to the accusations and/or implications initiated by Tom, and consequently followed up by Ron.

Although Brad is perfectly capable of defending the integrity of "America's Best", I simply added my support.

After all, you are right! Course ratings promote discussion and commentary. Hence this discussion board.  ;)

Such scrutiny reveals potential shortcomings inherent to the processes. Without healthy discussions such as this, the rating systems would never be tweaked or improved for purposes of credibility or accountability.

Thanks for sharing your concerns!  :)

« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Ron_Whitten

Re: GOLFWEEK:  strictly by the numbers?
« Reply #36 on: March 13, 2002, 04:09:26 PM »
Gosh Brad, it must have been a tiring trip.  I reviewed by earlier entry and can't see anything accusatory about Golfweek's rankings. In fact, I stated that I'm sure you had plenty of evaluations for your two Tope 100 Lists. What I expressed surprise about, and asked about, was your new state by state ranking of public courses. (Something, by the way, that was still probably just a germ of an idea when I met you at your house two years ago. How was I supposed to ask you about that back then?)  Anyway, you answered my question. Three panelists is the minimum.  so I'll ask another question, one that often gets asked of me. Do you get those same three panelists to all the leading contenders in Kansas? Do you get all the leading contenders in Kansas covered by at least three panelists?
 
I'll answer my own questions as far as it goes with Golf Digest (which strives for 10 evaluations for each state candidate.) No, we don't get the same 10 to every contender in the state, and no, we don't get every contender covered. Some do fall through the cracks.

Three's a mighty small number of opinions when ranking courses, even in my state of Kansas. And, as I said, I'm a fan of Chisholm Trail, but still was surprised it finishing third.

As for the tradition category, it's meant to counter balance the hype surrounding brand new courses (and the fact that our panelist play a lot of brand new courses every year).  Do we cook the books? No, that implies that we strive to determine the outcome.  Do we apply another factor to the panelists scores? Sure, we apply two - tradition points (of which panelists' evaluations on Ambiance - defined as how well does a course reflect or uphold the traditional values of the game -  are a part of) and Walking Bonus (of which panelists' evaluations  on Walkability are a part of.)  We tabulate those numbers and add them to the panelist averages to determine total scores. It's not cooking anything. Yeah, it's kept Shadow Creek from the top 10, a fact Steve Wynn to this day resents.  And maybe it's keep a couple of grand old courses from falling out of the Top 100. What's wrong with that.

Your solution was two lists. Our solution is Tradition points. Different methods, same objective, identify the nation's top courses as determined by a concensus of opinions.  We've got Tradition and Walking. You've got "Walk in the Park" test.

To imply that our tradition points undermine legitimate evaluations of  our panelists seems like a cheap shot.  It would be like me suggesting your panelists don't know anything because you have to educate them at periodic seminars.  I know better, and i think you know better about our Tradition points.  For some reason, you seemed defensive this time.  Doak and I are definitely not ganging up on you. I just happened to respond to a thread he started.

By the way, I returned your calls several times last week, but mostly left no messages. I figured you were on the road. So was I.

« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Geoff Shackelford

Re: GOLFWEEK:  strictly by the numbers?
« Reply #37 on: March 13, 2002, 05:26:16 PM »
Brad,
It seems to me that the Golfweek ranking is getting picked on, scrutinized and questioned for a variety of reasons which all seem to come back to one issue: the notion of publishing a yearly ranking. Not only does it appear overtly commercial, but the need to publish annually seems to fuel many of the debates and questions raised here.

The real-time ranking feel to the list seems extremely unfair to many of the courses that might have a one year slip in maintenance or perhaps they are undergoing a redo or simply had either a high ratio of panelists visit, or not enough visits. The frequency of the list also seems to promote a revolving door policy that reflects as poorly on the panel as it does on the courses that get shown the door. I wrote many articles pointing out how Golf Digest and its panelists were irresponsible on this front, but I no longer feel they are when compared to the Golfweek list. In fact, both Digest and Golf Magazine appear quite stable at the moment, while the Golfweek lists currently don't share a similar feel, and I think that's a shame. Again, it may be less an issue of the number of courses coming and going, and more the frequency with which the ranking is published.

Otherwise, it seems you have assembled a credible panel, a thoughtful ballot and an interesting concept, and it should be getting more respect than perhaps it is and I sense it's simply a matter of how often the list appears.
Geoff
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Brad Klein

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #38 on: March 13, 2002, 05:52:09 PM »
Well, I do plead guilty, Ron, to the charge of bering "cranky" -attributable not only to 6 days on the road but also having to answer recently about 300 phone calls and e-mails about our lists. You know what's that's like. It helps to have all the data on hand so that when someone claims - as they just did - that "Desert Mountain" should be on the top-100 Modern list I'm able to spit out the results of our votes on all five courses to show that they didn't come as close as he wanted.

In any case, I appreciate your account of Tradition. As for the extent of our coverage, the 3 who play Chisolm Trail might not be covering the whole state - as you know, that's the hard part of less-populated states, where you get spotty coverage, as opposed to SC or Geor., where we have vast coverage. In both cases, it's overlapping, though I have to tell you we had one amazing rater from Knoxville who spent a week in S.D. playing everything of note there - and then some.

As for Tom Doak, I have a lingering suspicion he's been inquring a lot lately about whther the Kinglsley Club in his hometown of Travertse City, Mich. really merits inclusion - I seem to recall some sniping of the place on his part elsewhere. Well, yes, the votes are there, and though I haven't seen it I'm happy to report that others did. Instead of casting suspicions, why not simply be garteful and celebratory - as we all were with Tom and Pacific Dunes during Archipalooza.

Some have criticized us for having too much Fazio, Dye, Rees Jones and Nicklaus. Last I looked they are doing a lot of work, and much of it very good work, so they are not disproportionately represented. But there are many others too on our list, as Jim Lewis showed in an earlier post. I'm glad we have a wide variety of designers and states represented - including Doak and DeVries, as well as Crenshaw/Coore and Strantz, Engh and Kay. They're just as important and hard working as Brian Silva, Jeff Brauer, Gil Hanse and John Fought, among many others
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Tom MacWood (Guest)

Re: GOLFWEEK:  strictly by the numbers?
« Reply #39 on: March 13, 2002, 07:04:33 PM »
Brad/Ron/Tom
What percentage of your lists do you agree (did you agree) with?


 
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

John Lyon

Re: GOLFWEEK:  strictly by the numbers?
« Reply #40 on: March 13, 2002, 07:21:14 PM »
Here is the link an earlier thread where Tom Doak offered his well thought out criticism of the Kingsley Club.  For those that do not want to link, I have included his response in the thread below but it may not be fair without the context of the thread on the Kingsley Club.  You can draw your own conclusions from Tom’s prior post.

www.golfclubatlas.com/board/ubbhtml/Forum1/HTML/005475.html

posted November 22, 2001 02:54 AM By Tom Doak          
 
Okay, Mr. Lyon, you've drawn me out of the closet. [Actually I just got back from New Zealand, where I was too busy looking at property to lurk on Golf Club Atlas.]
I am guilty of not posting on The Kingsley Club, because I had nothing to gain by doing so. If I like it, then I'm boosting someone who is taking work away from me in my own backyard and forcing me to go to New Zealand to find better property. If I don't like it, I'll be accused of sour grapes.
However, Mike DeVries will now get used to the fact that if you're not a name designer, your work can be overlooked by GOLF DIGEST ... as High Pointe and Stonewall were similarly overlooked. In ten years, if he's still designing good courses, they'll get their due. That's the nature of the signature design business.
As for The Kingsley Club, I've only played it once, with Mike. I liked it much better when I played than the previous time when I had just walked it ... but there are still too many semi-blind shots on the front nine for my tastes, and I know that will cost him points with some GOLF DIGEST types, too.
In general, too, I can say that while there were several holes which I really really liked, Mike as most other young designers tends to miss on the severe side when he misses with a hole. I'd say the same thing about a lot of other designers of my generation ... including Mike Strantz, Steve Smyers, and Dana Fry, and even myself 5 or 10 years ago. They do some great stuff, but they always "go for it" on every shot of every hole, like Phil Mickelson ... and we know how many majors he's won that way.
Pacific Dunes is no boring golf course, but it is slightly restrained, and all the better for it. I'll be disappointed if it doesn't make GOLF DIGEST's Best New Top Ten, but not shocked ... they've missed others, too.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Mike Nuzzo

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #41 on: March 13, 2002, 07:56:28 PM »
Ron Whitten or any Kansas City native,
I read the following list in the Star.
By their own admission their ratings aren't based on architecture totally, see note below.

Kansas City Star
Posted on Sun, Mar. 03, 2002

Kansas City area's top-10 public golf courses

1. SYCAMORE RIDGE
2. SHIRKEY
3. EAGLE BEND
4. ADAMS POINTE
5. PRAIRIE HIGHLANDS
6. ALVAMAR
7. FALCON RIDGE
8. OVERLAND PARK GOLF CLUB
9. COUNTRY CREEK
10. SUNFLOWER HILLS

here is the link:
http://www.kansascity.com/mld/kansascitystar/2002/03/03/sports/2752751.htm

NOTE: Course rankings are based on total golf experience, which includes not only course design and condition, but value in relation to price, customer service, food and beverage service, etc.

Thanks
Mike
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »
Thinking of Bob, Rihc, Bill, George, Neil, Dr. Childs, & Tiger.

Patrick_Mucci

Re: GOLFWEEK:  strictly by the numbers?
« Reply #42 on: March 14, 2002, 03:46:10 AM »
I think it's terrific that Brad Klein and Ron Whitten are having a discussion with respect to the rating methodology, initiated by Tom Doak.

Ratings bring readers, lots of them, just look at the 6,351 hits to the rating thread Ran started, hence they will continue to be a popular magazine feature and fuel for your fires.

But, perhaps there is room for refinement in the evaluation process.

I've always been more interested in the process rather than the results and have a few questions for Ron and Brad.

If one of your raters rates.... Seminole, for example, how likely would they be, when rating it one, two or three years later to substantively alter their rating ?  Doesn't repetitive rating, by the same individual, further imbed a golf course's position, rather than offer fresh insight and evaluation ?  Wouldn't you get a better picture if a rater was prohibited from rating a substantively unchanged golf course for a period of, let's say, five years ?

When polls are done, there is a random sampling of a population or test market.  Noone asks the same individual the same question five times and puts down their same response five times, otherwise the poll or sampling would have no validity.  The same applies to your ranking proceedure.
You can't have the same person rank the same course, year in and year out, they are just reinforcing their initial opinion, and you get no variance, which is critical to statistical analysis.

It would seem that this is more of a dilema, as Ron pointed out,  in sparsely populated areas, and at EXCLUSIVE clubs where accessability may be limited to a privileged few.  And.....
the insidious part is, who, being granted the keys to the kingdom, is going to provide a critical assessment ?  Noone !
Especially someone favorably connected to the club.

I would also suggest that noone be able to rate a club they belong to, or belonged to, or are joining.  And if they join within five (5) years of ranking, their evaluation should be deleted.

I would suggest the following, place a rotation cycle in your process whereby raters cannot rerate a substantively unchanged golf course for a three (3) to five (5) year period.

Revisit you evaluation forms, and segregate non-architectual issues or features from the architectual/playability section of your form.

Reconsider, Ambiance, Tradition, Walk in the park criteria, possibly creating some identifying symbol to indicate favorably or unfavorably, the presence or absence of these categories.

Let architecture alone prevail.

But, that's just my opinion.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Paul Richards

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #43 on: March 14, 2002, 04:05:47 AM »
Patrick:

you said:"the insidious part is, who, being granted the keys to the kingdom, is going to provide a critical assessment?"

Keep in mind that these rankings are confidential, so even if
a rater got access to Seminole, played it 5 times, and gave
it 5 low ratings, no one but the editors would know!
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »
"Something has to change, otherwise the never-ending arms race that benefits only a few manufacturers will continue to lead to longer courses, narrower fairways, smaller greens, more rough, more expensive rounds, and other mechanisms that will leave golf's future in doubt." -  TFOG

Brad Klein

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #44 on: March 14, 2002, 05:19:08 AM »
Patrick, Paul . .

at Golfweek, we don't accumulate multiple rankings on one course. Everyone votes annually. If you gave Seminole a 6 last year and give it a 7 this year, the course gets the latest, upated vote - a 7.

Patrick, you want us to "prohibit" this and "ban" that. We're not policemen here. We preent guidelines, we deal with ladies and gentlemen ho are honorable, and we have a basic code of conduct. If anyone violates it, we throw them off the committee. I know Whitten does the same at Digest, and I can assure you the selection process for recruiting raters makes sure that such rogues are few and far between.

So we have basic guidelines, but if I were to start monitoring the travel patterns of all our raters I'd

-spend all my time on it
-never write anything
-annoy people

Having said that, we do ask people to play/rate courses they've not seen before and that do not get enough ballots. We encourage people to do this and discourage them from rating the Pine Valleys and Cypress Points of the world. If they want to play them for their own edification, fine, but I won't authorize ratings for those courses as we prefer to focus attention on lesser-rated/modern, out-of-the way layouts. I can't tell you how any times I get a fall from a prominent, well-rated course that's been unchanged for years asking about a rater's request I tell them "don't let them on as rater."

As I read these posts about the ratings, I'm struck by the need to clarify some basic issues.

First of all, this is supposed to be fun, not a chore. It's an art, not a science, and if people disagree with you it's not because they're wrong and you're right but because people have different aesthetic judgments.

Second, there's an obsessisve focus on GCA with top-100 courses. People examine the relative fall and rise of courses as if it's the end of the world, and they forget what a narrow ranage of statistical variance there is among top-100 courses in a universe of 17,000 courses. A top-100 Classical/Modern course is, by definition, in the top 1 percent (actually it's 1.17647 percent) of all courses. So if it rises five or ten spots that's a bit minor in regard to its overall elite status.

Third, there's not enough regard here for basic mathematical laws of statistical variability and reliance. If you have fewer than 20 votes on any course you're on shakey ground. I think all of our Classical courses and many (but not all) of our Modern courses have 20+ ratings. We're willing to go out on a limb with fewer votes - but until you get to 20 you have a lot of variance from year to year. That's why our Modern list is more volatile than our Classical list. That's okay, it spawns debate. And we try hard to get the rates out to under-counted layouts.

My biggest concern with Golf Magazine is that their "celebrity" raters don't do out of their way to play much new or obscure work. In any case, there is a lot to the proper design of ratings list, and all I can say is we examine it and adjust it as needed in designing the voting process and criteria so that the outcome is fair and defensable. Discussions like this help a lot.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Ran Morrissett

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #45 on: March 14, 2002, 05:41:55 AM »
Conversely, two of GOLF's " celebrity" panelists (Matt Lauer, Byrant Gumbel) are based in NYC, so my guess is that they have seen lots of the neat northeast courses. While there are certainly obscure courses scattered throughout the country, far and away the preponderance is in the northeast.

I like Tom MacWood's question: wonder which list more accurately reflects the tastes of its coordinator?
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Geoff Shackelford

Re: GOLFWEEK:  strictly by the numbers?
« Reply #46 on: March 14, 2002, 06:45:45 AM »
Brad,
Perhaps you missed my post, but I'll try posing the question another way.

Would you feel it's reasonable if Business Week ranked the various golf publications based on varying numbers of readers in all 50 states subscribing to the publications and filling out a form, all on an semi-annual basis, with the list fluctuating, resulting in some writers or editors losing their jobs based on the twice a year shifts in the list, and advertisers fleeing to the top magazine?

I suspect you or some of our friends wouldn't consider that list fun or a fair assessment of their efforts, and this is why I think many superintendents, golf pros, GM's and other people in the business probably don't find the real-time nature of the Golfweek list fun or credible. The very people that I think Golfweek caters to most.  

The architects probably can deal with this more easily, but the people who work at these places inevitably take a hit because of the frequency of the list and the fluctuating nature which you right off to basic math, I think it's a matter of coming out with your list so often people don't have time to see a course or truly absorb their feelings about it compared to other designs. Again, I love everything else about what you guys are doing and I find all of this fun, but I don't think the list is going to carry much weight because the annual nature of it gives too many people the appearance that this is a way to sell issues, not to do something fun or interesting.
Geoff
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Brad Klein

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #47 on: March 14, 2002, 07:59:59 AM »
Geoff, I did miss this post elsewhere, so thanks for re-presenting it.

You raise a frightening point, which is that people take this stuff seriously and jobs are based on it. I know that colleges are subject to the same annual rankings in a few major national publications and that those polls have a profound impact - to some extent for the better, but certainly with some of the undertones you suggest.

To a certain degree, publications are already under comparable cometitive ratings pressures - for news, stories, ads, subscriptions and sales. Though I understand that's "the market" at work, not some central rating agency.

By the way, I don't just "write off" the mathematical fluctuation here. I admit it's built in to a certain degree, but I also clearly acknowledge that we work hard to reduce it, and that there's less fluctuation each year owing to better results and more participation.

Your thinking on this has evolved. When you were one of the pioneer Golfweek raters we talked a lot then about the process. It seems your thinking has developed to the point now where you think we're better off without the rankings. As you know, when Golfweek proposed to me in 1995 that we do this I was very hesitant for a year. But I decided we needed to try to make it work, warts and all, and I happen to believe we've done a decent enough job - but then you're concerns still impress me and give me pause to see how we can do even better.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Tom Doak

Re: GOLFWEEK:  strictly by the numbers?
« Reply #48 on: March 14, 2002, 09:49:00 AM »
Hey, Brad, I'm not picking on The Kingsley Club or Barona Creek or any particular course on your list.  We've got plenty of exciting projects to work on without needing to bitch-slap any of our competitors, much less our friends.  (I spoke to Mike just yesterday.)

You may or may not remember a conversation about this we had a few years ago, which inspired my question to you.

When I did the list for GOLF Magazine I had the idea of a "hidden gems" list for the highest-ranked courses that didn't have enough votes [ten] to qualify for our rankings.  That was a good idea, which they've now disbanded.  It told the truth -- there are some good opinions of these courses, but not enough for us to be sure about them -- thereby getting other panelists curious enough to seek them out.

Out of those hidden gems, about one-third to one-half eventually made the top 100, and the rest fell by the wayside.  That seems more consistent than ranking them 60th one year, and having them fall out of the top 100 the next.

I know I'm prejudiced, but I'll still cast my vote for the diversity of the GOLF Magazine panel.  The GOLF DIGEST panel has always been top-heavy with low-handicap guys who have a narrow view of good design, while what I know of the GOLFWEEK panel seems to overcompensate in the other direction.  I'd rather have Bryant Gumbel [and Walter Woods, Greg Norman, Dana Fry, etc.].

While I'm on the subject, I still don't understand why you have ten separate ratings for each course, with one number which supercedes all of them.  Your example in the last issue, where your ten ratings of PGA West averaged 6.13, and you rated it 7.5, really confused me.  What are all the other numbers for?
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Paul Richards

  • Karma: +0/-0
Re: GOLFWEEK:  strictly by the numbers?
« Reply #49 on: March 14, 2002, 07:56:28 PM »
Tom:

The numbers aren't to be added, totaled, and divided by ten
categories.  The overall total is not a mathematical average.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »
"Something has to change, otherwise the never-ending arms race that benefits only a few manufacturers will continue to lead to longer courses, narrower fairways, smaller greens, more rough, more expensive rounds, and other mechanisms that will leave golf's future in doubt." -  TFOG

Tags:
Tags:

An Error Has Occurred!

Call to undefined function theme_linktree()
Back