News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Michael Moore

  • Karma: +0/-0
Re: Dr. Klein's article on Magazine Rankings
« Reply #25 on: March 27, 2003, 06:29:12 PM »
I could not care less about rankings, what industry insiders and professionals think about this discussion board, who stole what from whose web site, which clubs are so precious that we are not allowed to discuss what is happening to them, which anonymous internet stranger insulted another, the minutae of course financing and permitting used as intellectual weapons, the possible exhaustion on this discussion board of the unusually finite topic of golf course architecture, which group of globetrotting bourgeois defeated the other, which utterly subjective aesthetic distinctions are preferable to others, and nonsensical hypothetical questions.

It is easy to tune all of that out. Take a look around this web site and tell me if anything remotely compares.

I'm writing because I always have to when George Wright Golf Club is on the table. What could they possibly be compaining about? That place is jam-packed at all times! What a course!

Hope to get the camera down there this summer and post some pictures. Isn't that what this site is about?
« Last Edit: December 31, 1969, 07:00:03 PM by -1 »
Metaphor is social and shares the table with the objects it intertwines and the attitudes it reconciles. Opinion, like the Michelin inspector, dines alone. - Adam Gopnik, The Table Comes First

Jim_Kennedy

  • Karma: +0/-0
Re: Dr. Klein's article on Magazine Rankings
« Reply #26 on: March 27, 2003, 06:42:29 PM »
Mark,
I think the idea has some merit.  If the listee says "Here are the numbers our teams came up with for these 100 courses. A course must have a minimum number of "x" in "x"  number of categories to make the list. Some courses score higher than others in some categories but lower in others. Figure it out and have fun while you are doing it", then I think the list would become more interactive and thought provoking. It would make for a more personalized list.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »
"I never beat a well man in my life" - Harry Vardon

Paul Richards

  • Karma: +0/-0
Re: Dr. Klein's article on Magazine Rankings
« Reply #27 on: March 27, 2003, 08:13:44 PM »
Here is the actual column from Superintendent's News:




Klein: ‘These lists don’t
make anyone happy’

 
I always knew that people took our annual course ratings seriously, but this was ridiculous. Immediately after Golfweek’s annual list of top courses appeared in print earlier this month, I got a call from a lawyer in the City of Boston Law Department asking about our ratings. Turns out, he wasn’t referring to our list of top 100 classical courses (built before 1960) or the top 100 modern (1960 and after), but to our list of best public-access layouts in each state - in this case, not surprisingly, Massachusetts. (The public-access list appeared only in Golfweek.)

By way of a preface, he briefly mentioned some litigation involving the management of a city layout, George Wright Municipal. His subsequent questioning followed perfect deductive logic, moving from the general to the particular. My inquisitor was curious what role maintenance and conditioning played, and I explained to him that it was but one, relatively subordinate factor among many architectural concerns. He then asked some details about how George Wright Municipal had been rated No. 2 in the state last year and now stood at No. 5.

At the end of the conversation he asked whether I’d sign an affidavit attesting to what I had said. Wow, I thought, this is serious. So I signed the document and now await word whether it might become part of a legal fight.

When we started rating courses systematically in 1996, our hope was to educate golfers about what makes for good and bad courses. However, I’m finding out that the ratings are used as ammunition in turf wars involving egos, marketing and excessive expectations. Each year, we seem to arouse more ire and concern, a factor I attribute less to our mistakes than to an increasingly competitive work culture in which overzealous demands have become the norm.

This year, in the aftermath of our publishing the lists, my phones and e-mail lighted up more than ever.

Consider the call I got on behalf of one Midwestern course that had fallen a handful of spots on our Modern list. The caller said his committee would be upset and wanted to make sure this wouldn’t happen again.

“Tell them they’re lucky to be ranked where they are,” I said. “And that’s not a judgment on the course. There are 10,000 modern courses out there, with 300 or so coming on line each year, and you’re in the top 1 percent of them.”

The strange thing is, when it comes to rankings, the bitterness far outweighs the joy. You rarely hear from folks who are grateful to have made one of the lists.

Occasionally, an appreciative superintendent or an architect with a breakthrough course writes. Mainly what I get, however, is grumbling about not being rated; about having dropped marginally; about another course that made it; about being too low on the list. One architect who calls me annually (to complain, but only a little) said it best: “These lists don’t make anyone happy.”

We know that going in, and we also know that our work is taken seriously, even if too seriously. Lest anyone think we enter into such list-making casually, I can assure you we proceed systematically. I have more Excel spreadsheets in my PC than I care to be responsible for, with printouts galore showing statistical analyses, histograms, standard deviations and other individualized profiles covering our 235 raters and their combined 25,904 votes for the 1,429 courses on our nomination list.

None of which guarantees that our readers and the golfing industry will like the results.

One problem with such polls is that instead of measuring something, you begin to influence its shape. I’ve heard of architects and owners sitting down with balloting criteria for various lists and making sure they anticipated those elements. Likewise, I’ve gotten a few calls from superintendents who were curious how they ranked on our “basic quality of conditioning” standard in order to improve their performance. Like I told the attorney from Boston, that’s only one, rather subordinate element in our voting.

What I didn’t tell him is that I thought the whole process was getting out of hand. It’s fine to use course rankings to appreciate and improve a golf course. But when the lists become a weapon, then good, hard-working people are going to get hurt.

• • •

Bradley S. Klein is editor of Golfweek’s Superintendent News. To reach him e-mail bklein@golfweek.com.

 


Date Posted: 3/28/2003
Date Printed: 3/28/2003

« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »
"Something has to change, otherwise the never-ending arms race that benefits only a few manufacturers will continue to lead to longer courses, narrower fairways, smaller greens, more rough, more expensive rounds, and other mechanisms that will leave golf's future in doubt." -  TFOG

Scott_Burroughs

  • Karma: +0/-0
Re: Dr. Klein's article on Magazine Rankings
« Reply #28 on: March 27, 2003, 09:12:12 PM »
Jim,

Semi-private=public with a membership where tee times may be restricted to the public at certain times for member play.  Only truly private club members are shut out from the Publinx.  I am a member of a semi-private club and am eligible for the Publinx (I played in a Publinx qualifier a couple years ago).

As you said, Taconic is semi-private and thus is eligible for the public lsitings as I'm sure quite a few courses on the list are semi-private as well.  They just choose to be labeled private for some reason.  They probably don't want mass (pun itended)hordes flooding the course.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Brad Klein

  • Karma: +0/-0
Re: Dr. Klein's article on Magazine Rankings
« Reply #29 on: March 28, 2003, 07:18:12 AM »
Paul, the reason I didn't simply post my column is a little legal matter called "copyright," which you and GCA have now technically violated. Personally, it doesn't bother me, but writers need to be careful of protecting their stuff, and some publications are assiduous about doing so. Golfweek doesn't worry about these things, but when work you do is distributed free of charge then a certain erosion of rights takes place and in extreme cases people benefit unfairly.
« Last Edit: December 31, 1969, 07:00:03 PM by -1 »

John_Conley

  • Karma: +0/-0
Re: Dr. Klein's article on Magazine Rankings
« Reply #30 on: March 28, 2003, 07:36:17 AM »

Quote
John Conley,
I think ratings could be improved in how they are presented. Just give us the numbers and forget about totaling them up and telling who is #1. If course A gets more "tradition" points than course B but fewer "conditioning" points than course B and course C gets gets more "shot value" points then either A or B just show us that!  Let the public debate who's number whatever on the list of 100 courses provided to them. This way no one gets their noses out of joint and there would be 10 times more locker room conversations which would add to the buzz and probably add to the popularity of rankings, especially among people like myself who don't really care one way or the other if ratings are published.    

Jim:

Whose rankings are you talking about?  It would appear Golf Digest.  If so, it is a VERY good thing they break it out for people because they've had Sand Hills 36th and 41st the last two times - impossible to explain unless you realize they are basically ineligible for Tradition points and suffer accordingly.

In case you aren't aware, Dr. Klein oversees a different panel for GOLFWEEK.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

John_Conley

  • Karma: +0/-0
Re: Dr. Klein's article on Magazine Rankings
« Reply #31 on: March 28, 2003, 07:46:18 AM »

Quote
I do, however, disagree with part of what you said in that I feel it should be harder for a course to go backwards than forwards.  It should be much harder for a course ranked #57 to fall off the list than for a new one to show up in that position.

...

But when a course goes more than 43 spots backwards (remember no one knows if Hollywood is now at #101 or #300), that is much harder to explain.  I do realize that a course with only a few votes can move significantly, but a change like Hollywood tells me there is wide variation in the opinions of the panelists voting on it.  

Mark:

Agreed.  It might be hard to explain to the members of Hollywood.  It also could be said they were lucky to make the list in the first place.  (I don't know of any other similar accolades they've received.)

If the results were presented in a different format it wouldn't look the same.   The list could just as easily be Top 10, Next 20, Next 30, and Next 40 - or Top 10, Next 15, Next 25, Next 50.  Relative ranges are much more indicative than whether or not one is 16th or 18th.  In my latter example, a course falling from #57 to off the list would barely catch your attention.

I have many loves in life and one is basketball.  Last night Nick Collison tore up Duke, who by this time was supposed to be led by McDonald's All-America Casey Sanders.  Sanders is but a bit player and Collison a household name, yet it was Sanders with a better ranking from recruiting people.  Rip Hamilton led Connecticut to the National Championship and stars in the NBA, but was ranked around #35 when he was a high schooler.  You can get caught up on whether or not one is a few notches higher or lower than another or you can just accept that there may not be much difference between #51 and #99 in such an endeavor.  (Remember, the population is NOT just 200 courses, but 20+ THOUSAND.)

Shades of grey.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Jim_Kennedy

  • Karma: +0/-0
Re: Dr. Klein's article on Magazine Rankings
« Reply #32 on: March 28, 2003, 08:05:27 AM »
John,
Substitute any categories of criteria used by any panel of raters, that was not my issue. I was suggesting that any list, using any criteria, could become interactive and would become more personalized if the category scores were not totaled by the listee, letting the reader come to his or her own conclusions.

The first Golfweek that I ever read was printed on folded over news stock, about 20 years ago.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »
"I never beat a well man in my life" - Harry Vardon

John_Conley

  • Karma: +0/-0
Re: Dr. Klein's article on Magazine Rankings
« Reply #33 on: March 28, 2003, 08:16:45 AM »
Jim:

Golfweek's approach, where a "bottom line" number supplants the score given to each category, has come under fire from some for doing what you are asking.  Can't please everyone, and that is the point of Brad's article.

Charlie Stine's GOLFWEEK is not the same as Rance Crain's.  In fact, if you like the old I can send you issues of Stine's current rag.  Just as bad as ever with favorable coverage given to the courses where his son had an involvement.  Journalism at its best.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Jim_Kennedy

  • Karma: +0/-0
Re: Dr. Klein's article on Magazine Rankings
« Reply #34 on: March 28, 2003, 08:30:50 AM »
John,
The old Golfweek was a good place to keep up on who won what and where. Then, as now, I didn't read it for the rankings. Golfweek has done nothing but improve since the first time I picked one up.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »
"I never beat a well man in my life" - Harry Vardon

John_Conley

  • Karma: +0/-0
Re: Dr. Klein's article on Magazine Rankings
« Reply #35 on: March 28, 2003, 08:37:52 AM »
Jim:

Today's GOLFWEEK does a great job of covering all golf.  Minitour, Junior, College, Amateur - I don't think anyone even comes close.  If all you follow are the Professional Tours, I think "Golf World" may be better.

A big change happened about a decade ago or whenever Stine sold the magazine, making the move to a full-blown national publication complete.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Jim_Kennedy

  • Karma: +0/-0
Re: Dr. Klein's article on Magazine Rankings
« Reply #36 on: March 28, 2003, 02:45:54 PM »
John,
I don't remember saying that all I followed were the Pros but the old Golfweek did a pretty good job of covering more than just professional golf although the new Golfweek idoes it better. There weren't many options back then and I have continually enjoyed the publication for some twenty years. I have no opinion on Charlie Stine other than that he started a magazine that has flourished into what we see now, albeit, under new ownership.


« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »
"I never beat a well man in my life" - Harry Vardon

Danny Goss

  • Karma: +0/-0
Re: Dr. Klein's article on Magazine Rankings
« Reply #37 on: March 28, 2003, 06:32:13 PM »
Brad,

In your article you say ".....when we started rating courses.....our hope was to educate golfers about what makes for good and bad courses".

My question is whether the exercise of rating is so subjective as to render the process just a matter of opinion. And we all know that there are plenty of opinions in the world today!!

If it is as subjective (as I think) does it serve any useful purpose?

Because many golfers are not as architecturally inclined as those who rate does that make their ( the raters) opinion more worthwhile than those who have no idea about a courses's merits but who confuse conditioning with architecture?

It's probably easier here in Australia where the list drops sharply after about number 6-8 but isn't the whole thing just about opinions? And who is to say mine is more valuable than others - no matter how less ( or more) educated they may be?

I often hear people tell me how great a course is when they are really talking about the conditioning - the rest of the course seems to have escaped their notice!
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Tom_Doak

  • Karma: +3/-1
Re: Dr. Klein's article on Magazine Rankings
« Reply #38 on: March 29, 2003, 10:50:10 AM »
I'll take the blame for ranking courses in order.

When I started doing the list for GOLF Magazine in 1983, it was the first time that anyone had bothered keeping track of votes in any statistical way.  GOLF DIGEST's list was "top ten," "second ten," "second fifty" as someone suggested above, and every two years they got the panel of about 25 people together at the US Open and conferred and made a few adjustments on the basis of their discussion.  There were no numbers.

When I began to tabulate the results for the GOLF Magazine vote the courses were simply listed in order of their average rating ... in order.  Previously the magazine had just listed the top 50 alphabetically, but George Peper asked me if we could list them in the order they finished, and I said why not?  Order is okay -- to me "10th" and "11th" is better than listing one of those courses with Pine Valley and the other equal with the second ten.

What few people understand is that after the consensus top 50 courses, the numbers are very close together in every ranking, so it's silly to take very seriously the difference between 65th and 85th.  But everyone does.

Likewise, there is even less difference between #100 and #150 ... which makes me wonder why anyone bothers to rate courses down to #100 at all.  But if we only rated the top 50, it would hardly ever change, and CHANGE IS WHAT CREATES THE INTEREST in reading the rankings.

Brad gave a speech in Oregon in which he verbalized the true reason for rankings.  He said the reason for rankings is not to gain advertising, but to increase the prestige of your magazine.  Courses refer to their "GOLFWEEK ranking," and hang a plaque with the GOLFWEEK list on their walls, and so more people subscribe to GOLFWEEK and take it more seriously.

I wonder why he didn't print that in the magazine? :)
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

TEPaul

Re: Dr. Klein's article on Magazine Rankings
« Reply #39 on: March 29, 2003, 11:19:57 AM »
TomD:

So all this stuff with rankings is your fault huh?  ;)

Well then, could we interest you in the Michelin system or perhaps the ten goal polo system which may be different than your own Doak rating?

Michelin doesn't seem to have the drawback of ranking really good courses against each other possibly to their eventual detriment. I guess theoretically if a bunch of them are great and the best they can be we would have a bunch of three stars or five stars or whatever restaurants are in Michelin.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

TEPaul

Re: Dr. Klein's article on Magazine Rankings
« Reply #40 on: March 29, 2003, 11:23:02 AM »
Or maybe even like generals---one-two-three and four stars and once in a blue moon along comes an Eisenhower who makes five stars! But there are times when there aren't any of those--like now! Does anyone really expect Tommy Franks or Myers to ever make five stars?
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

TEPaul

Re: Dr. Klein's article on Magazine Rankings
« Reply #41 on: March 29, 2003, 11:34:02 AM »
Michael Moore;

You're a good man! I like that post #25! Leave it to the level headedness of a Mainer, I say!

Question:

"Have you lived in Maine all your life?"

Mainer:

"Aaah, nahht yehhht!"
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »

Matt_Ward

Re: Dr. Klein's article on Magazine Rankings
« Reply #42 on: March 30, 2003, 10:46:12 AM »
Tom Doak's point about "numbering" specific positions of golf courses clearly itemizes what is so difficult to do, but yet still continues. I believe GD's former way of grouping courses in some larger context (first ten, second ten, etc, etc) is far more practical and realistic because unlike sporting events which provide a clear and definite outcome (i.e. who wins and loses) the subject of golf course evaluation / ratings is not something that can be carried out in that manner.

If courses were grouped in some sort of grouping it would still provoke discussion because there are points where differences will still engender much discussion. For example -- some courses may be borderline at the 10th and 11th position -- so that a position in the "first ten" and "second ten" is certainly worthy of major debate.

I don't see how courses can be numbered #85 or #53. I know the motivation of the magazines is to create a "buzz" but there's a way to do this without jumping off the cliff and moving ahead with a college football rankings that dissolves into absurdity.

I agree with Mr. Doak -- the listing of courses doesn't change much at the very top -- nor should it. Those courses have achieved that high classification for a specific reason and unless they demonstrate some clear deficiency or there are courses that truly demonstrate an even higher sense of accomplishment you will likely not see much movement. That's why I enjoy seeing ranking of courses in particular categories (i.e. best public courses under $50 or best public courses per state) or other groupings that make for interesting inclusions that a general "Best of All in the USA" will likely not provide.
« Last Edit: December 31, 1969, 07:00:00 PM by 1056376800 »