News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Bill Shamleffer

  • Total Karma: 0
Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #25 on: March 31, 2009, 05:34:44 PM »
It would be good to see a golf publication do to golf course rankings, what Sight and Sound does to movie rankings.  Sight and Sound is a British monthly magazine devoted to movies.  They produce a very famous ranking of movies, but they only re-do the list every 10 years.

They currently use a small group of film directors and film critics, and then list each group's rankings, and also list each individual's rankings.  The last list was in 2002, close to 300 people submitted lists.  Of course this list creates disagreements.  The whole ranking process is very flawed.  But at least it offers to the public each contributor's personal list.  This allows the readers to not only see the "group opinion" but also the individual opinions from a variety of different people.
http://www.bfi.org.uk/sightandsound/topten/poll/

Granted it is a lot easier to see every significant movie, but there are some aspects of Sight & Sound's approach that can be applied to golf course rankings.

First, it would be better to limit these rankings to no more than once every five years.  (Although this is not happening due to the magazine sales & PR these lists likely generate.)

Second, each rater's lists should be published.  (If not all in the magazine, at least list them all online.)

Third, get rid of all of the categories of consideration in the rating process.  It is fine for the editors to give a list of items for each rater to consider, and to even list a weighting for each item (e.g. conditioning 25%, routing 35%, historical significance 10%, etc.), but in the end it is better to just have each rater submit a list without any other breakdown or details.

The most interesting historical lists of golf course rankings are mostly by one person, and are just a list of golf course rankings, no details per conditioning, etc.
“The race is not always to the swift, nor the battle to the strong, but that's the way to bet.”  Damon Runyon

Andy Troeger

Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #26 on: March 31, 2009, 07:42:51 PM »
Matt,

I think you should just come up with your own top 100 list--that way it will be what you think deserves to be there and we can critique it!  ;D

In all seriousness, you might be one of the few that could put together a reasonable list just based on what you've played. It would be an interesting read. I don't think you're one that's ever going to like a consensus listing no matter what results come out of it. Some likely would be closer than others.

Mike Hendren

  • Total Karma: -1
Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #27 on: March 31, 2009, 07:49:21 PM »
Matt, please post your top 100. 
Two Corinthians walk into a bar ....

Sean Leary

  • Total Karma: 0
Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #28 on: March 31, 2009, 07:52:02 PM »
Please Matt.

Mike Sweeney

Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #29 on: March 31, 2009, 08:06:44 PM »
Perhaps Matt will become golf's equivalent of The Bracketologist:

http://sports.espn.go.com/ncb/bracketology

Bart Bradley

  • Total Karma: 0
Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #30 on: March 31, 2009, 08:08:34 PM »
Matt:

If you asked an expert in statistics, they could look at the data and determine how large a panel would be necessary to produce statistically significant differences in the overall rating numbers (which would be called the "power" of the rankings.  

My guess is that by pure statistics, many of the courses on the magazines' lists are not rated "statistically significantly" different from each other.  In other words, that the difference in the final ratings between say course 60 and course 90 is due only to bias, chance, and the "luck" of which panelists saw which courses.  This effect alone makes the rankings dubious at best.

It is my opinion, that these lists are far too affected by bias, accessibility and even politics to be tremendously useful.  They are interesting and allow for hype, marketing, and magazine sales.

My idea for the best way to produce such a list is entirely different from any of the magazines and the basics are stolen from the publishing industry.

My wife is a contributor to a well known book reviewing magazine.  She is assigned to anonymously read and review certain books.  She passes on a written evaluation to a main editor who, if warranted will also read and review the book.  Then a small committee including the main editor passes final judgement and the magazine publishes the final opinion.

I honestly believe that a similar system, with the right people, would produce the best list...it would be biased toward the ideals of the committee which could be published for everyone's knowledge.  The initial review(s) would need to be performed by a group of anonymous panelists and the information garnered would then be filtered through the small committee.

This technique would not necessarily produce an "accurate" list, but at least it would not be biased by preferential treatment for panelists and the results should be more consistent with the published set of ideals.

Just my opinion,

Bart



Matt_Ward

Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #31 on: March 31, 2009, 09:34:00 PM »
Michael H:

I never remotely claimed superiority -- thanks for the plug though ! ;D

Bill S:

I like the idea in pushing overall ratings to a time frame of say 3-4 years. This would give sufficient time to see what is happening at different courses.

Andy:

I don't doubt you are right.

I was a GD panelist for 17 years and always tried to convince Ron W that some sort of bifurcated panel would work best. What many people don't realize is that Digest years ago did have state / regional selectors and a smaller national panel -- many on the national panel were more of the who's who types in golf (e.g. Sam Snead, Gene Sarazen, etc, etc).

It's possible to streamline the process because I look at what Digest is producing for it's top 100 and Im just shaking my head in disbelief. When Ballyneal isn't even listed -- something is big time wrong. Ditto the omission of a place like Kingsley.

Sad to say but Digest pays so much heed to private clubs and the fact is that many public courses that have opened in the last 20-25 years are likely beyond a good number on the list you see being trumpeted time after time.

I did create my own personal top 50 metro NYC listing. Going to a personal top 100 would take a bit of time to sort out -- likely I would divide it up between the top 100 private and the top 100 public I've played.


Bart:

Thanks for your comments -- however ...

You have people who weigh the value of one course and then you have completely different people weighing the value of another course -- at no time do you have meaningful spillovers from people playing a solid cross section of courses.

As I said before -- there needs to be -- as Lou D pointed out -- some form of scaled voting power. Too many people are regional in their application of rating numbers -- some who do travel only see and play courses from other areas on a once-in-a-lifetime basis.

There are solutions to improving the overall process. No doubt the final result is still a subjective account. But, simply adding more and more people to the overall rating numbers doesn't increase real meaningful analysis -- all it does is increase people.

Bart, your ideas on reform are somewhat patterned after what I and a few others said would be a scaling towards those who are really national panelists and those who are simply regional types.

Clearly, the process can be streamlined but the updated Digest list -- can't wait to see the state ratings -- is so inane as to be comical.


Jeff:

So what's your solution?

Add more raters to the braintrust they've got now !

Panelists can be broken down to regional and national levels. The people who handle the paperwork for the different mags know full well which people really get around and therefore are able to see / play the courses in question.

When you have one person who simply plays a singular 100-mile circle area of courses and then give him the same weight as those who see anywhere in the range of 50 or more courses in a year and travel criss-cross the nation you need to weigh such factors into the equation.

Tom Paul said it best this is just a listing without any real sense of education - it's just courses stuck together through some hodge podge of numbers that certain individuals have decided to throw forward.

I'll say this again -- in the event you missed it -- there are ways to streamline the process and still get the necessary input from different levels. The question is do the mags have the will to see the flaws that can be easily corrected.

Jeff, there is no perfect solution to this. I know that without doubt. I just think if someone were really networked you would get as much info -- if not more and could really elevate the top tier places -- many of which happen to be relatively newer courses and often times by architects who don't have star power status. That's all.

Jeff_Brauer

  • Total Karma: 4
Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #32 on: March 31, 2009, 10:36:24 PM »
Matt,

......"could really elevate the top tier places -- many of which happen to be relatively newer courses and often times by architects who don't have star power status."

Keep talking, you will win me over yet!

I wonder, given the publishing business, if the mags can find the money to support the will to improve the ratings?
Jeff Brauer, ASGCA Director of Outreach

Paul Richards

  • Total Karma: -2
Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #33 on: March 31, 2009, 10:56:47 PM »
Matt:

Whatever the 'consensus' gives you, it HAS to be better than that Bizarro list that I just saw posted today on another thread..........


 ::) :'( ??? :o >:(
"Something has to change, otherwise the never-ending arms race that benefits only a few manufacturers will continue to lead to longer courses, narrower fairways, smaller greens, more rough, more expensive rounds, and other mechanisms that will leave golf's future in doubt." -  TFOG

Matt_Ward

Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #34 on: April 01, 2009, 10:27:50 AM »
Jeff:

To elaborate on what I mentioned ...

Far too many raters are star chasers -- they are intent on playing the top tier layouts they have always heard about (e.g. Cypress, Shinnecock, Sand Hills, et al) and likely do the same thing with certain designers such as Fazio, Doak, Nicklaus, etc, etc.

During my time as a rater for GD it was not uncommon for Ron Whitten to implore raters to play a varied menu of courses beyond the usual suspects. It got so bad that Digest even went the route in assigning locally-based courses so that different people would need to play them.

The issue with improvement to the ratings Jeff stems from trying to go "on the cheap" and engage the willing services of countless people scattered around the country and although you do get some superb contributions from some marvelous people the sheer bulk of them are operating at a very low level -- the proof is in looking at the "consensus" final version that Digest has outlined -- not only with the current version but recent ones.

The issue is not the $$ but a desire to do the job thoroughly. Years back Digest used to take on issues (e.g. slow play campaigns and the like) -- now the magazine is really divorced from such worthy efforts. The ratings process is not a complicated one -- it does require solid networking to ascertain what is happening in any given area. In years past it would have been incredbly hard for reliable info -- the Internet age has forever changed that.

What's so funny is that results from years ago were better than what you see today. And the sad part is that there are plenty of courses and plenty of unheralded designers who are left out in the wilderness while a number of tired predictable courses continually grab the spotlight.


Lou_Duran

  • Total Karma: -2
Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #35 on: April 01, 2009, 11:46:55 AM »
Mark B,

I don't know what you mean by "thanks for the bone".  Care to elaborate?

Matt W,

Education is important.  Everyone singing from the same hymn book is also helpful (I once sat there in total disbelief listening to an impassioned speech from a well-known male rater on why a wonderful classical course should be docked big time because it didn't have an adequate set of women's tees).  But nothing replaces experience.  The most open-mided, highly educated rater who has not played a good sampling of courses in each of the top four or five deciles has to have a problem assigning a rating value that is ordinal in nature.  Isn't it analogous to completing a budget variance analysis without knowing the budget?  The best one can do is to compare to other available information, say prior history or other courses one is familiar with.

No doubt that Ron Whitten, Brad Klein, and others have though long and hard about these issues.  It could very well be that what they've come up with through time is the best that can be had given all the factors they have to deal with.  There are a lot of ratings that I have to scratch my head about, but reasonable, smart enough, experienced people have those types of disagreements all of the time.  Personally, I can't understand how you and Redanman are so enamored with The Golf Club, but so what!

Matt_Ward

Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #36 on: April 01, 2009, 01:42:13 PM »
Lou:

The major problem I have with the pubs -- Digest, GW, et al, is the silly idea that adding more and more raters provides better overall coverage. It certainly does not!

No doubt you get people who have opinions -- but opinions without some sort of basis for such thoughts is meaningless. Lou, you also have people with ingrained perceptions -- I mentioned the ones that relate to Texas and how golf there has improved. You have some northeast blue bloods who still think the Alamo battle is still being fought.

Disagreements will happen -- I'd like the mags to selectively include minority opinons -- if not in the pages of the pub then through their Websites.

In the Internet age the speed of quality info is better than it has ever been. Now no course can live in the shadows if something is amiss. That's good. The mags simply don't realize that the army of people they are engaging could be better prepped -- re-evaluating theor role at the state / national levels would be a start.

Last item -- On TGC -- let me know what you see as being deficient with the course. I'd be curious as to the Pete Dye courses you see as being superior to it.

Lou -- last item -- disagreements are good and should happen. Just demonstrate that the people making such decisions have a leg to stand on besides the pablum masquerading as solid info that one sees with such outputs as what Digest now claims as America's top 100 courses.




Paul Richards

  • Total Karma: -2
Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #37 on: April 02, 2009, 12:16:24 AM »
I think the point is that to truly understand the difference between a 'good' and a 'great' golf course, you must also understand what an 'average' course is as well as what is just plain 'crap'.

Part of the education of a Rater is to play some of the 'crap' and also courses that used to be on the lists, so you can understand the difference between what people value today versus what used to be considered 'quality.'

In order to kiss a princess, you have to kiss a few frogs first to understand the difference.

"Something has to change, otherwise the never-ending arms race that benefits only a few manufacturers will continue to lead to longer courses, narrower fairways, smaller greens, more rough, more expensive rounds, and other mechanisms that will leave golf's future in doubt." -  TFOG

Matt_Ward

Re: Can Consensus Ratings Work & How Large a Panel is Really Needed ?
« Reply #38 on: April 02, 2009, 04:00:53 PM »
Paul:

I hear what you say.

I just have to wonder if Digest does any introspection on the process and the expansive nature of the panel they now use.

I do agree that playing a number of dogs can really make you appreciate the top tier ones.

But just keep this in mind -- there are people who really knew so little about a place like BB until it hosted the US Open. I think too many people are simply interested in cherry picking the elite courses and never realizing how many fascinating other courses are out there -- some of which should be rated among the nation's finest.