News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Andy Troeger

Re:The Failure in State Ratings
« Reply #50 on: April 18, 2007, 12:08:48 PM »
Jerry,
I don't disagree with your statement that raters find their way to courses even if they have to work at it. However, even a change of .2 of a point on the overall total can be a huge difference in where a course ends up being ranked. For the national ratings you need 30-40 minimum to be included, so admittedly the change there is not likely to be too significant.

However for a state list the minimum is more like 10 raters. If a private club does not want raters and is not ranked very highly its not unheard of that they might stay close to that minimum. Thus the volatility if they get a few good or bad ratings.

There also has to be an impact when old ratings fall out after however many years. They might have been better/worse than the subsequent evaluations and hence a course will jump up or down without having done anything in the last 2-5 years.

Matt,
I certainly agree with your premise that a rater who can play a course multiple times should be able to give a more accurate rating. However, do locals always give their home course or rival courses a fair shake and evaluate only the course itself?

The problem with this entire argument no matter what opinion you have is lack of evidence. I don't doubt that there are instances of what you've mentioned. I also think there's instances that work the other way around. It would take a lot more information than any of us (other than those in charge of the lists that do have all the data) have to actually "prove" any of it. The rest is just conjecture.

Matt_Ward

Re:The Failure in State Ratings
« Reply #51 on: April 18, 2007, 12:15:32 PM »
Andy:

A rater should NOT be rating their home course for the simple reason of a clear conflict.

Also, I have pointed out the evidence countless times. Simply read the examples previously provided. People can take them at face value and accept them or reject them. It's just my opinion and can be accepted or rejected.

Andy Troeger

Re:The Failure in State Ratings
« Reply #52 on: April 18, 2007, 12:35:31 PM »
Matt,
Your last sentence there sums up what I was getting at. We're talking about opinions, and I don't think there is a "one size fits all" answer to it.

Your examples to me are evidence that you don't feel the state ratings are done very well...which I think we've covered already.

I would agree with not rating their home course, but I think you could argue there are situations where it can be just as much of a conflict to rate competing courses as it to rate your own home course(s).

Matt_Ward

Re:The Failure in State Ratings
« Reply #53 on: April 18, 2007, 12:45:02 PM »
Andy:

Without sounding like an egotist -- there are opinions and there are opinions formed from having done serious legwork with specific and countless visits to a wide range of facilities. Sample sizes do add a got bit in forming opinions when courses are cross-compared. So does cogent analysis.

No doubt -- it's simply opinions. People can chose to accept or reject them at-will.

In your last point -- I don't understand when you say "it can  be just as much of a conflict to rate competing courses as it to rate your own home course(s)." You need to run that by me again with a much more detailed explanation.

Jim Franklin

  • Total Karma: 0
Re:The Failure in State Ratings
« Reply #54 on: April 18, 2007, 12:52:45 PM »
Matt -

I understand where Andy is coming from. For example, my home course, Baltimore Country Club, has dropped out of the top 100 and is now #2 in Maryland behind Congressional. If I were the type to do this (and I am not), I could go play Congo and give it artificially low scores in an effort to get my course ahead of it in the state rankings. I think that is where he is coming from.
Mr Hurricane

Matt_Ward

Re:The Failure in State Ratings
« Reply #55 on: April 18, 2007, 12:58:49 PM »
Jim:

I can understand the temptation for people to lower others in order for their home course to move ahead. Quite frankly, I would think a simple spot review of people who belong to "x" club and what they provide to other clubs in terms of their review can be analyzed to prevent such a situation from happening.

For example, if Congressional were averaging an 8 across the board from say 15 different raters -- and then you get a rating of 2 from someone who belongs to a club in the state you could see where the situation you mentioned can happen and how it can be identified.

My God -- I would hope people are honorable in such endeavors but I know the sheer temptation that some people will operate in such matters.

Adam Clayman

  • Total Karma: 0
Re:The Failure in State Ratings
« Reply #56 on: April 18, 2007, 01:03:42 PM »
Why is the mid range of state lists important? Especially if theres a strong contingent who feel that top tier rankings are superfluous.

If Bobby Jnes was correct, and it goes without saying I think he was, the sport is played with the five inches between one's ears. What does leg work have to with it?

It's similar to the mistake most make when assuming risk in a casino game. They think past performance has any impact on future results. This is just not the case.
"It's unbelievable how much you don't know about the game you've been playing your whole life." - Mickey Mantle

Andy Troeger

Re:The Failure in State Ratings
« Reply #57 on: April 18, 2007, 01:04:58 PM »
Matt,
I have no problem with you saying that, most of what is said on here is just someone's opinion. Of the courses we've both seen I tend to agree with your comments more often than naught, although not always. It is what it is.

To protect the innocent  ;D (meaning not using any specific course as an example) here's an example of rating competing courses.

Mr. Doe is a rater and member at course A in a medium sized town with two good golf clubs (A & B). The two clubs have a fairly intense rivalry being on the opposite side of town and are always trying to "one-up" the other. While not the only factor, the clubs always want to be the higher rated club in the various publications. Mr. Doe as a rater can affect this in two ways, rating A highly and rating B lowly. Both could skew the overall ratings.

Now...there's no way to measure what a "competing club" is either. I certainly hope this doesn't even happen, but I'm cynical to think it probably has somewhere, just as courses are likely propped up with their "WOW" factor with first time visitors. I'm just trying to point out that the idea of having local raters with added weight of state rankings has a hole or two it IMO. Maybe the current format balances things out, maybe not, but I'm not convinced there is a better way.

EDIT: In my effort to be detailed Jim made my case in about two sentences :)  All I was pointing out was that not rating one's own course is hardly the only way to influence its position.
« Last Edit: April 18, 2007, 01:07:53 PM by Andy Troeger »

Jim Franklin

  • Total Karma: 0
Re:The Failure in State Ratings
« Reply #58 on: April 18, 2007, 01:47:11 PM »
Andy -

Glad to be of assistance ;).

Matt -

I would hope this was not the case as well. I enjoy playing and seeing different places and rate the courses as I see them. As for my home course dropping out, I can see that a few years ago, but the changes have greatly improved the course from a raters perspective IMO. I hope you can make it down some time this summer.
Mr Hurricane

Matt_Ward

Re:The Failure in State Ratings
« Reply #59 on: April 18, 2007, 04:54:01 PM »
Jim:

The drop in Five Farms is indicative of the weaknesses with the ratings system / re: Digest. Two years ago Plainfield CC dropped 50 positions in the Digest poll and this was AFTER the extensive work by Gil Hanse / George Bahto on the layout. The course has rebounded somewhat in the current assessments but not back to where it was prior. How does a coursre that was a top 50 PRIOR to the changes lose 50 spots when universally from the sources I know have raved about what the new changes have provided?

The reality is that when different people view courses at different times and with different definition on what "rating numbers" do apply you get a wide variety of outcomes that cannot be explained in any comprehensive manner.

Plainfield and Five Farms are just two examples of such situations of timing and how various people can miss the boat by a large margin.

Five Farms is indeed a special place designed by a most gifted architect. Touch base with me offline and we can work out a time frame to hook up there.

Andy:

The influence for much of the issues with state ratings comes from the more likely scenario I spelled out originally. When you have out-of-state / area raters you get people who simply add numbers to a particular course's totals without really understanding how the rest of the field matters.

Too often the "newest" of courses get this bump up and unless some of the more obscure classical courses are spoken about it is far more likely they will remain in the dark. To the credit of some of the magazines doing such ratings they are now assigning people to certain courses to provide for coverage and to minimize people looking to do what you mentioned.

The key to understanding state courses is to get repetitive plays. I don't think any person can rationally argue that a one time play is equivalent to someone who has played the course multiple times in different conditions and time frames. Surely, you are not taking that flawed position -- right?

I also pointed out that any deliberate attempt to undermine another course for the sake of promoting others can be ascertained by a review of the numbers applied. I don't doubt that there's no fail-safe mechanism that can be created 100% of the time.

I still stand by my original statement -- home town people should form the bulk of what shakes out in their own area in regards to state ratings. No doubt there should be inclusion of our of area raters to provide some perspective. The question is really about proportion in how the two groups intersect for this purpose.

Andy Troeger

Re:The Failure in State Ratings
« Reply #60 on: April 18, 2007, 08:49:03 PM »
Matt,
I don't disagree that multiple plays is preferable...there's no question or argument there. The lists are what they are, and there are a lot of things that cannot be made "perfect" about them. You could argue (and you may have previously?) that not all panelists opinions are created equally based on depth and breadth of experience. Where do you draw the line and how do you weight the opinions?  At some point, I think you have to trust your panel to rate courses accurately whether they see them once or multiple times. As much as I enjoy lists and all, again they are what they are at some level...the opinions of the panelists.

Matt_Ward

Re:The Failure in State Ratings
« Reply #61 on: April 19, 2007, 11:37:54 AM »
Andy:

I never asked or expected perfection.

You ask where do I draw the line ?

Simple.

I just ignore much of what comes from the magazines because a number of the listings -- it seems to be growing wish each "new" & "improved" listing contains so many clear errors -- either omissions and / or inclusions.

When particular magazines went on this "let's include more people on our panels" kick -- they believed having more and more people -- would ensure better coverage and awareness of the courses being reviewed. That has not happened in my mind.

At the end of the day this is still a subjective matter. There will never be a 100% answer for all to be satisfied. My main contention is that too many of the courses now being selected for such state listings are nothing more than the "flavor of the month" variety layouts -- specifically on the Digest front.

For those who watch and monitor this site at least you get info about other courses that aren't listed and often times I've gotten info that has truly panned out and provided a stellar layout.

You close with a thought about "trust(ing) your panel." So be it for those so inclined.