News:

Welcome to the Golf Club Atlas Discussion Group!

Each user is approved by the Golf Club Atlas editorial staff. For any new inquiries, please contact us.


Don Mahaffey

  • Karma: +0/-0
How we evaluate courses
« on: December 02, 2017, 11:28:07 AM »
 We’re a data driven bunch.

We are almost unanimous in naming the greats, Cypress, Pine Valley, The Old Course…we have some outliers, but for the most part we generally agree on the greatest in the world.
But why do we think they are great? The experience? The golf? We break ‘em down, develop all these metrics, then apply them to other courses we want to grade.
What other art form uses this process to critique art? And does it really work in golf.
For example, we know we like playing TOC, we like how it makes us FEEL. But how do you measure that? We try and that means we must define that feeling analytically; so we can try and recreate that feeling, and then we talk about width and wild greens, firm and fast turf, and if a course has those things, done well, we might grade it high. But does breaking down the parts, taking those parts and applying them elsewhere, come anywhere close to the recreating the sum we have at TOC?
Kyle Harris’ post about green size got me thinking. In a million years, I could never tell you which greens are largest, or smallest, at Cypress Point. If I was given a similar site to work on, and I wanted to recreate the feeling of playing Cypress, would it do me good to go measure the greens and use that data when I build the new course? I don’t think so. Some “experts” might like it, people that study such things, but does doing something like that really create the experience you get when you play the great golf course you borrowed from?
It’s the sum that matters to me. How the parts are arranged to reach that sum is the key, IMO, and that isn’t so easily borrowed.
 
« Last Edit: December 02, 2017, 11:38:10 AM by Don Mahaffey »

Kyle Harris

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #1 on: December 02, 2017, 11:38:39 AM »
We’re a data driven bunch.

We are almost unanimous in naming the greats, Cypress, Pine Valley, The Old Course…we have some outliers, but for the most part we generally agree on the greatest in the world.
But why do we think they are great? The experience? The golf? We break ‘em down, develop all these metrics, then apply them to other courses we want to grade.
What other art form uses this process to critique art? And does it really work in golf.
For example, we know we like how playing TOC, we like how it makes us FEEL. But how do you measure that? We try and that means we must define that feeling analytically; so we can try and recreate that feeling, and then we talk about width and wild greens, firm and fast turf, and if a course has those things, done well, we might grade it high. But does breaking down the parts, taking those parts and applying them elsewhere, come anywhere close to the recreating the sum we have at TOC?
Kyle Harris’ post about green size got me thinking. In a million years, I could never tell you which greens are largest, or smallest, at Cypress Point. If I was given a similar site to work on, and I wanted to recreate the feeling of playing Cypress, would it do me good to go measure the greens and use that data when I build the new course? I don’t think so. Some “experts” might like it, people that study such things, but does doing something like that really create the experience you get when you play the great golf course you borrowed from?
It’s the sum that matters to me. How the parts are arranged to reach that sum is the key, IMO, and that isn’t so easily borrowed.

Don,

The idea behind my post was that it took my years, and in some cases hundreds of plays, to realize the variance in the size of the greens. I also realized it was only when the outliers engendered a negative review that something like the green size was even mentioned!

Therefore, your point about Cypress Point is exactly the kind of experience I had - it was only in retrospect that I realized my favorites had some very well done variably-sized putting surfaces. So well done that I didn't even think about it.

As I am beginning to tell my more experience staff members: "If I don't notice it, you probably did it right."
http://kylewharris.com

Constantly blamed by 8-handicaps for their 7 missed 12-footers each round.

Thank you for changing the font of your posts. It makes them easier to scroll past.

Peter Pallotta

Re: How we evaluate courses
« Reply #2 on: December 02, 2017, 12:19:12 PM »
Don:
you emphasized the right word, IMO - feel.
How do we quantify or evaluate any personal experience?
Years ago, a "10" reflected one young man's feelings about a given golf course; and, like most people at that age, he couldn't care less what anyone else felt about it.
Years later, despite that (now older) young man's reminders that all such evaluations are subjective, that "10" has become the gold standard - as if it's the truth, or a fact, or objectively provable and repeatable.
If I ever get a chance to play some of these great courses, I wonder if I'll be able to experience them with fresh eyes, to actually feel them in a personal/subjective way - unaffected by the 'objectification' (in ratings and magazines and here) and the external evaluations of the experience. 
I think I can do some of that now, with the little known courses. I had so much fun and such a great day of golf and was so impressed by a 1930 Stanley Thompson course I recently played — and later was very surprised to learn that ‘objectively’ it’s not on anyone’s list of top quality courses.
Peter   
« Last Edit: December 02, 2017, 02:42:00 PM by Peter Pallotta »

Tom_Doak

  • Karma: +3/-1
Re: How we evaluate courses
« Reply #3 on: December 02, 2017, 06:14:31 PM »
I think the most important thing is to evaluate a course for what it does, rather than what it doesn't do.  All of the rules and checklists propose that there are certain aspects of a course that are critical to its success, but ultimately the best courses stand out because they are different and violate everyone's silly rules.


For me the best object lesson of the last few years was to see the Himalayan Golf Club in Nepal.  Normally I'm a fan of firm and fast conditions, but those are impractical in that setting with such limited resources.  Instead, I found a golf course with narrow landing areas, and semi-shaggy fairways that limited bounces and roll - a different solution that was perfectly melded to local needs.  I don't believe that any professional architect (including myself) would have come up with such a practical solution.

Sean_A

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #4 on: December 02, 2017, 08:33:27 PM »
I think the most important thing is to evaluate a course for what it does, rather than what it doesn't do. 

Si.  The problem with ranking criteria is we are forced to look for features (and then evaluate what we looked for) rather than see what is there.  Even so, what bloody difference does it make if a course is great or good?  If this is the bolied down reason for playing the game I am not interested.  As Don states, the enjoyment of a course really comes down to how it makes one feel (and I would add think).  The thing is though, no two people share identical feelings about a course.  I am quite happy to understand that course B is not as good as course A, but I like course B more anyway.  People get caught up in these quality debates when we are usually nitpicking among the very best.  Why...because of how courses make people feel and think.

Ciao
« Last Edit: December 04, 2017, 05:04:36 AM by Sean_A »
New plays planned for 2024: Nothing

Ally Mcintosh

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #5 on: December 03, 2017, 03:45:44 AM »
Completely agree it's down to an overall feel above everything else. Ranking categories are a complete sham.


However, you have to evaluate WHY it makes you feel that way (as a designer) if you are going to learn anything at all from it. If you are just playing golf, less so.

Thomas Dai

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #6 on: December 03, 2017, 04:39:05 AM »
Do I want to go back would seem a pretty good basis for evaluation.
Atb

Sean_A

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #7 on: December 03, 2017, 05:08:09 AM »
Do I want to go back would seem a pretty good basis for evaluation.
Atb


There are tons of courses I want to play again.  A more important question is which courses would I be willing to pay to play again....or in other words which courses have my full attention.  The number significantly drops at this point, but I am sure that to a certain degree not willing to pony up for an additional game is a comment on how you feel/think about the quality and style of the course.


Ciao
New plays planned for 2024: Nothing

PCCraig

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #8 on: December 03, 2017, 09:04:33 AM »
All of the rules and checklists propose that there are certain aspects of a course that are critical to its success, but ultimately the best courses stand out because they are different and violate everyone's silly rules.



I absolutely love the above statement. Just terrific. I am going to have to noodle it a bit.
H.P.S.

Peter Pallotta

Re: How we evaluate courses
« Reply #9 on: December 03, 2017, 10:12:03 AM »
Pat - yes, that was a terrific line. So too was the Tom-Sean exchange about seeing what's there instead of looking for what's not.

(Brought to mind a Par 4 I know. As a "short par 4" it's meh; but as a "par 4 that just happens to be short" it's actually an interesting and nuanced golf hole.)

That might be a very good thread: What does your favourite/best golf course *not* have?

Dave McCollum

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #10 on: December 03, 2017, 11:00:19 AM »
Several times I’ve tried to comment on my feelings about ratings and given up each time.  One anecdotal feeling I think I’ve noticed is that when I’ve done my own research about where I wanted to play and picked my own courses, my “wow” factor is slightly elevated, indicating a predisposed bias before playing, or perhaps just a bit more knowledge going in.  I really don’t have much experience with others picking courses for me.  I did a couple of weeks in Ireland/N. Ireland:  the first week as an arranged tour, the second on my own.  I didn’t like two courses I played the first week as much as the others or all of the courses I picked.  They also happened to be the hardest (Waterville, Euro Club).  So, for me, my suspicion is not only is my playing experience subjective, it begins developing before I ever get there.  I suppose there is a special joy when built-up expectations don’t disappoint.

Rich Goodale

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #11 on: December 03, 2017, 11:24:42 AM »
Great post Don.


Vis a vis Art vs. Golf, you can observe the former but observe and interact with the latter.  This is why I play 20+ different golf courses/year and only visit the Louvre  and Musee d'Orsay every 5+ years or so.  It's all about form and function.  All art has form that one gets or doesn't get (or does or doesn't like), whereas all golf has a very strict function.  9 or 18 holes.  All 100-600 yard holes.  Fairways and greens.  3-6 hours per round.  Walk or pull or ride.  Sharing a beverage or three and chatter in the clubhouse or driving home to change nappies or watch football (American or ROTW).

I've been fortunate enough to play many of the "great" courses in my life, but very few of these do I care to revisit, per se.  These days it is all (to me) who I am playing with rather than where I am playing, whether it be Cypress Point or Auchterderran.

All golf courses are interesting, some more interesting than others.

Rich
« Last Edit: December 03, 2017, 11:26:43 AM by Rich Goodale »
Life is good.

Any afterlife is unlikely and/or dodgy.

Jean-Paul Parodi

Jack Carney

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #12 on: December 03, 2017, 04:25:54 PM »
Magazine categories are interesting attempts to put criteria into words, very hard to do. If we all created a system they would all be different - still good attempts. One category that we all allude to but don't define; and none of the magazine or rating systems do either; is fun! We all like fun courses and it remains outside theses systems but we will refer back to it in one way or another. Thats why we like course B regardless of it being rated lower than course A. Its more fun to play - Why? Again difficult to put into words! Just MO.

Tom_Doak

  • Karma: +3/-1
Re: How we evaluate courses
« Reply #13 on: December 03, 2017, 05:30:30 PM »
All of the rules and checklists propose that there are certain aspects of a course that are critical to its success, but ultimately the best courses stand out because they are different and violate everyone's silly rules.



I absolutely love the above statement. Just terrific. I am going to have to noodle it a bit.


You don't have to think much farther than to think about the differences between The Old Course (or North Berwick) and Pine Valley.  They are almost two opposite poles, with all of the lesser courses falling somewhere in the boring middle.

Steve Lang

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #14 on: December 03, 2017, 05:45:24 PM »
 8)   The boring middle??????????????  Surely you jest.
Inverness (Toledo, OH) cathedral clock inscription: "God measures men by what they are. Not what they in wealth possess.  That vibrant message chimes afar.
The voice of Inverness"

Rich Goodale

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #15 on: December 03, 2017, 06:34:00 PM »
8)   The boring middle? ??? ??? ??? ??? ?  Surely you jest.


Hopefully Tom was jesting, given that virtually of his courses would be in the "boring middle," by his definition.






« Last Edit: December 03, 2017, 06:35:42 PM by Rich Goodale »
Life is good.

Any afterlife is unlikely and/or dodgy.

Jean-Paul Parodi

Peter Pallotta

Re: How we evaluate courses
« Reply #16 on: December 03, 2017, 07:16:57 PM »
8)   The boring middle? ??? ??? ??? ??? ?  Surely you jest.
Hopefully Tom was jesting, given that virtually of his courses would be in the "boring middle," by his definition.

Just a guess, of course - but I don't think Tom was joking as much as exaggerating to make a point.

From reading books and course profiles and the rankings, I've (tentatively) concluded:

That if you don't understand what makes for greatness, your own work will never stand the test of time. But if you don't know how to tailor that greatness for the time & place in which you live, you won't have much of your own work to begin with.

But "boring" isn't the right word for that very fine line. 'Measured" might be a bit closer, it seems to me - but then again not really.

Peter 
« Last Edit: December 03, 2017, 08:06:57 PM by Peter Pallotta »

Steve Lang

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #17 on: December 03, 2017, 09:46:42 PM »
 8)  I was expecting... "stop calling me Shirley!"
Inverness (Toledo, OH) cathedral clock inscription: "God measures men by what they are. Not what they in wealth possess.  That vibrant message chimes afar.
The voice of Inverness"

Tom_Doak

  • Karma: +3/-1
Re: How we evaluate courses
« Reply #18 on: December 04, 2017, 06:14:50 AM »
8)   The boring middle? ??? ??? ??? ??? ?  Surely you jest.


In math terms, I was saying that those two courses are several standard deviations different than the norm, on either end of the extreme -- Pine Valley is islands of fairway with severe hazards all around, while St. Andrews is all fairway just punctuated by some deep bunkers.


The norm, the courses that follow all of the rules, are boring in my opinion.  You may think they're fair to play, but there is no point in traveling to see them, because there is nothing different about them.  To me, the great courses are the ones that demand you go see them, because there is something different about them.  So I try to stay out of "the boring middle".

Jeff_Brauer

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #19 on: December 04, 2017, 08:53:32 AM »

As I said somewhere, golfers tend to judge a hole (or courses) on difficulty, beauty or uniqueness, depending on golf skill and personality type.


IMHO, I agree its mostly feel, and would bet that when a rater goes to his ballot, he makes the numbers match his gut feel, even if the point system is designed to make them think twice and be objective.


Since Cypress was mentioned, I will say that its among my favorite courses for beauty, but Lanny Wadkins hates it because its too short and easy.  A magazine did a tour pro survey and they basically said the same thing, ranking it lower.


So for me, my preferences are for beauty, a few unique holes, and then difficulty.  Others may disagree, and c'est la vie, non?
Jeff Brauer, ASGCA Director of Outreach

Ed Brzezowski

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #20 on: December 04, 2017, 09:26:10 AM »
Reading this topic reminded me of the part in The Dead Poets Society about evaluating poetry on an X-Y axis. Only when done this way can a poem be properly evaluated.

Tastes change over the years as does ones playing ability. You start seeing things as you get older that were not as relevant when you could muscle a drive past them. The greats will always stand out, as they should, but can a change in playing ability change your thoughts?

Great topic.
We have a pool and a pond, the pond would be good for you.

Ira Fishman

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #21 on: December 04, 2017, 09:49:00 AM »
Ed,


There is no question that a change (hmm, decline) in my playing ability has affected the way I evaluate courses.  But I think for the better.  When younger, I paid little attention to and therefore did not appreciate green contours both as they affected chipping and putting.  Now, that the short game is one of the few things that I can improve with practice and focus, I devote more time and attention to evaluating and admiring green complexes.


Ira

David Wuthrich

Re: How we evaluate courses
« Reply #22 on: December 04, 2017, 12:08:21 PM »

I'm not a data person.


When people ask, I answer Blondes, Brunettes and Redheads!


They are all different, they are all wonderful but some people prefer one over the other.


I happen to like them all !


There are difficult courses, beautiful courses, architecturally interesting courses, etc.


I try to judge each in its own category.

Ulrich Mayring

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #23 on: December 07, 2017, 05:32:22 PM »
Golf is per definition (the rule book) a very data-intensive sport played in definite categories. Success at the game is not determined by how you feel or the beauty of your swing. Rather, it's by hard, cold numbers: how many strokes did you take against your handicap or against your opponent?

Obviously, you can play without counting and revel in the beauty of a course and the challenge of certain shots. You could find more satisfaction in sinking a curling 10 footer for a 10 than a tap-in for birdie. But I suspect that neither the rulebook nor golf courses were made with that type of player in mind. Historically, going back as far as you like, golf was always about numbers.

That doesn't mean that judging golf courses must be done by numbers, just that it seems logical. If someone finds a better way, then I'm all ears. But the only alternative seems to be "don't judge, it doesn't work", which is legitimate, but not very courageous.

Ulrich
Golf Course Exposé (300+ courses reviewed), Golf CV (how I keep track of 'em)

Sean_A

  • Karma: +0/-0
Re: How we evaluate courses
« Reply #24 on: December 07, 2017, 05:44:44 PM »
I am not sure the rules of golf and evaluating courses are analogous...if so...how? 


My belief is numbers guys fit the numbers to how they feel and think about courses.  Just about any list author/editor will do an eye test after all is said and done and if something doesn't look right they will make the adjustments.  I tried the numbers gig for a while as an experiement and found that too often stuff didn't pass the eye test (which I consider far more important than any set of data).  It got to the point where I figured the system was broken, not the assigned numbers.  I never came close to figuring out how to make the system fit the eye test.  I just tried it again for another magazine and came up with some interesting results which I didn't buy, but that was mainly because of categories I didn't think mattered or cold easily be wrapped into the larger picture as small beer stuff.

Ciao
« Last Edit: December 07, 2017, 05:50:33 PM by Sean_A »
New plays planned for 2024: Nothing