Oh, I almost forgot ... back to the original topic of this thread ...
One of the perverse problems of doing course rankings is that the voters' information is often out of date. You're counting votes of people who have not been to some of these courses for several years. So, it often happens that if a course starts to deteriorate, as Yale has, it starts sliding slowly for several years as more of the panelists see that for themselves. Further, people who haven't seen the deterioration start hearing about it and revising their votes downward as they make room for new courses in their own top 50. So, even if the club rights the ship, it may continue to slide for a couple of years until the panelists start seeing the positive changes.
This isn't an ideal situation, especially in the case where a club takes the rankings so seriously that a slip causes them to question positive work that has been done to the course.
However, for me, letting nature take its course in this example is far better than trying to correct the problem editorially by manipulating the results. Brad should be commended in this instance for letting the chips fall where they may.