Sorry, the idea that magazines pay qualified raters to rate courses is an intellectual argument that is not practical.
Of the 18,000 courses in the US let's say 5% of them qualify to be considered for any "top" billing lists. That's 900 courses. Let's say further that we need some kind of minimum samples or ratings before we can "trust" any average rating generated for a course. Pick a number; let's arbitrarily say 10 ratings as a minimum. I now need a minimum of 9000 ratings to fully canvas my 900 courses. Let's continue by saying that a rater, volunteer or professionally paid magazine rater, is assigned to go out and get 90 of these 9000 ratings. If they choose the professional rater route, a magazine would now need to hire and pay the expenses for 100 raters. Assume each rating costs the magazine $50 in reimbursable expenses. That means that the Top 100 issue for that magazine would cost them in the neighborhood of an added half million dollars – something no magazine can remotely afford.
GW puts out a list each year. It may surprise some but there are only 65 or so employees in the company. I don’t know the exact numbers (I don’t work for Golfweek) but GW probably has annual revenues in the $5M range. I’ll bet any employee suggesting that GW hire 100+ raters for their one issue would quickly find himself in the bread lines.
Right, wrong or indifferent, volunteer raters is the only practical way to go about golf course ratings.