In the broad world (including art, movies, music, restaurants or drama), a critique of a piece of work will do generally four to five things:
1 - explain the style of the piece of work and what audience it will likely appeal to.
2 - make broad judgments about the quality of various aspects of the work.
3 - put the work into a broader context and further discussion of the art.
4 - be entertaining to read
5 - assign a numerical rating.
So to go through each point and assess the affect of bias:
1 - a golfer's skill level should not affect his critical ability to describe the style of golf course and who it would appeal to.
2 - a golfer's ability (as well as his level of creativity, intelligence, upbringing etc) will affect the way he assesses the quality of the work but this is really no different to a film critic (professional or friend) preferring action movies or arthouse movies. Individuals have varying tastes in everything. This in itself is not a problem if you can can put the critique into context.
3 - Going by the comments of Sean Arble and others in the layman routing thread, there seems to be a school of thought that furthering an understanding of the profession is irrelevant to critiques and reviews and the focus should be on the experience. Personally I would love it if more reviews discussed the context of a golf course in the way that the New Yorker reviews art, plays, music and film.
4 - There are very few professional writers critiquing golf courses, either in publications, on the web or on this discussion board, leaving a vacuum for the views of the uninformed to carry more weight. A friend declaring that Rees Jones is the greatest architectect of all time carries more weight than a friend declaring that Steven Segal is the greatest actor of all time because there just isn't a great body of educated opinion that counters the claim.
5 - Whilst in the broader artistic world a lot of reviews don't even carry a numerical review, in the world of golf courses, the agregated numerical review carries such a disproportionate weight that it seems like it makes it important for critics to be 'accurate' when critics will always be biased.
Even a website like rotten tomatoes that agregates ratings into a numerical summary links back to the individual written reviews. In this way you can find reveiwers that you like or dislike and focus your further attention on them or way from them. If the magazine top 100 lists are a guide to help you choose where to play, imagine ho much useful it would be if you could click on a course and it would link the score and a paragraph written by every rater who visited the course in the way that rotten tomatoes or metacritic does. You could then find out pretty quickly which raters you respected and focus on the courses that they particularly enjoy.
Without the ability to do this, it creates a false impression that ratings are definitive and absolute without any biases from the raters. Which then leads others to claim that their rankings are more definitive. And on and on. Which is the real problem IMO.
Apologies if I rambled. In summary, of course everyone is biased, the problem is the way this is not accounted for in discussion, critiques, and rankings.