Tom,
My experience is that complexity is less a problem than it was several years ago, when the computer controls were first introduced. At that time I had more than one superintendent blow pipe out of the ground and/or not get enough water out, rather than admit that they "didn't know a damn thing about computers." Even today, I have one course where the original super left all the run times at grow in levels, rather than reset the times to something more manageble, and customers have complained that the course was too wet.
Mark,
The system is never more taxed than at grow in, using up to twice the normal water, but of course, not being limited to an 8 hour water cycle either. Certainly, using above ground pipe and temporary sprinklers reduces permanent irrigation cost. I have seen some grow in supers put five or six heads on wheels for easy transport, which worked well. However, the cost of temp. irrigation and extra labor versus permanent irrigation is not that great a savings. Thus, logic dictates just putting in the permanent. The trick is, of course, not to use it all the time after installation!
I think the biggest problem with computer control is that supers can read every morning that they lost .25" of water to evapotranspiration, and can set the system to replace that amount. Thats called the "checkbook method" of irrigation management. But, in rainy climates, he/she doesn't have to replace the water fully, just as I don't replace my checkbook funds, and my balance sometimes gets close to zero. I don't blame supers for not cutting it close, as they are protecting an investment, with serious consequences of dropping below "0", like turf loss.
However, if the plant has available water of 3" (a typical number in many soils) it will survive on 33% of maximum capacity, or 1" before going dormant, or dying, depending on species. No super would cut it that close, but the roughs at least, and probably fairways and tees could be managed to drop to half of field capacity, or 1.5", in this example.
Thus, if the super figured on rain within a week, and was losing .25" daily, he really could not water for 7 days, and the grass would survive, as field capacity would be 1.25" after that period, assuming the turf was to full water availablility before the cycle started. That would be cutting it close, but he could also replace, say .25" every other night, meaning that after the week, his turf would have plant capacity of 2.25", and 1.5" after two weeks, assuming no rain in the meantime, still a survivable mode for the turf, but somewhat dry.
The beauty of the wet/dry cycle is that water availabilty is partially a function of root depth, as well as soil type. Roots go deeper when there is no rain/irrigation seeking moisture. Thus, watering less frequently improves the turfs ability to withstand drought.
I have read many theories, and some are now touting every night irrigation as better. Larry Rodgers or someone who has studied it in more depth could probably explain why, but right now, I ain't buyin'! Of course, in the desert, you can't count on rain helping you out, and in certain soils, like clays we have here in Texas, you can only put about .1-.2 inches out per night, and the rest runs off, so my theories definatly have to be localized by the supers. And, depite our talk of the most modern irrigation systems, most supers will tell you that their system can barely put out what it needs in a night, so they are forced to underwater in technical terms for turf health. or, older systems don't have even coverage, so some areas have to be overwatered to get proper water in other areas, ie, they don't have enough control to get the right amount of water to the roughs.
I agree that golfers and/or budgets sometimes force the superintendent ot abandon the mindset of managing water for health reasons only, in favor of managing for color.