Changes in reducing of spin? Seems unlikely, but may be a function of dimples? One of the spin reports in the tread noted that although the balata spin rates were close to the ProV, it curved sideways in a more pronounced manner. Is perhaps making shallower dimples having a significant effect? This seems unlikely to me, because I doubt TopFlites and Titlesit balatas had significantly different dimples, but they sure moved differently in the air, which I have always attributed to the spin rates.
Well, aerodynamics, but more importantly the composition of the ball. Balls don't necessarily distribute mass evenly throughout their various layers. If one ball had a higher density core and the other had a higher density cover, they'd have different amounts of rotational kinetic energy and thus one would lose more rpms per second of flight than the other, even if the dimple patterns were identical. Because energy is conserved, a ball that had a higher density core and thus lost rotational energy more quickly would transfer more rotational energy from impact, as compared to a ball with a lower density core.
So was there a difference between balatas and modern balls in this regard? Titleist balatas had a liquid center, surrounded by a lot of rubber windings. I don't know the density of the windings compared to the overall average density of the ball, but it is probably not as dense as solid rubber. The liquid center was almost certainly less dense than the overall ball (unless it used a heavy liquid like oil) That would mean the ball was probably lighter in the center and heavier in the cover, which would transfer less spin from impact, but lose less spin during flight. I have no idea of the density of the various layers in a modern ball, but it is reasonable to assume that they are not of identical density. Is it denser in the center or less dense in the center? I have no idea. Maybe those digging into the patents saw something about the densities of the different layers?
Based on the trajectory of the modern ball versus the balata, I'd say the balata loses less spin as it flies than the modern ball does. Recall how the balata balls flew off a driver (those too young to remember can read Patrick's description of it) versus what we all experience with every drive today with the modern ball. The balata started out lower, climbed to a peak, then dropped fairly quickly. That's what you'd expect from a ball that gained height primarily via backspin. The modern ball starts out higher in the early stages of flight, and has a much flatter apex (which makes it appear to fall more slowly when viewed from behind) It is reaching its apex mainly due to the initial launch angle, with backspin not contributing as heavily.
Balls lose rotational energy and forward velocity during flight due to friction, which is dependent on the dimple pattern. Obviously the primary goal in dimple design is to lose as little velocity per foot as possible. Spin is only necessary at the start of flight to the extent needed to help the ball climb to its apex. Beyond that spin is less desirable as it would steepen the downward trajectory and reduce roll. It seems reasonable that a modern ball would be designed to lose as little of the initial velocity as possible, but rely more on a higher launch and less on spin to reach its apex - therefore meaning it would be desirable for it to lose spin fairly quickly. If it was possible to measure rpms 50, 100 and 150 yards after impact I'll bet the numbers would show the modern ball losing rpms much more quickly.
So maybe controlling spin via the rules is more subtle. Rather than dealing solely with initial spin rate, you have to deal with how quickly the ball loses spin. i.e. a drive hit with an initial spin rate of x can lose a maximum of y rpm per distance or time. It isn't that you don't want balls spinning at 3000 rpm when hit with the driver, it is that you want it carrying more of that spin throughout the drive, rather than quickly shedding it.