Not to belabor the point, but I just wanted to make sure this additional point was not lost.
I understand what you want to achieve: run programs back to back, and also allow for varying program run times (due to weather adjustment).
Not only that — but even when programs aren’t necessarily scheduled to run back-to-back, there is still potential for watering of stations being silently dropped as a result of increased watering percentage.
Here’s a program preview with watering percentage set (manually) to 100%. Programs are not running directly back-to-back, i.e. (interval time > program run time) to avoid the issue we discussed at length, above.
Now, this image shows the exact same program at 150% (manually set), notice that some station waterings have been dropped due to the issue described above:
Now, had I been using the automated Zimmerman method, and if that method had resulted in 150% watering percentage on my watering day, then I would have had absolutely no warning ahead of time that my stations weren’t going to be watered as intended. This is particularly unfortunate because this means that the controller delivers less water on the specific days where we actually required more. The only way the user would know is by analyzing the log data, or by their lawn turning brown.
Seems the only way to ensure this can’t happen is to place sufficiently large space between all programs so that there is no overlapping even at max percentage (250%?), which doesn’t seem practical for end-user — it means there is burden on end-user to always preview programs at 250% watering percentage to ensure this behavior can never happen.
I do fully understand from our discussion above that this is not an easy thing to solve, but I think this is something your users (at least those using Zimmerman method) should be aware of.