I’ve been thinking a lot more about this and so I wanted to share where I’m at with the problem and the algorithm to implement what I think we’re all talking about.
As R_W suggests, one thing that has become obvious to me is that if you are going to compare rainfall to sprinkler-based watering, you are going to have to know how much water your sprinkler system puts out as well as how much water your plants need. The simple fact is that if you know that it rained .27in yesterday, the only way you can determine its impact on your 7-day watering schedule is if you know A) how much you put out when you turn on the zone valve, and B) how much water your plans need. Note: you also really ought to know what the maximum amount of water you can put out before it starts to run off, but that’s potentially a second order problem.
I’m going to refer to the amount of water placed by the sprinklers in inches per hour as Precipitation Rate or Pr. To determine your Pr, you have to measure your actual water output using the method R_W links to in that video. You can use a kit (like this http://www.agrilifebookstore.org/product-p/sp-424.htm) or set out empty cat food tins or frankly anything else that will work. We need this data for every zone. I can’t state enough how important this is.
You also need to know how much water your plants (grass, flowers, vegetables, or whatever) need. From what I can find, most information available refers to the amount needed in inches of water per week or every 7 days. I’ve seen estimates for some grasses in my area of 2in of water needed per week, and some need as little as .5in per week. This value is referred to as evapotranspiration and abbreviated ET. Note: a commonly used term in the literature is ETo, which is the amount needed by a particular referent plant.
With this in mind, I believe the right thing to implement for OSPi is an advanced watering model that, on a per zone basis, determines the amount to water on a given day by factoring the amount of rainfall and amount of irrigation for the previous 6 days along with the amount of water that zone needs. The model I have in mind looks like this:
– ( + ) = water needed
– for a zone that contains grass that requires 1.5in/week
– and contains a sprinkler system that can put out .5in/hour
– that has been used for 90 minutes in the past 6 days
– and has had rainfall of .25in in the previous 6 days
1.5in – (.25in + (.5 * 1.5)) = .5in water needed
Once this is determined for a given zone for that day, the controller can then decide how much to water this particular day. In the case of our example, .5in of water is needed so 1 hour of watering is required for this day. If we know in advance that the maximum amount of water that can be put down before runoff occurs is .5in, then we’re fine – we can water for the full hour and everything is great. If we know that runoff occurs at .4in, then the system should activate the zone for 48 minutes and wait for tomorrow.
Does this seem right? Am I missing something?
If you had a UI that allowed you put in data per zone for the information on amount of water needed ET (in/week) and the amount of irrigation throughput Pr (in/hour) would that be acceptable?