The watering level is the result of the weather over the previous 24 hours and has nothing to do with whether or not you watered. It’s always about the evotranspiration that occurred in the prior 24 hours, which doesn’t change just because you watered. Watering just makes up for what was lost.
The baseline detect does a lookup of your location in a huge table. The result is the average worst case day in the worst month of the year for your location. If you’re in the U.S., you can lookup the historic ETo by month for your zipcode at http://www.rainmaster.com/historicET.aspx. Once looked up by the OS weather-server, it should not change. Basically, it’s what you should calibrate your sprinkler system to deliver. More correctly, it’s what you watering program should deliver at 100%, evenly across all zones. You can use just about any other number, as long as you calibrate your sprinkler system and your default watering program to deliver that much. Using the baseline value detected means you are unlikely to ever need to run at 200%.
187% is a pretty high value. Has it been unusually hot and dry? I’m at 6300 feet outside of Denver, an area that is known as “high desert.” The baseline ETo is 0.2 inch. Those are the kinds of losses we have in July and August, when the temperature may exceed 100F, there’s no rain, the humidity is very low, there’s a fast, dry wind, and the solar radiation is off the charts. Unless you’re in a similar area, it’s blazing hot, the humidity is close to zero, there’s been no rain, and the solar radiation is intense, 187% is suspicious.
And, yes, I need, and a lot of other people need, a method that accumulates watering level and has some logic about when to water. Unfortunately, having studied the problem, I believe there are too variations in logic requirements to be practical to implement on a system that runs on a chip in a small device; the user interface would be extremely complex.