Practical guide to determining Rainfall Rate and Rain Intensity Error
/Determining Rainfall Rate and Rain Intensity Error
The rainfall rate should be determined by the shortest practical time period per WMO and NWS best practices. Based on rain intensity and rain rate classifications, if one wants the precipitation rate uncertainty (error) to fall below 5% for any rain gauge, the number of tips of its tipping bucket must be at minimum equal to at least 1/5% = 20 tips per chosen time period. Based on this, one can calculate the minimum time period required to reach a 5% rain rate error as shown below.
What is rain intensity error in practical terms?
With 1 mm per tip resolution rain gauges common in agriculture, the error from a small 2 mm rain storm can be as much as 50% since the rain gauge tipping bucket may start off empty, tip once, and be almost full without tipping over when the rains end. When the sun comes out, the left over water in the cup will evaporate. With this type of error you might as well us an old rag to check how much it rained. Higher resolution rain gauges reduce this problem down to about a 0.1 mm (0.004”) resolution, after which a rain gauge can become too sensitive to wind and other vibrations.
For a 0.01” inch resolution rain gauge, the minimum time interval for 5 % precipitation rate error is determined as follows:
1 divided by 5 % error = 20 tips of a tipping bucket mechanism
Minimum required time period for evaluating precipitation intensity with an error of less than 5 % is:
Light rain - (<0.1"/h) with less than 10 full tipping buckets per hour requires a minimum of 120 minutes to achieve a 5 % error in rain intensity.
Moderate rain - (0.1” to 0.3”/h) with 10 to 30 full tipping buckets per hour requires a minimum of 40 to 120 minutes to achieve a 5% error in rain intensity and a maximum.
Heavy rain - (0.3” to 2”/h) with 31 to 200 full tipping buckets per hour requires a minimum of 6 to 40 minutes to achieve a 5 % error in rain intensity.
Violent rain - (>2”/h) with over 200 full tipping buckets per hour requires 6 minutes or less to achieve a 5 % error in rain intensity.
For a 0.2 mm resolution rain gauge, the minimum time interval for 5 % precipitation rate error is determined as follows:
1 divided by 5 % error = 20 tips of a tipping bucket mechanism
Minimum required time period for evaluating precipitation intensity with an error of less than 5 % is:
Light rain - (<2.5 mm/h) with less than 12 full tipping buckets per hour requires a minimum of 100 minutes to achieve a 5 % error in rain intensity.
Moderate rain - (2.5 to 7.5 mm/h) with 13 to 37 full tipping buckets per hour requires a minimum of 33 to 100 minutes to achieve a 5% error in rain intensity and a maximum.
Heavy rain - (7.6 to 50 mm/h) with 38 or more full tipping buckets per hour requires a minimum of 5 to 33 minutes to achieve a 5 % error in rain intensity.
Violent rain - (>50 mm/h) with 250 or more full tipping buckets per hour requires 5 minutes or less to achieve a 5 % error in rain intensity.
For a 0.1 mm resolution rain gauge, the minimum time interval for 5 % precipitation rate error is determined as follows:
1 divided by 5 % error = 20 tips of a tipping bucket mechanism
Minimum required time period for evaluating precipitation intensity with an error of less than 5 % is:
Light rain with less than 25 full tipping buckets per hour requires a minimum of 48 minutes to achieve a 5% error in rain intensity.
Moderate rain - with 25 to 75 full tipping buckets per hour requires a minimum of 16 to 48 minutes to achieve a 5 % error in rain intensity and a maximum.
Heavy rain - with 76 or more full tipping buckets per hour requires a minimum of 3 to 16 minutes to achieve a 5 % error in rain intensity.
Violent rain - with 500 or more full tipping buckets per hour requires 3 minutes or less to achieve a 5 % error in rain intensity.
CONCLUSION: Practical advice for Precipitation Rate Determination
One can see that with a 0.2 mm resolution rain gauge, a time interval for determining rain intensity for all practical purposes should be greater than 1.5 hours or 90 minutes, which is not very practical. Thus we recommend using a rain gauge of 0.1 mm or finer resolution for determining accurate rain precipitation rate intensities. A 0.1 mm resolution rain gauge is able to achieve lower than 5 % error within a 1 hour (60 minutes) time interval. To overcome this problem, intelligent rain gauges like the MeteoRain® IoT line report the exact time interval between two successive bucket fills, thus providing real-time rain rate information regardless of the below time intervals.