Submitted to: Acta Horticulture Proceedings
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 8/2/2010
Publication Date: N/A
Citation: N/A Interpretive Summary: Soil water sensors are widely marketed as aids for irrigation management. The lowest cost and most easily used sensors are based on electronic measurements. This makes them sensitive to changes in soil electrical conductivity and temperature, which are common in agricultural soils, and it may make them too inaccurate to use for irrigation scheduling. The USDA-ARS Conservation & Production Research Laboratory compared four direct-burial type electronic water content sensors and five sensors made to be used in plastic access tubes inserted into the soil. Comparison was done in a clay loam soil known to be present challenges to electronic sensors. Temperature and water content were highly variable during the comparison, allowing weaknesses in the sensors to be found. The best, but most expensive, direct burial system was conventional time domain reflectometry, known as TDR. It provided accurate measurements with only minor interference from temperature and soil conductivity changes and very little sensor-to-sensor variability. The TDR system was calibrated to the soil and is very expensive. One relatively inexpensive electronic system, the Acclima, was nearly as good as the TDR system, even without soil-specific calibration. The other direct-burial systems were more temperature sensitive and exhibited considerable sensor-to-sensor variability, making them less useful for water management. The electronic sensors used in access tubes were too inaccurate for irrigation scheduling. Except for the Acclima and TDR, all of the electronic sensors were influenced by soil electrical conductivity and temperature. Recommendations for improvement of the sensors are forthcoming from this study.
Technical Abstract: Soil water sensors are widely marketed in the farming sector as aids for irrigation scheduling. Sensors report either volumetric water content (theta-v, m**3 m**-3) or soil water potential, with theta-v sensors being by far the most common. To obtain yield and quality goals, irrigations are scheduled to keep soil water in the root zone above a management allowed depletion (MAD) level, which is specified as a percentage of the plant available water and which may change over the cropping season and with soil type and water quality. For example, representative values for maize grown in a silty clay loam and irrigated with good quality water may be (during pollination): MAD = 40%; field capacity = 0.33 m**3 m**-3; wilting point = 0.18 m**3 m**-3; and water content range of the MAD = 0.33 - 0.27 = 0.06 m**3 m**-3. In a sandy soil, the water content range of the MAD would be much smaller, say 0.03 to 0.04 m**3 m**-3. An important question then is: Are available soil water sensors accurate enough in the field to be reliable for irrigation scheduling using MAD? We evaluated the accuracy of several down-hole access tube type electromagnetic (EM) sensors and several EM sensors that can be buried or inserted into the soil. All sensors required soil-specific calibration, with the possible exception of conventional time domain reflectometry (TDR with waveform reduction). The EM sensors based on capacitance measurements were found to be the least accurate, most affected by soil bulk electrical conductivity and temperature, and generally ineffective for irrigation scheduling by MAD. The neutron moisture meter and gravimetric sampling were accurate enough to use with MAD. Some insertion type EM sensors showed promise for MAD-based irrigation scheduling. The capacitance sensors suffer from a fundamental problem in that their EM fields do not uniformly permeate the soil, instead preferentially following paths of greater bulk electrical conductivity. Because of differing arrangements of conductive pathways at each sampling location, the capacitance sensors exhibited a variability that did not reflect the actual field variability in water content.