Submitted to: Decennial National Irrigation Symposium
Publication Type: Proceedings
Publication Acceptance Date: 10/1/2010
Publication Date: 12/5/2010
Citation: Evett, S.R., Schwartz, R.C., Lascano, R.J., Pelletier, M.G. 2010. In-Soil and Down-Hole Soil Water Sensors: Characteristics for Irrigation Management. In: Proceedings of the 5th Decennial National Irrigation Symposium, December 5-8, 2010, Phoenix, Arizona. Paper No:IRR10-8346.2010 CDROM. Interpretive Summary: Accurate soil water content data are important for effective irrigation management. Efficiency and low cost requires that these data be obtained automatically by unattended sensors. The USDA-ARS Conservation & Production Research Laboratory compared four electronic water content sensors made for measurements in the soil at specific points and four sensors made to work from within tubes installed vertically in the soil. Comparison was conducted in a clay loam soil known to be present challenges to electronic sensors. Temperature and water content were highly variable during the comparison, allowing weaknesses in the sensors to be found. The best, but most expensive, system was conventional time domain reflectometry, known as TDR. It provided accurate measurements with only minor interference from temperature and soil conductivity changes and very little sensor-to-sensor variability. The TDR system was calibrated to the soil. The Acclima system, which is relatively inexpensive, was nearly as good as the TDR system, even without soil-specific calibration. The CS616 and Hydra Probe systems were more temperature sensitive and exhibited considerable sensor-to-sensor variability, making them less useful for water management. Of the sensors that worked from within tubes in the soil, only the neutron probe was deemed useful for accurate irrigation scheduling. An inexpensive yet effective sensor for irrigation scheduling was identified in this study.
Technical Abstract: The past use of soil water sensors for irrigation management was variously hampered by high cost, onerous regulations in the case of the neutron probe (NP), difficulty of installation or maintenance, and poor accuracy. Although many sensors are now available, questions of their utility still abound. This study examined down-hole (access tube type) and insertion or burial type sensors for their ability to deliver volumetric water content data accurately enough for effective irrigation scheduling by the management allowed depletion (MAD) method. Down-hole sensors were compared with data from gravimetric sampling and field-calibrated neutron probe measurements. Insertion and burial type sensors were compared with a time domain reflectometry (TDR) system that was calibrated specifically for the soil; and temperature and bulk electrical conductivity measurements were also made to help elucidate sensor problems. The capacitance type down-hole sensors were inaccurate using factory calibrations, and soil-specific calibrations were not useful in a central valley California soil and a Great Plains soil. In both soils, these sensors exhibited spatial variability that did not exist at the scale of gravimetric and NP measurements or of irrigation management, resulting in errors too large for the MAD approach. Except for one, the point sensors that could be buried or inserted into the soil gave water contents larger than saturation using factory calibrations. The exception was also the least temperature sensitive, the others exhibiting daily water content variations due to temperature of >= 0.05 m**3 m**-3 water content. Errors were related to bulk electrical conductivity of this non-saline but clayey soil.