|DATTA, SUMON - Oklahoma State University|
|MEHATA, MUKESH - Oklahoma State University|
|TAGHVAEIAN, SALEH - Oklahoma State University|
|Starks, Patrick - Pat|
Submitted to: Journal of Irrigation and Drainage Systems
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 1/30/2021
Publication Date: N/A
Interpretive Summary: Accurate measurements or estimation of different water fluxes in irrigated agriculture is the first step towards developing and implementing scheduling approaches to improve irrigation water use efficiency and reduce water losses. A three-year study was conducted at the Fort Cobb Reservoir Experimental Watershed located in west-central Oklahoma to quantify irrigation fluxes and to compare the actual fluxes with those calculated assuming well-watered (no stress) conditions. Crop evapotranspiration (ET), runoff (RO), and deep percolation (DP) were estimated using the root zone Soil Water Balance (SWB) and the HYDRUS models. Measured applied irrigation data revealed that nearly all studied fields were under-irrigated, with an average amount that was only 30% of what should have been applied to maintain well-watered conditions. Overall, the soil water content model outputs computed based on the actual irrigation fluxes were comparable to in-situ measurements from sensors installed at four depths at each study site. However, model results indicated that ET, RO, and DP fluxes computed using actual irrigation fluxes were 82%, 50%, and 33%, respectively, of what would have been experienced under the hypothetical well-watered conditions commonly used in hydrologic modeling studies. Based on these findings, it is important for studies such as the Conservation Effects Assessment Project (CEAP) and the Long-term Agroecosystems Research (LTAR) network to use actual irrigation fluxes in crop growth, hydrologic, and irrigation energy use and emission models to ensure accurate outputs, which are used for decision and policy making.
Technical Abstract: Evaluating the adequacy and efficiency of irrigation practices and identifying potential irrigation management improvements in agricultural watersheds require accurate estimates of irrigation fluxes under actual management conditions. Such estimates are also beneficial for other applications, such as simulating physiological and hydrologic processes at field and basin scales. A three-year study was conducted at an agricultural watershed in west-central Oklahoma to quantify irrigation fluxes and to compare the actual fluxes with those calculated assuming well-watered conditions. Measured applied irrigation data revealed that almost all studied fields were under-irrigated, with an average amount that was only 30% of what should have been applied to maintain well-watered (no stress) conditions. Other water fluxes, namely crop evapotranspiration (ET), runoff (RO), and deep percolation (DP), were estimated using two models with different levels of complexity: the root zone soil water balance (SWB) and the HYDRUS models. The outputs of the two models were very close and showed that the common under-irrigation practices lead to a reduction in all fluxes compared to the hypothetical well-watered conditions. According to the HYDRUS model results, the average actual ET, RO, and DP fluxes were 82%, 50%, and 33% of what would have been experienced under well-watered conditions. The soil water content simulations of HYDRUS under the actual scenario were similar to readings of in-situ sensors installed at four depths at each study site with an overall root mean square difference of 0.06 cm3 cm-3. The findings of this study demonstrate that differences between actual fluxes and those calculated based on no water stress assumptions could be notable, leading to major errors if well-watered fluxes are used in crop growth models, hydrologic simulations, irrigation energy use and emission models, and other applications.