Submitted to: Transactions of the ASABE
Publication Type: Peer reviewed journal
Publication Acceptance Date: 4/1/2008
Publication Date: 6/1/2008
Citation: Thorp, K.R., Jaynes, D.B., Malone, R.W. 2008. Simulating long-term performance of drainage water management across the midwestern United States. Transactions of the ASABE. 51(3):961-976. Interpretive Summary: Excessive levels of nitrate in surface water bodies have had severe ecologic and economic impacts throughout the drainage basin of the Mississippi River and have caused problems as far south as the Gulf of Mexico, where hypoxia threatens commercial and recreational fisheries. Many scientific studies have found a link between excessive surface water nitrate levels and movement of nitrate through agricultural drainage systems, which have been used for 150 years throughout the Midwestern United States to improve the agricultural productivity of the region. Drainage water management is a new technology that promises to reduce losses of nitrate from agricultural fields by regulating the flow of water through the drainage systems. Our study utilized an agricultural systems model to simulate the performance of drainage water management over 25 years of meteorological measurements at 48 different locations across the Midwest. We used the simulation results to assess the efficacy of using drainage water management to reduce nitrate loss across the region. We also assessed the potential negative impacts of drainage water management on other aspects of an agricultural system, such as increased runoff and increased loss of nitrate to groundwater. Results of this study will be useful for researchers and practitioners who want to implement drainage water management practices in the Midwest. In particular, the findings will benefit the Agricultural Drainage Management Systems (ADMS) Task Force, which is a cooperative effort among several universities, government agencies, and agricultural drainage companies to research, implement, and promote the use of improved agricultural drainage systems for decreasing losses of nitrate from agricultural fields. Our study evaluated drainage water management in eight states across the Midwest, including Illinois, Indiana, Iowa, Michigan, Minnesota, Missouri, Ohio, and Wisconsin, which is also the current focus area of the ADMS Task Force.
Technical Abstract: Under conventional drainage (CVD), excess soil water in agricultural fields is allowed to drain freely through artificial subsurface drainage lines. In contrast, drainage water management (DWM) utilizes a control structure at the end of the lines to regulate drain flow by varying the depth of the drainage outlet throughout the growing season. Developers of DWM have proposed the technology as a solution to reduce losses of nitrate (NO3) from subsurface drainage systems in the Midwestern United States; however, tests of DWM efficacy have only been performed over short time periods and at a limited number of sites. To fill this gap, the RZWQM-DSSAT hybrid model, previously evaluated for a subsurface-drained agricultural system in Iowa, was used to simulate both CVD and DWM over 25 years of historical weather data at 48 locations across the Midwestern United States. Model simulations were used to demonstrate how variability in both climate and management practices across the region affects the ability of DWM to reduce losses of NO3 in subsurface drainage. Across the 48 locations, the simulated annual average reduction in subsurface drain flow ranged from 22 to 364 mm yr-1 when using DWM instead of CVD; percent reductions in total subsurface drain flow over the long term ranged from 35% to 68%. Similarly for nitrogen (N), the simulated average annual reduction in NO3 losses through subsurface drains ranged from 8 to 46 kg N ha-1 yr-1 when using DWM instead of CVD, and the percent reduction of NO3 losses through subsurface drains over the long term ranged from 33% to 56% across the region. In contrast to the 151 mm yr-1 regional average annual decrease in subsurface drainage in response to DWM, the model simulated increases of 85 and 51 mm yr-1 for surface runoff and evapotranspiration, respectively. For the N cycle, the regional average annual decrease in NO3 losses to subsurface drains in response to DWM was 19.4 kg N ha-1 yr-1, which was offset by 8.5, 5.9, 3.2, and 2.9 kg N ha-1 yr-1 increases in soil N storage, plant N uptake, N losses in seepage, and denitrification, respectively. The amount of N in surface runoff across the region only increased by an average of 15.3 kg N ha-1 over the entire 25-year simulation, and no site had an average annual runoff increase greater than 2.6 kg N ha-1 yr-1. When considering the pathways of subsurface drainage, surface runoff, and deep seepage collectively, DWM still reduced undesirable waterborne NO3 losses by 27% regionally. Spatial regression analysis demonstrated that precipitation amount and relative humidity were the most important model inputs affecting simulated reductions in subsurface drain flow across the region, and daily minimum temperature and precipitation amount were the most important variables affecting simulated reductions in NO3 loss through subsurface drains. The simulation results suggest that, if DWM can be practically implemented on a large scale, particularly in the southern states of the region, substantial reductions in the amount of NO3 entering surface waters from agricultural systems can be expected, and these reductions will be counterbalanced mainly by increases in stored soil N and plant N uptake.