• Authors:
    • Harriss, R. C.
    • Narayanan, V.
    • Li, C.
  • Source: Global Biogeochemical Cycles
  • Volume: 10
  • Issue: 2
  • Year: 1996
  • Summary: The Denitrification-Decomposition (DNDC) model was used to elucidate the role of climate, soil properties, and farming practices in determining spatial and temporal variations in the production and emission of nitrous oxide (N[2]O) from agriculture in the United States. Sensitivity studies documented possible causes of annual variability in N[2]O flux for a simulated Iowa corn-growing soil. The 37 scenarios tested indicated that soil tillage and nitrate pollution in rainfall may be especially significant anthropogenic factors which have increased N[2]O emissions from soils in the United States. Feedbacks to climate change and biogeochemical manipulation of agricultural soil reflect complex interactions between the nitrogen and carbon cycles. A 20% increase in annual average temperature in °C produced a 33% increase in N[2]O emissions. Manure applications to Iowa corn crops enhanced carbon storage in soils, but also increased N[2]O emissions. A DNDC simulation of annual N[2]O emissions from all crop and pasture lands in the United States indicated that the value lies in the range 0.9 - 1.2 TgN. Soil tillage and fertilizer use were the most important farming practices contributing to enhanced N[2]O emissions at the national scale. Soil organic matter and climate variables were the primary determinants of spatial variability in N[2]O emissions. Our results suggest that the United States Government, and possibly the Intergovernmental Panel on Climatic Change (IPCC), have underestimated the importance of agriculture as a national and global source of atmospheric N[2]O. The coupled nature of the nitrogen and carbon cycles in soils results in complex feedbacks which complicate the formulation of strategies to reduce the global warming potential of greenhouse gas emissions from agriculture.
  • Authors:
    • Mosier, A. R.
    • Delgado, J. A.
  • Source: Journal of Environmental Quality
  • Volume: 25
  • Issue: 5
  • Year: 1996
  • Summary: Nitrous oxide (N2O) and methane (CH4) are greenhouse gases that are contributing to global warming potential. Nitrogen (N) fertilizer is one of the most important sources of anthropogenic N2O emissions. A field study was conducted to compare N-use efficiency and effect on N2O and CH4 flux, of urea, urea plus the nitrification inhibitor dicyandiamide (U + DCD), and a control release fertilizer, polyolefin coated urea (POCU) in irrigated spring barley (Hordeum vulgare L.) in northeastern Colorado. Each treatment received 90 kg urea-N ha(-1) and microplots labeled with N-15-fertilizer were established. Average N2O emissions were 4.5, 5.2, 6.9, and 8.2 g N ha(-1) d(-1) for control, U + DCD, POCU, and urea, respectively. During the initial 21 d after fertilization, N2O emissions were reduced by 82 and 71% in the U + DCD and POCU treatments, respectively, but continued release of N fertilizer from POCU maintained higher N2O emissions through the remainder of the growing season. No treatment effect on CH4 oxidation in soils was observed. Fertilizer N-15 found 50 to 110 cm below the soil surface was lower in the POCU and U + DCD treatments. At harvest, recovery of N-15-fertilizer in the plant-soil system was 98, 90, and 85% from POCU, urea, and U + DCD, respectively. Grain yield was 2.2, 2.5, and 2.7 Mg ha(-1) for POCU, urea, and U + DCD, respectively. Dicyandiamide and POCU showed the potential to be used as mitigation alternatives to decrease N2O emissions from N fertilizer and movement of N out of the root zone, but N release from POCU does need to be formulated to better match crop growth demands.
  • Authors:
    • Mikkelsen, R. L.
    • Parsons, J. E.
    • Gilliam, J. W.
  • Source: Buffer Zones: Their Processes and Potential in Water Protection
  • Year: 1996
  • Summary: riparian Buffers have been proven to be very effective in the removal of sediment associated nitrogen from surface runoff and nitrate from subsurface flows. In both surface and subsurface flows, hydrologic characteristics are the key to determining how effective the buffer will be in nitrogen removal. Even though buffers are extremely important to minimise entry of non-point source nitrogen into surface waters and removals of 90% are common, they do not work well in some hydrologic conditions.
  • Authors:
    • Cai,Zucong
    • Shan,Yuhua
    • Xu,Hua
    • Callaway,J. M.
    • McCarl,Bruce A.
  • Source: Environmental and Resource Economics
  • Volume: 7
  • Issue: 1
  • Year: 1996
  • Summary: There is a growing body of literature on the costs of sequestering carbon. However, no studies have examined the interplay between farm commodity programs and carbon sequestration programs. This study investigates two dimensions of the interaction between farm commodity programs and afforestation programs, using a price-endogenous sector model of agriculture in the United States. First, this study compares the fiscal and welfare costs of achieving specific carbon targets through afforestation, with and without current farm programs. Second, it examines the welfare, fiscal, and carbon consequences of replacing existing farm subsidies, wholly or in part, with payments for carbon. Two approaches, Hicksian and Marshallian, are investigated. In the first, the sector model is used to quantify the carbon consequences and fiscal costs associated with various combinations of farm commodity and carbon sequestration programs that leave consumers and producers in the U.S. agricultural sector no worse off than under existing programs. The second approach focuses on the carbon and welfare consequences of various farm commodity and carbon sequestration programs that hold total program fiscal costs constant at current levels. Although the methodology and data are applied to the United States, the issues addressed are common in a number of developed nations, particularly within the European Union (EU). Adapting existing sector models in these nations to perform similar analyses would provide policy makers with more precise information about the nature of the trade-offs involved with second-best policies for replacing farm commodity subsidies with tree planting subsidies.
  • Authors:
    • Yoo, K. H.
    • Shirmohammadi, A.
    • Yoon, K. S.
    • Rawls, W. J.
  • Source: Journal of Environmental Science and Health . Part A: Environmental Science and Engineering and Toxicology
  • Volume: 31
  • Issue: 3
  • Year: 1996
  • Summary: A Continuous version of distributed parameter model, ANSWERS (ANSWERS 2000) was applied to a field-sized watershed planted to cotton in the Limestone Valley region of northern Alabama. The field was cultivated for three years with conventional tillage followed by three years of conservation tillage. Overall, the ANSWERS model simulated runoff and nutrient losses in surface runoff within an acceptable range for the conventional tillage system conditions in continuous simulation mode. But the sediment losses predicted by ANSWERS were initially on the order of fifteen times or more higher than measured regardless of tillage systems. In order to duplicate measured data, the sediment detachment coefficient of rainfall and flow had to be reduced for calibration. The model poorly predicted soluble nutrient losses for the conservation tillage system due to the model's weakness in representing the surface application of fertilizer under this practice. The model simulates only one soil layer, in which soil moisture, nutrient concentration, and soil characteristics are assumed homogeneous. Currently, the model does not consider vertical nutrient concentration variation in soil profile. During the conservation tillage system, corn stalk and the residue of a winter cover crop were spread on the soil surface. However, the model did not properly represent surface spreading of crop residue, thus the model was unable to consider the organic-nitrogen contribution from crop residue to the erodible soil surface. This resulted in poor prediction of sediment-bound TKN, especially for conservation tillage system.
  • Authors:
    • Lyon, D. J.
    • Baltensperger, D. D.
  • Source: Journal of Production Agriculture
  • Volume: 8
  • Issue: 4
  • Year: 1995
  • Summary: Downy brome (Bromus tectorum L.), Jointed goatgrass (Aegilops cylindrica Host), and volunteer cereal rye (Secale cereale L.) are winter annual grass weeds that are increasingly troublesome in the winter wheat (Triticum aestivum L. emend. Thell.)-fallow rotation areas of the western USA. Six dryland cropping systems-continuous no-till winter wheat, winter wheat-fallow with fall tillage, winter wheat-fallow with fail applied herbicide, winter wheat-fallow-fallow, winter wheat-sunflower-fallow, and winter wheat-prose millet-fallow-were compared for their effect on winter annual grass densities in winter wheat. Winter annual grass densities averaged 145, 4.4, and 0.4 plants/sq yard for the 1-, 2-, and 3-yr systems, respectively. Eradication of the winter annual grasses was not achieved with any of the systems. Dockage and foreign material levels in wheat grain were lower in 3-yr than in 2-yr cropping systems. Jointed goatgrass was the most persistent annual grass investigated.
  • Authors:
    • Workman, J. P.
  • Source: Rangelands
  • Volume: 17
  • Issue: 2
  • Year: 1995
  • Authors:
    • Schulbach, K. F.
    • Jackson, L. E.
    • Wyland, L. J.
  • Source: The Journal of Agricultural Science
  • Volume: 124
  • Year: 1995
  • Summary: Winter non-leguminous cover crops are included in crop rotations to decrease nitrate (NO3-N) leaching and increase soil organic matter. This study examined the effect of incorporating a mature cover crop on subsequent N transformations. A field trial containing a winter cover crop of Merced rye and a fallow control was established in December 1991 in Salinas, California. The rye was grown for 16 weeks, so that plants had headed and were senescing, resulting in residue which was difficult to incorporate and slow to decompose. Frequent sampling of the surface soil (0-15 cm) showed that net mineralizable N (anaerobic incubation) rapidly increased, then decreased shortly after tillage in both treatments, but that sustained increases in net mineralizable N and microbial biomass N in the cover-cropped soils did not occur until after irrigation, 20 days after incorporation. Soil NO3-N was significantly reduced compared to winter-fallow soil at that time. A N-15 experiment examined the fate of N fertilizer, applied in cylinders at a rate of 12 kg N-15/ha at lettuce planting, and measured in the soil, microbial biomass and lettuce plants after 32 days. In the cover-cropped soil, 59% of the N-15 was recovered in the microbial biomass, compared to 21% in the winter-bare soil. The dry weight, total N and N-15 content of the lettuce in the cover-cropped cylinders were significantly lower; 28 v. 39% of applied N-15 was recovered in the lettuce in the cover-cropped and winter-bare soils, respectively. At harvest, the N content of the lettuce in the cover-cropped soil remained lower, and microbial biomass N was higher than in winter-bare soils. These data indicate that delayed cover crop incorporation resulted in net microbial immobilization which extended into the period of high crop demand and reduced N availability to the crop.
  • Authors:
    • Hardie, I. W.
    • Parks, P. J.
  • Source: Land Economics
  • Volume: 71
  • Issue: 1
  • Year: 1995
  • Summary: Supply schedules for forests planted on marginal agricultural lands are used to simulate a national carbon sequestration program. A cost-effective program should focus on establishing softwood forests on pastureland, and select lands by minimizing cost per ton sequestered. A program similar to the Conservation Reserve Program would sequester 48.6 million tons of carbon per year (3.5 percent of U.S. emissions) on 22.2 million acres. Costs would include $3,700 million in land rental costs and forest establishment costs. Minimizing cost per acre would increase enrollment to 23.1 million acres and would sequester 45.0 million tons per year.
  • Authors:
    • Lamm,F. R.
    • Manges,H. L.
    • Stone,L. R.
    • Khan,A. H.
    • Rogers,D. H.
  • Source: Transactions of the ASAE
  • Volume: 38
  • Issue: 2
  • Year: 1995
  • Summary: Irrigation development during the last 50 years has led to overdraft in many areas of the large Ogallala aquifer in the central United States. Faced with the decline in irrigated acres, irrigators and wafer resource personnel are examining many new techniques to conserve this valuable resource. A three-year study (1989 to 1991) was conducted on a Keith silt loam soil (Aridic Argiustoll) in northwest Kansas to determine the water requirement of corn (Zea mays L.) grown using a subsurface drip irrigation (SDI) system. A dryland control and five irrigation treatments, designed to meet from 25 to 125% of calculated evapotranspiration (ET) needs of the crop were examined. Although cumulative evapotranspiration and precipitation were near normal for the three growing seasons, irrigation requirements were higher than normal due to the timing of precipitation and high evapotranspiration periods. Analysis of the seasonal progression of soil water revealed the well-watered treatments (75 to 125% of ET treatments) maintained stable soil water levels above approximately 55 to 60% of field capacity for the 2.4-m soil profile; while the deficit-irrigated treatments (no irrigation to 50% of ET treatments) mined the soil water. Corn yields were highly linearly related to calculated crop water use, producing 0.048 Mg/ha of grain for each millimeter of water used above a threshold of 328 mm. Analysis of the calculated water balance components indicated that careful management of SDI systems can reduce net irrigation needs by nearly 25%, while still maintaining top yields of 12.5 Mg/ha. Most of these water savings can be attributable to minimizing nonbeneficial water balance components such as soil evaporation and long-term drainage. The SDI system is one technology that can make significant improvements in water use efficiency by better managing the water balance components.