• Authors:
    • Peterson, G. A.
    • Westfall, D. G.
  • Source: Annals of Applied Biology
  • Volume: 144
  • Issue: 2
  • Year: 2004
  • Summary: In the Great Plains of North America potential evaporation exceeds precipitation during most months of the year. About 75% of the annual precipitation is received from April through September, and is accompanied by high temperatures and low relative humidity. Dryland agriculture in the Great Plains has depended on wheat production in a wheat-fallow agroecosystern (one crop year followed by a fallow year). Historically this system has used mechanical weed control practices during the fallow period, which leaves essentially no crop residue cover for protection against soil erosion and greatly accelerates soil organic carbon oxidation. This paper reviews the progress made in precipitation management in the North American Great Plains and synthesises data from an existing long-term experiment to demonstrate the management principles involved. The long-term experiment was established in 1985 to identify dryland crop and soil management systems that would maximize precipitation use efficiency (maximization of biomass production per unit of precipitation received), improve soil productivity, and increase economic return to the farmers in the West Central portion of the Great Plains. Embedded within the primary objective are subobjectives that focus on reducing the amount of summer fallow time and reversing the soil degradation that has occurred in the wheat-fallow cropping system. The experiment consists of four variables: 1) Climate regime; 2) Soils; 3) Management systems; and 4) Time. The climate variable is based on three levels of potential evapotranspiration (ET), which are represented by three sites in eastern Colorado. All sites have annual long-term precipitation averages of approximately 400-450 mm, but vary in growing season open pan evaporation from 1600 mm in the north to 1975 mm in the south. The soil variable is represented by a catenary sequence of soils at each site. Management systems, the third variable, differ in the amount of summer fallow time and emphasize increased crop diversity. All systems are managed with no-till techniques. The fourth variable is time, and the results presented in this paper are for the first 12 yr (3 cycles of the 4-yr system). Comparing yields of cropping systems that differ in cycle length and systems that contain fallow periods, when no crop is produced, is done with a technique called "annualisation". Yields are "annualised" by summing yields for all crops in the system and dividing by the total number of years in the system cycle. For example in a wheat-fallow system the wheat yield is divided by two because it takes 2 yr to produce one crop. Cropping system intensification increased annualised grain and crop residue yields by 75 to 100% compared to wheat-fallow. Net return to farmers increased by 25% to 45% compared to wheat-fallow. Intensified cropping systems increased soil organic C content by 875 and 1400 kg ha(-1), respectively, after 12 yr compared to the wheat-fallow system. All cropping system effects were independent of climate and soil gradients, meaning that the potential for C sequestration exists in all combinations of climates and soils. Soil C gains were directly correlated to the amount of crop residue C returned to the soil. Improved macroaggregation was also associated with increases in the C content of the aggregates. Soil bulk density was reduced by 0.01g cm(-1) for each 1000 kg ha(-1) of residue addition over the 12-yr period, and each 1000 kg ha(-1) of residue addition increased effective porosity by 0.3%. No-till practices have made it possible to increase cropping intensification beyond the traditional wheat-fallow system and in turn water-use efficiency has increased by 30% in West Central Great Plains agroecosystems. Cropping intensification has also provided positive feedbacks to soil productivity via the increased amounts of crop residue being returned to the soil.
  • Authors:
    • Linden, D. R.
    • Voorhees, W. B.
    • Hatfield, J. L.
    • Johnson, J. M. F.
    • Wilhelm, W. W.
  • Source: Agronomy Journal
  • Volume: 96
  • Issue: 1
  • Year: 2004
  • Summary: Society is facing three related issues: overreliance on imported fuel, increasing levels of greenhouse gases in the atmosphere, and producing sufficient food for a growing world population. The U.S. Department of Energy and private enterprise are developing technology necessary to use high-cellulose feedstock, such as crop residues, for ethanol production. Corn (Zea mays L.) residue can provide about 1.7 times more C than barley (Hordeum vulgare L.), oat (Avena sativa L.), sorghum [Sorghum bicolor (L.) Moench], soybean [Glycine max (L.) Merr.], sunflower (Helianthus annuus L.), and wheat (Triticum aestivum L.) residues based on production levels. Removal of crop residue from the field must be balanced against impacting the environment (soil erosion), maintaining soil organic matter levels, and preserving or enhancing productivity. Our objective is to summarize published works for potential impacts of wide-scale, corn stover collection on corn production capacity in Corn Belt soils. We address the issue of crop yield (sustainability) and related soil processes directly. However, scarcity of data requires us to deal with the issue of greenhouse gases indirectly and by inference. All ramifications of new management practices and crop uses must be explored and evaluated fully before an industry is established. Our conclusion is that within limits, corn stover can be harvested for ethanol production to provide a renewable, domestic source of energy that reduces greenhouse gases. Recommendation for removal rates will vary based on regional yield, climatic conditions, and cultural practices. Agronomists are challenged to develop a procedure (tool) for recommending maximum permissible removal rates that ensure sustained soil productivity.
  • Authors:
    • Wolt, J. D.
  • Source: Nutrient Cycling in Agroecosystems
  • Volume: 69
  • Issue: 1
  • Year: 2004
  • Summary: The effectiveness of nitrification inhibitors for abatement of N loss from the agroecosystem is difficult to measure at typical agronomic scales, since performance varies at the research field scale due to complex interactions among crop management, soil properties, length of the trial, and environmental factors. The environmental impact of the nitrification inhibitor nitrapyrin on N losses from agronomic ecosystems was considered with emphasis on the Midwestern USA. A meta-evaluation approach considered the integrated responses to nitrification inhibition found across research trials conducted in diverse environments over many years as measured in side- by-side comparisons of fertilizer N or manure applied with and without nitrapyrin. The resulting distributions of response indices were evaluated with respect to the magnitude and variance of the agronomic and environmental effects that may be achieved when nitrification inhibitors are used regionally over time. The indices considered (1) crop yield, (2) annual or season-long maintenance of inorganic N within the crop root zone, (3) NO3-N leached past the crop root zone, and (4) greenhouse gas emission from soil. Results showed that on average, the crop yield increased (relative to N fertilization without nitrapyrin) 7% and soil N retention increased by 28%, while N leaching decreased by 16% and greenhouse gas emissions decreased by 51%. In more than 75% of individual comparisons, use of a nitrification inhibitor increased soil N retention and crop yield, and decreased N leaching and volatilization. The potential of nitrification inhibitors for reducing N loss needs to be considered at the scale of a sensitive region, such as a watershed, over a prolonged period of use as well as within the context of overall goals for abatement of N losses from the agroecosystem to the environment.
  • Authors:
    • Anderson, R. L.
  • Source: Weed Technology
  • Volume: 18
  • Issue: 1
  • Year: 2004
  • Summary: Dryland rotations are changing in the semiarid Great Plains because of no-till systems. Producers now rotate summer annual crops such as corn with winter wheat and fallow, which can disrupt weed population growth because of diverse life cycles among crops. This study estimated changes in weed populations as affected by rotation design, with the goal of suggesting crop sequences that lower weed community density. We used an empirical life-cycle simulation based on demographics of jointed goatgrass and green foxtail to compare various rotations consisting of winter wheat, corn, proso millet, and fallow across a 12-yr period. The simulation indicated that designing rotations to include a 2-yr interval when seed production of either jointed goatgrass or green foxtail is prevented will drastically reduce weed populations. Arranging four different crops in sequences of two cool-season crops, followed by two warm-season crops was the most beneficial for weed management. Fallow, if used, serves in either life-cycle category. However, if the same crop is grown 2 yr in a row, such as winter wheat, the benefit of rotation design on weed density is reduced considerably. Impact of rotation design on weed density was enhanced by improving crop competitiveness with cultural practices. Rotations with balanced life-cycle intervals not only reduce weed density but enable producers to use alternative weed management strategies, improve effectiveness of herbicides used, and minimize herbicide resistance.
  • Authors:
    • Halvorson, A. D.
    • DeVuyst, E. A.
  • Source: Agronomy Journal
  • Volume: 96
  • Issue: 1
  • Year: 2004
  • Summary: Annualized yields with more intensive cropping (IQ systems tend to be greater than those of spring wheat-fallow (SW-F); however, little economic comparison information is available. The long-term (12 yr) effects of tillage system and N fertilization on the economic returns from two dryland cropping systems in North Dakota were evaluated. An IC rotation [spring wheat (Triticum aestivum L.)winter wheat (T. aestivum L.)-sunflower (Helianthus annuus L.)] and a SW-F rotation were studied. Tillage systems included conventional till (CT), minimum till (MT), and no-till (NT). Nitrogen rates were 34, 67, and 101 kg N ha(-1) for the IC system and 0, 22, and 45 kg N ha(-1) for the SW-F system. Annual precipitation ranged from 206 to 655 mm, averaging 422 mm over 12 yr. The IC system generated higher profits than the SW-F system, but the IC profits were more variable. Within the IC system, MT generated higher profits than corresponding N treatments under CT and NT, but MT profits were more variable. Of the N rates evaluated, the largest N rates generated the largest profits. The dryland IC system with MT and NT was more profitable than the best SW-F system using CT for this location. Stochastic dominance analyses revealed that the SW-F system and IC system CT treatments were economically inefficient when compared with the IC system with MT and NT.
  • Authors:
    • Gibson, S. G.
    • Yarboro, W.
    • Hamrick, M.
    • Thompson, S.
    • King, R.
  • Source: Proceedings of the 26th Southern Conservation Tillage Conference for Sustainable Agriculture
  • Year: 2004
  • Summary: In addition to regular programming, County Agricultural Extension agents are asked many times to respond to questions, suggestions and concerns by their farmer clientele. In North Carolina as in other states an advisory leadership system is in place and farmers can formally and informally make suggestions and requests for on-farm demonstrational work. In many cases what the farmers are observing in their fields and/or things they have read "spark" the interactions with agents. Such has been the case in Cleveland County, NC. For example in the early continuous no-till era many area farmers were concerned about soil compaction. Measurements and simple demonstrations conducted by the Cleveland and Lincoln County agents and supported by the NCSU Soil Science Department and Cleveland County Government helped alleviate these concerns. Later as fields were in continuous no-till for 5 or more years, farmers began to notice a greater than expected development of their crops prior to major applications of fertilizer nitrogen. These observations led to a replicated test in wheat conducted by the Cleveland County Agricultural Extension agent comparing a field in a 2 year no-till wheat soybean rotation verses a nearby field in a 5 year continuous no-till wheat soybean rotation. Also a 6 year replicated test was initiated on Cleveland County owned land that had been in continuous no-till for 10 years. The test was set up as a continuous soybean corn rotation and in addition to the standard dryland portion, irrigation was used in part of the study to simulate a "good" corn year. Five nitrogen rates were used. The economics of the cost of fertilizer nitrogen was used to demonstrate that the Realistic Yield Expectation (RYE) method for determining nitrogen rates was very much applicable in continuous no-till. Both the wheat and corn tests indicated that residual soil nitrogen was indeed becoming a major factor in continuous no-till for these field crops and when farmers considered the realities of the weather very likely nitrogen rates can be reduced with confidence.
  • Authors:
    • Halvorson, A. D.
    • Nielsen, D. C.
    • Reule, C. A.
  • Source: Agronomy Journal
  • Volume: 96
  • Issue: 4
  • Year: 2004
  • Summary: No-till (NT) production systems, especially winter wheat (Triticum aestivum L.)-summer crop-fallow, have increased in the central Great Plains, but few N fertility studies have been conducted with these systems. Therefore, winter wheat (W) response to N fertilization in two NT dryland crop rotations, wheat-corn (Zea mays L.)-fallow (WCF) and wheat-sorghum (Sorghum bicolor L.)-fallow (WSF), on a Platner loam (fine, smectitic, mesic Aridic Palleustoll) was evaluated for 9 yr. Five N rates, 0, 28, 56, 84, and 112 kg N ha(-1), were applied to each rotation crop. Wheat biomass and grain yield response to N fertilization varied with year but not with crop rotation, increasing with N application each year, with maximum yields being obtained with 84 kg N ha(-1) over all years. Based on grain N removal, N fertilizer use efficiency (NFUE) varied with N rate and year, averaging 86, 69, 56, and 46% for the 28, 56, 84, and 112 kg ha(-1) N rates, respectively. Grain protein increased with increasing N rate. Precipitation use efficiency (PUE) increased with N addition, leveling off above 56 kg N ha(-1). A soil plus fertilizer N level of 124 to 156 kg N ha(-1) was sufficient to optimize winter wheat yields in most years in both rotations. Application of more than 84 kg N ha(-1) on this Platner loam soil, with a gravel layer below 120 cm soil depth, would more than likely increase the amount of NO3-N available for leaching and ground water contamination. Wheat growers in the central Great Plains need to apply N to optimize dryland wheat yields and improve grain quality, but need to avoid over-fertilization with N to minimize NO3-N leaching potential.
  • Authors:
    • NASDA
  • Year: 2004
  • Authors:
    • Kaspar, T. C.
    • Parkin, T. B.
  • Source: Soil Science Society of America Journal
  • Volume: 68
  • Year: 2004
  • Summary: It is well known that soil CO2 flux can exhibit pronounced day-to-day variations; however, measurements of soil CO2 flux with soil chambers typically are done only at discrete points in time. This study evaluated the impact of sampling frequency on the precision of cumulative CO2 flux estimates calculated from field measurements. Automated chambers were deployed at two sites in a no-till corn/ soybean field and operated in open system mode to measure soil CO2 fluxes every hour from 4 March 2000 through 6 June 2000. Sampling frequency effects on cumulative CO2-C flux estimation were assessed with a jackknife technique whereby the populations of measured hourly fluxes were numerically sampled at regular time intervals ranging from 1 d to 20 d, and the resulting sets of jackknife fluxes were used to calculate estimates of cumulative CO2-C flux. We observed that as sampling interval increased from 1 d to 12 d, the variance associated with cumulative flux estimates increased. However, at sampling intervals of 12 to 20 d, variances were relatively constant. Sampling once every 3 d, estimates of cumulative C loss were within +-20% of the expected value at both sites. As the time interval between sampling was increased, the potential deviation in estimated cumulative CO2 flux increased such that sampling once every 20 d yielded potential estimates within 60% and 40% of the actual cumulative CO2 flux. A stratified sampling scheme around rainfall events was also evaluated and was found to provide more precise estimates at lower sampling intensities. These results should aid investigators to develop sampling designs to minimize the effects of temporal variability on cumulative CO2-C estimation.
  • Authors:
    • Dale, B. E.
    • Kim, S.
  • Source: Biomass and Bioenergy
  • Volume: 26
  • Issue: 4
  • Year: 2004
  • Summary: The global annual potential bioethanol production from the major crops, corn, barley, oat, rice, wheat, sorghum, and sugar cane, is estimated. To avoid conflicts between human food use and industrial use of crops, only the wasted crop, which is defined as crop lost in distribution, is considered as feedstock. Lignocellulosic biomass such as crop residues and sugar cane bagasse are included in feedstock for producing bioethanol as well. There are about 73:9 Tg of dry wasted crops in the world that could potentially produce 49:1 GL year-1 of bioethanol. About 1:5 Pg year-1 of dry lignocellulosic biomass from these seven crops is also available for conversion to bioethanol. Lignocellulosic biomass could produce up to 442 GL year-1 of bioethanol. Thus, the total potential bioethanol production from crop residues and wasted crops is 491 GL year-1, about 16 times higher than the current world ethanol production. The potential bioethanol production could replace 353 GL of gasoline (32% of the global gasoline consumption) when bioethanol is used in E85 fuel for a midsize passenger vehicle. Furthermore, lignin-rich fermentation residue, which is the coproduct of bioethanol made from crop residues and sugar cane bagasse, can potentially generate both 458 TWh of electricity (about 3.6% of world electricity production) and 2:6EJ of steam. Asia is the largest potential producer of bioethanol from crop residues and wasted crops, and could produce up to 291 GL year -1 of bioethanol. Rice straw, wheat straw, and corn stover are the most favorable bioethanol feedstocks in Asia. The next highest potential region is Europe (69:2 GL ofbioethanol), in which most bioethanol comes from wheat straw. Corn stover is the main feedstock in North America, from which about 38:4 GL year -1 of bioethanol can potentially be produced. Globally rice straw can produce 205 GL of bioethanol, which is the largest amount from single biomass feedstock. The next highest potential feedstock is wheat straw, which can produce 104 GL of bioethanol. This paper is intended to give some perspective on the size ofthe bioethanol feedstock resource, globally and by region, and to summarize relevant data that we believe others will 0nd useful, for example, those who are interested in producing biobased products such as lactic acid, rather than ethanol, from crops and wastes. The paper does not attempt to indicate how much, if any, of this waste material could actually be converted to bioethanol.