• Authors:
    • Dahl, B.
    • Gustafson, C.
    • Wilson, W.
  • Source: Agricultural Finance Review
  • Volume: 69
  • Issue: 1
  • Year: 2009
  • Summary: Malting barley is an important specialty crop in the Northern Plains and growers mitigate risk with federally subsidized crop insurance and production contracts. The purpose of this paper is to quantify risks growers face due to "coverage gaps" in crop insurance that result in uncertain indemnity payments when their crop does not meet contract specifications. A stochastic dominance model is developed to evaluate alternative strategies for growers with differing risk attitudes and production practices (irrigation vs dryland). The results illustrate how alternative crop insurance provisions affect efficient choice sets for growers. Risk premiums for irrigated growers all point to valuations favoring more coverage, contracts, and malting option B. As the crop insurance industry matures in the functions it performs, it will become increasingly more important to address quality attributes.
  • Authors:
    • Hunt, J. R.
    • Dalgliesh, N. P.
    • McCown, R. L.
    • Whish, J. P. M.
    • Robertson, M. J.
    • Foale, M. A.
    • Poulton, P. L.
    • Rees, H. van
    • Carberry, P. S.
    • Hochman, Z.
  • Source: Crop & Pasture Science
  • Volume: 60
  • Issue: 11
  • Year: 2009
  • Summary: Crop simulation models relevant to real-world agriculture have been a rationale for model development over many years. However, as crop models are generally developed and tested against experimental data and with large systematic gaps often reported between experimental and farmer yields, the relevance of simulated yields to the commercial yields of field crops may be questioned. This is the third paper in a series which describes a substantial effort to deliver model-based decision support to Australian farmers. First, the performance of the cropping systems simulator, APSIM, in simulating commercial crop yields is reported across a range of field crops and agricultural regions. Second, how APSIM is used in gaining farmer credibility for their planning and decision making is described using actual case studies. Information was collated on APSIM performance in simulating the yields of over 700 commercial crops of barley, canola, chickpea, cotton, maize, mungbean, sorghum, sugarcane, and wheat monitored over the period 1992 to 2007 in all cropping regions of Australia. This evidence indicated that APSIM can predict the performance of commercial crops at a level close to that reported for its performance against experimental yields. Importantly, an essential requirement for simulating commercial yields across the Australian dryland cropping regions is to accurately describe the resources available to the crop being simulated, particularly soil water and nitrogen. Five case studies of using APSIM with farmers are described in order to demonstrate how model credibility was gained in the context of each circumstance. The proposed process for creating mutual understanding and credibility involved dealing with immediate questions of the involved farmers, contextualising the simulations to the specific situation in question, providing simulation outputs in an iterative process, and together reviewing the ensuing seasonal results against provided simulations. This paper is distinct from many other reports testing the performance and utility of cropping systems models. Here, the measured yields are from commercial crops not experimental plots and the described applications were from real-life situations identified by farmers. A key conclusion, from 17 years of effort, is the proven ability of APSIM to simulate yields from commercial crops provided soil properties are well characterised. Thus, the ambition of models being relevant to real-world agriculture is indeed attainable, at least in situations where biotic stresses are manageable.
  • Authors:
    • Davis, R. A.
    • Huggins, D. R.
    • Cook, R. J.
    • Paulitz, T. C.
  • Source: Canadian Journal of Plant Pathology
  • Volume: 31
  • Issue: 4
  • Year: 2009
  • Summary: Fusarium crown rot of wheat (Triticum aestivum), caused by Fusarium pseudograminearum and Fusarium culmorum, is a yield-limiting disease in the dryland wheat-production area of the intermountain Pacific Northwest and is exacerbated in water-stressed plants induced by overfertilizing with nitrogen (N). Plants with excess N deplete water from the soil profile more rapidly and become drought stressed prematurely. Traditionally a problem on winter wheat in summer fallow, this disease has become more important for spring wheat in continuous cropping areas managed for high grain protein levels. During 3 years with direct seeding (no till) near Pullman, Washington, we investigated whether a split application of N, with some applied the previous fall and some with planting, could limit the disease compared with all N applied in the spring and with no N as the check. We also investigated the influence of the previous (rotation) crop (winter and spring canola, Brassica rapa; barley, Hordeum vulgare; or peas, Pisum sativum) on disease, grain yield, grain protein concentration, and populations of Fusarium in the soil. Overall, the DNA concentration of F. culmorum was significantly greater than F. pseudograminearum, and F. culmorum was highest following spring barley. Disease severity and yield were consistently lower in the no-N treatments compared with the other N treatments. The split application reduced disease in only 1 of 3 years. The all-spring application resulted in higher grain protein in 2 of 3 years compared with the split application, but yield was not affected. The previous crop had small but significant effects on disease, but they were not consistent from year to year and often interacted with the N treatment. Grain protein was higher in wheat after pea in 2 of 3 years. In conclusion, splitting of N had little effect on fusarium crown rot, probably because the N level in both treatments was conducive for disease development. Even if not a host species, the previous crop had little effect on subsequent disease, probably because Fusarium persists for more than one season as chlamydospores and in crop residue in this dry summer climate.
  • Authors:
    • NASS
    • USDA
  • Year: 2009
  • Authors:
    • Smith, K. A.
    • Edwards, A. C.
    • Reay, D. S.
  • Source: Agriculture, Ecosystems & Environment
  • Volume: 133
  • Issue: 3-4
  • Year: 2009
  • Summary: Direct and indirect nitrous oxide (N2O) emissions and leaching losses from an intensively managed grazed pasture in the Ythan catchment, Aberdeenshire, UK, were measured and compared over a 17-month period. Simultaneous measurements of farm-wide leaching losses of N2O were also made and catchment-wide fluxes were estimated from existing N leaching data. The relative importance of direct and indirect N2O fluxes at the field, farm and catchment scale was then assessed. At the field scale we found that direct N2O emissions were low (1.2 kg N ha-1 year-1, 0.6% of N input) with indirect N2O emissions via drainage waters comprising a significant proportion (25%) of total N2O emissions. At the whole-farmscale, the N2O-N emission factor (0.003) for leached NO3-N (EF5-g) was in line with the IPCC's recent downward revision. At the catchment scale, a direct N2O flux of 1.9 kg N ha-1 year-1 and an indirect flux of 0.06 kg N2O-N ha-1 year-1 were estimated. This study lends further support to the recent downward revision of the IPCC emission factor for N2O arising from leached N in surface and ground waters (EF5-g) and highlights the need for multiple point sampling to ensure that the importance of indirect N2O losses via drainage waters is not misrepresented at the farm and catchment scales.
  • Authors:
    • Lal, R.
  • Source: Soil & Tillage Research
  • Volume: 102
  • Issue: 2
  • Year: 2009
  • Summary: Global energy demand of 424 EJ year-1 in 2000 is increasing at the rate of 2.2% year-1. There is a strong need to increase biofuel production because of the rising energy costs and the risks of global warming caused by fossil fuel combustion. Biofuels, being C-neutral and renewable energy sources, are an important alternative to fossil fuels. Therefore, identification of viable sources of biofuel feedstock is a high priority. Harvesting lignocellulosic crop residues, especially of cereal crops, is being considered by industry as one of the sources of biofuel feedstocks. Annual production of lignocellulosic residues of cereals is estimated at 367 million Mg year-1 (75% of the total) for the U.S., and 2800 million Mg year-1 (74.6% of the total) for the world. The energy value of the residue is 16 × 106 BTU Mg-1. However, harvesting crop residues would have strong adverse impact on soil quality. Returning crop residues to soil as amendments is essential to: (a) recycling plant nutrients (20-60 kg of N, P, K, Ca per Mg of crop residues) amounting to 118 million Mg of N, P, K in residues produced annually in the world (83.5% of world's fertilizer consumption), (b) sequestering soil C at the rate of 100-1000 kg C ha-1 year-1 depending on soil type and climate with a total potential of 0.6-1.2 Pg C year-1 in world soils, (c) improving soil structure, water retention and transmission properties, (d) enhancing activity and species diversity of soil fauna, (e) improving water infiltration rate, (f) controlling water runoff and minimizing risks of erosion by water and wind, (g) conserving water in the root zone, and (h) sustaining agronomic productivity by decreasing losses and increasing use efficiency of inputs. Thus, harvesting crop residues as biofuel feedstock would jeopardize soil and water resources which are already under great stress. Biofuel feedstock must be produced through biofuel plantations established on specifically identified soils which do not compete with those dedicated to food crop production. Biofuel plantations, comprising of warm season grasses (e.g., switch grass), short rotation woody perennials (e.g., poplar) and herbaceous species (e.g., miscanthus) must be established on agriculturally surplus/marginal soils or degraded/desertified soils. Plantations established on such soils would restore degraded ecosystems, enhance soil/terrestrial C pool, improve water resources and produce biofuel feedstocks.
  • Authors:
    • Baraibar, B.
    • Westerman, P. R.
    • Recasens, J.
  • Source: Journal of Applied Ecology
  • Volume: 46
  • Issue: 2
  • Year: 2009
  • Summary: Agricultural intensification can cause a huge increase in productivity. However, associated costs in terms of reduced, self-regulation and increased reliance on external inputs for the control of pests, diseases and weeds are seldom taken into account or acknowledged. A pro-active approach in which ecosystems services are documented and potential effects of changes in agricultural practices evaluated may lead to more informed decisions prior to implementation. We investigated the effects of management of cereal production in a semi-arid region on weed seed mortality caused by predators. Seed losses have a greater impact on weed population size than any other life cycle process and should therefore be of significance for natural weed control. We hypothesized that the conversion from rain-fed to irrigated production should lead to reduced and the adoption of no-till techniques to increased seed predation. Seed removal and seed predator populations were monitored in irrigated (N = 3) and rain-fed cereal fields (N = 6) and field margins. Of the dryland fields half was conventionally tilled and the other half no-till. Seed removal (g g(-1) 2-days(-1)) was followed from April 2007 until June 2008, using Petri-dishes and exclosure cages. Populations of harvester ants were estimated by direct nest counts; rodent populations by Sherman live traps. Seed removal in dryland cereals, mainly by harvester ants Messor barbarus was high from mid April to mid October, and should cause a strong weed suppressive effect. Seed removal in irrigated cereals, mainly by granivorous rodents Mus spretus, was low. Seed removal was higher in no-till than in conventional fields and corresponded to differences in harvester ant nest densities. Synthesis and applications. Our results show that tillage and irrigation in a semi-arid cereal production system results in a reduction and total annihilation of granivorous harvester ants, respectively. The concurrent decline in weed seed mortality could lead to increased herbicide use and dependency. In particular, in areas where economic margins are small or the environmental costs of tillage and irrigation high, the increased costs of chemical weed control may exceed the benefits. Here, preserving biodiversity to enhance natural weed control is a viable alternative to agricultural intensification.
  • Authors:
    • Horwath, W.
    • Kallenbach, C.
    • Assa, J.
    • Burger, M.
  • Year: 2009
  • Authors:
    • Del Grosso, S. J.
    • Halvorson, A. D.
    • Alluvione, F.
  • Source: Journal of Environmental Quality
  • Volume: 38
  • Issue: 5
  • Year: 2009
  • Summary: Long-term effects of tillage intensity, N fertilization, and crop rotation on carbon dioxide (CO2) and methane (CH4) flux. from semiarid irrigated soils are poorly understood. We evaluated effects of. (i) tillage intensity [no-till (NT) and conventional moldboard plow tillage (CT)] in a Continuous corn rotation; (ii) N fertilization levels [0-246 kg N ha(-1) for corn (Zea mays L.); 0 and 56 kg N ha(-1) for dry bean (Phaseolus vulgaris W; 0 and 112 kg N ha(-1) for barley (Hordeum distichon L.)]; and (iii) crop rotation Under NT soil management [corn-barley (NTCB); continuous corn (NT-CC); corn-dry bean (NI-CDb)] on CO2 and CH4 flux from a clay loam soil. Carbon dioxide and CH4 fluxes were monitored one to three times per week using vented nonsready state closed chambers. No-till reduced (14%) growing season (154 d) cumulative CO2 emissions relative to CT (NT 2.08 Mg CO2-C ha(-1); CT 2.41 Mg CO2-C ha(-1)), while N fertilization had no effect. Significantly lower (18%) growing season CO2 fluxes were found in NT-CDb than NT-CC and NT-CB (11.4, 13.2 and 13.9 kg CO2-C ha(-1)d(-1) respectively). Growing season CH4 emissions were higher in NT (20.2 g CH4 ha(-1)) than in CT (1.2 g CH4 ha(-1)). Nitrogen fertilization and cropping rotation did not affect CH4 flux. Implementation of NT for 7 yr with no N fertilization was not adequate for restoring the CH4 oxidation capacity Of this clay learn soil relative to CT plowed and fertilized soil.
  • Authors:
    • Lal, R.
    • Blanco-Canqui, H.
  • Source: Soil Science Society of America Journal
  • Volume: 73
  • Issue: 2
  • Year: 2009
  • Summary: Franzluebbers (2009) is right about the need for a more intensive soil sampling, "repeated sampling with time,"and "stratified sampling" as well as for the use of multiple fields and collection of larger number of pseudoreplicates to overcome the high field variability in soil organic carbon (SOC) pools within each Major Land Resource Area (MLRA). The selected fields were representative of each MLRA in terms of soil type, slope, and management, but it is correct that a single soil would not capture all the variability in soil and management for the whole MLRA. This study was not intended to relate the data from the single soil to the whole MLRA but rather to emphasize the differences in SOC sequestration rates among the three management systems within each soil.