• Authors:
    • Ruiz, J. C.
    • Vanderlinden, K.
    • Melero, S.
    • Madejon, E.
  • Source: The Journal of Agricultural Science
  • Volume: 147
  • Issue: 1
  • Year: 2009
  • Summary: Soil enzyme activities are widely utilized as rapid and sensitive indicators in discriminating among soil management effects. The objective of the present study was to compare the influence of conservation tillage, i.e. direct drilling (DD) (residue cover is left on the soil surface) v. conventional tillage (CT), on soil chemical and biochemical properties in a crop rotation (cereals-sunflower-legumes) under dryland production in a semi-arid Mediterranean Vertisol after 23 years. A randomized experimental design was established. Soil biological status was evaluated by measuring of enzymatic activities (dehydrogenase, beta-glucosidase, alkaline phosphatase and protease). Total organic carbon (TOC) contents were greater in soils managed by DD than those found by CT. Except for protease activity, enzymatic activity values were approximately 2-fold higher in soils under DD than in soils under CT. The beta-glucosidase, alkaline phosphatase and dehydrogenase values showed a high correlation (from r=0.481 to r=0.886, P≤0.01) with TOC contents and they were correlated with each other (from r=0.664 to r=0.923, P≤0.01). The coefficient of variation of biochemical properties was higher than those of chemical properties in both treatments. Principal component analysis (PCA) showed that two principal components explained 58% and 20% of the total variability. The first principal component was influenced mostly by beta-glucosidase, dehydrogenase and TOC, whereas the second was influenced by pH. The first component effectively differentiated managed soil under both agriculture practices. In general, long-term soil conservation management by DD in a dryland farming system improved the quality of this Vertisol by enhancing its organic matter content and biochemical activity.
  • Authors:
    • Eckard, R.
    • Henry, B.
  • Source: Tropical Grasslands
  • Volume: 43
  • Year: 2009
  • Summary: Agriculture is responsible for a significant proportion of total anthropogenic greenhouse gas emissions (perhaps 18% globally), and therefore has the potential to contribute to efforts to reduce emissions as a means of minimising the risk of dangerous climate change. The largest contributions to emissions are attributed to ruminant methane production and nitrous oxide from animal waste and fertilised soils. Further, livestock, including ruminants, are an important component of global and Australian food production and there is a growing demand for animal protein sources. At the same time as governments and the community strengthen objectives to reduce greenhouse gas emissions, there are growing concerns about global food security. This paper provides an overview of a number of options for reducing methane and nitrous oxide emissions from ruminant production systems in Australia, while maintaining productivity to contribute to both objectives. Options include strategies for feed modification, animal breeding and herd management, rumen manipulation and animal waste and fertiliser management. Using currently available strategies, some reductions in emissions can be achieved, but practical commercially available techniques for significant reductions in methane emissions, particularly from extensive livestock production systems, will require greater time and resource investment. Decreases in the levels of emissions from these ruminant systems (i.e., the amount of emissions per unit of product such as meat) have already been achieved. However, the technology has not yet been developed for eliminating production of methane from the rumen of cattle and sheep digesting the cellulose and lignin-rich grasses that make up a large part of the diet of animals grazing natural pastures, particularly in arid and semi-arid grazing lands. Nevertheless, the abatement that can be achieved will contribute significantly towards reaching greenhouse gas emissions reduction targets
  • Authors:
    • Graham, J.
    • Kelly, K.
    • Armstrong, R.
    • Phillips, F.
    • Officer, S.
  • Source: Proceedings of Greenhouse 2009
  • Year: 2009
  • Authors:
    • Wu, J. Q.
    • Singh, P.
    • Flury, M.
    • Schillinger, W. F.
    • Huggins, D. R.
    • Stoeckle, C. O.
    • Al-Mulla, Y. A.
  • Source: Applied Engineering in Agriculture
  • Volume: 25
  • Issue: 1
  • Year: 2009
  • Summary: Establishing winter wheat in the dryland Pacific Northwest requires soil water at depths that the seeds are planted in the early fall. Usually, a soil mulch is created and maintained to conserve seed-zone water and to promote the early establishment of winter wheat. Unfortunately, the tillage used to create the soil mulch often results in unacceptable levels of wind erosion. Chemical (no-till) fallow (CF) and reduced-tillage fallow (RT) are two alternatives for reducing wind erosion, but their effectiveness in maintaining sufficient seed-zone water is unknown. Our objectives were to: (i) assess the effects of CF and RT on seed- and root-zone temperature and water; and (ii) test a model (Simultaneous Heat and Water, SHAW) for simulating management effects on soil temperature and water. Weather data, soil temperature, and water content were monitored in CF and RT treatments. The RT treatment was observed to retain more seed-zone water over summer compared to CF. During the wet winter, CF gained more water than RT. Observed soil temperatures were higher in the CF than in RT. SHAW-simulated water contents followed the trend of the field data, though it slightly under-predicted soil water content for CF and over-predicted for RT. We concluded that RT would provide more seed-zone water for winter wheat establishment than CF. In addition, the SHAW model proved adequate in simulating soil water and temperature, and therefore may serve as a useful modeling tool for evaluating tillage and residue management alternatives.
  • Authors:
    • Djurovic, D.
    • Dugalic, G.
    • Stevovic, V.
    • Paunovic, A.
    • Bokan, N.
  • Source: Agroznanje - Agro-knowledge Journal
  • Volume: 10
  • Issue: 3
  • Year: 2009
  • Summary: Average grain yields of maize, undoubtedly the most common field crop grown in the Balkans, are still significantly lower than its genetic and practical potential. All known cultural practices have not yet been applied sufficiently. Hence the constant need to conduct trials to confirm the necessity to employ known technological practices in the cultivation of old and novel maize hybrids. The generally low average yields of maize grown under dryland conditions can be increased by available cultural practices including the selection of drought-tolerant hybrids, adequate crop rotation, the use of the most suitable tillage system and basic fertilization, optimal plant density, interrow cultivation and fertilization. The trial was set up as a randomized block design on leached alluvial soil. The following hybrids were studied: NS 50402, NS 540, ZP 570, ZP 580 and ZP 599, being fertilized under three treatments: basic treatment (30 t/ha manure and 400 t/ha of composite 15:15:15 fertilizer prior to sowing), N1 (250 kg/ha CAN) and N2 (500 kg/ha CAN). The average yield of dry maize grain was 9.65 t/ha. Averagely for the hybrids, the low and high nitrogen application rates induced 0.32 t/ha and 0.55 t/ha yield increases, respectively. The plot fertilized every second year with manure and composite mineral fertilizer gave a satisfactory yield of 9.36 t/ha. The highest average yield of 10.61 t/ha under all treatments was produced by ZP580 hybrid. The above-average yield, achieved under non-irrigated conditions, was largely induced by combined organic and mineral fertilization, since the plants were able to better tolerate the drought conditions due to a sufficient amount of readily available nutrients.
  • Authors:
    • Hunt, J. R.
    • Dalgliesh, N. P.
    • McCown, R. L.
    • Whish, J. P. M.
    • Robertson, M. J.
    • Foale, M. A.
    • Poulton, P. L.
    • Rees, H. van
    • Carberry, P. S.
    • Hochman, Z.
  • Source: Crop & Pasture Science
  • Volume: 60
  • Issue: 11
  • Year: 2009
  • Summary: Crop simulation models relevant to real-world agriculture have been a rationale for model development over many years. However, as crop models are generally developed and tested against experimental data and with large systematic gaps often reported between experimental and farmer yields, the relevance of simulated yields to the commercial yields of field crops may be questioned. This is the third paper in a series which describes a substantial effort to deliver model-based decision support to Australian farmers. First, the performance of the cropping systems simulator, APSIM, in simulating commercial crop yields is reported across a range of field crops and agricultural regions. Second, how APSIM is used in gaining farmer credibility for their planning and decision making is described using actual case studies. Information was collated on APSIM performance in simulating the yields of over 700 commercial crops of barley, canola, chickpea, cotton, maize, mungbean, sorghum, sugarcane, and wheat monitored over the period 1992 to 2007 in all cropping regions of Australia. This evidence indicated that APSIM can predict the performance of commercial crops at a level close to that reported for its performance against experimental yields. Importantly, an essential requirement for simulating commercial yields across the Australian dryland cropping regions is to accurately describe the resources available to the crop being simulated, particularly soil water and nitrogen. Five case studies of using APSIM with farmers are described in order to demonstrate how model credibility was gained in the context of each circumstance. The proposed process for creating mutual understanding and credibility involved dealing with immediate questions of the involved farmers, contextualising the simulations to the specific situation in question, providing simulation outputs in an iterative process, and together reviewing the ensuing seasonal results against provided simulations. This paper is distinct from many other reports testing the performance and utility of cropping systems models. Here, the measured yields are from commercial crops not experimental plots and the described applications were from real-life situations identified by farmers. A key conclusion, from 17 years of effort, is the proven ability of APSIM to simulate yields from commercial crops provided soil properties are well characterised. Thus, the ambition of models being relevant to real-world agriculture is indeed attainable, at least in situations where biotic stresses are manageable.
  • Authors:
    • Davis, R. A.
    • Huggins, D. R.
    • Cook, R. J.
    • Paulitz, T. C.
  • Source: Canadian Journal of Plant Pathology
  • Volume: 31
  • Issue: 4
  • Year: 2009
  • Summary: Fusarium crown rot of wheat (Triticum aestivum), caused by Fusarium pseudograminearum and Fusarium culmorum, is a yield-limiting disease in the dryland wheat-production area of the intermountain Pacific Northwest and is exacerbated in water-stressed plants induced by overfertilizing with nitrogen (N). Plants with excess N deplete water from the soil profile more rapidly and become drought stressed prematurely. Traditionally a problem on winter wheat in summer fallow, this disease has become more important for spring wheat in continuous cropping areas managed for high grain protein levels. During 3 years with direct seeding (no till) near Pullman, Washington, we investigated whether a split application of N, with some applied the previous fall and some with planting, could limit the disease compared with all N applied in the spring and with no N as the check. We also investigated the influence of the previous (rotation) crop (winter and spring canola, Brassica rapa; barley, Hordeum vulgare; or peas, Pisum sativum) on disease, grain yield, grain protein concentration, and populations of Fusarium in the soil. Overall, the DNA concentration of F. culmorum was significantly greater than F. pseudograminearum, and F. culmorum was highest following spring barley. Disease severity and yield were consistently lower in the no-N treatments compared with the other N treatments. The split application reduced disease in only 1 of 3 years. The all-spring application resulted in higher grain protein in 2 of 3 years compared with the split application, but yield was not affected. The previous crop had small but significant effects on disease, but they were not consistent from year to year and often interacted with the N treatment. Grain protein was higher in wheat after pea in 2 of 3 years. In conclusion, splitting of N had little effect on fusarium crown rot, probably because the N level in both treatments was conducive for disease development. Even if not a host species, the previous crop had little effect on subsequent disease, probably because Fusarium persists for more than one season as chlamydospores and in crop residue in this dry summer climate.
  • Authors:
    • NASS
    • USDA
  • Year: 2009
  • Authors:
    • Hiatt, S.
    • Potter, C.
  • Source: Journal of Soil and Water Conservation
  • Volume: 64
  • Issue: 6
  • Year: 2009
  • Summary: The nonpoint source pollution model Soil and Water Assessment Tool (SWAT) was applied to understand management options that may improve water quality in the Laguna de Santa Rosa watershed in Sonoma County, California. Surface water quality in the Laguna watershed has been significantly impaired over recent years, as natural land cover has been urbanized or converted to agricultural uses. We first generated new maps of land cover and major land uses from satellite and airborne imagery for the watershed. The SWAT model output was checked against six streamflow gauges in the watershed. At the monthly time step, we found that the precalibrated model performed well at all gauges, with the coefficient of determination () values ranging from 0.81 to 0.92. Calibration by modifications of groundwater extraction in the watershed resulted in notable increases to correlation values at all gauges, except at upstream locations on Santa Rosa Creek and Mark West Creek. Measured seasonal trends in sediment concentrations were tracked closely by the SWAT model predictions. Highest sediment loading rates were associated in the model results with pasture, rangeland, and vineyard cover areas. Model scenarios were tested for vegetation filter strips and improved ground cover conditions applied in subbasins, where soil erosion was shown to be elevated in previous simulations.
  • Authors:
    • Science Applications International Corporation
  • Source: Bus Fleet Upgrade Projects
  • Year: 2009
  • Summary: This paper discusses the key issues with developing a GHG offsets methodology for bus fleet upgrades, including options for setting a performance threshold for identifying those projects that should receive credit. A performance standard sets a threshold emissions level that is significantly better than the average emissions performance for a specified service. In this case, we expect the threshold to be set by reference to the emissions performance of bus fleets. If a project for improving fleet performance has emissions that are equal to or better than the threshold, then the project would be considered to exceed the "business-as-usual" (BAU) performance and would be eligible for registration of emission reduction credits.