• Authors:
    • Sanderman, J.
  • Source: Agriculture, Ecosystems & Environment
  • Volume: 155
  • Year: 2012
  • Summary: In many important agricultural regions, soil inorganic carbon (SIC) stocks can rival the amount of carbon found in organic form. land management practices, including irrigation, fertilization and liming, have the potential to greatly alter the soil inorganic carbon cycle thus creating an important feedback to atmospheric CO2 concentrations. However, the current literature is less clear regarding the direction and magnitude of this feedback. Application of irrigation water, for example, can increase the rate of soil carbonate precipitation, but depending on the source of calcium and bicarbonate, the net reaction can be an atmospheric carbon sink, a carbon source or carbon neutral. Similarly, the accelerated dissolution of soil carbonates due to various acidifying processes can act as a net sink or source of atmospheric CO2 depending on the spatial and temporal frame of reference. While SIC stocks in agricultural soils have been found to increase or decrease by as much as 1.0 t C ha(-1) yr(-1), given the need to account for both the supply and fate of reactants and reaction products, ascribing these stock changes as a net sink or net source activity is difficult. This review paper provides an overview of the major inorganic carbon transformations in soils as affected by agricultural management, including the practice of liming to raise soil pH, and when these transformations should be considered a net atmospheric carbon source or sink. (C) 2012 Elsevier B.V. All rights reserved.
  • Authors:
    • Dreccer, M. F.
    • Chenu, K.
    • Zheng, B. Y.
    • Chapman, S. C.
  • Source: GLOBAL CHANGE BIOLOGY
  • Volume: 18
  • Issue: 9
  • Year: 2012
  • Summary: Extreme climate, especially temperature, can severely reduce wheat yield. As global warming has already begun to increase mean temperature and the occurrence of extreme temperatures, it has become urgent to accelerate the 5-20 year process of breeding for new wheat varieties, to adapt to future climate. We analyzed the patterns of frost and heat events across the Australian wheatbelt based on 50 years of historical records (1960-2009) for 2864 weather stations. Flowering dates of three contrasting-maturity wheat varieties were simulated for a wide range of sowing dates in 22 locations for 'current' climate (1960-2009) and eight future scenarios (high and low CO 2 emission, dry and wet precipitation scenarios, in 2030 and 2050). The results highlighted the substantial spatial variability of frost and heat events across the Australian wheatbelt in current and future climates. As both 'last frost' and 'first heat' events would occur earlier in the season, the 'target' sowing and flowering windows (defined as risk less than 10% for frost (35°C) around flowering) would be shifted earlier by up to 2 and 1 month(s), respectively, in 2050. A short-season variety would require a shift in target sowing window 2-fold greater than long- and medium-season varieties by 2050 (8 vs. 4 days on average across locations and scenarios, respectively), but would suffer a lesser decrease in the length of the vegetative period (4 vs. 7 days). Overall, warmer winters would shorten the wheat season by up to 6 weeks, especially during preflowering. This faster crop cycle is associated with a reduced time for resource acquisition, and potential yield loss. As far as favourable rain and modern equipment would allow, early sowing and longer season varieties (i.e. in current climate) would be the best strategies to adapt to future climates.
  • Authors:
    • Grierson, P.
    • Goodwin, I.
    • D'Arrigo, R.
    • Cullen, L.
    • Allen, K.
    • Karoly, D. J.
    • Braganza, K.
    • Gallant, A. J. E.
    • Gergis, J.
    • McGregor, S.
  • Source: CLIMATIC CHANGE
  • Volume: 111
  • Issue: 3-4
  • Year: 2012
  • Summary: This study presents the first multi-proxy reconstruction of rainfall variability from the mid-latitude region of south-eastern Australia (SEA). A skilful rainfall reconstruction for the 1783-1988 period was possible using twelve annually-resolved palaeoclimate records from the Australasian region. An innovative Monte Carlo calibration and verification technique is introduced to provide the robust uncertainty estimates needed for reliable climate reconstructions. Our ensemble median reconstruction captures 33% of inter-annual and 72% of decadal variations in instrumental SEA rainfall observations. We investigate the stability of regional SEA rainfall with large-scale circulation associated with El Nino-Southern Oscillation (ENSO) and the Inter-decadal Pacific Oscillation (IPO) over the past 206 years. We find evidence for a robust relationship with high SEA rainfall, ENSO and the IPO over the 1840-1988 period. These relationships break down in the late 18th-early 19th century, coinciding with a known period of equatorial Pacific Sea Surface Temperature (SST) cooling during one of the most severe periods of the Little Ice Age. In comparison to a markedly wetter late 18th/early 19th century containing 75% of sustained wet years, 70% of all reconstructed sustained dry years in SEA occur during the 20th century. In the context of the rainfall estimates introduced here, there is a 97.1% probability that the decadal rainfall anomaly recorded during the 1998-2008 'Big Dry' is the worst experienced since the first European settlement of Australia.
  • Authors:
    • Denham, A. J.
    • Auld, T. D.
    • Ooi, M. K. J.
  • Source: PLANT AND SOIL
  • Volume: 353
  • Issue: 1-2
  • Year: 2012
  • Summary: Background and aims Understanding the mechanistic effects of climate change on species key life-history stages is essential for predicting ecological responses. In fire-prone regions, long-term seed banks allow post-fire recovery and persistence of plant populations. For physically dormant species, seed bank longevity depends on the maintenance of dormancy which is controlled primarily by temperature. Successful inter-fire recruitment is rare and dormancy loss between fires produces a net loss to the seed bank. We assessed whether temperature increases related to climate change can affect seed dormancy and, potentially, seed bank longevity. Methods We quantified the relationship between air temperatures and soil temperatures. Seeds of two shrub species, from four populations along an altitudinal gradient, were then exposed to a range of soil temperatures calculated to occur at the end of the 21st century, using projected mean and heat wave scenarios. Alterations to dormancy were assessed via germination. Results For every 1 degrees C increase in air temperature, associated soil temperature increased by 1.5 degrees C. Mean temperature increase had no affect on seed dormancy. However, future heat wave conditions produced soil temperatures that significantly increased dormancy loss. This impact was greatest in seeds from cooler, high elevation populations. Conclusions Projected heat wave events produce conditions that provide a mechanism for seed bank compromise. Dormancy-breaking temperatures for each population were positively related to parental environment temperatures, indicating local adaptation. Whilst heat from fire may govern post-fire recruitment response, we suggest that parental climate is the key selective force determining dormancy-breaking threshold temperatures, ensuring inter-fire seed bank persistence.
  • Authors:
    • Scanlan, J. C.
    • Stokes, C. J.
    • Webb, N. P.
  • Source: CLIMATIC CHANGE
  • Volume: 112
  • Issue: 3-4
  • Year: 2012
  • Summary: There is an increasing need to understand what makes vegetation at some locations more sensitive to climate change than others. For savanna rangelands, this requires building knowledge of how forage production in different land types will respond to climate change, and identifying how location-specific land type characteristics, climate and land management control the magnitude and direction of its responses to change. Here, a simulation analysis is used to explore how forage production in 14 land types of the north-eastern Australian rangelands responds to three climate change scenarios of +3A degrees C, +17% rainfall; +2A degrees C, -7% rainfall; and +3A degrees C, -46% rainfall. Our results demonstrate that the controls on forage production responses are complex, with functional characteristics of land types interacting to determine the magnitude and direction of change. Forage production may increase by up to 60% or decrease by up to 90% in response to the extreme scenarios of change. The magnitude of these responses is dependent on whether forage production is water or nitrogen (N) limited, and how climate changes influence these limiting conditions. Forage production responds most to changes in temperature and moisture availability in land types that are water-limited, and shows the least amount of change when growth is restricted by N availability. The fertilisation effects of doubled atmospheric CO2 were found to offset declines in forage production under 2A degrees C warming and a 7% reduction in rainfall. However, rising tree densities and declining land condition are shown to reduce potential opportunities from increases in forage production and raise the sensitivity of pastures to climate-induced water stress. Knowledge of these interactions can be applied in engaging with stakeholders to identify adaptation options.
  • Authors:
    • Bellotti, B.
    • Ridoutt, B.
    • Page, G.
  • Source: Journal of Cleaner Production
  • Volume: 32
  • Year: 2012
  • Summary: There is growing interest in carbon footprints of products but for horticulture water use can also be important, hence we studied both for fresh tomatoes supplied to the Sydney market. Carbon and water footprints for each kg of fresh tomato supplied to Sydney depend on the season and the type of production system (ranging from 0.39 to 1.97 kg CO(2)e; 5 to 53 L). Energy use of the systems was also reported which ranged from 6.16 to 27.42 MJ for each kg of fresh tomato supplied to Sydney. Tradeoffs exist within studied production systems such that a system which had higher carbon footprint had lower water footprint; this complicates setting priorities for overall environmental improvement. To address this limitation, life cycle impacts of greenhouse gas (GHG) emissions and water use were subsequently modelled using endpoint indicators and compared. The results indicated that in all cases the climate change impacts were most important representing 84-96% of the combined scores on damages to the environment. As such the vegetable industry's priority to reduce GHG emissions is confirmed. In case of field production, transportation of tomatoes to market was the hotspot in carbon footprint, while for the medium and high technology greenhouses it was artificial heating. Although the results indicated priority to reduce carbon footprint, further development and harmonisation of LCA impact assessment models for water use at the endpoint level is considered essential. (C) 2012 Elsevier Ltd. All rights reserved.
  • Authors:
    • Scheer,Clemens
    • Grace,Peter R.
    • Rowlings,David W.
    • Payero,Jose
  • Source: Plant and Soil
  • Volume: 359
  • Issue: 1-2
  • Year: 2012
  • Summary: Irrigation management affects soil water dynamics as well as the soil microbial carbon and nitrogen turnover and potentially the biosphere-atmosphere exchange of greenhouse gasses (GHG). We present a study on the effect of three irrigation treatments on the emissions of nitrous oxide (N2O) from irrigated wheat on black vertisols in South-Eastern Queensland, Australia. Soil N2O fluxes from wheat were monitored over one season with a fully automated system that measured emissions on a sub-daily basis. Measurements were taken from 3 subplots for each treatment within a randomized split-plot design. Highest N2O emissions occurred after rainfall or irrigation and the amount of irrigation water applied was found to influence the magnitude of these "emission pulses". Daily N2O emissions varied from -0.74 to 20.46 g N2O-N ha(-1) day(-1) resulting in seasonal losses ranging from 0.43 to 0.75 kg N2O-N ha(-1) season (-aEuro parts per thousand 1) for the different irrigation treatments. Emission factors (EF = proportion of N fertilizer emitted as N2O) over the wheat cropping season, uncorrected for background emissions, ranged from 0.2 to 0.4 % of total N applied for the different treatments. Highest seasonal N2O emissions were observed in the treatment with the highest irrigation intensity; however, the N2O intensity (N2O emission per crop yield) was highest in the treatment with the lowest irrigation intensity. Our data suggest that timing and amount of irrigation can effectively be used to reduce N2O losses from irrigated agricultural systems; however, in order to develop sustainable mitigation strategies the N2O intensity of a cropping system is an important concept that needs to be taken into account.
  • Authors:
    • Thomas, M.
    • Sanderman, J.
    • Chappell, A.
    • Read, A.
    • Leslie, C.
  • Source: Global Change Biology
  • Volume: 18
  • Issue: 6
  • Year: 2012
  • Summary: Anthropogenically induced change in soil redistribution plays an important role in the soil organic carbon (SOC) budget. Uncertainty of its impact is large because of the dearth of recent soil redistribution estimates concomitant with changing land use and management practices. An Australian national survey used the artificial radionuclide caesium-137 ( 137Cs) to estimate net (1950s-1990) soil redistribution. South-eastern Australia showed a median net soil loss of 9.7 t ha -1 yr -1. We resurveyed the region using the same 137Cs technique and found a median net (1990-2010) soil gain of 3.9 t ha -1 yr -1 with an interquartile range from -1.6 t ha -1 yr -1 to +10.7 t ha -1 yr -1. Despite this variation, soil erosion across the region has declined as a likely consequence of the widespread adoption of soil conservation measures over the last ca 30 years. The implication of omitted soil redistribution dynamics in SOC accounting is to increase uncertainty and diminish its accuracy.
  • Authors:
    • Finlay, L. A.
    • Hulugalle, N. R.
    • Weaver, T. B.
  • Source: Renewable Agriculture and Food Systems
  • Volume: 27
  • Issue: 2
  • Year: 2012
  • Summary: Cover crops in minimum or no-tilled systems are usually killed by applying one or more herbicides, thus significantly increasing costs. Applying herbicides at lower rates with mechanical interventions that do not disturb or bury cover crop residues can, however, reduce costs. Our objective was to develop a management system with the above-mentioned features for prostrate cover crops on permanent beds in an irrigated Vertisol. The implement developed consisted of a toolbar to which were attached spring-loaded pairs of parallel coulter discs, one set of nozzles between the individual coulter discs that directed a contact herbicide to the bed surfaces to kill the cover crop and a second set of nozzles located to direct the cheaper glyphosate to the furrow to kill weeds. The management system killed a prostrate cover crop with less trafficking, reduced the use of more toxic herbicides, carbon footprint, labor and risk to operators. Maximum depth of compaction was more but average increase was less than that with the boom sprayer control.
  • Authors:
    • Cai, L.
    • Padovan, B.
    • Lee, B.
    • Ren, Y. L.
  • Source: Pest Management Science
  • Volume: 68
  • Issue: 2
  • Year: 2012
  • Summary: BACKGROUND: Methyl bromide is being phased out for use on stored commodities, as it is listed as an ozone-depleting substance, and phosphine is the fumigant widely used on grains. However, phosphine resistance occurs worldwide, and phosphine fumigation requires a long exposure period and temperatures of > 15 degrees C. There is an urgent requirement for the development of a fumigant that kills insects quickly and for phosphine resistance management. This paper reports on a new fumigant formulation of 95% ethyl formate plus 5% methyl isothiocyanate as an alternative fumigant for stored grains. RESULTS: The formulation is stable for at least 4 months of storage at 45 degrees C. A laboratory bioassaywith the formulation showed that it controlled all stages of Sitophilusoryzae (L.), Sitophilusgranarius (L.), Tribolium castaneum (Herbst), Rhyzopertha dominica (F.), Trogoderma variabile Ballion and Callosobruchus maculatus (Fabricius) in infestedwheat, barley, oats and peas at 80 mg L-1 for 5 days, and in canola at both 40mg L-1 for 5 days and 80mg L-1 for 2 days at 25 +/- 2 degrees C. After an 8-14 day holding period, residues of ethyl formate and methyl isothiocyanate in wheat, barley, peas and canola were below the experimental permit levels of 1.0 and 0.1 mg kg(-1). However, fumigated oats needed an 18 day holding period. CONCLUSIONS: The findings suggest that the ethyl formate plusmethyl isothiocyanate formulation has potential as a fumigant for the control of stored-grain insect pests in various commodities. (C) 2011 Society of Chemical Industry