• Authors:
    • Lugnot, M.
    • Decuq, C.
    • Garnier, J.
    • Vilain, G.
  • Source: Nutrient Cycling In Agroecosystems
  • Volume: 98
  • Issue: 2
  • Year: 2014
  • Summary: Nitrous oxide is produced in soils and sediments essentially through the processes of nitrification and denitrification, although several rival processes could be competing. This study was undertaken in order to better understand the controlling factors of nitrification, denitrification and associated N2O production as well as the contribution of these two processes to the average N2O production by soils and sediments. With this aim, soil and sediment samples were taken in contrasting periods and different land use types, each time at different depths (upper and lower soil horizons). They were incubated in separate batches in specific conditions to promote denitrification and nitrification: (1) a complete anaerobic environment adding KNO3 for the denitrification assay and (2) an aerobic environment (21 % O-2) with addition of NH4Cl for the nitrification assay. Potentials of nitrification and denitrification were determined by the rates of nitrate either reduced (for denitrification) or produced (for nitrification). Overall, denitrification potential varied from 70 to 2,540 ng NO (3) (-) -N g(-1) dry soil h(-1) and nitrification potential from 30 to 1,150 ng NO3 (-)-N g(-1) dry soil h(-1). Nitrous oxide production by denitrification was significantly (P < 0.05) greater in topsoils (10-30 cm) than in subsoils (90-110 cm), ranging, respectively, from 26 to 250 ng N2O-N g(-1) dry soil h(-1) versus 1.5 to 31 ng N2O-N g(-1) dry soil h(-1), i.e., a mean 19.5 versus. 6.0 % of the NO3 (-) denitrified for the upper and lower horizons, respectively. Considering the N2O production in relation with the nitrate production (e.g., nitrification), no significant difference (P < 0.05) was found in the soil profile, which ranged from 0.03 to 6 ng N2O-N g(-1) dry soil h(-1). This production accounts for 0.21 and 0.16 % of the mean of the NO3 (-) produced for the top and subsoils, respectively. On the basis of the average production by both top- and subsoils, N2O production by denitrification is clearly greater than by nitrification under the measurement conditions used in this study, from 30- to 100-fold higher. Such a high potential of N2O emission must be taken into account when reducing nitrate contamination by increasing denitrification is planned as a curative measure, e.g. in rehabilitation/construction of wetlands.
  • Authors:
    • Billen, G.
    • Anglade, J.
    • Garnier, J.
    • Benoit, M.
  • Source: NUTRIENT CYCLING IN AGROECOSYSTEMS
  • Volume: 100
  • Issue: 3
  • Year: 2014
  • Summary: In the Seine Basin, characterised by intensive arable crops, most of the surface and groundwater is contaminated by nitrate (NO3-). The goal of this study is to investigate nitrogen leaching on commercial arable crop farms in five organic and three conventional systems. In 2012-2013, a total of 37 fields are studied on eight arable crop rotations, for three different soil and climate conditions. Our results show a gradient of soil solution concentrations in function of crops, lower for alfalfa (mean 2.8 mg NO3-N l(-1)) and higher for crops fertilised after legumes (15 mg NO3-N l(-1)). Catch crops decrease nitrate soil solution concentrations, below 10 mg NO3-N l(-1). For a full rotation, the estimated mean concentrations is lower for organic farming, 12 +/- 5 mg NO3-N l(-1) than for conventional farming 24 +/- 11 mg NO3-N l(-1), with however a large range of variability. Overall, organic farming shows lower leaching rates (14-50 kg NO3-N ha(-1)) than conventional farms (32-77 kg NO3-N ha(-1)). Taking into account the slightly lower productivity of organic systems, we show that yield-scaled leaching values are also lower for organic (0.2 +/- 0.1 kg N kg(-1) N year(-1)) than for conventional systems (0.3 +/- 0.1 kg N kg(-1) N year(-1)). Overall, we show that organic farming systems have lower impact than conventional farming on N leaching, although there is still room for progress in both systems in commercial farms.
  • Authors:
    • de Cortazar-Atauri,I. G.
    • Huard, F.
    • Bourgeois, G.
    • Caubel, J.
    • Launay, M.
    • Bancal, M. O.
    • Brisson, N.
  • Source: AGRICULTURE ECOSYSTEMS & ENVIRONMENT
  • Volume: 197
  • Year: 2014
  • Summary: Since weather has a major influence on the occurrence and development of crop diseases, valuable insight toward future agricultural planning emerges with assessment tools to evaluate fungal disease pressure and crop regional suitability under projected future climatic conditions. The aim of this study was to develop two climatic indicators, the average infection efficiency (AIE) and the number of infection days (NID), to quantify the potential effects of weather on the intensity and occurrence of pathogen infection. First, a simple and continuous infection function accounting for daily temperature and leaf wetness duration variations was implemented. The function was then parameterized from published data sets for five major contrasting fungal diseases affecting crops in Northern France: phoma of oilseed rape, late blight of potato, downy mildew of grape, leaf rust of wheat and net blotch of barley. Finally, AIE and NID were calculated for the recent past (1970-2000) and the future A1B climate scenario (2070-2100). An overall decrease in the risk of infection was shown for potato late blight and downy mildew of grapevine for all months during the period when the host plant is susceptible to infection. There were greater differences for the other three diseases, depending on the balance between warmer temperatures and lower humidity. The future climate would result in a later onset of disease and higher infection pressure in late autumn. In spring, for brown rust of wheat and net blotch of barley, the climatic risk for infection is expected to occur earlier but would result in lower infection pressure in May. These findings highlighted the need to use an infra-annual (monthly or seasonally) scale to achieve a relevant analysis of the impact of climate change on the infection risk. The described indicators can easily be adapted to other pathogens and may be useful for agricultural planning at the regional scale and in the medium term, when decision support tools are required to anticipate future trends and the associated risks of crop diseases.
  • Authors:
    • Aoun,W. B.
    • El-Akkari,M.
    • Gabrielle,B.
    • Flenet,F.
  • Source: Proceedings of the 9th International Conference on Life Cycle Assessment in the Agri-Food Sector
  • Year: 2014
  • Summary: Nitrogen fertilization practices have a significant effect on the LCA results of biodiesel chains, which warrants reliable inventory data. In this study focused on the Lorraine region (eastern France), we established a typology of oilseed rape fields based on fertilization practices, and used the agro-ecosystem model CERES-EGC in lieu of generic emission factors to simulate the productivity and externalities associated with oilseed farming. The results were subsequently used to generate an LCA of biodiesel from oilseed rape. We also tested the effect of improved practices on the LCA results. In Lorraine, oilseed rape crops appeared to be frequently over fertilized compared to best management practices. Switching to improved practices with optimal fertilization has a potential to reduce the GWP of 1 megajoule of biodiesel by around 6 gr CO 2eq, against a total life-cycle of 43.9 gr CO 2eq.
  • Authors:
    • Goge,Fabien
    • Gomez,Cecile
    • Jolivet,Claudy
    • Joffre,Richard
  • Source: Geoderma
  • Volume: 213
  • Year: 2014
  • Summary: Numerous studies on the prediction of soil properties from visible and near-infrared spectroscopy, based on large libraries at county scale or small size soil libraries at local scales have been reported in literature. However, difficulties appear when large libraries are used to estimate the soil properties of a small area. The aim of this paper was to compare various strategies to predict soil properties of local samples using a French national database. Models were built: i) from the national database alone and ii) from the national database spiked with subsets of the local database. Two regression methods were tested: partial least square [PLS] and a local regression method (fast Fourier transform local weighted [FFT-LW]). No general rule was obtained in this study as the best strategy differed according to the property under study. It seems that when strong spectral features are related to the characteristic under study (as for CaCO3 content), the addition of local samples did not bring a decisive advantage over calibration based on a wide national database. There are three important and encouraging points of this work to emphasize: i) the evidence of the added value brought by the national library for the prediction of some soil properties over a local area, ii) the pertinence of spiking with local samples to the global database to reach accurate predictions, and iii) the interest of the FFT-LW non-linear method. As we examined only one local site with peculiar land-use and geologic characteristics, further researches are needed to elucidate the way in which these results depend on intrinsic properties of the local site samples and on the relationship between spectral features and considered soil properties. (C) 2013 Elsevier B.V. All rights reserved.
  • Authors:
    • Walter, C.
    • Viaud, V.
    • Michot, D.
    • McBratney, A.
    • Minasny, B.
    • Lacoste, M.
  • Source: Research Article
  • Volume: 213
  • Year: 2014
  • Summary: Soil organic carbon (SOC) is a key element of agroecosystems functioning and has a crucial impact on global carbon storage. At the landscape scale, SOC spatial variability is strongly affected by natural and anthropogenic processes and linear anthropogenic elements (such hedges or ditches). This study aims at mapping SOC stocks distribution in the A-horizons for a depth up to 105 cm, at a high spatial resolution, for an area of 10 km(2) in a heterogeneous agricultural landscape (North-Western France). We used a data mining tool, Cubist, to build rule-based predictive models and predict SOC content and soil bulk density (BD) from a calibration dataset at 8 standard layers (0 to 7.5 cm, 7.5 to 15 cm, 15 to 30 cm, 30 to 45 cm, 45 to 60 cm, 60 to 75 cm, 75 to 90 cm and 90 to 105 cm). For the models calibration, 70 sampling locations were selected within the whole study area using the conditioned Latin hypercube sampling method. Two independent validation datasets were used to assess the performance of the predictive models: (i) at landscape scale, 49 sampling locations were selected using stratified random sampling based on a 300-m square grid; (ii) at hedge vicinity, 112 sampling locations were selected along transects perpendicular to 14 purposively chosen hedges. Undisturbed samples were collected at fixed depths and analysed for BD and SOC content at each sampling location and continuous soil profiles were reconstructed using equal-area splines. Predictive environmental data consisted in attributes derived from a light detection and ranging digital elevation model (LiDAR DEM), geological variables, land use data and a predictive map of A-horizon thickness. Considering the two validation datasets (at landscape scale and hedge vicinity), root mean square errors (RMSE) of the predictions, computed for all the standard soil layers (up to a depth of 105 cm), were respectively 7.74 and 5.02 g kg(-1) for SOC content, and 0.15 and 021 g cm(-3) for BD. Best predictions were obtained for layers between 15 and 60 cm of depth. The SOC stocks were calculated over a depth of 105 cm by combining the prediction of SOC content and BD. The final maps show that the carbon stocks in the soil below 30 cm accounted for 33% of the total SOC stocks on average. The whole method produced consistent results between the two predicted soil properties. The final SOC stocks maps provide continuous data along soil profile up to 105 cm, which may be critical information for supporting carbon policy and management decisions. (C) 2013 Elsevier B.V. All rights reserved.
  • Authors:
    • Siegfried, W.
    • Rohr, C.
    • Riemann, D.
    • Retso, D.
    • Pribyl, K.
    • Nordl, O.
    • Litzenburger, L.
    • Limanowka, D.
    • Labbe, T.
    • Kotyza, O.
    • Kiss, A.
    • Himmelsbach, I.
    • Glaser, R.
    • Dobrovolny, P.
    • Contino, A.
    • Camenisch, C.
    • Burmeister, K.
    • Brazdil, R.
    • Bieber, U.
    • Barriendos, M.
    • Alcoforado, M.
    • Luterbacher, J.
    • Gruenewald, U.
    • Herget, J.
    • Seneviratne, S.
    • Wagner, S.
    • Zorita, E.
    • Werner, J.
    • Pfister, C.
    • Wetter, O.
    • Soderberg, J.
    • Spring, J.
  • Source: Climatic Change
  • Volume: 125
  • Issue: 3-4
  • Year: 2014
  • Summary: The heat waves of 2003 in Western Europe and 2010 in Russia, commonly labelled as rare climatic anomalies outside of previous experience, are often taken as harbingers of more frequent extremes in the global warming-influenced future. However, a recent reconstruction of spring-summer temperatures for WE resulted in the likelihood of significantly higher temperatures in 1540. In order to check the plausibility of this result we investigated the severity of the 1540 drought by putting forward the argument of the known soil desiccation-temperature feedback. Based on more than 300 first-hand documentary weather report sources originating from an area of 2 to 3 million km(2), we show that Europe was affected by an unprecedented 11-month-long Megadrought. The estimated number of precipitation days and precipitation amount for Central and Western Europe in 1540 is significantly lower than the 100-year minima of the instrumental measurement period for spring, summer and autumn. This result is supported by independent documentary evidence about extremely low river flows and Europe-wide wild-, forest- and settlement fires. We found that an event of this severity cannot be simulated by state-of-the-art climate models.
  • Authors:
    • Labreuche, J.
    • Cohan, J. P.
    • Dimassi, B.
    • Mary, B.
  • Source: Agriculture Ecosystems and Envirtoment
  • Volume: 169
  • Year: 2013
  • Summary: Although continuous no-till (NT) is recommended for erosion control and carbon sequestration, it often has a limited duration since farmers alternate between NT and full inversion tillage (FIT) to control weed infestation and avoid soil compaction. In this paper, we evaluate the effect of continuous tillage and tillage conversion of NT to FIT and vice versa on SOC and SON stocks, in a long-term experiment at Boigneville in Northern France. Continuous NT (CNT) and FIT (CFIT) treatments were established in 1991 and maintained until 2011 while half of the plots were converted in 2005: from CNT to new FIT (NFIT) and CFIT to new NT (NNT). Bulk densities and organic C and N contents were determined in 2001 and 2011 down to the old ploughing depth ( opd) which was also measured. SOC and SON stocks were calculated at equivalent soil mass by correcting either bulk densities or the opd. Both methods produced very close results and similar conclusions. A typical gradient of SOC and SON concentrations vs depth was observed in CNT as opposed to a rather uniform distribution in CFIT. CNT resulted in SOC concentration in the top soil (0-5 cm) higher by 38% in 2001 and 53% in 2011 compared to CFIT. Conversely, it led to a SOC reduction in the deeper layer ( ca. 10-28 cm) by 14% in 2001 and 18% in 2011. The global effect was no significant change in SOC and SON stocks between treatments over the old ploughed layer (4060 t soil ha -1) in both years: 43.2 and 45.0 t C ha -1 in 2001 and 44.7 and 45.8 t C ha -1 in 2011, in CNT and CFIT, respectively. In 2011, six years after tillage conversion, the stratification of SOC and SON had disappeared in NFIT whereas a new one had appeared in NNT with a smaller gradient than in CNT. SOC or SON stocks over the old ploughed layer did not differ significantly between treatments after 6 years of conversion: SOC stocks were 45.8, 43.2, 44.7 and 43.1 t C ha -1 in the CFIT, NFIT, CNT and NNT treatments, respectively. Furthermore, SOC stocks below the old ploughed layer ( ca. 28-40 cm) were slightly greater in FIT than in NT treatment (10.9 vs 8.7 t C ha -1). In this experiment, continuous or conversion tillage did not result in any C sequestration benefit.
  • Authors:
    • Smith, P.
    • Williams, M.
    • Forristal, D.
    • Lanigan, G.
    • Osborne, B.
    • Abdalla, M.
    • Jones, M. B.
  • Source: Soil Use and Management
  • Volume: 29
  • Issue: 2
  • Year: 2013
  • Summary: Conservation tillage (CT) is an umbrella term encompassing many types of tillage and residue management systems that aim to achieve sustainable and profitable agriculture. Through a global review of CT research, the objective of this paper was to investigate the impacts of CT on greenhouse gas (GHG) emissions. Based on the analysis presented, CT should be developed within the context of specific climates and soils. A number of potential disadvantages in adopting CT practices were identified, relating mainly to enhanced nitrous oxide emissions, together with a number of advantages that would justify its wider adoption. Almost all studies examined showed that the adoption of CT practices reduced carbon dioxide emissions, while also contributing to increases in soil organic carbon and improvements in soil structure.
  • Authors:
    • De Nocker, L.
    • Aertsens, J.
    • Gobin, A.
  • Source: Land Use Policy
  • Volume: 31
  • Year: 2013
  • Summary: Purpose: This paper aims at indicating the potential of agricultural measures in sequestering carbon as an option for climate change mitigation. The related value for society is estimated. Principle results: Agricultural practices like agroforestry, introducing hedges, low and no tillage and cover crops have an important potential to increase carbon sequestration. The total technical potential in the EU-27 is estimated to be 1566 million tonnes CO2-equivalent per year. This corresponds to 37% of all CO2-equivalent emissions in the EU in 2007. The introduction of agroforestry is the measure with the highest potential, i.e. 90% of the total potential of the measures studied. Taking account only of the value for climate change mitigation, the introduction of agroforestry is estimated to have a value of 282 euro/ha in 2012 that will gradually increase to 1007 euro/ha in 2030. Major conclusions: This implies that there is a huge potential which represents an important value for society in general and for the agricultural sector in specific. At the European level, only in the last few years policy makers have recognized the important benefits of agroforestry. In their rural development programmes some European countries now support farmers to introduce agroforestry. But still the current level of support is only a small fraction of the societal value of agroforestry. If this value would be fully recognized by internalizing the positive externality, we expect that agroforestry will be introduced to a very large extent in the next decades, in Europe and the rest of the world, and this will importantly change the rural landscapes. (C) 2012 Elsevier Ltd. All rights reserved.