- Authors:
- Reitsma, K.
- Carlson, G. C.
- Gelderman, R. H.
- Stone, J.
- Clay, S. A.
- Chang, J. Y.
- Clay, D. E.
- Jones, M.
- Janssen, L.
- Schumacher, T.
- Source: Agronomy Journal
- Volume: 104
- Issue: 3
- Year: 2012
- Summary: The corn (Zea mays L.)-based ethanol carbon footprint is impacted by many factors including the soil's C sequestration potential. The study's objective was to determine the South Dakota corn-based ethanol surface SOC sequestration potential and associated partial C footprint. Calculated short-term C sequestration potentials were compared with long-term sequestration rates calculated from 95,214 producer soil samples collected between 1985 and 2010. National Agricultural Statistics Service (NASS) grain yields, measured root/shoot ratios and harvest indexes, soil organic C (SOC) and nonharvested C (NHC) first-order rate constants, measured SOC benchmarks [81,391 composite soil samples (0-15 cm) collected between 1985 and 1998], and 34,704 production surveys were used to calculate the short-term sequestration potentials. The SOC short-term, area weighted sequestration potential for the 2004 to 2007 time period was 181 kg C (ha * yr) -1. This relatively low rate was attributed to a drought that reduced the amount of NHC returned to soil. For the 2008 to 2010 time period, the area weighted short-term sequestration rate was 341 kg (ha * yr) -1. This rate was similar to the long-term measured rate of 368 kg C (ha * yr) -1. Findings from these independent SOC sequestration assessments supports the hypothesis that many of the regions surface soils are C sinks when seeded with corn. Based on short-term C sequestration rates, corn yields, and the corn conversion rate to ethanol, the area weighted surface SOC footprints for the 2004 to 2007 and 2008 to 2010 time periods was -10.4 and -15.4 g CO 2 equ MJ -1, respectively.
- Authors:
- Howell, T. A.
- Evett, S. R.
- Schwartz, R. C.
- Colaizzi, P. D.
- Gowda, P. H.
- Tolk, J. A.
- Source: Agronomy Journal
- Volume: 104
- Issue: 2
- Year: 2012
- Summary: Relatively few radiation transfer studies have considered the impact of varying vegetation cover that typifies row crops, and methods to account for partial row crop cover have not been well investigated. Our objective was to evaluate a widely used radiation model that was modified for row crops having sparse to full vegetation cover. The radiation model was combined with geometric view factors based on elliptical hedgerows that account for the spatial distribution of row crop vegetation, and this approach was compared with the more commonly used clumping index approach. Irradiance measurements included transmitted and reflected visible and shortwave, outgoing longwave, and total net radiation. The model used optimized parameters for corn ( Zea mays L.), grain sorghum [ Sorghum bicolor (L.) Moench], and cotton ( Gossypium hirsutum L.). The elliptical hedgerow and clumping index approaches resulted in similar model agreement; however, the former resulted in up to 7.3 W m -2 smaller RMSE and up to 7.5 W m -2 smaller mean bias error compared with the latter. Both approaches resulted in similar model sensitivities to inputs, which varied 25%. Calculated shortwave irradiance fluxes were most sensitive to leaf area index (LAI; -3.25), canopy width (-1.94), ellipsoid leaf angle parameter (-0.77), and visible leaf absorption (-5.54) when LAI=2.95 m 2 m -2, and visible soil reflectance (0.89) when LAI=0.21 m 2 m -2. Calculated outgoing longwave irradiance and net radiation were most sensitive to the soil directional brightness temperature (0.55 and -0.61, respectively) when LAI=0.21 m 2 m -2.
- Authors:
- Abbaspour, K. C.
- Gaiser, T.
- Folberth, C.
- Schulin, R.
- Yang, H.
- Source: Agriculture Ecosystems and Environment
- Volume: 151
- Year: 2012
- Summary: Spatially explicit large-scale crop growth models are often applied at the global scale with no or little adjustments to regional conditions, which may produce unreliable model results. To tackle this issue, we have regionalized a large-scale crop model for simulating maize cultivation in sub-Saharan Africa (SSA). Planting dates were estimated using reported planting seasons, plant growth parameters were adopted from literature to reflect a low-yielding cultivar, and agricultural practice was mimicked by simulating continuous cultivation of maize with removal of plant residues. The analysis of different estimates of planting date showed that a monthly time step was too coarse in (semi-)arid regions and a weekly step should be used. Limiting planting date estimates by reported seasons is especially important in regions with bimodal rain seasons. The parameterization of a low-yielding cultivar by decreasing the maximum and minimum harvest index (HI) in the model resulted in HI estimates within the range of values reported in the literature. The most important step in the model adjustment was found to be the removal of plant residue. This leads together with little fertilizer inputs to soil nutrient and organic carbon depletion, which has been taking place in most parts of SSA during the past decades. If residue removal is not taken into account, the simulation results in organic carbon sequestration and only minor nutrient depletion. With the adjustments of cultivar, planting dates, and agricultural practice in the model setup, crop growth is in most areas of SSA mainly constrained by nutrient stress as compared to water and temperature. The estimated national and regional average yields compared well with reported yields for the major maize producing countries, suggesting that the regionalized model is suitable for supporting policies on water and soil management in SSA.
- Authors:
- Ahuja, L. R.
- Hatfield, J. L.
- Ma, L.
- Malone, R. W.
- Heiman, P.
- Boyle, K. P.
- Kanwar, R. S.
- Source: Agricultural Systems
- Volume: 106
- Issue: 1
- Year: 2012
- Summary: A 45% reduction in riverine total nitrogen flux from the 1980-1996 time period is needed to meet water quality goals in the Mississippi Basin and Gulf of Mexico. This paper addresses the goal of reducing nitrogen in the Mississippi River through three objectives. First, the paper outlines an approach to the site-specific quantification of management effects on nitrogen loading from tile drained agriculture using a simulation model and expert review. Second, information about the net returns to farmers is integrated with the nitrogen loading information to assess the incentives to adopt alternative management systems. Third, the results are presented in a decision support framework that compares the rankings of management systems based on observed and simulated values for net returns and nitrogen loading. The specific question addressed is how information about the physical and biological processes at Iowa State University's Northeast Research Farm near Nashua, Iowa, could be applied over a large area to help farmers select management systems to reduce nitrogen loading in tile drained areas. Previous research has documented the parameterization and calibration of the RZWQM model at Nashua to simulate 35 management system effects on corn and soybean yields and N loading in tileflow from 1990 to 2003. As most management systems were studied for a 6 year period and in some cases weather had substantial impacts, a set of 30 alternative management systems were also simulated using a common 1974-2003 input climate dataset. To integrate an understanding of the economics of N management, we calculated net returns for all management systems using the DevTreks social budgeting tool. We ranked the 35 observed systems in the Facilitator decision support tool using N loading and net returns and found that rankings from simulated results were very similar to those from the observed results from both an onsite and offsite perspective. We analyzed the effects of tillage, crop rotation, cover crops, and N application method, timing, and amount for the 30 long term simulations on net returns and N loading. The primary contribution of this paper is an approach to creating a quality assured database of management effects on nitrogen loading and net returns for tile drained agriculture in the Mississippi Basin. Such a database would systematically extend data from intensively monitored agricultural fields to the larger area those fields represent. Published by Elsevier Ltd.
- Authors:
- Finlay, L. A.
- Hulugalle, N. R.
- Weaver, T. B.
- Source: Renewable Agriculture and Food Systems
- Volume: 27
- Issue: 2
- Year: 2012
- Summary: Cover crops in minimum or no-tilled systems are usually killed by applying one or more herbicides, thus significantly increasing costs. Applying herbicides at lower rates with mechanical interventions that do not disturb or bury cover crop residues can, however, reduce costs. Our objective was to develop a management system with the above-mentioned features for prostrate cover crops on permanent beds in an irrigated Vertisol. The implement developed consisted of a toolbar to which were attached spring-loaded pairs of parallel coulter discs, one set of nozzles between the individual coulter discs that directed a contact herbicide to the bed surfaces to kill the cover crop and a second set of nozzles located to direct the cheaper glyphosate to the furrow to kill weeds. The management system killed a prostrate cover crop with less trafficking, reduced the use of more toxic herbicides, carbon footprint, labor and risk to operators. Maximum depth of compaction was more but average increase was less than that with the boom sprayer control.
- Authors:
- Source: Plant and Soil
- Volume: 353
- Issue: 1-2
- Year: 2012
- Summary: Aims A field experiment was conducted where maintenance of indigenous arbuscular mycorrhizal (AM) fungal populations was attempted using AM host cover crops arranged temporally or spatially during growth of nonmycorrhizal crops. Methods To arrange AM hosts temporally, sunflower or oat was grown as a cover crop after non-host cropping (cabbage) or fallowing. In order to arrange AM hosts spatially, red clover, white clover or vetch was intercropped during growth of non-host cabbage. Results The AM colonization and growth of maize with previously introduced sunflower or oat were much greater than those without introduction of cover crops or those with introduction of non-host cover crops. The AM colonization and yield of winter wheat grown after cabbage with AM host intercropping were greater than those after cabbage only cropping, suggesting that arrangement of AM hosts between cabbage rows is effective for maintaining the AM fungal population in soil during non-host cropping. Conclusions Mycorrhizal hosts cropped after or during non-host cropping is an effective means to increase indigenous AM fungal populations. The results show that AM colonization, P uptake and productivity of crops after cultivation of nonmycorrhizal crops can be improved by arranging AM hosts temporally or spatially as cover crops.
- Authors:
- Singer, J. W.
- Moorman, T. B.
- Parkin, T. B.
- Jaynes, D. B.
- Kaspar, T. C.
- Source: Agricultural Water Management
- Volume: 110
- Year: 2012
- Summary: Much of the NO3 in the riverine waters of the upper Mississippi River basin in the United States originates from agricultural land used for corn (Zea mays L) and soybean (Glycine max [L] Merr.) production. Cover crops grown between maturity and planting of these crops are one approach for reducing losses of NO3. In this experiment, we evaluated the effectiveness of oat (Avena sativa L.) and rye (Secale cereale L.) cover crops in reducing NO3 concentrations and loads in subsurface drainage water. The oat fall cover crop was broadcast seeded into living corn and soybean crops before harvest in late August or early September and was killed by cold temperatures in late November or early December The rye winter cover crop, which had already been used annually for four years, was planted with a grain drill after corn and soybean harvest, overwintered, grew again in the spring, and was killed with herbicides before main crop planting. These treatments were evaluated in subsurface-drained field plots with an automated system for measuring drainage flow and collecting proportional samples for analysis of NO3 concentrations from each plot. The rye winter cover crop significantly reduced drainage water NO3 concentrations by 48% over five years, but this was less than the 58% reduction observed in its first four years of use. The oat fall cover crop reduced NO3 concentrations by 26% or about half of the reduction of the rye cover crop. Neither cover crop significantly reduced cumulative drainage or nitrate loads because of variability in cumulative annual drainage among plots. Both oat and rye cover crops are viable management options for significantly reducing NO3 losses to surface waters from agricultural drainage systems used for corn and soybean production. Published by Elsevier B.V.
- Authors:
- Okeyo, J.
- Vanlauwe, B.
- Kimetu, J. M.
- Waswa, B.
- Bationo, A.
- Kihara, J.
- Mukalama, J.
- Martius, C.
- Source: Experimental Agriculture
- Volume: 48
- Issue: 2
- Year: 2012
- Summary: Reduced tillage is said to be one of the potential ways to reverse land degradation and ultimately increase the productivity of degrading soils of Africa. We hypothesised that crop yield following a modest application of 2 t ha(-1) of crop residue in a reduced tillage system is similar to the yield obtained from a conventional tillage system, and that incorporation of legumes in a cropping system leads to greater economic benefits as opposed to a cropping system involving continuous maize. Three cropping systems (continuous maize monocropping, legume/maize intercropping and rotation) under different tillage and residue management systems were tested in sub-humid western Kenya over 10 seasons. While soybean performed equally well in both tillage systems throughout, maize yield was lower in reduced than conventional tillage during the first five seasons but no significant differences were observed after season 6. Likewise, with crop residue application, yields in conventional and reduced tillage systems are comparable after season 6. Nitrogen and phosphorus increased yield by up to 100% compared with control. Gross margins were not significantly different among the cropping systems being only 6 to 39% more in the legume-cereal systems relative to similar treatments in continuous cereal monocropping system. After 10 seasons of reduced tillage production, the economic benefits for our cropping systems are still not attractive for a switch from the conventional to reduced tillage.
- Authors:
- Lawton-Rauh, A.
- Agudelo, P.
- Leach, M.
- Source: Plant Disease
- Volume: 96
- Issue: 1
- Year: 2012
- Summary: Rotylenchulus reniformis is a highly variable nematode species and an economically important pest in many cotton fields across the southeastern United States. Rotation with resistant or poor host crops is a method for management of reniform nematode. We studied the effect of six planting schemes covering four 120-day planting cycles on the predominant genotype of R. reniformis. Rotations used were: (i) cotton to corn; (ii) susceptible soybean to corn; (iii) resistant soybean to cotton; (iv) corn to cotton; (v) continuous susceptible soybean; (vi) continuous cotton. After each 120-day cycle, amplified fragment length polymorphisms (AFLPs) produced from four primer pairs were used to determine the effect of crop rotation on the predominant genotype of reniform nematode. A total of 279 polymorphic bands were scored using four primer combinations. Distinct changes in genotype composition were observed following rotations with resistant soybean or corn. Rotations involving soybean (susceptible and resistant) had the greatest effect on population structure. The characterization of field population variability of reniform nematode and of population responses to host plants used in rotations can help extend the durability of resistant varieties and can help identify effective rotation schemes.
- Authors:
- Rajput, T. B. S.
- Sarangi, A.
- Singh, M.
- Abedinpour, M.
- Pathak, H.
- Ahmad, T.
- Source: Agricultural Water Management
- Volume: 110
- Year: 2012
- Summary: Crop growth simulation models of varying complexity have been developed for predicting the effects of soil, water and nutrients on grain and biomass yields and water productivity of different crops. These models are calibrated and validated for a given region using the data generated from field experiments. In this study, a water-driven crop model AquaCrop, developed by FAO was calibrated and validated for maize crop under varying irrigation and nitrogen regimes. The experiment was conducted at the research farm of the Water Technology Centre, IARI, New Delhi during kharif 2009 and 2010. Calibration was done using the data of 2009 and validation with the data of 2010. Irrigation applications comprised rainfed, i.e. no irrigation (W 1) irrigation at 50% of field capacity (FC) (W 2) at 75% FC (W 3) and full irrigation (W 4). Nitrogen application levels were no nitrogen (N 1), 75 kg ha -1 (N 2) and 150 kg ha -1 (N 3). Model efficiency ( E), coefficient of determination ( R2), Root Mean Square error (RMSE) and Mean Absolute Error (MAE) were used to test the model performance. The model was calibrated for simulating maize grain and biomass yield for all treatment levels with the prediction error statistics 0.95