- Authors:
- Source: Nematropica
- Volume: 41
- Issue: 2
- Year: 2011
- Summary: Studies that utilized rotation crops for management of root-knot nematodes in the southeastern United States were examined to evaluate the overall performance of rotation crops. In general, nematode-susceptible crops that followed effective rotation crops produced yields and supported nematode numbers similar to those obtained on crops treated with most standard nematicides. Fumigation with methyl bromide was an exception, and resulted in low nematode numbers up to the end of the susceptible target crop, whereas nematode numbers recovered following rotation crops. Performance of rotation crops was similar to clean fallow in most studies, and there was little evidence that rotation crops could suppress nematode numbers below levels obtained after clean fallow. Large reductions in nematode numbers often were achieved following rotation crops. In sites with relatively low initial population levels before rotation crops were used, effective rotation crops sometimes maintained relatively low nematode numbers through the following susceptible target crop, and nematode recovery was not observed until the second year of the rotation sequences. Where practical, very long rotations such as bahiagrass pastures were often effective in preventing increase in nematode numbers on subsequent susceptible crops. Rehabilitation of heavily infested sites is difficult, could require several years of rotation crops, and the benefit gained may last only through one susceptible crop.
- Authors:
- Canaday, C. H.
- Little, C. R.
- Chen, P.
- Rupe, . B.
- Wrather, A. J.
- Shannon, G. J.
- Bond, J. P.
- Arelli, P. A.
- Mengistu, A.
- Newman, M. A.
- Pantalone, V. R.
- Source: Plant Health Progress
- Issue: September
- Year: 2011
- Summary: Charcoal rot, caused by Macrophomina phaseolina, significantly reduces yield in soybean more than most other diseases in the midsouthern United States. There are no commercial genotypes marketed as resistant to charcoal rot. Reactions of 27 maturity group (MG) III, 29 Early MG IV, 34 Late MG IV, and 59 MG V genotypes were evaluated for M. phaseolina between 2006 and 2008 in a non-irrigated, no-till field that had been artificially infested for three years. There was significant variation in root colonization among genotypes and years, indicating the value of screening genotypes over multiple years. Based on CFUI there was no genotype that was consistently immune to charcoal rot each year. However, there were a total of six genotypes (one genotype in MG III, one in Late MG IV, and four in MG V) that were identified as moderately resistant. Some of the commercial and public genotypes were resistant to M. phaseolina at levels equal to or greater than the standard DT97-4290, a moderately resistant cultivar. The genotypes identified as having moderate resistance across the three years could be useful as sources for developing resistant soybean genotypes.
- Authors:
- Mengistu, A.
- Bellaloui, N.
- Ray, J. D.
- Smith, J. R.
- Source: Plant Disease
- Volume: 95
- Issue: 9
- Year: 2011
- Summary: The seasonal progress of charcoal rot (caused by Macrophomina phaseolina) was measured over two growing seasons in four separate experiments: irrigated infested, irrigated non-infested, non-irrigated infested, and non-irrigated noninfested. Disease was assessed at V5, R1, R3, R5, R6, and R7 growth stages based on colony forming units (CFU) of M. phaseolina recovered from the lower stem and root tissues and the area under the disease progress curve (AUDPC). The population density of M. phaseolina increased slowly from the V5 to R6 growth stages and then rapidly from the R6 to R7 growth stages for all genotypes in all four experiments. Yield loss due to charcoal rot ranged from 6 to 33% in irrigated environments. The extent of yield loss was affected by severity of charcoal rot, which in turn was affected by year. Yield loss due to charcoal rot was consistently measured in all paired comparisons in irrigated environments, suggesting that charcoal rot can be an important disease in irrigated environments. Disease severity based on CFU accounted for more yield loss variation (42%) than did the AUDPC (36%) when used to assess disease. Growth stage R7 was found to be the optimum stage for assessing disease using CFU. In addition, screening soybean genotypes under irrigation environment may have utility in breeding programs where there is a need for evaluating soybean genotypes for both disease resistance and yield.
- Authors:
- Ball, L. O.
- Vandever, M. W.
- Milchunas, D. G.
- Hyberg, S.
- Source: Rangeland Ecology & Management
- Volume: 64
- Issue: 3
- Year: 2011
- Summary: The effects of grazing, mowing, and type of cover crop were evaluated in a previous winter wheat fallow cropland seeded to grassland under the Conservation Reserve Program in eastern Colorado. Prior to seeding, the fallow strips were planted to forage sorghum or wheat in alternating strips (cover crops), with no grazing, moderate to heavy grazing, and mowing (grazing treatments) superimposed 4 yr after planting and studied for 3 yr. Plots previously in wheat had more annual and exotic species than sorghum plots. Concomitantly, there were much greater abundances of perennial native grass and all native species in sorghum than wheat cropped areas. The competitive advantage gained by seeded species in sorghum plots resulted in large increases in rhizomatous western wheatgrass. Sorghum is known to be allelopathic and is used in crop agriculture rotations to suppress weeds and increase crop yields, consistent with the responses of weed and desired native species in this study. Grazing treatment had relatively minor effects on basal and canopy cover composition of annual or exotic species versus perennial native grass or native species. Although grazing treatment never was a significant main effect, it occasionally modified cover crop or year effects. Opportunistic grazing reduced exotic cheatgrass by year 3 but also decreased the native palatable western wheatgrass. Mowing was a less effective weed control practice than grazing. Vegetative basal cover and aboveground primary production varied primarily with year. Common management practices for revegetation/restoration currently use herbicides and mowing as weed control practices and restrict grazing in all stages of development. Results suggest that allelopathic cover crop selection and opportunistic grazing can be effective alternative grass establishment and weed control practices. Susceptibility, resistance, and interactions of weed and seeded species to allelopathic cover species/cultivars may be a fruitful area of research.
- Authors:
- Source: Transactions of the American Society of Agricultural and Biological Engineers
- Volume: 54
- Issue: 1
- Year: 2011
- Summary: Canopy resistance (r c), which represents the composite diffusive resistance to water vapor transfer from vegetation surfaces to the atmosphere, plays an important role in describing the water vapor and energy fluxes and CO 2 exchange mechanisms and is an essential component of the complex ecophysiological and turbulent transport and evapotranspiration models. While one-step (direct) application of combination-based energy balance models (i.e., Penman-Monteith, PM) requires r c to solve for actual evapotranspiration (ET a), a remaining challenge in practical application of PM-type models is the scaling up of leaf-level stomatal resistance (r s) to r c to represent an integrated resistance from the plant community to quantify field-scale evaporative losses. We validated an integrated approach to scale up r s to the canopy. Through an extensive field campaign, we measured diurnal r s for a subsurface drip-irrigated soybean [ Glycine max (L.) Merr.] canopy and integrated several microclimatic and in-canopy radiation transfer parameters to scale up r s to r c. Using microclimatic and plant factors such as leaf area index for sunlit and shaded leaves, plant height, solar zenith angle, direct and diffuse radiation, and light extinction coefficient, we scaled up soybean r s as a primary function of measured photosynthetic photon flux density (PPFD). We assumed that PPFD is the primary and independent driver of r c; hence, the scaling approach relied heavily on measured PPFD-r s response curves. We present experimental verifications of scaled up r c by evaluating the performance of the scaled up r c values in estimating ET a. In addition, we solved the PM model on an hourly time step using the scaled up r c values and compared the PM-estimated ET a with the Bowen ratio energy balance system (BREBS)-measured ET a. The relationship between r s and PPFD was asymptotic, and r s showed strong dependence to PPFD, as PPFD alone explained 67% to 88% of the variability in r s. Beyond a certain amount of PPFD (400 to 500 mol m -2 s -1), r s became less responsive to PPFD. At smaller PPFD (0 to about 150 mol m -2 s -1) and greater r s (>70 to 80 s m -1) range, r s was very sensitive to PPFD. The r c_min, r c_avg, and r c_max ranged from 42 to 104 s m -1, 69 to 183 s m -1, and 95 to 261 s m -1, respectively, throughout the season. The seasonal average r c_min, r c_avg, and r c_max were 54, 92, and 129 s m -1, respectively. Canopy resistances were higher in early growing season during partial canopy closure, lower during mid-season, and high again in late season due to leaf aging and senescence. The ET a estimates from the PM model using scaled up r c values correlated very well with the BREBS-measured ET a. The average root mean square difference (RMSD) between the BREBS-measured and PM-estimated ET a was 0.08 mm h -1 (r 2=0.91; n=827), and estimates were within 3% of the measured ET a on an hourly basis. On a daily time step, RMSD was 0.64 mm d -1 (r 2=0.86; n=83), and the estimates were within 4% of the measured data. The approach successfully synthesized the whole-canopy resistance for use in PM-type combination-energy balance equations by scaling up from r s using a straightforward model of in-canopy radiation transfer.
- Authors:
- Source: Soil & Tillage Research
- Volume: 111
- Issue: 2
- Year: 2011
- Summary: Soil aggregate stability is a frequently used indicator of soil quality, but there is no standard methodology for assessing this indicator. Current methods generally measure only a portion of the soil or use either dry-sieved or wet-sieved aggregates. Our objective was to develop a whole soil stability index (WSSI) by combining data from dry aggregate size distribution and water-stable aggregation along with a 'quality' constant for each aggregate size class. The quality constant was based on the impact of aggregate size on soil quality indicators. Soil quality indicators can be loosely defined as those soil properties and processes that have the greatest sensitivity to changes in soil function. The WSSI was hypothesized to have a better relationship to the impacts of aboveground management than other soil aggregation indices such as a mean weight diameter (MWD), geometric mean diameter (GIVID), and the normalized stability index (NSI). Soil samples used in this study were collected from sites established on the same or similar soil types at the Northern Great Plains Research Laboratory in Mandan, ND. By utilizing dry aggregate size distribution, water-stable aggregation, and the quality constant, the WSSI detected differences in soil quality due to management (such as amount of disturbance, plant cover, and crop rotation) with the highest values occurring for the undisturbed, native range and the lowest values for conventional tillage, fallow treatments. The WSSI had the best relationship with management and is recommended as a standard measurement for soil aggregation. Published by Elsevier B.V.
- Authors:
- Source: Transactions of the American Society of Agricultural and Biological Engineers
- Volume: 54
- Issue: 3
- Year: 2011
- Summary: Estimation of actual evapotranspiration (ET), especially its partitioning into plant transpiration (T) and soil evaporation (E), in agricultural fields is important for effective soil water management and conservation and for understanding the interactions between ET, T and E with the management practices. Direct field measurements of ET, T, and E rates are difficult and costly; hence, mathematical models are used for estimating them. The objective of this study was to evaluate the practical applicability of the Shuttleworth-Wallace (S-W) model to estimate and partition ET in a subsurface drip-irrigated soybean ( Glycine max L. Merr.) field with partial residue cover. While its performance has been studied for various surfaces, the performance evaluation of the S-W model for such surface has not been carried out. An integrated approach of calculating bulk stomatal resistance (r sc) as a function of soil water content (theta i) was incorporated into the model to allow simulation of T over a range of theta i, and a residue decomposition function was introduced to account for surface residue decay over time to more accurately account for the actual residue cover in field conditions. The model performance was evaluated for different plant growth stages during the 2007 and 2008 growing seasons at the University of Nebraska-Lincoln, South Central Agricultural Laboratory near Clay Center, Nebraska. The sum of estimated T and E was compared to the bowen Ratio Energy Balance System (BREBS)-measured actual ET on a daily time-step. The model was able to capture the trends and magnitudes of measured ET, but its performance differed for various plant physiological growth stages. The root mean square difference (RMSD) values between the model-estimated and measured ET values for the growing season (day after emergence until physiological maturity) were 1.26 and 1.03 mm d -1 for 2007 and 2008, respectively. Best performance was observed during the mid-season during full canopy cover with a two-year average r 2 of 0.87, average RMSD of 0.94 mm d -1, and average mean biased error (MBE) of 0.30 mm d -1. Estimates for both initial and late season growth stages where E was dominant had the least agreement with BREBS measurements. The proportion of T and E in the estimated ET varied with growth stage. The S-W estimated seasonal total ET and BREBS measurements were equal in 2007 (S-W model ET=496 mm and BREBS ET=498 mm), and in 2008 the model underestimated by only 8.2% (S-W model ET=452 mm and BREBS ET=489 mm). While, in general, the model was successful in tracking the trends and magnitude of the BREBS-measured ET, further re-parameterization of the T module of the model can improve its accuracy to estimate ET, especially T, during the initial and late season (before full canopy cover and after physiological maturity) for a subsurface drip-irrigated soybean canopy. Other enhancements needed in the model for improved estimation of the E component include accurate determination of soil surface resistance coefficients and accounting for direct evaporation of intercepted rainfall on the canopy.
- Authors:
- Roskamp, G. K.
- Glassman, K. R.
- Ortiz-Ribbing, L. M.
- Hallett, S. G.
- Source: Plant Disease
- Volume: 95
- Issue: 4
- Year: 2011
- Summary: Common waterhemp ( Amaranthus rudis) and pigweeds ( Amaranthus spp.) are troublesome weeds in many cropping systems and have evolved resistance to several herbicides. Field trials to further develop Microsphaeropsis amaranthi and Phomopsis amaranthicola as bioherbicides for control of waterhemp and pigweeds were conducted to test the effectiveness of these organisms in irrigated and nonirrigated pumpkin and soybean plots over 2 years at three locations in western Illinois. The bioherbicide was applied with lecithin and vegetable oil at 187 liters ha -1 in 2008 and 374 liters ha -1 in 2009. Treatments included spore suspensions of M. amaranthi and P. amaranthicola alone, a mixture of both organisms, and sequential treatments of the organisms with halosulfuron-methyl (Sandea Herbicide) in pumpkin or glyphosate (Roundup Original Max Herbicide) in soybean. Bioherbicide effectiveness was estimated at approximately 7 and 14 days after treatment, as disease incidence, disease severity, percent weed control, and weed biomass reduction. Significant reductions in weed biomass occurred in treatments with one or both of the fungal organisms, and potential exists to tank mix M. amaranthi with halosulfuron-methyl. Leaf surface moisture and air temperatures following application may account for inconsistencies in field results between year and locations. These fungal organisms show potential as bioherbicides for weeds in the genus Amaranthus.
- Authors:
- Bonta, J. V.
- Owens, L. B.
- Shipitalo, M. J.
- Rogers, S.
- Source: Journal of Environmental Quality
- Volume: 40
- Issue: 1
- Year: 2011
- Summary: Winter application of manure poses environmental risks. Seven continuous corn, instrumented watersheds (approximately 1 ha each) at the USDA-ARS North Appalachian Experimental Watershed research station near Coshocton, Ohio were used to evaluate the environmental impacts of winter manure application when using some of the Ohio Natural Resources Conservation Service recommendations. For 3 yr on frozen, sometimes snow-covered, ground in January or February, two watersheds received turkey litter, two received liquid swine manure, and three were control plots that received N fertilizer at planting (not manure). Manure was applied at an N rate for corn; the target level was 180 kg N ha(-1) with a 30-m setback from the application area to the bottom of each watershed. Four grassed plots (61 x 12 m) were used for beef slurry application (9.1 Mg ha(-1) wet weight); two plots had 61 x 12 m grassed filter areas below them, and two plots had 30 x 12 m filter areas. There were two control plots. Nutrient concentrations were sometimes high, especially in runoff soon after application. However, most events with high concentrations occurred with low flow volumes; therefore, transport was minimal. Applying manure at the N rate for crop needs resulted in excess application of P. Elevated P losses contributed to a greater potential of detrimental environmental impacts with P than with N. Filter strips reduced nutrient concentrations and transport, but the data were too limited to compare the effectiveness of the 30- and 61-m filter strips. Winter application of manure is not ideal, but by following prescribed guidelines, detrimental environmental impacts can be reduced.
- Authors:
- Shipitalo, M. J.
- Owens, L. B.
- Source: Agriculture, Ecosystems & Environment
- Volume: 141
- Issue: 1-2
- Year: 2011
- Summary: With the current emphasis on the role of carbon in the environment, agricultural systems and their impacts on the carbon cycle are important parts of the overall issue. Organic carbon lost to streams and rivers can promote bacterial production and microbial respiration of CO(2) to the atmosphere. Although pastures and grasslands are major land uses in the humid U.S., row crop agriculture has received most of the carbon research focus. The objective of this study at the North Appalachian Experimental Watershed near Coshocton, Ohio, was to assess organic carbon transported from a pasture system, particularly on a runoff event basis. A beef cow-calf herd rotationally grazed a paddock during the growing season and was fed hay in this paddock during the dormant season (November-April). Surface runoff and sediment loss was measured and sampled throughout the year from the small watershed in the paddock. Most of the sediment samples were collected during the dormant season. With continuous winter occupancy, the percent vegetative cover was often = 10 kg ha(-1) from the watershed in the winter feeding area. The largest 6 events carried nearly 50% of the total sediment and sediment-attached C lost during this period. Annual losses of sediment and sediment-C varied considerably but averaged 2642 and 140 kg ha(-1), respectively. There was no significant correlation between the amount of sediment transported during individual events and the C concentration on the associated sediment. The pasture sediments have a C enrichment ratio of 1.2-1.5 compared with the 0-2.5 cm soil layer. Pasture sediment-C concentrations were >2x the C concentrations on sediments from nearby row crop watersheds. Published by Elsevier B.V.