• Authors:
    • Armstrong, R.
    • Nuttall, J.
  • Source: Australian Journal of Soil Research
  • Volume: 48
  • Issue: 2
  • Year: 2010
  • Summary: Subsoil physicochemical constraints can limit crop production on alkaline soils of south-eastern Australia. Fifteen farmer paddocks sown to a range of crops including canola, lentil, wheat, and barley in the Wimmera and Mallee of Victoria and the mid-north and Eyre Peninsula of South Australia were monitored from 2003 to 2006 to define the relationship between key abiotic/edaphic factors and crop growth. The soils were a combination of Calcarosol and Vertosol profiles, most of which had saline and sodic subsoils. There were significant correlations between EC e and Cl - ( r=0.90), ESP and B ( r=0.82), ESP and EC e ( r=0.79), and ESP and Cl - ( r=0.73). The seasons monitored had dry pre-cropping conditions and large variations in spring rainfall in the period around flowering. At sowing, the available soil water to a depth of 1.2 m (theta a) averaged 3 mm for paddocks sown to lentils, 28 mm for barley, 44 mm for wheat, and 92 mm for canola. Subsoil constraints affected canola and lentil crops but not wheat or barley. For lentil crops, yield variation was largely explained by growing season rainfall (GSR) and theta a in the shallow subsoil (0.10-0.60 m). Salinity in this soil layer affected lentil crops through reduced water extraction and decreased yields where EC e exceeded 2.2 dS/m. For canola crops, GSR and theta a in the shallow (0.10-0.60 m) and deep (0.60-1.20 m) layers were important factors explaining yield variation. Sodicity (measured as ESP) in the deep subsoil (0.80-1.00 m) reduced canola growth where ESP exceeded 16%, corresponding to a 500 kg/ha yield penalty. For cereal crops, rainfall in the month around anthesis was the most important factor explaining grain yield, due to the large variation in rainfall during October combined with the determinant nature of these crops. For wheat, theta a in the shallow subsoil (0.10-0.60 m) at sowing was also an important factor explaining yield variation. Subsoil constraints had no impact on cereal yield in this study, which is attributed to the lack of available soil water at depth, and the crops' tolerance of the physicochemical conditions encountered in the shallow subsoil, where plant-available water was more likely to occur. Continuing dry seasonal conditions may mean that the opportunity to recharge soil water in the deeper subsoil, under continuous cropping systems, is increasingly remote. Constraints in the deep subsoil are therefore likely to have reduced impact on cereals under these conditions, and it is the management of water supply, from GSR and accrued soil water, in the shallow subsoil that will be increasingly critical in determining crop yields in the future.
  • Authors:
    • Weeks, C.
    • Robertson, M.
    • Oliver, Y.
  • Source: Agricultural Water Management
  • Volume: 98
  • Issue: 2
  • Year: 2010
  • Summary: The practice of long fallowing, by omitting a year of cropping, is gaining renewed focus in the low rainfall zone of the northern agriculture region of Western Australia. The impetus behind this practice change has been a reduced use of pasture breaks in cereal crop rotations, and the belief that a fallow can improve soil water accumulation and thus buffer the negative effects of dry seasons on crop yields. We evaluated the benefits of long fallowing (full stubble retention, no weed growth allowed) in a continuous wheat sequence via simulation modelling with APSIM at two rainfall locations and five soil types. The simulated benefits to long fallowing were attributable to soil water accumulation only, as the effects on soil nitrogen, diseases or weeds were not evaluated. The long-term (100 years) mean wheat yield benefit to fallowing was 0.36-0.43 t/ha in clay, 0.20-0.23 t/ha in sand and loam, and 0-0.03 t/ha in shallow sand and shallow loams. Over the range of seasons simulated the response varied from -0.20 to 3.87 t/ha in the clay and -0.48 to 2.0 t/ha for the other soils. The accumulation of soil water and associated yield benefits occurred in 30-40% of years on better soils and only 10-20% on poorer soils. For the loam soil, the majority of the yield increases occurred when the growing-season (May-September) rainfall following the fallow was low (30 mm), although yield increase did occur with other combinations of growing-season rainfall and soil water. Over several years of a crop sequence involving fallow and wheat, the benefits from long fallowing due to greater soil water accumulation did not offset yield lost from omitting years from crop production, although the coefficient of variation for inter-annual farm grain production was reduced, particularly on clay soils during the 1998-2007 decade of below-average rainfall. We conclude that under future drying climates in Western Australia, fallowing may have a role to play in buffering the effects of enhanced inter-annual variability in rainfall. Investigations are required on the management of fallows, and management of subsequent crops (i.e. sowing earlier and crop density) so as to maximise yield benefits to subsequent crops while maintaining groundcover to prevent soil erosion.
  • Authors:
    • Kidd, C.
    • Ruchs, C.
    • D'Antuono, M.
    • Rayner, B.
    • Peirce, J.
    • Reeves, A.
  • Source: Plant Protection Quarterly
  • Volume: 25
  • Issue: 1
  • Year: 2010
  • Summary: Skeleton weed is under an eradication program in Western Australia. There was concern that should the weed establish over large areas of the sandier soils in the cereal growing areas of Western Australia; the treatments to eradicate/control the weed would affect the cropping rotations. This is because the persistence of the herbicides clopyralid and picloram used for skeleton weed control would suppress lupins and other legumes which are a major part of cropping rotations in Western Australia. A cropping rotation experiment was established during 2002 in South Australia in an area heavily infested with skeleton weed. For six years crops were grown in a continuous rotation which included lupins in 2004 and 2006. Regular use of the herbicides clopyralid and picloram in the cereal phase and clopyralid as a pre-sowing application in the legume phase significantly reduced skeleton weed density without any deleterious impact on narrow leaf lupins ( Lupinus angustifolius).
  • Authors:
    • Brennan, J. P.
    • Murray, G. M.
  • Source: Australasian Plant Pathology
  • Volume: 39
  • Issue: 1
  • Year: 2010
  • Summary: The incidence, severity and yield loss caused by 40 pathogens associated with 41 diseases of barley were assessed from a survey of 15 barley pathologists covering the winter cereal growing areas of Australia. The survey provided data on the frequency of years that each pathogen developed to its maximum extent, the proportion of the crop then affected in each growing area, and the yield loss that resulted in the affected crops with and without current control measures. These data were combined with crop production and grain quality data to estimate the value of the losses aggregated to the Northern, Southern and Western production regions. Pathogens were estimated to cause a current average loss of $252 x 10(6)/year or 19.6% of the average annual value of the barley crop in the decade 1998-99 to 2007-08. Nationally, the three most important pathogens are Pyrenophora teres f. maculata, Blumeria graminis f. sp. hordei and Heterodera avenae with current average annual losses of $43 x 10(6), $39 x 10(6) and $26 x 10(6), respectively. If current controls were not used, losses would be far higher with potential average annual losses from the three most important pathogens, P. teres f. maculata, H. avenae and P. teres f. teres, being $192 x 10(6), $153 x 10(6) and $117 x 10(6), respectively. The average value of control practices exceeded $50 x 10(6)/year for nine pathogens. Cultural methods (rotation, field preparation) were the only controls used for 14 pathogens and contributed more than 50% of the control for a further 13 pathogens. Breeding and the use of resistant cultivars contributed more than 50% of control for five pathogens and pesticides for four pathogens. The relative importance of pathogens varied between regions and zones.
  • Authors:
    • Bathgate, A.
    • Lawes, R. A.
    • Robertson, M. J.
    • Byrne, F.
    • White, P.
    • Sands, R.
  • Source: Crop and Pasture Science
  • Volume: 61
  • Issue: 3
  • Year: 2010
  • Summary: Break crops (e. g. pulses, lupins, canola, oats) underpin the continued profitability of cereal (wheat or barley) based cropping sequences. The area sown on farms to break crops varies widely across geographical regions according to climate, soil type mix, enterprise mix (crop v. livestock), and other constraints such as the prevalence of soil-borne disease. Given recent fluctuations in the area of established break crops in Western Australia, there are concerns about their long-term prospects in the farming system. A survey of the area and grain yield of break crops on-farm was combined with whole-farm bio-economic modelling to determine the upper limit to the area of break crops on representative farms in 4 agro-climatic regions. Sensitivity analysis was conducted to ascertain the potential effects of varying commodity prices (sheep and grain), costs of production, and assumptions on the yield of break crops and the boost to the yield of following cereals. The survey revealed that the two dominant break crops, lupins and canola, occupied 8-12% and 8-9%, respectively, of farm area on those farms that grew them in the medium-rainfall zone and this declined to 6-8% and 7-10% in the drier region. Nevertheless, the modelling results show that break crops are an important component of the farming system, even where the area is small, and the response of whole-farm profit to percent of the farm allocated to break crops is relatively. at near the optimum of 23-38%. The modelled area of break crops at maximum profit is higher than that found in farm surveys. The discrepancy could possibly be explained by the lower break crop yields realised by farmers and a reduced boost to cereal yields following break crops than assumed in models. Also, deterministic models do not account for risk, which is an important consideration in the decision to grow break crops. However, the yield difference does not explain the discrepancy entirely and raises questions about farmer motivations for adoption of break crops. The scope for increased area of break crops beyond 23-38% of the farm is limited, even with increases in the yield enhancements in subsequent cereal crops, higher break crop prices, and higher fertiliser costs. Further research is required to better quantify costs and benefits of break crops in Western Australian farming systems.
  • Authors:
    • Rogers, D. J.
    • Brier, H. B.
  • Source: Crop Protection
  • Volume: 29
  • Issue: 1
  • Year: 2010
  • Summary: The response of vegetative soybean (Glycine max) to Helicoverpa armigera feeding was studied in irrigated field cages over three years in eastern Australia to determine the relationship between larval density and yield loss, and to develop economic injury levels. Rather than using artificial defoliation techniques, plants were infested with either eggs or larvae of H. armigera, and larvae allowed to feed until death or pupation. Larvae were counted and sized regularly and infestation intensity was calculated in Helicoverpa injury equivalent (HIE) units, where 1 HIE was the consumption of one larva from the start of the infestation period to pupation. In the two experiments where yield loss occurred, the upper threshold for zero yield loss was 7.510.21 HIEs and 6.431.08 HIEs respectively. In the third experiment, infestation intensity was lower and no loss of seed yield was detected up to 7.0 HIEs. The rate of yield loss/HIE beyond the zero yield loss threshold varied between Experiments 1 and 2 (-9.440.80 g and -23.173.18 g, respectively). H. armigera infestation also affected plant height and various yield components (including pod and seed numbers and seeds/pod) but did not affect seed size in any experiment. Leaf area loss of plants averaged 841 and 1025 cm 2/larva in the two experiments compared to 214 and 302 cm 2/larva for cohort larvae feeding on detached leaves at the same time, making clear that artificial defoliation techniques are unsuitable for determining H. armigera economic injury levels on vegetative soybean. Analysis of canopy leaf area and pod profiles indicated that leaf and pod loss occurred from the top of the plant downwards. However, there was an increase in pod numbers closer to the ground at higher pest densities as the plant attempted to compensate for damage. Defoliation at the damage threshold was 18.6 and 28.0% in Experiments 1 and 2, indicating that yield loss from H. armigera feeding occurred at much lower levels of defoliation than previously indicated by artificial defoliation studies. Based on these results, the economic injury level for H. armigera on vegetative soybean is approximately 7.3 HIEs/row-metre in 91 cm rows or 8.0 HIEs/m 2.
  • Authors:
    • Dougall, A.
    • Halpin, N. V.
    • Stirling, G. R.
    • Bell, M. J.
  • Source: Proceedings of the 2010 Conference of the Australian Society of Sugar Cane Technologists held at Bundaberg, Queensland, Australia, 11-14 May 2010
  • Year: 2010
  • Summary: Lesion nematode ( Pratylenchus zeae) occurs in almost every sugarcane field in Queensland and is perhaps the most important of a community of nematode pests that cost the Australian sugar industry an estimated $82 million/annum in lost production. Legumes such as soybean and peanut are relatively poor hosts of the nematode and, when they are used as rotation crops in the sugarcane farming system, populations of P. zeae are markedly reduced. This paper provides data on the host status of other rotation crops that might have a place in the sugarcane farming system, together with some common weeds. The capacity of P. zeae to multiply on various plants was assessed after 70 days in pots at temperatures suitable for nematode reproduction, with multiplication factors calculated as (Pf/Pi), where Pf was the final nematode population density and Pi the initial inoculum density. Sugarcane and forage sorghum had the highest multiplication factors (Pf/Pi >40), whereas the nematode population on most other plants increased 5 to 13 times. Some cultivars of wheat, oats and Rhodes grass had multiplication factors of only 3 or 4 and three crops ( Setaria cv. Splenda, barley cv. Grimmett and cowpea cv. Red Caloona) were non-hosts (Pf/Pi
  • Authors:
    • Hoffmann, A. A.
    • Umina, P. A.
    • Weeks, A. R.
    • Arthur, A. L.
  • Source: Experimental and Applied Acarology
  • Volume: 52
  • Issue: 2
  • Year: 2010
  • Summary: Balaustium medicagoense and Bryobia spp. have recently been identified as emerging pests of winter crops and pastures in Australia. These mites have a high natural tolerance to currently registered pesticides, highlighting the need to develop alternative control strategies such as cultural controls which require an understanding of plant associations. In shade-house experiments, Bryobia spp. survived and reproduced successfully on pasture, lupins and oats, but progeny failed to reach the adult stage on canola and wheat. Balaustium medicagoense progeny failed to produce a generation on any crop but parental adults survived a few months on all crops, particularly wheat. Bryobia spp. damaged canola, pasture and lupins, but caused minimal damage to oats and wheat, whereas Ba. medicagoense caused considerable damage to wheat and lupins, but only moderate damage to canola, oats and pasture. Field survey data, taken from approximately 450 sites across southern Australia, combined with analysis of historical pest reports, suggest broadleaf crops such as canola, lucerne, lupins and weeds appear particularly susceptible to attack by Bryobia species. Balaustium medicagoense was more commonly found on cereals and grasses, although they also attacked broadleaf crops, particularly canola, lucerne and lupins. These findings show that the mites have the potential to be an important pest on several winter grain crops and pasture, but there are important differences that can assist in management strategies such as targeted crop rotations.
  • Authors:
    • Lawson, A. R.
    • Greenwood, K. L.
    • Kelly, K. B.
  • Source: Agricultural Water Management
  • Volume: 96
  • Issue: 5
  • Year: 2009
  • Summary: Knowledge of the components of the water balance - evaporation, transpiration and deep drainage - would be beneficial for targeting productivity improvements for irrigated forages in northern Victoria. We aimed to estimate these components using a simple water balance and the dual crop coefficients provided in FAO-56. Soil water deficits from a field experiment, comparing the water use of six border-check and one spray irrigated forage system, agreed well with the modelled values, except for alfalfa where irrigation intake was restricted. About 85% of the water applied to perennial forages (perennial ryegrass/white clover, tall fescue/white clover and alfalfa) was used for transpiration, 10% for evaporation and 5% was lost as drainage below the root zone. Evaporation was highest from the double-cropped (oats/millet) system (30%) and was 5-25% of the water used by winter-growing annual pastures (Persian clover/Italian ryegrass and both border-check and spray irrigated subterranean clover/Italian ryegrass). The high proportion of water used as transpiration by the perennial forages was due to their high ground cover maintained throughout the year. When compared over similar seasonal conditions, actively growing forages used similar amounts of water, indicating that any increases in water productivity will be mainly due to higher production and/or to matching the growing season of the forage to periods of lower potential evapotranspiration.
  • Authors:
    • Cameron, C.
    • Kearney, G.
    • Dowling, P.
    • Quigley, P.
    • Cousens, R.
    • Chapman, D.
    • Tozer, K.
  • Source: Crop & Pasture Science
  • Volume: 60
  • Issue: 11
  • Year: 2009
  • Summary: A field experiment was established in a southern Australian temperate pasture to investigate the effects of identity and proximity of perennial grasses on the demography of the annual grasses Vulpia spp. ( V. myuros, V. bromoides) and Hordeum leporinum (barley grass). Annual grasses were grown either alone or in mixtures, at different distances from rows of Dactylis glomerata (cocksfoot) and Phalaris aquatica (phalaris). Dactylis had a greater suppressive effect than Phalaris on Vulpia and Hordeum. Biomass, tiller production, and panicle production of annual grasses increased linearly with increasing distance from the perennial row. Tiller and panicle production were greater for Vulpia than Hordeum. The estimated rate of population growth (lambda) for annual grasses was greater in Phalaris than in Dactylis and in Vulpia than in Hordeum, and increased with sowing distance from perennial grass rows. It was estimated that lambda, when seeds were sown directly adjacent to a row of perennial grasses, was 1 and 0.4 for Vulpia and Hordeum, respectively, within Dactylis stands, and 7 and 3, respectively, within Phalaris stands. However, 15cm from the row, lambda reached 50 and 39 for Vulpia and Hordeum, respectively, within Phalaris stands, and 39 and 16, respectively, within Dactylis stands. In grazed, dryland pastures, perennial competition alone is therefore unlikely to prevent population growth of annual grasses, especially in systems heavily disturbed by grazing or drought. However, Dactylis showed more promise than Phalaris in limiting the abundance of these weeds.