Posted on 08/06/2015 by Jim Orson
Perusing the recent agricultural press would suggest that arbuscular mycorrhizae (AM; some say that the plural is spelt mycorrhizas) are the sole basis of soil health. In one article it is argued that oilseed rape is damaging soil structure because it does not form an association with these soil fungi.
Let us start with a couple of definitions. Mycorrhizae are defined in the web version of the Oxford Dictionary as fungi which grow in association with the roots of a plant in a symbiotic or mildly pathogenic relationship. AM are those mycorrhizae that penetrate the outer layers of plant roots. These have a symbiotic relationship with plants by helping with their nitrogen and phosphate uptake in exchange for the plant giving them some of the products of photosynthesis. However, under severe stress conditions, these fungi turn into self-survival mode and primarily look after themselves at the expense of the plant.
The other benefit of AM is that they form a protein structure, described as glomalin, which helps to bind soil particles into aggregates that make the soil easier to cultivate and form resilient seedbeds.
Some crops potentially have very good associations with AM, such as oats, barley and the legumes. Some crops do not form associations with AM, notably the brassicas including oilseed rape. Hence, the accusation that oilseed rape is damaging to soil structure.
It is the latter comment, which appears to me to be overplaying the properties of AM and prompted me to contact Professor Penny Hirsch of Rothamsted Research. This was also partly because I remember being told at university that a high soil status of nitrates and/or phosphate restricts the abundance of AM.
Prof. Hirsch confirmed that a high nitrogen and/or phosphate status of the soil restricts the abundance of AM and she made the intriguing comment that modern wheats have largely lost the ability to benefit from AM compared to wheats bred before the 1950s. Perhaps the ability to form an association with and potentially benefit from AM has been lost because modern breeding programmes have been carried out in a background of high levels of nitrogen use.
So the wheat and oilseed rape rotations using high levels of nitrogen and with good phosphate supply are perhaps not the best for encouraging the abundance of AM. Despite this, a thirty year plus trial at NIAB TAG Morley is showing, in almost continuous wheat, that current rates of nitrogen produces more dry matter production of straw and presumably roots when compared to zero nitrogen application, leading to a slight increase in soil organic matter. However, despite a small shift in soil organic matter, there is a significant improvement in soil aggregate stability where current rates of nitrogen have been used. This perhaps demonstrates that other soil microflora, which also produce glomalin, have responded to the higher annual incorporation of straw and roots.
It is widely reported in the scientific literature that high levels of AM are beneficial to plant nutrition only where the soil supply of nitrates and/or phosphate is limited. In our conventional systems the supply of these two nutrients is not limited and so the need for assistance from AM is less important.
It seems to me that whilst high levels of soil AM have theoretical advantages, the way we fertilise our crops in conventional agriculture largely or totally negates their ability to improve plant nutrition. The claimed advantage that they produce glomalin is also largely or totally neutralised by the fact that other microflora also form this protein structure that binds soil aggregates. I am an enthusiast for the regular application of organic material to the soil to increase soil microbial biomass but the current enthusiasm specifically to encourage AM seems to me to be overblown.
Posted on 28/05/2015 by Jim Orson
Black-grass heads are now appearing above wheat crops. Perhaps it is too early to estimate the levels of control achieved this year but it is getting very close to the end of the window for patch spraying with glyphosate in order to prevent the setting of viable seed. In my opinion, based on field experience and limited experimentation, some of the final dates for this operation suggested in the press are far too late. It is also important to note the downside of this approach. The crop in the patches is also killed and so there is no green matter keeping the soil dry if there are meaningful summer rains. This can lead to complications in post harvest cultivations.
As the ability to control weeds with herbicides reduces, the need to be pro-active and adopt other means of control increases. No opportunity should be ignored. This spring, I noted a large field of oilseed rape where there was charlock in a small area only. It might have taken a couple of hours to hand pull the plants and put them in a bag but it would perhaps have prevented a costly problem in the future. Similarly, I have seen very small patches of black-grass which would take only a few minutes to hand pull but have been ignored. This type of activity goes against the grain for many who have become accustomed to large scale control using herbicides but the current reality means such opportunities have to be taken.
Whilst singing the praises of the opportunist hand-pulling of weeds, it is important to remember that although it is assumed that it is 100% effective, the reality is that the outcome can be disappointing unless it is done diligently. In the days of hand-roguing wild-oats, it was estimated that it was equivalent to reducing seed shed by 85% and was far more effective if the fields were re-walked a few days later.
There is now the challenge of increased resistance of septoria to the triazoles. This results in all kinds of issues, including less reliance on the eradicant properties of fungicides and adopting programmes that delay resistance to the SDHIs for as long as possible. It also creates the dilemma of how much fungicide to use and also making sure that the time gap between applications is not too long. Currently, farmers are not holding back on fungicide use despite the low price of wheat. On the other hand, there are seed houses claiming that some varieties require less fungicide.
When the triazoles were providing effective control of septoria, there was a real opportunity to exploit the value of good varietal resistance through lower doses. The more resistant varieties only required two thirds of the dose normally applied to the most susceptible varieties. I suspect that in most cases this potential saving was ignored on the grounds of keeping it (too) simple or the “just in case” principle.
Nowadays, I suspect that the reason for using the more disease resistant varieties is largely to do with security of production in a difficult spraying season rather than a significant saving on inputs. There is a big financial risk to ‘getting it wrong’ when planning fungicide programmes. In high disease years, the yield penalties from using lower input programmes can be very significant compared with small potential savings in low disease years which are difficult to predict. I am sure NIAB TAG members are well aware of the options through the information they receive.
As always, it is well worth farmers and advisers keeping an eye out for all opportunities to save money in the short and/or medium term. This means being well informed and occasionally going back to practices that were commonplace a few decades ago.
Posted on 13/05/2015 by Jim Orson
The cause of the weather pattern known as El Nino has been forming in the middle and eastern parts of the equatorial Pacific Ocean where the temperature of the surface layers is significantly higher than normal. It will have a large impact on weather patterns across the globe, particularly at the end of the year and in the first few months of 2016. However, the impact of each El Nino is different because of other factors that influence our weather. For instance, of the 26 recorded El Ninos, 17 have caused drought in Australia but on some occasions, it has been associated with very wet weather in that country.
El Nino years have also been associated with increased rainfall in parts of South America, less rainfall in India, less hurricane activity in the Atlantic but more typhoons in the Sea of Japan. Closer to home, they have been associated with colder winters in Northern Europe. The El Nino in 2009 is said to have been a cause of the very cold weather in the UK at the end of that year and in the spring of 2010.
The way it is formed explains why it occurs in a cycle of at least two years. Currents normally bring cold water to the surface layers of the equatorial Pacific Ocean, a process called ‘upwelling’. In years when the winds are weak, less cold water is brought to these surface layers and consequently they warm, causing an El Nino. There is then a ‘feedback loop’ of at least two years because the warmer surface layers of the Pacific cause increased wind speeds that bring more cold water to the surface layers. The cooling phase lasts far longer than the warming phase. See the excellent Met. Office video on the causes of El Nino (https://www.youtube.com/watch?v=WPA-KpldDVc).
There have been studies on the impact of El Ninos on global food production, notably the Nature Communication “Impacts of El Niño Southern Oscillation on the global yields of major crops” (http://www.nature.com/ncomms/2014/140515/ncomms4712/full/ncomms4712.html).
Overall, the results of the study suggest that El Niño improves the global mean soybean yield by 2.1—5.4% but appear to change the global yields of maize, rice and wheat by −4.3 to +0.8%. The authors conclude that the results of their studies can lead to farmers adapting crop choice and management in order to minimise the impact of El Ninos. This is brave stuff bearing in mind that no two El Nino events are ever the same.
El Nino is also known as the Southern Oscillation. There is a North Atlantic Oscillation that is unlike El Nino because it is a largely atmospheric phenomenon rather than based on sea temperatures. It contributes to weather fluctuations in the North Atlantic and surrounding humid climates, such as the UK. However, it does not have the global impact of El Nino.
A few weeks back I attended a lecture on El Nino in the Department of Mathematics at the University of Cambridge. The lecturer, who was from the department but also worked for the Met. Office, was very ‘kind’ to the audience as he explained the mathematical basis of El Nino in simple terms. While he was not a modelling zealot, he did outline a simple model to illustrate the phenomenon. This explained why the Atlantic and Indian Oceans’ water temperatures had little effect on global climates. El Nino is far more influential because of the size of the equatorial Pacific.
So what can UK farmers do to reduce the impact of El Nino? Higher soya yields possibly mean lower oilseed rape prices and early drilling of wheat is likely to be more appropriate should next winter’s weather be cooler than average. But do not count on it!
Posted on 30/04/2015 by Jim Orson
There is an aspect of black-grass cultural control practice with which I would like to take issue. There is the notion that multiple applications of glyphosate are needed in the autumn to maximise the emergence of black-grass plants and hence maximise seed loss prior to sowing the following winter crop. This is based on the unsubstantiated theory that allelopathy is the cause of the observation that there is an apparent higher total emergence of black-grass when two or more applications of glyphosate are made to a false seedbed in order to kill newly emerged black-grass, rather than when a single application is made just before sowing the subsequent autumn sown crop. In my opinion there is a much more likely explanation for this than allelopathy being the cause.
The Oxford Dictionary defines allelopathy as “the chemical inhibition of one plant (or other organism) by another, due to the release into the environment of substances acting as germination or growth inhibitors”. In this case it is suggested that exudates from the roots of newly emerged black-grass prevent the germination of neighbouring viable black-grass seeds in the soil. Hence, it is argued that allelopathy has to be avoided to ensure maximum germination of black-grass seed, thereby maximising seed losses before sowing the following winter crop. However, it should be noted that no root exudates that damage the germination or growth of black-grass or other species have ever been identified in black-grass but more proof is needed to discount the claim of allelopathy.
There are data in HGCA Project Report 381 (http://archive.hgca.com/publications/documents/cropresearch/PR381_Final_Project_Report.pdf) on the impact of cultivations and dormancy on black-grass populations that indirectly suggest allelopathy is not the explanation as to why there is an apparent higher total emergence of black-grass when several applications of glyphosate, rather than a single application, are made to a false seedbed. There is no part of the report that covers this specific aspect and some work has to be done to extract the relevant data. Luckily for you, I have attempted to do this work! In this project there were two dates of drilling of winter wheat but for both drilling dates, no application of glyphosate was made after harvest until a single application just prior to sowing. Therefore, the difference in the number of black-grass counted at the time of glyphosate application indicates the increase in the size of black-grass population between the two drilling dates.
The first diagram shows on the horizontal axis the number of black-grass seedlings just before the single glyphosate application made prior to the first drilling date of winter wheat. You will note that there were three cultivation systems; cultivate immediately after harvest of the preceding winter wheat (in the project a Simba Solo was used), cultivate immediately before drilling (the glyphosate was applied immediately before the cultivation) and direct drilling. The vertical axis shows how much additional black-grass emerged prior to the delayed drilling, expressed as a proportion of the initial count. So a two on the vertical scale shows that there were double the number of plants emerged before a single application of glyphosate being made prior to the late drilling than the number emerged before a single application of glyphosate prior to the early drilling.
This first diagram agrees with the field observations that early emerged black-grass may impact on the establishment of later emerging black-grass. The higher the number (particularly over 50-100 plants/m2) of black-grass plants prior to the first drilling, the less the proportional increase in numbers prior to the second drilling. In most situations, at the higher infestations, the multiplication was one or below indicating that there was no increase in the number of plants between the two drilling dates. Those who promote the notion of allelopathy would say that this proves their point, with allelopathy being absent or low at the low populations and high at the high populations. Accepting this notion means accepting that there is an increase in the proportion of ungerminated seed as allelopathy increases, leading to an expectation of proportionally higher numbers emerging after the later sowing date where the background populations are high.
To test this expectation, I have prepared the second diagram. This has the same horizontal axis but the vertical axis is now the increase in the proportion of emerged black-grass plants in the late sown crop prior to Atlantis being applied at the 2-3 leaf stage (it worked when the project started) or in the untreated areas. This shows that the proportional increase appears to show the same pattern but the rate of reduction is less, most probably because the count was taken when the most advanced black-grass were only 2-3 leaves. There is no apparent association with supposed low levels of allelopathy at the low populations in diagram 1 resulting in lower levels of black-grass post-crop emergence, or supposed high levels of allelopathy at the highest populations maintaining more viable seed which would germinate in the following crop (diagram 2).
Hence, allelopathy appears not to be the explanation for the relationships between the background black-grass population of the site and the proportional increases in plant counts, either before late drilling or early post-emergence of the crop. The ‘shape’ of the diagrams strongly suggests that the far more logical explanation is intra-specific competition in black-grass (i.e. simply black-grass competing with other black-grass). This can take two forms. One is where the shading by the earlier emerged black-grass prevents light getting to the soil surface thus inhibiting germination. This would have a similar effect to allelopathy and so this very limited data suggests that the cause is much more likely simply to be that later emerging black-grass is out-competed by the earlier emerging black-grass. Accepting that this is the explanation means that the seed loss through germination in the soil is the same regardless of the numbers of glyphosate applications made.
This latter and more logical explanation suggests that there is no need (in terms of black-grass seed loss) for multiple applications of glyphosate and using only one application just prior to sowing will have the same impact on the numbers emerging in the crop. However, on a practical note, letting black-grass get very advanced means that drilling after the glyphosate application could be delayed.
Reducing the number of applications of glyphosate will save money and operational stress and reduce the risk of glyphosate resistance in black-grass. It is interesting to note that the first weed to develop glyphosate resistance in the world was rye-grass (Lolium rigidum) in Australia and in the first recorded cases, the resistance mutations made it weak and uncompetitive. Should this be the case with black-grass, then using intra-specific competition to kill the weaker plants in the stubbles may be an additional anti-resistance strategy.
I am the first to admit that these data do not totally disprove allelopathy in the field and I suggest that specific research is required. However, it seems to me that those who have counted a higher total black-grass emergence after multiple sprays of glyphosate rather than after a single spray just before drilling, have overlooked the simple and more logical explanation of plant competition reducing numbers once the black-grass grows beyond the early emergence stage.
Posted on 20/04/2015 by Jim Orson
In late March, the International Agency for Research on Cancer (IARC) announced that glyphosate would be added to its list of agents that are “probably carcinogenic to humans”. The IARC is an agency within the World Health Organization so the announcement has been widely reported. This news has delighted those who oppose GM technology and/or pesticides. There are many “we told you so” articles on their websites and in the popular press.
Does glyphosate really cause cancer? I am no expert on this issue but have been looking at the comments made by UK scientists (http://www.sciencemediacentre.org/expert-reaction-to-carcinogenicity-classification-of-five-pesticides-by-the-international-agency-for-research-on-cancer-iarc/) and also, I have read an excellent blog which carefully weighs up the evidence used by the IARC when coming to its preliminary conclusion (http://weedcontrolfreaks.com/2015/03/glyphosate-and-cancer-what-does-the-data-say/). Interestingly, the latter quotes a huge American study which shows that farm workers are less likely to get the cancer in question (non-Hodgkin lymphoma) than the general population.
The IARC admits that it is working from a very limited database of studies on the subject. This reminds me of the Daily Mail type of article where one quoted study suggests that there is evidence that a particular food type causes cancer and yet within a few weeks there is another study that concludes the opposite. The following diagram really puts such studies into context. The blobs are the average result for an individual study but the inevitable error in each study often means that a blob on the right or left hand side of the vertical line may not be significantly different to the line. According to the blog to which I referred, this is the case for the few studies on glyphosate and non-Hodgkin lymphoma.
There are two reasons for me potentially to panic over the IARC announcement. One is that I have handled glyphosate over the last forty years or so and also, if it is true, then there could be implications for the availability of the herbicide. However, I am not even concerned after reading the comments on the evidence.
It is worth mentioning that the IACR list of known carcinogens includes alcoholic beverages, emissions from coal fires, untreated or mildly treated mineral oils, outdoor air pollution, solar radiation, soot, wood dust and smoking tobacco. I suggest that it is worth paying more attention to our exposure to these than the so-called ‘probable’ risk from glyphosate but please remember that pesticides should always be handled with care.
After researching the issue, I am also a lot more reassured about my coffee addiction (see the diagram). In fact, I was once so concerned about my intake that I asked my GP if it was possible to drink too much coffee. He thoughtfully put down his cup of coffee and said “I hope not”. The diagram is also a comfort to those, like me, who love a glass of wine. A couple of years ago I heard a talk by an eminent scientist at the University of East Anglia who concluded that a healthy diet could be based on red wine and chocolate. Now where are those Easter eggs I have to finish eating? For health reasons you understand.