Posted on 03/09/2015 by Jim Orson
It is a fair time since I had a rant about GM. I am not alone as the following quote from a recent opinion piece in The Times demonstrates:
‘The debate around genetically modified crops would almost certainly puzzle an alien. “I don’t get it,” he’d beep, arriving on his first tour of Earth. “You’ve found a way to boost crop yields and reduce your reliance on pesticides. Repeated, peer-reviewed scientific studies have shown there is no downside. It could be the answer to a pending crisis of global hunger, with no ill-effect. Yet it still makes lots of people very angry, and many governments avoid it in horror. Why?’
It is very easy to answer that question. There are too many organisations who have pinned their reputation and, in some cases, their finances on being against the technology. It has also become an ideological struggle against ‘multi-nationals’. Hence, the same old scares are repeated and new issues are exploited as being evidence against the technology. A prime example is the reducing numbers of the monarch butterfly in the US. These iconic butterflies migrate in the autumn from southern Canada and northern US to a specific area in central Mexico in huge and spectacular clouds.
The monarch butterfly first entered the GM debate when it was found that they died when contained in chambers heavily laced with pollen from Bt maize modified to control corn-borers. This is no surprise; if you continually expose an insect to very high levels of something that is known to kill it, it will die. The key issue is in real life would the monarch butterfly be exposed to sufficient amounts of this pollen to affect its health? The judgement of the experts is that it would not.
Despite this, there have been recent headlines that GM Bt and glyphosate tolerant maize is responsible for the recorded decline in the Monarch butterfly. However, looking behind the headlines is a more nuanced explanation. Monarch butterflies feed exclusively on native milkweeds (Asclepias species) and cannot survive without them. Native milkweeds are perennials that flourish in semi-natural areas such as road sides, edges of fields and also in uncropped land. The plant declined by 58% in the plains states from 1999 to 2010. Monarch populations dropped by 81% in the same period.
It now seems that the major factor in the decline of milkweed is that more land is now cropped with maize in order to supply the ethanol plants. Nearly 4.5 million hectares have been taken out of the Conservation Reserve Programme, much of it going into maize. It has to be acknowledged that glyphosate tolerance enables a higher level of control of native milkweeds in maize but the shift into maize and out of something akin to set-aside has been a major factor. This is an unintentional consequence of promoting biofuels.
Another reason for the fall in the population of the Monarch butterfly may be illegal logging in Mexico where it overwinters on trees. The logging was almost halted at one stage but there are recent reports that it has recommenced.
Taking the populist decision to ban GM, as the Scottish Government has done, is an easy way out for European governments. They are really abdicating their responsibility to lead the debate rather than just blindly follow public prejudice that has been stoked up by misleading and misinterpreted information. It seems that getting a better balance between crop production and biodiversity is ever more challenging and it is my view that GM technology can play a role in improving the chances of getting it right.
Posted on 20/08/2015 by Jim Orson
There was some interesting weather in the UK in July. The 1st was the hottest July day on record and a temperature of only 10C was recorded in Powys on the 31st. However, the extreme event of note for me was the overnight thunderstorm on 16th/17th in Cambridge. Up to that point, I was desperate for rain in order to perk up our garden and allotment. We had had rain forecast several times but received tiny amounts, if any at all. As they say in Norfolk, “it never rains in a dry time”.
Unfortunately we were away when the thunderstorm occurred and so I did not experience it at firsthand. The first I knew of it was when I checked the online weather station in Cambridge the following morning (I was that desperate for rain!). It said that there had been a total of over 50mm of rain overnight. When we got back on the 18th, I cycled up to the allotment where my rain gauge had recorded 80mm of rain. Up to 90mm was recorded locally. It must have been an amazing storm to witness.
This got me thinking about the challenges from weather events to our crops and their management. The dull and wet summer of 2012 was one such seasonal extreme event. It took a large toll on wheat yields in some parts of England, particularly on the heavy clays. I also clearly recall the dry and hot summer of 1976 (it was actually very dry from June 1975 to September 1976). It is not just the extreme summer weather events that can impact on our wheat crop; the wet autumn and winter of 2000 reduced the yields achieved in 2001. However, despite what we sometimes consider to be extreme weather variations, UK wheat yields are amazingly consistent from year to year when compared to some other parts of the world.
One notable example of large variations in yield over the years is in parts of Australia. There is an area in Victoria, north of Bendigo, called the Mallee, which is perhaps close to being marginal for arable crop production. Harm van Rees, a leading Australian agronomist, recently sent me the following data on its growing-season rainfall (GSR) expressed as deviation from average (rather than ‘of average’ as stated on the y-axis). As you can see, there have been a very high number of dry seasons over the last 15 years or so and the ten year running mean (the red line) is still going down. In the very dry seasons, wheat yields can be alarmingly low (often below 1.0 t/ha) or zero and many farms are in financial crisis. There was a similar but shorter downward trend of the ten year running mean in the 1940s. The anxiety about the current trend is heightened because of the concern over the possible influence that climate change may have been having on the rainfall in recent years and may have in the future.
So what would you do in this situation? Not an easy question to answer. Australian researchers have developed sophisticated risk management tools for use in arable farming, many of which are now based on seasonal forecasts. For instance, these help to decide on how much nitrogen to use as applying any nitrogen in a very dry year can reduce wheat yields as well as unnecessarily increase costs. A preliminary analysis suggests these tools are beneficial when compared to adopting the same management each year. As one researcher points out “the [seasonal] forecasts are too good to ignore, but not good enough to completely rely on. Even though there are uncertainties, we do have more information than if we were guessing by chance.” However, what farmers and bank managers in the area featured in the rainfall graph really need is a reliable forecast of rainfall not just for the season but also for the next few years.
A recently published report called Extreme Weather and Resilience of the Global Food System warns that major “shocks” to global food production will be three times more likely within 25 years because of an increase in extreme weather brought about by global warming. So we will have to accept that there are going to be even more challenges in the future. Achieving more resilience in our production systems to extreme weather events will have to be given a higher priority. It does not help the UK industry that pesticide resistance and legislation are together impacting on the resilience of our arable systems. It is also regrettable to have to say that opposition to GM may well reduce our ability to respond to the challenges of the future. The report can be found on http://www.foodsecurity.ac.uk/assets/pdfs/extreme-weather-resilience-of-global-food-system.pdf
Posted on 06/08/2015 by Jim Orson
This is my 150th blog and perhaps too many have been on black-grass, the nation’s most talked about weed. You have to believe me when I say that I was trying to avoid writing about this weed yet again but recent press results have driven me back to the keyboard.
The issue that I would like to raise is the too simplistic conclusions made from field data. These can over-estimate or under-estimate the role of cultural or chemical control of the weed. Good field data is not easily achieved and it is the industry’s responsibility to use it intelligently.
I have written before about the huge fuss made a few years ago about increased crop competition from wheat being able to reduce black-grass seed heads by half. Many took this as the answer to the black-grass problem but the maths of the weed’s dynamics told us differently. Models suggest we have to chemically control 97% of black-grass plants emerging in a continuous wheat crop grown in ‘normal circumstances’ in order to contain populations at current levels. This means that we can let only 3 out of 100 plants make it to seed shedding. Crop competition does not reduce black-grass plant numbers but reduces the ability of the plant to set viable tillers. Hence, reducing seed heads per plant by half through crop competition means that we can allow 6 rather than 3 out of 100 plants to make it to seed shedding. So a 50% reduction in seeding per plant reduces the need for chemical control to 94% rather than 97% control. Every reduction in the reliance on chemical control is useful but this is not the game-changer that was originally claimed. Obviously, greater reductions in seed return through competition would reduce the need for chemical control by a greater percentage. However, more than a 90% reduction in black-grass seed set from increased competition will be required to get down to a chemical control requirement of 70% of plants; a level that is now often achieved by pre- and/or early post-emergence herbicides.
This is an example of when a large percentage figure can over-estimate the value of a control technique. In contrast, recent quotes in press reports seem to be under-estimating the benefit of stacking herbicide products to control the weed. Some say that using more than two products in a mix adds little to the percentage control of plants. These statements can undermine the value of using mixtures of three or more products.
The best way to explain what I mean is to give an example. Let us again assume that there are potentially 100 black-grass plants/m2 that will emerge in a winter wheat crop. A two-way mix applied pre-emergence may control 60 of these plants (i.e. 60% control) and a three-way mix may control 70%. This is ‘only’ a 10% increase at face value. This does not sound a lot and some might say it would be better to apply this third component early post-emergence where potentially it may give a higher level of control. The maths of this situation are illuminating.
In this example, the two-way mix will reduce the numbers in the crop from 100 plants/m2 to 40/m2 and adding the third component of the mix will reduce the numbers from 40/m2 to 30/m2. Reducing survivor numbers from 40 to 30 is in fact 25% control, which is a lot higher than the headline figure of an additional 10% control. Put another way: using a two-way mix pre-emergence would let 40 plants/m2 survive and achieving 25% control from an early post-emergence would provide the same result of a final plant number count of 30 plants/m2 (more than 29 too many!).
I realise that this is a rather simplistic view and herbicide resistance will ‘nuance’ the situation. We know little about the impact a particular pre-emergence application will have on the susceptibility of weeds to a particular early post-emergence application. Are the percent controls of plants achieved by individual products simply additive or does an earlier application pre-dispose its survivors to be more or less difficult to control with an early-post emergence application?
What I am saying is that data need to be carefully interpreted. In practice, it may be that stacks of more than two products applied pre-emergence are particularly relevant to later sown winter wheat crops provided the seedbed and the weather are conducive to good control. Simpler mixes may be more suited to pre-emergence or peri-emergence applications to earlier sown winter wheat crops, particularly when the weather and/or seedbed are not suited to good control, with an expectation that a further application of a mix could well be necessary early post-emergence.
Posted on 23/07/2015 by Jim Orson
A few weeks back I wrote a blog (Black-grass allelopathy nonsense - posted on 30th April) on my analysis of the very limited and indirect evidence as to whether or not multiple applications of glyphosate to seedling black-grass in ‘false seedbeds’ were beneficial, in terms of reducing numbers in succeeding crops. I came to the conclusion that there was no compelling evidence to support multiple applications.
However, I have now been supplied with evidence from one well conducted trial that specifically investigated this subject and which suggests that multiple applications may well be beneficial and so I have to retract what I said a few weeks back. However, I remain of the view that the possible explanation cannot be allelopathy. Allelopathic exudates from black-grass roots have never been identified. The most likely explanation is that emerged black-grass plants shade the soil surface restricting further black-grass germination.
There are two further comments to add. Firstly, should a winter crop be sown if there are sufficient black-grass plants emerging in a ‘false seedbed’ to provide enough shade to reduce further germination? Numbers in the crop may well be too high to control adequately with herbicides. Secondly, is the threat of black-grass resistance to glyphosate sufficiently high to avoid multiple applications and to rely more on shallow cultivations to kill some or all of the emerged plants? Nobody knows the answer to that conundrum but black-grass has shown it is a worthy foe and has developed resistance to a range of modes of action.
One method of running down the seedbank of viable black-grass seed is to spring crop. However, in some areas this year, there are horrific numbers of the weed emerged in spring sown crops. I have noted that these instances tend to be in the areas that received the worst of the deluge in 2012 and the soils in that summer would have been waterlogged. This has led me to the conclusion that these fields are not infested with a new spring emerging strain of black-grass, as some suggest, but are continuing to suffer from the impact of exceptional levels of dormancy in the seed set in 2012. I realise the some may think that I am off my head and eventually will have to offer another apology for this observation! However, the HGCA Project Report 498 on dormancy shows that with typical levels of high dormancy, the majority of seed will germinate at least 12 months after shedding. It is only a small step to say that with exceptional levels of dormancy, the germination of a significant proportion of the seed could be delayed for a couple of years.
The other explanation is that there is always a small proportion of black-grass that will germinate in spring crops and so a high spring emergence is a reflection of an exceptionally high soil seedbank. I do not quite buy that one in the cases that I have experienced this year. I have only witnessed such high populations in spring sown crops since the summer of 2012 and I have been around a long time. The inevitable conclusion is that the weather and soil conditions during seed maturation that year had a profound impact on future germination patterns.
I initially got it in the neck from one colleague this spring because he tried a half-field comparison of cultivating/drilling or direct drilling a spring sown cereal. The experience in the industry is that ‘true’ direct drilling will reduce black-grass emergence in spring sown crops because of the lack of soil disturbance. In turns out that in this case there was no difference in black-grass emergence, which was extremely high, between the two cultivation approaches tested. However, where the land was direct drilled the soil soon cracked in the dry conditions and a very high proportion of the black-grass was emerging from the cracks. It shows why agronomists often have to mention a few caveats when they give advice!
Posted on 09/07/2015 by Jim Orson
Since 2012, I have been attempting to predict wheat yields in a blog at about this time of year. Looking back, 2012 was perhaps the easiest year to attempt this task. It was relatively clear that the lack of sunshine and the saturated soils would result in lower than average yields. Last year it was also reasonable to think that overall there would be very good yields. Indeed, we had a record UK average of 8.6 t/ha with the East averaging 9.1 t/ha.
This year is much more complicated. Generally, crops over-wintered in good condition and the levels of radiation that establish potential yield have been above average for almost every month this year, including the critical month of June. Night time temperatures were below average in June and consequently the crop did not respire so much overnight, hence retaining more of the day’s gains than average.
However, there have been negatives that may result in much of the potential not being realised. The heat of last week would have done wheat no favours. Luckily it occurred in perhaps the latter half of grain fill rather than earlier in this process. 200C seems about ideal for wheat, 250C is less than ideal, 300C is harmful and 350C even more harmful. Coupled with this we had high overnight temperatures where the wheat must have been respiring very actively. I cannot get a clear picture of what is the critical night temperature above which respiration is particularly harmful: the scientific literature quotes between 9 and 140C. In many areas last week, the night time temperature over two or three nights barely fell below 200C.
However, I think that the main determinant of how much of the good potential established by above average levels of radiation we harvest this year will be the availability of soil moisture. Much of the rainfall has been patchy and there are large areas of the country where rainfall in March, April and June was way below average. Looking back at the good yielding years of 1984, 2008 and 2014, it seems that May rainfall is very important to exploit wheat’s potential. This year rainfall during May was particularly patchy. The arable areas north of the Wash and also the West Midlands and parts of the South Coast tended to receive reasonable levels of rainfall in May.
This makes me think that in terms of yield we are in for a mixed UK wheat harvest, depending on the soil moisture availability in late May and June. Those areas north of the Wash which had some good May rains perhaps also missed the extremely high temperatures that were experienced further south. It seems to me, purely based on weather records, that those areas that received significantly below average rainfall in May will do well to achieve average wheat yields, except on the most moisture retentive soils. Those areas where the moisture deficits were closer to the average situation in mid-June may achieve more pleasing yields.