NIAB - National Institute of Agricultural Botany

Orson's Oracle

Reducing the risk of glyphosate-resistant black-grass

Posted on 21/01/2016 by Jim Orson

In my previous blog I discussed the role of dose on the speed of development of pesticide resistance. I also highlighted the concern over the possible development of black-grass resistance to glyphosate. However, it must be said that nobody is sure that this will happen but just the possibility means that we have to take measures to reduce this risk.

Whilst dose is one aspect, the number of sprays of glyphosate is also an important issue: these need to be reduced to a minimum. I am reminded of a point that a local farmer made to me at an Australian conference where the introduction of GM glyphosate tolerant canola was being discussed. Responding to my question as to whether or not he would introduce this crop, he said that he would if he could identify an opportunity to avoid a glyphosate spray elsewhere in his rotation. This discussion was heavily influenced by the presence of glyphosate resistant annual rye-grass in Australia and so the need for an anti-resistance strategy on his farm was absolutely clear.

The recent Pesticide Usage Surveys carried out on behalf of the Chemical Regulations Directorate (CRD) of HSE have confirmed an ever increasing reliance on glyphosate in arable crop production. This is partly due to the decreasing number of options for effective control of black-grass within crops. Multiple applications of glyphosate to ‘stale seedbeds’ are commonly adopted before sowing winter wheat and there is a strengthening case to support such an approach for reducing high black-grass populations.

The issue is that if populations are sufficiently high to warrant multiple applications of glyphosate in order to prevent the emerged black-grass in ‘stale-seedbeds’ shading the soil and preventing further black-grass germination, then the background population is too high to grow a relative weed-free crop of winter wheat unless it is sown exceptionally late.

The obvious solution is to reduce background black-grass populations to a level where the emerged black-grass plants do not shade the soil sufficiently to prevent further black-grass emergence. The limited data suggest that this means less than 10-15 plants/m2. In addition, such a low incidence of black-grass will prevent the need to spray twice in order to keep the seedbeds in a good condition for sowing, although the number of emerging volunteer plants will also be a factor influencing decision-making.

There are other compelling reasons to reduce background black-grass numbers. As well as reducing the number of glyphosate sprays prior to sowing wheat, low numbers mean that the herbicides used to control the weed in the crop will provide higher levels of control of black-grass heads. When black-grass plants start to compete with themselves as well as the crop, reducing black-grass plant number with selective herbicides in the crop means that the survivors have less competition and produce more heads.

In addition, reducing background numbers provides more flexibility for the future. A poor herbicide performance will not be so critical and there may be more options available on sowing dates.

Hence, for a variety of reasons it is important to reduce background populations of black-grass. The often poor and variable selective control offered by herbicides in arable crops means that living on a knife-edge is no longer a realistic option. I will stop now because I recently read that one farmer’s New Year resolution is not to attend any more meetings on black-grass. Somehow, I know what he means!

Leave a comment / View comments

 

Survival of the most resistant

Posted on 08/01/2016 by Jim Orson

I have been a life-long supporter of Leicester City and can now come out of the woodwork because they have the 40 points necessary to survive in the Premier League for another season. It would have been tempting fate to declare my support earlier in the season. I must admit that I have not watched them play for about 10 years and, instead, go and watch the mighty U’s (Cambridge United) a few times a season with one of my daughters.

Leicester City is not new to the dizzy heights that they have achieved so far this season. I watched them in the early 1960s when they were a top club. They were not far from doing the old League and FA Cup double in 1963 when I watched the best match that I have ever attended. They drew 2-2 at home to Spurs, then the glory team of the League. Jimmy Greaves of Spurs scored with a wonder strike and it was amazing that the great Gordon Banks in the Leicester goal even got his finger tips to the ball. That moment is forever imprinted on my mind.

There is a small core of Leicester supporters in NIAB, including Tina the director. Normally our discussions centre on Leicester’s survival in the Premier League, particularly this time last year when they were bottom. I am mentioning this because I am desperately trying to link the Leicester story to the theme of this blog which is that small percentages of a population can count, particularly in the context of the development of pesticide resistance.

It is generally accepted that pesticide resistance is a process of selection of naturally occurring mutations which happen to be resistant to a specific or a range of pesticides. The continual exposure to the mode(s) of action of the pesticide(s) to which there is resistance results in these resistant individuals becoming dominant in the pest populations, whether these pests be insects, diseases or weeds.

There has long been a debate in the industry about whether or not high or low doses increase the rate of selection of these naturally occurring mutations. Logic suggests it must be high doses in order to select more effectively for the most resistant individuals. Low or sub-optimal doses are more likely to result in the higher survival of non-resistant or less resistant individuals. This type of discussion always begs the question of what actually is a ‘high’ or ‘low’ dose.

SeptoriaField experience and experimental evidence on fungicide resistance also suggest that it is usually higher doses that result in the more rapid selection of resistant individuals. The following is from a recent statement from the Fungicide Resistance Action Group, which represents the whole industry, on reducing the speed of development of septoria resistance to the SDHI fungicides. “All effective fungicides exert a selection pressure on pathogen populations and carry a risk of resistance. This risk can be modified and reduced by either mixing or alternating with fungicides with an alternative mode of action, or by reducing the number of applications or dose of the fungicide.” This suggests that, at last, there is a more general acceptance that the higher the dose the more likely there is to be selection for resistance.

I am positive that same is generally true for herbicides. When first introduced the ‘fops’ and ‘dims’ controlled around 99% of black-grass at the recommended dose. They were rarely used at reduced doses but resistance developed very quickly. The field experience with the sulfonylurea herbicide product Atlantis was that when it was first used there were always a few survivors of a full dose. I remember suggesting to farmers that these could be resistant and should be rogued. Resistance to Atlantis is now widespread and continues to increase rapidly.

However, there is some evidence to suggest that ‘low’ or sub-optimal doses can speed up the development of weed resistance to glyphosate. That gave me cause to ponder why this could be true. It did not take me long to conclude that the development of resistance is speeded up not by whether high or low doses are used but initially by doses that result in a low number of survivors. When first introduced, crop-safe herbicides such as the ‘fops’ and ‘dims’ and the sulfonylureas left only a few survivors when used at recommended doses. The survivors were more likely to be resistant to these modes of action.

Optimal doses of glyphosate should kill everything, provided that it is applied well, growing conditions are conducive to control and the weeds are at the appopriate growth stages for good activity. However, sub-optimal doses may leave a few individuals which may be more likely to have a level of resistance. The continued adoption of sub-optimal doses, particularly where minimal or no-tillage is employed, may form the basis of future populations which could perhaps cross-fertilise, resulting in individuals with even higher levels of resistance. It may have been significant that the first case of a partially glyphosate resistant weed in the UK was in sterile brome, where the dose recommended for control on stubble (540 g ae glyphosate/ha) is often only marginally effective on this weed.

So perhaps the speed of development of resistance is all about the dose required to select the most resistant types. This is often the recommended doses for crop-safe herbicides and fungicides but could be sub-optimal doses for glyphosate, at least on some weeds. Hence, whilst it is not absolutely proven that sub-optimal doses speed the development of glyphosate resistance it would be advisable to apply it correctly in the right circumstances and use doses that will kill all the black-grass and inspect the results of treatment to ensure that there are no survivors. This is particularly the case where control by glyphosate is not supplemented by cultivations. For more information, please see the guidelines for minimising the risk of glyphosate resistance.

Best wishes for 2016.

The constructive comments on the script of this blog by Stephen Moss of Stephen Moss Consulting are gratefully acknowledged.

Leave a comment / View comments

 

Warning – abide by pesticide authorisations

Posted on 18/12/2015 by Jim Orson

In my final blog of 2015 I would like to express my concern over the amount of mecoprop in the raw water that has recently been feeding some drinking water treatment works. Levels of up to four or five times those specified in the Drinking Water Directive have been detected. The source of this mecoprop is unclear. Products containing mecoprop-p are authorised for amenity use and for grassland and the warm weather in the late autumn could have protracted the application season.

My concern is that there are a few fields of wheat where the volunteer beans have been showing symptoms of hormone herbicide damage. Mecoprop-p is no longer authorised for application to cereals in the autumn after 1st October and some farmers may not have been aware of this fact. A restriction like this would not have occurred just because levels in water have in the past been exceeding those specified in the Drinking Water Directive. There must be environmental impact issues associated with this usage.

No doubt the water companies can remove the mecoprop before it arrives at the consumers’ taps but that is not the point. So does it matter that there are high levels in some water sources this autumn? Of course it does. It may send a signal to some anti-pesticide groups and legislators that some individuals may not be following the rules on the responsible use of pesticides and consequently opens a can of worms. If the regulations are not followed, what hope is there that voluntary measures will be adopted generally by the industry?

The success of voluntary measures is already being questioned by outside organisations. This year the RSPB has compiled a report intending to demonstrate that voluntary measures do not work. Some of the criticisms of the voluntary approaches mentioned in this report are unfair because they refer to observations made close to 10 years ago.

It does not stop there. Recently the Angling Trust and the World Wildlife Fund (now simply called WWF) settled their High Court dispute with the government over their accusation that Defra is failing to take effective action to protect waterways from agricultural pollution. They received courtroom reassurances from Defra that mandatory water protection zones (WPZs) are being actively considered alongside voluntary steps being taken by farmers to reduce pollution in rivers and wetlands.

It has long been accepted by the water companies that we have only until 2018 to demonstrate that voluntary measures are sufficient to meet the demands of the Drinking Water Directive. This does not mean eliminating all pesticide movement to water but it is critically important to reduce the size of the peaks* in pesticide content that can create real difficulties at the Water Treatment Works.

Hence, it is important to re-assure the pressure groups, the public, the Water Companies and the legislators that the industry can be trusted to reduce the inevitable environmental impact of farming. The folly of just a few farmers illegally using mecoprop-p, with intent or through ignorance, in cereals in the autumn would demonstrate the opposite and, if regularly repeated, could lead to further restrictions on mecoprop-p usage. I realise that the alternatives for autumn control of beans in cereals are slower acting and more expensive but that is a small price to pay in the context of the bigger picture.

Notes:
* See earlier blogs that mention the importance of reducing the size of peak concentrations:
2 November 2015: Pesticides in water – meeting the challenge
16 November 2015: IT and reducing pesticides in raw drinking water

 

 

 

Leave a comment / View comments

 

Glyphosate cancer confusion

Posted on 07/12/2015 by Jim Orson

In my mid-April blog, “Roundup causing cancer?”, the classification of glyphosate by the International Agency for Research on Cancer (IARC - part of the World Health Organisation) as a probable carcinogen was discussed. Its evidence in humans was from correlating the occurrence of non-Hodgkin lymphoma with exposure to formulated glyphosate products, mostly agricultural, in the USA, Canada and Sweden. The concern about such studies is that there are no controls which allow for the exclusion of the many other farming practices which may have been causative. Recently, the European Food Standards Agency (EFSA) has concluded that “glyphosate is unlikely to pose a carcinogenic hazard to humans and the evidence does not support classification with regard to its carcinogenic potential”.

So why is there a difference of opinion between these two august bodies? Looking at the statements made by people far brighter than me, it seems that there are more reasons than the few I am about to mention.

Firstly, IARC, wholly or partly, inevitably based its opinion on studies of the product whilst EFSA based its opinion on just the active substance. When Roundup was first introduced I was led to understand that the acute toxicity (symptoms within 24 hours) of the formulants was higher than the glyphosate itself but the product was still very safe. The Soil Association raised the issue of the risk from the formulants even before the publication of the EFSA opinion. In the European Union context, the European Commission (EC) approves the active substance (in this case glyphosate) and individual zones or member states authorise the formulated products that can only contain EC approved active substances. I find it comforting that there are some very precautionary inclined EU member states that have authorised formulated products based on glyphosate.

IARC admitted that its opinion was based on a small database of scientific evidence and EFSA was able to assess a larger database. It also seems that IARC re-analysed some of the data in the papers it considered. The reasons for this are unclear and the authors of those papers may or may not have agreed with the method of analysis carried out by IARC.

There also appears to be a different approach between the two organisations when coming to their conclusions. According to the genotoxicologist Dr Peter Jenkinson, “EFSA followed a weight of evidence approach whereas IARC took the view that if one study showed a positive result then it took precedence over negative studies, even though there may be many more negative than positive studies.”

These two opinions were about assessing the hazard posed by glyphosate. Just to remind you, electricity is hazardous but the risk of electrocution is acceptable. This is because risk is a combination of hazard and exposure. As always with the risk from chemicals, it is about the dose. For instance, formaldehyde is listed by IARC as a Group 1 carcinogen (its highest risk category meaning that it will definitely cause cancer). No pesticides appear to feature in this category but some medicines do. Formaldehyde naturally occurs in apples at concentrations of up to 22 parts per million (ppm) but at this concentration does not appear to be a risk. The maximum residue limit (MRL) for glyphosate in cereal grains is currently 30 ppm but this may change because its EC re-approval is currently being considered. I must emphasise that a MRL is not a value above which there is harm to health but is the highest level of a pesticide residue that is legally tolerated in or on food or feed when pesticides are applied correctly (Good Agricultural Practice).

There has been much fun made of IARC in terms of its opinion that red meat is probably carcinogenic and processed

meat is carcinogenic. The statistic quoted on the TV programme Have I Got News For You was that IARC has examined 941 substances and has only found one to be non-carcinogenic. This may be a little unfair because much of their

work may have been directed at substances where there was a real or perceived suspicion (there is no shortage of accusations from ‘green’ groups about glyphosate!). Many things when taken to excess are hazardous but in real life do not pose an unacceptable risk. It does not help I

ARC that the World Health Organisation has been widely criticised over recent weeks, notably for its slow response to the Ebola outbreak in Africa.

So my conclusion is very much the same as my previous blog. Glyphosate products are safe when used a directed.

Leave a comment / View comments

 

Wheat 2015 - the right kind of sunshine?

Posted on 19/11/2015 by Jim Orson

Looking at the husbandry of those achieving extremely high winter wheat yields this year suggests that it was the weather, good crop management and possibly higher rates of applied N that were key. There are those who sell a programmed approach of trace element applications who are trying to claim that this was also a factor but I am very far from being convinced. Field yields of 16-17 t/ha were achieved without using any trace elements, except in some cases of manganese sulphate.

The requirement for above typical rates of nitrogen for extremely high yields is debatable because some farmers achieved these yields with their typical doses of nitrogen; around 200 kg N/ha for a first wheat on long term arable soils where no organic manures are being used. However, NIAB TAG have some trials’ evidence that there were cost effective responses from doses above 220 kg/ha when plot yields of feed wheat exceeded 14 t/ha. We are still deliberating on how this information can be used in practice.

It is also a matter of debate as to why this year was so special for achieving such high wheat yields. We did have around more than 10% extra radiation than average during the growing season but this cannot be the sole explanation because it is certainly not unique.

Air temperatures were bang on or close to average from March onwards and so the growth and development of wheat proceeded at its average rate. Higher than average radiation is often associated with warmer than average air temperatures which in turn result in a more hurried growth and development and so there is less time to trap all the extra radiation and convert it into yield. Hence, higher than average radiation combined with typical or below average air temperatures can only be beneficial for yield potential.

One reason why air temperatures may have been restrained this year was because the sea surface temperatures of the North Sea and the North Atlantic were significantly cooler than average. Temperatures of the surface of the North Atlantic go up and down in a multi-year cycle known as the North Atlantic Oscillation. This year we seem to have been towards or at the bottom of the current cycle. There are many learned papers that suggest that this cycle influences our air temperatures. The map below, which compares current sea surface temperatures to the average, clearly shows how cold the sea surface was around the UK this June.

Sea surface temperature Jim Orson blog

 NOAA Jim Orson blog

Does all this explain the extremely high yields achieved by some growers? I personally do not think so. I discussed the subject with Eric Ober, a crop physiologist who works for NIAB. He gave me a few leads which I have followed up. As a result I have concluded that it was not only the above average radiation combined with average temperatures this year that were behind the extremely high yields but also that it was ‘the right kind of radiation’. I realise that this sounds a bit quirky and so I need to explain myself.

I examined the daily hours of sunshine received in March-July on the on-line automatic weather station at the Cambridge Computer Laboratory and compared their pattern with the four years in the previous ten years when we received similar levels of sunshine hours during these same five months. 2015 was very different. There was less than half the totally sunless days in these months and the total sunshine hours were accumulated more on a little and often basis than in the other years. In addition, I remember that this spring and early summer we regularly had sunny mornings until around 11.00 BST and then light cloud for the rest of the day. Some academic research shows that little and often bursts of lower intensity solar radiation (i.e. with a greater proportion of diffuse radiation) are better for photosynthetic efficiency than receiving the same overall level of radiation in longer and more intensive bursts. Similarly, there are several studies that show that solar radiation is more efficiently used by wheat when it occurs in the morning or evening rather than at midday (1:00 BST) when it is at its most intense.

In conclusion, I think that the reasons for the extremely high yields of wheat in some parts of the country this year are:

• The crops came out of the winter in good condition and there was little over-winter water logging.
• The very high yielding areas had a good dollop of rain in May, perhaps at the expense of some solar radiation.
• Temperatures from March onwards were around average but solar radiation was well above average, particularly in April and June. The April weather helped to ‘set up’ the foundations for potentially high yielding crops and June is the key month for grain fill.
• The higher than average radiation was supplied on a ‘less intense and more often basis’, which is conducive to high photosynthetic efficiency.

It was not just wheat that achieved very good yields this year and so other crops may have responded to some or all of these factors. Perhaps ‘the right kind of solar radiation’ is not so quirky after all.

Leave a comment / View comments

 

Page:  < 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 >  Last ›