Posted on 08/01/2016 by Jim Orson
I have been a life-long supporter of Leicester City and can now come out of the woodwork because they have the 40 points necessary to survive in the Premier League for another season. It would have been tempting fate to declare my support earlier in the season. I must admit that I have not watched them play for about 10 years and, instead, go and watch the mighty U’s (Cambridge United) a few times a season with one of my daughters.
Leicester City is not new to the dizzy heights that they have achieved so far this season. I watched them in the early 1960s when they were a top club. They were not far from doing the old League and FA Cup double in 1963 when I watched the best match that I have ever attended. They drew 2-2 at home to Spurs, then the glory team of the League. Jimmy Greaves of Spurs scored with a wonder strike and it was amazing that the great Gordon Banks in the Leicester goal even got his finger tips to the ball. That moment is forever imprinted on my mind.
There is a small core of Leicester supporters in NIAB, including Tina the director. Normally our discussions centre on Leicester’s survival in the Premier League, particularly this time last year when they were bottom. I am mentioning this because I am desperately trying to link the Leicester story to the theme of this blog which is that small percentages of a population can count, particularly in the context of the development of pesticide resistance.
It is generally accepted that pesticide resistance is a process of selection of naturally occurring mutations which happen to be resistant to a specific or a range of pesticides. The continual exposure to the mode(s) of action of the pesticide(s) to which there is resistance results in these resistant individuals becoming dominant in the pest populations, whether these pests be insects, diseases or weeds.
There has long been a debate in the industry about whether or not high or low doses increase the rate of selection of these naturally occurring mutations. Logic suggests it must be high doses in order to select more effectively for the most resistant individuals. Low or sub-optimal doses are more likely to result in the higher survival of non-resistant or less resistant individuals. This type of discussion always begs the question of what actually is a ‘high’ or ‘low’ dose.
Field experience and experimental evidence on fungicide resistance also suggest that it is usually higher doses that result in the more rapid selection of resistant individuals. The following is from a recent statement from the Fungicide Resistance Action Group, which represents the whole industry, on reducing the speed of development of septoria resistance to the SDHI fungicides. “All effective fungicides exert a selection pressure on pathogen populations and carry a risk of resistance. This risk can be modified and reduced by either mixing or alternating with fungicides with an alternative mode of action, or by reducing the number of applications or dose of the fungicide.” This suggests that, at last, there is a more general acceptance that the higher the dose the more likely there is to be selection for resistance.
I am positive that same is generally true for herbicides. When first introduced the ‘fops’ and ‘dims’ controlled around 99% of black-grass at the recommended dose. They were rarely used at reduced doses but resistance developed very quickly. The field experience with the sulfonylurea herbicide product Atlantis was that when it was first used there were always a few survivors of a full dose. I remember suggesting to farmers that these could be resistant and should be rogued. Resistance to Atlantis is now widespread and continues to increase rapidly.
However, there is some evidence to suggest that ‘low’ or sub-optimal doses can speed up the development of weed resistance to glyphosate. That gave me cause to ponder why this could be true. It did not take me long to conclude that the development of resistance is speeded up not by whether high or low doses are used but initially by doses that result in a low number of survivors. When first introduced, crop-safe herbicides such as the ‘fops’ and ‘dims’ and the sulfonylureas left only a few survivors when used at recommended doses. The survivors were more likely to be resistant to these modes of action.
Optimal doses of glyphosate should kill everything, provided that it is applied well, growing conditions are conducive to control and the weeds are at the appopriate growth stages for good activity. However, sub-optimal doses may leave a few individuals which may be more likely to have a level of resistance. The continued adoption of sub-optimal doses, particularly where minimal or no-tillage is employed, may form the basis of future populations which could perhaps cross-fertilise, resulting in individuals with even higher levels of resistance. It may have been significant that the first case of a partially glyphosate resistant weed in the UK was in sterile brome, where the dose recommended for control on stubble (540 g ae glyphosate/ha) is often only marginally effective on this weed.
So perhaps the speed of development of resistance is all about the dose required to select the most resistant types. This is often the recommended doses for crop-safe herbicides and fungicides but could be sub-optimal doses for glyphosate, at least on some weeds. Hence, whilst it is not absolutely proven that sub-optimal doses speed the development of glyphosate resistance it would be advisable to apply it correctly in the right circumstances and use doses that will kill all the black-grass and inspect the results of treatment to ensure that there are no survivors. This is particularly the case where control by glyphosate is not supplemented by cultivations. For more information, please see the guidelines for minimising the risk of glyphosate resistance.
Best wishes for 2016.
The constructive comments on the script of this blog by Stephen Moss of Stephen Moss Consulting are gratefully acknowledged.
Posted on 18/12/2015 by Jim Orson
In my final blog of 2015 I would like to express my concern over the amount of mecoprop in the raw water that has recently been feeding some drinking water treatment works. Levels of up to four or five times those specified in the Drinking Water Directive have been detected. The source of this mecoprop is unclear. Products containing mecoprop-p are authorised for amenity use and for grassland and the warm weather in the late autumn could have protracted the application season.
My concern is that there are a few fields of wheat where the volunteer beans have been showing symptoms of hormone herbicide damage. Mecoprop-p is no longer authorised for application to cereals in the autumn after 1st October and some farmers may not have been aware of this fact. A restriction like this would not have occurred just because levels in water have in the past been exceeding those specified in the Drinking Water Directive. There must be environmental impact issues associated with this usage.
No doubt the water companies can remove the mecoprop before it arrives at the consumers’ taps but that is not the point. So does it matter that there are high levels in some water sources this autumn? Of course it does. It may send a signal to some anti-pesticide groups and legislators that some individuals may not be following the rules on the responsible use of pesticides and consequently opens a can of worms. If the regulations are not followed, what hope is there that voluntary measures will be adopted generally by the industry?
The success of voluntary measures is already being questioned by outside organisations. This year the RSPB has compiled a report intending to demonstrate that voluntary measures do not work. Some of the criticisms of the voluntary approaches mentioned in this report are unfair because they refer to observations made close to 10 years ago.
It does not stop there. Recently the Angling Trust and the World Wildlife Fund (now simply called WWF) settled their High Court dispute with the government over their accusation that Defra is failing to take effective action to protect waterways from agricultural pollution. They received courtroom reassurances from Defra that mandatory water protection zones (WPZs) are being actively considered alongside voluntary steps being taken by farmers to reduce pollution in rivers and wetlands.
It has long been accepted by the water companies that we have only until 2018 to demonstrate that voluntary measures are sufficient to meet the demands of the Drinking Water Directive. This does not mean eliminating all pesticide movement to water but it is critically important to reduce the size of the peaks* in pesticide content that can create real difficulties at the Water Treatment Works.
Hence, it is important to re-assure the pressure groups, the public, the Water Companies and the legislators that the industry can be trusted to reduce the inevitable environmental impact of farming. The folly of just a few farmers illegally using mecoprop-p, with intent or through ignorance, in cereals in the autumn would demonstrate the opposite and, if regularly repeated, could lead to further restrictions on mecoprop-p usage. I realise that the alternatives for autumn control of beans in cereals are slower acting and more expensive but that is a small price to pay in the context of the bigger picture.
* See earlier blogs that mention the importance of reducing the size of peak concentrations:
2 November 2015: Pesticides in water – meeting the challenge
16 November 2015: IT and reducing pesticides in raw drinking water
Posted on 07/12/2015 by Jim Orson
In my mid-April blog, “Roundup causing cancer?”, the classification of glyphosate by the International Agency for Research on Cancer (IARC - part of the World Health Organisation) as a probable carcinogen was discussed. Its evidence in humans was from correlating the occurrence of non-Hodgkin lymphoma with exposure to formulated glyphosate products, mostly agricultural, in the USA, Canada and Sweden. The concern about such studies is that there are no controls which allow for the exclusion of the many other farming practices which may have been causative. Recently, the European Food Standards Agency (EFSA) has concluded that “glyphosate is unlikely to pose a carcinogenic hazard to humans and the evidence does not support classification with regard to its carcinogenic potential”.
So why is there a difference of opinion between these two august bodies? Looking at the statements made by people far brighter than me, it seems that there are more reasons than the few I am about to mention.
Firstly, IARC, wholly or partly, inevitably based its opinion on studies of the product whilst EFSA based its opinion on just the active substance. When Roundup was first introduced I was led to understand that the acute toxicity (symptoms within 24 hours) of the formulants was higher than the glyphosate itself but the product was still very safe. The Soil Association raised the issue of the risk from the formulants even before the publication of the EFSA opinion. In the European Union context, the European Commission (EC) approves the active substance (in this case glyphosate) and individual zones or member states authorise the formulated products that can only contain EC approved active substances. I find it comforting that there are some very precautionary inclined EU member states that have authorised formulated products based on glyphosate.
IARC admitted that its opinion was based on a small database of scientific evidence and EFSA was able to assess a larger database. It also seems that IARC re-analysed some of the data in the papers it considered. The reasons for this are unclear and the authors of those papers may or may not have agreed with the method of analysis carried out by IARC.
There also appears to be a different approach between the two organisations when coming to their conclusions. According to the genotoxicologist Dr Peter Jenkinson, “EFSA followed a weight of evidence approach whereas IARC took the view that if one study showed a positive result then it took precedence over negative studies, even though there may be many more negative than positive studies.”
These two opinions were about assessing the hazard posed by glyphosate. Just to remind you, electricity is hazardous but the risk of electrocution is acceptable. This is because risk is a combination of hazard and exposure. As always with the risk from chemicals, it is about the dose. For instance, formaldehyde is listed by IARC as a Group 1 carcinogen (its highest risk category meaning that it will definitely cause cancer). No pesticides appear to feature in this category but some medicines do. Formaldehyde naturally occurs in apples at concentrations of up to 22 parts per million (ppm) but at this concentration does not appear to be a risk. The maximum residue limit (MRL) for glyphosate in cereal grains is currently 30 ppm but this may change because its EC re-approval is currently being considered. I must emphasise that a MRL is not a value above which there is harm to health but is the highest level of a pesticide residue that is legally tolerated in or on food or feed when pesticides are applied correctly (Good Agricultural Practice).
There has been much fun made of IARC in terms of its opinion that red meat is probably carcinogenic and processed
meat is carcinogenic. The statistic quoted on the TV programme Have I Got News For You was that IARC has examined 941 substances and has only found one to be non-carcinogenic. This may be a little unfair because much of their
work may have been directed at substances where there was a real or perceived suspicion (there is no shortage of accusations from ‘green’ groups about glyphosate!). Many things when taken to excess are hazardous but in real life do not pose an unacceptable risk. It does not help I
ARC that the World Health Organisation has been widely criticised over recent weeks, notably for its slow response to the Ebola outbreak in Africa.
So my conclusion is very much the same as my previous blog. Glyphosate products are safe when used a directed.
Posted on 19/11/2015 by Jim Orson
Looking at the husbandry of those achieving extremely high winter wheat yields this year suggests that it was the weather, good crop management and possibly higher rates of applied N that were key. There are those who sell a programmed approach of trace element applications who are trying to claim that this was also a factor but I am very far from being convinced. Field yields of 16-17 t/ha were achieved without using any trace elements, except in some cases of manganese sulphate.
The requirement for above typical rates of nitrogen for extremely high yields is debatable because some farmers achieved these yields with their typical doses of nitrogen; around 200 kg N/ha for a first wheat on long term arable soils where no organic manures are being used. However, NIAB TAG have some trials’ evidence that there were cost effective responses from doses above 220 kg/ha when plot yields of feed wheat exceeded 14 t/ha. We are still deliberating on how this information can be used in practice.
It is also a matter of debate as to why this year was so special for achieving such high wheat yields. We did have around more than 10% extra radiation than average during the growing season but this cannot be the sole explanation because it is certainly not unique.
Air temperatures were bang on or close to average from March onwards and so the growth and development of wheat proceeded at its average rate. Higher than average radiation is often associated with warmer than average air temperatures which in turn result in a more hurried growth and development and so there is less time to trap all the extra radiation and convert it into yield. Hence, higher than average radiation combined with typical or below average air temperatures can only be beneficial for yield potential.
One reason why air temperatures may have been restrained this year was because the sea surface temperatures of the North Sea and the North Atlantic were significantly cooler than average. Temperatures of the surface of the North Atlantic go up and down in a multi-year cycle known as the North Atlantic Oscillation. This year we seem to have been towards or at the bottom of the current cycle. There are many learned papers that suggest that this cycle influences our air temperatures. The map below, which compares current sea surface temperatures to the average, clearly shows how cold the sea surface was around the UK this June.
Does all this explain the extremely high yields achieved by some growers? I personally do not think so. I discussed the subject with Eric Ober, a crop physiologist who works for NIAB. He gave me a few leads which I have followed up. As a result I have concluded that it was not only the above average radiation combined with average temperatures this year that were behind the extremely high yields but also that it was ‘the right kind of radiation’. I realise that this sounds a bit quirky and so I need to explain myself.
I examined the daily hours of sunshine received in March-July on the on-line automatic weather station at the Cambridge Computer Laboratory and compared their pattern with the four years in the previous ten years when we received similar levels of sunshine hours during these same five months. 2015 was very different. There was less than half the totally sunless days in these months and the total sunshine hours were accumulated more on a little and often basis than in the other years. In addition, I remember that this spring and early summer we regularly had sunny mornings until around 11.00 BST and then light cloud for the rest of the day. Some academic research shows that little and often bursts of lower intensity solar radiation (i.e. with a greater proportion of diffuse radiation) are better for photosynthetic efficiency than receiving the same overall level of radiation in longer and more intensive bursts. Similarly, there are several studies that show that solar radiation is more efficiently used by wheat when it occurs in the morning or evening rather than at midday (1:00 BST) when it is at its most intense.
In conclusion, I think that the reasons for the extremely high yields of wheat in some parts of the country this year are:
• The crops came out of the winter in good condition and there was little over-winter water logging.
• The very high yielding areas had a good dollop of rain in May, perhaps at the expense of some solar radiation.
• Temperatures from March onwards were around average but solar radiation was well above average, particularly in April and June. The April weather helped to ‘set up’ the foundations for potentially high yielding crops and June is the key month for grain fill.
• The higher than average radiation was supplied on a ‘less intense and more often basis’, which is conducive to high photosynthetic efficiency.
It was not just wheat that achieved very good yields this year and so other crops may have responded to some or all of these factors. Perhaps ‘the right kind of solar radiation’ is not so quirky after all.
Posted on 16/11/2015 by Jim Orson
This is continuing the theme of my previous blog. It summarised some of the very sensible and practical approaches that can be taken to reduce pesticide movement to water in order to meet the very demanding targets set by the Drinking Water Directive (DWD).
Research in France over the last twenty years and practical experience in the UK show that pesticide choice and dose is not such a significant issue in meeting the DWD standards when the soil profile is relatively dry at the time of application and there is no significant rain for at least a week or so following application. The issue of pesticide choice and dose becomes critical when the soil profile is close to soil moisture capacity at application, particularly if it rains within a few days after application. In France there are guidelines on pesticide choice and dose where the soil is relatively dry and where it is approaching soil moisture capacity. These guidelines make interesting reading, particularly because they are derived from field experimentation rather than modelling.
It is also interesting to note that in field trials in France there has been more pesticide movement to water in some Integrated Pesticide Management approaches, when compared to more conventional treatments. The explanation is obvious. The conventional treatments were applied when the soil profile was reasonably dry but the IPM treatments involved delayed autumn drilling and pesticide applications were made to a wetter soil profile. Although less pesticide was applied in the IPM treatments they resulted in more pesticide movement to water. This goes back to the dilemma of what we are trying to achieve with IPM, namely a reduction in pesticide use or a reduction in pesticide impact. This example demonstrates that reducing pesticide use does not necessarily mean reducing impact and I hope that our researchers have the nouse to aim for a reduction in impact rather than use.
There are some situations in the UK where there can be a very real conflict between optimising pesticide performance and reducing pesticide movement to water. The clearest example was covered in the previous blog. Propyzamide is an essential element in black-grass control in autumn sown oilseed rape and it has to be applied to wet and cold soils to optimise control. Subsequent rain can mean huge (in terms of the DWD) spikes in propyzamide levels in the water courses feeding water works although such levels comply with the Environmental Quality Standards. Whilst buffer strips, particularly in high risk fields, will reduce surface run-off, the movement through the soil to field drains can alone cause spikes.
It is absolutely essential to reduce the size and number of these spikes which can result in the closure of water works until the content supplied to domestic users meets the DWD standard. It seems to me, and this is not an original thought, that a way out is to try to close off the intake of the water works during the few hours when the spike occurs in addition to the other stewardship guidelines issued by the pesticide and water companies and the Voluntary Initiative. This is more difficult than it seems. First of all the water company has to be able to shut off the intake i.e. be able to supply water from another source or have reservoir storage it can utilise whilst the intake is off. Unfortunately, these facilities are not available at all water treatment works. The water company then needs to know when and where the propyzamide is sprayed and then it has to calculate the travel time of the spike to the intake of the water works.
The latter is getting more possible with more understanding of pesticide movement to water but locating when and where propyzamide is applied remains a difficult issue for water companies. Perhaps help is at hand with so-called ‘crowd-sourcing’ software on mobile phones. NIAB TAG is pioneering such an approach in order to improve further the services to its farmer subscribers. It was first tried last April by asking farmers the stage of flowering of their oilseed rape in order to improve the prediction of sclerotinia risk. Each farmer’s response was automatically geo-referenced and within a few hours the map in this article was achieved. NIAB TAG is rolling out other uses for this approach and it is envisaged that the information provided can only result in more precise and timely advice.
Whilst I realise that NIAB TAG members have a common cause for taking part in crowd-sourcing approaches, I would suggest that farmers in the relevant catchments have every reason for working with water companies in a similar way in order to reduce the problem of spikes of pesticide content in the raw water entering water works.
IT can also help by providing improved field specific advice. The WaterAware app. from Adama is an encouraging start but its designers accept that improvements are needed. Nevertheless, it gives the platform for the farmer or adviser to assess the risk of pesticide movement to water on a field by field basis through geo-referenced information on soil type and weather. Such an approach will still need the expertise of the farmer or adviser in order to make final decisions but it should lead to an improvement in decision making.