NIAB - National Institute of Agricultural Botany

Orson's Oracle

Not as bad as first thought?

Posted on 24/01/2012 by Jim Orson

There have been some very doom laden studies on the financial implications of the Drinking Water Directive on pesticide availability. These suggest that herbicides relied upon to control black-grass in oilseed rape could disappear, with dire financial consequences for the whole rotation. The view of the Environment Agency (EA) is that these studies are far too gloomy.

So what’s the truth? Well the EA is drawing up maps of areas that influence the pesticide content of raw drinking water. These represent very much the minority of arable land. So, by no means is the whole of the arable area expected to meet the demanding targets set by the Drinking Water Directive, which is incorporated into the Water Framework Directive.

And do the targets set by the Drinking Water Directives need to be totally met in raw drinking water? It seems not. The Water Framework Directive aims to stop the situation getting worse and avoid further investment in treatment.

But, but, but…this does not let any arable farmer off the hook. They need to adopt every reasonable measure in order to minimise movement of pesticides into water. It may be that additional measures need to be adopted in those areas that are sources of drinking water.

One major management target is for farmers to do their best to avoid huge spikes in pesticide levels in watercourses that can occur after heavy rain. Reducing water run-off, particularly from recently sprayed fields, will help significantly.

Outside the areas where drinking water is sourced, water bodies (typically lakes and rivers) have to meet the standards set by the Environmental Quality Standards (EQS) for the content of specific pesticides. The comforting fact is that a 2010 report suggests that 99% of water bodies are meeting these standards. A new report is being prepared and we await its results.

But, but, but…there is concern in arable farming that there are now interim arrangements for wider aquatic buffer zones, which can be up to 20 metres wide.

These were introduced at the request of the pesticide companies in order that the maximum number of existing and future pesticides is available to farmers. This is because the standards required preventing the impact of spray drift in water ecosystems have increased over the last few years.

The question is why have these standards increased? I have a sneaking suspicion that it’s because of the standards set in order to meet the requirements of the Water Framework Directive!

On the one hand, it may take only a small amount of spray drift to exceed EQS standards in a small field side ditch. On the other hand, the impact of individual fields on pesticide levels in a larger water body is literally watered down because of dilution of water from untreated areas.

EQS standards may largely being met for water bodies, but the impact may be felt at field level through the probability of wider buffer zones.

So, is the potential financial impact of the Water Framework Directive less than originally forecast in the reports on the subject? Probably yes, but, but, but……there is a potential financial impact due to wider buffer zones that was not forecast in these reports.


Leave a comment / View comments


Can N recommendations be improved?

Posted on 18/01/2012 by Jim Orson

The main focus of Rothamsted when it was set up in 1843 was to improve fertiliser practice. So it’s sobering to reflect on where we are nowadays with nitrogen recommendations for crops.

The depressing fact is that, as an industry, we haven’t got much further. In winter wheat, at best we can only predict within 50 kg/ha of optimum fertiliser N in 75% of cases, even where there’s not much N in the soil; i.e. long term arable soils with low organic matter and where no organic manures are used.

This sounds unimpressive, but the other methods of prediction are far worse. And they are even less reliable when it comes to predicting optimum fertiliser N rates where there’s a good bit of nitrogen already kicking about the system.

But why?

It’s hard to blame the scientists at the blue sky end of the research spectrum. Nitrogen cycling in the soil is a terrifyingly difficult subject, even with the advantages of modern research techniques. In addition, the optimum fertiliser N requirement can be influenced by weather after application.

N fertiliser application

Personally I consider that some progress has been lost at the more practical end of research, because there was an aura that it had all been resolved. We had nice tables in RB209 that implied there was accuracy. Measuring soil mineral nitrogen implied even greater accuracy. However, some of the simple assumptions that are behind these prediction methods may be flawed. Scientists charged with improving recommendations need to look at all the recent evidence and to see if things can be improved.

With this background in mind I was excited to read, and analyse, the HGCA project report PR447 on canopy management of winter oilseed rape. There was a remarkably (for nitrogen trials) close relationship between N in the canopy, involving laboratory N analysis, in the spring and optimum N fertiliser requirement. Perhaps this relationship was even more remarkable when the fickleness of the oilseed rape crop is taken into account.

However, this relationship was made worse when soil mineral nitrogen was also taken into account. The same occurred in trials on canopy management in Germany’s Schleswig Holstein.

We now know the possible pitfalls when trying to accurately measure soil mineral nitrogen. Additionally, the assumptions used to predict the impact of soil mineral nitrogen on crop fertiliser N requirement may be erroneous.

However, it is clear that another stumbling block is getting a practical field method that provides an accurate estimate of N in the canopy.

There are electronic methods of estimating canopy size which can lead to a reliable estimate of the total N in the canopy. However, the ability to predict the optimum economic dose of applied N is much reduced when compared to the method that involves plant analysis.

So, perhaps, history repeats itself. Have we here an improved method of prediction, but it is not reaching its full potential because of underlying assumptions? Perhaps only a researcher will say this, but we need more R&D on this issue...

Leave a comment / View comments


...and then came the rules

Posted on 10/01/2012 by Jim Orson


I was listening to one of my favourite tracks on my Christmas present over the holiday period. The song describes the opening-up of the US prairies and has a line; ‘First came the churches, then came the schools, then came the lawyers and then came the rules’. Does anyone know which song it is?

We may choose a different set of institutions that influence current developments in society, but the end point is the same. There would have to be, at least, an indirect reference to laboratories - if they can measure it, we get a rule. Mycotoxins in cereals and pesticides in water are cases in point.

Perhaps the most extreme example of this is the Environmental Quality Standard (EQS) for cypermethrin in water bodies - rivers and lakes - at 0.1 parts per trillion (0.0000000000001 by weight). I’ve long known that this pesticide could be a threat to aquatic life, but I didn’t know it was that risky! And yes, the chemists claim that they can accurately measure this concentration in water.

EQSs are set for all chemicals, not just pesticides, which may be deemed to be a threat to the ecological status of water bodies. In 2011 nine widely used pesticides were listed by the EU as being a particular threat. Of those, only chlorpyrifos still remains on the UK market, and then only by a thread.

Gone are trifluralin, isoproturon, simazine, atrazine, alachlor, diuron, chlorfenviphos and endosulfan. All but one of this group failed to get approval in the EU during the re-registration process, although the threat to water may not have been the main reason why some of these failed to get approval.


Isoproturon was the other active ingredient that achieved EU approval. However, UK authorisation was not granted for products delivering 1,500 g ai/ha. I’ve assumed that this was because of the threat to water ecosystems and nobody seems to say anything different. In the case of isoproturon, the EQS is well above the drinking water standard of 0.1 parts per billion (0.0000000001 by weight), but obviously not high enough to allow us to use 1,500 g ai/ha.

Not to be outdone by the EU, the UK also had its own list of chemicals deemed to be a threat to water. This list has had a better survival rate through the re-registration system, with only diazinon failing to get EU approval.

Now, with renewed zeal, the EU and the UK are preparing new lists of chemicals that pose a threat to the ecological status of water. There are a few pesticides on the draft EU list, and the only one of any significance in UK arable farming is bifenox (Fox). However, the draft UK list includes methiocarb (Draza), chlorothalonil (Bravo) and...wait for it...glyphosate.

Should they be on the final agreed list, their EQS’s will be established and their presence will be monitored in water bodies. Fortunately, with the possible exception of chlorothalonil, these pesticides are not liable to significant movement through the soil to land drains, which means that the main source in water is likely to be from point sources such as spray fill areas. Hence, their appearance in water is largely under the control of users.

Compared to isoproturon, chlorothalonil is less prone to move through the soil, is used at a lower dose, and is typically used at a time of year when drains are not running. But, it is being detected in water courses in France.

Should chlorothalonil get on the finally agreed UK list, its EQS will be critically important to its future. Let us hope that this valuable pesticide does not fall foul of the analytical chemists and become another victim of the rules.

Leave a comment / View comments


Reversing the decline

Posted on 28/12/2011 by Jim Orson

There was a very thoughtful reply to my blog on the impact of predators on the numbers of farmland birds including two key statements: ‘the predators have as much right to exist as their prey’; and ‘….controlling a species to benefit another is just wrong. It's manipulating nature, not conservation’.

I have sympathy with these statements, but there remains the need to reverse the slide in farmland bird numbers in order to meet one of Defra’s quality of life indicators.

Since the baseline year of 1970 there has been a fall of over 50% in farmland bird numbers. Over the same period populations of magpies and sparrowhawks nearly doubled.

Now, I’m not saying that this is by any means a cause and effect. However, the increase in predator numbers must have had some influence on farmland birds. Is it really fair to ask for a return to the numbers of farmland birds recorded in 1970 when there were significantly fewer predators?

The response referred to earlier also stated ‘the fact that they [predators] are still doing well is because actually there is plenty of prey out there on our farms for them to flourish’. I suppose the tongue-in-cheek response must be perhaps we should be using predator numbers as the indicator rather than the population of farmland birds?

There is no doubt that much of the decline in farmland birds has been due to changes in land use. The reduction in mixed cropping and the increase in winter crops at the expense of spring crops have been major factors. We’ve seen publically-funded schemes try to reverse these declines, but they don’t seem to have worked.

Research by the University of Reading has concluded that take-up of ELS options is very much out-of-line with the requirements of farmland birds. There needs to be more clarity on which options are required in order that declines can be reversed.

But this clarity will bring some unpalatable news to some farmers. There will be a need to devote land to wildbird seed mixes for winter food and for flowers for insects in the summer. These in-field options have not been popular, but really are the essence of trying to get a balance between biodiversity and food production.

The Government is examining future options under ELS and also has consulted on new quality of life indicators. Previous experience and research like the University of Reading’s hopefully mean that we not only get realistic and pragmatic indicators but also the means of achieving them.

I say pragmatic partly because I have to ask why we cannot currently use pesticides or fertilisers to optimise the biodiversity gain from the hopefully limited area we have to devote to this aim.

Reducing the area needed will have wide support from farmers. It’s not these inputs that have been directly responsible for the decline in biodiversity, but they could in the future make a significant contribution to reversing it.

Leave a comment / View comments


Is British weather a basket case?

Posted on 20/12/2011 by Jim Orson

We reviewed all our 2011 experimental results recently. One trial result particularly interested me - chlormequat increased wheat yield in the absence of lodging. This doesn’t typically occur, despite what many are led to believe. So why did it this year?

The yield increase was found at a trials location that witnessed a lengthy spring drought from late-February to late-May. It then rained in June and July and this, combined with low temperatures, provided very good conditions for grain fill.

This is the ideal scenario for chlormequat to increase yields. The growth regulator increases potential grain sites, usually by reducing tiller loss during the period of rapid growth. This year there was a very high loss of tillers in the drought conditions, often resulting in sub-optimal populations of heads. Hence, the additional extra grain sites, from the use of chlormequat, were able to exploit fully the excellent grain filling conditions.

Mind you, the opposite can occur.

Yield losses from chlormequat can happen following good growing conditions in the early spring followed by hot and dry weather before and during grain fill. So, the impact on yield from chlormequat use is a gamble on the weather.

This is true, to a greater or lesser extent, for most decisions on crop input management. If we knew future weather conditions then it would be a lot easier. But we don’t, which means compromises, in the guise of risk management, have to be made.

Everyone can quote examples where decisions are compromised by having to take into account the weather not doing what we would wish it to do. One good example is nitrogen application timing for wheat. If we knew that there was going to be a decent amount of rain at flag leaf emergence then we would apply most of the nitrogen just before that stage to optimise yield.

The unpredictability is not just restricted to input decisions. This year’s spring drought followed by rain in late-May reduced the relative yield of early winter wheat varieties in many cases. Obviously these varieties were less able to compensate for the spring drought than the varieties that were less advanced when the drought broke, which is just sod’s law!

Some crop physiologists say that a way to help combat the warmer and drier summers forecast is to have earlier flowering and ripening wheat. It goes to prove that no-one should put all their eggs in one basket when the basket is as fickle as our climate.

After saying all that, we have little to complain about regarding the variable impact of weather on yields and crop input decisions. In parts of Australia the yields between seasons can vary from nothing to around five or six tonnes per hectare. In some seasons much of this variation is explained by rain falling after the time most input decisions are made. This is risk management in the raw.

By the way, I accept that the main reason for the use of chlormequat in winter wheat is to help keep it standing. Another unpredictable risk.

Leave a comment / View comments


Page: ‹ First  < 32 33 34 35 36 37 38 39 40 41 42 43 >