NIAB - National Institute of Agricultural Botany

Orson's Oracle

IPM - filling an EU legislative gap

Posted on 20/12/2013 by Jim Orson

A few years ago ‘Brussels’ sat down to review the EU pesticide legislation. There were existing directives on pesticide registration (91/414), on waste (the Water Framework Directive and a Directive on hazardous waste) and on residues (regulation 396/2005 on maximum residue levels (MRLs)).  ‘Brussels’ realised that there was a yawning legislative gap between registering pesticides and the consequences of pesticide use. This was the gap that was eventually filled by the Sustainable Use Directive (2009/218/EC) which covers the use of pesticides.

Just to remind you, an EU Directive is really a set of objectives which individual member states interpret and then transpose into their own regulations. The UK Plant Protection Products (Sustainable Use) Regulations 2012 was the result of the transposition of the Sustainable Use Directive.

This will not have a great impact on the use of pesticides in the UK because we have had similar legislation or audit requirements since the mid-1980s. Our advisers and those who apply pesticides are already qualified and our sprayers are regularly tested etc. However, there will be some changes in the UK as a result of this Directive. Notably, ‘grandfather rights’ to apply pesticides will disappear in November 2015 and we will have to demonstrate that we are employing Integrated Pesticide Management (IPM). Some EU member states have taken a more draconian attitude than the UK to introducing IPM by setting pesticide reduction targets; France a 50% reduction and Belgium a 25% reduction. 

The question is what is IPM? It can be in the eye of the beholder. For instance, I once asked a leading proponent whether a particular farmer was practising IPM. The soil type on his farm was really only suited to continuous wheat which he grew for the breadmaking market. This restricted variety choice and he often had to grow disease susceptible varieties to meet the demands of the miller. However he used pesticides rationally. The answer from the IPM proponent was in the affirmative. However, the Sustainable Use Directive does not take such a relaxed view and has a definition of IPM in an annex. For some light Christmas relaxation, I suggest that this Directive is a better read than most.

The main IPM principles it espouses are crop rotation, cultural control, resistant varieties, clean seed, balanced nutrition, cleaning machinery, encouraging beneficial organisms, monitoring and the use of thresholds.

Perhaps there is not a lot that we can argue with in that list but the level of their primacy is critical. For instance, taken to extremes, rotation can radically reduce pesticide use but can result in a tumble in income and food production. I still look back with some anger at a talk delivered by a Danish researcher a couple of years back. The headline was that with new rotations the UK grower could reduce pesticide use by 50%. He obviously was not an economist and so seemed totally unaware of the economic damage that his proposals would cause. It all comes down to the old dilemma: farmers farm to make a living and to do that they grow only those crops most suited to their soils and climate. This is also the principle that drives world trade. The dilemma is that such an approach will to frequently conflict with what some in society expect them to do.

The Sustainable Use Directive suggests that cultural measures should be adopted if they provide satisfactory pest control. Perhaps ‘satisfactory’ needs to be defined but I am sure no UK farmer would apply a pesticide if an alternative cultural control measure was as consistently effective and as cheap to implement.

It is interesting to note that the Directive also encourages the use of reduced (to be more precise – “appropriate”) doses of pesticides. It is a bit of a breakthrough to get official recognition of appropriate doses. Advisers in some countries around the world are still threatened with legal action if they dare to suggest to farmers that a lower than label recommended dose is appropriate in a particular situation.

It is understood that the principles of IPM, as defined by the Sustainable Use Directive, will be included in the Code of Practice for using Plant Protection Products which is referenced in general cross-compliance (SMR 9). This should have little impact on the way we farm as I am sure that most UK farmers already employ the wider principles of IPM.   The problem is how do they prove it? An IPM plan is being developed by the Voluntary Initiative to be incorporated into assurance schemes (do not all groan together).  This plan will include a self-assessment process. Those who groaned because they thought that the new IPM plan would be yet another form to fill in will be relieved to hear that it is intended that this will replace the current Crop Protection Management Plan!

Happy Christmas!

 

Leave a comment / View comments

 

Goodbye KISS?

Posted on 13/12/2013 by Jim Orson

Times were tough for UK cereal producers around the turn of this century and costs were really under the hammer. Whilst some savings could be made with variable costs, labour and machinery costs were the main targets for savings.

There were many implications in trying to reduce labour and machinery costs per ha, including increasing the scale of production and also the widespread adoption of non-plough tillage.  As I have said in previous blogs, contractor charges for a plough-based system and non-plough tillage to a depth of 20 cm do not indicate where large savings could have been made. It was the rate of work of non-plough tillage that enabled a reduced labour and machinery complement to maintain timeliness.

However agronomy played its part. Wheat varieties more suited to historical early sowing dates were identified, allowing a longer drilling window.  In addition, there were effective pesticides that enabled crops to be protected from the potential problems of very early drilling. The KISS (Keep It Simple Stupid) principle became established as a way to maintain crop protection with a diminished availability of managerial time and labour input per hectare.  Block cropping was increasingly adopted and cereal variety choice was influenced by being able to phase their fairly predictable crop protection and nutrition requirements in order to help smooth out the labour peaks of the following spring.  It now all seems so simple and straightforward.

The first real challenge to this approach came in the dry autumn of 2003. Control of blackgrass in cereals was often poor with the pre-emergence herbicides.  However, the drought-induced delayed emergence resulted in blackgrass being small and not shaded by the crop in the following exceptionally warm February.  These were ideal conditions for the newly introduced Atlantis. I dread to think what would have happened that year without that herbicide. In reality, I do not have to use my imagination too much. Wheat would have had infestations of blackgrass similar to the summers of 2010 and 2012 when the previous dry autumns also limited the activity of the pre-emergence herbicides but this time the potency of Atlantis was diminished due to resistance.

This reduced ability to control blackgrass has implications for block cropping. I have regularly witnessed one or two fields in a block of several fields having particular control problems. This means that one common approach to blackgrass control for all the fields in the block means either too much for some fields or two little for others. Hence, things are getting more complicated and starting to challenge the KISS approach.

Resistance is also making it more difficult to plan fungicide programmes in wheat well in advance. Increasing resistance in septoria is limiting the length of activity of the triazoles on the disease, particularly the period of eradicant activity. This reduces the flexibility in timing and may also mean an additional application between the traditional timings of the second node detectable stage of the crop and full flag leaf emergence in wet seasons. 

Pesticide resistance is not the only challenge to KISS in cereal production. Pesticide revocations have also added complications. The loss of trifluralin was a particular blow. Although not the most effective blackgrass herbicide in cereals, it had the attributes of being cheap and its efficacy not being affected by pesticide resistance or by dry seedbeds. Obviously the current withdrawal of the neonicitinoid seed dressings in oilseed rape will complicate management in the future.

So whilst adopting the KISS principle helped enable cereal farmers to survive the lean times early this century, it is becoming more difficult to conduct. However, it is not yet the time to kiss KISS goodbye. 

Leave a comment / View comments

 

What is a legitimate environmental target?

Posted on 05/12/2013 by Jim Orson

Professor Seralini is not a household name but he has been in the limelight in the ‘GMO world’ for a few years. He is Professor of Molecular Biology at the University of Caen in France and has been known to be against genetic modification. I’m not sure whether he held such views before he started to carry out studies on feeding GMOs to rats or only after he perceived the conclusions of such studies.

In September 2012, his group published a peer reviewed paper titled "Long-term toxicity of a Roundup herbicide and a Roundup-tolerant genetically modified maize". It concluded that feeding studies proved that GM maize caused more tumours to occur in rats than from feeding conventional maize. There are various rumours on who funded the study, but they are all anti-GM organisations.

What was unusual was that steps appear to have been taken to ensure that the first news stories published about the study would not be critical of its methods or results. Science journalists were briefed prior to its official publication only on condition that they did not mention the paper to any independent researchers or outside experts. By the day of its publication, anti-GM activists were all briefed for interviews with the wider media and there was even a beautifully crafted video available on websites. 

However, the cunning plan did not quite work. A copy of the paper was circulated to independent researchers just prior to publication.  This enabled the Science Media Centre to produce a document on the day of publication in which leading scientists commented on the validity of the conclusions made by Professor Seralini and his colleagues. This resulted in most of the national media not mentioning the paper at all. This all suggested that there was a mood change in the media both towards GM and also to the methods being adopted by anti-GM groups.

That mood change has continued. Articles that support the cultivation of GM crops have started to appear in the national press. In addition, politicians, notably Owen Paterson, have spoken forcefully on the subject and various other governments are also taking a more pro-GM stance.  

After a year-long analysis, the peer review journal has now retracted Seralini’s paper on the basis of interpretation of the data generated by the feeding studies. However, Professor Seralini is not taking this lying down and is contesting the decision. He is even threatening legal action, mentioning that a member of the editorial board of the journal once carried out a piece of research for Monsanto. In the anti-GM world this is equivalent to supping with the devil. This view is now becoming outdated as many in society are as suspicious of research funded by an anti-GM group(s) as they are of research funded by commercial companies.

However, I hope that the alleged ‘getting rid of green crap’ remark made by our Prime Minister does not allow the pendulum to swing too far towards ignoring legitimate concerns over the environment. Reducing environmental damage can be an expensive process but may have larger longer-term commercial and societal gains. The issue is defining a legitimate concern.

I was reminded of this during a lecture I recently attended. Arable weeds have prospered only since the adoption of cultivated crops. Prior to that they were restricted to small areas where the soil was regularly moved by natural processes, for example by soil erosion. Hence, the introduction of arable cropping gave them the ideal opportunity to thrive, particularly before the introduction of effective herbicides. During this period, many farmland bird species also thrived because these arable weeds supported their insect food.

GM herbicide tolerant crops have been questioned because their adoption could reduce the numbers of arable weed seeds in soils. However, fundamentalist ecologists would say that arable cropping is, by definition, damaging to a totally natural environment. This suggests that they are not over-concerned about arable weed seed levels in the soil. With such contrasting views, no wonder we’re all struggling to get a clear view on just what are the legitimate targets for environmental policy.

Leave a comment / View comments

 

Nature more effective than black-grass herbicides

Posted on 27/11/2013 by Jim Orson

I have had a few enquiries this autumn about the fate of black-grass seed after shedding. This type of question is fostered by rather imprecise language describing the number of black-grass seeds formed on a head of the weed. On average one head of autumn germinating black-grass sets 100 Blackgrass in earseeds; a convenient number. But are all these seeds viable as some claim?

The simple answer is no. Although, of necessity there is a lot of variability involved in determining the fate of black-grass seed, the average figures are intriguing. The best source of such data is page 86 of HGCA Project Report 466 – Integrated Management of Herbicide Resistance.

The data suggest that on average only 55% of the seeds are viable. This still seems a horrifying situation; on every black-grass head there are 55 viable seeds. However, be (relatively) comforted by the huge subsequent losses.

Only 45% of the seeds survive the period between shedding and sowing a following crop in late September. So 45% of the 55 viable seeds per head make it to a stage where they can potentially become a weed in the following crop. However, many are buried by cultivations to a depth from which they will not emerge, assumed to be more than 5 cm. For instance, non-inversion tillage carried out to a depth of 20 cm buries 40% of freshly shed seeds deeper than 5 cm. As you can see, the losses keep mounting and there is more to come.

Only a proportion (15%) of these freshly shed seeds in the top 5 cm will actually germinate and establish plants in the following crop. All this means that for every head of autumn germinating black-grass that survives to produce seed in June, approximately 2.2 plants will emerge in a following late September sown crop established after non-plough tillage to a depth of 20 cm.

Of course life (certainly the life of black-grass) is not that simple. Some of the seeds shed will survive up to three or more years in the soil. There is a loss of about 70% of viable seeds a year but this is partly compensated for by the fact that a higher proportion (30%) of over-yeared viable seeds will germinate, because of the loss of dormancy, if they are in the top 5 cm of soil.

This complicates the maths but when it is all worked through, the seeds shed from one head will result in around three plants being formed over the following three years in continuous late September sown crops established after non-inversion tillage to 20 cm depth. So there is some comfort to be had from this information on seed losses but please note that these are average figures.

The factor that I have not mentioned until now is the number of fertile tillers (i.e. heads) per plant. Generally, the earlier the autumn sowing the more heads per plant are formed. In addition, extreme summer weather events can make a huge difference. In 2012, the continuous wet conditions resulted in low black-grass tiller losses and so very high heads/plant. In contrast, the very dry spring of 2011 resulted in tiller losses right up to head emergence, leading to very low heads/plant.

In conclusion, and in the context of total seeds set, natural losses in continuous late-September sown cropping are higher than those that can now be achieved with cereal herbicides. However, these losses are a long way from being high enough to avoid the use of chemicals.

There are a lot of research and field observations being carried out to see how the number of natural losses can be increased, for example by later drilling. Head numbers per plant can be reduced by higher crop seed rates and the weed’s lifecycle can be disrupted by spring sowing. It is heartening that we can do things that enhance natural losses but they all cost money and, for many, will result in systems that are less reliable than continuous early-autumn sown crops.

Leave a comment / View comments

 

Giving up on organic matter

Posted on 20/11/2013 by Jim Orson

Last week I visited Rothamsted Research with a few arable farmers.  As always, it was very good value and we were updated on many aspects of the organisation’s research. Pesticide resistance and availability and soil issues were the main themes.

One area that caused particular interest was the evidence behind a new HGCA and Defra funded project on how to get the best out of organic manures. The whole industry is not only aware of the desirability to increase organic matter levels in long-term arable soils but also the futility of trying to achieve such an objective. 

Changing to a system of interspersing short periods of arable crops between medium-term grass leys could help but would also have a negative impact on both production and profits. The alternative is to use a lot of organic manures or amendments at regular intervals. However, there are simply not enough of these organic sources to have a significant impact on organic matter levels on more than a very small percentage of our arable land.

So how best to use these limited organic sources to maximise their benefits? Some research at Rothamsted has provided a clue and is the basis of the new project. Researchers found that the benefits from annual farmyard manure application, in terms of spring barley yields, were quickly available when judged against the same amount of manure being applied annually since 1852 (see figure). This yield benefit, recorded in 2002 after just two successive autumn applications of farmyard manure, could not be explained either by the nutrients in the manure or by an increase in soil organic matter. So what is the possible explanation?

Rothamsted spring barley FYM project

Figure: spring barley yields (t/ha) at the optimum doses of mineral (bag) nitrogen.
The blue line is where mineral nitrogen only has been used annually since 1852, the green line is where both farmyard manure and mineral nitrogen have been used annually since 1852 and the red line is where organic manure and mineral nitrogen have been used annually since 2000 (i.e. first manure application in autumn 2000).  Graph kindly provided by Andy Whitmore of Rothamsted Research    

 

Regular readers of my blogs (if there are any out there!) will have guessed the possible explanation because I’ve banged on about this issue before. The application of crop residues and organic manures and amendments increases the level of the biomass of fungal, bacterial and fauna in the soil. These can act as a surrogate for organic matter.  However, to maintain these benefits they need annual applications.

It is for this reason that the value of straw to the cereal farmer is more than just the nutrients it contains. Many farmers reported that their land worked more easily after a couple of years of incorporating rather than burning straw. However, despite positive effects on the soil, Rothamsted has not found a benefit in terms of winter wheat yields from the annual incorporation of up to four times the average straw yield.

Why were increased yields recorded from the regular incorporation of organic materials recorded in spring barley and not in winter wheat? It’s my view, which cannot be verified, that spring barley yields are more likely to benefit because the crop is established in more hostile growing conditions and has a shorter growing period. We all recognise that winter wheat yields are more likely than spring barley to compensate for set-backs in the first couple of months of growth. The opposite may well also be true, that spring barley yields more than winter wheat yields are likely to benefit from improved soil conditions in the early stages of establishment.

To continue my theorising, how best should we use the limited UK supplies of organic manures and amendments? Based on very limited information, it would seem that using smaller amounts annually rather than large amounts intermittently in largely spring crop orientated rotations may be the best way forward to maximising yield benefits. However, applying smaller amounts annually will result in extra costs. I’m sure that the Rothamsted project will provide guidance as to the way forward. 

Leave a comment / View comments

 

Page: ‹ First  < 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 >  Last ›