NIAB - National Institute of Agricultural Botany

Orson's Oracle

Silencing scientists?

Posted on 04/03/2016 by Jim Orson

In the last couple of weeks I have been viewing the lights of Cambridge from the top of the Gog Magog hills (well they pass for hills in Cambridgeshire). To explain, I drive over the hills whilst returning home from my golf club.

There are now two levels or layers of lights. The lower layer comprises the normal lights of a city: houses, flats, streetlights etc. The upper layer comprises red lights that mark the top of the large cranes on the many construction sites in and around the City. I do not know how many cranes there are but the effect of their lights at night is quite striking. Some have likened the skyline to that of Dubai during its rapid expansion.

I have often wondered what it must be like to operate such cranes. However, I will never find out because I am a wimp when it comes to heights. The overviews of the City must be great but of course to see the precise detail of a specific location you have to be at ground level.

I am becoming increasingly concerned that the views of those who work at the ground level of science and who know the nuances and details of the subject are becoming more and more pushed aside by those who have just an overview of the issues. This is a worrying trend and one which shows signs of getting worse.

Last year there was a missive from the Cabinet Office demanding that all civil servants (many scientists are civil servants) must seek prior permission from a minister to speak to the media. This feels like a restraint on scientists engaging pro-actively with topical controversies such as badger culling and gene editing. I realise that ministers have scientific commitbadgertees in order to inform decisions but there are, inevitably, many expert scientists who are excluded from this process.

There is now further concern that UK scientists may be prevented from arguing for changes in national legislation or policy if research grants are not exempted from a government ban on the use of public funds for political lobbying. This ban was announced in February and will be introduced in May. This prohibition could impact on University and Research Station staff who are receiving support from public funds. Much agricultural research is publically funded by the BBSRC (Biotechnology and Biological Sciences Research Council). Does this mean that these scientists will not be able to take part in public debates on scientific issues relating to policy or to respond to public consultations in their areas of expertise?

At the moment there is confusion on this issue and the government department responsible seems unable to provide clarification. This has prompted a petition which many scientists are signing. I sincerely hope that it is not the government’s intention to gag scientific opinion and that clarification will protect the right of scientists to make their informed opinions widely known.

Without such clarification life for scientists will become very difficult because ‘lobbying’ is not easily defined. Nor is it easy to define a policy issue. For example, some scientists have commented that a recent paper stating that ‘organic’ milk is more nutritious for consumers is flawed. The paper compares ‘organic’ milk from grass-fed cows with ‘conventional’ milk. The flaw is that the comparison should have been between ‘conventional’ milk from grass-fed cows and ‘organic’ milk from grass-fed cows. It has been known for some time that grass-fed cows produce more nutritious milk: even I knew this!

The criticism of this paper is perhaps just scientific opinion rather than lobbying. However, organic production could be seen by some as part of government policy. Hence, the uncertainties of the boundaries between scientific opinion and lobbying would inevitably inhibit scientific comment and progress should the lobbying by scientists receiving public funds be banned. Let’s hope further clarification from government will prove that this concern is a storm in a teacup.

Leave a comment / View comments

Hard science vs. field observation

Posted on 19/02/2016 by Jim Orson

I attended the BCPC Pests and Beneficials review this week. It concentrated on flea beetle control in oilseed rape and I came away depressed about the prospects for the crop in some parts of the country unless the neonicitinoids are reintroduced or an alternative and at least equally effective seed dressing is introduced. I am sure that there will be press reports on some of the papers delivered at the review.

However, there are likely to be no press reports on a fascinating paper that included the issue of bias in scientific reports. Those that regularly read my blogs know that I have been tub-thumping on this issue and asking for more transparent systems of science reporting. During the delivery of this particular paper my mind wandered to some of the practices that we have adopted in arable agriculture without a scintilla of good scientific study.

The prime example is stubble cultivations to encourage the emergence of black-grass. Now I am sorry that I have to mention black-grass once again but this example demonstrates my concerns that a practice can be adopted with next to no science and where field observations are biased towards thinking that a practice is actually working.

Many years ago I addressed a large audience of agronomists on the control of black-grass and questioned whether the then widespread practice of cultivating stubbles to up to 20 cms depth immediately after harvest with something akin to a Simba Solo was significantly contributing to black-grass control. I was met with a wall of silent disbelief. I was told that black-grass emerges where this practice is adopted and it can be killed with glyphosate before sowing the following autumn sown crop, so it must be contributing to the control of the weed.

This approach was originally sold on the basis that it would make long-term black-grass control in non-plough tillage as effective as where the soil was ploughed. There was little or no evidence for this and what ‘evidence’ I saw was incomplete and unconvincing. However, everyone seemed to believe it was a valuable approach and I assume that it did not benefit my reputation by having the temerity to question this new dawn.

Now we know that this practice of so-called ‘stale seedbeds’ did little to increase the technical sustainability of black-grass control and I am sure resistance built up more quickly where the plough was abandoned.

The time between harvest and the establishment of the following crop needs to be a key time to try to maximise the loss of viable black-grass seed; yet, it is absolutely amazing that there has been no thorough scientific study of the best way to maximise such losses in the range of circumstances that are likely to occur. All we have are a series of un-coordinated field trials with incomplete sets of treatments and observations. Field trials can help but these should be based on the knowledge gained from thorough mini-plot or laboratory studies that establish the principles of maximising the loss of viable black-grass seed.

There are some data in the AHDB Cereals and Oilseeds (what a mouthful) Project Report 381. I have tabulated the data and it seems that there was consistently more black-grass emergence between autumn-sown wheat crops when there was no soil disturbance at all, when compared to using a Solo to a depth of 15cm immediately after harvest. In years when there was good soil moisture status from harvest onwards, this additional emergence between autumn-sown crops resulted in less black-grass in the following wheat crop. However, in dry autumns the opposite was true. Despite little or no black-grass emergence where the stubble was cultivated to 15 cm depth in dry condition, there was less black-grass in the following winter wheat crop than where direct drilling was adopted. This is hard to explain and, very significantly, it questions whether the level of black-grass emergence between crops is a reliable guide to viable seed loss. I have a theory or two but hard facts are required.

Needless to say, in the trials covered by this report, ploughing was the most effective cultivation method in containing black-grass numbers.

This snippet of data only results in highlighting more unknowns. The critical question must be is black-grass emergence between autumn-sown crops a reliable way of estimating the loss of viable black-grass seed? In addition, we need to know how soil moisture as well as the age, dormancy and depth of black-grass seed influence the efficiency of the various approaches to stimulate viable black-grass seed loss.

No amount of un-informed field trials or abstract field observations will provide a robust answer to these questions. Rather than continue an unproductive debate that has gone on for far too long, we need to have some very closely monitored mini-plots and laboratory studies, where the impact of position, age and dormancy of black-grass seed as well as soil moisture status can be monitored. Even a small step in this direction will help to provide some foundation for more informed field testing of approaches that are much more likely to optimise the management of the soil between crops in order to maximise the loss of viable black-grass seed. It would be a huge prize for the industry.


Leave a comment / View comments

Maximum yields versus margins?

Posted on 05/02/2016 by Jim Orson

I have just read a very interesting article in Landmark, the NIAB TAG membership magazine. The title is ‘The economic margin; farmers maximising yields but not financial returns’. It is compiled by the Rural Business Unit of the Department of Land Economy, University of Cambridge. It uses data collected in the Farm Business Survey carried out by Universities across Great Britain between 2004 and 2013. 1,650 winter wheat and winter oilseed rape farms were involved.

I am sure that you have heard the term ‘big data’, well this is it! Specific field data were collected from 5,341 winter wheat crops and 2,927 winter oilseed rape crops and analysed by a sophisticated statistical technique: the results reflect the title of the article. It seems that farmers are maximising yields of winter wheat and winter oilseed rape but using too much seed, fertilisers and pesticides than would achieve the best gross margin.

The authors accept that it is easier to achieve maximum yields because farmers and their advisers like a clean crop. They also add that excess inputs cost farmers financially and impose extra costs on society.

I am a very poor statistician and cannot really comment on the approach taken. However, many will instinctively agree with the results of the analysis. It is interesting that the AHDB yield plateau report in 2012 also indicated the same conclusion. Interestingly, the Cambridge researchers found that the greatest yield returns from the last pound spent were from expenditure on crop protection, compared to smaller yield returns from the last pound spent on fertilisers or seeds.

So why do farmers maximise yields and not gross margins? Primarily it is due to risk management and the need for the rotational control of weeds. Risk management has become increasingly important with labour and machinery costs cut to the bone and also because of the rising tide of pesticide resistance.

Minimising labour and machinery costs has proved a very effective way of reducing total costs of production (see diagram). The savings surpass anything that could be achieved by paring back on the variable costs of seeds, fertilisers and sprays. However, the cutting back of these ‘fixed costs’ has meant that perhaps more has to be spent on variable inputs of seeds, fertilisers and sprays in order to limit the risks engendered by the reduced opportunities and flexibility to carry out field operations.

The disparity between maximising yields and margins may have grown due to pesticide resistance because as the control of insects, weeds or diseases gets weaker and more variable, the more risk averse a chemical means to control them has to become. The current approaches to disease control in wheat are a prime example, particularly as uncertain weather conditions after application have to be taken into account. However, farmers will have to totally rely on cultural controls should resistance to pesticides in a particular weed, disease or insect become extreme.

I have always maintained that we need very effective pesticides in order to achieve the classic IPM approach of assessing risk to the crop from insects, weeds or diseases before applying any necessary treatment. As an example, to support this view, I have regularly used GM glyphosate tolerant sugar beet. Currently, because of the rather weak selective herbicides that are available, complex weed control programmes are started pre-emergence and/or very early post-emergence of the crop. Further treatments are applied when the later emerging weeds are tiny. These applications are done before there is an opportunity to assess the weed challenge to the crop. Waiting to assess the likely competition from weeds is too late to get good control. However, using glyphosate post-emergence of the crop means that a true IPM approach can be adopted with the doses of glyphosate and number of applications adjusted according to need. Hence, ‘strong’ pesticides give us the opportunity to be even more compliant with the principles of IPM, a truth that the ‘green blob’ cannot seem to grasp. By the way, please do not take this as unfettered support for the adoption of GM glyphosate tolerant sugar beet because there are technical issues to be resolved, notably the danger of glyphosate resistant weeds developing.

In terms of weed control, no one crop is an island and there is often a need to use more herbicide than is necessary in the current crop in order to keep weeds at manageable numbers in the following crops in the rotation. This is particularly so when the rotations comprise just autumn sown or spring sown crops. Common times of sowing throughout the rotation mean that the weed species that share the same growth patterns as the crops will flourish.

Hence, whilst I can re-assure the authors of the article that farmers do not like spending more than necessary on their crops, it is still beyond the wit of the industry to tailor exactly the level of inputs required to maximise margins of an individual field. More accurate medium term weather forecasts would help. The authors rightly comment that approaches such as strip trials coupled with the more accurate adjustments from comparative farm data ("big-data") will lead to a growth in efficiency. I can assure the authors that NIAB TAG will continue trying to close the gap in order that input use more closely meets the requirement of maximising margins rather than maximising yields.

Please note that I have not used the word b****-g**** once in this blog and that, at the time of writing, Leicester City are still top of the premier league.

Leave a comment / View comments

Reducing the risk of glyphosate-resistant black-grass

Posted on 21/01/2016 by Jim Orson

In my previous blog I discussed the role of dose on the speed of development of pesticide resistance. I also highlighted the concern over the possible development of black-grass resistance to glyphosate. However, it must be said that nobody is sure that this will happen but just the possibility means that we have to take measures to reduce this risk.

Whilst dose is one aspect, the number of sprays of glyphosate is also an important issue: these need to be reduced to a minimum. I am reminded of a point that a local farmer made to me at an Australian conference where the introduction of GM glyphosate tolerant canola was being discussed. Responding to my question as to whether or not he would introduce this crop, he said that he would if he could identify an opportunity to avoid a glyphosate spray elsewhere in his rotation. This discussion was heavily influenced by the presence of glyphosate resistant annual rye-grass in Australia and so the need for an anti-resistance strategy on his farm was absolutely clear.

The recent Pesticide Usage Surveys carried out on behalf of the Chemical Regulations Directorate (CRD) of HSE have confirmed an ever increasing reliance on glyphosate in arable crop production. This is partly due to the decreasing number of options for effective control of black-grass within crops. Multiple applications of glyphosate to ‘stale seedbeds’ are commonly adopted before sowing winter wheat and there is a strengthening case to support such an approach for reducing high black-grass populations.

The issue is that if populations are sufficiently high to warrant multiple applications of glyphosate in order to prevent the emerged black-grass in ‘stale-seedbeds’ shading the soil and preventing further black-grass germination, then the background population is too high to grow a relative weed-free crop of winter wheat unless it is sown exceptionally late.

The obvious solution is to reduce background black-grass populations to a level where the emerged black-grass plants do not shade the soil sufficiently to prevent further black-grass emergence. The limited data suggest that this means less than 10-15 plants/m2. In addition, such a low incidence of black-grass will prevent the need to spray twice in order to keep the seedbeds in a good condition for sowing, although the number of emerging volunteer plants will also be a factor influencing decision-making.

There are other compelling reasons to reduce background black-grass numbers. As well as reducing the number of glyphosate sprays prior to sowing wheat, low numbers mean that the herbicides used to control the weed in the crop will provide higher levels of control of black-grass heads. When black-grass plants start to compete with themselves as well as the crop, reducing black-grass plant number with selective herbicides in the crop means that the survivors have less competition and produce more heads.

In addition, reducing background numbers provides more flexibility for the future. A poor herbicide performance will not be so critical and there may be more options available on sowing dates.

Hence, for a variety of reasons it is important to reduce background populations of black-grass. The often poor and variable selective control offered by herbicides in arable crops means that living on a knife-edge is no longer a realistic option. I will stop now because I recently read that one farmer’s New Year resolution is not to attend any more meetings on black-grass. Somehow, I know what he means!

Leave a comment / View comments

Survival of the most resistant

Posted on 08/01/2016 by Jim Orson

I have been a life-long supporter of Leicester City and can now come out of the woodwork because they have the 40 points necessary to survive in the Premier League for another season. It would have been tempting fate to declare my support earlier in the season. I must admit that I have not watched them play for about 10 years and, instead, go and watch the mighty U’s (Cambridge United) a few times a season with one of my daughters.

Leicester City is not new to the dizzy heights that they have achieved so far this season. I watched them in the early 1960s when they were a top club. They were not far from doing the old League and FA Cup double in 1963 when I watched the best match that I have ever attended. They drew 2-2 at home to Spurs, then the glory team of the League. Jimmy Greaves of Spurs scored with a wonder strike and it was amazing that the great Gordon Banks in the Leicester goal even got his finger tips to the ball. That moment is forever imprinted on my mind.

There is a small core of Leicester supporters in NIAB, including Tina the director. Normally our discussions centre on Leicester’s survival in the Premier League, particularly this time last year when they were bottom. I am mentioning this because I am desperately trying to link the Leicester story to the theme of this blog which is that small percentages of a population can count, particularly in the context of the development of pesticide resistance.

It is generally accepted that pesticide resistance is a process of selection of naturally occurring mutations which happen to be resistant to a specific or a range of pesticides. The continual exposure to the mode(s) of action of the pesticide(s) to which there is resistance results in these resistant individuals becoming dominant in the pest populations, whether these pests be insects, diseases or weeds.

There has long been a debate in the industry about whether or not high or low doses increase the rate of selection of these naturally occurring mutations. Logic suggests it must be high doses in order to select more effectively for the most resistant individuals. Low or sub-optimal doses are more likely to result in the higher survival of non-resistant or less resistant individuals. This type of discussion always begs the question of what actually is a ‘high’ or ‘low’ dose.

SeptoriaField experience and experimental evidence on fungicide resistance also suggest that it is usually higher doses that result in the more rapid selection of resistant individuals. The following is from a recent statement from the Fungicide Resistance Action Group, which represents the whole industry, on reducing the speed of development of septoria resistance to the SDHI fungicides. “All effective fungicides exert a selection pressure on pathogen populations and carry a risk of resistance. This risk can be modified and reduced by either mixing or alternating with fungicides with an alternative mode of action, or by reducing the number of applications or dose of the fungicide.” This suggests that, at last, there is a more general acceptance that the higher the dose the more likely there is to be selection for resistance.

I am positive that same is generally true for herbicides. When first introduced the ‘fops’ and ‘dims’ controlled around 99% of black-grass at the recommended dose. They were rarely used at reduced doses but resistance developed very quickly. The field experience with the sulfonylurea herbicide product Atlantis was that when it was first used there were always a few survivors of a full dose. I remember suggesting to farmers that these could be resistant and should be rogued. Resistance to Atlantis is now widespread and continues to increase rapidly.

However, there is some evidence to suggest that ‘low’ or sub-optimal doses can speed up the development of weed resistance to glyphosate. That gave me cause to ponder why this could be true. It did not take me long to conclude that the development of resistance is speeded up not by whether high or low doses are used but initially by doses that result in a low number of survivors. When first introduced, crop-safe herbicides such as the ‘fops’ and ‘dims’ and the sulfonylureas left only a few survivors when used at recommended doses. The survivors were more likely to be resistant to these modes of action.

Optimal doses of glyphosate should kill everything, provided that it is applied well, growing conditions are conducive to control and the weeds are at the appopriate growth stages for good activity. However, sub-optimal doses may leave a few individuals which may be more likely to have a level of resistance. The continued adoption of sub-optimal doses, particularly where minimal or no-tillage is employed, may form the basis of future populations which could perhaps cross-fertilise, resulting in individuals with even higher levels of resistance. It may have been significant that the first case of a partially glyphosate resistant weed in the UK was in sterile brome, where the dose recommended for control on stubble (540 g ae glyphosate/ha) is often only marginally effective on this weed.

So perhaps the speed of development of resistance is all about the dose required to select the most resistant types. This is often the recommended doses for crop-safe herbicides and fungicides but could be sub-optimal doses for glyphosate, at least on some weeds. Hence, whilst it is not absolutely proven that sub-optimal doses speed the development of glyphosate resistance it would be advisable to apply it correctly in the right circumstances and use doses that will kill all the black-grass and inspect the results of treatment to ensure that there are no survivors. This is particularly the case where control by glyphosate is not supplemented by cultivations. For more information, please see the guidelines for minimising the risk of glyphosate resistance.

Best wishes for 2016.

The constructive comments on the script of this blog by Stephen Moss of Stephen Moss Consulting are gratefully acknowledged.

Leave a comment / View comments

Page:  < 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 >  Last ›