NIAB - National Institute of Agricultural Botany

Orson's Oracle

Goodbye KISS?

Posted on 13/12/2013 by Jim Orson

Times were tough for UK cereal producers around the turn of this century and costs were really under the hammer. Whilst some savings could be made with variable costs, labour and machinery costs were the main targets for savings.

There were many implications in trying to reduce labour and machinery costs per ha, including increasing the scale of production and also the widespread adoption of non-plough tillage.  As I have said in previous blogs, contractor charges for a plough-based system and non-plough tillage to a depth of 20 cm do not indicate where large savings could have been made. It was the rate of work of non-plough tillage that enabled a reduced labour and machinery complement to maintain timeliness.

However agronomy played its part. Wheat varieties more suited to historical early sowing dates were identified, allowing a longer drilling window.  In addition, there were effective pesticides that enabled crops to be protected from the potential problems of very early drilling. The KISS (Keep It Simple Stupid) principle became established as a way to maintain crop protection with a diminished availability of managerial time and labour input per hectare.  Block cropping was increasingly adopted and cereal variety choice was influenced by being able to phase their fairly predictable crop protection and nutrition requirements in order to help smooth out the labour peaks of the following spring.  It now all seems so simple and straightforward.

The first real challenge to this approach came in the dry autumn of 2003. Control of blackgrass in cereals was often poor with the pre-emergence herbicides.  However, the drought-induced delayed emergence resulted in blackgrass being small and not shaded by the crop in the following exceptionally warm February.  These were ideal conditions for the newly introduced Atlantis. I dread to think what would have happened that year without that herbicide. In reality, I do not have to use my imagination too much. Wheat would have had infestations of blackgrass similar to the summers of 2010 and 2012 when the previous dry autumns also limited the activity of the pre-emergence herbicides but this time the potency of Atlantis was diminished due to resistance.

This reduced ability to control blackgrass has implications for block cropping. I have regularly witnessed one or two fields in a block of several fields having particular control problems. This means that one common approach to blackgrass control for all the fields in the block means either too much for some fields or two little for others. Hence, things are getting more complicated and starting to challenge the KISS approach.

Resistance is also making it more difficult to plan fungicide programmes in wheat well in advance. Increasing resistance in septoria is limiting the length of activity of the triazoles on the disease, particularly the period of eradicant activity. This reduces the flexibility in timing and may also mean an additional application between the traditional timings of the second node detectable stage of the crop and full flag leaf emergence in wet seasons. 

Pesticide resistance is not the only challenge to KISS in cereal production. Pesticide revocations have also added complications. The loss of trifluralin was a particular blow. Although not the most effective blackgrass herbicide in cereals, it had the attributes of being cheap and its efficacy not being affected by pesticide resistance or by dry seedbeds. Obviously the current withdrawal of the neonicitinoid seed dressings in oilseed rape will complicate management in the future.

So whilst adopting the KISS principle helped enable cereal farmers to survive the lean times early this century, it is becoming more difficult to conduct. However, it is not yet the time to kiss KISS goodbye. 

Leave a comment / View comments


What is a legitimate environmental target?

Posted on 05/12/2013 by Jim Orson

Professor Seralini is not a household name but he has been in the limelight in the ‘GMO world’ for a few years. He is Professor of Molecular Biology at the University of Caen in France and has been known to be against genetic modification. I’m not sure whether he held such views before he started to carry out studies on feeding GMOs to rats or only after he perceived the conclusions of such studies.

In September 2012, his group published a peer reviewed paper titled "Long-term toxicity of a Roundup herbicide and a Roundup-tolerant genetically modified maize". It concluded that feeding studies proved that GM maize caused more tumours to occur in rats than from feeding conventional maize. There are various rumours on who funded the study, but they are all anti-GM organisations.

What was unusual was that steps appear to have been taken to ensure that the first news stories published about the study would not be critical of its methods or results. Science journalists were briefed prior to its official publication only on condition that they did not mention the paper to any independent researchers or outside experts. By the day of its publication, anti-GM activists were all briefed for interviews with the wider media and there was even a beautifully crafted video available on websites. 

However, the cunning plan did not quite work. A copy of the paper was circulated to independent researchers just prior to publication.  This enabled the Science Media Centre to produce a document on the day of publication in which leading scientists commented on the validity of the conclusions made by Professor Seralini and his colleagues. This resulted in most of the national media not mentioning the paper at all. This all suggested that there was a mood change in the media both towards GM and also to the methods being adopted by anti-GM groups.

That mood change has continued. Articles that support the cultivation of GM crops have started to appear in the national press. In addition, politicians, notably Owen Paterson, have spoken forcefully on the subject and various other governments are also taking a more pro-GM stance.  

After a year-long analysis, the peer review journal has now retracted Seralini’s paper on the basis of interpretation of the data generated by the feeding studies. However, Professor Seralini is not taking this lying down and is contesting the decision. He is even threatening legal action, mentioning that a member of the editorial board of the journal once carried out a piece of research for Monsanto. In the anti-GM world this is equivalent to supping with the devil. This view is now becoming outdated as many in society are as suspicious of research funded by an anti-GM group(s) as they are of research funded by commercial companies.

However, I hope that the alleged ‘getting rid of green crap’ remark made by our Prime Minister does not allow the pendulum to swing too far towards ignoring legitimate concerns over the environment. Reducing environmental damage can be an expensive process but may have larger longer-term commercial and societal gains. The issue is defining a legitimate concern.

I was reminded of this during a lecture I recently attended. Arable weeds have prospered only since the adoption of cultivated crops. Prior to that they were restricted to small areas where the soil was regularly moved by natural processes, for example by soil erosion. Hence, the introduction of arable cropping gave them the ideal opportunity to thrive, particularly before the introduction of effective herbicides. During this period, many farmland bird species also thrived because these arable weeds supported their insect food.

GM herbicide tolerant crops have been questioned because their adoption could reduce the numbers of arable weed seeds in soils. However, fundamentalist ecologists would say that arable cropping is, by definition, damaging to a totally natural environment. This suggests that they are not over-concerned about arable weed seed levels in the soil. With such contrasting views, no wonder we’re all struggling to get a clear view on just what are the legitimate targets for environmental policy.

Leave a comment / View comments


Nature more effective than black-grass herbicides

Posted on 27/11/2013 by Jim Orson

I have had a few enquiries this autumn about the fate of black-grass seed after shedding. This type of question is fostered by rather imprecise language describing the number of black-grass seeds formed on a head of the weed. On average one head of autumn germinating black-grass sets 100 Blackgrass in earseeds; a convenient number. But are all these seeds viable as some claim?

The simple answer is no. Although, of necessity there is a lot of variability involved in determining the fate of black-grass seed, the average figures are intriguing. The best source of such data is page 86 of HGCA Project Report 466 – Integrated Management of Herbicide Resistance.

The data suggest that on average only 55% of the seeds are viable. This still seems a horrifying situation; on every black-grass head there are 55 viable seeds. However, be (relatively) comforted by the huge subsequent losses.

Only 45% of the seeds survive the period between shedding and sowing a following crop in late September. So 45% of the 55 viable seeds per head make it to a stage where they can potentially become a weed in the following crop. However, many are buried by cultivations to a depth from which they will not emerge, assumed to be more than 5 cm. For instance, non-inversion tillage carried out to a depth of 20 cm buries 40% of freshly shed seeds deeper than 5 cm. As you can see, the losses keep mounting and there is more to come.

Only a proportion (15%) of these freshly shed seeds in the top 5 cm will actually germinate and establish plants in the following crop. All this means that for every head of autumn germinating black-grass that survives to produce seed in June, approximately 2.2 plants will emerge in a following late September sown crop established after non-plough tillage to a depth of 20 cm.

Of course life (certainly the life of black-grass) is not that simple. Some of the seeds shed will survive up to three or more years in the soil. There is a loss of about 70% of viable seeds a year but this is partly compensated for by the fact that a higher proportion (30%) of over-yeared viable seeds will germinate, because of the loss of dormancy, if they are in the top 5 cm of soil.

This complicates the maths but when it is all worked through, the seeds shed from one head will result in around three plants being formed over the following three years in continuous late September sown crops established after non-inversion tillage to 20 cm depth. So there is some comfort to be had from this information on seed losses but please note that these are average figures.

The factor that I have not mentioned until now is the number of fertile tillers (i.e. heads) per plant. Generally, the earlier the autumn sowing the more heads per plant are formed. In addition, extreme summer weather events can make a huge difference. In 2012, the continuous wet conditions resulted in low black-grass tiller losses and so very high heads/plant. In contrast, the very dry spring of 2011 resulted in tiller losses right up to head emergence, leading to very low heads/plant.

In conclusion, and in the context of total seeds set, natural losses in continuous late-September sown cropping are higher than those that can now be achieved with cereal herbicides. However, these losses are a long way from being high enough to avoid the use of chemicals.

There are a lot of research and field observations being carried out to see how the number of natural losses can be increased, for example by later drilling. Head numbers per plant can be reduced by higher crop seed rates and the weed’s lifecycle can be disrupted by spring sowing. It is heartening that we can do things that enhance natural losses but they all cost money and, for many, will result in systems that are less reliable than continuous early-autumn sown crops.

Leave a comment / View comments


Giving up on organic matter

Posted on 20/11/2013 by Jim Orson

Last week I visited Rothamsted Research with a few arable farmers.  As always, it was very good value and we were updated on many aspects of the organisation’s research. Pesticide resistance and availability and soil issues were the main themes.

One area that caused particular interest was the evidence behind a new HGCA and Defra funded project on how to get the best out of organic manures. The whole industry is not only aware of the desirability to increase organic matter levels in long-term arable soils but also the futility of trying to achieve such an objective. 

Changing to a system of interspersing short periods of arable crops between medium-term grass leys could help but would also have a negative impact on both production and profits. The alternative is to use a lot of organic manures or amendments at regular intervals. However, there are simply not enough of these organic sources to have a significant impact on organic matter levels on more than a very small percentage of our arable land.

So how best to use these limited organic sources to maximise their benefits? Some research at Rothamsted has provided a clue and is the basis of the new project. Researchers found that the benefits from annual farmyard manure application, in terms of spring barley yields, were quickly available when judged against the same amount of manure being applied annually since 1852 (see figure). This yield benefit, recorded in 2002 after just two successive autumn applications of farmyard manure, could not be explained either by the nutrients in the manure or by an increase in soil organic matter. So what is the possible explanation?

Rothamsted spring barley FYM project

Figure: spring barley yields (t/ha) at the optimum doses of mineral (bag) nitrogen.
The blue line is where mineral nitrogen only has been used annually since 1852, the green line is where both farmyard manure and mineral nitrogen have been used annually since 1852 and the red line is where organic manure and mineral nitrogen have been used annually since 2000 (i.e. first manure application in autumn 2000).  Graph kindly provided by Andy Whitmore of Rothamsted Research    


Regular readers of my blogs (if there are any out there!) will have guessed the possible explanation because I’ve banged on about this issue before. The application of crop residues and organic manures and amendments increases the level of the biomass of fungal, bacterial and fauna in the soil. These can act as a surrogate for organic matter.  However, to maintain these benefits they need annual applications.

It is for this reason that the value of straw to the cereal farmer is more than just the nutrients it contains. Many farmers reported that their land worked more easily after a couple of years of incorporating rather than burning straw. However, despite positive effects on the soil, Rothamsted has not found a benefit in terms of winter wheat yields from the annual incorporation of up to four times the average straw yield.

Why were increased yields recorded from the regular incorporation of organic materials recorded in spring barley and not in winter wheat? It’s my view, which cannot be verified, that spring barley yields are more likely to benefit because the crop is established in more hostile growing conditions and has a shorter growing period. We all recognise that winter wheat yields are more likely than spring barley to compensate for set-backs in the first couple of months of growth. The opposite may well also be true, that spring barley yields more than winter wheat yields are likely to benefit from improved soil conditions in the early stages of establishment.

To continue my theorising, how best should we use the limited UK supplies of organic manures and amendments? Based on very limited information, it would seem that using smaller amounts annually rather than large amounts intermittently in largely spring crop orientated rotations may be the best way forward to maximising yield benefits. However, applying smaller amounts annually will result in extra costs. I’m sure that the Rothamsted project will provide guidance as to the way forward. 

Leave a comment / View comments


Conquering hunger

Posted on 13/11/2013 by Jim Orson

In the late 1990s I gave a talk at a highly charged conference on GMs at which I stated that there was progressively less hunger in the world. Naturally, I checked my facts with some aid agencies before making such a statement. I attributed this good news to a more plentiful supply of food because of higher yields, but didn’t highlight the role of any specific agricultural technology. I didn’t even think of GM crops being a factor because at that time they had only just been introduced in the USA.

This did not stop the fledgling anti-GM movement from saying, both at the conference and in follow up statements, that I was lying. Their implication was that ‘modern agriculture’ was failing to meet the challenge of a rapidly expanding world population.

Nowadays, the FAO publishes estimates of world hunger which provide hard data rather than opinion. It shows that A gradually decreasing proportion of the world population is described as undernourished - falling from around 19% in 1990 to around 12% in 2013. The percentage of undernourished people has fallen in all regions, but not in all countries, of the world. Tragically, there’s still around 20% of the population in sub-Saharan Africa and in South Asia that does not have sufficient food.

Whilst the percentage of undernourished people has fallen significantly over the last 20 years or so the world population has increased significantly.  So the question has to be asked as to whether there are now fewer undernourished people than then? Rather amazingly the world population has increased from just over 5 billion in 1990 to around 7 billion in 2013, almost as much of a proportional increase as the decrease, over the same period, in the proportion that is undernourished. Hence, the absolute number of people that are recorded as undernourished (now 842 million) is not dramatically below that in 1990 (1,015 million). So whilst modern agriculture can pat itself on the back for feeding the world’s rapidly expanding population there are still many challenges ahead.

This emphasises the need to be very careful when quoting statistics. I would have been absolutely correct to say in my talk that the proportion of the world population that is undernourished was falling but this would have ignored the fact that the fall in the number of people in this category was not so significant. There are plenty of instances where ill-defined statements can be misleading. For example, a recent report showed that the UK has reduced carbon dioxide emissions by 20% since 1990. Great news, until you read in the same report that the UK has increased its carbon footprint by 10% over the same period because of the carbon dioxide emissions associated with the production and transport of the goods we’re importing in increasing amounts.

There’s an active debate about how farming should be structured in the less developed parts of the world in order to meet these food security challenges of the future. There appear to be two contrasting views. The first, which is being championed by some charities, is that very small scale family units are the solution. The opposite view, taken by some of the more industrially-based organisations, is that very much larger and professionally managed units are necessary. I can’t really comment on this dichotomy other than to say that the farms need to be sufficiently large to provide a surplus of food in order to create a market and to stimulate other parts of the economy. It is precarious, to say the least, to rely solely on barely self-sufficient units. This was the objective in Cambodia when Pol Pot tried to impose his communist-based agrarian Utopia.

Leave a comment / View comments


Page: ‹ First  < 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 >  Last ›