NIAB - National Institute of Agricultural Botany

Orson's Oracle

Nature more effective than black-grass herbicides

Posted on 27/11/2013 by Jim Orson

I have had a few enquiries this autumn about the fate of black-grass seed after shedding. This type of question is fostered by rather imprecise language describing the number of black-grass seeds formed on a head of the weed. On average one head of autumn germinating black-grass sets 100 Blackgrass in earseeds; a convenient number. But are all these seeds viable as some claim?

The simple answer is no. Although, of necessity there is a lot of variability involved in determining the fate of black-grass seed, the average figures are intriguing. The best source of such data is page 86 of HGCA Project Report 466 – Integrated Management of Herbicide Resistance.

The data suggest that on average only 55% of the seeds are viable. This still seems a horrifying situation; on every black-grass head there are 55 viable seeds. However, be (relatively) comforted by the huge subsequent losses.

Only 45% of the seeds survive the period between shedding and sowing a following crop in late September. So 45% of the 55 viable seeds per head make it to a stage where they can potentially become a weed in the following crop. However, many are buried by cultivations to a depth from which they will not emerge, assumed to be more than 5 cm. For instance, non-inversion tillage carried out to a depth of 20 cm buries 40% of freshly shed seeds deeper than 5 cm. As you can see, the losses keep mounting and there is more to come.

Only a proportion (15%) of these freshly shed seeds in the top 5 cm will actually germinate and establish plants in the following crop. All this means that for every head of autumn germinating black-grass that survives to produce seed in June, approximately 2.2 plants will emerge in a following late September sown crop established after non-plough tillage to a depth of 20 cm.

Of course life (certainly the life of black-grass) is not that simple. Some of the seeds shed will survive up to three or more years in the soil. There is a loss of about 70% of viable seeds a year but this is partly compensated for by the fact that a higher proportion (30%) of over-yeared viable seeds will germinate, because of the loss of dormancy, if they are in the top 5 cm of soil.

This complicates the maths but when it is all worked through, the seeds shed from one head will result in around three plants being formed over the following three years in continuous late September sown crops established after non-inversion tillage to 20 cm depth. So there is some comfort to be had from this information on seed losses but please note that these are average figures.

The factor that I have not mentioned until now is the number of fertile tillers (i.e. heads) per plant. Generally, the earlier the autumn sowing the more heads per plant are formed. In addition, extreme summer weather events can make a huge difference. In 2012, the continuous wet conditions resulted in low black-grass tiller losses and so very high heads/plant. In contrast, the very dry spring of 2011 resulted in tiller losses right up to head emergence, leading to very low heads/plant.

In conclusion, and in the context of total seeds set, natural losses in continuous late-September sown cropping are higher than those that can now be achieved with cereal herbicides. However, these losses are a long way from being high enough to avoid the use of chemicals.

There are a lot of research and field observations being carried out to see how the number of natural losses can be increased, for example by later drilling. Head numbers per plant can be reduced by higher crop seed rates and the weed’s lifecycle can be disrupted by spring sowing. It is heartening that we can do things that enhance natural losses but they all cost money and, for many, will result in systems that are less reliable than continuous early-autumn sown crops.

Leave a comment / View comments

 

Giving up on organic matter

Posted on 20/11/2013 by Jim Orson

Last week I visited Rothamsted Research with a few arable farmers.  As always, it was very good value and we were updated on many aspects of the organisation’s research. Pesticide resistance and availability and soil issues were the main themes.

One area that caused particular interest was the evidence behind a new HGCA and Defra funded project on how to get the best out of organic manures. The whole industry is not only aware of the desirability to increase organic matter levels in long-term arable soils but also the futility of trying to achieve such an objective. 

Changing to a system of interspersing short periods of arable crops between medium-term grass leys could help but would also have a negative impact on both production and profits. The alternative is to use a lot of organic manures or amendments at regular intervals. However, there are simply not enough of these organic sources to have a significant impact on organic matter levels on more than a very small percentage of our arable land.

So how best to use these limited organic sources to maximise their benefits? Some research at Rothamsted has provided a clue and is the basis of the new project. Researchers found that the benefits from annual farmyard manure application, in terms of spring barley yields, were quickly available when judged against the same amount of manure being applied annually since 1852 (see figure). This yield benefit, recorded in 2002 after just two successive autumn applications of farmyard manure, could not be explained either by the nutrients in the manure or by an increase in soil organic matter. So what is the possible explanation?

Rothamsted spring barley FYM project

Figure: spring barley yields (t/ha) at the optimum doses of mineral (bag) nitrogen.
The blue line is where mineral nitrogen only has been used annually since 1852, the green line is where both farmyard manure and mineral nitrogen have been used annually since 1852 and the red line is where organic manure and mineral nitrogen have been used annually since 2000 (i.e. first manure application in autumn 2000).  Graph kindly provided by Andy Whitmore of Rothamsted Research    

 

Regular readers of my blogs (if there are any out there!) will have guessed the possible explanation because I’ve banged on about this issue before. The application of crop residues and organic manures and amendments increases the level of the biomass of fungal, bacterial and fauna in the soil. These can act as a surrogate for organic matter.  However, to maintain these benefits they need annual applications.

It is for this reason that the value of straw to the cereal farmer is more than just the nutrients it contains. Many farmers reported that their land worked more easily after a couple of years of incorporating rather than burning straw. However, despite positive effects on the soil, Rothamsted has not found a benefit in terms of winter wheat yields from the annual incorporation of up to four times the average straw yield.

Why were increased yields recorded from the regular incorporation of organic materials recorded in spring barley and not in winter wheat? It’s my view, which cannot be verified, that spring barley yields are more likely to benefit because the crop is established in more hostile growing conditions and has a shorter growing period. We all recognise that winter wheat yields are more likely than spring barley to compensate for set-backs in the first couple of months of growth. The opposite may well also be true, that spring barley yields more than winter wheat yields are likely to benefit from improved soil conditions in the early stages of establishment.

To continue my theorising, how best should we use the limited UK supplies of organic manures and amendments? Based on very limited information, it would seem that using smaller amounts annually rather than large amounts intermittently in largely spring crop orientated rotations may be the best way forward to maximising yield benefits. However, applying smaller amounts annually will result in extra costs. I’m sure that the Rothamsted project will provide guidance as to the way forward. 

Leave a comment / View comments

 

Conquering hunger

Posted on 13/11/2013 by Jim Orson

In the late 1990s I gave a talk at a highly charged conference on GMs at which I stated that there was progressively less hunger in the world. Naturally, I checked my facts with some aid agencies before making such a statement. I attributed this good news to a more plentiful supply of food because of higher yields, but didn’t highlight the role of any specific agricultural technology. I didn’t even think of GM crops being a factor because at that time they had only just been introduced in the USA.

This did not stop the fledgling anti-GM movement from saying, both at the conference and in follow up statements, that I was lying. Their implication was that ‘modern agriculture’ was failing to meet the challenge of a rapidly expanding world population.

Nowadays, the FAO publishes estimates of world hunger which provide hard data rather than opinion. It shows that A gradually decreasing proportion of the world population is described as undernourished - falling from around 19% in 1990 to around 12% in 2013. The percentage of undernourished people has fallen in all regions, but not in all countries, of the world. Tragically, there’s still around 20% of the population in sub-Saharan Africa and in South Asia that does not have sufficient food.

Whilst the percentage of undernourished people has fallen significantly over the last 20 years or so the world population has increased significantly.  So the question has to be asked as to whether there are now fewer undernourished people than then? Rather amazingly the world population has increased from just over 5 billion in 1990 to around 7 billion in 2013, almost as much of a proportional increase as the decrease, over the same period, in the proportion that is undernourished. Hence, the absolute number of people that are recorded as undernourished (now 842 million) is not dramatically below that in 1990 (1,015 million). So whilst modern agriculture can pat itself on the back for feeding the world’s rapidly expanding population there are still many challenges ahead.

This emphasises the need to be very careful when quoting statistics. I would have been absolutely correct to say in my talk that the proportion of the world population that is undernourished was falling but this would have ignored the fact that the fall in the number of people in this category was not so significant. There are plenty of instances where ill-defined statements can be misleading. For example, a recent report showed that the UK has reduced carbon dioxide emissions by 20% since 1990. Great news, until you read in the same report that the UK has increased its carbon footprint by 10% over the same period because of the carbon dioxide emissions associated with the production and transport of the goods we’re importing in increasing amounts.

There’s an active debate about how farming should be structured in the less developed parts of the world in order to meet these food security challenges of the future. There appear to be two contrasting views. The first, which is being championed by some charities, is that very small scale family units are the solution. The opposite view, taken by some of the more industrially-based organisations, is that very much larger and professionally managed units are necessary. I can’t really comment on this dichotomy other than to say that the farms need to be sufficiently large to provide a surplus of food in order to create a market and to stimulate other parts of the economy. It is precarious, to say the least, to rely solely on barely self-sufficient units. This was the objective in Cambodia when Pol Pot tried to impose his communist-based agrarian Utopia.

Leave a comment / View comments

 

Can we afford any more revocations?

Posted on 05/11/2013 by Jim Orson

One of life’s treats for me is to settle down with a cup of coffee (or two) and read The Sunday Times. The process starts by throwing away all the bits we don’t read, then finding out about the latest cars and gadgets and then onto the sport and then business. I reach the news later in the day and so by that time I am very relaxed.

I was rudely roused from my relaxed state a couple of Sundays ago by an article that attributed all losses in UK amphibian life over the last few decades to pesticides. It quoted research sources that were said to have proved this state of affairs. The pesticides mentioned were atrazine, DDT, dieldrin and malathion.

My immediate reaction was that no other possible causes of losses of amphibians were mentioned, perhaps loss of habitat or other possible toxins. This seemed an extraordinary oversight in light of the fact that the pesticides mentioned have not been used in the UK for years.Female great crested newt

It is now clear that the bold accusations were made by an anti-pesticide group and not the researchers themselves whose methods appear not to reflect the type of exposure that would occur outside the laboratory. I’m rather disappointed that a quality newspaper acted as a mouthpiece for this group rather than investigating more thoroughly the facts.

Since the article was published, I’ve been thinking a little more about my immediate reaction to it. I remember saying to my wife that the article was ill-founded because the pesticides had already been withdrawn because of fears over their safety in the environment. However, perhaps this makes me a bit of a hypocrite because I may have lamented and, at least subconsciously, defended these pesticides when they were first threatened with revocation. I realise that it is difficult to defend DDT and dieldrin, but please remember that DDT is still being used in a very targeted way to fight malaria (see my April 2013 blog ‘The Impatient Optimist’).

The new EU pesticide regulations have further increased the standards of environmental protection, particularly for aquatic life. I blogged only a couple of weeks ago about wider aquatic buffer zones (‘Two in one will go’).  The issue of endocrine disruptors is also coming to a head and it’s hard to imagine that there will not be further losses of active ingredients as a result.

As standards increase the reasons for revocation become more marginal. Preventing the widespread use of DDT and dieldrin was perhaps an easy decision compared to the possible withdrawal of active ingredients due to the hazard (not the risk) posed to endocrine disruption. So getting the correct balance between food production and any threat that may exist to environmental and human health will become more challenging.

It’s also getting harder to achieve the right balance when there are single issue groups who are hell bent on getting their own way. To achieve this, truth and scientific facts become the first casualties. I suppose the prime example of this is Greenpeace’s campaign against GM Golden Rice (see January 2013 blog ‘A golden future’), the cultivation of which may prevent widespread disability and death amongst children.

Fortunately, we’ve continued to produce the same amount of food despite all the pesticide withdrawals which we’ve seen over recent years. Unfortunately, this may have given the impression that food production will continue to be maintained despite even further pesticide withdrawals. However, there are many of us in the industry who now feel that we cannot suffer more losses of pesticides without a significant fall in production. Hence, I sympathise with those at the sharp end of decision making on pesticide approvals. They’re trying to achieve the right balance between food production and the environment, which is of course a political as well as a scientific issue.

Politicians are becoming increasingly aware of the challenges to future food supplies and they also seem more aware that NGOs and single issue groups may not be as righteous as they themselves think they are. It is my firm belief that the debate on achieving the right balance between food production and the environment will be heightened as food supplies get tighter resulting in more votes being swayed by food availability and prices.

Leave a comment / View comments

 

Seeing is believing?

Posted on 29/10/2013 by Jim Orson

It’s always been said that many farmers adopt or adapt techniques by ‘looking over the hedge’ at their neighbours. Nowadays ‘looking over the hedge’ should not be taken too literally as it is a process that includes all forms of communication. However, seeing should not always necessarily mean believing.

I don’t think we’ll ever regain the world wheat record from New Zealand now that they’ve really got their act together. In this context, this year’s UK record wheat crop was a real achievement. NIAB TAG’s monitoring of weather variables suggested that the area where the UK record crop was grown was favourably treated by nature this spring; higher than average levels of solar radiation and sufficient rainfall were recorded.   

Much is being made of the level of foliar nutrition that was received by the wheat crop that broke the UK wheat record. There’s the assumption that this significantly contributed to the high yields and it appears that the farmer, whose attention to detail is impressive, is convinced of this.

Now, I don’t want to be a killjoy but I’m not quite so convinced. Perhaps this is inevitable from a boring science-based agronomist whose views have been coloured by similar (and eventually proven to be unsubstantiated) claims throughout his career. Perhaps this time it is different but I should like it to be validated in field trials where yields with and without the foliar nutrition are compared.

What was different about the approach to the record UK wheat record crop is that the farmer applied foliar nutrition from early post-emergence onwards. I don’t think that’s been assessed experimentally in the past. Usually these products have been tested in single applications rather than in a multi-application, multi-year approach. The farmer’s high yields suggest that this approach needs to be evaluated experimentally.

There are yield benefits from splitting the total season-long dose of ‘bag’ nitrogen. In trials in years gone by, there were yield benefits from increasing the number of spring nitrogen applications. The biggest yield benefit was going from one application to two. The yield benefit got progressively less with each additional application. The current standard of three applications is a pragmatic approach bearing in mind labour and machinery costs but there would be yield benefits from having one or two additional applications. However, it must be remembered that we farm to optimise margins and not to maximise yields.Wheat

In press reports, the farmer who grew the UK record crop laments the fact that he cannot use, for environmental reasons, the very high dose of applied nitrogen that was used to grow the world record crop of 15.64 t/ha in New Zealand. He is reported to have used 220 kg N/ha. In fact he may not have had to use much or any more because Eric Watson, who also farms in South Island New Zealand, has achieved a field yield slightly higher than the world record with around 250 kg/ha of applied nitrogen. The level of soil mineral nitrogen in this case was 100 kg/ha. 

In NIAB TAG trials the optimum applied nitrogen dose is not reduced by this level of soil mineral nitrogen. To reinforce this point, Craige MacKenzie, who farms a few miles from Eric Watson has also achieved a field yield higher than the world record from a total of 262 kg/ha of applied nitrogen where the soil mineral nitrogen was 35 kg/ha. Unlike Mike Solari, the world record holder, both Eric and Craige have the benefit of variable rate irrigation.

This suggests that, while the amount of nitrogen that a crop can access may limit yield, very high wheat yields can be achieved by not exceptionally high doses of applied nitrogen. What is essential is the correct crop structure, prolonged ripening with high levels of sunlight and an ample supply of moisture. The final and critical element must be the skill of the farmer with particular emphasis on attention to detail. I remember vividly having a prolonged discussion with Mike Solari about seed rates and tiller numbers and what constitutes a ‘too thick’ crop. 

As an aside, one intriguing observation can be made when looking at the results of UK trials on trace elements and/or plant vigour sprays. Usually these trials have two control treatments. One control treatment is standard crop management and the other is the same but receiving a spray of just water applied at the same timing and volume of application as that used for the trace element and/or plant vigour sprays. It’s not unusual to see a yield benefit from all the spray treatments, with or without the trace element and/or plant vigour products. I’ve often pondered why such a small amount of applied water can make such a difference.

Leave a comment / View comments

 

Page: ‹ First  < 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 >  Last ›