NIAB - National Institute of Agricultural Botany

Orson's Oracle

Nitrogen recommendations – it's very complicated

Posted on 04/11/2016 by Jim Orson

I wrote a blog in April (Precision application of nitrogen) on why I thought that experiments and field experience over the past 10-20 years have not shown a significant benefit in the spatial application of nitrogen in winter wheat. The recent publication of the AHDB project ‘Automating nitrogen fertiliser for cereals (Auto-N)’ has strengthened the argument that this approach to nitrogen fertilisation has little or no economic advantage.

The authors of the report seem to accept that the simplistic models that are currently used to predict nitrogen requirement do not take into account the unpredictability of nitrogen uptake by the crop, in this case winter wheat. In many cases there are interactions between the components used to predict optimum nitrogen requirement. For instance the authors highlight a possible negative correlation between soil nitrogen supply (SNS) and fertiliser recovery. Where SNS is high there is sometimes a lower recovery of fertiliser nitrogen by the crop.


However, despite the failure to define an approach that will make the spatial application of nitrogen pay in winter wheat, the report is full of data. I love data, but there is far too much for me to absorb. There are, though, some standout pieces of information and interpretation by the authors.

Highlighted in the project summary is the statement that high SNS is associated with high yields. We are familiar with the term SNS; in current recommendation systems it is the sum of Soil Mineral Nitrogen (SMN) in the spring,plus the amount of nitrogen in the crop at that time, plus an estimate of usable net mineralisable nitrogen released by the soil. However, in the report it has a different meaning: it is the amount of nitrogen in the crop at harvest in the plots that did not receive nitrogen, hence it is Harvested SNS. The interesting observation that high Harvested SNS is associated with high yields is only useful if it is predictable. Unfortunately it may not be; in this project the relationship between SMN and Harvested SNS was tenuous.

Other data collected in this project suggest that there is not a strong association between grain yield and nitrogen requirement. This confirms previous field and experimental evidence that very high yields can be achieved with moderate levels of nitrogen. In this project, plot yields in excess of 13 t/ha were achieved with 240 kg N/ha in a field which had modest levels of SMN. Last year, in a NIAB TAG experiment in a field with typical levels of SMN, treatment yields in excess of 14 t/ha were achieved with 220 kg N/ha before additional nitrogen was required to enable even higher yields. However, please remember that spot yields in excess of 14 t/ha can occur in fields averaging 10-12 t/ha.

To quote the report ‘the lack of a strong relationship between yield and nitrogen requirement raises some important questions’. The most important question is how yield expectation can be incorporated into future nitrogen recommendation systems. My view for feed wheats is that nitrogen doses should not be increased until field yields above 10-12 t/ha are expected. For field yields above this level any increase in nitrogen recommendation should be modest.

Any new recommendation system cannot continue to adopt the principle that each kg of SMN is equivalent to a kg of SNS. Continued adherence to this principle explains why RB 209 has too large a change between soil nitrogen indices in recommended doses for feed wheat.

AHDB has funded two large projects that patently show that each kg of SMN changes SNS by only up to half a kg. Reviews of field experiments and the Auto N project also indicate the surprising lack of influence of SMN levels below about 100 kg/ha on the economic optimum dose of nitrogen for feed wheat.

All this is perhaps summed up by this statement in the report: ‘However, the large and somewhat unexplained variation in measured N requirements means that any prediction system will inevitably produce errors and improvement over the standard recommendation system (or even a standard figure of, say, 200 kg N/ha) is likely to be relatively modest’. I think that the 200 kg N/ha quoted was in the context that a kg of nitrogen was five times more expensive that a kg of wheat.

I suppose that the only comfort is that errors of up to 50 kg N/ha either side of the optimum dose has a relatively small effect on the margin over nitrogen costs. Please remember that the doses of nitrogen quoted in this report are relevant to feed wheats rather than milling wheats.

Leave a comment / View comments


Average wheat yields

Posted on 21/10/2016 by Jim Orson

Defra has recently published its preliminary estimate of this year’s UK cereals and oilseed rape harvest. It suggests that wheat averaged 7.9 t/ha and oilseed rape a lowly 3.1 t/ha. Defra, like everyone else, has suggested that the wheat yield was ‘average’. That got me thinking; how is the average yield calculated, particularly if there is a current trend upwards or downwards?

Hence, I have produced two graphs of the average yields of wheat, using information from the excellent online FAOSTAT3 database for the years 1996-2014 and Defra data for 2015 and 2016. The data used by the FAO is taken from Defra’s final yield surveys but at the moment it covers only individual years up to the 2014 harvest.

The first graph shows the five year rolling average yield for UK wheat; hence, the 2016 average yield includes the yields achieved from 2012 to 2016. As you can see, this graph rather deflates the view that we are witnessing a breakthrough in yields. Average five year rolling yields have been virtually the same over this century.

The second graph shows the four year rolling average and this implies a different scenario. After a long plateau in yields, they appear to be on the rise following a recent dip.

What explains the difference between the two graphs is the year 2012. Taking that out of the picture, the high yields achieved in 2014 and 2015 have raised the 2016 four year rolling average yield. It is tempting to say that we should ignore the poor yields in 2012 where exceptionally low summer solar radiation and summer waterlogging resulted in such poor yields. However, on the same basis, should we perhaps ignore the near perfect conditions, particularly up North, for wheat yields in 2015?

So, in my opinion, we have yet to see clear evidence that wheat yields are increasing despite some spectacular results in 2015. In addition, this year’s oilseed rape yields have dampened a recent trend towards higher yields.

There is more than one reason for the rather ‘average’ yields for this year’s winter wheat yields and the poor winter oilseed rape yields. However, one common feature shared by both crops is that much of the country had poor levels of solar radiation in late May and June. This may have been more significant in winter oilseed rape than in winter wheat. This is because grain fill in oilseed rape is determined almost entirely by current solar radiation whilst a significant proportion of grain fill in wheat is provided by reserves laid down before flowering.

The generally low radiation levels in June are probably the dominant explanation for the relatively high protein levels in wheat this year. Grain fill was restricted whilst moist soils meant that there was adequate nitrogen in the wheat plants. Another explanation may be that farmers were mistakenly encouraged by the high yields in 2015 to apply more nitrogen in 2016. There is no evidence of a link between economic optimum nitrogen levels and feed wheat yields unless yields are exceptionally high. NIAB TAG data suggests that extra nitrogen may be required for yields above 14 t/ha but please remember that such yields occur in parts of fields that have an overall yield of 11-12 t/ha.

Defra publishes its final yield estimates in December along with regional yields. I will then comment on my predictions of wheat yields made in a blog in July. It sounds as if I was not too far out but, of course, the majority of seasons produce average or near average yields.

Leave a comment / View comments


Variety choice more important with low yields

Posted on 07/10/2016 by Jim Orson

I recently looked at the results of this year’s and last year’s winter wheat recommended list variety trials on the AHDB website. I was struck by the small differences between the percent yield of varieties in the highest yielding trial, which averaged over 17 t/ha. This spurred me to look at the standard deviation of variety yields in each of the fungicide treated winter wheat recommended list trials in 2012 to 2016.

Standard deviation is simply a measure that is used to quantify the amount of variation or dispersion of a set of data values. The results of this study are expressed graphically in the figure below. This shows that there is no trend in the standard deviation of the yields of the individual varieties (t/ha) and the average yield of the trial. The outlier showing a very high standard deviation was a second cereal on a light soil: this may have suffered from take-all and the records show that it experienced high pressure from septoria.

Does this ‘interesting’ observation have any real practical significance? That is a matter of opinion but it suggests to me that variety choice is more critical for low yielding situations, particularly when margins are as tight as they are now.

Looking at the data in another way, when the yield results are expressed as percentages rather than t/ha, overall the spread of individual variety performances in each trial reduces as the mean yields of the control varieties increase. In the figure below, the 2012 data, the lowest yielding year of the five years, has been highlighted.

Septoria pressure was very high in the extremely wet summer of 2012. Interestingly, I cannot find any evidence to suggest that there was an association between the level of septoria infection and the standard deviation of the physical yields of the varieties in the individual trials harvested that year. This indicates that the reasons why there are just as large yield differences between varieties, expressed in t/ha, in low yielding trials as in high yielding trials is probably more complicated than a single factor. They are possibly associated with the overall stresses incurred in individual trials that affect some varieties more than others. The trials were stressed by summer waterlogging in 2012 but in other years lack of moisture was a feature. Hence, this type of analysis may not help to improve variety selection in differing scenarios but it does emphasise that variety performance is economically more important in lower yielding situations.

Leave a comment / View comments


GM or non-GM; it’s the trait silly!

Posted on 21/09/2016 by Jim Orson

It is a fair time since I vented my spleen over the absurdity behind the public’s misguided views on the environmental impacts of GM crops. It is not the public’s fault; they have been led along the garden path by the green blob.

A basis of the registration process for GM crops is that they have no additional negative impacts on the environment than conventionally bred crops that have the same trait. Hence, it is hardly surprising that a recently published report from the Belgian plant science institute VIB concludes that:

Crop cultivation is by definition unnatural, and produces a negative impact on the environment. Plant breeding makes it possible to develop plants that reduce this impact. The impact, whether positive or negative, depends on the crop trait and the cultivation method, but not on the breeding technology used.

It is comforting that the registration systems for GM are working and that there have been environmental benefits. Over the last 18 years GM crops have worldwide saved 6.3 billion litres of fuel and 21.3 million kg on insecticide active substances. The report also highlights that when fields with insect-resistant GM crops are compared with conventional fields where insecticides are used, many more beneficial insects can be found in the fields with insect-resistant GM crops. Even opponents of GM agree that insect resistance can lead to more insect diversity in crops but I assume that they still object to using GM to achieve it. Surely by now they must suspect that they may be wrong in their unsubstantiated objections to this technology. However, they and (in some cases) their bank balances have nailed their colours to the mast and would find it impossible to backtrack.

The report does include a warning about the adoption of herbicide-tolerant crops. The continual use of a single mode of action to control weeds, particularly where minimum- or zero-tillage is adopted, is the strategy with the highest risk of developing herbicide resistance in weeds. I do worry that the easy route provided by GM herbicide tolerance could mean that European farmers might adopt these practices, despite the precedent set by US farmers that has resulted in glyphosate resistant weeds. Hence, the report emphasises that:

‘To prevent resistance in weeds, insects, and fungi, there must be integrated pest management, which involves using several means or techniques simultaneously against a particular pest.’

The report also contains some intriguing data on the role of plant breeding in feeding the world now and in the future. A group of economists has estimated the impact of removing the gains in cereal productivity attributed to the widespread adoption of improved varieties. They conclude that in 2004, between 18 and 27 million additional hectares of agricultural land would have been in use compared to 1965. Of this figure, an estimated 12 to 18 million hectares of land was spared in developing countries and 2 million hectares of deforestation prevented. There is no doubt that, as a result of widespread adoption of improved crop germplasm, increases in cereal yields have saved natural ecosystems from being converted to agriculture. It is a pity that the green blob does not yet ‘get’ the wider environmental benefits associated with agricultural technologies.

Leave a comment / View comments


Over-precautionary principle?

Posted on 06/09/2016 by Jim Orson

Previous blogs have defined the difference between hazard and risk. Electricity is hazardous but the way we manage it means that it has a very low risk of doing harm. With pesticides, the level of risk is all about the dose likely to be received by operators, bystanders, customers and the wider environment. The way we manage pesticides can reduce the risk to acceptable levels for many pesticides hitherto classified as hazardous.

However, uniquely and for the first time, hazard cut-offs have been adopted in the current EU pesticide regulations, perhaps due to the over-zealous application of the precautionary principle. This means that however low the risk of using a particular pesticide may be, its registration will be refused or revoked if it has in any way been classified as a hazardous substance.

There are, of course, other hazards to human health involved in crop production. The most significant are the hazardous pathogens that occur in animal manures. The most prominent is E. coli, but salmonella and listeria can also be present. No one would dream of banning animal manures from crop production, despite these hazards.

The greatest risk of these hazardous organisms causing harm must be when manure is used just before or during the production of short-season crops which are consumed raw. There are guidelines for composting the manure before use, but things can go wrong. The best known example of things going wrong is the deaths caused by E. coli contamination of organic bean sprouts in Germany a few years ago. Over 40 people died and more had their long-term health ruined. In this and other cases of E. coli contamination, the risk management procedures either failed catastrophically or were not followed.

Possibly as a result of this particularly disastrous failure to provide safe food, the regulators appear to have increased their monitoring of hazardous pathogens in food. Last year the New York Times reported on the dramatic increase in product recalls of organic vegetables in the US, mainly due to contamination from animal manure derived salmonella and listeria. These pathogens can, at the very least, cause severe discomfort for the consumer. The data suggest, at least in some cases in the US, that the risk management of animal manures still needs tightening up. It is interesting that these pathogens did not appear to be an issue in conventionally produced food, whose main problem was undeclared allergenic constituents.

Over the last few decades there has not been a human death in the UK directly attributable to pesticides and in addition, I have not been aware of any proven chronic effects except, perhaps, from pesticides that were withdrawn a long time ago. Interestingly, a recently published study by Oxford University on the diets of more than 600,000 women over a decade, carried out before hazard cut-offs were introduced, suggests that eating organic food does not reduce the incidence of the wide range of cancers monitored. This suggests that rigidly applied risk management and not hazard cut-offs should also be appropriate to pesticides as well as animal manures.

Perhaps Brexit will provide an opportunity for the UK to make sure that the risk of harm from pesticides is assessed in the same manner as for other inputs used to produce food.

Leave a comment / View comments


Page:  < 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 >  Last ›