Posted on 27/05/2016 by Jim Orson
I have tried cover crops on my allotment over the last year. They looked great in mid-October (see pictures), with the late-summer sown phacelia and vetch growing to an enormous size before being killed by the mild frosts of this last winter. Early-autumn sown spring oats also grew rapidly to a huge size but the winter only killed about half the plants.
The following spring-sown vegetable crops are now growing well where I dug the ground before sowing the cover crops but not where I have not dug or deep cultivated the soil for a year or two. So I have reluctantly decided that digging is still currently required before sowing or after removing the cover crops. I have also decided that the only cover crop that I will continue to grow is spring oats. The seed is cheap (well free!), establishment is more reliable and the soil is in good shape afterwards. In addition, there are no disease implications for the vegetable crops I grow.
Over the spring I have witnessed a few large scale on-farm experiences with autumn-sown cover crops. Their impact seems to have been favourable where the soil has been light enough to carry out sufficient cultivation prior to establishing spring-sown crops. A few half-field observations suggest a more vigourous spring-sown crop after cover crops. However, there is an active debate about the economic benefits of cover cropping, particularly on the heavier soils.
One intriguing aspect of autumn-sown cover crops is that they significantly reduce over-winter nitrate leaching but UK trials suggest that there is no opportunity to reduce nitrogen application to the following spring-sown crop. So what is going on? What happens to the nitrogen that has been prevented from leaching to the lower layers of the soil or to the drains?
There is a cover crop experiment in France investigating the impact of a late summer/early autumn-sown white mustard cover crop in a winter wheat, spring barley, spring pea rotation. A paper written after the first 12 years of the experiment confirms that the reduced nitrate leaching over winter does not reduce the economic optimum nitrogen dose for cereal crops. The measurements taken show that much of the nitrogen saved from leaching goes into storage in the soil organic matter.
Modelling based on the results of this and two other long-term cover crop trials in France suggest that over time, the benefit of cover crops reducing nitrate leaching decreases. However, it concludes that after twenty years or more, the economic optimum dose of nitrogen for crops can be reduced by around 20-24 kg N/ha and that this reduced dose restores much of the benefit of reduced nitrate leaching with autumn-sown cover crops.
It is worth pointing out that these long-term experiments all have non-leguminous cover crops. The nitrogen implications will no doubt be different where leguminous cover crops are adopted.
In the UK winter cropping may well have similar nitrogen dynamics to spring cropping plus autumn-sown cover crops. Hence, interpreting this French research in terms of practical implications for the UK is, to say the least, difficult. However, I am relieved that the modelling work provides the possible long-term implications for the nitrogen that is prevented from leaching over the winter by the introduction of autumn-sown cover crops and also the possible impact on nitrogen fertiliser requirements.
Posted on 13/05/2016 by Jim Orson
I must admit to getting emotional when Tottenham failed to get the points to challenge Leicester City for the Premier League title. Much has been written about the significance of Leicester’s win and, for me, it has added spice because Leicester and Tottenham have history. Leicester lost to them in the 1961 FA Cup Final and also lost out to them for the League title in 1963.
I am amused by the fact that the odds of a Leicester win were 5,000:1. This is quoted time and time again as if these odds are real. In fact they are just numbers adopted by the bookmakers. Apparently the odds are only 2,000:1 on finding Elvis Presley alive and kicking; well perhaps not kicking too vigorously as he was born in 1935. However you present it, the press and the public see Leicester’s League title as a triumph for the underdog and a real David v Goliath story.
This has got me thinking about the current travails of glyphosate, the biggest selling pesticide in the world. It is for this reason and its association with GM that it has become a target of the green blob. Now they seem to be having some success in making registration authorities think through the extra ‘evidence’ they have presented against it. I have put ‘evidence’ in inverts because of the standards and relevance of some of the ‘science’ being used to question glyphosate’s safety.
The green blob’s breakthrough came when the International Agency for Research on Cancer (IARC, which is part of the World Health Organisation) listed glyphosate as probably carcinogenic to humans. I wrote two blogs on this subject questioning their decision-making: Glyphosate cancer confusion and Roundup causing cancer? The second of these blogs was written after the European Food Standards Agency questioned the IARC decision.
It may be that the IARC decision was heavily influenced by the presence of a member of the green blob on the inside who helped to lead the decision-making process. This is a quote from an article by Matt Ridley in The Times on 23rd April:
Yet the document depends heavily on the work of an activist employed by a pressure group called the Environmental Defense Fund: Christopher Portier, whose conflict of interest the IARC twice omitted to disclose. Portier chaired the committee that proposed a study on glyphosate and then served as technical adviser to the IARC’s glyphosate report team, even though he is not a toxicologist. He has since been campaigning against glyphosate.
The IARC study is surely pseudoscience. It relies on a tiny number of cherry-picked studies, and even these don’t support its conclusion. The evidence that it causes cancer in humans is especially tenuous, based on three epidemiological studies with confounding factors and small sample sizes “linking” it to Non-Hodgkin lymphoma (NHL). The study ignored the US Agricultural Health Study, which has been tracking some 89,000 farmers and their spouses for 23 years.
The study found “no association between glyphosate exposure and all cancer incidence or most of the specific cancer subtypes we evaluated, including NHL . . .”
The worrying implication is that far-reaching and important decisions made by major international bodies can now be unduly influenced by the green blob. Surely this cannot continue and there has to be a return to objective and independent decision-making.
The pressure on glyphosate continues. A recent article in The Ecologist suggests that glyphosate kills a key soil fungus. However, it seems that the soil fungus used in this study was artificially exposed in the laboratory to concentrations of formulated glyphosate at levels that would not be found in the field soil environment. Previous investigations using standardised tests show that glyphosate formulations have no long-term effects on microorganisms in soil. I now never read The Ecologist after its leader once suggested that the world should eschew conventional and organic agriculture and we should all go back to hunting and gathering. Need I say more?
So is glyphosate David or Goliath in the fight for its future? I suggest that it is David because of the mass ranks mustered by the green blob that regularly use arguments not based on realistic science. It is as well to remember that everyone loves the underdog; just take Leicester City as an example.
Posted on 29/04/2016 by Jim Orson
Recently, the David Attenborough Building opened in the centre of Cambridge. It provides office accommodation for 500 (yes five hundred) conservationists. These are university researchers, post-graduate researchers and representatives of both UK and international organisations. Hence, it is a boiling pot of ideas but it seems that there is one thing that they cannot really settle upon: that is a universally agreed definition of conservation.
One definition has been suggested in a recent paper; ‘actions that are intended to establish, improve or maintain good relations with nature’. I do not think that anyone can disagree with this very broad-brush definition: a more precise one seems harder to agree.
A few weeks ago I attended a seminar entitled “what is conservation?” at the David Attenborough Building. Incidentally, the building was so recently completed that the PA system did not work and there were some wires hanging down from the ceiling. I assumed that a full Health and Safety assessment had been carried out!
One speaker’s approach was to accept that conservation means halting or interfering with natural succession otherwise all the ‘unmanaged’ land would eventually end up as deciduous woodland. However, with the example she gave, I felt a tiny bit uneasy about her approach. She described a deciduous wood on the North Norfolk coast that had grown on some boggy land. In the mind of the conservation agencies it would be better for general biodiversity to return the wood to a boggy area with few trees. However, the local villagers rather enjoyed their wood but they were persuaded that chopping down the trees would increase the conservation value of the site. The speaker says that the villagers now enjoy the wildlife in the recently cleared area but I think it has taken an enormous effort to reach this end point.
Of course there would be little natural succession to halt if it was not for farmers originally clearing land to produce food. It is the broad spectrum of habitats in the natural succession between intensively cultivated land and deciduous woodland that provides the home and food for the huge range of plants, insects and animals that form the biodiversity of this country. Hence, there is a real need to manage uncropped land to support biodiversity.
A recently published paper attempts to attribute the cause of the changes in biodiversity in the UK since 1970 and concludes that agricultural management and climate change are the major drivers. It suggests that we should adopt lower intensity farming to help reverse some of the negative trends in biodiversity measured over this time period. Of course, the overriding issue is that the world population has doubled since 1970 and demand for food will continue to grow.
What I have not been able to glean from the scientific literature is the cost to biodiversity of producing say one tonne of wheat from different approaches to arable production; organic, low-intensity and conventional. There are such studies for greenhouse gas production. They conclude that despite the greenhouse gas production associated with the production and use of nitrogen fertiliser, there is a lower amount of emissions when producing a tonne of wheat conventionally rather than organically. I suppose that biodiversity is too multi-faceted to try to do the same analysis for it.
There has been an attempt to compare the value to butterflies of conventionally and organically managed land. This concludes that ‘farming conventionally and sparing land as nature reserves is better for butterflies when the organic yield per hectare falls below 87% of conventional yield. However, if the spared land is simply extra field margins, organic farming is optimal whenever organic yields are over 35% of conventional yields’.
This suggests that conventional farmers have to manage their uncropped land better. I think that this is supported by the paper I previously quoted on the cause of changes in biodiversity since 1970, which suggests that the way the habitat is managed has a greater impact than changes in its extent. I assume that the authors are suggesting that this applies not only to cropped land but also uncropped land.
This all suggests that in order to relieve the intense pressure to increase the biodiversity of arable land, the first step the industry needs to make is a significant improvement in the management of uncropped land.
P S – Despite the fact that they had a brand new building for 500 conservationists in the centre of Cambridge, the speakers at the seminar did appear to agree on one thing; ‘not enough funding’.
Posted on 15/04/2016 by Jim Orson
A few months ago I attended a seminar at which all of the farmers present said that they had abandoned the precision application of nitrogen to wheat. This came as no surprise to me as I am unaware of a strong scientific base to support it. On the contrary, the scientific arguments against it keep strengthening.
One key issue is the way that wheat responds to nitrogen. It is possible to stray from the economic optimum by up to 50 Kg N/ha and have only a minor effect on both yield and margin. This is demonstrated by a slide shown to the HGCA (now AHDB cereals and oilseeds) conference a few years back.
Therefore, based on nitrogen response data in individual trials, the economic optimum has to vary very significantly in a field before there are meaningful economic advantages for precision application of nitrogen, provided of course that the average dose is just about right. This overriding basic fact alone explains much of the disenchantment in the precision application of nitrogen to wheat.
Nitrogen trials are carried out in a small part of a field where the level of SMN may not vary much across the trial site. On a field scale SMN levels will vary more significantly and may provide an opportunity for precision application of nitrogen. This approach was tested in a major LINK funded project in the late 1990s. The test fields were divided into a grid and for each section of the grid a SMN measurement was taken and the estimated economic optimum was applied. The result was that the average yield of the grids was no higher than it would have been had the average dose been applied to the entire field.
This is perhaps explained by two factors. First, SMN is not a reliable indicator of soil nitrogen supply and second, SMN has far less influence on the optimum dose than previously supposed. A leading soil scientist told me a few years ago that SMN levels up to 100 Kg N/ha were “merely noise”.
Another possible reason to adopt precision application of nitrogen to wheat could be variations in potential yield across the field. However, yield does not influence the demand for nitrogen fertiliser by as much as suggested in some advisory systems. This is simply because high yielding crops use applied nitrogen and SMN more efficiently. If this is not so, how do you explain plot yields last year of 14 t/ha of wheat being achieved from 220 Kg N/ha in NIAB TAG trials carried out on-long term arable soils receiving no organic amendments? It did seem from these particular trials that additional nitrogen was necessary for plot yields above 14 t/ha.
Finally, there is an argument for spatially applying nitrogen according to the canopy size of wheat. Looking back at the graph at the beginning of this blog, this would typically bring small rewards in yield although it would help ‘even up’ the crop. The reward would be more significant in very variable oil seed rape crops where there is a good relationship between canopy size in the very early spring and the optimum economic dose of applied nitrogen.
You have to understand that the examples I have quoted above for wheat are for feed wheat and I have ignored the critically important issue of the environmental impact of nitrogen. It may be that there is a stronger argument for precision application to milling wheat. However, the understanding of nitrogen nutrition is still relatively rudimentary.
To make progress there needs to be further research at the more basic level. The trouble is that funders are more drawn to projects that offer quick fixes. As far as I can see there are no quick fixes in nitrogen nutrition that will provide the basis for precision application of nitrogen. However, there are those enthusiasts who continue to try to develop a system of precision application of nitrogen that will reap real rewards. I genuinely wish them well whilst reminding them that accurately predicting nitrogen doses for feed wheat to within a few Kg N/ha using the current relatively simplistic recommendation systems is at the moment not possible. The only potential case I can foresee for precision application of nitrogen to feed wheat is perhaps where there are very extreme soil types within a field and/or if there was a way of predicting variations in the efficiency of soil and applied nitrogen use across a field.
Remember, the first rule of precision farming is to assess how much variation there is and then assess whether or not there is an economic advantage in spatially varying an operation in order to correct or minimise it. Good basic and applied science, correctly interpreted, has a significant role to play here.
Posted on 01/04/2016 by Jim Orson
March is the month of the Cambridge Science Festival and I attended a few lectures. The programme was less plant-orientated this year and one of the major themes was big data.
The lecture I attended on big data and medical science was superb. The lecturer was brilliantly modest, ensuring that his points were understood by the wide-spectrum of knowledge in the audience. Put it this way, even I understood it!
First of all, he undermined some of the bunkum that goes with the term big data. As he said, it is data analysis. The issue is that so much data is now being generated in the medical world that it is becoming ever more challenging to analyse it and extract meaningful messages.
He warned that correlations in data do not mean causation. The well known example he quoted was the close link between ice cream sales and shark attacks in Australia. This is simply explained by the fact that there are more people in the sea on the hot days when ice cream sales are high. Hence, stopping ice cream sales will not prevent shark attacks.
Another example I can remember was when someone found a correlation between the fall in sparrow numbers in the UK and the rise in lead-free petrol sales. This amazingly lead to suggestions that there should be an investigation into the environmental impact of the additives in lead-free petrol but that proposal soon ran out of gas (pun intended).
In the medical world, there is a real danger of correlations leading to false conclusions that may impact on the lives of many people and so much of the lecture was based on this issue. There are some similarities in arable agriculture.
During the 1970s it was fashionable to say that high wheat yields were correlated with crops that had a high above-ground biomass at harvest. When you think about it, it is stating the b******* obvious. At harvest much of the biomass is in the grain and the rest is in the remaining above ground plant matter, notably straw. The two are related. Varieties tend to have a consistent harvest index, which is the proportion of total above-ground dry matter that is in the grain. Hence, inevitably high yields must be associated with high levels of biomass.
The correlation between wheat grain yield and high biomass crops was the basis of the Schleswig-Holstein system. This attempted to have a high biomass throughout the life of the crop, starting with high seed-rates and continuing with encouraging high growth rates from early spring onwards. Also in the 1970s there was another approach called the Laloux system. This stemmed from research by the then Professor of Agronomy at the University of Gembloux in Belgium. This was based on modest seed rates (roughly what we use now) and a more measured programme of feeding and protecting the crop in the spring and early summer. I did trials comparing the two approaches and the Laloux system consistently outyielded the Schleswig-Holstein system. The striking observation of the trials was that by harvest the two systems gave similar looking crops. Obviously the weather intervened and influenced final tiller numbers and biomass.
In the event, the then best UK practice was equal to or superior to the Laloux system. This was partly due to the fact that the Laloux system relied on late nitrogen applications. These are relevant to Belgium because of the regular incidence of thunderstorms in late May and June but not to the UK where it may be as dry as toast at that time of year.
Hence, whilst there is an obvious correlation between wheat biomass at harvest and wheat yield, it may not necessarily be pointing a way towards growing higher yielding crops. Obviously, things would be different if there was a correlation between yield and wheat biomass at a far earlier growth stage.
Last year proved that good standard UK practice can result in very high levels of biomass and yields provided that the weather is with us. Therefore, the approach must surely be to ensure that crops are able to take advantage of such conditions in order to achieve high levels of biomass at harvest without spending a shed load of money.
By the way, Gembloux is a very small town. I have visited the superb Department of Agronomy of the Agricultural University a few times and on occasions my wife came along too. The first time I suggested she looked around the town and we would meet up for lunch. I think it took her less than an hour to ‘do’ the town but the salads in the restaurant at the gates of the University made up for that.