NIAB - National Institute of Agricultural Botany

Orson's Oracle

Winning is everything?

Posted on 10/03/2014 by Jim Orson

Sainsbury’s was selling snow shovels at a 25% discount today so I have to assume that winter is over. The rush on the land is accelerating and as always it will be a busy spring season with multiple passes of fertilisers and pesticides in each field.

I’m developing the view that the current debate about funding elite sport is an allegory for the approach we take to crop management. It can be argued that elite sport and arable agriculture are both aimed specifically at being successful against international competition rather than establishing a sounder (I am trying to avoid the word ‘sustainable’) base.British Basketball v Spain

The debate in sport is about basketball losing its funding from UK Sport because it’s unlikely to get medals at the 2016 and 2020 Olympics. This is despite the GB basketball team losing to the eventual bronze medallists by the narrowest of margins in London 2012. This decision is in contrast to the increase in funding for some less inclusive and less popular sports.  The question being asked is whether elite sport is just about Olympic medals. Basketball is much more popular than some of the funded elite sports, particularly so in the inner cities where it can transform the lives of teenagers.

The equivalent question in arable agriculture is whether it is all about increasing yields and high short-term financial margins. Certainly projects or initiatives to increase yields attract a lot of attention and are followed with great interest in the press. I still clearly remember a lecture in the late 1970s at which a plant physiologist explained the path towards higher cereal yields. It was all the more pleasurable because I had been in a university cross-country team with him and it was good to catch up, in more than one way, as he always beat me! During our conversation he said that the then current developments would lead to higher wheat yields and it was only a few years later, in 1984, that we achieved yields that were far higher than those of our dreams a few years earlier.

This was all part of a process that took the UK from being a producer of just over four million tonnes of wheat in 1970 to not only supplying our home market, but also exporting over four million tonnes by 1990.  This remarkable growth in production was a combination of both a significant increase in the area of wheat and a significant increase in yields.

However, as I have written before, there are now no imminent developments that will lead to significantly higher on-farm yields. It is more likely, in the short term at least, that we may not be able to maintain such high levels of cereal production partly because many farmers are having to replace some winter wheat with spring cereals. The main reasons for this are pesticide resistance, removal of pesticides from the market and more regularly occurring extreme weather events. It seems that our quest to optimise margins in the past has resulted in pesticide resistance that is compromising our current productivity.

A further reminder of this issue was a discussion that I had with some leading researchers last week. I was hoping that they would tell me that we should not be concerned about fungicide resistance because we would soon get disease resistant cereal varieties. However, they said that this is unlikely and that what the industry should be doing is reducing the number of fungicide applications in wheat to two in the season. This could be achieved partly by choosing varieties with reasonable all-round disease resistance.

However, this approach is rather pie-in-the-sky in the cold reality of having to survive financially. It is the crux of the problem; in order to compete internationally UK farmers have been forced, almost knowingly, into practices that they suspected would lead to problems.  What happened to Atlantis is a prime example. The threat of black-grass resistance was recognised before its commercial introduction and warnings of over-reliance on it were hard to avoid. However, because of a lack of alternatives for the control of high weed populations, the inevitable occurred.

So elite sport funding and arable farming are very similar in respect that competing successfully with international competition is essential to keep the coffers flowing. The difference is that in the recent past, in order for arable farming to survive financially, we may have reduced our ability to compete in the future.

Interestingly, the same has occurred for some of our international competitors and so the net result is likely to be felt in the consumers’ pockets. European producers are particularly hampered by the new pesticide registration requirements and the rather smug view of many politicians and bureaucrats that food supplies will be easily available without the aid of new technologies, such as genetic modification (GM). There have also been publicly expressed concerns over other new scientific developments that offer increased productivity in the medium and long-term. It seems that there is a part of our society that wants science taken out of agriculture. The net result could easily be increases in food prices that will dwarf any potential savings on snow shovels.

Leave a comment / View comments



Posted on 27/02/2014 by Jim Orson

Last week I went to the Defra offices in London for a meeting. Whilst walking to Cambridge railway station I was reminded of the changes that have occurred on that route over the past thirty years.

Halfway to the station I passed the former site of a large and extensive Ministry of Agriculture (MAFF) office. Like many MAFF offices it was previously a hospital for those wounded in World War II. The buildings were thrown up and made desperately poor offices. I was working there when the then Minister of Agriculture, John Selwyn Gummer visited and apologised for the working conditions. The site is now a very modern housing development.

Things have changed at the railway station as well. In 1984 it had two ticket windows and now it has six plus self-service machines, although these are still not enough to cope with peak demand. Close to the station was a RHM mill that in its latter years milled Soissons wheat for baguette flour; the smell was glorious. That site is also now a very modern housing development.

There have been big changes in the places in London where I’ve attended meetings. In the 1980s, MAFF occupied three substantial buildings in London and the Department of the Environment occupied three tower blocks. As the joint department (Defra) they are now squeezed into one of the former MAFF buildings. Few have their own offices and most work in rooms resembling call centres.

Changes have been as significant or even more significant in arable agriculture over the last thirty years. 1984 was the year of the great breakthrough in wheat yields and over the following twenty years many farmers strove but failed to match that yield. In fact, the overall approach to wheat growing has changed little over the last thirty years. The big changes have been in the size of arable enterprises and the labour and machinery costs.

I was recently in one of the university libraries (this particular one has not physically changed at all over the last thirty years!) and read the annual Farming in the Eastern Counties reports based on the farm business surveys carried out by the University. Starting with the 1984 report, I worked forward to 2004. There was a national re-organisation of surveys in 2005 and the type and format of data in the reports were changed. However, the trends in labour and machinery costs between 1984 and 2004 were intriguing.

Labour and machinery costs remained remarkably stable over that period despite the Consumer Price Index increasing by 85% over the same period. On mainly cereal farms in the eastern counties, the labour cost per hectare was £102 in 1984 and it was £102 in 2004. Machinery costs were £199/ha in 1984 and £165/ha in 2004. The cost of the two together rose from £301/ha in 1984 to £345/ha in 1997 but with the subsequent bad years fell to £266 in 2004. This is not far from the target of £250/ha set by farm business and machinery advisers at the end of the last millennium.

As far as I can ascertain from the more recent data, the costs have increased a little over the last few years as farmers have re-equipped. However, they still do not appear to exceed the joint total of 1984.    

These cost trends reflect an enormous increase in efficiency based on the benefits of scale and access to good and reliable crop protection. The latter has enabled a labour and machinery efficient approach to the management of broad-acre crops, including non-plough tillage and the adoption of block cropping, without having to jeopardise unnecessarily the standard of or the cost of crop protection.

As I mentioned in a blog a few weeks back, this KISS (Keep It Simple Stupid) approach to broad-acre crop management is under threat from pesticide resistance, pesticide product withdrawals (due to higher registration requirements) and perhaps extreme weather patterns. In addition, the ‘three crop rule’ may well reduce the efficiency of labour and machinery use on some blocks of land, particularly those under short term tenancy and contract farming agreements.

So as in all parts of our lives things are not what they used to be. Change and adaptation will always be required.

Leave a comment / View comments


Up in the air?

Posted on 21/02/2014 by Jim Orson

Most people know what a CV is but not so many know what a CV% represents. It is the coefficient of variation and, crudely put, is a measure of the variation in the data from individual treatments in an experiment as a proportion of the mean of the individual treatments. Most research organisations use this as a guide to the reliability of the results of individual experiments.

In NIAB TAG we become concerned when the CV exceeds 5% in cereal field experiments and typically would reject results when the CV% rises significantly above this guideline.

The reason I mention all this is that I have just re-read the results from Defra Project NT26 because of the current debate over the role of additives/coating for urea in reducing the loss of nitrogen due to ammonia emissions. NT26 suggests that there’s a need to increase the dose of nitrogen supplied in urea by 20% to match the crop nitrogen uptake (not yields) achieved by ammonium nitrate. The difference is attributed to losses due to ammonia emissions which can be reduced by the addition of a chemical that inhibits the soil urease enzymes which are responsible for converting urea into ammonia.

The bland data from the ten field trials in NT26 (nine of which are in winter wheat and one in winter barley) identifying the optimum dose of nitrogen does indeed suggest that, on average, 20% more nitrogen in the form of urea is required to match the nitrogen uptake of ammonium nitrate at its optimum dose. However, the authors state that there is no significant difference in the nitrogen required to optimise winter cereal yields or the protein content at optimum yields whether it is applied as ammonium nitrate, urea, urea + the urease inhibitor Agrotain or UAN. The question is why such a difference is not statistically significant? In my opinion, the reason is that the errors in the majority of trials are unacceptably high.

I spent a little time looking at the results of the four trials that resulted in the most accurate response curves and CVs of around 5% or less. These confirm that urea is slightly less efficient in crop uptake but the difference is a lot less than 20%; more like 6-7%. This is a tiny database. The much larger database comprised of trials ADAS carried out in the mid-1980s and the NIAB TAG trials suggests that the difference is even lower.

So anyone can draw almost any conclusion that they wish from NT26. The evidence from previous ADAS trials, the four trials in NT26 with CVs of around 5% or less and the NIAB TAG data suggests that there is little difference between ammonium nitrate and urea in achieving optimum yields in winter cereals but the protein levels can be a little down where urea is used. Published data on urea + Agrotain is restricted to NT26 and so is thin on the ground but suggest that, in terms of crop uptake, it is equivalent to ammonium nitrate.

There is one specific concern about the evaluation of Agrotain in NT26. One reading of the field trials implies that around 8% more applied nitrogen in the form of urea + Agrotain is required to get similar optimum yields to ammonium nitrate and urea. However, the efficiency of use of nitrogen is equivalent to ammonium nitrate because at these optimum yields, the protein contents are higher with urea + Agrotain. This may reflect how the total dose of nitrogen was split in the trials. After the first 40 kg N/ha applied in March, the remainder of the total dose was split between an application at GS30/31 and at GS32 of the wheat. Hence, particularly with the overall higher doses, relatively large doses were being applied as late as GS32. This kind of split may not be entirely suitable for Agrotain because there is the implication in these trials that it slowed the release of nitrogen during the growth stages that most influence yields but more was available later for protein formation.

So nitrogen fertiliser requirements still continue to confound everyone and, as with other projects, NT26 seems to raise as many questions as it answers. Who’d be a soil scientist?

Leave a comment / View comments


Things go better with Coke

Posted on 13/02/2014 by Jim Orson

Last week I heard a fascinating talk on the minimal (i.e. critical) level of phosphate needed in the soil to achieve optimum economic yields of broad-acre crops. RB209 has said for years that it should be Index 2 as measured by Olsen-P. Like many things, this has become an industry truism but every truism needs to be questioned from time to time. About ten years ago, I became aware that the need to retain phosphate at Index 2 for most crops was primarily based on research from just two trial locations and so it seems prudent to test this particular truism.

So hats off to the HGCA for funding a project to investigate the critical level of P for cereals and rape on a wider range of soil types under modern cropping systems. All the results are now available and they are fascinating. I would never have believed that phosphate nutrition was so interesting!

The results indicate that, in general, Index 2 has much wider relevance than just the two sites where the research was originally carried out. There appears, however, to be an important caveat.

It seems that the long-term availability of applied phosphate (triple-superphosphate was used in the HGCA-funded experiments) may be strongly influenced by the level of extractable calcium in the soil. In this series of experiments, the long-term availability of applied phosphate was around 25% on those soils with lower levels of calcium and only 10% on thin soils over chalk and limestone i.e. after 12 months, there was a significantly higher level of lock-up of applied phosphates on chalk and limestone based soils.

Sufficient available phosphate is required in particular during crop establishment. All this suggests to me that annual applications of phosphates either combined-drilled or applied to the seedbed may be more economical than rotational applications on chalk and limestone based soils because the crops would benefit more from the freshly applied nutrient before it was ‘locked-up’. A LINK project is suggesting that such a ‘targeted’ approach may avoid the need to go through the very expensive process of trying to build these soils up to index 2. I may be getting ahead of myself with these conclusions and it will be particularly worthwhile for those working on chalky or limestone soils to read the Project Reports when they appear on the HGCA website. The publication of the Critical P report is imminent and the LINK report will be published next year.

All this talk of the need for annual application of phosphate reminds me of my experiences in Australia. I remember reading one account of a Western Australian farmer who said, in 1951, that annual applications of this nutrient transformed the yield potential of his farm. I emailed Harm van Rees (what a fantastic name for a great Aussie consultant) who confirmed that the use in many parts of Australia of annual combine-drilled applications of phosphate is because of the high levels of free calcium in the soil.Applying coke as a fertiliser

There is a particular problem with phosphate availability in some parts of the Eyre Peninsula, just west of Adelaide. These particular areas are on limestone and they have found that using fluid fertilisers based on phosphoric acid provides higher yields than granular based fertilisers. These fluid fertilisers are expensive and require specialist equipment but they can be more economical; less phosphate is required to achieve higher yields. 

It is interesting to note that in Australia the extreme problems of phosphate availability can occur on limestone soils. There is a hint in the HGCA Critical P results that the problem of phosphate ‘lock-up’ may be worse on limestone than on chalk soils.

As you may recognise, there has been a lot more research done on phosphate application in Australia. A few years ago I spoke at a conference in Bendigo, Victoria where another presentation described some of this great research effort. Novel techniques were being tested and one (totally tongue in cheek) approach was the application of 200 litres/ha of Coca-Cola. It certainly greened-up the emerging crop because of its phosphoric acid content. Based on the declared chemical content of Classic Coke, this treatment does not appear to apply sufficient phosphorus and it is expensive. On a practical point, the researcher added that it was essential that the Coke was flat, otherwise there were great problems in applying it!  So whilst in the context of phosphate nutrition “things go better with Coke”, it cannot meet the claim that “it’s the real thing”.

Leave a comment / View comments


Royal approval

Posted on 05/02/2014 by Jim Orson

This is the 100th Orson’s Oracle and there is no better way to start it than by saying that for once I agree with one of Prince Charles’ pronouncements.  

Last week he said before an invited audience “With a barrage of sheer intimidation, we are told by powerful groups of deniers that the scientists must be wrong and we must abandon all our faith in so much overwhelming scientific evidence”.  He was talking about climate change but I am sure that those who support conventional farming and the regulated introduction of GM crops will fully support that sentiment on the grounds that good science, correctly interpreted, should always be listened to and respected.

However, it is important to point out that excellent ‘laboratory’ or ‘theoretical’ science may not always provide the answers and field testing is essential. That is why field testing of GMs is so important, but of course there are “powerful groups of deniers” who have used intimadatory measures to prevent these going ahead.

Field testing is essential because there are good examples where excellent and robust laboratory research and theories have not been delivered in the field. One example with which I became closely involved concerned spray application. Large spray droplets produced by conventional nozzles are not retained by a target plant and small ones will drift. Hence, it seems logical to produce a spray that is solely comprised of droplets that are neither too large nor too small. This is the basis of Controlled Droplet Application (CDA).

Tests with CDA in the laboratory were indeed very promising, with individual target plants having four times as much spray retained on their foliage when compared to a standard flat fan nozzle. So far, so good. However, in independently run field trials, CDA proved to be, at best, as good as conventional nozzles and often inferior. Naturally, there was a huge furore between those who were involved in the independent field trials and those who were selling CDA machines.

However, to the credit of some laboratory researchers, the reasons why the theory was not delivered in the field were identified. The droplets from CDA machines used in the trials fell relatively slowly downwards under the influence of gravity in an almost vertical trajectory. This meant that they were not good in penetrating a crop canopy. In addition, the pesticides were predominately deposited on the horizontal parts of the target plant. Research subsequently proved that pesticides are far more effective if they are deposited on the vertical surfaces of target plants.Spray patterns

There were two other reasons for the disappointing results from CDA application. Although it was proven in field trials that on average there could be more pesticides on the target, the variation between the amounts of deposit on individual plants was far larger than with conventional nozzles. Finally, CDA involved very low volumes and some pesticides were too concentrated for optimum uptake by the plant. I witnessed trials where some herbicides just ‘shot-holed’ the leaves of otherwise unaffected susceptible weeds.

The following development in spray application was electrostatic sprayers. These also failed to reach their theoretical advantages in the field because the charged pesticide spray stuck to the wrong part of the crop canopy and/or target plant for good activity. However, I wish to point out that I am not a spray application ‘denier’ and hand-held CDA machines are being used very successfully in many parts of the world. This is particularly so with pesticides whose the position on the plant has little impact on their efficacy and which can maintain their efficacy in very low volumes and also where there is no canopy to penetrate. You’ve guessed it; total weed control with glyphosate is a prime use with these machines.

One other theory which does not deliver in field trials is that the level of Soil Mineral Nitrogen (SMN) has a major influence on the optimum amount of bag nitrogen required by the crop. All the UK field trial databases show that this is not generally true for winter wheat, oilseed rape and sugar beet. I suspect that it is also generally not true for the other arable crops as well. In wheat, which has by far the largest field trials database, the optimum dose of bag nitrogen is only demonstrably reduced by levels of SMN in excess of the RB 209 N Index 3; i.e. in the minority of situations.

Fertiliser spreadingNitrogen is a major and polluting input and we urgently need to know why SMN typically has so little influence on the optimum amount of bag N required. I may have just come across a clue in a paper by some Belgian soil scientists. In their research, the more efficiently the bag nitrogen was used by wheat, the less the amount of N was taken up from the soil. So the wheat plots in this research were using SMN more as a back-up than a primary source of this nutrient. This may be one of the reasons why differences in low-moderate SMN levels have no influence on the optimum amount of bag N required by the crop. David Jones, the manager of Morley Farms, suggested an analogy for this suspected preferential uptake of bag N by wheat “why drive down the road to buy a sandwich when your fridge is full of them?”

Leave a comment / View comments


Page: ‹ First  < 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 >  Last ›