Posted on 27/02/2014 by Jim Orson
Last week I went to the Defra offices in London for a meeting. Whilst walking to Cambridge railway station I was reminded of the changes that have occurred on that route over the past thirty years.
Halfway to the station I passed the former site of a large and extensive Ministry of Agriculture (MAFF) office. Like many MAFF offices it was previously a hospital for those wounded in World War II. The buildings were thrown up and made desperately poor offices. I was working there when the then Minister of Agriculture, John Selwyn Gummer visited and apologised for the working conditions. The site is now a very modern housing development.
Things have changed at the railway station as well. In 1984 it had two ticket windows and now it has six plus self-service machines, although these are still not enough to cope with peak demand. Close to the station was a RHM mill that in its latter years milled Soissons wheat for baguette flour; the smell was glorious. That site is also now a very modern housing development.
There have been big changes in the places in London where I’ve attended meetings. In the 1980s, MAFF occupied three substantial buildings in London and the Department of the Environment occupied three tower blocks. As the joint department (Defra) they are now squeezed into one of the former MAFF buildings. Few have their own offices and most work in rooms resembling call centres.
Changes have been as significant or even more significant in arable agriculture over the last thirty years. 1984 was the year of the great breakthrough in wheat yields and over the following twenty years many farmers strove but failed to match that yield. In fact, the overall approach to wheat growing has changed little over the last thirty years. The big changes have been in the size of arable enterprises and the labour and machinery costs.
I was recently in one of the university libraries (this particular one has not physically changed at all over the last thirty years!) and read the annual Farming in the Eastern Counties reports based on the farm business surveys carried out by the University. Starting with the 1984 report, I worked forward to 2004. There was a national re-organisation of surveys in 2005 and the type and format of data in the reports were changed. However, the trends in labour and machinery costs between 1984 and 2004 were intriguing.
Labour and machinery costs remained remarkably stable over that period despite the Consumer Price Index increasing by 85% over the same period. On mainly cereal farms in the eastern counties, the labour cost per hectare was £102 in 1984 and it was £102 in 2004. Machinery costs were £199/ha in 1984 and £165/ha in 2004. The cost of the two together rose from £301/ha in 1984 to £345/ha in 1997 but with the subsequent bad years fell to £266 in 2004. This is not far from the target of £250/ha set by farm business and machinery advisers at the end of the last millennium.
As far as I can ascertain from the more recent data, the costs have increased a little over the last few years as farmers have re-equipped. However, they still do not appear to exceed the joint total of 1984.
These cost trends reflect an enormous increase in efficiency based on the benefits of scale and access to good and reliable crop protection. The latter has enabled a labour and machinery efficient approach to the management of broad-acre crops, including non-plough tillage and the adoption of block cropping, without having to jeopardise unnecessarily the standard of or the cost of crop protection.
As I mentioned in a blog a few weeks back, this KISS (Keep It Simple Stupid) approach to broad-acre crop management is under threat from pesticide resistance, pesticide product withdrawals (due to higher registration requirements) and perhaps extreme weather patterns. In addition, the ‘three crop rule’ may well reduce the efficiency of labour and machinery use on some blocks of land, particularly those under short term tenancy and contract farming agreements.
So as in all parts of our lives things are not what they used to be. Change and adaptation will always be required.
Posted on 21/02/2014 by Jim Orson
Most people know what a CV is but not so many know what a CV% represents. It is the coefficient of variation and, crudely put, is a measure of the variation in the data from individual treatments in an experiment as a proportion of the mean of the individual treatments. Most research organisations use this as a guide to the reliability of the results of individual experiments.
In NIAB TAG we become concerned when the CV exceeds 5% in cereal field experiments and typically would reject results when the CV% rises significantly above this guideline.
The reason I mention all this is that I have just re-read the results from Defra Project NT26 because of the current debate over the role of additives/coating for urea in reducing the loss of nitrogen due to ammonia emissions. NT26 suggests that there’s a need to increase the dose of nitrogen supplied in urea by 20% to match the crop nitrogen uptake (not yields) achieved by ammonium nitrate. The difference is attributed to losses due to ammonia emissions which can be reduced by the addition of a chemical that inhibits the soil urease enzymes which are responsible for converting urea into ammonia.
The bland data from the ten field trials in NT26 (nine of which are in winter wheat and one in winter barley) identifying the optimum dose of nitrogen does indeed suggest that, on average, 20% more nitrogen in the form of urea is required to match the nitrogen uptake of ammonium nitrate at its optimum dose. However, the authors state that there is no significant difference in the nitrogen required to optimise winter cereal yields or the protein content at optimum yields whether it is applied as ammonium nitrate, urea, urea + the urease inhibitor Agrotain or UAN. The question is why such a difference is not statistically significant? In my opinion, the reason is that the errors in the majority of trials are unacceptably high.
I spent a little time looking at the results of the four trials that resulted in the most accurate response curves and CVs of around 5% or less. These confirm that urea is slightly less efficient in crop uptake but the difference is a lot less than 20%; more like 6-7%. This is a tiny database. The much larger database comprised of trials ADAS carried out in the mid-1980s and the NIAB TAG trials suggests that the difference is even lower.
So anyone can draw almost any conclusion that they wish from NT26. The evidence from previous ADAS trials, the four trials in NT26 with CVs of around 5% or less and the NIAB TAG data suggests that there is little difference between ammonium nitrate and urea in achieving optimum yields in winter cereals but the protein levels can be a little down where urea is used. Published data on urea + Agrotain is restricted to NT26 and so is thin on the ground but suggest that, in terms of crop uptake, it is equivalent to ammonium nitrate.
There is one specific concern about the evaluation of Agrotain in NT26. One reading of the field trials implies that around 8% more applied nitrogen in the form of urea + Agrotain is required to get similar optimum yields to ammonium nitrate and urea. However, the efficiency of use of nitrogen is equivalent to ammonium nitrate because at these optimum yields, the protein contents are higher with urea + Agrotain. This may reflect how the total dose of nitrogen was split in the trials. After the first 40 kg N/ha applied in March, the remainder of the total dose was split between an application at GS30/31 and at GS32 of the wheat. Hence, particularly with the overall higher doses, relatively large doses were being applied as late as GS32. This kind of split may not be entirely suitable for Agrotain because there is the implication in these trials that it slowed the release of nitrogen during the growth stages that most influence yields but more was available later for protein formation.
So nitrogen fertiliser requirements still continue to confound everyone and, as with other projects, NT26 seems to raise as many questions as it answers. Who’d be a soil scientist?
Posted on 13/02/2014 by Jim Orson
Last week I heard a fascinating talk on the minimal (i.e. critical) level of phosphate needed in the soil to achieve optimum economic yields of broad-acre crops. RB209 has said for years that it should be Index 2 as measured by Olsen-P. Like many things, this has become an industry truism but every truism needs to be questioned from time to time. About ten years ago, I became aware that the need to retain phosphate at Index 2 for most crops was primarily based on research from just two trial locations and so it seems prudent to test this particular truism.
So hats off to the HGCA for funding a project to investigate the critical level of P for cereals and rape on a wider range of soil types under modern cropping systems. All the results are now available and they are fascinating. I would never have believed that phosphate nutrition was so interesting!
The results indicate that, in general, Index 2 has much wider relevance than just the two sites where the research was originally carried out. There appears, however, to be an important caveat.
It seems that the long-term availability of applied phosphate (triple-superphosphate was used in the HGCA-funded experiments) may be strongly influenced by the level of extractable calcium in the soil. In this series of experiments, the long-term availability of applied phosphate was around 25% on those soils with lower levels of calcium and only 10% on thin soils over chalk and limestone i.e. after 12 months, there was a significantly higher level of lock-up of applied phosphates on chalk and limestone based soils.
Sufficient available phosphate is required in particular during crop establishment. All this suggests to me that annual applications of phosphates either combined-drilled or applied to the seedbed may be more economical than rotational applications on chalk and limestone based soils because the crops would benefit more from the freshly applied nutrient before it was ‘locked-up’. A LINK project is suggesting that such a ‘targeted’ approach may avoid the need to go through the very expensive process of trying to build these soils up to index 2. I may be getting ahead of myself with these conclusions and it will be particularly worthwhile for those working on chalky or limestone soils to read the Project Reports when they appear on the HGCA website. The publication of the Critical P report is imminent and the LINK report will be published next year.
All this talk of the need for annual application of phosphate reminds me of my experiences in Australia. I remember reading one account of a Western Australian farmer who said, in 1951, that annual applications of this nutrient transformed the yield potential of his farm. I emailed Harm van Rees (what a fantastic name for a great Aussie consultant) who confirmed that the use in many parts of Australia of annual combine-drilled applications of phosphate is because of the high levels of free calcium in the soil.
There is a particular problem with phosphate availability in some parts of the Eyre Peninsula, just west of Adelaide. These particular areas are on limestone and they have found that using fluid fertilisers based on phosphoric acid provides higher yields than granular based fertilisers. These fluid fertilisers are expensive and require specialist equipment but they can be more economical; less phosphate is required to achieve higher yields.
It is interesting to note that in Australia the extreme problems of phosphate availability can occur on limestone soils. There is a hint in the HGCA Critical P results that the problem of phosphate ‘lock-up’ may be worse on limestone than on chalk soils.
As you may recognise, there has been a lot more research done on phosphate application in Australia. A few years ago I spoke at a conference in Bendigo, Victoria where another presentation described some of this great research effort. Novel techniques were being tested and one (totally tongue in cheek) approach was the application of 200 litres/ha of Coca-Cola. It certainly greened-up the emerging crop because of its phosphoric acid content. Based on the declared chemical content of Classic Coke, this treatment does not appear to apply sufficient phosphorus and it is expensive. On a practical point, the researcher added that it was essential that the Coke was flat, otherwise there were great problems in applying it! So whilst in the context of phosphate nutrition “things go better with Coke”, it cannot meet the claim that “it’s the real thing”.
Posted on 05/02/2014 by Jim Orson
This is the 100th Orson’s Oracle and there is no better way to start it than by saying that for once I agree with one of Prince Charles’ pronouncements.
Last week he said before an invited audience “With a barrage of sheer intimidation, we are told by powerful groups of deniers that the scientists must be wrong and we must abandon all our faith in so much overwhelming scientific evidence”. He was talking about climate change but I am sure that those who support conventional farming and the regulated introduction of GM crops will fully support that sentiment on the grounds that good science, correctly interpreted, should always be listened to and respected.
However, it is important to point out that excellent ‘laboratory’ or ‘theoretical’ science may not always provide the answers and field testing is essential. That is why field testing of GMs is so important, but of course there are “powerful groups of deniers” who have used intimadatory measures to prevent these going ahead.
Field testing is essential because there are good examples where excellent and robust laboratory research and theories have not been delivered in the field. One example with which I became closely involved concerned spray application. Large spray droplets produced by conventional nozzles are not retained by a target plant and small ones will drift. Hence, it seems logical to produce a spray that is solely comprised of droplets that are neither too large nor too small. This is the basis of Controlled Droplet Application (CDA).
Tests with CDA in the laboratory were indeed very promising, with individual target plants having four times as much spray retained on their foliage when compared to a standard flat fan nozzle. So far, so good. However, in independently run field trials, CDA proved to be, at best, as good as conventional nozzles and often inferior. Naturally, there was a huge furore between those who were involved in the independent field trials and those who were selling CDA machines.
However, to the credit of some laboratory researchers, the reasons why the theory was not delivered in the field were identified. The droplets from CDA machines used in the trials fell relatively slowly downwards under the influence of gravity in an almost vertical trajectory. This meant that they were not good in penetrating a crop canopy. In addition, the pesticides were predominately deposited on the horizontal parts of the target plant. Research subsequently proved that pesticides are far more effective if they are deposited on the vertical surfaces of target plants.
There were two other reasons for the disappointing results from CDA application. Although it was proven in field trials that on average there could be more pesticides on the target, the variation between the amounts of deposit on individual plants was far larger than with conventional nozzles. Finally, CDA involved very low volumes and some pesticides were too concentrated for optimum uptake by the plant. I witnessed trials where some herbicides just ‘shot-holed’ the leaves of otherwise unaffected susceptible weeds.
The following development in spray application was electrostatic sprayers. These also failed to reach their theoretical advantages in the field because the charged pesticide spray stuck to the wrong part of the crop canopy and/or target plant for good activity. However, I wish to point out that I am not a spray application ‘denier’ and hand-held CDA machines are being used very successfully in many parts of the world. This is particularly so with pesticides whose the position on the plant has little impact on their efficacy and which can maintain their efficacy in very low volumes and also where there is no canopy to penetrate. You’ve guessed it; total weed control with glyphosate is a prime use with these machines.
One other theory which does not deliver in field trials is that the level of Soil Mineral Nitrogen (SMN) has a major influence on the optimum amount of bag nitrogen required by the crop. All the UK field trial databases show that this is not generally true for winter wheat, oilseed rape and sugar beet. I suspect that it is also generally not true for the other arable crops as well. In wheat, which has by far the largest field trials database, the optimum dose of bag nitrogen is only demonstrably reduced by levels of SMN in excess of the RB 209 N Index 3; i.e. in the minority of situations.
Nitrogen is a major and polluting input and we urgently need to know why SMN typically has so little influence on the optimum amount of bag N required. I may have just come across a clue in a paper by some Belgian soil scientists. In their research, the more efficiently the bag nitrogen was used by wheat, the less the amount of N was taken up from the soil. So the wheat plots in this research were using SMN more as a back-up than a primary source of this nutrient. This may be one of the reasons why differences in low-moderate SMN levels have no influence on the optimum amount of bag N required by the crop. David Jones, the manager of Morley Farms, suggested an analogy for this suspected preferential uptake of bag N by wheat “why drive down the road to buy a sandwich when your fridge is full of them?”
Posted on 29/01/2014 by Jim Orson
The quote “you can fool some of the people all of the time, and all of the people some of the time, but you cannot fool all of the people all of the time” is attributed to Abraham Lincoln. It came to mind when I watched the TV reports on the floods on the Somerset Levels. The Environment Agency has, according to the reports, spent a huge amount of money on a bird sanctuary but little on cleaning out the river. Apparently this, as well as cleaning out ditches, is bad for biodiversity. I suggest that three weeks under water is even worse for biodiversity.
I have admired Owen Patterson’s approach to GMs and badgers. He has looked at all the facts and taken a stand. Now he has to look at the all the facts on the value of well maintained water channels, both in terms of getting water away and the impact of such maintenance on biodiversity.
Farmers in the flooded areas are reporting finding carcasses of wild animals, including badgers. I must give it to any burrowing animal that tries to establish itself in an area with such high water tables. I realise that the situation is “complicated”, to quote the Environment Agency, but to let water channels silt up that previous generations had thought necessary to keep clear was a ‘brave’ decision. I have been reading for years the concerns of farmers on this issue. Some would regard farmers’ comments as biased but it also has to be accepted that comments from single issue pressure groups are perhaps more biased because, unlike farmers, they do not feel obliged to weigh up all the issues.
It is so easy for single issue pressure groups to influence public opinion and decision makers. They tend to ignore the downsides and concentrate on the “cuddly” view of nature. This puts pressure on the media. Everyone with the full benefit of all the facts knows that badgers are not cuddly. They prey on a range of wildlife. Those with less than a full knowledge, including some of the media, can be influenced into thinking that these “cuddly” creatures would not do such a thing. This could explain why in a BBC programme on a hedgehog sanctuary a comment that badgers are a major predator was allegedly edited out.
The same applies to articles on hedgehogs in the Sunday Times. Not once is the predatory nature of the badger mentioned and one of the reasons given for the decline in the hedgehog is, according to this paper, (inevitably) pesticides! The same paper has just published an article under the banner headline ‘Farmageddon’ which again, perhaps in ignorance, has misquoted the countryside survey data on hedgerows. Apparently, 16,000 miles disappeared between 1998 and 2007. It forgets to mention that much of this is due to farmers, for environmental reasons, letting hedges grow above a certain size where they are no longer defined as hedges. Also it was not mentioned that there are huge losses of hedgerows associated with new roads and other non-farming developments. I personally know of no hedge removal by farmers during this period but I know of many hedges that have been planted but these may have not yet have grown sufficiently tall to be classified as hedges in the survey.
Sometimes, the comments of single issue groups have a sinister edge implying that we all have to change our lifestyle and dietary habits in order to fit in with their own very narrow agendas.
There is little doubt that agriculture has environmental issues. This is inevitable because natural vegetation has to be destroyed to grow food. Getting the balance right is proving very difficult to achieve and will become even more difficult as the pressure of an increasing and more affluent population grows. Decisions should be based on all the facts and not just the views of single issue pressure groups. Life is more complicated than that. It should also be highlighted that farmers are not the only ones to “meddle” in nature. Protecting badgers has led to a reduction in biodiversity.