NIAB - National Institute of Agricultural Botany

Orson's Oracle

Micronutrients fail the ‘common sense’ test

Posted on 02/10/2013 by Jim Orson

Micronutrient treatments are often relatively cheap in terms of cost/ha but not necessarily in terms of cost/kg of the nutrient(s) in question. However, despite their relatively low price/ha, treating the whole farm with them will often cost more than a good holiday, even after tax.

There is tremendous pressure to use them but limited independent evidence of their value. The farming press has printed articles over the last few months suggesting that they are of great value but the evidence put forward is compromised by the fact that they have been applied together with extra nitrogen or extra fungicides or both. Who is to know where the claimed extra yield comes from? So it’s good news for the industry that HGCA has recently published the results of a research project on the value of three micronutrients (copper, zinc and manganese) when applied to wheat (Current status of soils and responsiveness of wheat to micronutrient applications – Project Report no 518).

Part of the continuing interest in micronutrients is the fear that despite the fact that annual uptake is low, their reserves in the soil may be falling to or below critical levels to support crop production. So, quite sensibly, this project included analysing 132 soil samples and comparing the micronutrient levels in the soil to those in a survey carried out 30 years ago. The results indicate that levels have fallen marginally but overall the changes are not likely to be biologically significant.

Another issue frequently raised when discussing micronutrients is that yields are now higher than when much of the original research on the responsiveness of wheat to their application was carried out. Hence, the research team that carried out the project did a total of 15 experiments over three years in order to measure the potential benefit of micronutrients in the context of today’s wheat management and yields. Sites were on soils that had a risk of micronutrient deficiency: light sandy soils, soils high in organic matter and calcareous soils.

The problem came when interpreting yield responses because they, both negative as well as positive, were typically below statistical significance. However, the levels of yield responses required to pay for the micronutrient application are typically way below the level of statistical significance. This is an age old issue and was a subject of my blog on 22nd March this year (Can you take the risk of not using it?).

In this situation, you have to be pragmatic and apply common sense by looking at all the results to see if there is a consistent trend. Should a very high proportion of the trials have a positive but non-significant yield response that is sufficiently high to more than pay for the micronutrient, then personally I would accept that their application is economically viable.

However, the results in this project by no means reflect this. For instance, at face value, the response to copper was at or less than 0.1 t/ha in 12 of the 15 trials with five trials indicating a negative yield response. The equivalent figures for manganese showed nine out of 15 trials had a response at or below 0.1 t/ha with seven trials indicating a negative yield response, and for zinc eight out of 15 trials and six trials respectively. This suggests that generally the small and non-statistically significant yield variations from those recorded in the untreated plots were due to random error and not to the application of micronutrients. So, with the best will in the world, the results suggest that there is no ‘common sense’ evidence that these micronutrients should be used as a matter of course, even on the soil types where yield responses are most likely to occur.

There were however the two statistically significant results. A very large response to copper application was recorded on one site. An economic response would reasonably have been expected from a soil analysis of the site. There was also a statistically significant response to zinc on one site, which was also supported by a soil analysis but, in this case, not by a leaf analysis. In fact, overall the project indicated that leaf sampling in the spring was of no value for determining the need for an application of copper or zinc.

The real surprise was the lack of statistically significant yield responses to manganese. This led the researchers to say that it “has had attention disproportionate to its yield benefits”. However, it would be a brave person not to treat a wheat crop showing deficiency symptoms which might increase susceptibility to disease and pesticide damage. On the other hand, the results provide food for thought for those who are fond of multiple applications in the absence of symptoms.

So there you have it. Even on soil types where these micronutrient deficiencies are most likely to occur the ‘common sense’ test, as well as the statistical tests, suggest that it is uneconomic to apply these micronutrients to modern high-yielding wheat as a matter of course. Soil analysis needs to be carried out to determine the probability of a response to zinc and copper and in the project critical values in the soil were identified.

This is an important project but there are, of course, other sources of trials information on the value of micronutrients. These have been reviewed for HGCA and will be subject of another blog in the near future. 

Leave a comment / View comments

 

Wheat harvest post-mortem

Posted on 23/09/2013 by Jim Orson

To tell you the truth, I’m not absolutely sure if my prediction for wheat yields was correct. If you remember, I predicted that “overall, timely sown wheat crops will not yield above average”. In a following blog I said that good yields may be achieved in areas where solar radiation levels during grain fill are above average and moisture supply is not too restricted. In particular I mentioned the NE of England above the Humber.

First of all, it doesn’t take a genius to predict average yields! Yields from year to year are amazingly stable in the UK when compared to many parts of the world.

It’s clear that moisture supply has had a major influence on yield this year. This is reflected by some high yields having been achieved in areas such as the more northerly parts of England and in the South West, both of which received higher amounts of rainfall in May and June.

However in general, yields appear to have been limited by a lack of soil water. In particular, yields on the less moisture retentive soils have suffered. A clue to indicate that moisture was limiting yields is that in trials and in practice there was a large benefit from drilling in early September rather than in early October.

Significantly higher yields from early drilling tend to occur when the soil water availability in the following spring and summer is limiting. However, this may not have been such a reliable guide this last season because of the wet soils in early October and the prolonged and cold winter. On the other hand, the high protein levels this harvest also suggest that potential yields have been restricted by a lack of water. 

It’s true to say that wheat yields overall were better than I’d feared and at the top of my forecast. It may be that the long cool spring has resulted in a higher than average contribution to yields from reserves of water soluble carbohydrates. Some scientific papers associate yield with total levels of solar radiation intercepted over the lifetime of the crop; the most important months being when there is full crop cover i.e. April onwards. This year the levels of sunshine in April to mid-July were above average. I’m digging deeper on this aspect and will write another blog on the significance of growth before anthesis on final grain yield.

The cool conditions around flowering were also good for yield. I’ve always wondered why hot weather around flowering is bad for yields; a PhD project at the Waite Institute in Adelaide has provided me with an explanation. In this project, flowering field wheat plots were exposed to just three hours of temperature at 35C, compared to field temperatures at the time of up to 20C. This resulted in grain yield reductions of between 18 and 30% depending on variety.

These yield reductions were due to lower leaf chlorophyll content and earlier leaf senescence. In addition, the heat reduced grain numbers by decreasing pollen viability. Whilst I recognise that we don’t experience 35C, the French become concerned when average temperatures exceed 25C during flowering.

In addition, research at the University of Reading suggests that the very high temperatures we experienced towards the end of grain fill may not have damaged yields as much as we feared at the time.

This is my first shot at reviewing wheat yields; I will return to the subject when I have ‘dug deeper’. However, it’s pleasing that in a year with just average levels of solar radiation during grain fill in many areas, the potential for good yields was established. Last year we had more than adequate rain but not enough sunshine and this year we had reasonable sunshine but not enough rain.

Looking back, 2008 had a better balance between the two, with slightly above average solar radiation and average rainfall. 2008 was a record year for wheat yields, which would have been even higher were it not for such a rotten harvest. So we continue to wait for the perfect year.

Leave a comment / View comments

 

Is farming advice too compartmentalised?

Posted on 12/09/2013 by Jim Orson

There has been a bit of a gap since my last blog because we were involved in a car accident when on holiday in France. This resulted in a week’s stay in a local French hospital and then a couple of days in a UK one. It was interesting to note the difference between the two.

All the ward staff in France seemed to be fully committed to our overall care and well-being within a fairly loose structure. They all wore the same uniform; from the orderly to the head ward nurse. Most of the day the orderly cleaned assiduously but also served the meals, often sharing that task with the head nurse.

In the UK there was a more rigid structure; there were those who cleaned; those who checked the cleaning; those who served meals; and those who nursed. Personally I think that the compartmentalised system in the UK is more likely to result in a breakdown in care when under financial pressure.

This set me thinking as to whether we have the right structures in agriculture to serve the well-being of our great industry. Have the support industries become too compartmentalised to the extent that the farmer is faced with an incomplete package of care? Each compartment may well be efficient in achieving its own targets but are individual farmers getting the support they need and deserve?

I fear that the answer to this latter question is that farmers often admit being confused by the different messages from the various parts of the industry, despite all the research being done on their behalf. This is nothing new but the situation doesn’t seem to be getting any better.

In a recent blog I referred to the reports in the press regarding what to do to control black-grass (Confused over blackgrass). Each report quoted conflicting views, particularly on the role played by different cultivation methods.

It’s easy to say that the cause for this has been a lack of applied research which the Government stopped funding in the late 1980s. This certainly hasn’t helpedJim at the NIAB TAG Blackgrass Centre. There’s still a lot spent on such research but because it isn’t co-ordinated it lacks coherence.

Let’s go back to the issue of cultivations and black-grass on this one. There is much disjointed, short-term and incomplete research on this subject; no wonder different trials come to different conclusions. Using the same level of overall resource to fund a few key experiments where every relevant detail is recorded over a few years would provide clear and definitive conclusions. However, the industry cannot achieve such a co-ordinated effort because of sectional interests.

It would be nice to finish this blog with a few hopeful words as to how we can move forward on these issues. However, we are where we are. In such a situation, individual farmers have to identify sources of information that have been interpreted in such a way as to benefit their businesses rather than the compartmentalised interests of others.

There are some tests that can be applied to the veracity of the information. Boringly, the basic information presented will not change much from year to year and when it does, it has been updated by sound and independent research or field experience that has been carefully interpreted. ‘New’ science is always being presented but what excitingly may appear as a superior technique may not stand up to reviews by informed pragmatic agronomists.

So, it is up to those providing interpreted information to farmers not only to listen to and read the science but also to look at the data presented to see if it will bring any benefit to an individual farmer’s businesses. It is amazing what fresh sets of eyes can generate from a data set generated by a scientist who may have an incomplete overall knowledge of agronomy. Let’s face it; scientists inevitably work in their own compartments.

By the way, one lovely touch about French medical care was that every meal came with a neatly ironed and folded linen napkin! Just a hint of Gallic flair.

Leave a comment / View comments

 

A noisy summer

Posted on 02/09/2013 by Jim Orson

We live very close to the centre of Cambridge. Despite that, it’s a normally a quiet area but this summer has been different. The garden has been exceptionally noisy with bees buzzing and butterfly wings flapping.

The only thing missing is bird song. Despite the best efforts of the neighbourhood to feed them, small birds have all but disappeared and those that do appear skulk in the bushes hoping that they will not be seen by the local sparrow hawks. One bird that has completely disappeared from our garden is the magpie. We’ve had high numbers in the previous couple of years when I assume that they hoovered up the nests of small birds and so having exhausted the local territory they have now moved on. The only birds we now see and hear regularly are pigeons although a green woodpecker makes an occasional visit.

The increased number of bees and butterflies could perhaps have been predicted. I’d noticed that the cool and consequently ‘long’ spring resulted in a relatively constant high number of flowers in hedgerows and field edges.  

The relatively high numbers at any one time were possibly due to two factors. Firstly the flowering period of each plant lasted a bit longer in the cool conditions. And secondly the constancy of the supply of flowers may have been due to the fact that in a cool spring the flowering periods of different plants were more likely to overlap rather than occur closely together. This may be because the time of flowering of some species is more determined by day length and in other species by accumulated temperature.

So in a warm spring the flowering of some species may be brought forward to a time when other plant species whose development is more determined by day length are also flowering. I must admit that this is pure conjecture based on observation and that this year’s dry spring might also have contributed to higher numbers of some species.

To take this conjecture to a further stage, are the reported warmer springs, compared to previous decades, contributing to a decline in biodiversity? It is perhaps an issue that is worth investigating. Please do not take this as an answer for all the decline in biodiversity on arable land but it may be that it should be explored as a contributory factor.

There is a lot of debate in the national press about the decline of one ‘cuddly’ garden species. Whilst I realise that you really cannot cuddle a hedgehog, they are a very popular animal. The Sunday times devoted a whole page to their decline and the usual suspects were mentioned, land change and pesticides. What was not mentioned was the fact that their main predator is the badger, whose numbers are flourishing. Surely there is a connection here but badgers were not mentioned once. My allotment is in a semi-urban environment and yet when it is dry there are signs of badger activity, suggesting that they are a less shy species than I had presumed.

The Sunday Times article reflects much of the press output on other ‘cuddly’ species. Everything but natural predators are mentioned along with an anti-food production message. In the natural world there are inconvenient truths but pressure groups do not mention them possibly because they may upset other pressure groups and/or they may upset their subscribers. So they retreat into a rather surreal world where there is a common enemy on which they can all agree. Perhaps this is a core reason why some consumers have totally unrealistic fears about pesticides and pesticide use.

Leave a comment / View comments

 

Confused over blackgrass

Posted on 22/08/2013 by Jim Orson

A number of farmers have confessed to me that they are confused over the best approach to take to control black-grass. After reading many articles in the press over the past couple of weeks, I can see why. Conflicting views have been expressed and some are, in my opinion, plainly wrong.

Let’s take the simple issue of the impact of primary cultivation. What’s the impact on black-grass populations if the land is ploughed every year compared to the long- term adoption of shallow non-plough tillage, deeper non-plough tillage or direct drilling? 

The answer is clearly that the lowest populations occur with ploughing. However, ploughing is painfully slow, an awful lot more equipment is needed to maintain timeliness and there are other arguments against the plough from a weed control point of view. Ploughing can result in poorer seedbeds, with less black-grass germination and emergence prior to sowing, their roots establishing at greater depths compared to very shallow tillage or direct drilling and a reduced level of control with pre-emergence herbicides.Small blackgrass

Despite these negative features, every LONG-TERM experiment, both in northern Europe as well as in the UK, has demonstrated that, although this position must be weakened by the increased reliance on pre-emergence herbicides, the plough is best.

However, in the SHORT-TERM it may not be best. 

Typically, the plough can result in higher numbers of black-grass where seed shed has been low in the year of ploughing but high in the year before. Higher numbers of viable seed can then be brought to the surface when compared to non-plough tillage.

This is why there is always some experimental data around to show that an alternative approach may be better. It also shows that, in general, farmers should not slavishly follow one particular cultivation approach, because herbicide resistance results in us all having to be more tactical nowadays.

There has also been some remarkable advice in one or two recent press articles saying that the pre-emergence herbicides should be applied straight after drilling regardless of soil conditions. The advice is that even if it is bone-dry the herbicides should still be applied; the argument put forward is that they are true pre-emergence herbicides and the risk of a delay in application cannot be taken. 

There are lots of things to say about this.

Firstly, these herbicides enter the plant through the developing roots and/or shoots. If it is bone-dry the weeds will not germinate and so the herbicides will be degrading on the soil surface whilst not contributing at all to weed control. All pre-emergence herbicides apart from tri-allate are far more efficacious when applied to a moist soil surface and they also rely on soil moisture to be taken up by the plant. 

The dread scenario is that which occurred in the autumn of 2011 when the soil surface was dry but there was just sufficient moisture below the soil surface for some black-grass to germinate and emerge. This is the situation we should fear the most. If it is so dry that the black-grass will not germinate - do not apply pre-emergence residual herbicides including tri-allate.

Secondly, on the subject of whether or not pre-emergence herbicides should be applied as soon as possible - peri-emergence application is just as effective as pre-emergence. In fact, one of the more widely used herbicides is consistently more effective when applied peri-emergence. Seedbeds are more likely to be moist and settled peri-emergence. When they are dry pre-emergence, but moist peri-emergence, the advantage of peri-emergence application is very significant. Although saying this, I recognise that the peri-emergence window of application is vanishingly small, but please remember the principles. 

I have to say it … in order to avoid confusion over the various and often conflicting statements in the press, consider subscribing to an organisation such as NIAB TAG which retains both a scientific and practical perspective on technical issues. 

Leave a comment / View comments

 

Page: ‹ First  < 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 >  Last ›