NIAB - National Institute of Agricultural Botany

Orson's Oracle

Seeing is believing?

Posted on 29/10/2013 by Jim Orson

It’s always been said that many farmers adopt or adapt techniques by ‘looking over the hedge’ at their neighbours. Nowadays ‘looking over the hedge’ should not be taken too literally as it is a process that includes all forms of communication. However, seeing should not always necessarily mean believing.

I don’t think we’ll ever regain the world wheat record from New Zealand now that they’ve really got their act together. In this context, this year’s UK record wheat crop was a real achievement. NIAB TAG’s monitoring of weather variables suggested that the area where the UK record crop was grown was favourably treated by nature this spring; higher than average levels of solar radiation and sufficient rainfall were recorded.   

Much is being made of the level of foliar nutrition that was received by the wheat crop that broke the UK wheat record. There’s the assumption that this significantly contributed to the high yields and it appears that the farmer, whose attention to detail is impressive, is convinced of this.

Now, I don’t want to be a killjoy but I’m not quite so convinced. Perhaps this is inevitable from a boring science-based agronomist whose views have been coloured by similar (and eventually proven to be unsubstantiated) claims throughout his career. Perhaps this time it is different but I should like it to be validated in field trials where yields with and without the foliar nutrition are compared.

What was different about the approach to the record UK wheat record crop is that the farmer applied foliar nutrition from early post-emergence onwards. I don’t think that’s been assessed experimentally in the past. Usually these products have been tested in single applications rather than in a multi-application, multi-year approach. The farmer’s high yields suggest that this approach needs to be evaluated experimentally.

There are yield benefits from splitting the total season-long dose of ‘bag’ nitrogen. In trials in years gone by, there were yield benefits from increasing the number of spring nitrogen applications. The biggest yield benefit was going from one application to two. The yield benefit got progressively less with each additional application. The current standard of three applications is a pragmatic approach bearing in mind labour and machinery costs but there would be yield benefits from having one or two additional applications. However, it must be remembered that we farm to optimise margins and not to maximise yields.Wheat

In press reports, the farmer who grew the UK record crop laments the fact that he cannot use, for environmental reasons, the very high dose of applied nitrogen that was used to grow the world record crop of 15.64 t/ha in New Zealand. He is reported to have used 220 kg N/ha. In fact he may not have had to use much or any more because Eric Watson, who also farms in South Island New Zealand, has achieved a field yield slightly higher than the world record with around 250 kg/ha of applied nitrogen. The level of soil mineral nitrogen in this case was 100 kg/ha. 

In NIAB TAG trials the optimum applied nitrogen dose is not reduced by this level of soil mineral nitrogen. To reinforce this point, Craige MacKenzie, who farms a few miles from Eric Watson has also achieved a field yield higher than the world record from a total of 262 kg/ha of applied nitrogen where the soil mineral nitrogen was 35 kg/ha. Unlike Mike Solari, the world record holder, both Eric and Craige have the benefit of variable rate irrigation.

This suggests that, while the amount of nitrogen that a crop can access may limit yield, very high wheat yields can be achieved by not exceptionally high doses of applied nitrogen. What is essential is the correct crop structure, prolonged ripening with high levels of sunlight and an ample supply of moisture. The final and critical element must be the skill of the farmer with particular emphasis on attention to detail. I remember vividly having a prolonged discussion with Mike Solari about seed rates and tiller numbers and what constitutes a ‘too thick’ crop. 

As an aside, one intriguing observation can be made when looking at the results of UK trials on trace elements and/or plant vigour sprays. Usually these trials have two control treatments. One control treatment is standard crop management and the other is the same but receiving a spray of just water applied at the same timing and volume of application as that used for the trace element and/or plant vigour sprays. It’s not unusual to see a yield benefit from all the spray treatments, with or without the trace element and/or plant vigour products. I’ve often pondered why such a small amount of applied water can make such a difference.

Leave a comment / View comments

 

Science backs straw burning

Posted on 23/10/2013 by Jim Orson

I recently received an email from a farmer who forwarded a Tweet providing a link to a report in New Zealand. The heading on the linked website was “Science Backs Use of Stubble Burning”. My immediate thought was “was it ever thus?” Crop science has always backed straw burning. I was closely involved with MAFF’s machinations on the straw burning issue in the late 1980’s. The crop science clearly backed straw burning but the practice was banned despite that knowledge.

Perhaps a couple of issues have changed since the 1980’s. Only a few years after the ban was imposed, many farmers commented that their soil had become easier to work where they incorporated the straw. I think this was the result of regular incorporation of straw increasing soil fungal biomass; to a certain extent this acts as a proxy for organic matter.

The big change since the 1980s relating to issues around straw burning is herbicide resistance in black-grass. This has increased the reliance on cultural measures. The ban on straw burning removed one cultural option. Straw and stubble burning can reduce the number of viable seed by up to 50%. That sounds impressive but this loss of viable seed would only reduce the annual chemical control required in continuous winter wheat, established by non-plough tillage to 20 cm depth, from 97% to 94%.Burning straw stubble

In addition to killing viable seed, straw burning reduces the dormancy in surviving seeds. This may provide potentially the most significant impact on black-grass control particularly where a ‘hot’ burn is not possible. However, this latter statement comes with a huge proviso. To benefit fully from reduced dormancy, drilling will have to be delayed. Going by the record of this year, you have to ask the question whether farmers are willing or able to delay drilling for a sufficient time in order to exploit fully the benefit of straw burning.

This year, even without straw burning, we have had the potential advantage of low dormancy. Despite this, many farmers drilled fields with known high levels of black-grass in mid-September. The subsequent dry conditions in many areas resulted in poor herbicide control and some of these fields have now been sprayed with glyphosate prior to re-drilling.

This suggests that there is no simple silver bullet for black-grass control. So whilst straw burning would be beneficial, particularly where drilling is delayed, the real answer is hard to swallow. Black-grass populations have to be reduced to low levels by adopting far-reaching cultural control measures, perhaps the most important of which, in this context, is spring cropping. With lower background levels of black-grass, the penalties of poor herbicide control in some seasons will be minimised. Once background levels of black-grass have been reduced to manageable numbers, there are less daunting cultural measures that may enable continuous autumn cropping, provided that it is not all early drilled. The issue of defining a more durable approach will depend on soil type, the farmer’s attitude to cultivation practice and crop choice and, as well, their attitude to risk.

Going back to the New Zealand report, I think it is important to look at it in context. It is in fact a report commissioned by Environment Canterbury and it is focussed on the farming systems of the Canterbury Plain in South Island. This area produces a lot of specialist seed crops that are sown after straw crops and are of huge value to the local economy. Herbicide options are limited in these seed crops and straw residues can affect their establishment. Hence, straw burning significantly increases the chance of optimising crop establishment and achieving the high levels of weed control necessary in order to remove weed competition and minimise harvested seed contamination. One obstacle that does not apply on the Canterbury Plain is that soil organic matter levels are high, and therefore the potential benefit to soil structure from incorporating straw is minimal.

Over the last few months I have become very familiar with the issues relating to straw burning on the Canterbury Plain. I was one of the authors of the report and had many hours of stimulating discussion with my fellow authors.  Not that I visited New Zealand to do this; I was at home and sat round the table via Skype…

Leave a comment / View comments

 

Two into one will go

Posted on 14/10/2013 by Jim Orson

LERAPs (Local Environment Risk Assessment for Pesticides) was introduced in 1999 in order to meet the standards for protection of the aquatic environment from spray drift as laid out in the then pesticides registration directive. I’ve always been a fan of this scheme because it is practical, logical and rewards farmers by enabling reductions in buffer zone width through good spray application practice and responsible pesticide use.

The new EU pesticide registration regulations were introduced in 2011. These significantly increased the standards for protection of the aquatic environment. I’m not qualified to say whether these standards are realistic or far too precautionary but they could result in more unproductive land than is necessary unless we have a similarly successful approach to that demonstrated by LERAPs.

Initially the Chemicals Regulation Directorate (CRD) introduced some Interim Arrangements to cope with the requirements of the new regulations. If it weren’t for these interim measures we would have lost some valuable products from the market. However, to keep these products on the market wider buffer zones than those under LERAPs were introduced. These buffer zones stretch up to 20m from the edge of the watercourse but unlike LERAPs, cannot be reduced with the adoption of good spray practice or lower than label recommended doses. In addition, the size of the watercourse cannot be taken into account.

This, in effect, is resulting in the adoption, for each individual product, of one of two separate sets of rules in order to reduce the impact of spray drift on the aquatic environment; LERAPs or the Interim Arrangements. CRD is carrying out a consultation on possible changes to the Interim Arrangements that may allow even more products to meet the new regulations (albeit with wide buffer zones) or may reduce the buffer zone widths. It’s envisaged that these benefits will be delivered by the adoption of spray nozzles which reduce spray drift by more than 75% when compared to a standard conventional nozzle. These are designated as 3-star nozzles in LERAPs.

These proposals can only be welcomed but they don’t resolve the issue that there will still be effectively two separate sets of rules which govern the width of a buffer zone. Tank mixing could become a nightmare. Logically, these two separate sets of rules should be combined, not only for simplicity and clarity but also for achieving support from the farming industry.

The most logical way to combine them is, in my opinion, to incorporate the Interim Arrangements into LERAPs. This would mean taking into account the adoption of lower than label recommended doses (better described as appropriate doses) and the size of water course in addition to the adoption of 3-star nozzles in a unified scheme.

There also needs to be a coming together on the width of buffer zones.Crop spraying

Currently, the maximum in LERAPs, when applied through conventional nozzles from the top of the bank of the watercourse, is 5 metres. This width can be reduced for many products when applied though 3-star nozzles. The Interim Arrangements are 10 metres, 15 metres or 20 metres from the top of the watercourse. This means that those buffer zones which have currently been set at more than 5 metres and less than 10 metres when applied through conventional nozzles cannot possibly be reduced to 5 metres when applied through 3 star nozzles. So there needs to be an additional buffer zone width of 5 metres when applied through 3-star nozzles.

This is a very important issue for many farmers who have established 5 to 6 metre permanent grass based buffer zones next to water courses. Buffer zones wider than these strips but less than 10 metres may result in weeds, diseases and insects establishing on the untreated edge of the crop, which in turn may necessitate further pesticide use.

Hence, I would argue that two into one will go not only in terms of protecting water but will also reward good spray practice as well as gain greater support from the arable industry.  

Replies to the consultation need to be submitted by 21st October and it is available here.

Leave a comment / View comments

 

Micronutrients fail the ‘common sense’ test

Posted on 02/10/2013 by Jim Orson

Micronutrient treatments are often relatively cheap in terms of cost/ha but not necessarily in terms of cost/kg of the nutrient(s) in question. However, despite their relatively low price/ha, treating the whole farm with them will often cost more than a good holiday, even after tax.

There is tremendous pressure to use them but limited independent evidence of their value. The farming press has printed articles over the last few months suggesting that they are of great value but the evidence put forward is compromised by the fact that they have been applied together with extra nitrogen or extra fungicides or both. Who is to know where the claimed extra yield comes from? So it’s good news for the industry that HGCA has recently published the results of a research project on the value of three micronutrients (copper, zinc and manganese) when applied to wheat (Current status of soils and responsiveness of wheat to micronutrient applications – Project Report no 518).

Part of the continuing interest in micronutrients is the fear that despite the fact that annual uptake is low, their reserves in the soil may be falling to or below critical levels to support crop production. So, quite sensibly, this project included analysing 132 soil samples and comparing the micronutrient levels in the soil to those in a survey carried out 30 years ago. The results indicate that levels have fallen marginally but overall the changes are not likely to be biologically significant.

Another issue frequently raised when discussing micronutrients is that yields are now higher than when much of the original research on the responsiveness of wheat to their application was carried out. Hence, the research team that carried out the project did a total of 15 experiments over three years in order to measure the potential benefit of micronutrients in the context of today’s wheat management and yields. Sites were on soils that had a risk of micronutrient deficiency: light sandy soils, soils high in organic matter and calcareous soils.

The problem came when interpreting yield responses because they, both negative as well as positive, were typically below statistical significance. However, the levels of yield responses required to pay for the micronutrient application are typically way below the level of statistical significance. This is an age old issue and was a subject of my blog on 22nd March this year (Can you take the risk of not using it?).

In this situation, you have to be pragmatic and apply common sense by looking at all the results to see if there is a consistent trend. Should a very high proportion of the trials have a positive but non-significant yield response that is sufficiently high to more than pay for the micronutrient, then personally I would accept that their application is economically viable.

However, the results in this project by no means reflect this. For instance, at face value, the response to copper was at or less than 0.1 t/ha in 12 of the 15 trials with five trials indicating a negative yield response. The equivalent figures for manganese showed nine out of 15 trials had a response at or below 0.1 t/ha with seven trials indicating a negative yield response, and for zinc eight out of 15 trials and six trials respectively. This suggests that generally the small and non-statistically significant yield variations from those recorded in the untreated plots were due to random error and not to the application of micronutrients. So, with the best will in the world, the results suggest that there is no ‘common sense’ evidence that these micronutrients should be used as a matter of course, even on the soil types where yield responses are most likely to occur.

There were however the two statistically significant results. A very large response to copper application was recorded on one site. An economic response would reasonably have been expected from a soil analysis of the site. There was also a statistically significant response to zinc on one site, which was also supported by a soil analysis but, in this case, not by a leaf analysis. In fact, overall the project indicated that leaf sampling in the spring was of no value for determining the need for an application of copper or zinc.

The real surprise was the lack of statistically significant yield responses to manganese. This led the researchers to say that it “has had attention disproportionate to its yield benefits”. However, it would be a brave person not to treat a wheat crop showing deficiency symptoms which might increase susceptibility to disease and pesticide damage. On the other hand, the results provide food for thought for those who are fond of multiple applications in the absence of symptoms.

So there you have it. Even on soil types where these micronutrient deficiencies are most likely to occur the ‘common sense’ test, as well as the statistical tests, suggest that it is uneconomic to apply these micronutrients to modern high-yielding wheat as a matter of course. Soil analysis needs to be carried out to determine the probability of a response to zinc and copper and in the project critical values in the soil were identified.

This is an important project but there are, of course, other sources of trials information on the value of micronutrients. These have been reviewed for HGCA and will be subject of another blog in the near future. 

Leave a comment / View comments

 

Wheat harvest post-mortem

Posted on 23/09/2013 by Jim Orson

To tell you the truth, I’m not absolutely sure if my prediction for wheat yields was correct. If you remember, I predicted that “overall, timely sown wheat crops will not yield above average”. In a following blog I said that good yields may be achieved in areas where solar radiation levels during grain fill are above average and moisture supply is not too restricted. In particular I mentioned the NE of England above the Humber.

First of all, it doesn’t take a genius to predict average yields! Yields from year to year are amazingly stable in the UK when compared to many parts of the world.

It’s clear that moisture supply has had a major influence on yield this year. This is reflected by some high yields having been achieved in areas such as the more northerly parts of England and in the South West, both of which received higher amounts of rainfall in May and June.

However in general, yields appear to have been limited by a lack of soil water. In particular, yields on the less moisture retentive soils have suffered. A clue to indicate that moisture was limiting yields is that in trials and in practice there was a large benefit from drilling in early September rather than in early October.

Significantly higher yields from early drilling tend to occur when the soil water availability in the following spring and summer is limiting. However, this may not have been such a reliable guide this last season because of the wet soils in early October and the prolonged and cold winter. On the other hand, the high protein levels this harvest also suggest that potential yields have been restricted by a lack of water. 

It’s true to say that wheat yields overall were better than I’d feared and at the top of my forecast. It may be that the long cool spring has resulted in a higher than average contribution to yields from reserves of water soluble carbohydrates. Some scientific papers associate yield with total levels of solar radiation intercepted over the lifetime of the crop; the most important months being when there is full crop cover i.e. April onwards. This year the levels of sunshine in April to mid-July were above average. I’m digging deeper on this aspect and will write another blog on the significance of growth before anthesis on final grain yield.

The cool conditions around flowering were also good for yield. I’ve always wondered why hot weather around flowering is bad for yields; a PhD project at the Waite Institute in Adelaide has provided me with an explanation. In this project, flowering field wheat plots were exposed to just three hours of temperature at 35C, compared to field temperatures at the time of up to 20C. This resulted in grain yield reductions of between 18 and 30% depending on variety.

These yield reductions were due to lower leaf chlorophyll content and earlier leaf senescence. In addition, the heat reduced grain numbers by decreasing pollen viability. Whilst I recognise that we don’t experience 35C, the French become concerned when average temperatures exceed 25C during flowering.

In addition, research at the University of Reading suggests that the very high temperatures we experienced towards the end of grain fill may not have damaged yields as much as we feared at the time.

This is my first shot at reviewing wheat yields; I will return to the subject when I have ‘dug deeper’. However, it’s pleasing that in a year with just average levels of solar radiation during grain fill in many areas, the potential for good yields was established. Last year we had more than adequate rain but not enough sunshine and this year we had reasonable sunshine but not enough rain.

Looking back, 2008 had a better balance between the two, with slightly above average solar radiation and average rainfall. 2008 was a record year for wheat yields, which would have been even higher were it not for such a rotten harvest. So we continue to wait for the perfect year.

Leave a comment / View comments

 

Page: ‹ First  < 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 >  Last ›