NIAB - National Institute of Agricultural Botany

Orson's Oracle

Sweet smelling stubble

Posted on 23/02/2012 by Jim Orson

Travels across Australia - cont

Farming in South Australia

I visited a farm today near Clare in South Australia.  It’s in a relatively high yielding area; achieving over 6 tonnes/ha of wheat last year.  This year the soil is still dry throughout the profile, so yields are predicted to be lower for 2012, and the grain price has dropped due to the strength of the Australian dollar.  So the financial shutters are coming down and every growing cost is being reviewed.

In neighbouring Victoria grain farmers have seen a doubling of total costs over the past 15 years.  Profit levels have remained roughly the same, but with enormous variations between years linked to rainfall levels.  Overall, growers have made money in around half the years since the early 1990s, but this hasn’t prevented an increase in debt.  Some have suffered a 5-11-year drought that they thought would never end.

Back to this morning’s farm visit.  The farmer was using soil probes to monitor water availability in the soil rather than using computer models.  Neither method appeared to be perfect, but they have a role to play in arriving at a decision where the skill and the intuition of the farmer is paramount.  A researcher recently reviewed the crop models and found them all pretty inaccurate in predicting yield, although some were OK in estimating water use.

The farmer has used ‘controlled traffic’ of sorts for the past seven years.  I say ‘of sorts’ because his combine header isn’t fitted to the system and he makes hay from oats, and many of those operations don’t fit in with the concept.

Despite the ‘of sorts’, the farmer says that water infiltration has improved to such an extent that his dams, which are ponds constructed at the bottom of slopes to catch run-off,  are no longer filling with water.  Obviously, Australian grain growers prefer the rain to be in the soil rather than running off it.

Talking to his consultant, it seems that adopting direct drilling using the knife coulters in the mid-1990s increased water infiltration rates due to soil residues being left on the surface, which reduced run-off, and an improved soil structure.  Controlled traffic may have further increased infiltration rates.

The other advantage he claimed from controlled traffic is a reduction in power required to pull the 11-metre direct drill.  He suggests that there has been a reduction of around one-third since its initial adoption.

Despite these figures, many of his neighbours have not adopted any system of controlled traffic.  This prompted a discussion on soil care.  This farmer, and another one I met later in the day, re-iterated comments I often hear from UK growers; when they take over new land the horsepower needed to establish crops is high, but adoption of measures to improve soil health result in a gradual reduction in power requirement and an improvement in yield.  These measures include working the soil only at appropriate times and using as much organic manure or amendments as possible.

This led me to ask one final question as I got into the Ute: why did the stubble smell so great?  Apparently it had recently received a locally sourced organic amendment - 2 t/ha of grape waste.  We tasted the non-waste portion at lunchtime.

Leave a comment / View comments

 

Boring Brits?

Posted on 16/02/2012 by Jim Orson

We went out for lunch last Saturday. The weather was glorious and all the tables in the garden were occupied, so unfortunately we had to sit inside. The conversation soon got around to managing risk when growing wheat in low rainfall areas of Australia. I was talking to a consultant who has been a major driver in the development of risk management strategies in such situations; we were having lunch in Bendigo, Victoria...in Australia.

Obviously, much of the risk management strategy revolves around water availability. Last year soils were full of moisture at sowing in early May (our October equivalent), but this year they could be bone dry. So how do Victoria’s farmers cope with this extreme variation, when typically rainfall after sowing could be insufficient to meet the full needs of the crop?

When I was first in Australia, around ten years ago, this was done by estimating how much water there was in the soil at the start of the season. Either a steel rod was pushed into the soil or a calculation was made based on the amount of post-harvest rainfall, providing a crude measure of how many mm of water was available at the time of sowing.

The assumption was for 20 kg of grain for each mm of moisture, and as rain fell during the season this was measured and the estimated yield as adjusted accordingly, with nitrogen (N) applied based on yield potential.

The issue with N is that there should be sufficient to achieve the anticipated yield. However, too much will mean excessive green leaf and water loss and as a result, reduction in both yield and grain size.

Now, computerised systems are used that require information at the start of the season based on a soil core analysis. Available moisture and N are measured and where prospects are good for yield and some additional N is justified this is applied in the combine drill.

As the season progresses the computer programme predicts N uptake and, based on any rainfall after sowing, expected yield. Additional N is applied if a shortage is predicted.

Water availability, along with plagues of mice and locusts, are not the only risks to take into account.

Sow too early and the risk of frost damage increases. Crops are grown through the winter and harvested in November (equivalent to our April) with the threat of wipeout due to frost at flowering. Sow too late and there is an increased prospect of yield damage from high temperatures during flowering.

Variables impacting Australian wheat production (specific variety) These risks can also be assessed by a computer programme based on weather records over the last 100 years. The image shows a printout for a specific variety at a specific location:

  • potential yield according to sowing date is the thick line;
  • the falling lines on the left are the decreasing risk of frost damage as sowing is delayed; and
  • the increasing lines on the right are the increasing risk of heat damage as sowing is delayed. 

The optimum date for sowing, assuming sufficient moisture for germination, is early May.

Doesn’t this make British wheat production sound boring?


Jim is on a study trip to Australia so expect further Oz-blogs over the coming weeks

Leave a comment / View comments

 

That 70s nostalgia...

Posted on 09/02/2012 by Jim Orson

I went badly off-course in last week’s blog. It started with a suggestion that I was going to discuss some aspects of the contents of the 1970 edition of Approved Products for Farmers and Growers. Perhaps this objective will be met this week.

First of all, it wasn’t a very thick book - this was still the early days of pesticide development – and there were no really effective ‘approved’ grass-weed herbicides in cereals.

Tri-allate (Avadex) had an entry, but this didn’t fit the ‘very effective’ description. I think that chlorotoluron may have been in the following year’s edition and isoproturon in the year after that.

[By the way, chlorotoluron was for a few years known as chlortoluron before the extra ‘o’ was adopted. Nobody seemed to know where the extra ‘o’ came from, but most seemed to blame the Americans. It was fashionable to blame the Americans for anything unwelcome.]Blackgrass

I actually remember the first time I saw the results of isoproturon on black-grass. It was a March application to very large black-grass plants and they were all scythed down to the ground. At that time, both chlor(o)toluron and isoproturon averaged 99% control in trials. I will pause whilst a few nostalgic tears are shed.

In my opinion, it was these two pesticides that started the revolution in wheat growing and yields. Up to their introduction, farmers on clay soils couldn’t grow winter wheat very regularly because of black-grass. Spring cropping often resulted in low yields and, with the equipment and understanding of that time, terrible soil compaction. Those farmers who tried to grow winter wheat more regularly had to resort to late drilling to avoid black-grass, which also resulted in damaged soil structure. No wonder that the UK Government commissioned a report in the late 1960s on the poor state of soils.

Just think what high levels of control of black-grass offered. Farmers on potentially high yielding clays could grow wheat more often or continuously, and soil structure improved because of earlier wheat sowing and no spring cropping.

The pace to higher yields was accelerated by the introduction of semi-dwarf varieties in the mid-1970s. These placed more of their above ground growth into grain rather than straw. Then the triazoles were introduced in the late 1970s and as yields grew so did the use of nitrogen.

The strange thing is that during the 1970s we didn’t think we were in the middle of a revolution in crop management. We came to expect new products and techniques to be produced on a regular basis. It was only when we looked back a few years later that there was a recognition of what was achieved during those few years.

The sad reality is that we seem to have about exhausted those technologies. The battle is now on maintaining their benefits from the challenges of resistance and legislation.

We now need a new fledgling technology that could be developed within the industry over time. Perhaps GM technology is currently at a similar stage of development worldwide that pesticides were in 1970. With a market, albeit initially for simple traits, it could bloom and be the foundation for the next step forward in yields. Unfortunately, the technology is being held back by a public that have been fed misconceptions peddled by those who go to bed every night with a full stomach.

Worldwide the technology is racing ahead and we have recently heard that the BASF biotech operation is being moved from Europe to the US where the technology has an immediate future.

There is much talk of the rest of the world moving ahead of Europe in the press. The anti-GM movement is perhaps just one example of a wider malaise in the EU. Many of our citizens still harbour the attitude that unwelcome technologies, such as GM, emanate from America.

Well, having got that off my chest, I realise that I have still not fully reviewed much of the content of the 1970 book of Approved Products. Perhaps I will have another go next week – as I am sitting on the plane on the way to Australia. Then again...

Leave a comment / View comments

 

What should be on the label?

Posted on 03/02/2012 by Jim Orson

I said a few weeks ago that I would return to the 1970 edition of the Approved Products for Farmers and Growers. In those days approval meant that the product efficacy had been assessed and ‘it did what was said on the tin’. Efficacy assessment was not part of the registration process, but it was de rigueur to enter products into the voluntary Approvals Scheme.

Personally, I’ve regularly questioned why products have to be assessed for efficacy. It’s understandable that there are obligations on the label regarding crop safety and safety to following crops. The argument for statutory efficacy testing is that pesticides should not be used in the environment unless they do a useful job. On the other hand, the market would soon drop the products that didn’t do the job. That takes place despite a statutory efficacy assessment.

On the other hand, a target needn’t necessarily be on the label for a product to be widely adopted to control it. Trifluralin had a huge market for black-grass control in cereals, but the weed was not listed on the label as being susceptible.

In addition, the label recommended dose to do a particular job is often ignored. The best example of this is with cereal fungicides. In efficacy testing for authorisation the product is only applied once and no other fungicides are used. In practice, a programme of different fungicide products may be used to control the target. In this situation, adopting a lower than recommended doses is inevitable.

The other issue with the recommended dose is that it’s set to control the most difficult target on the label. Significantly lower doses may suffice for many of the more susceptible targets listed. Everyone can quote examples where this takes place. It’s regrettable that much of the dose information generated during initial development trials isn’t detailed on the label.

The other efficacy issue that now has great emphasis during the authorisation of products is resistance management. Overall I welcome this, but there are issues relating to some of the restrictions imposed. All too often it is closing the stable door after the horse has bolted. A good example of this is resistance to the ‘fops’ and ‘dims’. To be fair, the labels suggest that a wider view is taken of resistance management than just the specific measures they outline.

In many cases resistance is inevitable unless rather extreme measures are taken. Bayer imposed the need to use mixtures with Atlantis. However, it soon became clear that the only way to prevent resistance to Atlantis, and products that share its mode of action, was not to use them on a regular basis. Mixtures may have delayed the inevitable, but not by much.

I am increasingly convinced that resistance management is more of a socio-economic issue rather than merely producing guidelines and introducing some restrictions of use on labels. Resistance to outstanding pesticide-based solutions can occur very quickly. Glyphosate weed resistance developing in crops genetically modified to be tolerant to this herbicide. Roundup Ready was initially a great option, not only because of the level of weed control achieved but also the ease of management. The cost of the GM seed also meant that the breeders invested more in developing varieties. This resulted in farmers totally relying upon this one herbicide solution year after year. Weed resistance has taken a little of the icing off the cake but the area of Roundup Ready crops has not diminished.

There really is a need to learn the lessons of the past. This would suggest that an even more restrictive use may increase the lifelong benefits of new modes of action known to be vulnerable to resistance development. After the experience of the strobilurins, is it wise to allow two SDHI-based fungicides per crop?

Unfortunately, new discoveries are very active substances with doses, at most, of a few hundred grams/ha required to achieve control. This typically results in them having a single site of action making them more vulnerable to resistance development. The low doses and single site of action is a reflection of rapid screening methods in the discovery phase. They also mean that products based upon such active substances are more likely to be cheap to produce and, perhaps more importantly, they are more likely to get through our rigorous registration systems.

I’m rather surprised about the direction this blog has taken. I started by indicating I was going to discuss the contents of the 1970 Approvals book. That’ll have to wait for next week.

 

Leave a comment / View comments

 

Not as bad as first thought?

Posted on 24/01/2012 by Jim Orson

There have been some very doom laden studies on the financial implications of the Drinking Water Directive on pesticide availability. These suggest that herbicides relied upon to control black-grass in oilseed rape could disappear, with dire financial consequences for the whole rotation. The view of the Environment Agency (EA) is that these studies are far too gloomy.

So what’s the truth? Well the EA is drawing up maps of areas that influence the pesticide content of raw drinking water. These represent very much the minority of arable land. So, by no means is the whole of the arable area expected to meet the demanding targets set by the Drinking Water Directive, which is incorporated into the Water Framework Directive.

And do the targets set by the Drinking Water Directives need to be totally met in raw drinking water? It seems not. The Water Framework Directive aims to stop the situation getting worse and avoid further investment in treatment.

But, but, but…this does not let any arable farmer off the hook. They need to adopt every reasonable measure in order to minimise movement of pesticides into water. It may be that additional measures need to be adopted in those areas that are sources of drinking water.

One major management target is for farmers to do their best to avoid huge spikes in pesticide levels in watercourses that can occur after heavy rain. Reducing water run-off, particularly from recently sprayed fields, will help significantly.

Outside the areas where drinking water is sourced, water bodies (typically lakes and rivers) have to meet the standards set by the Environmental Quality Standards (EQS) for the content of specific pesticides. The comforting fact is that a 2010 report suggests that 99% of water bodies are meeting these standards. A new report is being prepared and we await its results.

But, but, but…there is concern in arable farming that there are now interim arrangements for wider aquatic buffer zones, which can be up to 20 metres wide.

These were introduced at the request of the pesticide companies in order that the maximum number of existing and future pesticides is available to farmers. This is because the standards required preventing the impact of spray drift in water ecosystems have increased over the last few years.

The question is why have these standards increased? I have a sneaking suspicion that it’s because of the standards set in order to meet the requirements of the Water Framework Directive!

On the one hand, it may take only a small amount of spray drift to exceed EQS standards in a small field side ditch. On the other hand, the impact of individual fields on pesticide levels in a larger water body is literally watered down because of dilution of water from untreated areas.

EQS standards may largely being met for water bodies, but the impact may be felt at field level through the probability of wider buffer zones.

So, is the potential financial impact of the Water Framework Directive less than originally forecast in the reports on the subject? Probably yes, but, but, but……there is a potential financial impact due to wider buffer zones that was not forecast in these reports.

 

Leave a comment / View comments

 

Page: ‹ First  < 29 30 31 32 33 34 35 36 37 38 39 40 41 >