Posted on 09/06/2013 by Jim Orson
The pressure for a limited return of straw burning to aid the control of black-grass has not gone away. I’ve just written a brief data review on this as part of a more comprehensive report on the impact of straw burning on crops and cropping. Don’t get too excited; this review has not been financed by a UK organisation and looks at the potential impact of straw burning in another part of the world.
The review has taken longer than expected because nearly all the papers I have had to read were so old that they weren’t available electronically. Plus, the libraries at Defra and Rothamsted are both moving and so many of the references were in packing cases.
So I had to employ other means to get the information, including contacting the authors of papers that were published up to 30 years ago! I also visited the Cambridge University Scientific Library where papers had to be dredged up from the very bowels of the earth. This is a great institution - I love going there and luckily it is only a mile from where I live.
There is no doubt that burning straw will help to control black-grass.
Updating the model produced by a young (at the time) Stephen Moss would suggest that if, without burning, one needs 95% control from herbicides in continuous autumn cropping to contain populations, then this would fall to around 92.5% control with burning.
Now, this doesn’t sound a lot - but look at it the other way round. Without burning you can allow 5% of the plants to survive after herbicide application. With burning this figure rises to 7.5% plants; a difference of 50%. This is roughly the same level of reduction in required herbicide control that is achieved with the cultural control method of adopting very high seedrates of winter wheat.
But, it’s not as simple as that. Straw burning breaks the dormancy (or stimulates the germination) of seed on the soil surface and so a higher proportion emerges when the pre-emergence herbicides are most active. This is provided that the seeds remain close to the soil surface after cultivation i.e. direct drilling or very shallow tillage. A surprisingly high proportion is buried below 5 cm where disc/tining to 15-20cm is used.
Then there is the issue of the role of crop residues to consider.
A student at Long Ashton did some pot studies which suggested that well distributed chopped straw that covered a lowish proportion of the soil surface did not affect the control of black-grass with isoproturon. However, experiments in pots in Denmark and field trials in France suggest that as little as one tonne of chopped straw/ha lying on the soil surface and/or incorporated in the top few cms can reduce the efficacy of both soil-acting and soil/foliage-acting herbicides. The more straw - the greater the reduction.
The story is further complicated by the fact that straw ash can adsorb herbicides. This takes me back to the early 1980s when adsorption of herbicides by straw ash often explained poor chemical control in direct drilled crops. However, experiments at the time showed that it took a few years of direct drilling and straw burning for this to become significant. So it can be concluded that ash, as a result of the occasional burning of straw, would have little impact on the activity of soil-acting herbicides.
Trials in the 1970s/1980s compared black-grass numbers in continuous winter wheat established after straw burning or baling. In a trial series over four years, where paraquat was used pre-drilling but no selective herbicide was used in the wheat, there was no difference in black-grass head numbers between the burnt and the baled plots where the land was ploughed. However, after four years where the plots were tined to 15 cm depth, there were five times as many black-grass heads in the baled plots than in the burnt plots and more than seven times as many where the land was direct drilled.
This indicates that straw burning would have the greatest impact on black-grass numbers where the land is not ploughed.
So there is no doubt that ‘strategic’ straw burning would help to reduce black-grass populations but alone, it is not a miracle cure. However, it has a potential role as part of a long term control strategy where other cultural control measures are also employed in order to reduce large black-grass populations down to manageable proportions.
On the other hand, please remember that where background organic matter levels are low, incorporating straw on a regular basis can help to maintain a higher level of soil fungal biomass that will itself help to support good soil structure.
Posted on 26/05/2013 by Jim Orson
I have recently taken a renewed interest in which nozzles farmers fit to their sprayers. On the heavy soils in East Anglia the vast majority regularly use Air Induction nozzles to reduce spray drift but when I asked the question in the Cotswolds recently, the vast majority of farmers didn’t use these nozzles. The explanation for this difference is clear; LERAP (Local Environmental Risk Assessments for Pesticides). This was introduced in 1999 to prevent short-term shocks to aquatic life caused by spray drift; there aren’t many water bodies and drainage ditches to protect in the Cotswolds. This, I must admit, rather folksy tale demonstrates that the original LERAP scheme was a classic case of farmers being rewarded for good practice.
The scheme enabled, for many pesticide products, those who fitted nozzles which decreased spray drift to reduce drastically the width of the aquatic buffer zone of 5-metres alongside water bodies, including ditches. It also recognised that as dose (as a proportion of the label recommended dose) decreased and the size of the water body increased, reductions in the buffer zone could also be adopted. This sounds complex but the look-up tables were easy to understand and the scheme was relatively easy to adopt. It’s a pity that the original explanatory leaflet made it sound more complex than it actually was!
However, the recent EU pesticide regulations have significantly increased standards i.e. there is now less tolerance of spray drift to aquatic ecosystems.
It became clear to the pesticide manufacturers that sticking to a maximum buffer zone of just 5-metres could mean that many existing pesticide products would not get through the 10-yearly re-registration process. So in 2011 CRD agreed to introduce interim arrangements that extended buffer zones up to 20-metres.
What surprised me and many others was that there was no flexibility to reduce buffer zones of between 6 and 20-metres through good spray practice, dose reductions or because of the increasing size of the water body.
I have calculated the percentage area of a square field that falls between 5 and 20 metres from the field edge. I recognise that this is rather simplistic and assumes a buffer strip on all four sides; typically, the yield losses due to hedges and field side vegetation do not extend beyond 5 or 6 metres into the field and so the loss of area will directly reflect the loss of production.
The figure demonstrates that it’s those farmers with small fields who will be more penalised. Some retained smaller fields to retain landscape features and biodiversity, so it seems that, in those respects, good practice is being penalised by these interim arrangements.
A brief review of recently issued authorisations for new products or re-registered products shows that wider aquatic buffer zones will be appearing on new labels. There is a pendimethalin product with a 20-metre buffer zone and straight diflufenican has a 12-metre buffer zone.
This is more than a little worrying because complex tank mixes, often containing these two herbicides, have to be adopted for black-grass control in the autumn. The time of application means that there is more likely to be water in ditches.
There are some other recently authorised products that have significantly wider buffer zones than the ‘old’ 5-metres. This demonstrates that with today’s necessity to tank-mix in order to get the job done, there is a potential for farmers to be forced into wider buffer zones or to use a more limited range of chemistry to control weeds, insects and diseases; the latter will only increase the risk of resistance developing.
So I hope that the ‘powers-that-be’ can develop ways of encouraging good spray practice in order that these wider buffer zones can be reduced whilst still meeting the higher standards set by the recently introduced regulations.
By the way, notice my use of language. Just to remind you that nowadays active ingredients are ‘approved’ and products are ‘authorised’.
Posted on 17/05/2013 by Jim Orson
Jim reports from his recent trip to the south eastern states of the USA.
Rocky Springs sounds like the name of a small town in a 1950s or 1960s movie. In fact it is (or was) a small settlement a few miles East of the Mississippi river. In 1860 it had 2,616 residents, now it has none. So, who, or what, killed Rocky Springs?
First of all, the name is misleading. There is not a rock in sight and there are no springs; water was accessed by wells. The soil is a deep loss, resulting from the wind depositing fine soil particles over countless years. This is one of the reasons for the settlement being established. Originally, the newly cleared land was fertile and easy to work.
However, the main reason for the establishment of the settlement was that it was a day’s walk along old Indian trails north of the delightful town of Natchez, which is on the banks of the Mississippi. Settlers in the northern states would float wood and firs down the Mississippi and sell them in Natchez and walk the 1,000 miles or so back home. And they would stay the first night in Rocky Springs.
Now the land and the settlement itself, except a still active church, have returned to woodland. The only other signs of habitation are the Post Office safe and a few stone wells standing forlornly in deeply scarred woods. So who, or what, killed Rocky Springs?
There are two suspects; steam boats and soil erosion.
Steam boats were introduced in the 1850s and offered an easier and safer way for settlers returning home. Bandits soon proliferated on the walking tracks and it became very hazardous to walk home with all that cash. So the overnight accommodation business dried up.
The perceived wisdom is that the settlement had grown to such a size it would have been self-sustaining, provided that arable cropping continued to flourish. However, the virgin arable land soon lost its initial fertility and things became tougher. On top of that, soil erosion became horrendous and there are still signs of the huge gullies that formed after a heavy rain.
That set me thinking as to whether, with today’s technology, arable cropping would have survived and even prospered at Rocky Springs. Obviously, fertilisers would have enabled crop yields to be optimised in a soil that would soon lose any fertility built up under woodland. Secondly, would minimum tillage or no-till farming have prevented soil erosion? I don’t know the answer to this but we did see some arable cropping in the area and it was clear that no-till was being practised to establish both cotton and maize.
What struck me was the low intensity of arable farming in the south eastern states of the United States. It may be that there is tremendous potential to increase production in these states should the world food supply levels become more critical. In some areas at least, no-till would be essential to sustainability and glyphosate would be the key. Hence this herbicide, much demonised by some groups, would have an essential part to play. With a fresh start in these areas, there would be every reason to think carefully as to how it would be deployed in order to prevent resistance. The penalties of over-reliance are only too clear in other parts of the United States.
I suggest that those who wish to go back to more traditional methods of arable production visit Rocky Springs. There isn’t much to see but the lessons are clear.
Posted on 09/05/2013 by Jim Orson
I have spent the last three weeks on holiday in the USA. This came about because we got tickets for the final practice day and par 3 competition of the US Masters (golf) which is held in Georgia. As a consequence we toured south east USA. There was little farming but an awful lot of swamps in the areas we visited, but we did manage to visit a cotton plantation; fascinating.
There is no doubt that you haven’t visited the NIAB website to read about the holiday of one of its staff. However, whilst we were out there the Boston bombing occurred. This appalling act of terrorism was a severe blow to the American public, and the news was dominated by this tragedy. CNN spent virtually all of its airtime on the subject for the following two weeks, most of it live from a nearby street in Boston. A speaker at the White House correspondents’ dinner said something like ‘CNN likes to cover all angles of a story, in the hope that one of them is correct’.
It was the CNN coverage of the bombing that led me to re-evaluate the meaning of ‘breaking news’. Invariably an interview with yet another witness of the bombing, who said something very similar to the previous witnesses interviewed, was classified as ‘breaking news’ and I’m sure the other news networks did the same.
In fact there is a ‘breaking news’ mentality in all news media including the UK agricultural press. There are plenty of instances where the same story is printed year after year as if it was hot news. Let me give you an example: every year I read that volunteer potatoes on dumps need to be controlled as they are a source of blight.
This is not to demean a valuable advisory message but the breaking news element is that someone new appears to make the statement. However, the sheer familiarity of such reports that are repeated annually results in little scrutiny of what has occurred in previous years or updating of the approach being covered. A good example of this is the annual coverage of canopy management of oilseed rape. The same old coverage occurs annually but there have been a lot of lessons learnt.
First of all, I have to say that OSR canopy management has enormous potential to optimise the nitrogen dose. This is no mean feat when you consider how contrary this crop can be. However, we’re falling short of this potential because of the difficulty in measuring nitrogen in the canopy. Despite the apparent sophistication of the rapid assessment techniques, the errors can be so large as to negate the value of the approach. The only sure way of getting an accurate assessment is the approach taken in the original trials; harvesting all the above ground green area, measuring it and then analysing it for nitrogen content. I am sure NIAB TAG is not alone, because of the potential reward of an accurate assessment of the optimum nitrogen dose, in trying to shortcut this tedious and expensive approach.
You may have noted that I’ve not mentioned yield as a potential reward. This is difficult to ascertain because it depends on what you compare canopy-managed OSR with.
If it’s against the same nitrogen dose, but with conventional timings, then the initial trials only recorded a significant yield increase from canopy management in one instance, where it avoided an excessive dose being applied early to a largish canopy. This resulted in crop lodging. Where lodging was avoided by the use of a triazoles fungicide there was no significant yield advantage from canopy management. And where the comparison was between canopy management and 240 kg N/ha with conventional timings then surprisingly there was still little difference in yields.
This leads me to the conclusion that unless lodging is avoided by canopy management, the overriding reason for its adoption is to optimise nitrogen doses and margins over nitrogen cost. So let’s hope that in the near future there will be some true breaking news on nitrogen for OSR. NIAB TAG has obtained a wealth of data over the last couple of years and I’m sure that soon we will be better able to assess quickly the nitrogen contained in the OSR canopy at the end of the winter.
Posted on 30/04/2013 by Jim Orson
A few weeks ago I recorded Bill Gates delivering the Richard Dimbleby lecture at the Royal Institution. It was entitled ‘The Impatient Optimist’ and centred on the efforts of the Bill and Melinda Gates Foundation to reduce child deaths from disease.
The Gates Foundation is huge, worth $US 36 billion. Not all the money has come from Bill Gates; Warren Buffet, the investment guru, has also contributed billions. They come from a long line of great American philanthropists who have given huge amounts of money to society.
I watched the recording of the lecture recently. What was refreshing about Bill Gates’ lecture was that he saw innovation as the key to reducing child deaths from disease. This is in contrast to the hand-wrenching of European society to anything new and innovative. For instance, he sees merit in using GM mosquitoes in order to reduce the transmission of dengue fever and malaria. The great cynics of our society say that this is just an excuse to get GM introduced. But why go to the expense of developing and registering GM mosquitoes if they do not have a potentially useful role? Surely not just for good PR?
As far as I know, there are two types of GM mosquitoes being developed. One approach is to modify mosquitoes to make them sterile, an alternative to irradiating them. They are then released and mate with the local population but of course there are no progeny and so total numbers are reduced. The concern about this approach is that mosquitoes may be an essential component in local ecosystems. So an alternative approach is to modify the mosquitoes so that they do not transmit disease. Large releases will allow local populations to be dominated by these harmless (in terms of disease transmission) mosquitoes.
Whisper it quietly but DDT is still being used in the fight against malaria. It seems sense to me to spend a few pence to spray the inside of a one-room mud-brick house with DDT and kill mosquitoes (and other nasty creepy crawlies) rather than to spend £5 per mosquito net in order to give people an imperfect barrier against insect bites. Now this really sounds like heresy but think about it.
The problems with DDT arose when it was massively sprayed and slopped all over the place, not when it was used in a targeted way and in quantities just sufficient to rid homes near malarial swamps of mosquitoes. So, as usual, it is how a pesticide is used that is important and not necessarily its potential for harm when used irresponsibly.
In addition, there is talk of developing vaccines against malaria but in his lecture Bill Gates described the huge challenge of getting all vulnerable groups vaccinated. You have only to think of the less than complete vaccination coverage against measles in the UK to understand the scale of the problem.
So there may be no single ‘silver bullet’ solution to dengue fever and malaria. Perhaps the biggest impediment to solving this huge source of human misery is those who do not suffer from the problem trying to impose their views on those who do.