NIAB - National Institute of Agricultural Botany

Orson's Oracle

Ex-ministers have their say

Posted on 22/05/2014 by Jim Orson

Last week I was lucky enough to be invited to a panel discussion at the Crop Protection Association’s Annual Convention between three ex-Defra Under-Secretaries of State for Farming. It was fascinating to hear their comments now that they are largely unfettered from the responsibilities of Government.

One thing that they all agreed upon was that Defra is perhaps the government department that is most likely to have to react to issues that come ‘out of the blue’, such as animal diseases, floods and food scares. They were all concerned that the very significant cuts in the department make it less able to cope with such issues, particularly if two come along at the same time.

I suppose that the biggest unforeseen issue that Defra has had to deal with in recent years was the Foot-and-Mouth disease outbreak in 2001. Defra (then MAFF) had more resources available at that time but even then staff from its various agencies were called to the front-line. Indirectly, this prolonged the commercial availability of isoproturon in the UK. This was because the CRD (then PSD) review of the herbicide was at a critical stage when the staff involved were diverted to fight the disease. It took a few years before the review got back on track.

Another interesting aspect debated was the issue of self-sufficiency and I was delighted to find myself agreeing with the former ministers. Self-sufficiency is rather an empty term. One has only to walk around a supermarket to see the vast range of foods that we cannot economically or technically produce in this country. It reminds me of the whole life cycle analysis that was carried out a few years ago on buying out-of-season cut flowers, either shipped in from Holland or flown in from Kenya. The conclusion was that cut flowers from Holland resulted in 17 times more carbon emissions to produce and deliver to the UK customer. Repatriating that production to glasshouses in the UK would, if we were as efficient as the Dutch, close that carbon gap slightly but not by 17 fold.


I read a letter in the Farmers Weekly from Guy Smith saying that the USA is 123% self-sufficient. Again, Im not sure what this means but the geographic area of the USA and hence the different climatic zones make it easier for them to produce a wider range of foods for their citizens than we could ever achieve. For instance, in November two or three years ago, I saw the sowing and harvesting of huge areas of lettuce on the border between Arizona and Mexico. This is a winter-only activity because in the summer it is almost too hot to grow any crop successfully, particularly lettuce. The irony is that in almost every American restaurant and diner, lettuce is always served but hardly ever eaten.

We can grow some crops very competitively and the self-sufficiency argument is a distraction from us focusing on what we do and doing it even better. This does not reflect a lack of ambition but harsh economic realities. However, we should never stop looking for new and realistic opportunities such as the longer season of production that has been magnificently achieved by the UK strawberry industry. Such developments need to have a market and often require research as well as investment.

Amongst the panel there was total support for science being a key foundation in determining the future health and competitiveness of our industry. There was also recognition that science is a long-term process and not a matter of a succession of three year research contracts. However, whilst all EU governments agree that decisions should be science-based, there was a warning that political self-interest can conflict with this principle and also, that individual or small groups of MEPs can be susceptible to the pseudo-science presented by single issue groups. This raised a debate on the neonicitinoids, more on which later.

Leave a comment / View comments


50 years on

Posted on 13/05/2014 by Jim Orson

1965 was my first year at university during which I had quite an intensive course on geology. It’s a fascinating subject once you’ve remembered the sequence and dates of the different geological periods. On the course we were told about how our landscapes were formed. Over the last couple of days I’ve finally seen at first-hand a couple of the major coastal features that were used as prime examples in the lectures and which we were told we should visit. It took me 49 years to get round to seeing them! However I’m partly comforted by the fact that this, in terms of geological time, is a very rapid response.

The reason I was ‘down south’ is that I am part of a team which is trying to develop a conventional farming system in a mini-catchment that supplies a potable water pumping station. Typically, water works have carbon filters that adsorb some of the pesticides in water and also an ozone treatment in order to oxidise the remainder. Together, these two processes will significantly reduce the levels of most pesticides. However, as you may well know, metaldehyde and clopyralid (e.g. Shield SG) can survive these treatments largely unscathed. Unfortunately, at this particular pumping station they cannot treat the raw water in order to reduce pesticides. Therefore, at the input, every pesticide has to be below the Drinking Water Directive level of 0.1 parts per billion (ppb) and the total content has to be below 0.5 ppb.

Hence, there is a need to take immense care over pesticide management in this particular small catchment. There has been significant investment in farmyard facilities to ensure that where the sprayer is filled and where the pesticides and sprayer are stored are not sources of pesticides in the raw water. The next step is to ensure that pesticide use on the fields meets similar high standards.

This objective touches many farm activities; the choice of crops, the rotation, the crop protection programmes, the spraying operation, the placement of entry points to the fields and the establishment of buffer zones.

The first step has been to define which pesticides have acceptable physical/chemical properties and doses that should mean that they can be used with little concern of them exceeding their content in the water, as specified in the Drinking Water Directive. This is a very complex and challenging issue as computer modelling for each individual pesticide is impractical because of a lack of knowledge on the actual ‘leaching’ routes through the subsoil to water. An additional challenge is that there must be a readily available and cheap analysis to identify them in water at levels at or below the standards set in the Drinking Water Directive. However, we’re getting there.Drinking water

It’s clear that we cannot, at this stage, afford to have black-grass as an issue in this catchment. Its control would almost certainly mean that the standards pertaining to pesticides in water could not be met. Also, even without black-grass as a target, it is clear that robust weed control in oilseed rape is going to be at the least difficult and perhaps impossible.

It appears possible to have a robust crop protection programme in spring barley and probably in winter wheat. There are other alternative crops that might be grown but with some limitations to crop protection. A system is slowly being devised which may result in profitable conventional farming whilst meeting the Drinking Water Directive standards.

In this situation it’s easy to rely on the sulfonylurea herbicides which are used at a few g/ha. This may prove to be a short-term palliative but the threat of resistance occurring in broad-leaved weeds means that other groups of herbicides will have to be considered. There are one or two crops where only the sulfonylureas can be used. Hence, there has to be a rotational plan with non-sulfonylurea herbicides as the major means of weed control in other crops in the rotation.

The idea of the project is to start with a very precautionary approach and perhaps introduce more pesticide and cropping options if all goes well. It will be intriguing to see how this develops.

Leave a comment / View comments


Soil back in fashion

Posted on 01/05/2014 by Jim Orson

Soil seems to be back in fashion. Last week, there was even an hour long programme on it on peak time telly. I didn’t really like the “cooking does not come tougher than this” Masterchef-like presentation. This style was also used for recent harvest programmes; “will the potato yield and quality be good enough?” was repeated several times in one episode. I know, I know; I’m getting old and curmudgeonly!

Also announced last week was a website called the UK Soil Observatory. This has collated the mapped information generated by some lead research organisations. Just bang in your postcode and you can get a huge amount of information on your soils. However, those wanting field specific data may be disappointed. It’s worth a look, perhaps at a slacker time of year than this.

Much of the increased interest in soils has come about because of the industry’s frustration over not being able to increase yields. We’ve just about exhausted the potential of plant nutrition and crop protection and plant breeders are making steady but not dramatic progress. So the question in the minds of many is can we significantly increase yield potential with improved soil management?

I’m not sure of the scale of yield improvement that we can expect but we must, as an industry, investigate potential approaches to improve soil management. The problem is that this must be done in the field and long-term trials are required. Unfortunately, government and levy-bodies are reluctant to commit to long-term field trials. Agricultural charities are less reluctant and have had the foresight to fund two long-term trials that are being carried out by NIAB TAG. These have been running for the best part of 10 years and it is only after a few years that the results became of interest.Nathan Morris - NIAB TAG soil specialist

NIAB TAG’s STAR Project on a clay soil in Suffolk is in its ninth year. The plots are of a size where farm scale equipment can be used. This has clearly demonstrated that, despite all the recent rhetoric, ploughing is certainly not inferior in generating yields when compared to non-inversion tillage to a depth of around 20cm. If anything, ploughing produces the highest yields, but not necessarily the highest margins. I realise that there are downsides to ploughing but these results challenge the views of some of the more evangelical supporters of non-plough tillage. The results of this trial also show that soil organic matter is, as yet, little different in plots cultivated to a depth of 10 cm every year when compared to deeper tillage. More information is available here.

The question remains as to whether or not the long-term adoption of cover-crops and/or companion crops in addition to the retention of soil residues can result in real improvements. A 30 year trial at Morley in Norfolk has shown that incorporating straw every year rather than removing it has resulted in an increase in organic matter from 1.57 to 1.75%. This doesn’t sound a lot but the difference in soil aggregate stability is amazing. Part of this increase in aggregate stability may also be due to the increase in soil microbial biomass that results from the annual retention of straw residues. Together, this may result in easier seedbed preparation and less surface capping but how to measure this in financial terms is problematical. It clearly demonstrates a major obstacle in transferring knowledge on soil related issues. Uptake of some techniques may be higher if a monetary value can be calculated. 

In addition to these changes in soil organic matter, soil microbial biomass and aggregate stability, there is a non-significant reduction in the bulk density of the soil where the straw is incorporated rather than removed every year. This may mean that the draught requirement is reduced. Plans are afoot to measure this.Cultivations on the STAR project

There is a strong argument that the annual incorporation of crop residues, whilst producing positive results, may not be enough to make a dramatic difference to the ease of cultivation and yield potential. The additional adoption of cover crops may result in more demonstrable differences. This could be true particularly where there are spring crops in the rotation. In this case, cover crops have more time for growth and will also result in less nitrate leaching over the winter. This is an area of intense interest at the moment and the basis of another charity funded long-term trial, this time on a medium soil type at NIAB TAG Morley in Norfolk. Again there is more information here

Book your place on the STAR Open Day on Friday 6 June, near Otley in Suffolk

Book your place on the ARTIS Soils Foundation Module at various dates and locations around the UK in June and July 2014

Leave a comment / View comments


Jumping to conclusions

Posted on 17/04/2014 by Jim Orson

Recently I’ve started to go to the supermarket with my wife where we’ve developed a systematic approach to the food shop. Initially we visit the wine department and then I go off to the cafeteria for an Americano (which used to be ordinary coffee) whilst my wife chooses the food. I then turn up to help bag it and get it to the car. It’s perhaps not what one would call total participation but it is an advance from total zero.

Whilst sipping coffee in the cafeteria this morning I read The Times. There were two articles that particularly interested me - one was on invasive species and the other was on research by the University of Reading showing that adding oilseed supplements to the cows’ diet resulted in milk with the same overall amount of fats but 25% less saturated fats.

As you know we have been bombarded for years by statements that saturated fat is bad for us so this could be good news for the dairy industry. The preoccupation with saturated fats has meant that for years and years we’ve been told that dairy products may actually be bad for our health. Now, after some scientifically based studies, it appears that those who eat or drink the most dairy products are less likely to suffer from cardio-vascular problems than those who consume the least. Therefore, it seems that just focussing on the saturated fats in milk has resulted in advice that may have been deleterious to our health.

It’s always a danger to select just one factor within a complex system and thus jump to conclusions. This isn’t really science and, in this case in particular, is not advantageous to human health.

There are similarities with issues that have arisen in arable agriculture. Scientists, advisers and farmers sometimes latch onto a single simple factor and think that it will determine the outcome of a very complex system, only to be proven wrong in field trials.

Systems do not come much more complex than the soil, and yet we seem hooked on simple indicators. To tell you the truth, perhaps that’s all we can do but at least we should be wary of these guidelines.

One generally held view, at least amongst soil scientists and very much less so amongst practitioners, is that the level of Soil Mineral Nitrogen (SMN) in the early spring has a profound and predictable impact on the level of applied nitrogen required to optimise yields. Field trials show that this is clearly not true. There may be a broad relationship but field trials suggest that SMN, at the levels which occur in long-term arable soils not receiving organic manures, appears to have a limited influence on optimum levels of applied nitrogen.Fertiliser application

The same appears to be true of the seemingly logical statement that higher yielding crops require more nitrogen. Field trials show that there is either no relationship or a weak relationship between yields and nitrogen requirement for feed wheat. I personally think that there may be a weak relationship between optimum yield levels and nitrogen requirement for feed wheat, but it may be no more than a few Kg of nitrogen for each additional tonne/ha above average yields. Bread wheat is another matter and the complexities of the nitrogen nutrition means that using previous farm experience is necessary to fine-tune the dose needed to achieve the required protein content in high yielding crops.

One of the worst examples of jumping to conclusions that I have come across was when Calixin (tridemorph) was first introduced in the 1970s, initially to control mildew in spring barley. BASF said that it should be applied around the first node stage of the crop, but the more academic plant pathologists said that protecting the flag leaf was so important that it should be applied at flag leaf emergence. BASF was relying on field trials and the academics were relying on their knowledge of crop physiology. Guess who was correct.

NIAB TAG had a similar experience a few years back. When strobilurins were first introduced, we ran an HGCA funded project to identify the optimum time to apply them to winter wheat. The results were that the optimum timings to use two applications of strobilurins within an overall three spray programme were at T1 (final leaf three emerged) and T3 (mid-flowering). This happened consistently in all the trials over two or three years. Despite this, the standard advice at the time was that T2 (flag leaf fully emerged) must be the most important timing for the strobilurins and that they should either be applied at T1 and T2 or at T2 and T3. However, as we finalised the project, septoria developed resistance to the strobilurins and the results were quietly forgotten.

Leave a comment / View comments


Grasses; mankind’s best friend?

Posted on 11/04/2014 by Jim Orson

Earlier this week I was watching TV with our grandchildren. It was a children’s programme on dinosaurs; a subject that captivates our grandson. The presenter said that dinosaurs did not eat grass because they existed before grass existed. This came as a complete surprise to me and so I looked into this a little further.

It appears that the programme was partly right and partly wrong. Dinosaurs almost certainly preceded grasses but by the time they were wiped out at the end of the cretaceous period (around 66 million years ago) there was some diversity in grass species, suggesting that they too had been around for a few million years.

Grasses now form an essential element of global nutrition, despite the fact that they evolved relatively late. Cereals, including wheat, barley and oats, millet, sorghum, maize and rice now provide just below 50% of the world’s dietary energy. This varies from country to country and continent to continent. In developing countries the average is 55% and in industrialised countries it’s around 35%. 

The data demonstrates that reliance on cereals as an energy source for humans declines with increasing affluence. This is reinforced by the fact that in Africa, this reliance has fallen from over 60% to 55% over the last few decades. Increasing affluence can also explain the much reduced reliance on rice in favour of wheat in many developing countries.

Another grass not included in the above list is sugar cane which produces around 75% of the world’s sugar supply in addition to being a major feedstock for ethanol production.

In terms of the history of the world, it’s been only a couple of ticks since most of these crops have developed. RiceI’ve written before about the chance hybridisation of three grass species around 10,000 years ago that resulted in what we now call wheat. This occurred in the Fertile Crescent, which includes Israel, Lebanon, Syria, Iran and Iraq. It took around 3,000 years before it was cultivated in France and another 1,000 years before any production in the UK.

Genetic evidence has shown that rice originates from a single domestication in China 8,200–13,500 years ago. It was introduced to Europe through Western Asia, and to the Americas through European colonisation. There is a huge area in the Po Valley in Italy that’s devoted to paddy rice and I’ve witnessed its production next to the Murray River in Australia so it remains a crop grown on a global scale.  

Maize was first cultivated in Mexico at about the same time as wheat was first grown in the Fertile Crescent. Forage maizeWhilst other grains such as wheat and rice have obvious wild relatives, there’s no wild plant that looks like maize, with soft, starchy kernels arranged along a cob. The abrupt appearance of maize in the archaeological record baffled scientists. Through the study of genetics, we now know that maize's wild ancestor is a grass called teosinte. Teosinte does not look much like maize, especially when you compare its kernels to those of maize but at DNA level, the two are surprisingly alike. They have the same number of chromosomes and a remarkably similar arrangement of genes. In fact, teosinte is able to cross-breed with modern maize varieties to form maize-teosinte hybrids that can go on to reproduce naturally.

Of course pasture grasses also significantly contribute to human nutrition. However, this week I’m more interested in amenity grasses. It’s the US Masters Golf tournament, held every year on the same course in Augusta on the border between Georgia and South Carolina. It’s a place that I’d always wanted to visit since watching the tournament on TV as a child. We actually managed to attend in 2011 and 2013 and the course is stunningly beautiful and the grass is perfect even in parts of the course which only golfers of my aptitude visit. So it’s now a four-day lock-down until the final putt drops.

Leave a comment / View comments


Page: ‹ First  < 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 >  Last ›