Posted on 17/04/2014 by Jim Orson
Recently I’ve started to go to the supermarket with my wife where we’ve developed a systematic approach to the food shop. Initially we visit the wine department and then I go off to the cafeteria for an Americano (which used to be ordinary coffee) whilst my wife chooses the food. I then turn up to help bag it and get it to the car. It’s perhaps not what one would call total participation but it is an advance from total zero.
Whilst sipping coffee in the cafeteria this morning I read The Times. There were two articles that particularly interested me - one was on invasive species and the other was on research by the University of Reading showing that adding oilseed supplements to the cows’ diet resulted in milk with the same overall amount of fats but 25% less saturated fats.
As you know we have been bombarded for years by statements that saturated fat is bad for us so this could be good news for the dairy industry. The preoccupation with saturated fats has meant that for years and years we’ve been told that dairy products may actually be bad for our health. Now, after some scientifically based studies, it appears that those who eat or drink the most dairy products are less likely to suffer from cardio-vascular problems than those who consume the least. Therefore, it seems that just focussing on the saturated fats in milk has resulted in advice that may have been deleterious to our health.
It’s always a danger to select just one factor within a complex system and thus jump to conclusions. This isn’t really science and, in this case in particular, is not advantageous to human health.
There are similarities with issues that have arisen in arable agriculture. Scientists, advisers and farmers sometimes latch onto a single simple factor and think that it will determine the outcome of a very complex system, only to be proven wrong in field trials.
Systems do not come much more complex than the soil, and yet we seem hooked on simple indicators. To tell you the truth, perhaps that’s all we can do but at least we should be wary of these guidelines.
One generally held view, at least amongst soil scientists and very much less so amongst practitioners, is that the level of Soil Mineral Nitrogen (SMN) in the early spring has a profound and predictable impact on the level of applied nitrogen required to optimise yields. Field trials show that this is clearly not true. There may be a broad relationship but field trials suggest that SMN, at the levels which occur in long-term arable soils not receiving organic manures, appears to have a limited influence on optimum levels of applied nitrogen.
The same appears to be true of the seemingly logical statement that higher yielding crops require more nitrogen. Field trials show that there is either no relationship or a weak relationship between yields and nitrogen requirement for feed wheat. I personally think that there may be a weak relationship between optimum yield levels and nitrogen requirement for feed wheat, but it may be no more than a few Kg of nitrogen for each additional tonne/ha above average yields. Bread wheat is another matter and the complexities of the nitrogen nutrition means that using previous farm experience is necessary to fine-tune the dose needed to achieve the required protein content in high yielding crops.
One of the worst examples of jumping to conclusions that I have come across was when Calixin (tridemorph) was first introduced in the 1970s, initially to control mildew in spring barley. BASF said that it should be applied around the first node stage of the crop, but the more academic plant pathologists said that protecting the flag leaf was so important that it should be applied at flag leaf emergence. BASF was relying on field trials and the academics were relying on their knowledge of crop physiology. Guess who was correct.
NIAB TAG had a similar experience a few years back. When strobilurins were first introduced, we ran an HGCA funded project to identify the optimum time to apply them to winter wheat. The results were that the optimum timings to use two applications of strobilurins within an overall three spray programme were at T1 (final leaf three emerged) and T3 (mid-flowering). This happened consistently in all the trials over two or three years. Despite this, the standard advice at the time was that T2 (flag leaf fully emerged) must be the most important timing for the strobilurins and that they should either be applied at T1 and T2 or at T2 and T3. However, as we finalised the project, septoria developed resistance to the strobilurins and the results were quietly forgotten.
Posted on 11/04/2014 by Jim Orson
Earlier this week I was watching TV with our grandchildren. It was a children’s programme on dinosaurs; a subject that captivates our grandson. The presenter said that dinosaurs did not eat grass because they existed before grass existed. This came as a complete surprise to me and so I looked into this a little further.
It appears that the programme was partly right and partly wrong. Dinosaurs almost certainly preceded grasses but by the time they were wiped out at the end of the cretaceous period (around 66 million years ago) there was some diversity in grass species, suggesting that they too had been around for a few million years.
Grasses now form an essential element of global nutrition, despite the fact that they evolved relatively late. Cereals, including wheat, barley and oats, millet, sorghum, maize and rice now provide just below 50% of the world’s dietary energy. This varies from country to country and continent to continent. In developing countries the average is 55% and in industrialised countries it’s around 35%.
The data demonstrates that reliance on cereals as an energy source for humans declines with increasing affluence. This is reinforced by the fact that in Africa, this reliance has fallen from over 60% to 55% over the last few decades. Increasing affluence can also explain the much reduced reliance on rice in favour of wheat in many developing countries.
Another grass not included in the above list is sugar cane which produces around 75% of the world’s sugar supply in addition to being a major feedstock for ethanol production.
In terms of the history of the world, it’s been only a couple of ticks since most of these crops have developed. I’ve written before about the chance hybridisation of three grass species around 10,000 years ago that resulted in what we now call wheat. This occurred in the Fertile Crescent, which includes Israel, Lebanon, Syria, Iran and Iraq. It took around 3,000 years before it was cultivated in France and another 1,000 years before any production in the UK.
Genetic evidence has shown that rice originates from a single domestication in China 8,200–13,500 years ago. It was introduced to Europe through Western Asia, and to the Americas through European colonisation. There is a huge area in the Po Valley in Italy that’s devoted to paddy rice and I’ve witnessed its production next to the Murray River in Australia so it remains a crop grown on a global scale.
Maize was first cultivated in Mexico at about the same time as wheat was first grown in the Fertile Crescent. Whilst other grains such as wheat and rice have obvious wild relatives, there’s no wild plant that looks like maize, with soft, starchy kernels arranged along a cob. The abrupt appearance of maize in the archaeological record baffled scientists. Through the study of genetics, we now know that maize's wild ancestor is a grass called teosinte. Teosinte does not look much like maize, especially when you compare its kernels to those of maize but at DNA level, the two are surprisingly alike. They have the same number of chromosomes and a remarkably similar arrangement of genes. In fact, teosinte is able to cross-breed with modern maize varieties to form maize-teosinte hybrids that can go on to reproduce naturally.
Of course pasture grasses also significantly contribute to human nutrition. However, this week I’m more interested in amenity grasses. It’s the US Masters Golf tournament, held every year on the same course in Augusta on the border between Georgia and South Carolina. It’s a place that I’d always wanted to visit since watching the tournament on TV as a child. We actually managed to attend in 2011 and 2013 and the course is stunningly beautiful and the grass is perfect even in parts of the course which only golfers of my aptitude visit. So it’s now a four-day lock-down until the final putt drops.
Posted on 03/04/2014 by Jim Orson
We have just spent the weekend in Dorset and yesterday morning our car, like many others, was covered in Saharan dust. It’s no surprise that disease spores can travel hundreds or even thousands of miles if larger dust particles can move with such ease.
Whilst driving back last night, I noticed smoke initially rising from a bonfire and then, rather than spreading out, it travelled ribbon-like horizontally a few metres above the ground before eventually falling back to the ground some distance away. This is typical of temperature inversion that can occur on still evenings after a warm spring day. Such conditions can strongly suppress the dispersion of not only smoke but also other particles such as pesticides.
The condition is called inversion because the air gets warmer rather than cooler with height (i.e. it is the inverse of what typically occurs). In the UK, it is more likely to take place in the spring during the evening and night when the air after a sunny day is warmer than that just above a rapidly cooling ground and there is little wind to mix up these layers.
My first practical experience of this was one April when spraying plots. Spray opportunities were rare and we had to take the opportunity to spray at dusk. The day had been warm and sunny but windy. Towards dusk the wind dropped completely and so we sprayed. As I finished the final plot, I turned round to see some of the spray just hovering above it. What proportion I’m not sure but the treatment worked. However, even a small proportion of the spray moving elsewhere is not desirable.
The largest incident in the UK of inversion causing problems with spray drift was in 1979. Brittox, a product mix of bromoxynil, ioxynil and mecoprop esters, dominated spring broad-leaved weed control in cereals. Again it was a difficult spring for spraying and opportunities to do so were limited to late evening and early morning. The result was widespread damage to oilseed rape. On many occasions the oilseed rape in the adjoining field was untouched but a rape crop further away in the same direction could have discrete patches of damage. This eventually was pinned down to vapour of the mecoprop ester being ‘trapped’ by inversion and moving above the adjoining crop before coming down on a crop further away.
There were two implications arising from these experiences. Firstly, esters of mecoprop were withdrawn and replaced by the less volatile salt formulations. Secondly, the enthusiasm for spraying at night came to an abrupt halt. Tramlines had been introduced in the 1970s and after a few difficult springs, many farmers were contemplating using them to guide the spraying operation at night in order to take advantage of low wind conditions. However, the implicit danger from inversion meant that the problems of spray drift could be higher than when spraying in windier conditions during the day.
The UK is less prone to inversion than some other countries. A few years ago I was travelling with a crop consultant in Australia when his phone rang. A farmer wanted to know how to kill wild melons (they are inedible) in the summer fallow period. He wanted an alternative to glyphosate and the initial advice was 2-4,D. However, almost as an afterthought, the consultant asked how far it was between the paddock (field!) to be sprayed and the nearest vineyard. Apparently, at that time of the year, there had to be a buffer area of 35 kms (yes 35 kms!) between spraying 2-4,D and vineyards. Contamination of vineyards (and potentially wine) by herbicide spray drift in inversion conditions remains a major issue in Australia.
The farmer wanted to kill the wild melons and other green vegetation because he was trying to store as much water as possible in the soil to maximise the supply to the autumn sown crops. When I first worked in Australia there were two schools of thought on this issue. The first, and the one which now prevails, is that all vegetation between the harvest of one crop and the sowing of the next should be destroyed using herbicides and the previous crop stubble left undisturbed to prevent wind erosion. The alternative traditional approach had been to cultivate regularly to a shallow depth. The argument for that was not just to prevent vegetation from establishing but also to stop capillary flow of water to the surface layers.
The second view has not only been proved inferior, in terms of water storage in the soil, but also it encourages frightening levels of wind erosion. I remember sitting in a plane at Melbourne Airport just over 10 years ago after witnessing a huge amount of cultivation and consequential wind erosion the day before. The amount of red dust in the air was drowning out much of the daylight and I was genuinely concerned that the jet engines would be blocked. After that a sprinkling of Saharan dust seems trivial.
Posted on 24/03/2014 by Jim Orson
The Cambridge Science Festival is an amazing event. Over a fortnight there are hundreds of events on all aspects of science. Many of them have very catchy titles and appear extremely interesting and it’s clear that quite a few are aimed at children, including NIAB's own stand in the Plant Science Marquee talking about roots and shoots (see picture)! What a shame that I might stand out if I try one of these events.
Last Saturday I attended a panel discussion (for adults) on the publication and communication of science. I was the oldest person in the 200 plus audience! The peer review system, where the paper is checked before publication by two or three other scientists, has been the established method for around 300 years for trying to ensure the integrity of science. Generally it’s stood the test of time well but it’s certainly not perfect. For the author it can lead to frustration, particular when the reviewers, usually anonymous, provide conflicting advice and comments. Then there are the re-writes to endure.
However, once through this process, the author can proudly say that he or she has a peer reviewed paper. There’s a hierarchy of journals in which to publish and scientists aim to submit papers to so-called high impact publications that only include the best of science. Down the pecking order are numerous journals that will have peer reviewed papers but there is a suspicion that some appoint reviewers who are less than forensic and who share the same views as the author.
A few years ago, a peer reviewed journal published a paper saying that organic agriculture produced similar or higher yields than conventional agriculture and could easily feed the world. It claimed to be a review of all relevant papers where organic and conventional systems were compared. The organic movement was triumphant. It must be true, they said, because it is in a peer reviewed paper.
The conclusions of this paper were a shock to anyone who had experience of the relative production levels of the two systems and prompted an American researcher to check the sources of information used in this review. His findings were shocking.
Over 100 of the papers included in the review described only a comparison between different conventional systems and not a comparison between organic and conventional systems. Also organic yields were misreported and false comparisons were made to unrepresentatively low conventional yields. Where reasonable organic yields were achieved they were counted 2, 3, even 5 times by citing different papers that referenced the same data. Finally, favourable and unverifiable “studies” from biased sources were given equal weight to rigorous studies. I must admit that when I read this critique, my faith in the peer review system plummeted.
There has also been a recent paper suggesting that GM ‘causes cancer’ and this has had to be withdrawn because of weaknesses in the peer review system of the journal.
Trust in science is absolutely vital. The panel discussion I attended pointed towards a way that increases trust. Not surprisingly, it involves the dramatic innovation that has occurred since the peer review system started 300 years ago - the Internet.
The panel predicted that science will slowly move to a system something approximately along the lines of the following: the draft paper is submitted to a journal and reviewers, perhaps not anonymously this time, are appointed; and then the draft paper, together with the reviewers’ comments, will then appear on the journal’s website plus the data generated in the project, for anyone to review and comment on.
This whole process could take a year before a final paper script is agreed. Such an approach would result in the non-publication of the examples that I have previously quoted.
The same approach should now be contemplated by Defra and the UK levy organisations. Recently I have aired my misgivings on the conclusions of the Defra project NT26 where the efficacy of ammonium nitrate and urea were compared (‘Up in the air’).
I feel that a more open discussion should have taken place before the final conclusions were widely promulgated. The primary question should have been why the project concluded that the nitrogen in urea was on average 20% less efficiently taken up by the crop than that in ammonium nitrate, and yet a major review of UK trials’ data, in a peer reviewed journal in the 1990s, suggested that the difference was 3.5%. I also have concerns with the lack of open debate about some of the conclusions in some levy-funded reports.
I think that the era of improved electronic communications will mean that the trend towards more openness and greater transparency will prove unstoppable and should lead us to say with more confidence that in science we trust.
Posted on 16/03/2014 by Jim Orson
As soon as spring gathers pace the trite rhyme ‘spring has sprung, the grass is ris, I wonders where the birdies is?’ comes into my mind (much to the annoyance of our middle daughter Emily). This year it has more relevance because the numbers of birds in our central Cambridge garden are the lowest I can remember. This is an unexpected observation after such a warm winter when there was plenty of bird food available. The berries in the garden are almost untouched.
The numbers have been falling for years whilst the numbers of magpies and hawks have increased. There is a history of bird egg shells on the lawn and I’ve witnessed hawks hunting. However, the number of magpies has fallen locally over the last couple of years, possibly because they were running out of victims’ nests and have gone elsewhere.
I am not alone with these observations and neighbours are questioning why they are feeding birds. This raises the possibility that success in encouraging bird numbers should be measured not by their numbers but by the prevalence of their enemies, such as magpies and hawks. On this basis, the measures that have been taken to increase birds locally over the last few decades have been a great success!
On farms there are additional hazards for ground nesting birds; these include badgers. They commonly hoover up eggs and chicks in the areas established to provide habitat for ground-nesting birds and in nearby skylark plots in crops. We all know in which direction badger numbers are going.
However, I do recognise that changes in arable agriculture, which have resulted in a dramatic increase in food production, have contributed to the fall in farmland birds. One generally recognised reason was the switch from spring to winter cropping in the 1970s and 1980s. Spring cereals provided a habitat for some farmland birds and the weeds in the crop were a source of food, either as hosts to insects and/or as seed.
The trend in some areas is now towards more spring cropping because of black-grass resistance to herbicides. So, logically, should this be good for farmland birds? It may be helpful but the management of spring cereals is not what it was forty years ago.
In the 1960s and 1970s, the land was ploughed for spring barley after the winter cereals were sown in mid-October. It was a steady winter job. The stubbles being ploughed also had recently shed crop and weed seeds on the soil surface. So birds like the skylark had large areas to rest in and had a food source away from field edges. This is no longer true with the land being prepared for spring cereals earlier in the autumn, when soil conditions are generally drier, and improved harvesting techniques and cleaner crops meaning that there is less crop and weed seed around.
In addition, spring cereals are now sown a lot earlier, as soon as conditions are suitable after mid-November, and higher seed rates are used where there is a requirement to compete with black-grass. Generally more nitrogen is now being applied although some heavy-land farmers have been adopting modest doses in order to produce malting quality spring barley. This all means that the relatively open spring cereal crops, suitable habitats for the later clutches of skylarks, are now less likely to occur. In addition, the standard of weed control is now higher than it was a few decades ago. Some annual broad-leaved weed species that tend to emerge in the spring are particularly beneficial to farmland birds, e.g. knotgrass and fat hen. However, in situations where spring cereals are being adopted because of black-grass, the soil seed bank will be largely devoid of these species after many years of winter cropping.
This all suggests that spring cereals will not provide the habitat and food for birds that were provided in the 1960s and 1970s. However, it may be that I am being unduly pessimistic.
There are a number of other trite rhymes that come into my head on occasions (no comments please!). One is ‘the bleedin’ sparrer’ which was obviously written in the days when sparrow hawks were controlled. Sparrow numbers have fallen dramatically over the last few years, probably due to a number of reasons including more predation, less hay-fed livestock and excluding birds from agricultural buildings. There was one statistician who even found a relationship between the increasing adoption of lead-free petrol and the decreasing number of sparrows. I have commented many times about scientists putting lines through charts and then believing the line represents the truth; this must be a prime example where great care is needed in the interpretation.