Crops of winter wheat (Triticum aestivum L. cv. Hereward) were grown within temperature gradient tunnels at a range of temperatures at either c. 350 or 700 mu mol mol(-1) CO2 in 1991/92 and 1992/93 at Reading, UK. At terminal spikelet stage, leaf area was 45% greater at elevated CO2 in the first year due to more tillers, and was 30% greater in the second year due to larger leaf areas on the primary tillers. At harvest maturity, total crop biomass was negatively related to mean seasonal temperature within each year and CO2 treatment, due principally to shorter crop durations at the warmer temperatures. Biomass was 6-31% greater at elevated compared with normal CO2 and was also affected by a positive interaction between temperature and CO2 in the first year only. Seed yield per unit area was greater at cooler temperatures and at elevated CO2 concentrations. A 7-44% greater seed dry weight at elevated CO2 in the first year was due to more ears per unit area and heavier grains. In the following year, mean seed dry weight was increased by > 72% at elevated CO2, because grain numbers per ear did not decline with an increase in temperature at elevated CO2. Grain numbers were reduced by temperatures > 31 degrees C immediately before anthesis at normal atmospheric CO2 in 1992/93, and at both CO2 concentrations in 1991/92. To quantify the impact of future climates of elevated CO2 concentrations and warmer temperatures on wheat yields, consideration of both interactions between CO2 and mean seasonal temperature, and possible effects of instantaneous temperatures on yield components at different CO2 concentrations are required. Nevertheless, the results obtained suggest that the benefits to winter wheat grain yield from CO2 doubling are offset by an increase in mean seasonal temperature of only 1.0 degrees C to 1.8 degrees C in the UK.
Field experiments were conducted to test the seed yield responses of winter oilseed rape (Brassica napus L., cvs Libravo or Falcon) to the addition of different rates of S fertilizer, at three N application rates, on a sandy loam at Woburn, Bedfordshire, in 1990/91, 1991/92 and 1993/94. Large increases in seed yields, ranging from 0.7 to 1.6 t/ha, or 42-267% on a relative scale, were obtained in response to the application of 40 kg S/ha with 180 and 230 kg N/ha treatments. The effects of S were highly significant in 1991/92 (P < 0.01) and 1993/94 (P < 0.001) and close to significant (P = 0.053) in 1990/91. The yield benefits were obtained mainly from the application of the first 10 kg S/ha and further yield increases were unlikely above 40 kg S/ha. Increasing N application from 180 to 230 kg/ha decreased seed yield in 1990/91 and 1993/94, when no S was applied. In contrast, seed yield was not increased by S at zero or low (50 or 100 kg/ha) N rates. The interactions between N and S on seed yield were significant (P < 0.05) in 1990/91 but not in the other two seasons. Application of S also increased seed oil content in 1993/94, when the degree of S deficiency was particularly severe. With an application of 230 kg N/ha, the crops took up 5-22 kg S/ha at maturity when no S was applied and 26-51 kg S/ha when 40 kg S/ha was applied. The utilization efficiency of the fertilizer S ranged from 50 to 73% in the three seasons. Although the concentrations of total N in plants were largely unaffected by S treatments, large amounts of NO3-N accumulated in the leaves of S-deficient plants in 1993/94. This indicates that N metabolism was disrupted by S deficiency. The concentrations of S and the N:S ratios in different tissues and the whole plant changed considerably with time. The concentration of S in leaves at early flowering was found to be the best index in predicting S deficiency in terms of seed yield, and a critical value of 3.8 mg/g was obtained. In comparison, the N:S ratio in leaves at early flowering was a much poorer predictor of S deficiency.
A grazing experiment was conducted for 8 weeks in the spring/summer of 1993 at Palmerston North, New Zealand, to study the effects of condensed tannins (CT) in Lotus corniculatus (birdsfoot trefoil; cv. Grasslands Goldie) upon the lactation performance of ewes rearing twin lambs. Effects of CT were evaluated by studying the responses of ewes to twice daily oral supplementation with polyethylene glycol (PEG; MW 3500), which binds and inactivates CT. A rotational grazing system with restricted feed allowance was used. Measurements were made of pre- and post-grazing herbage mass, the composition of the feed on offer and diet selected, voluntary feed intake (VFI), milk yield and composition, liveweight gain and wool production. The concentration of metabolites in rumen fluid and in blood plasma was also measured. Lotus contained 35.5 g total nitrogen and 44.5 g total CT/kg dry matter in the diet selected, with an in vitro digestibility of 73%. At peak lactation (weeks 3 and 4) milk yield and composition were similar for control (CT-acting) and PEG-supplemented (CT-inactivated) ewes but, as lactation progressed, the decline in milk production and in the secretion rates of protein and lactose were less for control than for PEG-supplemented ewes. In mid and late lactation (weeks 6-11), control ewes secreted more milk (21%), more milk protein (14%) and more lactose (12%) than PEG-supplemented ewes. Milk fat percentage was lower for control than for PEG-supplemented ewes, but secretion rates of fat were similar for the two groups. VFI, liveweight gain and wool growth were similar for both groups. Plasma urea and glucose concentrations were lower for control than for PEG-supplemented ewes, but concentrations of non-esterified fatty acids (NEFA), growth hormone and insulin were similar for the two groups. The concentrations of ammonia and molar proportions of iso-butyric, iso- and n-valeric acids in rumen fluid were lower for control than for PEG-supplemented ewes; molar proportions of acetic, propionic and n-butyric acids were similar for the two groups. It was concluded that for ewes rearing twin lambs grazing L. corniculatus, the action of CT increased milk yield and the secretion rates of protein and lactose without affecting VFI, thereby increasing the efficiency of milk production. The increased milk production did not appear to be mediated by effects on plasma concentrations of growth hormone or insulin.
A grazing experiment, conducted for 22 weeks in 1992/93 at Aorangi Research Station, AgResearch Grasslands, Manawatu, New Zealand, compared the productivity of weaned lambs grazing Lotus corniculatus (birdsfoot trefoil) and lucerne (Medicago sativa). Effects of condensed tannins (CT) in lotus were evaluated by studying the responses of lambs to twice daily oral supplementation with polyethylene glycol (PEG). A rotational grazing system with restricted feed allowance was used. Measurements were made of pre- and post-grazing herbage mass, the composition of the feed on offer and diet selected, voluntary feed intake (VFI), liveweight gain (LWG), carcass growth, wool growth and the concentration of metabolites in rumen fluid. For both lotus and lucerne swards, the diet selected was mainly leaf. Lotus contained 34 g total CT/kg dry matter in the diet selected, whilst there were essentially no CT in lucerne. Compared to lambs grazing lucerne, lambs grazing lotus had slightly lower VFI, and higher LWG, carcass weight gain, carcass dressing-out percentage and wool growth. PEG supplementation had no effect on these measurements or upon the composition of rumen fluid in lambs grazing lucerne. However, in lambs grazing lotus, PEG supplementation reduced wool growth (10.9 v. 12.1 g/day), slightly reduced LWG(188 v. 203 g/day), increased rumen ammonia concentration, and increased the molar proportions of iso-butyric, isovaleric and n-valeric acids and protozoa numbers in rumen fluid. PEG supplementation did not affect carcass gain, carcass fatness or the molar proportion of rumen acetic, propionic or n-butyric acids in lambs grazing lotus. It was concluded that the principal effect of CT in growing lambs grazing lotus was to increase wool growth without affecting VFI, thereby increasing the efficiency of wool production, that the greater rate of carcass gain of lambs grazing lotus than those grazing lucerne was mainly caused by factors other than CT and that CT did not affect the rumen fermentation of carbohydrate to major volatile fatty acids.
Ammonia volatilization and denitrification were measured in a ryegrass field in Denmark after direct injection and application with trail hoses of an untreated cattle slurry and an anaerobically digested slurry in late May-early June 1993 and 1994. Ammonia volatilization was measured using a wind-tunnel system for a period of 8 days after slurry application. Denitrification was measured for a period of 21 days after slurry application. In an adjacent field experiment, nitrogen-uptake (N-uptake) was determined in the first two cuts of the ryegrass harvested after slurry application. N losses through ammonia volatilization were larger in 1993 than in 1994 due to differences in climatic conditions. Ammonia volatilization was lowered substantially (47-72%), when slurry was injected compared with surface application. In 1993 the loss from surface-applied digested slurry was only 35% of total ammoniacal nitrogen (TAN), while the loss from the raw slurry was 47%. There were no significant differences in ammonia volatilization from the two slurry types in the other experiments. N losses through denitrification were low (<2% of TAN), but there were clear differences in the losses, depending on slurry type, application method and experimental year. Injection of the slurry gave a larger N-uptake in the first cut of grass compared to the trail-hose application. In 1993 N-uptake from the digested slurry treatment gave significantly larger N-uptake compared to the raw slurry in the first cut.
In laboratory tests using stable manure consisting of wheat straw and slurry, ammonia emission was found to have two peaks corresponding to the population dynamics of proteolytic bacteria and amino acid-degrading bacteria respectively. Cumulative ammonia emissions over 14 days were 0.8-23.2% of the initial total nitrogen (N-t) and were both abiotically and biotically induced. Changes in pH had the most significant effect on the abiotically induced ammonia emissions. After 14 days of decomposition, at pH values of 6.0 and 7.5, abiotically induced emissions remained close to the limit of detectability, whereas at pH 9.0 as much as 9.8% of the initial N-t was lost. An increase in storage pressure from 0 to 400 and 800 kp/m(2) generally decreased the biotic emissions to 9.6, 2.8 and 2.3%; while increasing the amounts of litter (2.5, 5.0 and 15.0 kg straw/LAU per day) led to a decline not only in the biotic (17.1, 12.8, 3.5%) but also in the abiotic emissions (6.1, 5.5, 1.6%). Varying the temperature (20, 30 and 40 degrees C) resulted in biotically induced emissions of 7.9, 11.7 and 11.6%, respectively, and abiotically induced emissions of 1.1, 1.4 and 2.2% of the initial N-t. At temperatures of 30 and 40 degrees C, the amount of microbially digested sources of carbon available was obviously sufficient to permit almost total reincorporation of NH4+ from 4 days onwards.
The Broadbalk Wheat Experiment at Rothamsted (UK) includes plots given the same annual applications of inorganic nitrogen (N) fertilizer each year since 1852 (48, 96 and 144 kg N/ha, termed N-1, N-2 and N-3 respectively). These very long-term N treatments have increased total soil N content, relative to the plot never receiving fertilizer N (N-0), due to the greater return of organic N to the soil in roots, root exudates, stubble, etc (the straw is not incorporated). The application of 144 kg N/ha for 135 years has increased total soil N content by 21%, or 570 kg/ha (0-23 cm). Other plots given smaller applications of N for the same time show smaller increases; these differences were established within 30 years. Increases in total soil N content have been detected after 20 years in the plot given 192 kg N/ha since 1968 (N-4). There was a proportionally greater increase in N mineralization. Crop uptake of mineralized N was typically 12-30 kg N/ha greater from the N-3 and N-4 treatments than the uptake of c. 30 kg N/ha from the N-0 treatment. Results from laboratory incubations show the importance of recently added residues (roots, stubble, etc) on N mineralization. In short-term (2-3 week) incubations, with soil sampled at harvest, N mineralization was up to 60% greater from the N-3 treatment than from N-0. In long-term incubations, or in soil without recently added residues, differences between long-term fertilizer treatments were much less marked. Inputs of organic N to the soil from weeds (principally Equisetum arvense L.) to the N-0-N-2 plots over the last few years may have partially obscured any underlying differences in mineralization. The long-term fertilizer treatments appeared to have had no effect on soil microbial biomass N or carbon (C) content, but have increased the specific mineralization rate of the biomass (defined as N mineralized per unit of biomass). Greater N mineralization will also increase losses of N from the system, via leaching and gaseous emissions. In December 1988 the N-3 and N-4 plots contained respectively 14 and 23 kg/ha more inorganic N in the profile (0-100 cm) than the N-0 plot, due to greater N mineralization. These small differences are important as it only requires 23 kg N/ha to be leached from Broadbalk to increase the nitrate concentration of percolating water above the 1980 EC Drinking Water Quality Directive limit of 11.3 mg N/I. The use of fertilizer N has increased N mineralization due to the build-up of soil organic N. In addition, much of the organic N in Broadbalk topsoil is now derived from fertilizer N. A computer model of N mineralization on Broadbalk estimated that after applying 144 kg N/ha for 140 years, up to half of the N mineralized each year was originally derived from fertilizer N. In the short-term, the amount of fertilizer N applied usually has little direct effect on losses of N over winter. In most years little fertilizer-derived N remains in Broadbalk soil in inorganic form at harvest from applications of up to 192 kg N/ha. However, in two very dry years (1989 and 1990) large inorganic N residues remained at harvest where 144 and 192 kg N/ha had been applied, even though the crop continued to respond to fertilizer N, up to at least 240 kg N/ha.
Agronomic practices can be modified to decrease autumn soil nitrate and nitrate leaching. This experiment aimed to measure the effectiveness of such practices when integrated into a farming system under UK conditions. The experiment started in autumn 1988 on a sandy soil in Nottinghamshire, UK, and comprised a four-course rotation of potatoes-cereal-sugarbeet-cereal. Three husbandry systems were superimposed, ranging from current commercial practice to most nitrate retentive. Plots were split further to receive either half or full recommended rates of nitrogen (N) fertilizer. Soil mineral N (Nmin) and nitrate leaching (using porous ceramic cups) were measured on selected treatments; this paper presents the findings after five winters. Autumn Nmin and N leached were strongly influenced by the previous crop, consistently following the order potatoes > cereal > sugarbeet. Pre-harvest management (chiefly N fertilizer input) affected Nmin, and post-harvest management also modified N loss. Cover crops (winter rye or forage rape) after cereals removed 10-30 kg/ha N, depending on previous N management, time and method of establishment. They decreased leaching and were particularly effective if they were able to establish fully before significant drainage occurred. Nmin following sugarbeet, which had received 125 kg/ha N, was less after November lifting than after October lifting (16 and 28 kg/ha N, respectively, as a mean of autumns 1989-92). Potatoes left most Nmin (a mean of 60 kg/ha for autumns 1989-92, receiving 220 kg/ha fertilizer N), and their late harvest gave little scope for decreasing leaching losses by establishing green cover before the start of winter. After late harvested root crops (both beet and potatoes), it was often preferable to leave the land fallow over winter, rather than ploughing and drilling a winter cereal. We show that nitrate leaching can be decreased by simple and inexpensive modifications to an existing crop rotation. Averaged over five winters, adopting such practices decreased the mean N concentration in drainage from 22.3 to 14.5 mg/l.
Liming is often recommended to minimize the plant uptake of potentially toxic elements from sludge-amended soils. In outdoor experiments conducted during 1989-91 in a rural location, near Brentwood (UK), wheat, carrots and spinach were grown on soils from a wide range of sites previously amended with heavy applications of sewage sludge. The objective of these studies was to examine the effect of liming on the accumulation of sludge-borne metals in the crop plants. The results showed that liming the soils to pH 7 prior to sowing significantly reduced metal concentrations in carrots and spinach, although the reduction appeared to be greater for Cd, Ni and Zn than for Cu and Pb. The wheat crop was grown on soils which had been limed 2 years previously, and the average pH of these soils was 6.5 compared to a pH value of 5.95 in the unlimed soils. This comparatively small pH difference between limed and unlimed soils (6.50 - 5.95) generally had little influence on metal contents in wheat. These results suggested that maintaining the soil at pH 7 is better than pH 6.5 for minimizing the accumulation of potentially toxic elements from soils which have received relatively high levels of sludge application over many years. The data for winter wheat suggested either that metal uptake into the grain was not sensitive to differences in soil pH or that a relatively small residual effect of past liming was not high enough to reduce metal uptake.
An experiment was conducted at Palmerston North, New Zealand, to determine the effect of condensed tannins (CT) on the true and apparent digestion of methionine and cysteine in the small intestine (SI) of sheep fed fresh Lotus corniculatus. The lotus contained c. 30 g total CT/kg dry matter (DM) and was fed hourly to sheep in metabolism crates. Four sheep were prepared with rumen and abomasal cannulae which enabled the indigestible liquid phase marker, chromium ethylene diamine tetra-acetic acid (Cr-EDTA), to be infused into the rumen to estimate digesta flow. True digestibility of plant methionine and cysteine in the SI and their site of absorption in the SI were determined from S-35-labelled L. corniculatus homogenate continuously infused into the abomasum. After 9 h infusion of the S-35-labelled lotus homogenate, the sheep were slaughtered and digesta samples were taken at intervals along the small and large intestines. The effect of CT was determined by comparing two control sheep (CT-acting) with two sheep given a continuous intraruminal infusion of polyethylene glycol (PEG, MW 3500) to bind and inactivate the CT. The CT reduced the true digestibility of plant methionine (0.72 v. 0.88) and cysteine (0.65 v. 0.81) in the SI relative to sheep receiving PEG. Condensed tannins also appeared to alter the site of digestion of both [S-35]methionine and [S-35]cysteine in the SI, and increased the flux of both amino acids in the mid and latter thirds of the SI. CT did not affect the apparent digestibility of total methionine (0.82 v. 0.84) in the SI but reduced the apparent digestibility of total cysteine from 0.77 to 0.66. In control sheep CT increased the abomasal flux (as a proportion of eaten) of total digesta methionine (0.88 v. 0.76) and total digesta cysteine (0.74 v. 0.62). The apparent absorption of total methionine (plant + microbial + endogenous) was increased by the action of CT (0.72 v. 0.63 g/g eaten) but was similar for total cysteine (0.49 v. 0.48 g/g eaten) in both groups. It was concluded that CT reduced the true digestibility of plant methionine and cysteine in the SI. However, it was calculated that the action of CT actually increased the total amounts (g/g eaten) of plant methionine and cysteine absorbed from the SI, due to its effect in increasing abomasal flux.
The effects of overwinter cover cropping, delayed ploughing and method of straw disposal on the quantities of nitrate leached (averaged over three winters during 1989-93) from a chalk loam in Eastern England were examined. The recovery of 'retained' nitrogen (retained through cover crop uptake, delayed ploughing and immobilization by straw) in a following spring crop was also assessed. In the first two winters, the rye cover crop decreased nitrate leaching by > 90% (28 kg N/ha per year), as compared with bare fallow treatments. In 1992/93 this decrease was only 23% (10 kg/ha), due to the early onset of drainage before cover was well established. Delayed ploughing on bare treatments, to decrease autumn N mineralization and subsequent nitrate leaching, was ineffectual in 1989/90 but had substantial effects in 1990/91 and 1992/93; N mineralization, inferred from soil mineral nitrogen content, and nitrate leaching were decreased by 31 and 35% in 1990/91 and by 36 and 61% in 1992/93, respectively. Nitrate leaching (averaged over three winters) was unaffected by straw incorporation. There was no evidence of recovery of cover crop N in the spring sown test crops (barley or sugarbeet). In the low soil N input situation encountered in this experiment, it was unnecessary to sow cover crops before early September in years of average or below average rainfall to ensure that the average soil solution concentrations remained below the EU drinking water limit of 11 mg NO3-N/l. However, in wetter seasons substantial N leaching occurred before cover had taken up much N. In 1992/93 N retained against leaching by a rye cover crop in previous years was apparently being remobilized and lost through leaching, although if cover was grown again there was less leaching than from bare land. In the future, an increase in the extent of cover cropping might increase transpiration rates and therefore lead to a decrease in aquifer recharge.
Differences amongst wheat cultivars in the rate of reproductive development are largely dependent on differences in their sensitivity to photoperiod and vernalization. However, when these responses are accounted for, by growing vernalized seedlings under long photoperiods, cultivars can still differ markedly in time to ear emergence. Control of rate of development by this 'third factor' has been poorly understood and is variously referred to as intrinsic earliness, earliness in the narrow sense, basic vegetative period, earliness per se, and basic development rate. Certain assumptions are made in the concept of intrinsic earliness. They are that differences in intrinsic earliness (i) are independent of the responses of the cultivars to photoperiod and vernalization, (ii) apply only to the length of the vegetative period up to floral initiation (as suggested by several authors), (iii) are maintained under different temperatures, measured either in days or degree days. As a consequence of this, the ranking of cultivars (from intrinsically early to intrinsically late) must be maintained at different temperatures. This paper, by the re-analysis of published data, examines the extent to which these assumptions can be supported. Although it is shown that intrinsic earliness operates independently of photoperiod and vernalization responses, the other assumptions were not supported. The differences amongst genotypes in time to ear emergence, grown under above-optimum vernalization and photoperiod (that is when the response to these factors is saturated), were not exclusively due to parallel differences in the length of the vegetative phase, and the length of the reproductive phase was independent of that of the vegetative phase. Thus, it would be possible to change the relative allocation of time to vegetative and reproductive periods with no change in the full period to ear emergence. The differences in intrinsic earliness between cultivars were modified by the temperature regime under which they were grown, i.e. the difference between cultivars (both considering the full phase to ear emergence or some sub-phases) was not a constant amount of time or thermal time at different temperatures. In addition, in some instances genotypes changed their ranking for 'intrinsic earliness' depending on the temperature regime. This was interpreted to mean that while all genotypes are sensitive to temperature they differ amongst themselves in the extent of that sensitivity. Therefore, 'intrinsic earliness' should not be considered as a static genotypic characteristic, but the result of the interaction between the genotype and temperature. Intrinsic earliness is therefore likely to be related to temperature sensitivity. Some implications of these conclusions for plant breeding and crop simulation modelling are discussed.
Iron toxicity is a nutrient disorder associated with high concentrations of iron in soil solutions. Deficiencies of other nutrients, such as P, K, Ca, Mg and Zn, have been implicated in its occurrence in rice plants. Field experiments were carried out in 1992 and 1993 in Ivory Coast to evaluate the iron toxicity tolerance of promising rice cultivars available in West Africa, and to provide additional information for selecting breeding materials. Two sites, differing in their potential to cause iron toxicity, were used. Glasshouse and held studies were also conducted to test the role of other nutrients in the occurrence of iron toxicity. The results showed that genetic tolerance to iron toxicity can significantly improve rice production in iron-toxic soils, with some cultivars producing yields in excess of 5 t/ha. The application of N, P, K and Zn in the field decreased the uptake of iron in rice tops, and this can be a significant factor in the iron-toxicity tolerance of the cultivars.
Soil mineral nitrogen (N-min) was measured to 90 cm at a total of 12 sites in the UK in the autumn after an oilseed rape experiment, which measured responses to fertilizer N. On average, N-min increased by 15 kg/ha per 100 kg/ha fertilizer nitrogen (N) applied to the rape, up to the economic optimum amount of N (N-opt). There were larger increases in N-min where fertilizer applications exceeded N-opt, thus super-optimal fertilizer applications disproportionately increased the amount of nitrate likely to leach over-winter. The small effects of sub-optimal N on N-min were associated with large increases in N offtake by the oilseed rape, whereas the larger effects of super-optimal N on N-min were associated with only small increases in N offtake. Over 70% of the variation in autumn N-min was explained by the previous rape's N fertilizer rate and the topsoil organic matter content. Nitrogen applied to the rape increased grain yields of the succeeding wheat crops when no further fertilizer N was applied to the wheat. It was concluded that N applied to oilseed rape significantly affected N-min after harvest, and these effects were not completely nullified by leaching over-winter, so soil N supply to the succeeding wheat crop was significantly increased. Responses in grain yield indicated that each 100 kg/ha N applied to the rape provided N equivalent to c. 30 kg/ha for the following cereal. Each 1% of soil organic matter further contributed N to the wheat, equivalent to 25 kg/ha. It is important to ensure that oilseed rape receives no more than the optimum amount of fertilizer N if subsequent leaching is to be minimized. Reductions below optimum amounts will have only a small effect on leaching. Substantial changes in the economic optimum N for rape production should be accompanied by adjustment in fertilizer N application to following wheat crops. Fertilizer recommendation systems for wheat should take account of the fertilizer N applied to the preceding oilseed rape and the topsoil organic matter content.
Stands of bambara groundnut (Vigna subterranea (L.) Verdc.) were grown in five controlled-environment glasshouses at the Tropical Crops Research Unit, University of Nottingham, Sutton Bonington Campus, in 1990. Five soil moisture regimes were imposed (one per house), from fully irrigated each week (treatment A), to no irrigation after crop establishment at 35 days after sowing (DAS) (treatment E). Decreasing the amount of water applied resulted in a decline in total dry matter production and harvest index, and a reduction in pod yield from 4.12 (treatment B) to 0.04 t ha(-1) (treatment E) at 125 DAS. A maximum leaf area index of 5.4 was achieved by treatments B and C at 90 DAS, resulting in a fractional interception of c. 0.8 of incoming radiation. Total accumulated radiation interception values were 749, 693, 688, 618 and 554 MJ m(-2) for treatments A, B, C, D and E, respectively. The efficiency of conversion of the radiation intercepted into dry matter was reduced from 1.41 to 0.50 g MJ(-1) by drought.
The changes in the buffer components and pH in the surface layer of a pig and a cattle slurry were studied in the laboratory of the Department of Soil Science, Lincoln University in 1994. The slurries were spread to a depth of 7 mm in Petri dishes open to the atmosphere. Slurry pH, total inorganic carbon (TIC = CO2 + HCO3- + H2CO3), total ammoniacal nitrogen (TAN = NH3 + NH4+) and volatile fatty acids (VFA = C-2-C-5 acids) were determined at 8-10 intervals after 1-96 h of incubation at 10, 16 and 22 degrees C. A great increase in pH over the first 8 h was due to the release of CO2. If the initial TIC > TAN, pH then increased steadily but slowly from 8 to 96 h. When the initial TIC < TAN, the pH declined or did not change after 20 h incubation. The initial pH elevation rate increased with temperature and initial concentration of TIC. Calculation indicated that the NH3 partial pressure (P-NH3) in equilibrium with the slurry increased and pH decreased at increasing temperature if gases could not exchange between the slurry and the atmosphere. From the open slurry system P-NH3 increased with temperature during the first 1-20 h. At 16 and 22 degrees C the PNH3 declined to low values after 20 h, whereas at 10 degrees C the P-NH3 remained appreciable after 20 h. This explains why high accumulated NH3 losses may occur when slurry is applied to the field at low temperatures.
Two long-term field experiments were conducted on a clay soil near Carrickfergus, County Antrim from 1983 to 1993. The main experiment tested the effects of lime (0, 4, 8 and 12 t/ha applied in 1983), N (160 and 320 kg N/ha per year as ammonium nitrate/calcium carbonate), sward type (permanent pasture and perennial ryegrass reseed) and initial soil pH (5.1 and 5.5) on the yield and composition of herbage for 10 years. The secondary experiment studied the interaction between lime (0, 4 and 8 t/ha applied in 1985) and N (80 and 160 kg N/ha per year) for 8 years. In both experiments the plots were fertilized three times each year for three cuts of herbage. In the main experiment, dry matter (DM) yield and N offtake over all 10 years depended little on initial soil pH. Over all cuts and years, DM yields of both sward types increased with lime. Responses peaked after 3 years and were largest with the first cut of the reseed at the lower rate of N fertilizer. Over the first 6 years after lime application, the average responses from the reseed at the lower rate of N fertilizer to 4, 8 and 12 t/ha of lime were 1.02, 1.85 and 1.65 t DM/ha per year respectively at the first cut. At the higher rate of N fertilizer, the response in DM yield of the reseed to lime averaged 0.91 t/ha at the first cut over the same period. In the last 3 years of the experiment, lime had no effect on DM yield even though soil pH ranged from 5.0 to 6.3 A significant response in N offtake due to lime only occurred at the first cut. Responses at the first cut averaged over all treatments were 3.5, 6.5 and 6.6 kg N/ha per year for 4, 8 and 12 t/ha of lime respectively. In the secondary experiment, responses to lime were again mainly at the first cut. There were few lime x N interactions in either experiment. Liming increased N availability either by increasing mineralization of soil N or by improving the uptake of ammonium and nitrate by roots. The effects of soil pH and Ca supply on these two processes are difficult to separate. All rates of liming at both N rates were cost-effective for the reseed, but only the lower rates of liming at 160 kg N/ha per year were cost-effective on permanent pasture. Current recommendations for liming grasslands should continue, particularly for swards reseeded with perennial ryegrass.
In a 15-week animal-house experiment, 24 steers were offered one of six diets based on molasses and an libitum barley straw. Three levels of dietary nitrogen (N) and three levels of dietary phosphorus (P), in factorial combination, were formulated by the addition of urea, formaldehyde-treated wheat gluten and monosodium orthophosphate. Food intake, liveweight gain, plasma metabolites and P kinetics were measured under dietary regimens similar to those experienced by cattle grazing Australia's northern semi-arid rangelands. The adverse effect of the low dietary N on both liveweight change and feed intake was greater and more immediate than that of the dietary P deficiency. The reduction in feed intake due to the P deficiency approached that caused by the N deficiency after 10 weeks. Under conditions of adequate dietary N, there was a trend for the effects of P deficiency on liveweight gain to be exacerbated. Dietary N and P deficiency reduced the concentrations of plasma urea-N and inorganic P respectively. Dietary N deficiency had no effect on cortical rib bone thickness but P deficiency markedly decreased bone thickness. Faecal endogenous loss of P and P absorption efficiency ranged from 9 to 21 mg/kg LW and 0.63 to 0.82 respectively for P intakes from 6 to 41 mg/kg LW. Faecal endogenous losses were closely related to dry matter Intake and plasma inorganic P together. Dietary N deficiency affected the efficiency of absorption of P. The results of this experiment indicate that cattle consuming diets containing low levels of N and P require supplementary N and P in combination to avoid severe depletion since an increase in N intake alone exacerbated the P deficiency. The results are also discussed in relation to the published findings regarding P metabolism and the implications for the calculation of P requirements.
The effects of different rates of N fertilizer (0-180 kg N/ha) were tested on the growth, yield and processing quality of sugarbeet in 34 field experiments in England between 1986 and 1988. The experiments were performed using soil types, locations and management systems that were representative of the commercial beet crop in the UK. The responses obtained showed that current recommendations for N fertilizer use are broadly correct, but large differences occurred on some soil types, in some years, between the recommended amounts and the experimentally determined optima for yield. The divergence was largest when organic manures had been applied in the autumn before the beet crop. Calculations using a simple nitrate leaching model showed that much of the N in the manures was likely to be leached, the extent of leaching being much less if the manure application was delayed until spring. In these circumstances, spring measurement of inorganic mineral N in the soil could improve fertilizer recommendations. In situations where higher than optimum rates of fertilizer N were used, the extra N had little effect on yield. Increasing the rate from 0 to 180 kg N/ha increased the amount of nitrate left in the soil at harvest by only 8 kg N/ha. The amount of inorganic N released into the soil from crop residues at harvest increased by 50 kg N/ha with N application rate, and the fate of this N has not been established.