The Effect of Time of Shearing on Wool Production and Management of a Spring-lambing Merino Flock
Angus John Dugald Campbell, BVSc(Hons) BAnimSc
Submitted in total fulfilment of the requirements of the degree of Doctor of Philosophy
November 2006
University of Melbourne School of Veterinary Science
© Angus Campbell, 2006
ABSTRACT
Choice of shearing time is one of the major management decisions for a wool-producing Merino flock and affects many aspects of wool production and sheep health. Previous studies have investigated the effect of shearing on only a few of these factors at a time, so that there is little objective information at the flock level for making rational decisions on shearing time. This is particularly the case for flocks that lamb in spring, the preferred time in south-eastern Australia.
A trial was conducted in a self-replacing, fine wool Merino flock in western Victoria, from January 1999 to May 2004, comparing ewes shorn annually in December, March or May. Within each of these shearing times, progeny were shorn in one of two different patterns, aligning them with their adult shearing group by 15–27 months of age.
Time of shearing did not consistently improve the staple strength of wool. December-shorn ewes produced significantly lighter and finer fleeces (average 19.1 µm, 3.0 kg clean weight), whereas fleeces from March-shorn ewes were heavier and coarser (19.4 µm, 3.1 kg). Fleeces from ewes shorn in May were of similar weight to fleeces from March-shorn ewes (3.1 kg), but they were of significantly broader diameter (19.7 µm). In young sheep, beneficial changes in some wool characteristics for each shearing group were offset by undesirable changes in others.
Shearing ewes in March or May, and weaners in March, May or June, significantly increased the risk of post-shearing mortality about three- and four-fold, respectively, compared to unshorn sheep. Substantial, highly significant associations in young sheep between post-weaning mortality, bodyweight and growth rate were also quantified using various survival analysis techniques. For example, the lightest 20% of weaners at weaning contributed 31% of all deaths in the year following weaning, and increasing average growth rate over summer and autumn from 250 to 500 g/month reduced the risk of death by 74%. These results could be used to develop supplementary feeding systems that efficiently reduce weaner mortality, which is a significant animal welfare issue in many Australian Merino flocks.
Mortality effects were incorporated into estimates of the total value of wool produced by the different shearing times between birth and culling at 6¼ years of age. Using median historical (1991–2006) wool prices, shearing ewes in March and their progeny first in June, or October (weaner)-December (ewe) shearing produced the greatest total value of wool ($111/head). March (weaners)-March (ewes) shearing had a wool value of $107/head and December (weaners)-December (adults) shearing $103/head. May-shorn ewes produced the smallest value of wool, irrespective of whether their progeny were first shorn in May or July ($93–96/head).
i
No shearing time consistently improved all animal health measures. May-shorn ewes had significantly more fleece rot in late autumn than the other shearing groups (odds ratio 2.5) and were up to 0.4 condition score lighter during winter, although they had a lower cost of dag (average $0.64/head) and significantly less breech strike risk in spring, compared to December-shorn ewes (odds ratio 0.18). December-shorn ewes had the greatest cost of dag ($1.50/head). March-shorn ewes had an intermediate cost of dag ($1.03/head) but significantly less breech strike than May-shorn ewes (odds ratio 0.38).
Overall, December and March shearing were shown to be appropriate alternatives for a self-replacing Merino flock in south-eastern Australia, whereas May was an undesirable shearing time.
ii
DECLARATION
This is to certify that:
(i) the thesis comprises only my original work
(ii) due acknowledgement has been made in the text to all other material used,
(iii) the thesis is less than 100,000 words in length, exclusive of tables, maps, bibliographies and appendices
Angus JD Campbell
iii
All we like sheep have gone astray; we have turned every one to his own way; and the Lord hath laid on him the iniquity of us all.—Isaiah 53:6
v
ACKNOWLEDGEMENTS
I wish to thank the Vizard Foundation and Mr Maurice and Mrs Jill Glover, who generously made their farms, facilities and sheep available for this trial, and the staff of ‘South Roxby’, particularly Mr Peter Lindeman, for the day-to-day management of the sheep. This work received financial support from the Vizard Foundation and the Australian Sheep Industry Cooperative Research Centre (CRC). The CRC also kindly supported me with a top-up scholarship.
I am very grateful to my supervisors, Dr John Larsen and Associate Professor Andrew Vizard, for their guidance and assistance throughout the study. Associate Professor Vizard devised and designed the trial and it was overseen by Dr Larsen, along with other members of the Mackinnon Project at the University of Melbourne, from December 1998 to January 2002, when I took over. Mr Garry Anderson happily and ably provided me with guidance for the statistical analyses that I performed, for which I am very thankful.
The amassing of such a large amount of data is due in no small part to the dedicated and meticulous work of Ms Dianne Rees. I will be eternally grateful for her technical assistance and cheerful smile, which made 5am starts on shearing days bearable.
I would also like to thank the following people, who generously provided me with information or assistance during my work: Mr Rod Agar (Australian Wool Testing Authority Ltd) for helping with processing of the dyebanded wool samples; Dr Renick Peres (Department of Primary Industries, Geelong) for providing the weather data; and Dr Roger Thompson (Pasture and Veterinary Institute, Hamilton) for kindly providing me with the results of his lamb shearing experiments.
This work is dedicated to my partner, Sarah, for her abiding love and support, and to my son, Callum, for making anything I do worthwhile.
vii
TABLE OF CONTENTS
Abstract ...... ii Declaration...... ii Acknowledgements...... ii Table of Contents...... ii List of Tables...... ii List of Figures...... ii C h a p t e r 1 General Introduction ...... 2 1.1 Background...... 2 1.2 Determinants of Wool Enterprise Profitability & Their Relationship to Shearing Time...... 2 1.2.1 Stocking Rate...... 2 1.2.2 Wool Production per Sheep ...... 2 1.2.3 Average Price per Kilogram of Wool ...... 2 1.2.4 Enterprise Costs...... 2 C h a p t e r 2 Literature Review ...... 2 2.1 The Effect of Shearing Time on Wool Quality...... 2 2.1.1 Staple Strength & Position of Break...... 2 2.1.1.1 Components of Staple Strength & Position of Break...... 2 2.1.1.1.1 Fibre Diameter Properties...... 2 2.1.1.1.2 Intrinsic Fibre Strength...... 2 2.1.1.1.3 Fibre Shedding ...... 2 2.1.1.2 Associations between Staple Strength, Position of Break & Shearing Time ...... 2 2.1.1.3 Shearing Time & Recurring Factors Affecting Staple Strength...... 2 2.1.2 Fleece Weight, Yield & Fibre Diameter ...... 2 2.1.3 Staple Length...... 2 2.1.4 Vegetable Matter Contamination...... 2 2.1.5 Time of Shearing & Wool Quality—Summary ...... 2 2.2 The Effect of Shearing Time on Flock Management & Sheep Health ...... 2 2.2.1 Sheep Nutritional Requirements & Farm Stocking Rate ...... 2 2.2.2 Reproductive Efficiency ...... 2 2.2.2.1 Shearing During Mating ...... 2 2.2.2.2 Shearing & Lamb Birthweight...... 2 2.2.2.3 Shearing & Ewe Sheltering Behaviour ...... 2 2.2.3 Fleece Rot...... 2 2.2.4 Dag ...... 2 2.2.5 Flystrike...... 2 2.2.6 Other Health Issues...... 2 2.2.7 Mortality...... 2 2.3 Mortality of Weaner Merino Sheep ...... 2 2.3.1 The Extent of Merino Weaner Mortality in Australia...... 2 2.3.2 Limitations of Weaner Mortality Investigations...... 2 2.3.3 Causes of Mortality & Illthrift...... 2 2.3.4 Factors Associated with Weaner Mortality...... 2 2.3.4.1 Bodyweight...... 2 2.3.4.2 Growth Rate...... 2 2.3.4.3 Bodyweight & Death from Causes Other Than Malnutrition ...... 2 2.3.4.4 Physiology Underlying the Association between Mortality & Bodyweight ...... 2 2.3.4.5 Season & Year of Birth...... 2 2.3.4.6 Disease...... 2 2.3.4.7 Maternal Factors ...... 2 2.3.4.8 Sex ...... 2
ix
2.3.5 Weaner Mortality—Summary ...... 2 2.3.6 Methodologies for Analysing Survival & Mortality...... 2 2.4 Conclusion...... 2 C h a p t e r 3 Experiment Introduction and Materials & Methods...... 2 3.1 Introduction & Trial Overview...... 2 3.2 Experimental Site ...... 2 3.3 Animals...... 2 3.4 Flock Management ...... 2 3.5 Shearing...... 2 3.5.1 Periodic Wool Growth...... 2 3.6 Measurement ...... 2 3.7 Analysis ...... 2 3.7.1 Wool Production...... 2 3.7.2 Body Condition & Weight...... 2 3.7.3 Dag ...... 2 3.7.4 Fleece Rot...... 2 3.7.5 Flystrike...... 2 3.7.6 Ewe Reproduction ...... 2 3.7.7 Mortality...... 2 3.7.7.1 Multivariate Weaner Survival Analysis...... 2 3.8 Calculation of Fleece Values ...... 2 3.9 ‘Lifetime’ Wool Production ...... 2 C h a p t e r 4 Results—Weather Observations & Wool Production...... 2 4.1 Weather Observations...... 2 4.2 Overview of Sheep Numbers & Wool Production Results...... 2 4.3 Staple Strength & Position of Break...... 2 4.3.1 Ewes ...... 2 4.3.2 Weaners ...... 2 4.4 Fleece Weight & Yield ...... 2 4.4.1 Ewes ...... 2 4.4.1.1 Patterns of Annual Wool Growth in Ewe Shearing Groups...... 2 4.4.2 Weaners ...... 2 4.5 Fibre Diameter...... 2 4.5.1 Ewes ...... 2 4.5.2 Weaners ...... 2 4.6 Staple Length...... 2 4.6.1 Ewes ...... 2 4.6.2 Weaners ...... 2 4.7 Summary of Wool Characteristics...... 2 4.8 Fleece Value ...... 2 4.8.1 Ewes ...... 2 4.8.2 Weaners ...... 2 4.9 Lifetime Wool Production Results ...... 2 C h a p t e r 5 Discussion—Wool Production...... 2 5.1 Staple Strength...... 2 5.1.1 Ewes ...... 2 5.1.2 Weaners ...... 2 5.2 Fibre Diameter & Fleece Weight...... 2 5.3 Fleece Value ...... 2 5.3.1 Ewes ...... 2 5.3.2 Weaners ...... 2 5.3.3 Lifetime Fleece Value...... 2 C h a p t e r 6 Results—Sheep Health & Mortality...... 2 6.1 Body Condition & Growth Rate ...... 2 6.1.1 Ewe Condition Score ...... 2 6.1.2 Weaner Growth Rate Following Shearing...... 2 6.2 Fleece Rot...... 2 6.2.1 Ewes ...... 2 6.2.2 Weaners ...... 2
x
6.3 Dag & Crutching ...... 2 6.3.1 Ewes ...... 2 6.3.2 Weaners ...... 2 6.4 Flystrike...... 2 6.4.1 Ewes ...... 2 6.4.2 Weaners ...... 2 6.5 Reproduction ...... 2 6.6 Mortality...... 2 6.6.1 Ewes ...... 2 6.6.2 Weaners ...... 2 6.6.2.1 Summary Mortality Data ...... 2 6.6.2.2 Mortality & Bodyweight...... 2 6.6.2.3 Mortality & Mean Seasonal Growth Rate...... 2 6.6.2.4 Mortality and sex ...... 2 6.6.2.5 Mortality & Weaner Shearing Time ...... 2 6.6.2.6 Multivariate Survival Analyses...... 2 6.7 Value of Lifetime Wool Production, Accounting for Survivorship...... 2 C h a p t e r 7 Discussion—Sheep Health & Mortality...... 2 7.1 Body Condition & Growth Rate ...... 2 7.1.1 Ewes ...... 2 7.1.2 Weaners ...... 2 7.2 Fleece Rot...... 2 7.3 Dag & Crutching ...... 2 7.4 Flystrike...... 2 7.5 Reproduction ...... 2 7.6 Mortality...... 2 7.6.1 Ewes ...... 2 7.6.2 Weaners ...... 2 7.7 Value of Lifetime Production, Accounting for Survivorship...... 2 C h a p t e r 8 Conclusion ...... 2 Bibliography ...... 2 A p p e n d i c e s ...... 2 A p p e n d i x 1 Fertiliser Application & Pasture Management Details on Farm A...... 2 A p p e n d i x 2 Fleece Values Based on First & Ninth Decile Historical Micron Basis Prices...... 2 A2.1 Ewes ...... 2 A2.2 Weaners ...... 2 A2.3 Lifetime Fleece Values ...... 2 A p p e n d i x 3 Price Schedules Used to Calculate Fleece Values...... 2
xi
List of Tables
C h a p t e r 1 Table 1.1: Summary of shearing times reported in different regions of Australia...... 2
C h a p t e r 2 Table 2.1: Summary of experiments comparing shearing time and staple strength in Merinos, measured by % of tender fleeces, N/ktex and/or position of break...... 2 Table 2.2: Summary of results of studies comparing greasy fleece weight (GFW), fibre diameter (FD) and clean fleece yield of Merinos shorn annually at different times of the year...... 2 Table 2.3: Summary of trials comparing the net returns per head from shearing young Merino sheep once (shearing B only) or twice (shearing A then shearing B) ...... 2 Table 2.4: Summary of the results of studies examining changes in sheep feed requirements following shearing at different times of the year...... 2 Table 2.5: Mortality of Merino and Merino-cross sheep between weaning and approximately 18 months of age reported in Australian commercial enterprises or field experiments...... 2 Table 2.6: Summary of methodologies of studies reporting risk factors for post-weaning mortality...... 2 Table 2.7: Post-weaning mortality of sheep in different nutritional groups reported by Allden (1968c).2 Table 2.8: Mortality of weaners with different weaning weights and autumn growth rates reported by Hodge (1990)...... 2 Table 2.9: Maternal factors associated with weaner mortality ...... 2
C h a p t e r 3 Table 3.1: Calendar of flock management procedures and shearing times...... 2 Table 3.2: Statistical methods used to analyse differences between shearing groups in ewe and weaner results, and corresponding Stata® (StataCorp 2005) commands...... 2
C h a p t e r 4 Table 4.1: Number of ewes in each age group, and mean age and standard deviation (years) of the ewe flock at each shearing ...... 2 Table 4.2: Duration of wool growth (days) between, and dates of, shearings in each ewe shearing group ...... 2 Table 4.3: Duration of wool growth (days) between birth and shearing 1, and shearing 1 and shearing 2 for weaner shearing groups in each birth-year cohort...... 2 Table 4.4: Least squares mean greasy fleece weight (GFW), yield, clean fleece weight (CFW), fibre diameter (FD), staple length (SL), staple strength (SS) and percentage of mid-breaks of wool from the ewe shearing groups...... 2 Table 4.5: Least squares mean greasy fleece weight (GFW), yield, clean fleece weight (CFW), fibre diameter (FD), staple length (SL), staple strength (SS) and percentage of mid-breaks of wool produced at shearing 1 and 2 by each weaner shearing group ...... 2
xii
C h a p t e r 6 Table 6.1: Mean condition score prior to joining in March (join), during winter in July (wint) and at weaning in December (wean) of each ewe shearing treatment...... 2 Table 6.2: Mean fleece-free bodyweight at weaning and 95% CI (kg) of progeny from each ewe shearing group ...... 2 Table 6.3: Mean growth rate and 95% CI (kg/month) between May and December (8–15 months old) of weaners that, in May, were shorn (MAY-MAY), or were carrying two (MAR-MAR) or five (DEC-DEC) months wool...... 2 Table 6.4: Prevalence of severe fleece rot (score ≥3) in ewe shearing groups ...... 2 Table 6.5: Odds ratios (95% CI) of weaner shearing groups having severe fleece rot, relative to unshorn weaners or weaners carrying 12 months wool, at different times between 6 and 20 months of age..2 Table 6.6: Mean cost of dags (cents/head) in each dag score category; mean cost in each shearing group, weighted by prevalence of each dag score category, in 2002–03 (only crutched prior to shearing) and 2003–04 (crutched prior to lambing and shearing); and mean total annual cost of dag ...... 2 Table 6.7: Odds (95% CI) of severe dag (score ≥3) in weaner shearing groups in March, October & March (6, 13 & 18 months old, respectively), relative to unshorn weaners or weaners carrying 12 months wool ...... 2 Table 6.8: Flystrike prevalence in ewe shearing groups during November and early December...... 2 Table 6.9: Proportion of weaners in each shearing group affected by flystrike in May (8 months old) in two years of the trial ...... 2 Table 6.10: Proportion of weaners in each shearing group affected by flystrike and odds of flystrike, relative to DEC-DEC weaners, in November (14 months old) in three years of the trial...... 2 Table 6.11: Proportion of weaners in each shearing group affected by flystrike and odds of flystrike, relative to MAR-MAR weaners, in March (18 months old) in two years of the trial ...... 2 Table 6.12: Proportion of ewes in each shearing group lactating at lamb marking and odds (95% CI) of ewes lactating at marking, relative to DEC ewes...... 2 Table 6.13: Number of lambs weaned, number of ewes and weaning percentage in each year ...... 2 Table 6.14: Mean weaning weight and 95% CI (kg) of weaners dying and surviving during the post- weaning period in each year-trial cohort...... 2 Table 6.15: Number of weaners in each weaning weight quintile (range in kg) that died and were at risk during the post-weaning period, total mortality, cumulative mortality rate, and mortality risk (95% CI), relative to the middle quintile...... 2 Table 6.16: Average weaner growth rate (kg/month; ‘GR’) at different times throughout the post- weaning period (PWP) and total mortality ...... 2 Table 6.17: Total mortality, cumulative mortality rate (deaths/1000 weaners/month) and mortality risk of males relative to females ...... 2 Table 6.18: Mortality rate (deaths/1000 weaners/month) of shorn and unshorn weaners, mortality rate difference, and mortality rate ratio between shorn and unshorn weaners at each shearing...... 2 Table 6.19: Coefficients of statistically significant terms in Weibull, Cox, cubic spline, log-logistic and interval-censored Weibull survival analysis models, in analyses of a) all data, and b) Trial 2 data only...... 2 Table 6.20: Mean annual (Ann.) and cumulative (Cum.) survival prior to each shearing of sheep in each shearing group between birth and 6 years old (y.o.)...... 2
xiii
Appendices Table A-1.1: Annual fertiliser application and pasture management details for Farm A ...... 2 Table A-3.1: First, median and ninth decile basis prices in the period July 1992–February 2006 for wool of specified fibre diameter (FD; µm), and first, median and ninth decile percentage changes from basis price for wool of specified staple length (SL; mm) and staple strength (SS; N/ktex) ...... 2
xiv
List of Figures
C h a p t e r 1 Figure 1.1: Average contribution of individual factors to variation in quarterly clean wool price April 1996–June 1998, after Hatcher (2000)...... 2
C h a p t e r 2 Figure 2.1: An example of the effect of rate of change of fibre diameter on staple strength. The two wool fibres have a similar minimum fibre diameter and peak breaking force, but the bottom fibre has a greater linear density and therefore a lower staple strength...... 2 Figure 2.2: An example of how shearing time (dotted arrows) locates position of break (POB) on a wool fibre. ABCD is the pattern of diameter change along the wool fibre. Shearing at A & C places POB in the middle of the fibre. Shearing at B & D places POB at the end of the fibre...... 2 Figure 2.3: Price discounts due to staple length of 18 µm, 35 N/ktex wool at first, median and ninth decile prices from 1992–2006...... 2 Figure 2.4: Average vegetable matter levels in Australian states (Source: AWEX sale data 2000–01 to 2002–03, inclusive, compiled by B. Swain) ...... 2 Figure 2.5: Mortalities in Merino weaner sheep, classified by weight at weaning, from: 3–12 months of age on pasture (Lloyd Davies 1983), 3–6 months of age in a drought feedlot (Lloyd Davies et al. 1988), and 4½–6 months of age on pasture (Holmes 1992) ...... 2
C h a p t e r 3 Figure 3.1: Timing of ewe and progeny shearing treatments (age in months at weaner shearings) ...... 2
C h a p t e r 4 Figure 4.1: Monthly cumulative rainfall and average daily maximum and minimum temperatures at Farm A during the Time of Shearing Trial ...... 2 Figure 4.2: Mean staple strength of fleeces from ewe shearing groups (error bars indicate 95% confidence interval (CI); LSM: least squares means) ...... 2 Figure 4.3: Mean proportion of staple mid-breaks in fleeces from ewe shearing groups (error bars indicate 95% CI)...... 2 Figure 4.4: Mean staple strength of fleeces from shearing 1 and shearing 2 of each weaner shearing group (error bars indicate 95% CI) ...... 2 Figure 4.5: Mean proportion of staple mid-breaks in fleeces from shearing 1 and shearing 2 of each weaner shearing group (error bars indicate 95% CI) ...... 2 Figure 4.6: Mean greasy fleece weight of ewe shearing groups, adjusted to 365 days growth (excluding October-shorn maiden ewes; error bars indicate 95% CI) ...... 2 Figure 4.7: Mean clean fleece yield of ewe shearing groups (error bars indicate 95% CI)...... 2 Figure 4.8: Mean clean fleece weight of ewe shearing groups, adjusted to 365 days growth (excluding October-shorn maiden ewes; error bars indicate 95% CI) ...... 2
xv
Figure 4.9: Mean clean wool growth rates of ewes from each shearing group in each dyeband period (■ denotes shearing) ...... 2 Figure 4.10: Mean weight of clean fleece grown by ewes in each shearing group during each dyeband period ( denotes shearing) ...... 2 Figure 4.11: Mean clean fleece weight of each weaner shearing group at shearing 1 ( ) and shearing 2 ( ) (LSM: least squares means)...... 2 Figure 4.12: Mean clean wool growth rates of each weaner shearing group between birth and shearing 1 (“Shearing 1”) and shearings 1& 2 (“Shearing 2”). Error bars indicate 95% CI...... 2 Figure 4.13: Mean yields of clean fleece from shearing 1 and 2 of each weaner shearing group (error bars indicate 95% CI) ...... 2 Figure 4.14: Mean fibre diameter of ewe shearing groups (error bars indicate 95% CI) ...... 2 Figure 4.15: Mean fibre diameter of weaner shearing groups at shearing 1 and shearing 2 (error bars indicate 95% CI)...... 2 Figure 4.16: Mean staple length of ewe shearing groups, excluding October-shorn maiden ewes (error bars indicate 95% CI) ...... 2 Figure 4.17: Mean staple lengths of each weaner shearing group from shearing 1 and shearing 2 (error bars indicate 95% CI) ...... 2 Figure 4.18: Estimated fleece value of each ewe shearing group, based on median historical basis price and median, first decile (lower bar) and ninth decile (upper bar) staple length and strength premiums and discounts...... 2 Figure 4.19: Estimated fleece values from first ( ) and second ( ) shearing of each weaner shearing group, calculated using median historical micron basis prices and length & strength discounts...... 2
C h a p t e r 5 Figure 5.1: Significant seasonal events during years 1 & 2, and hypothesised fibre diameter profile, showing the effect of different shearing times on the position of break and measured staple strength ...... 2
C h a p t e r 6 Figure 6.1: Mean condition score of ewes in each shearing group at different times during the trial (bars indicate 95% CI)...... 2 Figure 6.2: Mean fleece-free bodyweight of weaner shearing groups between weaning and 15 months of age (December to December; bars indicate 95% CI) ...... 2 Figure 6.3: Proportion of each weaner shearing group with severe fleece rot (score ≥3) between weaning and 20 months of age, amongst weaners born in 2000, 2001 and 2002 ...... 2 Figure 6.4: Proportion of each ewe shearing group with no (dag score (DS) 0), mild (DS 1–2) and severe (DS 3–5) dag (arrows indicate time of crutching or shearing)...... 2 Figure 6.5: Proportion of each weaner shearing group with severe dag (score ≥3) between weaning and 20 months of age, amongst weaners born in 2001 and 2002 ...... 2 Figure 6.6: Cumulative mortality rates of ewes in each shearing group at different times of the year.....2 Figure 6.7: Turnbull survival estimates of year-trial cohorts...... 2 Figure 6.8: Number of lambs weaned and deaths in each censoring interval of the PWP, by year-trial cohort (timelines not to scale)...... 2 Figure 6.9: Turnbull survival estimates of within-cohort weaning weight quintiles ...... 2 Figure 6.10: Turnbull survival estimates of males and females ...... 2
xvi
Figure 6.11: Mortality rate ratio of shorn and unshorn weaners, with different growth rates during the early (‘GR(early)’) or middle (‘GR(mid)’) post-weaning period, estimated by the cubic spline (──) and Cox (─ ─) models (bars indicate 95% CI) ...... 2
Figure 6.12: Effect of shearing in March on death rates of female weaners with GRearly = 1 kg/month (──) or 2 kg/month (─ ─), estimated by the cubic spline model* ...... 2 Figure 6.13: Mortality rate ratio for a 2 kg increase in weaning weight or time-varying bodyweight (BWT) from the labelled value, estimated by Weibull, Cox and cubic spline models (bars indicate 95% CI)...... 2
Figure 6.14: Mortality rate ratio for a 0.25 kg/month increase in GRearly from the labelled value for Weibull, Cox and cubic spline models (bars indicate 95% CI) ...... 2 Figure 6.15: Smoothed mortality rate estimate from all data, and estimated Weibull & cubic spline hazard functions for female weaners with median covariate values (weaning weight 17 kg; GRearly, GRmid and GRlate = 0.9, 1.6 and 2.8 kg/month, respectively) ...... 2 Figure 6.16: Smoothed curve of the scaled Schoenfeld residuals for growth rate early in the post- weaning period (GRearly) vs. time since weaning ...... 2 Figure 6.17: Per head value of wool production of each shearing group to 6.25 years of age, accounting for cumulative survival, at median historical micron basis prices using first, median and ninth decile historical staple length and strength premiums and discounts...... 2
Appendices Figure A-2.1: Estimated fleece value ($/head) produced by each ewe shearing group, based on median, first decile (lower bar) and ninth decile (upper bar) micron basis prices and median staple length and staple strength discounts ...... 2 Figure A-2.2: Estimated fleece values ($/head) from shearing 1 ( ) and shearing 2 ( ) of each weaner shearing group, calculated using first decile historical basis prices and median staple length & strength premiums and discounts...... 2 Figure A-2.3: Estimated fleece values ($/head) from shearing 1 ( ) and shearing 2 ( ) of each weaner shearing group, calculated using ninth decile historical basis prices and median staple length & strength premiums and discounts...... 2 Figure A-2.4: Estimated value ($/head) of wool produced over the lifetime (from birth to 75 months of age) of ewes shorn in each weaner and ewe shearing combination, net of shearing costs, at first, median and ninth decile historical micron basis prices with median staple length and strength premiums and discounts...... 2 Figure A-2.5: Estimated value ($/head) of wool produced over the lifetime of ewes shorn in each weaner and ewe shearing combination, net of shearing costs & accounting for cumulative survival, at first, median and ninth decile historical micron basis prices with median staple length and strength discounts ...... 2
xvii
xviii
C H A P T E R 1
GENERAL INTRODUCTION
“The question of ‘When should I shear?’ is one which should be considered seriously by every grazier for such a decision will affect farm efficiency, labour, markets, and returns.”—Sheep & Wool Branch, Tasmanian Journal of Agriculture 1964
1.11.11.1 Background The two most important management decisions at the discretion of a wool producer with a self- replacing Merino flock are when to lamb and when to shear (Morley 1994). In the winter rainfall zone of Australia, there is a large body of research to demonstrate that spring is the preferred lambing time (Lloyd Davies 1987; Foot and Vizard 1993). However, there is little holistic, objective evidence of a preferred time of shearing for spring-lambing flocks.
Table 1.1: Summary of shearing times reported in different regions of Australia Location & Stock Time of shearing Reference Class Victoria central Victoria majority (>80%) August–November Foot & Vizard (1993) Gippsland spring: 57% Irving (1991) summer: 25% autumn: 7% winter: 2% state-wide spring: 52% Court and Lawless (1995) summer: 18% autumn: 13% winter: 17% Tasmania Ewes mainly later winter; also significant Statham (2004) numbers late autumn and late spring Wethers majority spring Weaners more in spring but remainder spread throughout the year Western Australia spring: 36% McFarland & Shaw (1998); summer: 25% Bell (1993) autumn: 22% winter: 17% South Australia ~50% summer or autumn Ashton (1992) Victoria, NSW & South Reeve & Thompson (2004) Australia
1 General Introduction
On Australian farms, adult Merinos are usually shorn annually. Shearing times vary widely within and between regions in Australia (Table 1.1). In surveys of Victorian and South Australian woolgrowers, availability of labour was the most common reason given for choosing a particular shearing time, although flystrike risk, reliability of weather and available daylight hours were also considered to be important (Irving 1991; Ashton 1992). Notably, wool quality issues were not cited as important determinants of shearing time, but this may be because these surveys were conducted before staple strength was commonly measured.
Studies have shown that shearing time affects many aspects of wool production and sheep health. For example, time of shearing affects wool quality characteristics such as staple strength, position of break, fleece weight, yield and fibre diameter (Arnold et al. 1984). It can also influence stocking rate, a key determinant of farm profitability, by affecting the pattern of nutrient demand of the flock and the timing of cull stock sales (Dabiri et al. 1996; Salmon et al. 2006). Time of shearing can also influence the susceptibility of sheep to diseases such as flystrike and fleece rot (Raadsma 1988). These studies have investigated the effect of changing time of shearing on each of these factors in isolation. However, no study has estimated the combined effect of shearing time on all of the factors influencing wool enterprise profitability. Therefore, the aim of this study was to examine the totality of effects of shearing time on a spring-lambing, self-replacing Merino enterprise in south-eastern Australia. The remainder of Chapter 1 provides an overview of the important determinants of profitability for a wool enterprise in south-eastern Australia. Chapter 2 reviews the literature on the relationship between time of shearing and these factors.
1.21.21.2 Determinants of Wool Enterprise Profitability & Their Relationship to Shearing Time Together, stocking rate, wool production per sheep, average price received for wool and enterprise costs determine the gross margin of a wool enterprise (Quinn et al. 2005 Equation 1.1).
GM = SR × ( W × p – E )...... Equation 1.1 where: GM = gross margin ($/ha) SR = stocking rate (DSE/ha) W = wool production per sheep (kg/DSE) p = average wool price received per kilogram of wool ($/kg) E = enterprise costs ($/DSE)
The factors influencing each of these will be examined in turn.
1.2.11.2.11.2.1 Stocking Rate Stocking rate refers to the number of sheep grazed per unit area. It is commonly expressed as the number of dry sheep equivalents (DSE) per hectare, where DSE is the amount of feed consumed by an
2 General Introduction adult Merino sheep in medium body condition (Morley 1994). The use of DSE allows a direct comparison of animals with different feed intakes per head.
There is an important interaction between stocking rate and time of lambing. In south-eastern Australia, it is well recognised that more sheep can be grazed per hectare in spring-lambing flocks than in those lambing at other times (Caple et al. 1989; Lean et al. 1997). This is because lambing in spring best matches the flock’s peak nutritional requirements during late pregnancy and lactation with the time of highest pasture growth rate. Time of shearing is considered to have considerably less impact on stocking rate than time of lambing and, for this reason, it has been recommended to choose time of lambing before time of shearing (Foot and Vizard 1993). Time of shearing affects farm stocking rate in two different ways. Firstly, the feed requirement of sheep can increase after shearing and thereby influence stocking rate (Black and Bottomley 1980). Secondly, cull and cast-for-age sheep are usually sold soon after shearing, and consequently different times of shearing will result in different seasonal grazing pressures on a farm. These issues are discussed in greater detail in Section 2.2.1.
1.2.21.2.21.2.2 Wool Production per Sheep At any given stocking rate, genetics fundamentally determines average wool production per sheep (Morley 1994). Genetic improvement is the dominant method used in Australia to increase per head wool production of Merinos and allows improvements to be made without necessarily compromising other fleece characteristics (Hatcher 2000). Although shearing at certain times of the year can result in a relatively small increase in fleece weight (Lightfoot 1967; Arnold et al. 1984), it is accompanied by an increase in fibre diameter, which lowers the wool’s value. Time of shearing has therefore not been considered an important method for increasing wool production per sheep (Arnold et al. 1984). The relationship between shearing time, fleece weight and fibre diameter is discussed further in Section 2.1.2.
1.2.31.2.31.2.3 Average Price per Kilogram of Wool Different studies report slightly varying contributions of the various wool quality factors to wool price, but all show that average fibre diameter is the dominant determinant of price received by a grower for their wool (Figure 1.1). Variation in fibre diameter typically accounts for 50–70% of the variation in price received (Couchman et al. 1993; Hatcher 2000). Fibre diameter is one of the more heritable wool characteristics, with a heritability of approximately 0.50 (Morley 1994). Selection indexes can be devised that result in fibre diameter reduction, whilst avoiding or minimising deleterious changes to other aspects of wool production, such as fleece weight, and have been used widely in Australia (Hygate et al. 2006). Shearing time has been shown to influence fibre diameter (Arnold et al. 1984), but because greater, and permanent, progress in decreasing fibre diameter has been achieved through genetic selection, manipulating time of shearing has been a secondary consideration in strategies to lower fibre diameter.
3 General Introduction
After fibre diameter, wool quality factors that have a large effect on first stage processing performance have the most significant impact on wool price. These factors are staple strength, ‘position of break’ (the point along the staple where it breaks under tension), vegetable matter content and staple length (Hatcher 2000). Wools with a fibre diameter less than 20.5 µm tend to receive price penalties when staple strength is less than about 36 N/ktex (Oldham 2000). For example, between 1992 and 2006, 19 µm wool with a staple strength of 26 N/ktex received up to a 24% price discount compared to 36 N/tex wool of the same fibre diameter (Independent Commodity Services 2005). Position of break also tends to reduce wool price if more than 55% of tested staples break in the middle, although the discounts are usually only in the order of 1–2% (AWI 2006).
The presence of vegetable matter in wool affects processing efficiency and also lowers wool price (Charlton et al. 1981). Wool containing small or linear-shaped vegetable matter particles, such as seed and shive, tended to receive higher price penalties than burr because these contaminants were more difficult to remove (Atkinson 1989).
Time of shearing can have substantial and direct effects on staple strength, position of break and vegetable matter contamination (Warr et al. 1979; Arnold et al. 1984; Adams et al. 2000), and hence on the price growers receives for their wool.
Wool with staple length less than about 65 mm has received substantial price discounts because it is unsuitable for worsted processing (Adams et al. 2000; Independent Commodity Services 2005). Adult Merinos shorn annually invariably produce wool of adequate length, irrespective of when they are shorn (Arnold et al. 1984). However, sometimes adult sheep are shorn prematurely prior to sale, which can result in wool of short staple length that is heavily discounted (Salmon et al. 2006). Similarly, short staple length wool is commonly produced from young sheep whose shearing time is being aligned
Figure 1.1: Average contribution of individual factors to variation in quarterly clean wool price April 1996–June 1998, after Hatcher (2000)
11%
2% 1%
6% Fibre diameter Staple strength Staple length 7% Vegetable matter Colour Style Wool marketing 64% 9%
4 General Introduction with the adult flock. Consequently, short shearing interval is a dominant cause of short staple length wool (Donnelly 1991a).
1.2.41.2.41.2.4 Enterprise Costs Enterprise, or variable, costs are those which are directly attributable to a particular enterprise and that vary with its scale (Morley 1994). The main variable costs in a wool enterprise include contract services associated with shearing, supplementary feeding, animal health management and pasture maintenance (Quinn et al. 2005). Shearing time may influence costs by increasing sheep’s nutritional requirements and influencing their susceptibility to diseases such as fly strike and fleece rot (Black and Bottomley 1980; Irving 1991).
In summary, time of shearing is not considered to be the dominant factor influencing stocking rate, wool production per head, wool price received or enterprise costs, although it impacts upon all to some degree. The only factor primarily influenced by shearing time is staple length. The relatively small and diverse effects of shearing time on many aspects of a wool enterprise, rather than large effects on one or two factors, may partially explain the variety of observed shearing times that occur even within a given region.
5
6
C H A P T E R 2
LITERATURE REVIEW
This literature review discusses the effects of shearing time on wool quality characteristics and animal health, and survival of sheep after weaning more generally. Wool quality, which is discussed in Section 2.1, is difficult to define, although it may be best measured via the characteristics that most influence spinning performance and yarn quality (Vizard and Hansford 1999). Staple strength, mean fibre diameter, staple length and level of vegetable matter contamination are amongst the most important of these characteristics. The ways in which time of shearing affects sheep nutrition, health and mortality are reviewed in Section 2.2, with a more detailed examination of Merino weaner mortality in Section 2.3.
2.12.12.1 The Effect of SShearinghearing Time on Wool Quality
2.1.12.1.12.1.1 Staple Strength & Position of Break Staple strength (SS) is a measurement that represents the tensile strength of a staple of wool fibres (Lamb 2004). As outlined in Chapter 1, staple strength influences the price farmers receive for their wool because it is related to the performance of wool when it is processed into top. The adverse effect of low staple strength on processing performance “…is of considerable manufacturing significance” (Bigham et al. 1983).
Staple strength is calculated from several fibre measurements. It is the quotient of the tensile force required to break a wool staple and its linear density (Schlink 2000):
peak breaking force (N) (N/ktex)SS = ...... Equation 2.1 linear density(ktex)
The linear density of the wool fibre is its average mass per unit length and is measured in kilotex (ktex):
clean fibre mass (g) linear ktexdensity )( = ...... Equation 2.2 fibre length (m)
Staple strength was originally assessed subjectively, according to how easily a staple was broken when subjected to manually applied tension and flicked with a finger. Although this ‘flick test’ is a crude
7 Literature Review estimate of staple strength, it broadly classified wool as being sound (>30 N/ktex) or tender (<30 N/ktex), a distinction that was related to the wool’s processing performance (Bigham et al. 1983; Hansford 1987; Couchman et al. 1993). The tender classification can be further divided into part- tender (25–30 N/ktex), tender (18–24 N/ktex) or rotten (<18 N/ktex). Objective measurement of staple strength was incorporated throughout the 1990s into the set of objective characteristics that now form the basis for the sale of wool in Australia, a process known as ‘sale by description’.
Staple strength is routinely measured with either an ATLAS or AgriTest Staple Breaker machine (Vizard et al. 1994), both of which produce comparable results (Schlink 2000). The ATLAS and Staple Breaker machines first measure staple length under a standard extension, then the force required to pull the staple apart. The two resultant pieces are weighed to calculate linear density, and thence staple strength (Adams et al. 2000; Lamb 2004).
The location of the staple’s breaking point along its length, or ‘position of break’ (POB), is usually assessed alongside staple strength. As discussed in Section 1.2.3, POB is also an important characteristic of wool quality. It is calculated from the relative weights of the two pieces of broken wool from strength testing, and a minimum of 40 staples are broken to give a result describing the percentage of staples breaking in the lower third (that closest to the skin, or ‘base’), middle (‘mid’) or upper third (‘tip’) of the staple (Australian Wool Testing Authority 2004).
A significant proportion of wool grown in southern Australia has been classified as tender (Couchman et al. 1993). Foot and Vizard (1993) described this as the biggest limitation to quality of wool from ewes and young sheep in south-eastern Australia. For example, when that statement was made, 25% and 35% of the wool sold in Victoria and Western Australia, respectively, was assessed as weak (Doyle et al. 1993). Adams et al. (2000) presented unpublished survey data suggesting that between 35–45% of wool from southern Australia was “[financially] penalised” (p. 61) for poor staple strength.
2.1.1.1 Components of Staple Strength & Position of Break In order to discuss the effect of shearing time on staple strength, it is necessary to consider the physical wool properties that determine staple strength and POB.
A great deal of research has investigated the wool fibre properties that determine staple strength, and this has been well summarised in reviews such as those by Bigham (1983) and (1994). Because staple strength is calculated from several different wool staple measurements, it has been described as a “summary” of several “biological” properties of the wool fibre (Schlink et al. 2000a p. 21). The properties of interest are (Schlink and Hynd 1994):
• components of fibre ‘diameter’, comprising minimum diameter, and diameter variation along and between fibres • intrinsic fibre strength • fibre shedding
8 Literature Review
Although there has been debate in the past about the different factors that influence staple strength, it is now broadly accepted that the wool fibre’s size and shape, which are related to fibre diameter, have a greater influence on staple strength than properties of the wool material itself (Yang and Lamb 1997).
2.1.1.1.1 Fibre Diameter Properties The breaking force of a wool fibre or staple is mostly determined by the size of its minimum cross- sectional area, rather than differences in the intrinsic strength or chemical composition of the wool material present in the cross section (Adams et al. 2000). Less force is required to break a wool fibre of small diameter (and hence lower cross-sectional area) than a fibre of large diameter.
Initial examinations of the relationship between fibre diameter and staple strength focused on the relationship between minimum fibre diameter and staple strength. Various authors have reported 20– 66% of the variation in staple strength is accounted for by variation in minimum fibre diameter (Bigham et al. 1983; Hunter et al. 1983; Denney 1990; Hynd et al. 1997; Thompson and Hynd 1998). However, other studies have shown that rate of change of diameter along the fibre was correlated with staple strength to a similar or greater extent than minimum diameter, with fibres having a larger diameter change in a given length being weaker (Hansford and Kennedy 1988; Hansford and Kennedy 1990b; Friend et al. 1996; Peterson et al. 1998; Brown et al. 2002).
Minimum diameter and rate of diameter change along fibres help determine the overall shape of the wool fibre, which is referred to as the fibre diameter profile (Schlink et al. 2000a). Different aspects of the fibre diameter profile affect different components from which staple strength is calculated, explaining the association between staple strength and minimum fibre diameter, as well as rate of fibre diameter change. Minimum fibre diameter determines the fibre’s peak breaking force—the numerator of Equation 2.1—because it is related to the quantity of wool material present at the fibre’s weakest point. On the other hand, diameter variation along the fibre influences its linear density, which is the denominator of Equation 2.1. A fibre whose diameter changes gradually has a lower linear density
Figure 2.1: An example of the effect of rate of change of fibre diameter on staple strength. The two wool fibres have a similar minimum fibre diameter and peak breaking force, but the bottom fibre has a greater linear density and therefore a lower staple strength.
↕ 16 µm
↕ 16 µm
9 Literature Review than one whose diameter changes suddenly along its length. If the two fibres have the same minimum diameter (i.e. the same peak breaking force), the fibre with the greater rate of fibre diameter change will have be weaker because of its greater linear density (Adams et al. 2000). This is illustrated in Figure 2.1.
POB usually occurs at the wool fibre’s point of minimum fibre diameter (Bigham et al. 1983; Schlink et al. 1998; Schlink et al. 2000b), although it is possible for it to occur at other locations due to the mechanics of fibre and staple breaking(Orwin et al. 1980; de Jong et al. 1985).
Fibres with thin regions of varying length will vary in their elongation and load-bearing properties, and thus staple strength. Fibres with a short thin section elongate less, and fail under a lower load, than fibres with a long thin section. Analyses that include fibre elongation have very high correlations with staple strength (Masters et al. 1998; Lamb 2004). This may help to explain why fibres with large diameter variation and short, thin sections have a POB at the sudden diameter change, and are often weaker than ones whose diameter changes more gradually (Brown 1971).
Diameter variation between fibres is also related to staple strength, although it is not part of the fibre diameter profile (Adams et al. 1997). Between-fibre variation is thought to be both indirectly and directly associated with staple strength (Lamb 2004). The indirect association occurs because fibres of different diameters tend to also have different lengths. When tested to breaking point, fibres of different lengths come under load at different times (Masters et al. 2000). The direct association occurs because fibres with varying diameters elongate different amounts before breaking (Lamb 2004). The indirect and direct associations have the same result: fibres within the staple do not load simultaneously, so the peak force is shared by fewer fibres at one time. Thus, there is less wool material resisting the breaking force than if all fibres in the cross section were resisting the tension together. As a result, a staple containing fibres of widely differing diameters breaks at a lower peak force than one with less between-fibre diameter variation (Adams et al. 2000; Schlink et al. 2000a, Peterson, 1998 #516).
2.1.1.1.2 Intrinsic Fibre Strength Historically, great emphasis was placed on an association between intrinsic fibre strength—the strength of the wool material itself—and staple strength. However, intrinsic fibre strength only varies moderately between individual Merinos (Hunter 1980; Yang and Lamb 1997), and even when variation in intrinsic fibre strength has been observed, it is very poorly correlated with staple strength (Lamb 2004). For example, no correlation between intrinsic fibre strength and staple strength was found in a Western Australian study of fine and broad wool Merino hoggets receiving three different levels of supplementary feeding in summer and shorn in spring (Peterson et al. 1998).
Differences in the cellular or chemical composition of wool that are related to staple strength have not been observed in Merinos (Hansford and Kennedy 1990a), although they have been in Romney sheep (Orwin et al. 1980). Even the effects on staple strength of physiological processes such as pregnancy and lactation do not appear to be mediated through changes in intrinsic fibre strength. For example, no
10 Literature Review changes in intrinsic fibre strength were observed even when large changes in staple strength were observed in lambing ewes under different nutritional regimens (Hunter et al. 1990).
Any relationship between intrinsic fibre strength and staple strength may be overridden by other fibre characteristics that cause more variation in staple strength. Reis (1992) pointed out that the breaking force of wools with the same average fibre diameter vary significantly and proposed that intrinsic fibre strength played a significant role in staple strength. However, this overlooked the possibility that wools with the same average diameter may have different minimum diameters and peak breaking forces, as explained previously.
2.1.1.1.3 Fibre Shedding Fibre shedding occurs when a wool follicle stops producing a wool fibre and is expelled from the follicle before it is shorn off. Shed fibres do not span the distance between the jaws of the testing machine and so do not bear any load but still contribute to the staple’s linear density and consequently decrease staple strength (Schlink et al. 2000a). Fibre shedding has been reported to be well correlated in wool of staple strength less than 30 N/ktex, but not in stronger wool (Schlink and Hynd 1994), and its inclusion in analyses does not seem to explain staple strength variation any more than minimum fibre diameter and fibre diameter variation alone (for example, see Hynd et al. 1997).
2.1.1.2 Associations between Staple Strength, Position of Break & Shearing Time Position of break and the along-fibre components of staple strength (minimum diameter and rate of diameter change) occur at specific points in time as wool grows. Thus, these are the components that can potentially interact with shearing time, which also happens at a single point in time along the wool staple. In contrast, other components of staple strength, such as between-fibre variation or intrinsic fibre strength, affect the staple along its entire length. It is therefore unlikely that a single event could interact with them in a consistent way.
A number of surveys have shown an association between time of shearing and staple strength. For example, Curtis and Stanton (2000 cited by Oldham 2000) demonstrated a seasonal variation in staple strength of wool produced in all states of Australia, based on average figures from wool sold between 1995 and 2000. This analysis assumed that shearing occurred 1–2 months before sale. In Victoria, staple strength was greatest in wool sold in August (and therefore probably shorn in June or July), with an average staple strength of 38 N/ktex, whilst wool sold in December (shorn October or November) had the worst staple strength of 34 N/ktex. The staple strength of wool sold from other states also varied throughout the year, although the timing of the maximum and minimum strength differed between states. South Australia and New South Wales had similar patterns to Victoria. Western Australia had a similarly shaped curve, although the peak coincided with autumn shearing (36 N/ktex) and the trough with spring shearing (31 N/ktex). Queensland wool showed less variation throughout the year: wool sold in spring was strongest (38 N/ktex) and strength wool from the rest of year was only 2–3 N/ktex less. Wool from Tasmania sold between January and July had an average strength of
11 Literature Review about 34 N/ktex but then there was a sudden increase to about 40 N/ktex for wool sold between August and December.
It should be noted that whilst Curtis and Stanton’s data showed a definite pattern to staple strength throughout the year, in Victoria, New South Wales, Queensland and Western Australia, the annual range was only 3–5 N/ktex. The range in staple strength of wool from Tasmania, the state with the largest variation, was still only 8 N/ktex. Nonetheless, average staple strength was still less than the critical value of 36 N/ktex for more than half the year in all states except New South Wales and Queensland.
Other authors have reported a similar association with shearing time across Australia (Douglas 1989; Couchman et al. 1993) and in Western Australia (Arnold and Gordon 1973; J. Stanton 1987 cited by Hansford 1987), although these data may have been biased because staple strength was not measured in all wool sold at those times. Furthermore, the patterns are not consistent in every year. In the 1991/92 selling year, Baker (1993) reported that Western Australian wool sold in winter was weakest (30 N/ktex), whereas Curtis and Stanton found wool sold in winter had the second greatest average staple strength after late autumn.
Baker et al. (1994) conducted a more detailed survey of sheep, management and environmental factors affecting staple strength of Western Australian fleece wool sold between 1989/90 and 1991/92. The survey examined how much variation in staple strength was explained by average fibre diameter, sheep
Table 2.1: Summary of experiments comparing shearing time and staple strength in Merinos, measured by % of tender fleeces, N/ktex and/or position of break Location Shearing Staple strength Lambing Reference time measurement time WA March 4%, 10%* May (McGarry and Stott September 23%, 60% 1960)
WA March 2% June (Arnold et al. 1984) March 8% March October 9% June October 11% March
Tasmania July 43, 30 N/ktex† - (Butler et al. 1994) November 30, 27 N/ktex
Sth Africa February 35 N/ktex; 35% mid-breaks March (Cloete et al. 2000) September 27 N/ktex; 60% mid-breaks
WA January 37 N/ktex; tip‡ August (Hansford 1997) February 32 N/ktex; tip July April 37 N/ktex; tip July August 35 N/ktex; middle June August 21 N/ktex; middle/base April September 33 N/ktex; middle May October 36 N/ktex; tip July
Results from: * year 1, year 2; † less nutritional variation, more nutritional variation experienced by flock ‡ Position of break
12 Literature Review age and sex, shearing time, length of growing season, timing of seasonal break, and number of false seasonal breaks. Time of shearing was significantly associated with staple strength in all years but was only of mid to low importance in explaining staple strength variation. Frustratingly, the study did not report which shearing time was associated with greater staple strength.
Several field experiments, summarised in Table 2.1, have also demonstrated that shearing time affects staple strength and indicated which times produce stronger wool. In a Western Australian trial spanning two years, March shearing reduced the likelihood of autumn (May)-lambing Merino ewes producing ‘tender’ fleeces by 83%, compared to September shearing (McGarry and Stott 1960). This result was the same in both years, despite the proportions of tender fleeces varying considerably between the two years. A similar, more recent experiment involving autumn (March)-lambing Merinos in a Mediterranean-type environment in South Africa made similar findings (Cloete et al. 2000). Shearing ewes in February (before lambing) produced wool of greater staple strength than shearing in September (before joining) and reduced the proportion of ewes with mid-staple breaks.
The previous two experiments only examined ewes lambing in autumn, however a similar Western Australian trial showed that there was a statistically significant interaction between shearing and lambing time (Arnold et al. 1984). March shearing and June lambing produced the fewest tender fleeces. March shearing and March lambing produced the next lowest proportion of tender fleeces, although it was similar to October shearing with March or June lambing. Thus, March shearing generally reduced tenderness, although the largest improvement in staple strength came from the choice of shearing and lambing time. Despite the effect of lambing time on staple strength, only shearing time affected processing performance.
Hansford (1997) directly measured staple strength, rather than the proportion of tender fleeces, of Merino wool from seven farms in the Kojonup region (Western Australia) between April 1993 and February 1994. In this trial, shearing time was clearly associated with POB. The POB was at the end of the staple in flocks shorn in summer or autumn shearing (3 flocks) but was in the middle of the staple in flocks shorn in winter or spring (4 flocks). In contrast, shearing time was not clearly associated with staple strength. Thus, shearing time appeared to influence POB but not staple strength.
Complex interactions between shearing time, nutrition and staple strength have also been demonstrated in Merino wethers in Tasmania (Butler et al. 1994). In contrast to the Western Australian ewe experiments, wool shorn in July had a greater staple strength (43 N/ktex) than wool shorn in November (27 N/ktex). However, amongst wethers that experienced sudden changes in nutrition, such as changes in grazing rotation length and multiple changes from pasture to crop paddocks, July shearing produced wool only marginally stronger (30 N/ktex) than the November shearing. As observed in the Western Australian study, any influence of shearing time on staple strength can be tempered by other extraneous factors.
McGarry et al. (1960), Arnold et al. (1984), Butler (1994), Hansford (1997), Cloete et al. (2000) and Peterson et al. (2000) all speculated that shearing improved staple strength when it coincided with the
13 Literature Review
POB. This is illustrated in Figure 2.2. The timing of shearing determines where the minimum fibre diameter, and hence POB, is situated relative to the ends of the staple. What might otherwise be a mid- break in the staple can be a tip or base break if shearing occurs just before or after the minimum fibre diameter. This will affect the measured staple strength because 10–20 mm of wool at either end of the staple is held in the jaws of the ATLAS machine used to measure staple strength (Hansford 1997). If the POB is clamped, the staple must break at a different, stronger location, resulting in a higher measured peak breaking force and staple strength (Butler et al. 1994). Although the greater staple strength may seem to be an artefact of the testing methodology, such wools have improved processing performance (Arnold et al. 1984; Hansford 1997).
2.1.1.3 Shearing Time & Recurring Factors Affecting Staple Strength If the point of minimum fibre diameter corresponds to a recurring annual event, such as the start of lambing or the onset of autumn, then when shearing occurs relative to this event will determine the POB. The final part of this section briefly discusses the ‘recurring events’ that have been associated with the point of minimum fibre diameter. If such an event occurred reliably each year, staple strength may be improved by shearing at that time.
In Merinos, diameter variation along a wool fibre and POB are mainly influenced by seasonal variation in a sheep’s nutritional intake and requirements (McManus et al. 1964; Allden 1979). Variation in staple strength was shown to be mainly caused by the effects of nutritional changes on the fibre diameter profile by Thompson and Hynd (1998). They measured minimum fibre diameter, and along- and between-fibre diameter variation in Merino wethers, bred for high or low staple strength, that either maintained or lost then gained weight from consuming diets containing different amounts of metabolisable energy. Minimum diameter and along-fibre variation accounted for 75% of the variation
Figure 2.2: An example of how shearing time (dotted arrows) locates position of break (POB) on a wool fibre. ABCD is the pattern of diameter change along the wool fibre. Shearing at A & C places POB in the middle of the fibre. Shearing at B & D places POB at the end of the fibre.
POB
A B C D
POB POB
POB
14 Literature Review in staple strength, and these two factors were more strongly associated with nutrition than genotype. In contrast, between-fibre variation, which was attributed to genetics, only accounted for 8% of the variation. Mata et al. (2002) reported similar effects on the fibre profile, but not staple strength, in sheep grazing pasture.
In sheep grazing pastures in southern Australia, rapid nutritional change frequently occurs at or near the start of autumn, and many authors have observed that POB also occurs at this time (McManus et al. 1964; Allden 1979; Reis 1992; Earl et al. 1994). Prior to the autumn break, sheep have grazed dry pastures of low nutritional value. Once pasture growth resumes after the opening rains, sheep begin to graze increasing quantities of highly digestible, high-protein, green feed, which causes substantial increases in wool growth rate and fibre diameter (Mata et al. 1990). POB is particularly sensitive to such changes in protein nutrition (Schlink et al. 1998; Schlink et al. 2000b). In Mediterranean-type environments, sheep experience changing climatic conditions, as well as nutritional ones, at the autumn break. However, it has been shown that the change in diet, rather than exposure to rain and/or cooler temperatures, affects wool growth, fibre diameter and position of break (Mata et al. 1999; Woodgate et al. 2000; Woodgate et al. 2001).
Variations in pasture availability and feed intake, rather than pasture quality, can also alter the fibre profile and POB (Butler et al. 1994; Schlink et al. 1999; Mata et al. 2002). This is particularly seen in environments where there is less seasonal variation in pasture quality, such as regions of Tasmania (Butler 1994). In these environments, POB occurs during winter, when pastures tend to be at their lowest availability.
The physiological status of the sheep also influences fibre diameter properties and hence staple strength (Hansford 1987). In particular, ewe nutritional requirements vary throughout the year according to reproductive state, and wool growth and reproduction compete for nutrients (Corbett 1979). Many authors have observed that the minimum fibre diameter and the POB in ewes can coincide with pregnancy or lactation (Hunter et al. 1990; Robertson et al. 1996; Masters et al. 1997). Thus, in the one environment, the POB may occur in autumn in wethers but in winter or spring in spring-lambing ewes, depending on how well pasture availability matches ewe requirements during pregnancy or lactation. For example, Thornberry et al. (1988) examined staple strength in 14 spring-lambing flocks across NSW and found that POB in ewes occurred at widely varying times between early pregnancy and post-weaning, but consistently occurred in autumn in wethers on the same farms. This suggests that POB in ewes is quite sensitive to the match between nutritional requirements and intake, which in turn is significantly influenced by seasonal conditions, time of lambing and number of lambs born and reared. This has been confirmed experimentally. For example, nutritional restriction of pregnant ewes lasting only a fortnight reduced staple strength by up to 27 N/ktex, with the largest reductions occurring in twin-bearing ewes (Robertson et al. 2000b). The staple strength of ewes grazing about 550 kg dry matter (DM)/ha throughout pregnancy and lactation was 34 N/ktex lower than ewes grazing >1500 kg DM/ha (14 vs. 48 N/ktex Robertson et al. 2000a). Similarly, a Victorian field trial showed that changing lambing time by just 3 weeks (from late winter to spring) increased staple strength by 4
15 Literature Review
N/ktex and reduced the likelihood of producing tender wool nearly three-fold (Scrivener and Vizard 1997).
It is important to note that, in spring-lambing ewes, the two factors most commonly associated with the POB, lambing and autumn nutrition, occur about six months apart. Most work has concentrated on one factor or the other and little attention has been paid to their combined influence on the POB within and between years (Masters et al. 1997). For example, if the POB in ewes varies between spring and autumn from year to year, no annual shearing time will consistently improve staple strength.
In summary, it can be seen that staple strength is substantially influenced by the minimum fibre diameter and along-fibre variation, and that these fibre properties are largely determined by nutrition, relative to a sheep’s physiological requirements. Shearing time is thought to increase staple strength if it places the POB at the end of the wool staple. As explained previously, spring is the preferred lambing time in southern Australia because it improves the profitability of a self-replacing Merino flock, yet no experiments have compared the effects of different shearing times on staple strength in spring-lambing flocks. Furthermore, the POB may vary substantially with changes in nutrition or nutritional requirements, and no trials have been conducted for sufficient time to test if one particular shearing time consistently improves staple strength.
2.1.22.1.22.1.2 Fleece Weight, Yield & Fibre Diameter Although shearing time affects both fibre diameter and fleece weight, the main way to manipulate these characteristics is through genetic selection, as discussed previously. Several trials, all of which were conducted for two or three years, have investigated the effect of shearing time on fleece weight, fibre diameter and yield in Merinos, and their results are summarised in Table 2.2. These studies were conducted in Western Australia, Tasmania, NSW and South Africa in wether flocks, and ewe flocks with different lambing times. Five of eight studies showed that fleece weight differed with shearing time. In four of these trials, sheep shorn in autumn produced greasy fleeces approximately 10% heavier than sheep shorn at other times of the year (McGarry and Stott 1960; Lightfoot 1967; Arnold et al. 1984; Cloete et al. 2000). In the fifth trial, a similar increase in fleece weight was observed in wethers shorn in July, compared to wethers shorn in November (Butler et al. 1994). Of the three studies that did not observe an association between fleece weight and shearing time, two were conducted in ewes in Tasmania (McLeod 1967). In these trials, although ewe feed intake was not reported, the lack of an association between fleece weight and shearing time may have been due to low winter pasture availability that restricted the feed intake and consequent wool growth of autumn-shorn ewes after shearing (Dabiri et al. 1994). The final study showing no effect of shearing time was conducted in western NSW, an environment without a strong seasonal rainfall and pasture growth pattern (Kennedy et al. 1982). In summary, whilst there is evidence that autumn-shorn wethers and non-pregnant ewes produce heavier fleeces than sheep shorn at other times of the year, in the flocks lambing in spring, the preferred lambing time in southern Australia, this association may be modulated by other factors such as winter pasture availability.
16
Table 2.2: Summary of results of studies comparing greasy fleece weight (GFW), fibre diameter (FD) and clean fleece yield of Merinos shorn annually at different times of the year
Stock class (lambing Location Shearing GFW FD Yield Comment Reference time) times (kg) (µm) Ewes (May) WA March 5.3 - 62% (McGarry and Stott 1960) September 4.8 62% Ewes (late winter*) Tasmania January 4.4 - - Jan. > Oct. in 2 of 3 years (McLeod 1967) July 4.1 October 4.4 Ewes (late winter*) Tasmania March 4.4 - - Oct. > Mar. in 2 of 3 years (McLeod 1967) October 4.5 Literature Review Review Literature Wethers WA April 5.9 66% (Lightfoot 1967) October 5.2 71% 17 Ewes (March & June) WA March 5 23.0 67% (Arnold et al. 1984) October 4.1 22.3 71% Ewes (winter) western March 3.6 - - No effect of shearing time (Kennedy et al. 1982) NSW May 3.5 once variation in rainfall August 3.8 accounted for November 3.9 Ewes (February) South Africa February 5.7 23.5 69% FD difference not (Cloete et al. 2000) September 4.9 23.0 75% significant Wethers Tasmania July 4.3 - - (Butler et al. 1994) November 4.0
* not explicitly stated - : not reported
Literature Review
Table 2.2 also shows that autumn-shorn wools have lower yields than wools from other shearing times. Arnold et al. (1984) suggests that this occurs for two reasons. Firstly, autumn-shorn sheep have longer fleeces during summer than sheep shorn at any other time of the year, and this longer wool accumulates more dust and environmental contaminants than shorter fleeces do. Secondly, the increased wool growth stimulated by autumn shearing is accompanied by extra wax and suint production, and this further lowers the yield of autumn-shorn fleeces. However, this lower yield is not large enough to completely offset the increase in greasy fleece weight, so that autumn shearing still tends to produce heavier clean fleece weights than other shearing times.
Shearing time is thought to affect fibre diameter, fleece weight and yield through the same mechanism—by stimulating feed intake (Arnold et al. 1984)—so they will be considered together. The relationship between shearing time and feed intake is discussed in detail in Section 2.2.1 and only the underlying mechanisms are summarised here. Removing a sheep’s fleece substantially increases heat loss and this stimulates feed intake in an effort to maintain body temperature. In turn, a higher feed intake increases wool growth rate (Arnold et al. 1964; McFarlane 1965; McGuirk et al. 1966; Allden 1979). This stimulation of wool growth may persist for weeks to months after shearing. The extent of additional heat loss following shearing is determined by ambient temperatures and, consequently, shearing at different times of the year will have different effects on feed intake and hence wool growth rate. Furthermore, in southern Australia, factors that modulate feed intake, such as pasture availability and sheep body condition, vary considerably with season, and therefore will have differing influences on wool growth rate at different shearing times. For example, autumn shearing exposes sheep to cooler ambient temperatures than sheep shorn in spring. Additionally, sheep are usually in lighter body condition in autumn than in spring and so have a greater capacity to increase their feed intake following shearing (Freer et al. 1997). Arnold (1984) proposed that the combination of these factors results in a larger increase in feed intake after shearing in autumn than in spring and, thus, autumn-shorn sheep produce heavier fleeces than sheep shorn at other times. When the growth rate of wool increases, mean fibre diameter increases (McFarlane 1965; Fernandez Abella et al. 1991; Doyle et al. 1995; Masters et al. 1998), and so when it has been observed that autumn shearing is associated with increased fleece weight, the average fibre diameter of autumn-shorn wool is also greater than wool shorn at other times. Depending on the price premiums for wool of finer fibre diameter, the monetary value associated any increase in fleece weight due to autumn shearing may be reduced, or even negated, by the reduction in wool price per kilogram. The potential for shearing time to have opposite effects on wool characteristics has been previously noted (Foot and Vizard 1993) but the overall impact of shearing time on a commercially managed, spring-lambing enterprise needs to be quantified.
2.1.32.1.32.1.3 Staple Length As explained previously, top-makers have traditionally aimed to produce a top with hauteur of 65–90 mm (Oldham 2000) and wool shorter than 65 mm attracts considerable price discounts because it reduces hauteur (Independent Commodity Services 2005; Figure 2.3). In particular, the staple length of
18 Literature Review
wool from weaner sheep is often less than 65 mm and consequently attracts significant discounts (Taylor 1989).
In commercial sheep flocks, the shearing time of young sheep must eventually be aligned with the shearing time of the adult flock. Seven studies, listed in Table 2.3, have compared different weaner shearing patterns that aligned a weaner flock to a single adult shearing time, and all found that shearing weaners twice was less profitable than shearing them once. In weaners that are shorn twice, the first shearing usually occurs soon after weaning and produces wool approximately 30 mm long (Donnelly 1991a; Rogan et al. 1995). This short staple length wool attracts a substantially lower price than longer wool and, in five of the seven trials listed in Table 2.3, short length was the most or second most important determinant of the lower profitability of twice-shorn weaners. The cost of the extra shearing was the other major factor that reduced the profitability of a two-shearing pattern. Flocks with the same lambing date but differing adult shearing times will inevitably have different weaner shearing
Figure 2.3: Price discounts due to staple length of 18 µm, 35 N/ktex wool at first, median and ninth decile prices from 1992–2006
10% 0% -10% -20% -30% -40% -50%
Clean Clean price discount -60% -70% 16-25 26-35 36-45 45-54 55-64 65-74 75-84 85-94 95-104 105- 114 Staple length (mm)
Median 1st decile 9th decile basis prices
Table 2.3: Summary of trials comparing the net returns per head from shearing young Merino sheep once (shearing B only) or twice (shearing A then shearing B) Time of Shearing A–Shearing B Wool returns Difference Reference birth Once Twice winter December–May $8.06 $5.70 -29% (Rogan et al. 1995) spring December–September $20.41 $15.07 -26% (Irving 1991) spring January–July $15.41 $9.22 -40% (Donnelly 1991a) - January–December $25.10 $14.94 -40% (Sackett 1990) spring February–November $15.52* $14.12* -9% (Newman et al. 1996) autumn October–May $16.20 $15.56 -4% (Little et al. 1993) - 3–12 months old $7–8/hd (Taylor 1989)
* gross returns
19 Literature Review patterns to align the young sheep with the adult flock. No study has attempted to directly compare the net effect in both weaner and adult sheep of various adult shearing times.
2.1.42.1.42.1.4 Vegetable Matter Contamination The importance of vegetable matter (VM) contamination in wool varies between regions in Australia (Foot and Vizard 1993). The effect of shearing time on vegetable matter contamination is highly dependent on geographic region because of the great variation in pasture species responsible for it and the time of year when contamination occurs. For example, vegetable matter differed by 23% between shorn and unshorn weaners during spring at Trangie, NSW (1.9% vs. 24.9%, respectively; Warr and Thompson 1976), but there was no vegetable matter difference between shorn and unshorn weaners in a similar experiment in Gippsland, Victoria (Irving 1991). Substantial vegetable matter contamination is generally a small problem in Victoria and wool from the south-west of the state tends to be free or nearly free of vegetable matter (Court and Lawless 1995). By contrast, 60% of wool produced in the pastoral region of South Australia can carry significant contamination (Foot and Vizard 1993).
Figure 2.4 suggests a relationship between sale date, and by inference shearing time, of wool and its average vegetable matter content, at a state level. Wool sold in South Australia, New South Wales and Queensland had significant vegetable matter content that varied throughout the year. However, as outlined above, such broad surveys can overlook variation in associations between vegetable matter contamination and shearing time. In contrast to the results presented in Figure 2.4, two NSW surveys of vegetable matter contamination found that there was generally no association between shearing time and total amounts of vegetable matter contamination (Cornish and Beale 1974; Warr et al. 1979). Warr and his colleagues did show that summer shearing in one region of the state reduced vegetable matter
Figure 2.4: Average vegetable matter levels in Australian states (Source: AWEX sale data 2000– 01 to 2002–03, inclusive, compiled by B. Swain)
3.5
3
2.5
2
1.5
1
Vegetable Vegetable matter (%) 0.5
0 Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun Month of wool sale
SA NSW QLD WA VIC TAS
20 Literature Review compared to winter or autumn shearing but in other districts, although levels of individual seed and burr were reduced by particular shearing times, overall vegetable matter levels did not change. It was suggested that vegetable matter contamination remained relatively constant in these districts because, as one potential vegetable matter contaminant species declined, it was replaced by another. Also, the slow maturation of the seeds of some contaminant species resulted in vegetable matter being available for periods beyond the protective effect of shearing.
In contrast to the New South Wales surveys, several studies in Victoria and Western Australia have shown a more widespread association between time of shearing and vegetable matter contamination of wool (Arnold and Gordon 1973; Arnold et al. 1984; Stanton 1987 cited by Hansford 1987; Bowman 1989; Ralph 1984 cited by Bell and Ralph 1993). These studies reported vegetable matter contamination in wool sold in different months and assumed that shearing occurred one to two months before sale. They showed that the highest vegetable matter levels were in wools shorn between March and May and the lowest from shearing between October and January.
Vegetable matter contamination can be influenced by time of shearing because extraneous matter accumulates more rapidly in long than in short wool. An experiment conducted in Western Australia showed that, between late spring and early autumn, autumn-shorn Merino wethers accumulated vegetable matter up to 14 times faster than spring-shorn sheep. These sheep were grazing pastures dominated by barley (Hordeum sp.) and brome grass (Bromus sp.), both of which are significant causes of vegetable matter contamination (Ritchie et al. 1992).
The price discount incurred for wool with significant vegetable matter may also vary seasonally. For example, Bowman (1989) showed that the wool price discount for vegetable matter contamination was highest for wool sold in August (about a 12% discount, compared to wool with <1% vegetable matter) and lowest for wool sold in January (a 4% discount). This probably reflected the different levels and type of vegetable matter contamination present in wool sold at different times.
In summary, the importance of vegetable matter contamination is inconsistent from region to region, and possibly from farm to farm, and consequently its significance when choosing a shearing time will vary. In the Western District of Victoria, vegetable matter contamination is a small problem and usually does not warrant particular attention through change of shearing time or any other management strategies. In other parts of Australia, the wide range of potential contaminant pasture species can mean that no shearing time effectively controls vegetable matter contamination and pasture improvement or tactical control of problem species, such as spraying or slashing of pastures, may be a more effective option (Little et al. 1993). It should be noted that, even if vegetable matter poses little threat to wool quality, it can cause significant health problems in young sheep, which is discussed in more detail in Section 2.2.6
21 Literature Review
2.1.52.1.52.1.5 Time of Shearing & Wool QualityQuality————SummarySummarySummarySummary Choosing a time of shearing to optimise wool quality is not a straightforward decision. In a self- replacing flock, shearing and lambing time interact to influence staple strength, and the extent of the effect varies between environments. Furthermore, the effect of shearing time on staple strength cannot be examined in isolation, because changing shearing time also affects fleece weight and fibre diameter, which are, arguably, more important determinants of wool income than staple strength. The shearing time-lambing time interaction is also important because it affects the staple length of wool produced by young sheep as their shearing time is aligned with the adult flock, and this can substantially influence the value of the wool that they produce. There is a need for a study that examines the effect of shearing time by simultaneously measuring all important wool characteristics and that is conducted in spring- lambing flocks.
22 Literature Review
2.22.22.2 The Effect of Shearing Time oonn Flock Management & Sheep Health
2.2.12.2.12.2.1 Sheep Nutritional Requirements & Farm Stocking Rate This review is restricted to Australian experiments conducted in Merinos or similar breeds, such as Polwarths, or non-Merino experiments, the results of which are particularly relevant to production systems in southern Australia. A number of experiments, using a variety of methodologies, have shown that shearing increases the nutritional requirements of sheep and these are listed in Table 2.4. It has been pointed out that this may have an important financial impact (Salmon et al. 2006). If shearing increases requirements at a time of the year when there is limited pasture availability, it may constrain overall stocking rate of the farm and hence profitability.
The amount by which shearing increases feed requirements and affects grazing sheep depends on a variety of factors, including the prevailing weather conditions—determined by season—(Elvidge and Coop 1974), the sheep’s reproductive status (Hudson and Bottomley 1978) and pasture availability (Arnold and Birrell 1977). In the first few weeks after shearing, nutritional requirements have been shown to increase by up to 69% in non-breeding sheep in Australia (Hutchinson and McRae 1969; Arnold and Birrell 1977; Hudson and Bottomley 1978; Love et al. 1978). The largest increase in these studies was observed following shearing in winter in Tasmania, however cold temperatures at other times of the year, particularly if they are accompanied by wind and/or rain, can also cause significant increases in feed requirements. For example, a 62% increase was observed following shearing in December in Armidale, New South Wales, when minimum temperatures were as low as 7° C, and 25 mm of rain fell soon after shearing (Wheeler et al. 1963).
Studies have shown that, in the month after shearing, the post-shearing increase in energy requirements of pregnant ewes was less than non-reproducing sheep. For example, in three studies, the increase in pregnant ewes was only 38–85% of the increase in their non-pregnant counterparts (Elvidge and Coop 1974; Hudson and Bottomley 1978; Bottomley 1979). In these studies, the largest difference between pregnant and non-pregnant ewes occurred after August shearing, whereas the difference was smaller following shearing in June. However, environmental temperature differences between these shearing times may not have been the only cause of the different increases in feed intake. It has been hypothesised that the smaller increase in energy requirement of pregnant ewes after shearing is associated with their pre-existing greater feed intake (Bottomley and Hudson 1976). The extra ruminal heat production associated with this greater intake partially offsets the effects of shearing and, consequently, the post-shearing increase in energy requirement will be less.
Although energy requirements probably increase immediately after shearing (Farrell and Corbett 1970; Holmes et al. 1992), some studies in grazing sheep have reported a lag of one to three weeks between shearing and an increase in feed intake (Wheeler et al. 1963; Wodzick-Tomaszewska 1963; Love et al.
23
Table 2.4: Summary of the results of studies examining changes in sheep feed requirements following shearing at different times of the year
Sheep breed & Time of Location Change in feed intake or Observation Method used to estimate Reference class shearing requirements post-shearing period (weeks) feed requirement
Merino wethers July Canberra, ACT ~20–70% 0–4 Total faecal nitrogen output (Arnold et al. 1964)
Merino & August Hamilton, Vic. 10–44% (fat–thin sheep) 0–4 Total faecal nitrogen output (Arnold and Birrell 1977) Corriedale Bakers Hill, WA 0 (but weight gain of shorn Merino wethers July Armidale, NSW 30%, 59% 0–1, 1–2 Cr2O3 faecal marker (Hutchinson and McRae 1969) Merino wethers ? Hamilton, Vic. 0, 14, 0% 0–1, 2–5, 6 ? (Love et al. 1978) Merino wethers & December Armidale, NSW 42–62% (grazing) 0–2 Cr2O3 faecal marker (Wheeler et al. 1963) Review Literature ewes (non- 20–51% (penned) pregnant) 24 Polwarth ewes: July–August Cressy, Tas. Intake required to maintain (Hudson and Bottomley pregnant 26%, 9% 0–4, 4–8 bodyweight 1978) non-pregnant 69%, 27% 0–4, 4–8 Polwarth ewes July Cressy, Tas. 22% 0–7 Intake required to maintain (Bottomley and Hudson (spring-lambing) bodyweight 1976) Polwarth ewes ? Cressy, Tas. 33%, 33% (non-pregnant) 0–4, 4–8 ? (unpublished data cited by 28%, 23% (pregnant) Bottomley 1979) Romney ewes: Canterbury, NZ Energy associated with (Elvidge and Coop 1974) non-pregnant January–April 24% 0–13 bodyweight loss non-pregnant April–July 73% 0–10 3rd trimester June–July 50% 0–6 pregnant ewes June Swansea, Tas. 61% (lambing in October) 0–12 computer simulation (Black and Bottomley 1980) 18% (lambing in July) Literature Review 1978). On the other hand, Hutchinson and McRae (1969) observed feed intake to increase in the week immediately after shearing. The reason for these differing observations is not clear but could reflect either true physiological or methodological differences. Acute cold stress can depress grazing behaviour in sheep (Sykes and Slee 1969), although this usually only persists for a few days after shearing (Webster and Lynch 1966; Donnelly et al. 1974). Alternatively, the chromium oxide faecal marker technique, used in several of the above studies, does not reflect small changes in feed intake well (Holmes et al. 1992), so changes in feed consumption immediately after shearing, before heat production reached its maximum (Farrell and Corbett 1970), might not have been readily detected. It is not clear how long the period of increased energy requirement following shearing lasts. Hudson and Bottomley’s study (1978) compared the intakes of shorn and unshorn sheep for eight weeks after shearing, the longest study published. Shorn sheep were still eating more than unshorn ones at its conclusion, suggesting that feed intake is increased by shearing for at least two months. Although no field experiment has shown how long the effect of shearing on intake persists, a computer simulation suggested that residual effects of shearing on energy requirement for at least six months (Black and Bottomley 1980). This finding is consistent with Farrell’s (1970) observation on heat production in sheep following shearing. In grazing wethers shorn in June in Armidale, heat production did not return to pre-shearing levels until 19 weeks after shearing, despite the sheep being thermoneutral in their measurement environment much earlier than this (Farrell and Corbett 1970). This led to the conclusion that the increase in metabolic rate, and hence energy requirement, following shearing is due to a metabolic adjustment that, although proportional to the degree of cold exposure following shearing, lasts much longer than the stimulus does. Thus it is likely that the energy requirement of sheep shorn in winter may remain elevated for several months following shearing. A sheep will lose weight if it cannot increase its feed intake after shearing to meet its greater energy requirement (Elvidge and Coop 1974; Bottomley and Hudson 1976). In southern Australia, this is most likely to occur with a winter shearing time, when cold conditions stimulate the greatest increase in energy requirement but feed availability often limits intake to less than that required to maintain bodyweight. Avoiding weight loss in this situation may require supplementary feeding or a reduction in stocking rate, to increase pasture availability. For example, Hudson and Bottomley (1978) estimated that the stocking rate of pregnant and non-pregnant sheep shorn in winter would have to be about 13% and 33% lower, respectively, than unshorn sheep, to avoid bodyweight loss immediately after shearing. Similarly, the computer simulation of Black and Bottomley (1980) estimated that winter shearing of ewes with an optimised mid-October lambing time lowered stocking rate by 38%, compared to spring shearing. In summary, the substantial and prolonged increase in energy requirement that may occur after shearing in winter, combined with the generally poor pasture availability at this time, probably makes it an undesirable choice of shearing time in southern Australia. Shearing time can also affect farm stocking rate through its interaction with the timing of the sale of cull sheep. In southern Australia, surplus sheep are commonly sold in early summer, as a risk management strategy, to reduce grazing pressure before pasture availability decreases prior to the 25 Literature Review autumn break (Vizard and Foot 1994). If this occurs at, or soon after, the main shearing time, sheep leave the farm carrying only a small amount of wool, and little wool value is foregone (McLeod 1967). A main shearing time falling between late summer and early winter, however, comes several months after the preferred time for cull sales, and has two possible outcomes. Firstly, as described in Chapter 1, cull sheep can be shorn immediately prior to sale, but this produces short staple length wool that receives substantial price discounts (Independent Commodity Services 2005). Alternatively, sheep can remain on the farm to be sold after the main shearing, but simulations of a farm enterprise suggested that this can substantially reduce gross margins by reducing stocking rate (Salmon et al. 2006). The reduction occurs because selling sheep after shearing in autumn reduces winter grazing pressure, and hence annual per-hectare wool production and income. Even if extra stock are retained to maintain winter stocking rate, the gross margin is still reduced because extra supplementary feeding is often required to carry the extra animals throughout summer and autumn until they can be shorn. In Salmon et al.’s simulation, selling culls after shearing in May reduced winter stocking rate in a wether enterprise by 17% and maintaining winter stocking rate increased summer supplementary feeding costs by 54%, compared to December shearing. This is another undesirable consequence of late autumn or winter shearing in southern Australia. In New Zealand, the disadvantage of winter shearing to ewe nutrition and hypothermia risk has been addressed through the use of ‘cover combs’, which leave a greater depth of residual fleece on the sheep (Dabiri et al. 1995). Even when cover combs are used, sheep metabolic rate and energy requirement increase after shearing (Holmes et al. 1992), which may reduce winter stocking rate. Therefore, it is difficult to advocate winter shearing until benefits can be consistently demonstrated that outweigh its known disadvantages. There is a common belief amongst farmers that shearing young sheep increases their bodyweight, despite a number of Australian studies showing that the growth rate of Merino or Merino-cross weaners does not increase after shearing, compared to unshorn ones (Tucker 1964; Sumner 1984; Donnelly 1991a; Irving 1991; Rogan et al. 1995; R. Thompson pers. comm.). Although these studies used different times of lambing, all except the unpublished work of Thompson compared the effect of shearing three to six month old lambs at weaning. In contrast, Thompson’s study, conducted in western Victoria, compared spring-born Merinos first shorn when they were three, six, twelve or 15 months old. He also found no bodyweight differences at 15 and 27 months of age. The one study in which shearing was associated with increased bodyweight of weaners was conducted by Drinan and Ferguson (1966), who found that winter-born, second-cross weaners shorn at weaning in December were about 2 kg heavier than unshorn sheep in February. Some, but not all, of this difference was caused by weight loss in several unshorn sheep that were affected by flystrike. However in this study, as in the others, economic returns were not increased by shearing because the cost and lower wool value associated with the extra shearing were not offset by any increases in bodyweight, carcase value or wool production. In conclusion, although early shearing is not generally associated with increased bodyweight of weaners, there may be particular animal health situations, such as flystrike, in which an increase occurs. 26 Literature Review 2.2.22.2.22.2.2 Reproductive Efficiency It has been suggested that reproductive efficiency, as measured by the number of offspring successfully weaned as a percentage of the number of ewes mated, may be influenced by the time of shearing. Depending on timing, relative to the reproductive state of the ewe, shearing has the potential to directly influence several stages of the reproductive pathway, including mating (MacKenzie et al. 1975), gestation, parturition and lamb survival (Kenyon et al. 1999). The time of shearing can also have an indirect effect on reproductive efficiency through its influence on energy requirement and body condition of the ewe (Hudson and Bottomley 1978). The time of shearing, relative to reproduction, and the ways it affects the above events are discussed in the remainder of this section. 2.2.2.1 Shearing During Mating A study of Bungaree Merino ewes in New South Wales concluded that shearing three months prior to joining might improve the success of mating, compared to shearing prior to lambing (McGuirk et al. 1966). In the study, ewes were shorn either in June or December and were mated in late February. By the end of the mating period, 72% of June-shorn ewes, compared to 88% of December-shorn ewes had been mated. No comment was made about other factors that could have directly affected fertility, which may or may not have been associated with shearing time, such as ewe condition score. Despite the statistically significant difference in proportion of mated ewes, there was no significant difference in the percentage of ewes lambing between the two shearing times (33% shorn vs. 30% unshorn). Several studies have observed deleterious effects of shearing just prior to mating on reproductive parameters. Parr et al. (1989) found that Merino ewes in Victoria shorn less than three weeks before mating had a lower incidence of oestrus than ewes shorn more than three weeks before mating. Similar observations have been made in New Zealand in Romney Marsh ewes that were naturally mated in autumn (McMillan and Knight 1982). Mackenzie et al. (1975) suggested that the main effect of shearing was to suppress behavioural oestrus, rather than inhibit ovulation. They examined Merino × Border Leicester ewes that were either left unshorn or shorn one week prior to exposure in winter to the severe, cold weather of the New South Wales Tablelands, when they were either fed or fasted. No difference in ovarian activity or ovulation rate was observed between shorn or unshorn ewes at necropsy but fewer fasted or fully-fed shorn ewes exhibited oestrus signs, compared to unshorn ewes (three, six and ten ewes out of ten, respectively). Similarly, Fernandez et al. (1996) found that Corriedale ewes mated in summer in Uruguay (at a similar latitude to south-eastern Australia) were unaffected by shearing if it occurred between 3 months and 3 weeks before mating but that ewes shorn closer to mating had a lower incidence of oestrus, despite shearing time having no effect on ovarian activity, as observed laparoscopically. 2.2.2.2 Shearing & Lamb Birthweight Birthweight is an important factor in neonatal lamb survival (McCutcheon et al. 1981). Excessively heavy lambs, usually singletons, are more likely to experience a difficult or prolonged birth, which causes anoxia or central nervous system trauma that then predisposes the lamb to the neonatal 27 Literature Review starvation-mismothering syndrome (Dalton et al. 1980; Haughey 1981). Low birthweight lambs, commonly twins, are at risk of death from starvation, as well as hypothermia in adverse weather (Alexander et al. 1980; McCutcheon et al. 1981). Thus the manipulation of lamb birthweight may alter rates of lamb survival, especially if there are fewer lighter or heavier weight lambs. Numerous studies, many of which have been performed in Romneys in New Zealand, have investigated the effect of shearing during pregnancy on lamb birthweight but they have not produced consistent results. For example, two studies showed that shearing during pregnancy had no effect on the birthweight of single or twin lambs (Dabiri et al. 1994; Dabiri et al. 1996); three studies showed that shearing during pregnancy increased the birthweight of singletons only (Morris et al. 2000; Kenyon et al. 2002b; Revell et al. 2002); two studies showed that shearing increased the birthweight of twins only (Black and Chestnutt 1990; Morris 1997); whilst one study showed that shearing increased birthweight of both singles and twins (Kenyon et al. 2002c). The variable results may have occurred because of differing ewe nutrition, environmental conditions at the time of shearing—as determined by lambing time—or the timing of shearing in relation to gestation. Nevertheless, the effect of shearing during pregnancy on birthweight appears to be variable, difficult to control and does not always achieve the aim of increasing the birthweight of lightweight lambs without adversely the weight of heavier ones. More consistent, and greater, beneficial effects on lamb birthweight and survival are known to be achievable by ensuring that ewes are not undernourished during pregnancy (Kenyon et al. 2002b). Some of the studies that observed an association between pre-lamb shearing and birthweight showed no improvement in lamb survival (Alexander et al. 1980; Grosser et al. 1991; Dabiri et al. 1994; Falck et al. 2001; Kenyon et al. 2002a; Kenyon et al. 2006b). Two studies conclude that pre-lamb shearing improves lamb survival but the evidence for their conclusions is weak. In one, the conclusion was based on a difference in survival observed in one of two years that was statistically non-significant (73% vs. 64% survival, P = 0.08, in South African Merinos Cloete et al. 1994). In the other, shearing increased survival of Romney-cross lambs by 6% at one trial site in New Zealand but had no effect on survival at another (Kenyon et al. 2006a). 2.2.2.3 Shearing & Ewe Sheltering Behaviour Shearing during pregnancy has also been examined to determine whether it can improve lamb survival by encouraging ewes to seek shelter around the time of lambing. An experiment showed that substantially more shorn than unshorn Merino ewes sought shelter beside windbreaks around the time of lambing during winter at Armidale, New South Wales (Lynch and Alexander 1977). A separate experiment showed that shearing up to four weeks prior to lambing stimulated sheltering behaviour, but that fewer ewes sought shelter at lambing if they were shorn eight weeks beforehand (Lynch and Alexander 1980). Despite these behavioural responses, no studies have reliably demonstrated that this leads to improved lamb survival (Lynch and Alexander 1976; Alexander et al. 1980). In Alexander and Lynch’s study, overall neonatal mortality was relatively low (12%) and there were few twin births, which may have affected the statistical comparisons, but fewer singleton lambs born to shorn ewes died within three days of birth than lambs born to unshorn ewes (7% vs. 16%, 0.05 < P < 0.1). Lynch and 28 Literature Review Alexander’s study reported data from five consecutive years and showed no statistically significant effect of shearing on the mortality of singletons (P = 0.16) or twins (P = 0.064). Furthermore, when bad weather occurred and ewes did not have shelter, there was a much greater increase in lamb deaths from shorn ewes than from unshorn ones. The fact that shearing encouraged ewes to seek shelter but did not improve survival may have been because shearing still did not induce a large majority of ewes to seek shelter (Lynch and Alexander 1977) or because it affected ewe nutritional requirements and lactation, and hence lamb nutrition and survival. The results of these studies suggest that the provision of abundant shelter per se, or strategically locating it where ewes are most likely to use it, is much more effective in improving lamb survival than using shearing to ‘force’ ewes to use pre-existing, limited shelter (Alexander et al. 1979; Alexander et al. 1980; Lynch et al. 1980). 2.2.32.2.32.2.3 Fleece RotFleece Rot Fleece rot is a bacterial exudative dermatitis, mainly associated with Pseudomonas spp., that primarily affects young sheep (Burrell 1988; Colditz et al. 1996). It is important to animal health and wool quality because it is a major predisposing factor for body flystrike and can permanently discolour wool (Gherardi et al. 1983; Raadsma 1988). Prolonged or severe wetting of the skin, leading to maceration of the epidermis and establishment of infection, is a prerequisite for the development of fleece rot because it leads to maceration of the epidermis, which allows the infection to become established. Wetting of skin is influenced by fleece length, so time of shearing is important to the pathogenesis of the disease. Sheep carrying 4–6 months wool growth during wetting events that were known to induce fleece rot have had up to twice the prevalence of fleece rot of sheep carrying either less or more wool (Raadsma 1988). The greater prevalence appears to occur because the fleeces 4–6 months after shearing absorb considerable quantities of water but are slow to dry out. In contrast, shorter wool dries quickly and longer wool reduces the risk of saturation of the skin in the first place (Copland 1986). A further two experiments showed a relationship between shearing time and fleece rot, although they differed slightly from the results above (Hemsley et al. 1984). In both experiments, shearing weaners increased the subsequent prevalence of fleece rot, compared to unshorn sheep. In the first trial, fleece rot was assessed at 13 months of age and more fleece rot occurred in weaners shorn at five months of age than in unshorn weaners. In the second trial, fleece rot was assessed at seven months of age and was more prevalent amongst weaners shorn at three months of age than amongst unshorn weaners of the same age or two months younger. Although neither weather conditions nor staple length at the onset of fleece rot were reported in either experiment, the first set of results could be consistent with Raadsma’s (1988) observation of sheep with wool 4–6 months long being most susceptible to fleece rot. In the second experiment, a shorter wool length than that reported by Raadsma was associated with more prevalent fleece rot because more fleece rot developed in weaners with 0–4 months wool (shorn group) than ones with 0–7 or 0–5 months wool (unshorn groups seven and five months old, respectively). The younger unshorn weaners had less fleece rot than the shorn ones, although their staple lengths were similar, and Hemsley et al. suggested that having the wool present at birth, or ‘lambs’ tip’, still on the end of the staple reduced the development of fleece rot. Few sheep were used 29 Literature Review in this experiment and other differences between Hemsley et al.’s and Raadsma’s trials, such as sheep genotype and environmental conditions, might also have contributed to the variation in results. 2.2.42.2.42.2.4 DagDagDag Dag, or faecal soiling of the breech area in sheep, is important because it predisposes sheep to breech flystrike (Watts et al. 1979), and affects enterprise profitability through the cost of crutching and lowered income from affected wool (Larsen et al. 1995). Dag accumulates more readily in longer wool (French et al. 1996) but few studies have investigated the relationship between dag accumulation and time of shearing. Nevertheless, it is likely to be an important relationship because shearing time determines wool length at different times of the year and, in southern Australia, the diarrhoea that causes dag is a seasonal phenomenon that tends to occur during winter and early spring (Larsen et al. 1994). In a study in western Victoria, different weaner shearing times incurred different crutching costs, although the actual differences were not reported (R. Thompson pers. comm.). In this experiment, all weaners were crutched in April when they were seven months old but the presence of dag had necessitated jetting one month previously to control the risk of breech strike. 2.2.52.2.52.2.5 FlystrikeFlystrikeFlystrike Blowfly strike, or cutaneous myiasis, is one of the major diseases of sheep in Australia and has been estimated as being second only to intestinal parasitism in economic importance (Sackett 2006). Wool is an important factor in the development of flystrike because it traps urine, faeces and exudate on the sheep’s skin, which attract adult female flies and provide the moisture necessary for survival of deposited blowfly eggs and the establishment of strikes (Raadsma 1988). The principal blowfly species that causes flystrike in Australia, Lucilia cuprina, is most active during spring and autumn in southern Australia, when there is sufficient environmental warmth and humidity for its development. Thus, time of shearing largely determines susceptibility to flystrike because it dictates the length of wool carried by a sheep at times when blowflies are present. The important types of flystrike in a self- replacing Merino flock are body, breech (around the crutch and tail region) and pizzle (around the preputial opening on rams or wethers) strike. The main predisposing factors for body strike are fleece rot and mycotic dermatitis, whilst soiling around the crutch with dag or urine greatly increases the risk of breech strike, which is the most common form of flystrike in Australia (Gherardi et al. 1983; Raadsma 1988; Wardhaugh and Morton 1990). The presence of urine and other moisture are the main predisposing factors for pizzle strike (Raadsma 1988). Shearing immediately before the onset of the main time of blowfly activity, which is usually spring in southern Australia, is a major control strategy for flystrike (Fels 1971). It has the advantage of achieving protection without the use of insecticides, it has to be performed anyway, and is effective in preventing body, breech and pizzle strike (Raadsma 1988). Shearing usually protects against breech strike for up to three months and against body strike for up to two months. However, in the case of body strike, shearing at the start of one risk period (spring) can increase susceptibility to flystrike at a later risk period (autumn) because sheep will be carrying 4–6 months wool at this time, which is the 30 Literature Review length most predisposed to developing fleece rot. Nonetheless, shearing usually improves overall flystrike control if it precedes the more significant period of fly activity. For example, a postal survey of 200 sheep producers in the New England region of New South Wales found that shearing in November was associated with a lower incidence of flystrike (Brodbeck and Hill 1984). Similarly, in western New South Wales, Kennedy et al. (1982) found that fewer spring-shorn ewes were affected by flystrike than autumn-shorn ewes in February (3.6% vs. 11.9%, respectively). Despite this difference, Kennedy et al. (1982) suggested that the greater incidence of mortality amongst spring-shorn ewes observed in a separate experiment may have been due to a higher prevalence of flystrike in these ewes during spring and summer. This appears to contradict the findings of Brodbeck and Hill, although they did not specify whether the strikes occurred before or after shearing, whether they were of a particular type (for example, an increased prevalence of fleece rot in spring-shorn sheep in summer) or were associated with particular rainfall events that increased susceptibility to strike. 2.2.62.2.62.2.6 Other Health Issues Shearing can indirectly affect other aspects of sheep health if it affects risk factors for those conditions. For example, shearing before lambing may induce pregnancy toxaemia if ewes do not have access to feed whilst they are yarded for shearing, or if shearing significantly increases their energy requirements (Irving 1991). Shearing has been shown to significantly increase a sheep’s susceptibility to photo- active toxins, such as St John’s Wort (Hypericum), because it exposes the skin to sunlight, which converts toxin molecules in the circulation into a physiologically active state (Bourke 2003). Shearing also increases sensitivity to hot ambient temperatures (Parer 1963) because, although fleece is a thermal insulator, it also provides excellent protection against heating by solar radiation (Bottomley 1979). Thus, in various situations, shearing might be chosen to avoid times of high-risk times associated with particular conditions, such as prior to lambing or sunny times of the year, if they are important on a farm. Choice of shearing pattern is also important if a lice eradication program is being conducted on a farm, because eradication is facilitated if all sheep on a property can be shorn and treated with insecticide at the same time (Hungerford 1990). This might mean, for example, that young sheep might be shorn at the same time as adults in the year that an eradication program is being undertaken. As discussed in Section 2.1.4, shearing is an important factor affecting the accumulation of vegetable matter in wool. In addition to its effects on wool quality, grass seed infestation can be an important sheep health issue because seed awns caught in the wool can penetrate the skin and eyes, leading to discomfort, loss of production, infection and death, particularly in young sheep (Campbell et al. 1972; Cornish and Beale 1974). Because shearing reduces the accumulation of vegetable matter in wool (Ritchie et al. 1992), its timing might be chosen to leave sheep carrying short wool before the time of major seed production by problem pasture species, such as barley grass (Hordeum spp.) and spear grass (Stipa spp.). This is a relatively common practice amongst farmers (Cornish and Beale 1974), although a study reported no difference in liveweight gain or feed intake between unshorn and shorn Merinos grazing pastures containing problem species in Trangie, New South Wales (Warr and Thompson 1976). 31 Literature Review The authors suggested that the lack of production differences between shearing groups was because grass seeds caused discomfort to shorn sheep, making them reluctant to move around the paddock and graze, thereby counteracting the benefit of shearing. However, several other experiments have shown that shearing is likely to reduce the risk of death from grass seed infestation and improve production. Weaner mortality has been shown to be highly correlated (r2=0.52) with the density of barley and spear grass seeds in the pasture (Campbell et al. 1972) and shearing Merino-cross weaners grazing maturing barley grass pastures has been shown to give an immediate and sustained advantage in growth rate over unshorn sheep (Holst et al. 1996). As with vegetable matter contamination and wool quality, the importance of grass seed infestation to animal health will vary according to the prevalence of problem species in pastures and, where grass seeds are a potential problem, control measures other than shearing, such as manipulation of pasture species, may be more appropriate or cost-effective. For example, Little et al. (1993) showed that shearing weaner sheep once was more profitable than shearing them twice, despite the extra shearing being timed to assist in the control of grass seed infestation and flystrike risk. In this instance, very effective control of grass seed infestation was achieved by slashing and spraying pastures, and shearing afforded sheep little extra protection but increased costs, thus reducing net wool returns. 2.2.72.2.72.2.7 MortalityMortalityMortality Shearing has been shown to increase the mortality risk of sheep, although this occurs intermittently and is influenced by a number of other factors (Hutchinson and Bennett 1962). Nevertheless, anecdotal evidence suggests that, when it occurs, post-shearing mortality can be severe. For example, during a period of windy, rainy weather in September in south-eastern South Australia, 8% of sheep that had been shorn in the preceding fortnight died (Geytenbeek 1962) and at Darkan, Western Australia, 28% of sheep shorn in the preceding month (approximately 100,000 animals) died during storms in January (Buckman 1982). In an Australia-wide survey, Hutchinson (1968) reported that, whilst about half of respondents reported no post-shearing deaths, 6% reported that more than 2% of sheep died in the month following shearing. At that time, the average mortality rate of 0.7% within a month of shearing translated to more than one million sheep dying after shearing annually in Australia, although this figure was not compared to estimates of death rates of unshorn sheep. In direct field studies comparing shorn and unshorn sheep, shearing has been shown to increase mortality risk 17-fold in Merino wethers at Armidale, New South Wales in April (Hutchinson and McRae 1969), five-fold in Border Leicester- cross ewes at Palmerston North, New Zealand in July (Dabiri et al. 1995) and by 63% amongst Corriedale ewes in California, regardless of whether they were shorn in April, July or December (Torell et al. 1969). The difference in these risk ratios may be at least partially explained by the time over which the observations were made: a fortnight, a month and three to five months in the Armidale, Palmerston North and Californian studies, respectively. Hutchinson’s (1968) and Geytenbeek’s (1962) surveys showed that the majority of post-shearing mortalities occur within a fortnight of shearing and that risk is particularly great in the first few days after shearing. 32 Literature Review Post-shearing deaths are generally attributed to hypothermia, associated with adrenal gland insufficiency following excessive stimulation of the pituitary-adrenal system (Panaretto 1967; Hutchinson and McRae 1969). Sheep are at greater risk of dying if shearing precedes periods of cold, wet or windy weather (Hutchinson 1968), which has also been shown to be associated with altered adrenal function (Reid 1962). Under such environmental conditions, the ambient temperature may fall below the sheep’s lower critical temperature—the temperature below which it must produce extra heat in order to maintain homeothermy (Gregory 1995). The more ambient temperature falls below lower critical temperature, the greater the requirement for extra heat production and, if it cannot produce this extra heat, the sheep will become hypothermic and die. The impact of shearing on lower critical temperature is not restricted to the removal of the fleece. Sheep can experience several days of fasting around shearing as they are mustered and held away from pasture, and because acute cold stress after shearing discourages sheep from grazing (Hutchinson and McRae 1969; Sykes and Slee 1969; Donnelly et al. 1974). Fasting reduces ruminal heat production, thus raising lower critical temperature and the risk of death (Donnelly 1991b). The particularly severe mortality risk experienced by sheep soon after shearing occurs for two reasons. Firstly, a sheep must rely on tissue catabolism and non-metabolic thermogenesis, such as shivering, to meet its need for extra heat production while appetite is suppressed in the immediate post-shearing period (Farrell and Corbett 1970; Holmes et al. 1992). Secondly, although metabolic acclimatization, to the point where non-metabolic mechanisms are no longer required, occurs within about two weeks of shearing (Donnelly et al. 1974), feed intake may take a longer time to increase (Wodzick- Tomaszewska 1963), requiring further tissue catabolism for heat production. Thus, during the fortnight after shearing, sheep are particularly vulnerable to hypothermia if non-metabolic mechanisms or body energy reserves cannot generate sufficient heat. The important link between body condition and risk of death after shearing has also been identified in experiments and surveys showing that more thin sheep died after shearing than ones in moderate to good condition (Hutchinson and Lynch 1966; Hutchinson 1968). In the 1966 study, bodyweight and growth rate prior to shearing accounted for 82% of the variation in mortality, with rate of weight loss being much more important than absolute weight in explaining the variation in mortality. In a subsequent experiment, only rate of weight loss, and not body condition or post-shearing pasture availability, was associated with mortality risk, suggesting that sheep in a catabolic state before shearing are at greater risk of death afterwards (Hutchinson and McRae 1969). Similarly, weight loss in the months prior to shearing has been shown to be associated with increased susceptibility to cold off shears (Slee 1966). This may occur because sheep experiencing nutritional stress have a reduced metabolic rate. Donnelly et al. (1974) showed that such animals experience prolonged weight loss after shearing and, since body fat reserves are a primary source of energy for increased heat production following shearing (Holmes et al. 1992), thin body condition will impede a sheep’s ability to maintain body temperature after shearing, putting it at greater risk of death. 33 Literature Review In contrast to the above studies, no association was reported between body condition and post-shearing mortality of sheep exposed to severe storms in January 1982 at Darkan in south-western Western Australia (Buckman 1982), although this may have been a result of the difficulty of collecting accurate bodyweight results via a survey of farmers after the mortality events had occurred. In this report, the large numbers of deaths after shearing were attributed to sheep not being metabolically acclimated to cold weather during the summer months. Thus, shearing sheep at a time when their body condition is decreasing, when they are under nutritional stress or when cold storms are likely carries a greater risk of post-shearing mortality. In southern Australia, this combination is most likely in autumn, when sheep have lost weight in the preceding months, environmental temperatures are cooling and rain is becoming more likely (Hutchinson and McRae 1969). Furthermore, at this time of the year sheep have experienced warm environmental conditions for several months and, thus, are not acclimated to cold. Cold-acclimated sheep have greater basal heat production and are more sensitive to the thermogenic effects of circulating catecholamines, making them more able to withstand sudden exposure to colder weather (Graham and Christopherson 1981). In summary, sheep shorn in autumn will require a greater increase in basal heat production—and a greater stimulation by endogenous catecholamines to do so—than sheep shorn at colder times of the year, and are likely to have smaller body energy stores that are more slowly mobilised to increase heat production. All of these factors increase the likelihood death from hypothermia after shearing in autumn, compared to other times of the year. No studies have specifically examined the effect of shearing time on mortality of weaner Merino sheep but it could be expected that the underlying physiological mechanisms and risk factors would be similar, although the greater body surface area to weight ratio of the smaller animal might make it more susceptible to chilling than an adult. The factors that have been shown to be associated with deaths of young sheep are reviewed in the following section. 34 Literature Review 2.32.32.3 Mortality of Weaner Merino Sheep This review discusses the extent of Merino weaner mortality in Australia, its underlying causes and factors that have been shown to be associated with mortality in weaner sheep flocks in Australia and overseas. In Merinos, weaning is recommended to occur at approximately 12 weeks of age (Lean et al. 1997), and the term ‘weaner’ usually refers to a sheep between weaning and about 12 months of age. No maximum acceptable mortality rates of sheep between weaning and 12–18 months of age exist in the published literature, however a maximum of approximately 4% has been suggested as a realistic target (Larsen 2002; Behrendt 2003). 2.3.12.3.12.3.1 The Extent of Merino Weaner Mortality in Australia The problem of excessive mortality of weaner sheep, and Merinos in particular, is not new to the Australian sheep industry. For example, whilst visiting the Midland and Great Southern Districts of Western Australia, Sir Ian Clunies Ross reported that farmers were experiencing “considerable difficulty” rearing young sheep (1934). It is difficult to assess the current extent of Merino weaner mortality in Australia because very few recent surveys or scientifically-based estimates of weaner mortality in pasture-based grazing enterprises exist. Despite a paucity of reports concerned primarily with weaner mortality, commercial surveys and field experiments that incidentally or indirectly report mortality data allow estimates to be made. The average and range of Merino weaner mortality obtained from such studies in Australia are listed in Table 2.5. It indicates that rates of Merino weaner mortality exceeding 10% per annum occur in many regions of Australia, and that death rates have not decreased in recent times. Excessive post-weaning mortality can have important, adverse effects on the efficiency of a farming enterprise. The death rate in Merino weaners of 24% reported by Rose (1972) in the semi-arid environment of north-western Queensland was more than twice the maximum that could be sustained by the flock if any form of genetic selection through culling were to occur. In an analysis of reproduction over 4 years in a Merino flock in a similar environment, post-weaning mortality was identified as one of the three most serious causes of reproductive wastage (Kennedy et al. 1976). Such high rates of mortality in northern Australia have also been reported more recently; in only three months 14% of weaner sheep died in an experiment conducted in the late 1980s under grazing conditions common to the region (Cobon et al. 1990). The mortality results from some of the studies listed in Table 2.5 warrant further discussion because of particular details or because it is difficult to summarise the reported death rates. Harris and Nowara’s (1995) survey of sheep mortality was of mixed cropping-sheep enterprises in the Victorian Mallee. Mortality patterns in these enterprises may differ from grazing-only farms because of the availability of grain for supplementary feeding and the opportunity to graze crop stubbles. In this study, average Merino weaner mortality was 4% per annum, which is low when compared to the other listed studies. 35 Literature Review However, more than one fifth (6/29) of Merino flocks had annual weaner mortality rates higher than 5% and, in 8% of all flocks surveyed, between 10 and 25% of weaners died each year. Studies reporting specific animal health conditions, such as trace element or vitamin deficiencies, or gastrointestinal parasitism, may include mortality data. In such situations, high mortality might not be unexpected in the absence of treatment. However in the studies listed in Table 2.5, weaner mortality was still high in the groups that received treatment. For example, an average of 59% of Merino weaners died of vitamin A deficiency in a feeding trial of drought rations (Franklin et al. 1955). However, the death rate was still 16% amongst weaners that received vitamin A. In selenium supplementation trials in Victoria, Western Australia and South Australia, 6–16% of Merinos receiving selenium still died after weaning (Gabbedy 1971; Walker et al. 1979; Holmes 1992). Mulhearn (1958) reported up to 50% post-weaning mortality amongst Merinos and Corriedales in the high-rainfall zone of South Australia but could only attribute some of these cases to trace element deficiency or specific diseases. Table 2.5: Mortality of Merino and Merino-cross sheep between weaning and approximately 18 months of age reported in Australian commercial enterprises or field experiments Post-weaning Location Reference mortality (range) 16% feedlot (Franklin et al. 1955) up to 50% south-eastern South Australia (Mulhearn 1958) 25% north-western Queensland (Moule 1966) 7% Western District, Victoria (Russell 1968) 14% Western Slopes, NSW (Drinan 1968) 6% (up to 24%) southern Western Australia (Gabbedy 1971) 24% Julia Creek, Queensland (Rose 1972) 12–22% Western District, Victoria (Anderson et al. 1976) 18% Central Victoria (McDonald 1975) 14% Broken Hill, Queensland (Kennedy et al. 1976) 7–10% feedlot (Davis et al. 1976) 10% Kybybolite, South Australia (Brown 1977) 17% (11–29%) Kangaroo Island, South (Walker et al. 1979) Australia 5% (up to 13%) Katanning, Western Australia (Norris 1984) 6% Western Slopes, NSW (Langlands et al. 1984) up to 56% Kybybolite, South Australia (Brown et al. 1985) 39% Condobolin, New South Wales (Denney et al. 1988) 22% n/a (Mulholland 1986 cited by Allworth 1994) 14% Julia Creek, Queensland (Cobon et al. 1990) 11% East Gippsland, Victoria (Holmes 1992) 4% (up to 25%) Mallee, Victoria (Harris and Nowara 1995) 4% (1–10%) Hamilton (Vic), Struan (SA) & (Fogarty et al. 2005a) Cowra (NSW) 12% (2–27%) Victoria (J. Webb Ware pers. comm.) 36 Literature Review Field trials assessing parasite control programs have reported high mortalities even in weaner groups receiving recommended worm control treatments. For example, 12% of Corriedale weaners died in a trial in the Western District of Victoria, despite the use of double summer drenching, a strategy which remains the mainstay of effective worm control in that environment (Anderson et al. 1976). In a similar experiment involving Haemonchus, Trichostrongylus and Ostertagia infections of sheep in East Gippsland, 13% of Merino weaners died in one year despite weekly anthelmintic treatment and 19% died in groups managed under the common district approach of the time, which involved giving between three and six annual treatments (Barton and McCausland 1987). All of these results suggest that excessive weaner mortality cannot always be attributed to a single, treatable aetiology. It is striking that virtually none of the studies listed above comment on the rates of weaner mortality that they report. This suggests that mortality rates of 10%, or even 20%, are not considered unusual or noteworthy, and may explain how excessive mortality could exist in the Australian sheep industry and receive little attention. For example, in Barton’s (1987) trial of parasite control, ‘salvage’ drenching “to…prevent…high mortality” only occurred once deaths exceeded 10%, implying that 10% mortality was not considered to be concerning or controllable. This attitude does not appear to differ from one expressed in the 1950s, when Franklin (1955) considered that supplementary feeding during drought satisfactorily reduced mortality when only 15% of Merino weaners died. In an extensive review of progress in reproductive research in the Australian sheep industry, Scaramuzzi (1988) discussed the improvements made over many decades in our understanding of drivers of neonatal lamb mortality but made no mention of the extent, or role, of post-weaning survival in the intergenerational ‘efficiency’ of sheep reproduction. Nonetheless, excessive weaner sheep mortality has been labelled one of the “biggest issues for Merino producers” (Ripper 2003) (p. 2) and a recent economic evaluation suggested it is the fourth most costly animal health issue in the Australia sheep industry (Sackett 2006). Many farmers have difficultly successfully managing weaner survival. More than one quarter of Western District woolgrowers enrolled in an extension program for innovative farmers nominated successful weaner feeding, and achieving target growth rates and bodyweights, to be their primary concern (Larsen 1999). Despite this, the fatalistic and frustrated attitude expressed by many farmers, and the very fact that so few recent reports of mortality on commercial farms exist, implies that the problem is under-reported, under-researched and possibly one that the industry feels is beyond its control. 2.3.22.3.22.3.2 Limitations of Weaner Mortality Investigations Studies that seek to elucidate the risk factors underlying mortality must necessarily be observational because it is not ethical to observe animals dying without intervening. No study has specifically examined mortality in Merino weaners grazed in southern Australia or a similar, Mediterranean-type environment. Thus, because reports of weaner mortality are frequently incidental to a study’s principal aims, associations between proposed risk factors and mortality are frequently confounded with other variables. Plausible physiological explanations of the observed data can be offered, however ethical considerations prevent their testing in controlled experiments. Furthermore, interpretation of mortality 37 Table 2.6: Summary of methodologies of studies reporting risk factors for post-weaning mortality Breed/ Location Features of Observation Experimental or Analytical methodology Study Species production system period* Observational Study; n† Merino, Kybybolite, SA‡ autumn lambing 6–12 Exp; ~20×3 descriptive (Allden and Anderson 1957) Merino × Merino Adelaide, SA autumn lambing 6–12 Exp; 24×4 descriptive, χ2 test of proportions (Allden 1968c) Merino, western Vic autumn lambing ~6–12 Obs; unspecified descriptive case reports (Engel 1958) other Rambouillet north-western India extensive grazing + 5–12 Obs; 3285 difference of least squares means (Ganai and Pandey 1996) (temperate climate) supplementary feeding Literature Review Review Literature Merino, Mallee, Vic mixed enterprises ?–12 Obs; 79 farms × 2 descriptive (Harris and Nowara 1995) other (sheep + cropping) years 38 goat Kenya (semi-arid extensive grazing + 4–24 Obs; 270 discrete-time logistic regression (Hary 2002) savannah) supplementary feeding based on cubic spline hazard functions Merino Werribee, Vic spring lambing; feedlot ~4–12 Exp; 10×3×3 χ2 test of mortality proportions (Hodge 1990) then pasture goat subhumid Nigeria smallholder; foraging ± 5–12 Obs; 877 univariate analysis of cumulative (Ikwuegbu et al. 1995) access to reserved mortality using maximum fodder likelihood estimates Merino, SA autumn lambing 3–16 Exp; 591 χ2 test of mortality proportions (Kleemann et al. 1983) Merino x Merino WA autumn, winter or 3–12 Exp; 375 χ2 test of mortality proportions (Lloyd Davies 1983) spring lambing at average to high district stocking rates Merino NSW drought feedlot 3–6 Obs; 209 t-test of survivor vs. non-survivor (Lloyd Davies et al. 1988) weaning weights goat French West Indies irrigated pasture 3–9 Obs; 837 parametric & semi-parametric (Mandonnet et al. 2003) survival analysis Merino, Hamilton, Vic autumn, winter or 3–~11 Exp; unspecified χ2 test of mortality proportions (McLaughlin 1973) Corriedale spring lambing Merino, Kangaroo Island, SA winter lambing 6–10 Exp; 20–40/group descriptive (Mulhearn 1958) Corriedale African Kenya (sub-humid extensive grazing 3–12 Obs; 1785 semi-parametric survival analysis, (Nguti et al. 2003) sheep tropics) including time-varying covariates Indian north-western India extensive grazing + 3–6 Obs; 1272 difference of least squares means (Nivsarkar et al. 1982) sheep (temperate) supplementary feeding sheep north-western India extensive grazing 3–6 Obs; 10000 descriptive (Sharma et al. 1981) (temperate) meat sheep USA feedlot 2–12 Obs; 8642 logistic regression, semi- (Southey et al. 2001) Review Literature parametric & parametric survival analysis 39 goat Ghana extensive grazing 3–12 Obs; 351 χ2-test of mortality proportions; (Turkson et al. 2004) univariate odds ratios African Ghana extensive grazing 5–12 Obs; 453 χ2-test of mortality proportions; (Turkson and Sualisu 2005) sheep univariate odds ratios African Ghana extensive grazing, ~5–12 Obs; 868 χ2-test of mortality proportions; (Turkson 2003) sheep & average flock size = 10 univariate odds ratios goat Merino x n/a pen-study 1–3 Exp; 26 correlation between weaning (Walker and Hunt 1980) weight and duration of post- weaning survival ? California extensive grazing ?–22+ Obs; ~26×3 × 3 comparison of mortality (Weir and Torell 1967) years proportions * age in months, beginning at weaning † x × y means x number in y groups ‡ Abbreviations refer to Australian states: NSW: New South Wales; SA: South Australia; Vic: Victoria; WA: Western Australia Literature Review data is difficult in small trials where differences in mortality proportions may be due to the death of only one or two extra animals, and no Australian studies have used statistical techniques appropriate for analysing mortality data. Therefore, despite the different production systems examined, some quantitative estimates of the association between risk factors and weaner mortality must be obtained from overseas studies of sheep and goats. 2.3.32.3.32.3.3 Causes of Mortality & Illthrift Although weaner sheep can die from numerous diseases and syndromes, no single aetiological agent is consistently associated with Merino weaner mortality in Australia. It has thus been identified as a component of the ‘weaner illthrift’ syndrome (Gordon 1981) and a manifestation of illthrift in its most extreme form (McLaughlin 1967). However, its importance is usually alluded to without being specifically identified. For example, Wilkinson (1981) defines illthrift as a “failure of weaner sheep to thrive…when all other classes of sheep appear to be of satisfactory health” but does not mention mortality, despite going on to list illthrift’s primary manifestation as “survivability”. Weaner illthrift is a syndrome of multiple aetiologies that is frequently associated with the grazing of dry pastures (McLaughlin 1967). Inadequate nutrition more generally, ranging from insufficient protein or energy intake to specific trace element deficiencies, has been identified as the primary factor predisposing Merino weaners to illthrift and high mortality in Australia (Gordon 1981). Weaners in poor body condition, particularly whilst grazing dry summer pastures and suffering insufficient energy or protein intake, are considered to be more susceptible to other diseases (Allworth 1983). After inadequate nutrition, gastrointestinal parasitism is a common secondary contributor to weaner illthrift and death (Gordon 1981). In winter rainfall environments, death solely due to parasitism is thought to be rare during summer (Wilkinson 1981). Rather, it commonly is considered that undernutrition places weaners at risk of mortality and that parasitism then precipitates death (Beveridge et al. 1985). A variety of diseases, either following these two major predisposing conditions or independent of them, have also been implicated in outbreaks of weaner mortality in southern Australia. These include myopathies (usually associated with selenium or vitamin E deficiency), acidosis due to cereal grain overload, polioencephalomalacia, grass seed infestation, flystrike, mycotic dermatitis, scabby mouth and lupinosis (Wilkinson 1981; Hungerford 1990). The risk factors associated with weaner mortality are discussed in the following sections, and the design and methodologies of the experiments cited therein are outlined in Table 2.6. 2.3.42.3.42.3.4 Factors Associated with Weaner Mortality 2.3.4.1 Bodyweight Poor body condition and low growth rates at various ages have been shown to be associated with reduced survival of small ruminants after weaning. A postal survey of 79 enterprises in the Victorian Mallee (Harris and Nowara 1995) characterised patterns of sheep mortality on mixed farms (i.e. cereal cropping and sheep). In flocks experiencing more than 5% weaner deaths, loss of body condition was 40 Literature Review the clinical sign most frequently reported by farmers, with 86% of flocks in moderate or thin body condition when high mortalities were most likely to occur. A number of field studies have further characterised the relationship between mortality and bodyweight (Ganai and Pandey 1996 ; Hary 2002 ; Turkson and Sualisu 2005). In general, these have shown that the relationship is independent of a weaner’s age at death and the cause of death. For example, in studies of Rambouillet lambs in India (Ganai and Pandey 1996), East African Dwarf goat kids in Kenya (Hary 2002) and Sahelian lambs in Ghana (Turkson and Sualisu 2005), animals lighter than the average birthweight of 2–3 kg, were approximately twice as likely to die after weaning than heavier- than-average birthweight animals, even when weaning occurred up to five months after birth. In all instances, the association between birthweight and mortality persisted for as long as the progeny were observed, which was to one (lambs) or two (kids) years of age. Weight at weaning has been shown to have a similar, persistent association with post-weaning survival. Increases in weaning weight of 5–10 kg in Merinos have been associated with approximately two- to four-fold decreases in post-weaning mortality risk, as follows: • 9% deaths at 22 kg vs. 2% at 27 kg (Allden 1968c) • 80% at 12 kg vs. 33% at 21 kg (Lloyd Davies 1983) • 18% at 11 kg vs. 6% at 15 kg (Lloyd Davies et al. 1988) • 21% at 15 kg vs. 4% at >27 kg (Holmes 1992 ) All of these results equate approximately to an odds of death ratio of 1.3–1.5 for a 1 kg decrease in weaning weight. A similar mortality pattern across weight groups was also observed in a drought feeding study, even though the sheep were only followed until four months of age (Franklin et al. 1964). Two trace element supplementation trials showed that a greater proportion of weaners that were below average bodyweight at weaning died after weaning than ones of above average bodyweight. Amongst sheep grazing native pastures in East Gippsland, Victoria, 21%, 8% and 4% sheep died from the lightest, middle and heaviest third of weaners respectively (Holmes 1992; Figure 2.5). On Kangaroo Island, South Australia, 75–83% of weaners grazing improved pastures that died were lighter than the average breed/strain weaning weight (Walker et al. 1979). Similar results were observed in a feedlot trial involving Merino-cross weaners receiving wheat-based rations, where survivors were significantly heavier than weaners that died (6.5–9.5% deaths between 3 and 9 months of age; Davis et al. 1976). The similar results reported by all the studies discussed above is striking because they were conducted in widely differing environments in Western Australia, South Australia and New South Wales, and in different production systems with different lambing times. They also included studies of sheep grazing pasture or being fed in drought or production feedlots. 41 Literature Review It is difficult to compare directly the relative importance of weight at birth, weaning and later times to survival. However, two studies have shown that bodyweight had a similar association with risk of death of weaners, regardless of when it was measured. One examined survival of Red Maasai, Dorper and crossbred sheep in Kenya between weaning (at three months old) and twelve months of age (Nguti et al. 2003), during which time 31% (443/1442) of weaners died. Decreased weaning weight was associated with increased mortality rate, with the largest effect at low bodyweights. A 1 kg decrease in weaning weight increased the mortality risk of lightweight weaners by about 60%, equating to a hazard ratio of about 1.6, and this effect became progressively smaller for heavier weaners. A similar effect was estimated for bodyweight at any time in the post-weaning period if this time-varying term was used instead of weaning weight in the statistical model. Although no techniques exist to readily estimate how comprehensively one survival model accounts for observed variation compared to another, this result suggests that it is bodyweight in general, rather than birth or weaning weight in particular, that is associated with mortality. The second study to analyse the effects on mortality of both weaning weight and general bodyweight examined deaths caused by gastrointestinal parasites of Creole goat kids grazing irrigated pastures in the tropical French West Indies (Mandonnet et al. 2003). The hazard ratio for a 1 kg decrease in weaning weight was 1.33. The hazard ratio for a bodyweight decrease of 1 kg at any time was 1.69. The reason for the difference between the hazard ratios for weaning weight and time-varying bodyweight was not discussed. The inclusion of bodyweight as a time-varying covariate eliminated significant associations between mortality and three other explanatory factors, sex, rearing method and anthelmintic treatment. The authors noted that this did not necessarily imply that bodyweight was the Figure 2.5: Mortalities in Merino weaner sheep, classified by weight at weaning, from: 3–12 months of age on pasture (Lloyd Davies 1983), 3–6 months of age in a drought feedlot (Lloyd Davies et al. 1988), and 4½–6 months of age on pasture (Holmes 1992) 100% 90% 80% 70% 60% 50% 40% 30% Proportion of deaths 20% 10% 0% 10-12 12-14 14-16 16-18 18-23 23-27 >27 Weaning weight range (kg) 42 Literature Review only factor to significantly influence mortality. This was because kids in the study died after severe weight loss and, although other factors may have contributed to mortality, the reduction in bodyweight prior to death was so profound that other effects were statistically diminished to the point of non- significance. Only death caused by gastrointestinal parasites, which are not typically associated with acute mortality, was examined. All the results discussed above show that similar associations between post-weaning mortality risk and bodyweight exist, regardless of whether the sheep are weighed at birth, weaning or a later time. Generally, a 1 kg weight decrease was associated with an odds or hazard ratio ranging from 1.3 to 2. The consistency of these observations is remarkable because they were made in widely ranging environments (Mediterranean-type, semi-arid and tropical), involved very different breeds and crosses of sheep and goats and were independent of the cause of death. Weaning weight has been shown to be well correlated with weight at subsequent times (Lloyd Davies et al. 1968 in Merinos), so it is likely that the association between mortality and weight at a particular time such as weaning is actually a reflection of a more general association between bodyweight and survival. Lamb bodyweight at premature weaning has also been shown to be associated with risk of death in several experiments. Within groups of Merino lambs weaned at different ages between 4 and 12 weeks, survivors were 1–2 kg heavier when weaned onto a cereal grain and chaff diet than those that died (Franklin et al. 1964). The heavier weaning weight of survivors was associated with a heavier birthweight, and a higher pre- and post-weaning growth rate. All the lambs that died either lost weight or experienced negligible gain after weaning. Lambs weaned at a younger age also died more quickly than later-weaned lambs, despite being of similar bodyweight (7–9 kg) at weaning. For lambs weaned at the same age, there was no association between weaning weight and survival time but the range in weaning weight was small. In contrast, when Merino-cross lambs were weaned at three weeks onto a pelleted ration, there was a strong correlation (r = 0.94) between weaning weight and duration of survival after weaning (Walker and Hunt 1980). In this experiment, the distribution of weaning weights was greater and three quarters of the lambs died within four weeks as a result of poor consumption of the ration. Walker and Hunt concluded that bodyweight was associated with post- weaning survival in these young sheep because increasing weight represents the accumulation of energy stores required to sustain the lamb during adaptation to a new nutritional situation. Thus, lightweight lambs were more likely to die because their lower body energy stores were exhausted before full adaptation to the new diet rendered the reserves unnecessary. This concept is discussed in more detail below. 2.3.4.2 Growth Rate Allden’s (1968c) study of the effects of undernutrition of growing Merinos on lifetime production presented data that suggest associations between post-weaning growth rate, as well as weaning weight, and survival. The sheep were born in May in South Australia and weaned at six months of age. Nutrition was restricted before or after weaning by varying the stocking rate of ewes and lambs grazing green pasture during winter and spring, or withholding supplementary feeding when weaners grazed 43 Literature Review dry pasture throughout summer and autumn. Thus “low” and “high” pre-weaning nutrition groups were further divided by “low” and “high” post-weaning nutrition. This created four groups of 24 sheep that had experienced different nutritional regimens throughout their first year of life. The mortality proportions are summarised in Table 2.7. No vaccination against clostridial diseases was performed and 13% of lambs died of enterotoxaemia before weaning. These deaths, from a preventable disease, obscure any relationship between bodyweight, growth rate and survival in the 0–6 months period, and the 0–12 months period. However, from 6 to 12 months, more light weaning weight (average 22 kg) sheep died than heavy weaning weight (average 27 kg) sheep. In addition, more weaners that were nutritionally restricted after weaning (average weight loss 0.8 kg/month from 6–12 months of age) died than weaners fed ad libitum after weaning (average growth rate 0.8 kg/month). Allden commented that, “It is probable that under extensive grazing conditions and with no immediate access to water more of these [low pre- and post-weaning nutrition] sheep would have died”. The trial involved low numbers of animals, so no difference between proportions was statistically significant. Furthermore, because the survival of individuals was not analysed, it is not clear whether weaner survival was associated with absolute bodyweight or rate of weight change. Nevertheless, the results point to relationships between mortality and both weaning weight and post-weaning growth rate consistent with those of the more specific mortality studies discussed previously. There was no relationship between the level of supplementary feeding and mortality in Merino weaners kept in a drought feedlot in New South Wales (Lloyd Davies et al. 1988). However, weaner growth rate and final bodyweight in each treatment were similar, despite the diets differing in metabolisable energy by 2 MJ/kg. This lack of difference in growth rate might explain why similar numbers of sheep died in each group. Similarly, weaner bodyweight and the rate of weight loss were related to survival of Merino-cross sheep being fed wheat-based rations for production in feedlots (Davis et al. 1976). In that experiment, 43% of sheep that died lost more than 100 g/day prior to death and the weaners that died were significantly lighter than those that survived. The deaths were independent of ration formulation and feedlot management. Mulhearn’s (1958) report of Merino weaner management in South Australia contained information Table 2.7: Post-weaning mortality of sheep in different nutritional groups reported by Allden (1968c) Nutritional treatment Average weight at Post-weaning end of period (kg) mortality pre-weaning: low 22.0 9% high 27.0 2% post-weaning: low n/a 9% high n/a 2% pre- & post-weaning: low-low 17.4 13% low-high 26.0 4% high-low 22.8 5% high-high 31.2 0% 44 Literature Review about bodyweight and mortality, although it was confounded with the effect of season and property- specific management practices, such as supplementary feeding. He observed that no winter-born weaners died during summer in a year when the average peak summer bodyweight was 37 kg. However 15% died in the following year when post-weaning growth rates were lower and average peak summer bodyweight was 27 kg. In the high-mortality year, groups of weaners grazing dry pasture were supplemented with oats or linseed meal ad libitum and compared to weaners receiving no supplement. Supplemented weaners gained an average of 2.5 kg/month (linseed) and 1.6 kg/month (oats) but unsupplemented weaners lost 1 kg/month between February and May. In the oats and linseed groups, 8% and 10% died, respectively, compared to 28% of weaners in the control group. Deaths also continued for longer in the control group and nearly all the weaners that died were of low weight or emaciated. In the supplemented groups, it was noted that the weaners that died were lighter prior to the start of supplementation. This pattern of mortality was similar to those described by Allden (1968c) and Lloyd Davies (1983; 1988) and Hodge (1990). Mulhearn described a situation of fewer deaths and smaller differences between groups on another property where different supplements were used. However on this farm the controls experienced only a slight weight loss whilst grazing dry pasture. This suggests that it is the bodyweight or growth rate achieved by supplementary feeding, rather than the provision of feed itself, that is associated with mortality. In a field experiment in the same environment, 20% (3/15) of a monitor group of unsupplemented autumn-born Merino weaners died, compared to no deaths in groups receiving various supplementary feeds (Allden and Anderson 1957). During summer it became necessary to start feeding the unsupplemented group, as “many of the young sheep were…in critical condition” (p. 75). The unsupplemented sheep were at least 1 kg lighter than their counterparts at the start of feeding in January and lost 4 kg throughout February, when the deaths occurred, compared to gains of 0.2 and 2.2 kg/month by the weaners receiving supplementary feed. In another study in Victoria, approximately four times as many deaths occurred in a group of Merino weaners grazing dry pasture over summer without supplementation (McLaughlin 1973). Although detailed data were not presented, unsupplemented and supplemented weaners were approximately the same weight at weaning but unsupplemented weaners always grew at a slower rate than the supplemented groups. Mortality patterns also coincided with bodyweight changes rather than feeding regimens in a seven- year trial of supplementary feeding strategies for weaner wool sheep in a Mediterranean-type climate in inland California (Weir and Torell 1967). Significant mortalities occurred in only one year, which was associated with light weaning weight and weight loss over summer when the nutritional quality of pastures was particularly poor. In this year, 17% of unsupplemented (control) weaner sheep died between the start of summer and the following spring, whereas there were only “minor…losses” in the groups receiving supplementary feed. These results were in contrast to those from another year, when, although similar weight loss occurred during summer, sheep had been considerably heavier at weaning and the death rate was much lower. These results are similar to those of Allden et al. (1968c), discussed above. 45 Literature Review In the studies described above, the associations between bodyweight, bodyweight change and mortality rate were confounded with the provision of supplementary feed. However the patterns of mortality across the different bodyweight or growth rate groups were similar, regardless of the type of supplementary feed provided. Furthermore, supplementary feeding was only associated with differences in survival if it was accompanied by a growth rate response. This suggests that bodyweight or bodyweight change, rather than supplementary feeding per se, is associated with survival of weaner sheep. However the notion that generally providing supplementary feed to Merino weaners effectively prevents excessive mortalities was challenged by Norris (1986), who found no correlation between the cost of supplementary feed and Merino weaner survival on 10 different farms in south-west Western Australia. He concluded that farmers did not understand the specific protein and energy requirements of Merino weaners and therefore failed to prevent weaner mortalities because they provided inappropriate supplementary feed, especially hay. It is difficult to make general statements about the extent to which dry pastures match the nutritional requirements of weaner sheep because the nutritional characteristics of summer pastures vary tremendously between locations and seasons. Furthermore, the nutritional requirements of weaners also change as they grow older. One study of supplementary nutrition for Merino weaners grazing typical dry, summer pastures in southern Australia showed a linear response of liveweight gain to increasing energy supplementation but a declining response to increasing levels of supplementary protein (Allden 1959). From these results Allden concluded that such pastures primarily fail to meet weaners’ energy requirements but are only nominally deficient in protein. Similarly, Foot et al. (1983) commented that weaners do not show a response to supplementary protein if even small amounts of green feed, which is high in protein, are present in the pasture being grazed. In contrast, another experiment achieved higher growth rates in Merino weaners by increasing the protein content of isocaloric supplementary rations offered from December to May (9–15 months of age) (Lloyd Davies et al. 1968). Weaners offered a supplement containing 22% crude protein (CP) grew at 1 kg/month from December to March, whereas weaners offered a supplement containing 8% CP gained 0.6 kg/month, whilst a control group offered no supplement lost 0.8 kg/month throughout this time. However, no comparison was made between these diets and the expected protein intake from grazing pastures containing different quantities of green feed. 2.3.4.3 Bodyweight & Death from Causes Other Than Malnutrition Many of the studies reporting an association between bodyweight and survival have focussed on deaths of weaners during summer, when growing sheep frequently suffer malnutrition while grazing the dry pastures typical of the Mediterranean-type environment of southern Australia (Bellotti et al. 1993). An association between bodyweight and survival of sheep grazing dry pastures and dying principally from malnutrition is unsurprising, as weight loss inevitably precedes death. However several studies have reported a similar association between bodyweight and death at other times of the year and when weaners die from other causes. For example, a study of three strains (Merryville, Peppin and Collinsville) of Merino weaners showed that bodyweight was related to risk of death due to gastrointestinal parasitism during winter in southern Victoria (Hodge 1990, summarised in Table 2.8). 46 Literature Review In that study, lambs born in October were weaned at between nine and 20 weeks of age, in order to create different weaning weight groups. They were then fed to maintain weight until the following April. At this time they were either grazed on abundant or restricted pasture (>1500 kg DM/ha or <700 kg green DM/ha), resulting in growth of 3.3 kg/month or approximate maintenance of weight (lost or gained < 0.3 kg/month) respectively. This resulted in bodyweights at the end of the autumn break that varied approximately twofold—from 16 kg to 30 kg (for Merryville and Peppin strains), and 21 kg to 37 kg (Collinsville). Most deaths occurred in the groups that were light at weaning and then only maintained weight during autumn. Significantly fewer weaners died in the two groups that achieved an intermediate weight by the end of autumn. No deaths occurred in the group that were heaviest at weaning and then gained weight throughout autumn. It is of significance that similar deaths occurred in the two groups of equal weight at the end of autumn, regardless of how it was achieved. This suggests that it is absolute bodyweight, rather than the timing of its acquisition, that is associated with survival. No association was observed between proportion of weaners dying and their rate of growth during winter. However results for individual animals were not reported and so more detailed relationships between growth rates and mortality may have been overlooked. An association between bodyweight and mortality that extended beyond times of malnutrition was also observed in survival analyses conducted overseas of sheep (Nguti et al. 2003) and goats (Mandonnet et al. 2003). In these analyses, the ratio between death rates of animals from different weight group classes remained constant over time. This indicated that bodyweight differences accounted for the same difference in risk of death, regardless of the time of year. Additionally, no cause of death was specified in Nguti et al.’s (2003) study, yet bodyweight was significantly associated with survival. In Hary’s (2002) study, gastrointestinal parasites, and not malnutrition, were a leading cause of death. These results are particularly significant, since bodyweight was associated with survival even though the cause of death was not directly related to nutrition. 2.3.4.4 Physiology Underlying the Association between Mortality & Bodyweight It is appropriate to consider briefly how bodyweight might confer a survival advantage on weaner sheep and whether such mechanisms are consistent with the results described above. Following a period of weight loss, death occurs when body energy reserves have been fully consumed and the function of the vital organs is compromised as they are catabolised in an attempt to meet the animal’s Table 2.8: Mortality of weaners with different weaning weights and autumn growth rates reported by Hodge (1990) Weaning weight Autumn growth rate (kg/month) 0 3.3 low 11–22% * 4–5% intermediate 0–5% 0% high 0–4% 0% * range is for different strains 47 Literature Review daily metabolic requirements (Thornton et al. 1979). Lightweight sheep are at greater risk of death from this process than their heavier counterparts for several different reasons, which are discussed in more detail below. An important aspect of the association between bodyweight and survival is that the total body energy stores of weaner sheep increase with liveweight. A linear relationship, independent of breed, between total body energy and liveweight was demonstrated in an experiment that measured body composition and total energy of Merino and Merino × Dorset Horn sheep between weaning at 7 weeks (average liveweight 15 kg) and 10 months of age (average liveweight 45 kg; Allden 1970). Heavy sheep had a higher energy density than lightweight ones and the proportion of energy stored as fat also increased as the animals grew. For example, at 15 kg, energy density was approximately 1 MJ/kg and 13% of total body energy was stored as fat, whereas the energy density of weaners weighing 30 kg was 2.2 MJ/kg and 70% was stored as fat. The variation of fat content with bodyweight has also been shown to be independent of the growth rate of sheep of different breeds (Kellaway 1973). These results indicate three mechanisms by which heavy weaners can mobilise body energy with less risk of dying than light weaners. Firstly, heavy weaners have a greater total body energy reserve upon which they can draw. Secondly, heavy weaners will lose less weight in mobilising a given quantity of body energy than light weaners because of their higher energy density. Finally, more of this reserve is fat, the catabolism of which spares protein in the vital organs (Thornton et al. 1979). Therefore, light weaners forced to mobilise body energy, for any reason, will lose more weight than heavy weaners, and a greater proportion of this weight loss will be protein contained in vital tissues, to the detriment of organ function and ultimately survival (Allden 1970; Doyle and Egan 1983). Once weaners are lightweight, their mortality risk is increased further because they replace lost body protein more slowly than heavy sheep under conditions of limiting protein and energy intake (Allden 1970). Such nutritional conditions are typically experienced by sheep grazing dry summer pastures in southern Australia. For example, in a field study of Merino weaners grazing poor quality, dry phalaris/subterranean clover pastures in South Australia, heavy (average 36 kg) sheep prior to the period of summer weight loss regained bodyweight more quickly than light (28 kg) ones during the following winter (Donald and Allden 1959). This was despite the more severe nutritional circumstances during summer experienced by the heavier group. The age at which growth is restricted also affects how quickly young sheep regain weight, which in turn influences the length of time they remain at risk of death from low bodyweight. Merinos whose growth was restricted before 6 months of age recovered bodyweight more slowly than weaners restricted between 6 and 12 months of age (Allden 1968b). When Corriedale ewes were nutritionally restricted (by weaning onto dry pasture) at 8 weeks of age, they were 10% lighter at their first mating (38 vs. 42 kg) than if they were weaned at 12 or 16 weeks (McLaughlin 1966). Finally, weaner sheep can consume nearly 20% more highly digestible feed than adults, relative to metabolic liveweight, but their relative feed intake decreases at lower digestibilities, and is similar to 48 Literature Review adults at feed digestibilities of 46% and 56% (Egan and Doyle 1982). Thus, nutritional deficiencies of dry summer pastures are exacerbated for weaner sheep because their intake is relatively lower when grazing feeds of lower digestibility. The nutritional deficiencies that occur when grazing dry pasture explain the improvement in survival achieved by providing weaners with appropriate supplementary feed. Providing high-digestibility supplementary feed during times of nutritional deprivation increases, or at least reduces the decline in, the fat and total body energy content of the weaner sheep’s carcase (Weir and Torell 1967). Furthermore, the growth response to supplementary feeding over a weaner’s first summer of life has been shown to be independent of the weaner’s age (Marshall 1985). This shows that weight loss, and its associated survival outcomes, is not obligatory for even young sheep grazing summer dry pastures, provided they receive appropriate supplementary nutrition. 2.3.4.5 Season & Year of Birth Studies that have spanned more than one year show that weaner mortality can vary substantially from season to season and/or year to year, although the effects have not always been reported directly or quantified. In southern Australia, most weaner deaths occur during summer. For example, in field experiments in South Australia, 82% and 79% of all weaner deaths occurred between January and March, regardless of the age of the sheep (Lloyd Davies et al. 1968; Lloyd Davies 1983). Similarly, in western Victoria, half of all the occurrences of excessive mortality in weaner flocks occurred between February and April (Engel 1958). Such seasonal peak in mortality is consistent with the decline in bodyweight, and increase in risk of death, experienced by weaners grazing the dry summer pastures of southern Australia, as discussed in the previous section. In contrast to these findings, Harris and Nowara (1995) reported that the deaths in nearly all weaner flocks with greater than 5% mortality occurred between May and August, or during December. Unfortunately, no breakdown between Merino and non-Merino flocks was provided. This survey was primarily of cereal grain cropping enterprises, where weaners often graze crop stubbles during summer and autumn. Summer weaner mortality may therefore have been lower than in solely pasture-grazing enterprises because of the improved nutrition afforded by stubbles and residual grain soon after harvest. In contrast to the association between dry conditions and increased weaner mortality, some studies have reported that weaner mortality is greater during seasons of higher rainfall, when it would be expected that feed quantity and quality would satisfy the nutritional requirements of weaner sheep (for example, Beveridge et al. 1985; Mandonnet et al. 2003; Nguti et al. 2003). In these situations, gastrointestinal parasites are invariably implicated in the increased mortality risk. For example, an Australian field study that reported the seasonal distribution of deaths of Merino weaner sheep compared different parasite control strategies at Kybybolite in the high-rainfall zone of South Australia (Beveridge et al. 1985). Significant deaths only occurred in the groups that were not treated for gastrointestinal parasites, and then most occurred during the winter months (up to 15%/month). However, significant numbers of untreated weaners did die between December and April in two of five 49 Literature Review years. In these instances, the poor nutrition of weaners during summer was considered to have predisposed them to death from gastrointestinal parasitism, despite the relatively low parasite burdens at this time. A multi-year study of weaner sheep in the sub-humid tropics of Kenya quantified the seasonal differences in mortality risk (Nguti et al. 2003). During high-rainfall years, weaners died at 1.5–5.0 times the rate of weaners in low-rainfall years. The authors considered gastrointestinal parasitism to underlie this year-to-year variation in mortality, with the higher worm burdens acquired in wetter years outweighing any benefit to survival from improved feed quantity or quality at these times. 2.3.4.6 Disease Many studies have reported associations between weaner mortality and specific diseases, although there are few surveys of causes of weaner mortality in Australia. Amongst Merino weaner flocks in the Victorian Mallee, loss of condition, blowfly strike and diarrhoea were the signs most frequently reported by farmers in flocks experiencing more than 5% mortality (40%, 30% and 20% of flocks, respectively Harris and Nowara 1995). The authors more generally considered that pyrrolizidine alkaloid poisoning from grazing heliotrope (Heliotropium europaeum) was a leading cause of mortality in the Mallee and that gastrointestinal parasitism was less common because of the low rainfall environment and the fact that many sheep grazed crop stubbles for a part of each year. However they did not discuss weaner mortality in more detail. Oral selenium supplementation reduced mortalities (6% vs. 16%) in the first 52 days after weaning amongst Merino weaners in a field trial in the East Gippsland region of Victoria, where soil deficiencies of selenium, cobalt and copper were suspected (Holmes 1992). Selenium administration appeared to have a direct effect on mortality because it was not associated with increased growth rate or bodyweight. Similar reductions, independent of bodyweight change, were observed on some farms in south-western Western Australia during a selenium supplementation trial (Gabbedy 1971). However in another study, repeated oral selenium supplementation increased weaner bodyweight by 2 kg between 3 and 12 months of age, as well as decreasing mortalities (0% vs. 18% in the untreated groups), despite untreated lambs showing no post-mortem signs of white muscle disease (McDonald 1975). Two overseas studies have quantified the risk of death associated with weaner faecal egg count (FEC), packed cell volume (PCV) and anthelmintic treatment (Mandonnet et al. 2003 ; Nguti et al. 2003). These studies were conducted in the tropics or sub-tropics and, where specific parasite species were identified, Haemonchus contortus was the most common. An increase of 1000 eggs per gram was associated with a hazard ratio of 1.0–2.1 (weaner sheep) and 4.1 (weaner goats), similar to the hazard ratio associated with a 1 kg decrease in bodyweight. In the study by Nguti et al., a 1% decrease in PCV was associated with a 16% increase in mortality rate. Changes in PCV in this study may have been associated with disorders other than haemonchosis because all causes of death were examined. In the weaner goat study, a one standard deviation-decrease in PCV increased mortality rate by up to 3 times, 50 Literature Review although the absolute PCV values were not reported. Haemonchosis was the primary cause of death in this study, which may explain the larger hazard ratio associated with PCV. Anthelmintic treatment also reduced the risk of death by 77% in the 3 weeks following treatment. As has been discussed previously, increased bodyweight prior to acquiring nematode infection has also been shown to reduce the risk of death from parasitism (Hodge 1990). 2.3.4.7 Maternal Factors Maternal bodyweight, body condition, parity, stocking rate and milk yield, as well as size of the litter produced, have been shown to be associated with the survival of offspring following weaning (Table 2.9). However, no studies have actually investigated the mechanism of action of these associations. The results of one study suggested that some of these factors may indirectly influence survival through their effect on weaner bodyweight (Hary 2002). Although maternal milk yield was significantly Table 2.9: Maternal factors associated with weaner mortality Maternal Species/Breed Measurement Magnitude of effect Reference factor of mortality Stocking rate/ Merino % mortality 20% at 12 ewe/ha & (Lloyd Davies liveweight 48 kg vs. 1983) 9% at 4/ha & 52 kg Parity Rambouillet % mortality lower mortality from (Ganai and lower parity ewes Pandey 1996) Litter size Rambouillet % mortality 77% twins vs. (Ganai and 79% singletons (P < Pandey 1996) 0.05) Age Dorper & Maasai hazard ratio 0.7–0.4 for >2 y.o. vs. (Nguti et al. sheep 2 y.o. (P < 0.001) 2003) meat breed sheep hazard ratio 0.63: 4 yo vs. (Southey et al. younger 2001) Weight at Indian sheep % mortality decreased as maternal (Nivsarkar et al. lambing breeds weight at lambing 1982) increased West African cumulative mortality lowest for 2nd (Ikwuegbu et al. Dwarf Goat mortality & 3rd parity ewes (P > 1995) 0.05) Rearing Creole goats hazard ratio 2.5 ± 1.2 artificial vs. (Mandonnet et method maternal al. 2003) meat breed sheep hazard ratio 5.9 ± 1.2 artificial vs. (Southey et al. maternal 2001) Milk yield East African Goat hazard ratio ≈ 2 for total dam milk (Hary 2002) yield < 22.5 kg vs. ≥ 22.5 kg 51 Literature Review associated with mortality rate, the relationship between litter size and post-weaning survival became non-significant once the statistical model was adjusted for birthweight and total dam milk production. This suggested that litter size was not a “primary risk factor” for weaner mortality but acted via direct factors such as neonatal bodyweight and growth. An indirect relationship between litter size—or other similar factors—and post-weaning survival, mediated via weaning weight, is appealing. However other mechanisms of action—increased colostrum intake and improved offspring immune function, for example—cannot be excluded. Indeed, total milk yield continued to account for variation in survival along with bodyweight. This suggests that the effect of the former was, at least in part, independent of the latter. Other studies have also suggested associations between maternal factors and weaner survival but do not propose or elucidate mechanisms of action for them. In a study of the effect of stocking rate and time of lambing on Merino ewe and weaner production in Western Australia, post-weaning mortality was two times greater (20% vs. 9%) amongst ewes and lambs stocked at 12 ewes/ha than those stocked at 4 ewes/ha (Lloyd Davies 1983). Lloyd Davies did not comment on the cause for this difference but, in a subset of this data, average ewe liveweight declined from 52 kg to 48 kg as stocking rate was increased over this range (Lloyd Davies 1962). This, in turn, may have been associated with decreased maternal milk production, to the detriment of offspring weaning weight, since heavier dams in better condition usually produce more milk (Foot et al. 1987). In a less sophisticated analysis than Hary’s or Lloyd Davies’, a lower proportion of weaner deaths was observed amongst the offspring of ewes that were heavier at lambing (Nivsarkar et al. 1982), although no summary data were presented. An explanation of this relationship was not proposed, although it is reasonable to assume that ewe bodyweight was related to milk production, as in the other studies. Similarly, the slightly lower survival noted amongst Rambouillet weaner sheep in India born as twins, compared to singletons, was presumed to be mediated via birthweight, although no further analysis was performed (Ganai and Pandey 1996). In Ganai and Pandey’s study, associations between weaner survival, litter size and ewe parity were also “largely explained” by the effect of the latter two factors on birthweight, although these relationships were not explored in detail. Over the three months following weaning, more offspring of seventh parity ewes died than offspring of younger ewes (10% vs. 6%). Conversely, in two other studies, weaners born to primiparous ewes had a greater risk of death than the offspring of multiparous ewes (Southey et al. 2001; Nguti et al. 2003). A similar mortality pattern was present amongst West African Dwarf goats, where there was a tendency for lower mortalities amongst goat kids born to second and third parity does than those born to first, fourth or higher parity (Ikwuegbu et al. 1995). All of these results might be explained by younger, multiparous ewes having higher milk yields and thus weaning heavier offspring, to the benefit of weaner survival. This was demonstrated in Ikwuegbu et al.’s study, in which kids born to first parity does had lower weaning weights and post-weaning growth rates than kids born to second- and third-parity does. 52 Literature Review Southey et al. (2001) reported that artificially reared weaner sheep in feedlots died at 5.9 times the rate of maternally reared weaners. The death rate amongst artificially reared Creole weaner goats was similarly greater than that of maternally reared kids (Mandonnet et al. 2003). In Mandonnet et al.’s study, the association between rearing method and mortality, along with other factors, became non- significant when time-varying covariates representing FEC, PCV and bodyweight were added to the model. As with the other maternal factors discussed above, such a change in the statistical model suggests that rearing method is not a primary risk factor. Instead, it is likely that its association with post-weaning survival is mediated via another factor or factors. Although likely mechanisms were not proposed by Southey et al. or Mandonnet et al., possibilities could include an effect of rearing method on lamb bodyweight or immunocompetence. 2.3.4.8 Sex Several overseas studies have consistently demonstrated that male weaners face a greater risk of death than females, although no comparable results have been reported in Australia. Southey et al. (Southey et al. 2001) reported the odds of male weaner meat sheep dying in feedlots to be 40% higher than the odds of females dying. This is consistent with the odds (OR) and hazard ratios (HR) reported in Sahelian (Turkson and Sualisu 2005, OR 1.7), and Maasai, Dorper and crossbred (Nguti et al. 2003, HR 1.4) sheep in Africa, as well as West African dwarf goats (Turkson et al. 2004, OR 1.8). Similarly, in a study of survival of Indian sheep breeds, 10% of male weaners died between 3 and 6 months of age, compared to 5% of females. In a study of Creole goat kid deaths due to gastrointestinal parasitism, males died at 3 times the rate of females. No explanation of this widely observed phenomenon has been offered. It has been previously recognised that males are more susceptible to strongyle infection than females, although the reason remains unclear (Barger 1993). However this does not explain the higher mortality rate reported in the other studies, where deaths were not only caused by parasitism. In fact, in the analysis of the subset of Maasai and Dorper weaners that died of gastrointestinal parasitism, there was no difference in mortality rate between the sexes (Nguti et al. 2003). Since male lambs and kids, but not females, are often castrated, increased deaths amongst males may be associated with post-operative infection or other complications following this procedure. However castration was not routinely performed in all studies discussed here. When it was performed, it was often done at birth, yet sex differences in mortality sometimes persisted to 12 months of age. This is an unlikely duration of effect for a surgical procedure that heals in several weeks. Another potential reason for the difference in mortality risk could be differences in body composition between male and female weaners. Sex hormones are known to affect the deposition of muscle and fat (Mersmann 1987) and the consequent difference may influence risk of death. 2.3.52.3.52.3.5 Weaner MortalityMortality————SummarySummarySummarySummary Few surveys, and no specific mortality studies, of Merino weaners have been conducted in southern Australia but the available evidence suggests that post-weaning mortality commonly exceeds 15%, 53 Literature Review which is no better than that achieved in goat and sheep flocks run by subsistence farmers in Africa. In overseas studies, survival analyses have demonstrated associations between mortality and a variety of factors that have been incidentally suggested or remarked upon in Australian experiments involving Merinos. The outstanding feature of this review is that remarkably similar results have been obtained over a tremendous variation in countries, climatic conditions, ruminant species and production systems. Bodyweight is the factor most frequently reported to be associated with post-weaning mortality and this association has also been the most widely investigated. Weaners of lower bodyweight invariably have been shown to be at higher risk of death than heavy weaners, regardless of when weight is measured or the actual rate of weight gain, with a 1 kg bodyweight decrease usually associated with a 30–70% increase in risk. In several studies, specifically increasing the bodyweight of lightweight weaners has reduced mortality more than weight increases amongst heavy weaners. Providing supplementary feed to weaners appears to reduce mortality by increasing bodyweight, rather than via some direct effect of the feed itself. Similarly, weaner bodyweight has appeared, at least partly, to mediate the association between mortality and maternal factors such as parity, litter size and condition score. Seasonal and annual variation in weaner mortality is common, with more favourable survival associated with increased availability of high quality pasture to meet the critical nutritional requirements of weaner sheep. Perversely, such environmental conditions may also favour the development of increased burdens of gastrointestinal nematodes, which can lead to the death of large numbers of weaner sheep. The association between mortality and specific measures of the severity of some parasite infections, such as faecal egg count and packed cell volume, have also been quantified. The modest body of work concerning post-weaning survival has translated into few quantitative recommendations for the management of Merino weaners in Australia. Allden’s findings that Merino weaners can resume normal growth if they reach 40% of adult weight before experiencing summer weight loss (1968a) and that weaners can substitute a substantial amount of summer pasture for supplementary feed (1969) led him to conclude that “supplementation [is un]attractive except when feeding for survival”. The principal advice given for minimising post-weaning mortality is that lambs should reach at least 45% of adult bodyweight by weaning or drying-off of pastures, whichever is the later, and then grow at 1 kg/month until the onset of autumn rains (Foot et al. 1987). Furthermore, as has been discussed here, for the Merino weaner in southern Australia, the relationships between bodyweight and growth, let alone other potential risk factors, remain unquantified. 2.3.62.3.62.3.6 Methodologies for Analysing Survival & Mortality Survival analysis is the term commonly used to describe the statistical procedures that uniquely analyse time-to-event (also known as ‘survival time’ or ‘failure time’) data. It can be applied to the analysis of the occurrence of any well-defined event, not just death. However, the event of interest is usually non- recurring. 54 Literature Review Data describing survival or other ‘time-to-event’ outcomes typically consist of two variables for each individual: (1) a continuous variable describing how long the individual was under observation, and (2) a binary variable indicating whether or not the individual experienced the event of interest, such as death (Hosmer and Lemeshow 1999). Analysis of such data must take into account several unique characteristics. The ‘classical’ approach to analysing survival data involves analysis of variance or linear regression of survival proportions from specific points in the time under study. These methods, however, assume that all error terms are normally distributed and have equal variance. These assumptions are infrequently met by survival time data (Hosmer and Lemeshow 1999; Hary 2002). On the other hand, logistic regression is suitable for analysing a binomially distributed outcome such as death. It provides intuitive information on the probability of an event’s occurrence but ignores the extra information offered by survival data on the timing of events (Allison 1997). No information can be gleaned from logistic regression regarding differences between an individual that dies early in a study’s course and one that dies at the end (Southey et al. 2001). Survival analysis methodology addresses these key characteristics of time-to- event data that limit the use of other statistical techniques. Survival analysis uses a variety of techniques to examine time-to-event data (Dohoo et al. 2003). They are used to estimate the hazard and survivorship functions, make comparisons between them or estimate the effect of particular factors (or ‘covariates’ or ‘predictors’) on survivorship outcomes. Parametric models assume that the hazard is distributed as a defined function, whereas semi-parametric models make no assumption about the underlying hazard function but both techniques model multivariate effects of categorical and continuous predictors. In contrast, non-parametric techniques graphically describe survival data or make univariate comparisons between levels of a categorical covariate. The most commonly used technique is the semi-parametric Cox proportional hazards model (Cox 1972). A more recent parametric approach to survival analysis starts with a non-parametric graph of the hazard, which is then described mathematically using functions known as cubic splines (Royston and Parmar 2002). Covariates are then modelled in terms of their effects on this cubic spline function. This approach has the advantage of generating a more accurate mathematical representation of the observed hazard function, rather than approximating it by a simpler curve, such as the commonly used Weibull hazard (Royston and Parmar 2002). In practical terms, survival analysis must accommodate censored data (Hosmer and Lemeshow 1999). Right-censoring occurs when individuals are present throughout the observation period but do not experience the event of interest before it ends. This may have occurred because the observation period simply ended before these individuals experienced the event, or the subjects were lost to follow-up during the observation period. Left-censored observations are ones where the event has already occurred before a subject comes under study. Finally, interval censoring occurs when an event has taken place within a certain time interval but the precise time of the event is not known (Dohoo et al. 2003). Interval-censored data can take two different forms. When all individuals are observed at 55 Literature Review common time points, the data are termed synchronous interval-censored data. However if the censoring intervals are not the same for all individuals—if there is overlap, for example—then the data are termed asynchronous (Radke 2003). Interval censoring occurs almost universally in data from veterinary and agricultural experiments but is seldom taken into account during analysis (Radke 2003). Several commonly used statistical software packages are capable of analysing interval-censored data, including SAS/STAT (SAS Institute Inc. 2004), S-plus (Insightful Corp. 2004), Minitab (Minitab Inc. 2005) and Stata (StataCorp 2005). All of these programs are restricted to parametric or non-parametric analytical techniques and cannot perform semi-parametric interval-censored data analysis (Lindsey and Ryan 1998). Interval-censored data can be approximated by using the beginning, midpoint or end of the censoring interval as an exact failure time and then performing the analysis with standard techniques. Approximating data using any of these approaches can produce variable results (Leung et al. 1997), depending on the extent of interval censoring in a survival dataset, as well as the methodology used. For example, Radke (2003) compared parametric analyses of interval-censored data and approximated data which used the beginning, midpoint or end of the intervals as exact failure times. In this data, many observations shared common censoring intervals and 61% of cases were left-censored. He used the same parametric hazard function a priori in all four analyses but its fit of the data was not tested. The models that ignored interval censoring substantially underestimated the categorical covariate effects. For example, several of the effects estimated from the approximated data were non-significant. Even when they were significant, they were not large enough to be considered important in a practical sense. Alternatively, it has been suggested that interval censoring in parametric analysis can often be ignored and the midpoint of each failure interval used to approximate exact failure times instead (Lindsey 1998). This author examined “difficult” datasets, although all had relatively few left-censored observations. Although indicating that the results may not always be reliable, he nevertheless stated that conclusions drawn from such analyses are remarkably robust, irrespective of the hazard function chosen for the parametric analysis. J. C. Lindsey and L. M. Ryan (1998) compared both commercially available and novel interval- censored data analysis methods to Cox proportional hazards analysis of approximated data. All methods estimated similar covariate effects in one reasonably large data set (n=94; 54 interval-censored observations). However in smaller (n=31) and more heavily censored data with wide intervals and many right-censored observations, the Cox analysis of the approximated failure times underestimated the magnitude of the covariate effects by 25–50%. In conclusion, it appears that approximating interval-censored data using the midpoint of the censoring intervals is generally considered to produce satisfactory results, although the magnitude of the covariate effects may be underestimated. Other techniques do exist for the analysis of interval-censored survival data. These include the piecewise constant (or piecewise exponential) hazard model, as well as non-parametric estimators and 56 Literature Review statistical tests (Lindsey and Ryan 1998). The piecewise constant model provides a result similar to the more common techniques. It analyses the survival outcomes from each censoring interval with logistic regression, which is applied to a dataset that has been transformed by what is known as a ‘link’ function (Hosmer and Lemeshow 1999). This method was used by Hary (2002) and Southey et al. (2003) in analyses of weaner survival. In both these examples, all animals were observed at common time points. Therefore the data were synchronously interval-censored. Although a technique for applying this methodology to asynchronous interval-censored data has been suggested (Carstensen 1996; Hosmer and Lemeshow 1999), it is not implemented in any commercially available statistical software packages. It is also difficult to obtain correct standard errors from this technique when it is applied to a proportional hazards model containing time-varying covariates (Farrington 1996). Survival analysis is also suited to the analysis of time-to-event data where the covariates to be analysed may take on different values over time, such as whether an animal has received an anthelmintic drench or not in the last month (Allison 1997). Such factors are termed time-varying covariates. However no commercial software exists that can analyse time-varying covariate effects in interval-censored data (Radke 2003). Survival analysis of interval-censored data containing time-varying covariates therefore requires a compromise. If commercial software is to be used, then one of these aspects of the data must be approximated. Whether this occurs with the time-varying effect or the interval censoring will depend on which is considered the least important to the analysis. Several studies illustrate the advantages of survival analysis and the difference between the results produced by different methodologies when analysing weaner mortality. For example, Mandonnet et al. (2003) analysed survival of goat kids with the Cox proportional hazard and Weibull parametric models, although only the Cox model results were reported. They also used time-varying covariates to identify several factors that had important associations with survival throughout the entire post-weaning period. They did not discuss the manner in which time-to-death data were collected or interval censoring issues. Southey et al. (2001) also used Weibull and Cox models to analyse survival of meat lambs raised in a feedlot, and compared these results to those from logistic regression. The hazard ratios from the two survival analysis models and odds ratios from the logistic regression were all very similar, although the standard errors of the odds ratios were larger and the authors noted that a methodology that properly accounts for censoring, such as survival analysis, is preferable. Nguti et al. (2003) carried out a Cox survival analysis of post-weaning mortality of lambs in Kenya. Again, the method for recording deaths was not stated. It appears that weaners were yarded approximately every four to six weeks, which would have created interval-censored data if this were the time when deaths were noted. Hary (2002) estimated odds ratios for covariate effects by performing logistic regression on biweekly survival records for goat kids. Hazard curves for different covariate levels were estimated using cubic spline functions and hazard ratios were calculated from these curves. This approach appears to favour comparisons between a small numbers of categories only. It would not be practical for analysing the effect of a continuous covariate unless it could be subdivided in this way. Whilst detailed comparisons 57 Literature Review were made between different time points on various hazard rate curves, the methodology did not generate hazard or odds ratios that summarised covariate effects well across larger time periods, or indeed the entire study. 2.42.42.4 ConclusionConclusionConclusion Previous research has not clearly identified the most appropriate time for shearing a self-replacing Merino flock in southern Australia. Even when wool quality or animal health is considered alone, no single annual shearing time is clearly preferable because, for example, the effect of autumn shearing on fibre diameter and staple strength has opposite consequences for wool price. Even if the net effect on wool quality favoured autumn shearing, for instance, such an advantage would have to be weighed against the increased susceptibility of autumn-shorn sheep to flystrike in spring and post-shearing mortality, and the costs of managing such conditions might negate any increase in wool income. Similarly, a single shearing time may not always provide benefits for individual factors. For example, any given shearing date cannot reduce both spring and autumn flystrike risk. Therefore, a study is required that measures the effect of different shearing times on wool production and animal health parameters concurrently. 58 C H A P T E R 3 EXPERIMENT INTRODUCTION AND MATERIALS & METHODS 3.13.13.1 Introduction & Trial Overview Time of shearing affects many aspects of wool production and sheep health but no study has examined these factors concurrently in a spring-lambing, self-replacing Merino flock in southern Australia. A long-term trial was conducted to compare wool production, sheep bodyweight, fleece rot, dag, flystrike risk and mortality in spring-lambing Merino ewes and their progeny shorn at different times and managed under commercial conditions in south-eastern Australia. Adult ewes were shorn in December (DEC), March (MAR) or May (MAY), and within each adult shearing time, two patterns of shearing the progeny were examined. The shearing times were chosen in consultation with farmers to ensure that they were practical alternatives for a spring-lambing flock, and the trial was conducted for five years in order to measure the long-term effects of shearing time on production under a range of seasonal and environmental conditions. 3.23.23.2 EEExperimentalExperimental Site The trial was conducted from January 1999 to May 2004 on two farms in Western Victoria, a major wool growing area of Australia. The trial started on Farm A, 15 km west of Geelong and, following the sale of this farm, all sheep in the trial were moved in February 2003 to Farm B, 20 km north-west of Farm A. The region has an annual average rainfall of about 535 mm. The soils on Farm A were a heavy, basalt type, to which large amounts of fertiliser had been applied for the previous 15 years (Appendix 1). A single paddock on Farm A was allocated for grazing by the ewe flock for the duration of the trial. This paddock contained a mixture of annual grasses, including barley grass (Hordeum spp.), silver grass (Vulpia spp.) and wallaby grass (Austrodanthonia spp.), as well as smaller amounts of volunteer Phalaris aquatica and subterranean clover (Trifolium subterraneum). The paddocks grazed by the weaner and hogget sheep on Farm A contained well- established pastures of subterranean clover (cv. Leura and cv. Trikkala) and phalaris (cv. Sirosa and cv. Holdfast), which were sown in autumn 1995. Farm B had lighter, sandy soils that had not received fertiliser for many years. The pastures were dominated by annual grasses and contained mainly onion grass (Oxalis pes-caprae), silver grass and 59 Materials & Methods barley grass. They also contained some volunteer cocksfoot (Dactylis glomerata) and phalaris. The soil concentrations of phosphorus from the sites on Farm A and B in October 2003 were 18 and 8 mg Olsen P /kg, respectively. The winter stocking rate in the trial paddocks on Farms A and B was approximately 10 dry sheep equivalents per hectare (DSE/ha). This was more than the district average but lower than that achieved elsewhere on Farm A, and was consistent with a sustainable rate for a ‘high input’ farm under best- practice management in that environment (Vizard 1997). Monthly rainfall was measured on Farm A manually with a rain gauge and maximum and minimum monthly temperatures were obtained from the nearest Bureau of Meteorology recording site, 15 km away. 3.33.33.3 AnimalsAnimalsAnimals Six hundred fine-wool Merino ewes of the one genotype were used to establish the trial flock. Approximately one third were born in September 1995 and the remainder were born in September 1996. With this sample size, the power of detecting a 5% difference in measured characteristics between ewe shearing groups was greater than 0.9 and, if the flock’s weaning percentage were assumed to be at least 75%, the number of progeny in the six weaner shearing groups was such that the power of detecting a 10% difference between measured characteristics in weaner shearing groups was at least 0.8. 3.43.43.4 Flock Management The sheep were managed as a spring-lambing, self-replacing fine-wool flock, with the first progeny born in the trial in September 1999. The routine management calendar of the flocks is shown in Table 3.1. Sheep were managed as three different age groups: mixed-sex weaners (weaned sheep younger than 12 months of age), mixed-sex hoggets (sheep born one year prior to the current weaner flock, up to 19 months of age) and adult ewes (19 months of age or older). Within each management group, all sheep grazed together in the same paddock and were managed identically, apart from their shearing time. Ewes were naturally mated as a group in a single paddock for five weeks to fine-wool Merino rams. Mating started on about 5th April each year and lambing therefore started on or about 1st September. Lambs were marked as a single group three weeks after the end of lambing and at this time were identified to their dams’ shearing treatment using the marked udder method (Davis et al. 1981): coloured branding fluid corresponding to each ewe shearing treatment, mixed with vegetable oil, was applied to each ewe’s udder one day prior to lamb-marking and the udder and breech area of ewes was examined to determine lactational status (Dun 1963). Ewes were classified as either lactating (‘wet’) or 60 Materials & Methods non-lactating (‘dry’). Lambs were identified to ewe treatment according to the colour of branding fluid on their heads at marking on the following day. Each lamb had a copy of a uniquely numbered tag placed in both ears, as well as another tag identifying to which of the six progeny shearing treatments it belonged (see below). Routine marking procedures were performed by experienced farm staff or a contractor. Lambs were vaccinated against Clostridium tetani, C. perfringens type D, C. novyi, C. chauvoei and C. septicum, mulesed with a ‘radical-V’ operation (Morley 1983b) and cyromazine 0.1% W/V (Vetrazin Liquid® 500 g/L, Novartis Animal Health Australasia Pty Ltd) was applied over the breech to prevent flystrike of mulesing wounds. Male lambs were castrated by placing a rubber ring around the neck of the scrotum. Lambs and ewes were returned to their paddock immediately after marking. Lambs were weaned, received a booster vaccination and were drenched with an effective anthelmintic at the start of December, 12–13 weeks after the start of lambing. At 19 months of age, replacement ewe hoggets were selected to enter the adult ewe flock and wether hoggets exited the trial. Selection of replacement ewes was based on a 12% micron premium index that incorporated fibre diameter, clean fleece weight, staple strength and bodyweight in its selection criteria (Ponzoni 1986). Within each year, the same number of ewe hoggets was selected from each of the six progeny shearing treatments. The number available for selection was determined by the weaner shearing treatment with the least number of ewe hoggets in it. About 24–29 sheep from each weaner shearing group entered the ewe flock each year. At the same time, the oldest ewes, ranked lowest on index, were culled from each ewe shearing treatment in order to leave, after the addition of the replacement ewe hoggets, approximately 200 ewes in each adult shearing treatment. In order to compare the flystrike risk of the shearing groups, no sheep received prophylactic chemical treatment for flystrike. Flocks were monitored at least twice weekly throughout times of flystrike risk and individual strikes were treated by clipping wool away from the affected and surrounding area, and applying topical insecticide. Struck sheep were not removed from the trial, although withholding periods for treated fleeces were observed. A flock was jetted with cyromazine if flystrike incidence exceeded 2% strikes/week. Neither crutching nor the removal of wool from around the preputial opening of wethers (‘ringing’) was routinely performed. Some shearing treatments were crutched to reduce an anticipated fly risk on three occasions: March- and May-shorn ewes were crutched in December 2000, December- and March-shorn in July 2001, and all three ewe treatments in July 2003. The weight of crutchings removed from each sheep was measured on each of these occasions. The 18- month old wether progeny of May-shorn ewes were ringed in March 2002. A strain of intermediate-virulence footrot (Dichelobacter nodosus) was present in some sheep on Farm A, although no clinical signs of footrot were observed in any of the sheep involved in the trial. However, at the request of the owners of Farm B, the sheep were inspected in the yards by Farm A’s manager in February 2002 before the trial flocks were moved and up to four sheep were culled from each ewe and progeny shearing treatment, based on the appearance of their feet. No footrot lesions were detected in the trial flocks at two subsequent inspections on Farm B. 61 Materials & Methods Table 3.1: Calendar of flock management procedures and shearing times Month Ewe flock Lamb/weaner flock Hogget flock Sept Lambing start Oct Record BWT *, CS, DS, FR Lamb marking & allocation Record BWT, CS, DS, FR to weaner shearing treat- † Identify lactating & dry ewes ments (see text) Shearing: OCT-DEC Nov ------Monitor flocks for flystrike, and record & treat individual cases (ongoing—see text) ------ Dec ------Record BWT, CS, DS, FR ------1st summer drench ------ Weaning Shearing: DEC† Shearing: DEC-DEC Shearing: DEC-DEC Supplementary feeding as required (ongoing through- out summer & autumn) Jan Feb ------2nd summer drench ------ March ------Record BWT, CS, DS, FR ------ Shearing: MAR Shearing: MAR-MAR Shearing: MAR-MAR JUN-MAR Cull ewes exit trial after Replacement ewe hoggets shearing enter adult ewe flock Ewe hogget culls & all wether hoggets exit trial after shearing April ------Monitor flock worm egg counts monthly and drench if necessary ------ Mating starts (~April 5) May Mating ends after 5 weeks ------Record BWT, CS, DS, FR ------ Shearing: MAY Shearing: MAY-MAY Shearing: MAY-MAY (culls exit trial off shears) JUL-MAY (culls exit trial off shears) June Supplementary feeding as Record BWT, DS, FR required (ongoing, deter- mined by farm management) Shearing: JUN-MAR July ------Record BWT, CS, DS, FR------ Shearing: JUL-MAY [Weaner flock now consid- ered to be Hogget flock] Aug * BWT: bodyweight; CS: condition score; DS: dag score; FR: fleece rot score † see Section 3.5 for description of shearing group abbreviations 62 Materials & Methods 3.53.53.5 ShearingShearingShearing The shearing times in the trial were recommended by a workshop held with approximately 100 farmers and researchers participating in a farm extension program as (Larsen et al. 2002). The recommended shearing times were all considered to be realistic alternatives for commercial farms in the district, each offering potential benefits and disadvantages that warranted full examination. In January 1998, the ewes in the trial flock were randomly allocated, after stratification for age, to one of three annual shearing treatments: December (DEC), March (DEC) and May (MAY). Ewes allocated to December shearing were immediately shorn. Observations were commenced in December 1999, when the first 12-month DEC shearing took place. Progeny from each ewe shearing time were randomly allocated to one of two weaner shearing patterns that aligned them with the adult shearing time over their first two shearings (Figure 3.1). Thus, there were three ewe shearing and six progeny shearing treatments in total. Progeny of DEC ewes were either first shorn at weaning (3 months old, in December) then 15 months of age (December; abbreviated as ‘DEC-DEC’), or first shorn at 13 months of age (October) then at 27 months of age (December) in the DEC ewe flock itself (OCT-DEC). Progeny of MAR ewes were either first shorn at 6 months of age (March) then 18 months of age (March; MAR-MAR), or first shorn at 9 months of age (June) then 18 months of age (March; JUN-MAR). Progeny of MAY ewes were either first shorn at 8 months of age (May) then 20 months of age (May; MAY-MAY), or first shorn at 10 months of age (July) then 20 months of age (May; JUL-MAY). Sheep were shorn by professional shearers in about the middle of the respective month. As each sheep was shorn, the belly wool and any dags were separated from the main fleece. The unskirted fleece, Figure 3.1: Timing of ewe and progeny shearing treatments (age in months at weaner shearings) Ewe shearing treatment DEC MAR MAY