The contributions of high-quality colostrum to the health and productivity of dairy calves – in both their early stages and throughout their lifetimes – have been well-documented. But just how much of the colostrum produced in the United States can be classified as “high-quality?” Researcher Kim Morrill and a team of colleagues at Iowa State University conducted a study to find out.

The team collected 827 samples of first-milking colostrum for 67 farms in 12 states between June and October 2010. The parity of donor cows was recorded, as was the storage method of the colostrum when it was sampled – either fresh, refrigerated or frozen.

Among their findings:

  • IgG concentration ranged from <1 to 200 mg/mL, with a mean concentration of 68.8 mg/mL
  • Almost 30 percent of samples contained IgG concentrations of <50 mg/mL
  • IgG concentration increased with parity (42.4, 68.6 and 95.9 mg/mL in first, second and third or later lactations, respectively)
  • No significant differences in IgG levels were noted among breeds or storage methods
  • IgG was highest in samples collected in the Midwest (79.7 mg/mL) and lowest in the Southwest (64.3 mg/mL)
  • Nearly 43 percent of samples had total plate count (TPC) of >100,000 CFU/mL, and nearly 17 percent of samples had TPC of >1 million CFU/mL

Morrill concludes that only 39.4 percent of the samples collected met industry standards for both IgG concentration and TPC, and that almost 60 percent of colostrum on dairy farms is inadequate, putting a large number of calves at risk of failure of passive transfer and/or bacterial infections.