Utilizing Dried Distillers Grains as a Roughage Source in Finishing Rations

This article was originally published February 2010 on Pages A8-A9 in the Prairie Bull breeders 2010.

The 2009-2010 farming year in Alberta has been, and is still, a challenging year for beef producers. Cattle prices have been low and a dry summer has created forage shortages in the province. Consequently, producers are looking to use alternative fiber sources, such as by-product feedstuffs, to stretch out their silage and hay supplies. Such feedstuffs as screenings, oat hulls and straw have a lower TDN (total digestible nutrients) value than barley silage, which may limit the growth of feedlot cattle when these sources are substituted for barley silage in backgrounding and finishing rations.

Over the past two to three years, the substitution of dried distillers grains with solubles (DDGS) for a portion of the barley grain in finishing feedlot diets has become an increasingly common practice. Feed trials conducted in the U.S. and Canada suggest that performance of feedlot cattle is optimized when DDGS is included in the diet at 20% to 30% of diet dry matter. When DDGS replaces barley or corn grain in the ration, the starch content of the diet is decreased and the amount of more slowly fermentable fiber is increased. This may decrease the extent and rate of acid production in the rumen and reduce the incidence of subacute ruminal acidosis. Consequently, the level of forage that must be included in the diet to maintain rumen health may be reduced when DDGS is substituted for a portion of the grain in the diet. Our group recently conducted a feedlot trial at AAFC Lethbridge Research Centre to investigate this possibility.

One of the goals of our project was to explore the extent to which triticale could be used in the bioethanol industry, and so we utilized DDGS created from the fermentation of triticale for ethanol production. However, given our past experience, we would expect to see similar results if we had used either corn or wheat DDGS.

In our study, we formulated four diets (Table 1). One diet we designated as the control (CON) containing no triticale DDGS, which was typical of diets widely used in western Canadian feedlots before DDGS became readily available. The second diet contained triticale DDGS that was substituted for 20% of the barley grain in a diet containing 10% barley silage at a dry matter basis (D-10S). The third diet contained more triticale DDGS so that barley silage only composed 5% of the diet dry matter (D-5S). Triticale DDGS in the final diet was increased to the point that no barley silage was included in the formulation (D-0S).

Table 1. Diet Composition

One hundred and sixty crossbred yearling steers were fed these diets for 112 days while we measured intake, gain, feed efficiency and post-slaughter carcass quality. Additionally, we placed rumen windows in four steers on each diet and, through the use of a sophisticated pH meter, we continuously monitored the severity of rumen acidosis for four 7-day periods during the experiment. We also monitored intake of individual steers in two pens on each diet using the GrowSafe feed intake monitoring system, whereby we could tell when and how much feed each individual steer consumed at any point during the day or night.

Rumen pH was lower, which means reduced incidence of rumen acidosis, in steers fed the diet in which triticale DDGS was substituted only for barley grain as compared to the traditional feedlot ration not containing DDGS. Furthermore, based on the measurement of rumen pH, we found that cattle fed the D-10S diet experienced fewer cases of subclinical acidosis (12 versus 21) in the diet that contained no triticale DDGS. This suggests that diluting the dietary starch content through inclusion of triticale DDGS in the ration may reduce the incidence of digestive disturbances in feedlot cattle fed high-grain finishing diets.

However, as triticale DDGS replaced barley silage, the acid production linearly increased in the rumen such that it was the highest for steers that received no silage in their diet. This led to an increase in the number of cases of subclinical acidosis that we measured from 12 to 30 and 40 as the level of silage in the diet decreased from 10% to 5% and 0%, respectively. This result may have arisen due to the smaller particle size of triticale DDGS as compared to barley silage.

Diets that contain small particles do not stimulate chewing activity as much as those containing longer particles. A reduction in chewing reduces the flow of saliva into the rumen and, as a result, the animal‘s ability to buffer acid in the rumen is reduced. This buffering action is very important in maintaining rumen function and, in fact, this relationship is one of the main reasons for including silage in high-grain finishing diets. So in our study, substitution of triticale DDGS for barley silage increased the incidence of subclinical acidosis, but perhaps the more important question is did this practice influence intake, growth, feed efficiency or carcass quality of the cattle?

Work at Agriculture and Agri-Food Canada and by others has shown that greater variation in intake is often associated with increases in the incidence and severity of ruminal acidosis. However, in our study, even though acid production in the rumen increased as triticale DDGS replaced barley silage, it did not result in a noticeable increase in intake variation. This suggests that the greater incidence of subclinical acidosis did not achieve the threshold level required to seriously influence the intake of the cattle. This occurred even though the rate of feed consumption (lbs/min) increased as triticale DDGS was substituted for barley silage, a result that coincides with the contention that chewing was reduced as more triticale DDGS was substituted for barley silage.

In fact, as we substituted triticale DDGS in the diet for barley silage, intake decreased but without any negative effect on the gain of the cattle (Table 2). The fact that the substitution of triticale DDGS for silage reduced intake, but did not affect gain, resulted in the improved feed efficiency (i.e., lbs of feed needed per lb of gain) of the steers. This reflects the higher energy content of triticale DDGS as compared to barley silage. So in this study, even though substitution of triticale DDGS for barley silage increased subclinical rumen acidosis, it did not adversely affect the productivity of the steers.

Post-slaughter analysis also showed that this practice could be undertaken with no reduction in carcass quality. However, we did notice a substantial increase in the number of liver abscesses in cattle that were not fed any barley silage. For experimental consistency, we did not include a liver abscess preventative in our diets, so this problem may be solved if such an additive was included in the diet.

Table 2

Although substitution of DDGS for silage in finishing diets may increase the incidence or severity of rumen acidosis, this outcome does not appear to adversely impact the performance of the cattle and in fact resulted in better feed efficiency. Such a practice could provide significant savings to feedlot operators, both from a reduced cost of gain perspective as well as from potentially allocating fewer acres to silage production. Findings from this experiment support using DDGS to extend forage supplies in finishing rations. However, decreasing forage content in high-grain rations will likely necessitate producers to pay increased attention to feed bunk management, degree of grain processing, and will likely require the inclusion of ionophores and other antibiotics in the diet to modulate rumen fermentation and to reduce the incidence of liver abscesses.

Kris Wierenga recently completed his M. Sc. degree from the University of Alberta and both he and Darryl Gibb are consulting nutritionists with Viterra Feed Products. Tim McAllister and Karen Beauchemin are research scientists with Agriculture and Agri-Food Canada in Lethbridge and Masahito Oba is an associate professor at the University of Alberta.