This political as well as scientific issue has hung around for more than 50 years. About 80% of all antibiotics in the United States are fed to animals as disease prophylactics and as a growth stimulant. Recent developments promise to revive this dispute.
Antibiotics have been used in animals as long as in humans. Antibiotics eradicated a number of animal diseases that could wipe out a flock or herd very quickly. The second “miracle antibiotic” after penicillin, streptomycin, was developed at an agricultural experimental station and began to be used in animals in the 1940s.
The 1940s were also an era of rapid advance in animal feeds. Adding urea and vitamins to commercial feed spurred growth. So did small amounts of antibiotics in feed, before animals became ill. Healthy animals grow faster. This plus breeding for faster growth made possible large, concentrated feedlot operations, and commodity economics took over from there. Veterinarians and the livestock industry generally regard antibiotics as a pillar of their business model today.
Misgivings about bulk use of antibiotics began quickly. Research in 1956 noted the possibilities for microbes to become immune to antibiotics if constantly exposed, and that if animals loosed antibiotics into the larger environment, microbes would mutate to become resistant to them. Furthermore, humans eating “factory farmed” meat ingested small amounts of antibiotics regularly.
Evidence that microbes would become antibiotic resistant and weaken the effectiveness of antibiotics for therapeutic treatment of both animals and humans has increased over the intervening years. The FDA initiated action to ban antibiotic feeding in 1973, and in 1977 issued a notice to withdraw approval if manufacturers and users could not prove that the practice caused no harm. Over the past 35 years a stream of proposed bills, lobbying, and petitioning has not resolved anything while medical and environmental interests jockeyed back and forth with pharmaceutical and agricultural interests.
On March 22 this year, Judge Theodore Katz issued a directive to the FDA in response to another lawsuit by the National Resources Defense Council against the FDA for inaction. The judge directed the FDA to do now what it said it would do 35 years ago: Require that antibiotics fed to animals be shown to be safe, or ban the practice. The same day, an unrelated research report was issued: fluoroquinolone, an antibiotic that the FDA had banned in feed in 2005, was still showing up in chickens and chicken products.
This issue shows the importance of the precautionary principle: that the parties taking action be required to demonstrate to the best of their ability that it will do no harm to others, or to the environment. Of course, in the commercial world this incurs a lot of expense before a revenue stream begins. Once revenue is coming, companies prefer to insist that their practices be proven unsafe before giving up existing revenue – and perhaps a whole business model. Delay postpones having to take action, but whether in the long run both the companies and the public are better served is questionable.
If Judge Katz’s order actually leads to a ban on antibiotic feeding – this time – how the agribusiness model will shift is not easily foreseen, but years from now the industry may see it as something that it should have addressed much sooner. That’s part of learning to improve ongoing performance, rather than be content with a cash cow.