The current standard for determining nitrogen application rates is way off course, according to Richard Mulvaney, a soil scientist at the University of Illinois.
He notes that for more than 30 years, no-tillers and other corn growers have relied on a nitrogen application guideline known as the proven yield method.
In this method, the grower’s expected bushels-per-acre yield is multiplied by 1.2 pounds of nitrogen, and any credits for nitrogen from legumes or manure are subtracted.
The formula is incorrect, Mulvaney says, because it relies on several incorrect assumptions: that the crop receives most of its nitrogen from the fertilizer, not from the soil; that all fields need fertilization; that fertilizer efficiency should be constant regardless of conditions; that higher yields require more fertilizer nitrogen; and that whole-field nitrogen management is adequate.
Mulvaney says of the five assumptions, “They’re a bunch of baloney; they don’t jive with reality. So you shouldn’t expect a lot of accuracy from the kind of recommendations that come from this method.”
Those assumptions were proven wrong in a number of studies done during the past three decades, he says.
In the studies, chemically marked fertilizer was used so researchers could distinguish between the uptake of fertilizer nitrogen and soil nitrogen.
In a test field at Monmouth, Ill., the uptake of fertilizer nitrogen increased as the rate of fertilizer increased. However, even at 240 pounds per acre of fertilizer, at the end of the growing season the crop had still taken up more of…