When it comes to fall manure application, timing matters.
Applying manure too early in the fall creates more risk of nitrogen (N) loss
before your crop can use it. Two main factors drive this: temperature and time.
The nitrogen in manure is primarily in two forms: organic N
and ammonium N. Both forms are relatively stable when first applied and tend to
stay in place if incorporated into the soil. But once in the soil, microbes
start working. Under aerobic conditions, microbes convert ammonium into
nitrate—a form of nitrogen that is both mobile in the soil and vulnerable to
loss through leaching or denitrification.
If manure is applied earlier in the fall, microbes have:
·
Warmer soils, which speed up the conversion of
ammonium to nitrate, and
·
More time before crop uptake, which means more
opportunity for nitrate to be lost.
Combine those two with fall and spring precipitation moving
through the soil, and you’ve created a recipe for nitrogen loss.
Tracking Microbial Activity with μGDD
To make sense of this, researchers have developed a
microbial growing degree day (μGDD) index, a simple way to compare how much
microbial activity (and therefore nitrogen conversion) might occur depending on
when you apply.
Microbial activity roughly doubles with every 18°F increase
in soil temperature but drops to zero when soils freeze. Using that
relationship, we can calculate μGDD to compare the relative risk of nitrogen
conversion across application dates and locations.
The formula looks like this:
Where:
·
Tt is the daily average temperature
of the soil
·
Tref is a reference temperature,
which I set at 32°F
·
Q10 is 2, which is essentially saying
activity doubles every 18°F
temperature increase
This doesn’t tell us how many pounds of N are lost, that
takes water movement, and it doesn’t even tell us how much of the nitrogen is
converted into nitrate, but it does let us compare the relative risk of N loss
at different application dates and across geographic locations.
What the Results Show
When we apply this μGDD approach, a clear pattern emerges:
Earlier fall applications carry more microbial activity (and therefore risk). Well-timed applications in the fall (approximately November 1 are about double the risk of nitrification as compared to a spring application, but it is really the months of September and October where we start to see the relative risk of nitrification increasing rapidly. Risk does not equal loss, but it does show the opportunity for loss with weather patterns that don’t cooperate and cause wet soil conditions, water movement, and drainage.
The second point I want to make is there isn’t a one-size-fits
all recommendation for Iowa. Northern Iowa behaves like “late” southern Iowa.
In other words, applying in early October in northern Iowa may carry about the
same microbial risk as applying 10–14 days later in southern Iowa. The map
shown represents the relative range in microbial degree days a site in southern
Iowa would have experienced relative to northern Iowa if they both applied at
the same time.
Putting It in Context
This finding reinforces what we’ve discussed in other
articles:
·
Wait until soils are cool (below 50°F) before
applying fall manure to slow microbial conversion.
·
Use nitrification inhibitors if you need to
apply earlier; they can buy some time, though they won’t prevent all
conversion.
·
Consider cover crops to capture nitrate if early
application is unavoidable.
At the end of the day, earlier fall applications mean more
time for microbes to work and more chance for N loss. The μGDD framework gives
us a way to quantify that relative risk and better understand why timing matters.
It isn’t the only factor, but it can help identify why the risk changes.