27 Apr 2021
Budbreak is one of the most observed and studied phenological phases in perennial plants. Two dimensions of exposure to temperature are generally used to model budbreak: accumulation of time spent at low temperatures (chilling); and accumulation of heat units (forcing). These two effects have a well-established negative correlation: the more chilling, the less forcing required for budbreak. Furthermore, temperate plant species are assumed to vary in amount of chilling required to complete endodormancy and begin the transition to breaking bud. Still, prediction of budbreak remains a challenge. The present work demonstrates across a wide range of species how bud cold hardiness must be accounted for to study dormancy and accurately predict time to budbreak. Cold hardiness defines the path length to budbreak, meaning the difference between the cold hardiness buds attain during the winter, and the cold hardiness at which deacclimated buds are predicted to open. This distance varies among species and throughout winter within a species. Increases in rate of cold hardiness loss (deacclimation) measured throughout winter show that chilling controls deacclimation potential – the proportion of the maximum rate response attained at high chill accumulation – which has a sigmoid relationship to chilling accumulation. For forcing, rates of deacclimation increase non-linearly in response to temperature. Comparisons of deacclimation potential show a dormancy progresses similarly for all species. This observation suggests that comparisons of physiologic and genetic control of dormancy requires an understanding of cold hardiness dynamics and the necessity for an update of the framework for studying dormancy and its effects on spring phenology.