Expanding phenological insights: automated phenostage annotation with community science plant images.
Journal:
International journal of biometeorology
Published Date:
Jul 4, 2025
Abstract
Plant phenology plays a pivotal role in understanding the interactions between plants and their environment. Despite increasing interest in plant phenology research, documenting their spatial and temporal variability at large spatial scales remains a challenge for many species and a variety of phenostages. The use of plant identification apps results in a vast repository of plant occurrence records spanning large spatial and temporal scales. As these observations are usually accompanied by images, they could potentially be a rich source of fine-grained large scale phenological information. However, manually annotating phenological stages is time intensive, necessitating efficient automated approaches. In this study, we developed a machine learning-based workflow to automatically classify plant images into the phenological stages of flowering bud, flower, unripe fruit, ripe fruit, and senescence for nine common woody shrub and tree species. Although the process required only a small amount of training images, the classification achieved an overall accuracy of 96% across all species and phenostages. To evaluate the phenological relevance of these automatically annotated observations, we compared their temporal and spatial patterns from three years (2020-2022) with systematically collected phenological data from the German Meteorological Service (DWD). Our results revealed strong spatial and temporal consistency, particularly for the flowering stages, with interannual phenological trends aligning well between the datasets. Our results demonstrate that automatic annotation of phenological stages can be achieved with high reliability even with low manual labeling effort. Provided that a high number of images is available, these automatically labeled observations carry a strong phenological signal.
Authors
Keywords
No keywords available for this article.