Abstract Vertical leaf angles and their variation through time are directly related to several ecophysiological processes and properties. However, there is no efficient method for tracking leaf angles of plant canopies under field conditions. Here, we present AngleCam, a deep learning-based approach to predict leaf angle distributions from horizontal photographs acquired with low-cost timelapse cameras. AngleCam is based on pattern recognition with convolutional neural networks and trained with leaf angle distributions obtained from visual interpretation of more than 2500 plant photographs across different species and scene conditions. Leaf angle predictions were evaluated over a wide range of species and scene conditions using independent samples from visual interpretation ( R 2 = 0.84) and compared to leaf angle estimates obtained from terrestrial laser scanning ( R 2 = 0.75). AngleCam was tested for the long-term monitoring of leaf angles for two broadleaf tree species in a temperate forest. The plausibility of the predicted leaf angle time series was underlined by its close relationship with environmental variables related to transpiration. The evaluations confirm that AngleCam is a robust and efficient method to track leaf angles under field conditions. The output of AngleCam is compatible with a range of applications, including functional-structural plant modelling, Earth system modelling or radiative transfer modelling of plant canopies. AngleCam may also be used to predict leaf angle distributions for existing data, for instance from PhenoCam networks citizen science projects.