Fruit Detection and Yield Mass Estimation from a UAV Based RGB Dense Cloud for an Apple Orchard

Publikation: Beiträge in ZeitschriftenZeitschriftenaufsätzeForschungbegutachtet

Standard

Fruit Detection and Yield Mass Estimation from a UAV Based RGB Dense Cloud for an Apple Orchard. / Hobart, Marius; Pflanz, Michael; Tsoulias, Nikos et al.
in: Drones, Jahrgang 9, Nr. 1, 60, 16.01.2025.

Publikation: Beiträge in ZeitschriftenZeitschriftenaufsätzeForschungbegutachtet

Harvard

Hobart, M, Pflanz, M, Tsoulias, N, Weltzien, C, Kopetzky, M & Schirrmann, M 2025, 'Fruit Detection and Yield Mass Estimation from a UAV Based RGB Dense Cloud for an Apple Orchard', Drones, Jg. 9, Nr. 1, 60. https://doi.org/10.3390/drones9010060

APA

Hobart, M., Pflanz, M., Tsoulias, N., Weltzien, C., Kopetzky, M., & Schirrmann, M. (2025). Fruit Detection and Yield Mass Estimation from a UAV Based RGB Dense Cloud for an Apple Orchard. Drones, 9(1), Artikel 60. https://doi.org/10.3390/drones9010060

Vancouver

Hobart M, Pflanz M, Tsoulias N, Weltzien C, Kopetzky M, Schirrmann M. Fruit Detection and Yield Mass Estimation from a UAV Based RGB Dense Cloud for an Apple Orchard. Drones. 2025 Jan 16;9(1):60. doi: 10.3390/drones9010060

Bibtex

@article{5f0e9d943ce14b9ebba893add9c5a066,
title = "Fruit Detection and Yield Mass Estimation from a UAV Based RGB Dense Cloud for an Apple Orchard",
abstract = "Precise photogrammetric mapping of preharvest conditions in an apple orchard can help determine the exact position and volume of single apple fruits. This can help estimate upcoming yields and prevent losses through spatially precise cultivation measures. These parameters also are the basis for effective storage management decisions, post-harvest. These spatial orchard characteristics can be determined by low-cost drone technology with a consumer grade red-green-blue (RGB) sensor. Flights were conducted in a specified setting to enhance the signal-to-noise ratio of the orchard imagery. Two different altitudes of 7.5 m and 10 m were tested to estimate the optimum performance. A multi-seasonal field campaign was conducted on an apple orchard in Brandenburg, Germany. The test site consisted of an area of 0.5 ha with 1334 trees, including the varieties {\textquoteleft}Gala{\textquoteright} and {\textquoteleft}Jonaprince{\textquoteright}. Four rows of trees were tested each season, consisting of 14 blocks with eight trees each. Ripe apples were detected by their color and structure from a photogrammetrically created three-dimensional point cloud with an automatic algorithm. The detection included the position, number, volume and mass of apples for all blocks over the orchard. Results show that the identification of ripe apple fruit is possible in RGB point clouds. Model coefficients of determination ranged from 0.41 for data captured at an altitude of 7.5 m for 2018 to 0.40 and 0.53 for data from a 10 m altitude, for 2018 and 2020, respectively. Model performance was weaker for the last captured tree rows because data coverage was lower. The model underestimated the number of apples per block, which is reasonable, as leaves cover some of the fruits. However, a good relationship to the yield mass per block was found when the estimated apple volume per block was combined with a mean apple density per variety. Overall, coefficients of determination of 0.56 (for the 7.5 m altitude flight) and 0.76 (for the 10 m flights) were achieved. Therefore, we conclude that mapping at an altitude of 10 m performs better than 7.5 m, in the context of low-altitude UAV flights for the estimation of ripe apple parameters directly from 3D RGB dense point clouds.",
keywords = "apple trees, fruit detection, structure from motion (SfM), unmanned aerial vehicle (UAV), yield estimation, Environmental Governance",
author = "Marius Hobart and Michael Pflanz and Nikos Tsoulias and Cornelia Weltzien and Mia Kopetzky and Michael Schirrmann",
note = "Publisher Copyright: {\textcopyright} 2025 by the authors.",
year = "2025",
month = jan,
day = "16",
doi = "10.3390/drones9010060",
language = "English",
volume = "9",
journal = "Drones",
issn = "2504-446X",
publisher = "MDPI AG",
number = "1",

}

RIS

TY - JOUR

T1 - Fruit Detection and Yield Mass Estimation from a UAV Based RGB Dense Cloud for an Apple Orchard

AU - Hobart, Marius

AU - Pflanz, Michael

AU - Tsoulias, Nikos

AU - Weltzien, Cornelia

AU - Kopetzky, Mia

AU - Schirrmann, Michael

N1 - Publisher Copyright: © 2025 by the authors.

PY - 2025/1/16

Y1 - 2025/1/16

N2 - Precise photogrammetric mapping of preharvest conditions in an apple orchard can help determine the exact position and volume of single apple fruits. This can help estimate upcoming yields and prevent losses through spatially precise cultivation measures. These parameters also are the basis for effective storage management decisions, post-harvest. These spatial orchard characteristics can be determined by low-cost drone technology with a consumer grade red-green-blue (RGB) sensor. Flights were conducted in a specified setting to enhance the signal-to-noise ratio of the orchard imagery. Two different altitudes of 7.5 m and 10 m were tested to estimate the optimum performance. A multi-seasonal field campaign was conducted on an apple orchard in Brandenburg, Germany. The test site consisted of an area of 0.5 ha with 1334 trees, including the varieties ‘Gala’ and ‘Jonaprince’. Four rows of trees were tested each season, consisting of 14 blocks with eight trees each. Ripe apples were detected by their color and structure from a photogrammetrically created three-dimensional point cloud with an automatic algorithm. The detection included the position, number, volume and mass of apples for all blocks over the orchard. Results show that the identification of ripe apple fruit is possible in RGB point clouds. Model coefficients of determination ranged from 0.41 for data captured at an altitude of 7.5 m for 2018 to 0.40 and 0.53 for data from a 10 m altitude, for 2018 and 2020, respectively. Model performance was weaker for the last captured tree rows because data coverage was lower. The model underestimated the number of apples per block, which is reasonable, as leaves cover some of the fruits. However, a good relationship to the yield mass per block was found when the estimated apple volume per block was combined with a mean apple density per variety. Overall, coefficients of determination of 0.56 (for the 7.5 m altitude flight) and 0.76 (for the 10 m flights) were achieved. Therefore, we conclude that mapping at an altitude of 10 m performs better than 7.5 m, in the context of low-altitude UAV flights for the estimation of ripe apple parameters directly from 3D RGB dense point clouds.

AB - Precise photogrammetric mapping of preharvest conditions in an apple orchard can help determine the exact position and volume of single apple fruits. This can help estimate upcoming yields and prevent losses through spatially precise cultivation measures. These parameters also are the basis for effective storage management decisions, post-harvest. These spatial orchard characteristics can be determined by low-cost drone technology with a consumer grade red-green-blue (RGB) sensor. Flights were conducted in a specified setting to enhance the signal-to-noise ratio of the orchard imagery. Two different altitudes of 7.5 m and 10 m were tested to estimate the optimum performance. A multi-seasonal field campaign was conducted on an apple orchard in Brandenburg, Germany. The test site consisted of an area of 0.5 ha with 1334 trees, including the varieties ‘Gala’ and ‘Jonaprince’. Four rows of trees were tested each season, consisting of 14 blocks with eight trees each. Ripe apples were detected by their color and structure from a photogrammetrically created three-dimensional point cloud with an automatic algorithm. The detection included the position, number, volume and mass of apples for all blocks over the orchard. Results show that the identification of ripe apple fruit is possible in RGB point clouds. Model coefficients of determination ranged from 0.41 for data captured at an altitude of 7.5 m for 2018 to 0.40 and 0.53 for data from a 10 m altitude, for 2018 and 2020, respectively. Model performance was weaker for the last captured tree rows because data coverage was lower. The model underestimated the number of apples per block, which is reasonable, as leaves cover some of the fruits. However, a good relationship to the yield mass per block was found when the estimated apple volume per block was combined with a mean apple density per variety. Overall, coefficients of determination of 0.56 (for the 7.5 m altitude flight) and 0.76 (for the 10 m flights) were achieved. Therefore, we conclude that mapping at an altitude of 10 m performs better than 7.5 m, in the context of low-altitude UAV flights for the estimation of ripe apple parameters directly from 3D RGB dense point clouds.

KW - apple trees

KW - fruit detection

KW - structure from motion (SfM)

KW - unmanned aerial vehicle (UAV)

KW - yield estimation

KW - Environmental Governance

UR - http://www.scopus.com/inward/record.url?scp=85215707213&partnerID=8YFLogxK

U2 - 10.3390/drones9010060

DO - 10.3390/drones9010060

M3 - Journal articles

AN - SCOPUS:85215707213

VL - 9

JO - Drones

JF - Drones

SN - 2504-446X

IS - 1

M1 - 60

ER -

DOI