Background: Participatory forest monitoring has been promoted as a means to engage local forest-dependent communities in concrete climate mitigation activities as it brings a sense of ownership to the communities and hence increases the likelihood of success of forest preservation measures. However, sceptics of this approach argue that local community forest members will not easily attain the level of technical proficiency that accurate monitoring needs. Thus it is interesting to establish if local communities can attain such a level of technical proficiency. This paper addresses this issue by assessing the robustness of biomass estimation models based on air-borne laser data using models calibrated with two different field sample designs namely, field data gathered by professional forester teams and field data collected by local communities trained by professional foresters in two study sites in Nepal. The aim is to find if the two field sample data sets can give similar results (LiDAR models) and whether the data can be combined and used together in estimating biomass.
Results: Results show that even though the sampling designs and principles of both field campaigns were different, they produced equivalent regression models based on LiDAR data. This was successful in one of the sites (Gorkha). At the other site (Chitwan), however, major discrepancies remained in model-based estimates that used different field sample data sets. This discrepancy can be attributed to the complex terrain and dense forest in the site which makes it difficult to obtain an accurate digital elevation model (DTM) from LiDAR data, and neither set of data produced satisfactory results.
Conclusions: Field sample data produced by professional foresters and field sample data produced by professionally trained communities can be used together without affecting prediction performance provided that the correlation between LiDAR predictors and biomass estimates is good enough.
Keywords: Above-ground biomass; LiDAR; Participatory forest monitoring; REDD+.