Report

Find out if the carbon market is improving in 'The State of Quality in the VCM 2024.

Download now arrow right

INSIGHTS

The biggest problem with carbon credits is not measurement error

Tuesday, 23 Jul 2024

Many believe the use of advanced technologies or geospatial data to support measurements by carbon projects will improve the quality of their claims. However, for many project types, measurement is not the key “quality” issue when it comes to carbon accounting.

Why measurement error is not the key GHG integrity issue

There is a general confusion in the voluntary carbon market (VCM) that new technological tools are what can “save” the carbon market. In particular, tools or high resolution geospatial data that help projects to more accurately measure, for example, changes in forest cover or forest biomass. But this is misleading. If we look at the key issues that plague the voluntary carbon market today, they are not related to what experts call “measurement error.” 

In this blog, we provide examples of why measurement is not the key issue for the greenhouse gas integrity of most carbon projects. We also provide some information about the use and accuracy of geospatial data. Subscribers can find more detail on these issues in our subscriber-only blog.

Renewable energy: Additionality is the key issue

The largest volume of issuances come from large-scale grid-connected renewable energy projects that use the methodology ACM0002. When estimating emission reductions, the key variable for such projects is the assumed grid emission factor (i.e. the GEF). Our findings are that the GEF, which serves as the baseline,is not the primary GHG integrity issue with regard to ACM0002 projects. Baseline issues result in only a minor risk of over-crediting. As such, it is well known that ACM0002 projects suffer from risks of non-additionality. While we assess over-crediting due to the GEF, at Calyx Global, we focus our efforts on whether ACM0002 projects are additional.

REDD: It’s all about the baseline

REDD is the next largest source of credit issuances in the voluntary carbon market. According to our analysis, baseline emissions overestimation is the highest risk for REDD projects. In many instances, REDD projects using older methodologies (i.e., those that VCS is sunsetting) will have a baseline that is over 100% over-estimated. The significant issue for the baseline is not the estimation of carbon stocks per hectare (which we might improve with better measurements), nor the historical amount of deforestation (which is used to inform the baseline), but the assumptions related to “what would happen in absence of the project” (i.e. the baseline scenario). In such cases, the accuracy of emissions during the crediting period (called “project emissions”), which rarely have over 20% uncertainty, becomes a relatively insignificant issue. This is illustrated in the figure below.

Higher resolution geospatial data can help to reduce measurement error and, in some cases, can be helpful in assessing a REDD project. However, in the vast majority of REDD projects to date, the high projected baseline emissions associated with a substantially overestimated deforestation rate is the key issue, and that estimate would not be improved with higher resolution data. Model assumptions, not measurement error, is the most significant factor driving GHG integrity.

Calyx Global spends most of its geospatial effort on determining whether the project-defined baseline is realistic. We describe in detail, in our REDD Ratings Framework, how we look at multiple scenarios for setting the most realistic trajectory for baseline emissions. Of particular importance is a proprietary model we developed in 2021 to generate a Calyx-derived REDD baseline – one that is very similar to the new VCS baseline approach (found in the VM0048 methodology).

Focus on what matters: Uncertainty

For many projects, the GHG integrity risk is not whether a project has done a good job of measuring emissions, but rather whether the project is additional, or if it has made unfounded assumptions to set a baseline.

Furthermore, with regard to geospatial data, current technologies still result in multiple different maps. Even higher resolution data can still have significant uncertainty (error). In other words, there is no single source of “truth”. In Mexico, for instance, government-run, field-based inventories of forest carbon stocks have an uncertainty of 17%; data derived from LiDAR. SAR and Landsat have uncertainties ranging from 20-40%. We recognize that this could change over time. At Calyx Global, we will ensure that we use the right tool for the job. Given this situation, we believe comparing many independent, reputable estimates is the best approach. 

Finally, it’s worth noting that measurement uncertainty actually does not matter if a project applies the conservativeness principle. This is what the ICVCM has also suggested, i.e. that projects do not need to achieve a certain level of accuracy – only that the claimed emission reductions need to be conservative. The easiest way to do this is to: (1) ensure a robust estimation of uncertainty; and (2) reduce the emission reduction (or removal) claim to be outside the error bars.

Our analysis is consistent with this approach. We do not require projects to achieve a specific accuracy to score well. But we do assess whether the overall emission reduction (or removal) claim is overstated. 

For our subscribers: Read a fuller discussion on these issues, including additional examples in our subscriber-only blog post. 

If you are interested in subscribing to Calyx Global, contact us

About the author

Calyx Global

This article includes insights and input from multiple experts in Calyx Global.