Verra’s consolidated REDD methodology: Wins and lost opportunities, in four parts
Part 1: Emissions monitoring: Under the Radar
Part 2: Baselines: All We’ve Been Waiting For?
Part 3: Baseline revision schedule: Timing matters
Part 4: Permanence and Additionality: Waiting in the Wings
Baselines: All we’ve been waiting for?
Verra’s consolidated REDD methodology (VCRM) marks a major improvement in the way baselines are set for REDD projects. Calyx sees big wins. We also see potential risks, and hope they will be addressed before the final version comes out later this year.
Three big wins…
The biggest win is the approach taken to “allocate a jurisdictional baseline”. A plausible amount of deforestation is estimated for the region and this must be allocated across the entire landscape, using a risk map, within that region. In other words, there is one pie and everyone must share it. This cap on aggregated baselines in the jurisdiction imposes a discipline on baseline setting that did not exist in the original methodologies and will surely reduce existing baselines.
Secondly, for the first time, a third party will be hired by Verra to determine the amount and location of baseline deforestation rather than having each project developer do so. This could reduce the likelihood of overestimated baselines that have plagued REDD projects in the past.
Finally, there is an improvement in the treatment of uncertainty. In another first, uncertainty for activity data (estimated hectares of deforestation that would have occurred without the project) must be quantified, reported, and kept within 20%. If uncertainty is greater than 10%, the baseline activity data must be adjusted downwards. Furthermore, uncertainty in baseline emissions (the result of multiplying activity data by carbon stocks, or emission factors) will be estimated and adjusted downward if uncertainty is greater than 10%.
…but also two potential concerns
First, the independent third party chosen by Verra to create the baselines may be the project developer seeking to work in a specific jurisdiction. Our analysis to date suggests that outcomes tend to be biased towards overestimation (resulting in overcrediting) when project developers are allowed to set their own baseline. Creating a water-tight rule set to constrain modeling of where deforestation may occur is extremely challenging. Creating guidance for a robust review process of the resulting risk map and allocation will also be challenging. In the past, review (in the form of auditing by validation and verification bodies) has not led to conservative baselines. For this reason, Calyx Global suggested in its recent report that, for this particular project type, independent third parties should create the baseline. We understand that there are trade-offs – that project developers bring deep knowledge of their project areas – but, that said, independence is a tried and true path to integrity.
Second, we have a concern around the level of conservativeness proposed in the new methodology. One of the ICVCM’s core carbon principles is “Robust quantification of emission reductions and removals”, which specifically states that “conservative approaches” should be used. This means, in practice, that projects should account for uncertainties – for example, by taking deductions commensurate with the level of uncertainty to ensure “a ton is a ton”.
There are two main sources of uncertainty for REDD baselines: (1) the “model” uncertainty, or the estimation of how much deforestation will occur in the future; and (2) the estimation of emissions from deforestation. Calyx is pleased to see that VCRM requires disclosure of the uncertainty associated with modeled deforestation, requires that uncertainty be less than 20%, and requires a downward adjustment if uncertainty is greater than 10%. We note, however, that other methodologies, under other standards (e.g. Mexico Forest Protocol of the Climate Action Reserve), require adjustments if uncertainty is greater than 5%. According to equations in VCRM, the maximum downward adjustment for the number of hectares of deforestation in the baseline would be 5.2% for uncertainty just under the threshold of 20%. Other standards take a percent for percent approach to adjustment over a permitted threshold. In this instance, with VCRM’s 10% threshold, that would mean for uncertainty of 19.9%, the downward adjustment should be 9.9% (or 14.9% if a 5% threshold were used).
In addition to dealing with model uncertainty, the VCRM brings new rigor to the uncertainty of key elements that determine forest carbon stocks. With uncertainty propagation prescribed by VCRM, the project is to determine total baseline emissions uncertainty and apply a discount for uncertainty above 10%. This is in addition to the downward adjustment of activity data. Good so far.
However, as noted in our post yesterday, the bigger problem may be that the deductions themselves are too small for a given level of uncertainty. If total uncertainty for baseline emissions were ± 50%, baseline emissions would be reduced by only 13%.
That said, we recognize the trade-off that is being made – as some may argue that as long as there is no bias, it is fine to “live with” a certain amount of uncertainty. There would be some overestimation and some underestimation. However, in our experience the market tends to overestimate given the incentives are aligned with overestimation. Thus, taking a more conservative approach is warranted.
Get the latest delivered to your inbox
Sign up to our newsletter for the Calyx News and Insights updates.