Addressing Data Uncertainty

Addressing Data Uncertainty

March 15, 2021

The following piece is an excerpt from the report, Embodied Carbon: A Clearer View of Emissions.

As with many engineering tasks, quantifying embodied carbon involves working with uncertain data. With this comes a responsibility not currently addressed in common WBLCA tools such as Tally and Athena Impact Estimator for Buildings—to quantify predictions in our analysis as “uncertainty.” Quantifying uncertainty allows LCA practitioners to highlight what we can control while still incorporating unknowns when we report our data. However, for many of us, this process is nothing new.

As structural engineers, we already quantify uncertainty in our codes and designs with methods such as probabilistic design. This way of thinking can and should be applied when reducing embodied carbon so we can make well-informed sustainable design decisions as we do with our structures. This isn’t to say we must rethink the whole process before we conduct more WBLCAs--we can still operate effectively within the current imprecise framework. 

Before we can manage and reduce environmental impacts from building materials, we must be able to properly measure and analyze that data. In the case of embodied carbon, uncertainty in these measurements stems from a variety of sources: material volume assumptions, the usage of industry averages for Environmental Product Declarations (EPDs), and different methodologies for developing impact factors, to name a few. However, by using a simplified approach and focusing on the largest sources of impact, we can validate the directional accuracy attained by our design decisions.

Consider the usage of cement in concrete. Although there are broad assumptions behind the concrete mix data and impact values, we know cement is one of the largest sources of impact in a concrete building. If we focus on simple carbon reduction strategies such as using structural systems with less material or specifying concrete mixes with lower cement content, we can be more confident in our impact reductions. Structural engineers employ methods like probabilistic analysis and design to ensure strength and serviceability requirements are met while still maintaining design efficiency. Although the severity of a collapsed building may be more intuitive than the adverse effects of climate change, there is a prime opportunity for improving sustainable design practices. So, how might we apply this same line of thinking to environmental impact data and reductions?

We must demand more statistically transparent impact data and software that is straightforward about uncertainty and assumptions. It is paramount that we treat impact analysis with the same rigor we apply to structural design. While we have solutions that fit within the current framework, improvements to the methodology are essential to achieving goals such as SE 2050—a charge to eliminate embodied carbon in all projects by 2050 proposed by the Carbon Leadership Forum.

We are tackling a diabolical problem in a compressed time frame. It is our responsibility as building design professionals to improve our practice by identifying shortcomings and developing progressive, forward-thinking ideals. Sustainability is not just about checking a box for a certification. It’s about being honest in our efforts and responding quickly to improve our methods as we continue to learn.


Case Study in Cement Reduction

This WBLCA of a concrete structure was conducted using Tally and includes the enclosure, superstructure, and foundation. By using consistent impact data from materials and transportation within Tally and only manipulating the concrete mix design (particularly the cement), we focus on what we know can achieve reductions. Focusing on the minutiae of other materials would be far less effective given the massive contribution of cement in concrete with respect to other materials. Focusing on the highest emitters gives us confidence in our results since we have no access to the uncertainty in this data. Without impact data sets or software that incorporates uncertainty, we are restricted in how we can conduct a proper analysis. Here, we prioritize actual impact reduction, which is far more important than getting an exact value for total output.