Abstract

One of the most discussed issues in the design community is the performance gap. In this research, we investigate for the first time whether part of the gap might be caused by the modelling literacy of design teams. A total of 108 building modellers were asked to comment on the importance of obtaining and using accurate values for 21 common modelling input variables, from U-values to occupancy schedules when using dynamic simulation to estimate annual energy demand. The questioning was based on a real building for which high-resolution energy, occupancy and temperature data were recorded. A sensitivity analysis was then conducted using a model of the building (based on the measured data) by perturbing one parameter in each simulation. The effect of each perturbation on the annual energy consumption given by the model was found and a ranked list generated. The order of this list was then compared to that given by the modellers for the same changes in the parameters. A correlation analysis indicated little correlation between which variables were thought to be important by the modellers and which proved to be objectively important. k-means cluster analysis identified subgroups of modellers and showed that 25% of the people tested were making judgements that appeared worse than a person responding at random. Follow-up checks showed that higher level qualifications, or having many years of experience in modelling, did not improve the accuracy of people’s predictions. In addition, there was no correlation between modellers, with many ranking some parameters as important that others thought irrelevant. Using a three-part definition of literacy, it is concluded that this sample of modellers, and by implication the population of building modellers, cannot be considered modelling literate. This indicates a new cause of the performance gap. The results suggest a need and an opportunity for both industry and universities to increase their efforts with respect to building physics education, and if this is done, a part of the performance gap could be rapidly closed.
LanguageEnglish
Pages351-375
Number of pages25
JournalBuilding Services Engineering Research and Technology
Volume38
Issue number3
Early online date18 Jan 2017
DOIs
StatusPublished - 1 May 2017

Fingerprint

Cluster analysis
Sensitivity analysis
Energy utilization
Physics
Education
Computer simulation
Industry
Temperature

Cite this

The building performance gap : Are modellers literate? / Imam, Salah; Coley, David; Walker, Ian.

In: Building Services Engineering Research and Technology, Vol. 38, No. 3, 01.05.2017, p. 351-375.

Research output: Contribution to journalArticle

@article{f73b69b018e34e369b1863d1e49fa039,
title = "The building performance gap: Are modellers literate?",
abstract = "One of the most discussed issues in the design community is the performance gap. In this research, we investigate for the first time whether part of the gap might be caused by the modelling literacy of design teams. A total of 108 building modellers were asked to comment on the importance of obtaining and using accurate values for 21 common modelling input variables, from U-values to occupancy schedules when using dynamic simulation to estimate annual energy demand. The questioning was based on a real building for which high-resolution energy, occupancy and temperature data were recorded. A sensitivity analysis was then conducted using a model of the building (based on the measured data) by perturbing one parameter in each simulation. The effect of each perturbation on the annual energy consumption given by the model was found and a ranked list generated. The order of this list was then compared to that given by the modellers for the same changes in the parameters. A correlation analysis indicated little correlation between which variables were thought to be important by the modellers and which proved to be objectively important. k-means cluster analysis identified subgroups of modellers and showed that 25{\%} of the people tested were making judgements that appeared worse than a person responding at random. Follow-up checks showed that higher level qualifications, or having many years of experience in modelling, did not improve the accuracy of people’s predictions. In addition, there was no correlation between modellers, with many ranking some parameters as important that others thought irrelevant. Using a three-part definition of literacy, it is concluded that this sample of modellers, and by implication the population of building modellers, cannot be considered modelling literate. This indicates a new cause of the performance gap. The results suggest a need and an opportunity for both industry and universities to increase their efforts with respect to building physics education, and if this is done, a part of the performance gap could be rapidly closed.",
author = "Salah Imam and David Coley and Ian Walker",
year = "2017",
month = "5",
day = "1",
doi = "10.1177/0143624416684641",
language = "English",
volume = "38",
pages = "351--375",
journal = "Building Services Engineering Research and Technology",
issn = "0143-6244",
publisher = "Sage Publications",
number = "3",

}

TY - JOUR

T1 - The building performance gap

T2 - Building Services Engineering Research and Technology

AU - Imam, Salah

AU - Coley, David

AU - Walker, Ian

PY - 2017/5/1

Y1 - 2017/5/1

N2 - One of the most discussed issues in the design community is the performance gap. In this research, we investigate for the first time whether part of the gap might be caused by the modelling literacy of design teams. A total of 108 building modellers were asked to comment on the importance of obtaining and using accurate values for 21 common modelling input variables, from U-values to occupancy schedules when using dynamic simulation to estimate annual energy demand. The questioning was based on a real building for which high-resolution energy, occupancy and temperature data were recorded. A sensitivity analysis was then conducted using a model of the building (based on the measured data) by perturbing one parameter in each simulation. The effect of each perturbation on the annual energy consumption given by the model was found and a ranked list generated. The order of this list was then compared to that given by the modellers for the same changes in the parameters. A correlation analysis indicated little correlation between which variables were thought to be important by the modellers and which proved to be objectively important. k-means cluster analysis identified subgroups of modellers and showed that 25% of the people tested were making judgements that appeared worse than a person responding at random. Follow-up checks showed that higher level qualifications, or having many years of experience in modelling, did not improve the accuracy of people’s predictions. In addition, there was no correlation between modellers, with many ranking some parameters as important that others thought irrelevant. Using a three-part definition of literacy, it is concluded that this sample of modellers, and by implication the population of building modellers, cannot be considered modelling literate. This indicates a new cause of the performance gap. The results suggest a need and an opportunity for both industry and universities to increase their efforts with respect to building physics education, and if this is done, a part of the performance gap could be rapidly closed.

AB - One of the most discussed issues in the design community is the performance gap. In this research, we investigate for the first time whether part of the gap might be caused by the modelling literacy of design teams. A total of 108 building modellers were asked to comment on the importance of obtaining and using accurate values for 21 common modelling input variables, from U-values to occupancy schedules when using dynamic simulation to estimate annual energy demand. The questioning was based on a real building for which high-resolution energy, occupancy and temperature data were recorded. A sensitivity analysis was then conducted using a model of the building (based on the measured data) by perturbing one parameter in each simulation. The effect of each perturbation on the annual energy consumption given by the model was found and a ranked list generated. The order of this list was then compared to that given by the modellers for the same changes in the parameters. A correlation analysis indicated little correlation between which variables were thought to be important by the modellers and which proved to be objectively important. k-means cluster analysis identified subgroups of modellers and showed that 25% of the people tested were making judgements that appeared worse than a person responding at random. Follow-up checks showed that higher level qualifications, or having many years of experience in modelling, did not improve the accuracy of people’s predictions. In addition, there was no correlation between modellers, with many ranking some parameters as important that others thought irrelevant. Using a three-part definition of literacy, it is concluded that this sample of modellers, and by implication the population of building modellers, cannot be considered modelling literate. This indicates a new cause of the performance gap. The results suggest a need and an opportunity for both industry and universities to increase their efforts with respect to building physics education, and if this is done, a part of the performance gap could be rapidly closed.

UR - https://doi.org/10.1177/0143624416684641

U2 - 10.1177/0143624416684641

DO - 10.1177/0143624416684641

M3 - Article

VL - 38

SP - 351

EP - 375

JO - Building Services Engineering Research and Technology

JF - Building Services Engineering Research and Technology

SN - 0143-6244

IS - 3

ER -