TY - GEN

T1 - A Compositional Cost Model for the λ-calculus

AU - Laird, James

N1 - Publisher Copyright:
© 2021 IEEE.

PY - 2021/6/29

Y1 - 2021/6/29

N2 - We describe a (time) cost model for the (call-by-value) λ-calculus based on a natural presentation of its game semantics: the cost of computing a finite approximant to the denotation of a term (its evaluation tree) is the size of its smallest derivation in the semantics. This measure has an optimality property enabling compositional reasoning about cost bounds: for any term A, con C[_] and approximants a and c to the trees of A and C[A], the cost of computing c from C[A] is no more than the cost of computing a from A and c from C[a].Although the natural semantics on which it is based is nondeterministic, our cost model is reasonable: we describe a deterministic algorithm for recognizing evaluation tree approximants which satisfies it (up to a constant factor overhead) on a Random Access Machine. This requires an implementation of the λv-calculus on the RAM which is completely lazy: compositionality of costs entails that work done to evaluate any part of a term cannot be duplicated. This is achieved by a novel implementation of graph reduction for nameless explicit substitutions, to which we compile the λv-calculus via a series of linear cost reductions.

AB - We describe a (time) cost model for the (call-by-value) λ-calculus based on a natural presentation of its game semantics: the cost of computing a finite approximant to the denotation of a term (its evaluation tree) is the size of its smallest derivation in the semantics. This measure has an optimality property enabling compositional reasoning about cost bounds: for any term A, con C[_] and approximants a and c to the trees of A and C[A], the cost of computing c from C[A] is no more than the cost of computing a from A and c from C[a].Although the natural semantics on which it is based is nondeterministic, our cost model is reasonable: we describe a deterministic algorithm for recognizing evaluation tree approximants which satisfies it (up to a constant factor overhead) on a Random Access Machine. This requires an implementation of the λv-calculus on the RAM which is completely lazy: compositionality of costs entails that work done to evaluate any part of a term cannot be duplicated. This is achieved by a novel implementation of graph reduction for nameless explicit substitutions, to which we compile the λv-calculus via a series of linear cost reductions.

UR - http://www.scopus.com/inward/record.url?scp=85113898140&partnerID=8YFLogxK

U2 - 10.1109/LICS52264.2021.9470567

DO - 10.1109/LICS52264.2021.9470567

M3 - Chapter in a published conference proceeding

SN - 9781665448956

VL - 2021-June

T3 - Proceedings - Symposium on Logic in Computer Science

SP - 1

EP - 13

BT - 2021 36th Annual ACM/IEEE Symposium on Logic in Computer Science, LICS 2021

PB - IEEE

T2 - 36th Annual ACM/IEEE Symposium on Logic in Computer Science, LICS 2021

Y2 - 29 June 2021 through 2 July 2021

ER -