*** Welcome to piglix ***

Strictness analysis


In computer science, strictness analysis refers to any algorithm used to prove that a function in a non-strict functional programming language is strict in one or more of its arguments. This information is useful to compilers because strict functions can be compiled more efficiently. Thus, if a function is proven to be strict (using strictness analysis) at compile time, it can be compiled to use a more efficient calling convention without changing the meaning of the enclosing program.

Note that a function f is said to diverge if it returns : operationally, that would mean that f either causes abnormal termination of the enclosing program (e.g., failure with an error message) or that it loops infinitely. The notion of "divergence" is significant because a strict function is one that always diverges when given an argument that diverges, whereas a lazy (or non-strict) function is one that may or may not diverge when given such an argument. Strictness analysis attempts to determine the "divergence properties" of functions, which thus identifies some functions that are strict.

Strictness analysis can be characterized as a forward abstract interpretation which approximates each function in the program by a function that maps divergence properties of the arguments onto divergence properties of the results. In the classical approach pioneered by Alan Mycroft, the abstract interpretation used a two-point domain with 0 denoting the set considered as a subset of the argument or return type, and 1 denoting all values in the type.


...
Wikipedia

...