I never thought n+k patterns was any special. For example, aren't the two definitions equivalent? fac1 0 = 1 In fact, they aren't. GHCi infers fac1's type as (Integral t) => t -> t, and fac2's as (Num t) => t -> t. Why's that? The answer is in the Haskell report. The informal semantics says Matching an n+k pattern (where n is a variable and k is a positive integer literal) against a value v succeeds if x >= k, resulting in the binding of n to x - k, and fails otherwise. And the formal semantics says case v of { x+k -> e; _ -> e' } That's why fac3 terminates while fac4 diverges: fac3 (n+1) = (n+1) * fac3 n And that's also why ghci warns about fac1's and fac3's pattern matches are non-exhaustive. As the Haskell report points out, many people feel that n+k patterns should not be used. These patterns may be removed or changed in future versions of Haskell . |
Notes >