Python Development, memoized, "lazy" functions

Antti Karttunen antti.karttunen at gmail.com
Sun Jan 7 06:40:51 CET 2007


Antti Karttunen wrote:

> Jon Awbrey wrote:
>
>>
>> is this the same as "course of values" recursion?
>>
> In a sense yes, that we keep all the values of f for f(0) - f(n),
> however, storing them a bit more efficiently than in the exponents of
> prime factorization, or other Gödelization scheme... .-)
> Fortunately programming languages have also other data structures
> than integers.
>
Much more interesting point in my view is that with memoized
functions we can effectively implement a kind of "execution on demand"
or "lazy evaluation", even in strictly strict (= non-lazy) languages.

I just implemented Hofstadter's A005228 = 
1,3,7,12,18,26,35,45,56,69,83,98,114,131,150,170,191,213,236,260,...
and its complement & first differences:A030124 = 
2,4,5,6,8,9,10,11,13,14,15,16,17,19,20,...
with the following three lines of code, using the Scheme-macros I wrote:

;; Defined here as the partial sums of A030124+
;; Note that to kick start this recursion, we need to specify the three initial values explicitly:
(definec (A005228 n) (cond ((= n 1) 1) ((= n 2) 3) ((= n 3) 7) (else (+ (A005228 (- n 1)) (A030124+ (- n 1))))))

(define A030124+ (COMPLEMENT 1 A005228)) ;; One-based.
(define A030124 (compose-funs A030124+ 1+)) ;; Zero-based in OEIS.

In Haskell (only inherently lazy language I know) it is possible
(so I think) to create two lists similarly interlinked by their
definitions, which then grow by demand. However, why hassle
with lists, when one can have a clean functional interface
to these sequences? (Of course there are lists hidden in
these definitions, and that is precisely their memoization/cache vectors,
but fortunately we don't need to think about it.)



-- Antti.







More information about the SeqFan mailing list