I am reading SICP and read this recitation. I use MIT/GNU Scheme as MIT course 6.5151 (6.905) does with version 12.1.
I am stuck at Problem 3. I tried to use . So I have code
(define (e x)
(newline)
(display (/ 1.0 x))
(newline)
(display (+ 1.0 (/ 1.0 x)))
(newline)
(display x)
(expt (+ 1.0 (/ 1.0 x)) x))
(e (expt 10 100))
It has answer 1.
. Here I use display
to check where the resolution is not as expected. It turns out (display (+ 1.0 (/ 1.0 x)))
outputs 1.
which is unexpected.
The instructor solution uses Taylor series which is different from the above idea.