Unicode does not support any arbitrary integer as character number.

There's an interval limit, it's 0 .. (this particular number is quoted somewhere in the Gambit docs) ~0x7dffffff or sth. Perhaps you find it in |integer->char|'s def.

Brgds

2012/6/14 François Magnan <francois.magnan@licef.ca>
Hi,

I got the following error in Gambit-C (4.6.4 compiled for iOS)

*** ERROR IN (string)@1.7862 -- Character out of range

What does it mean?

Thank you,
Francois



_______________________________________________
Gambit-list mailing list
Gambit-list@iro.umontreal.ca
https://webmail.iro.umontreal.ca/mailman/listinfo/gambit-list