On Feb 9, 2007, at 7:13 PM, Phil Bewig wrote:
I'm disappointed that my code is four times worse than C. Could someone more knowledgeable that I look at my code and tell me where the time is going?
Well, I'm not that guy. However, the buzz (from the internets, now in the back of my mind) seems to be that gambit is always ~4x slower if you read strings. gsc can be compiled with character width 1 (as someone said in this thread), and I've been stewing on a (with- lickity-split-strings ...) macro that replaces all of the string bits in code with u8-vectors, which, being single byte, should recover the speed.
The overhead is related to multi-byte characters being recognized by read-char, and the buzz reports that other systems incur similar overhead to handle utf-8. There was a comparison to python (and maybe perl as well) on a blog somewhere.
I'm a newbie, though, and I didn't try any code related to this before writing.