2009/9/11 Bradley Lucier lucier@math.purdue.edu:
Or home-grown extensible u8vectors? Or strings (after configuring with --enable-char-size=1)? How many distinct "characters" are being distinguished?
Warning: *Beware of the following question, for I should already be sleeping* Supposing I have an alphabet of 2 letters (to make things simple), I can code it with two bits. Despite the possible algorithmic and computational pain somehow, since we have bignums, how about encoding a string of this alphabet into a number, using bitwise operations? Appending a char to a string is a SHIFT of two bits, then an OR. Referencing should be a matter of LOG (to get the size), and then shifting and an AND 3.
I'm still awake enough to encode just 2 letters and not three on 2 bits (for concatenating "01" to "00" would not work as expected). I guess that it relies on the representation of bignums in memory.
What would be such a misuse of bignums worth?
P!