[gambit-list] Input buffering and other IO issues
Marc Feeley
feeley at iro.umontreal.ca
Sun Jan 8 13:30:28 EST 2006
On 8-Jan-06, at 2:45 AM, Christian wrote:
> Hello
>
> I wanted to write a small program which reads exactly n bytes from a
> port (and calculates an md5 hash on them, like "head -c n | md5sum").
>
> It turned out that even with buffering: #f, read-subu8vector reads
> through buffers, as strace shows (it reads in 1024 byte chunks, not in
> the lengths I'm giving to read-subu8vector).
>
> The reason why I want it to not read past the given number of bytes
> is:
> - I wanted to checksum a cdrom (by reading from /dev/cdrom), and
> reading past the length of the cdrom image is giving read errors on my
> OS (linux on ppc).
> - I might need it for reading from pipes and not block (I've not
> checked whether gambit would block the thread or not), or worse, read
> from pipes used for interprocess-locking (even if gambit doesn't block
> when trying too many bytes from a pipe, it will break the idea of
> using pipes for locking).
>
>
Fixed.
>
>
> In this strace output a second issue can be seen as well: output to
> the console is done in single bytes (maybe chars). This makes output
> of a gambit process to emacs (when running under run-scheme) slow
> (much so on my laptop, where there seems to be an issue with frequent
> system calls between programs). Either outputting objects as a whole
> instead of single characters (the way unbuffered output in e.g. perl
> works), or buffering on line boundaries, would be an improvement.
>
On terminals I/O is unbuffered by default, so that output immediately
appears on the terminal when it is produced without having to call
force-output (for example when outputting dots to show progress, or
outputting a prompt before reading input from the user). I know that
this is a performance problem, but I think it is the right default
("least surprise" for novices which are not aware of buffering issues).
I was surprised to discover that there is no runtime option to
override the default buffering setting, so I added this to the
runtime options:
Usage: program [-:OPTION,OPTION...] ...
where OPTION is one of:
...
f[OPT...] set file options; see below for OPT
t[OPT...] set terminal options; see below for OPT
-[OPT...] set standard input and output options; see below for OPT
where OPT is one of:
a/1/2/4/8 character encoding (ASCII/LATIN1/UCS2/UCS4/UTF8)
l/c/cl end-of-line encoding (LF/CR/CR-LF)
u/n/f buffering (unbuffered/newline buffered/fully buffered)
e/E [for terminals only] enable line-editing (on/off)
>
> Finally, I can't understand why I'm getting this:
>
>> (test 10 "/dev/cdrom")
> *** ERROR IN (console)@7.1 -- Unknown error
> (call-with-input-file '(path: "/dev/cdrom" char-encoding: ascii
> buffering: #f) '#<procedure #3>)
> 1>
>
> head -c 10 /dev/cdrom works just fine, as the same system user. It
> might help if the error message would include the OS level message.
>
> strace output:
> open("/dev/cdrom", O_RDONLY|O_NONBLOCK) = 4
> ioctl(4, TCGETS, 0x7ffff148) = -1 EINVAL (Invalid argument)
> fstat(4, {st_mode=S_IFBLK|0660, st_rdev=makedev(22, 0), ...}) = 0
> close(4) = 0
Fixed (block devices can now be read like normal files).
Keep those bug reports coming!
Marc
More information about the Gambit-list
mailing list