Hello Gambitious Schemers,
I've read the Wiki Namespaces page and I'm still trying to figure out how to incorporate the information.
Here's my situation: * I have copied ~~/syntax-case.scm to ~~/gambcext.scm since I want to use some syntax-rules macros * I compile a file that "include"s all the relevant files and then includes a funny message: ;;;test.ss (include "c-header.ss") (include "scheme-header.ss") (include "gsl_structures.ss") (include "gsl_lalgebra.ss") (include "gsl_wrapper.ss") (include "f64matrix.ss") (include "life-table.ss") ;;includes ;;(define-structure life-table gens m p a)
(define C (make-f64matrix 2 2 (f64vector 8.67 53.09 3.14 2.17))) (define C-gsl (make-gsl-matrix 2 2 (f64matrix-data C))) (define C->gsl (f64matrix->gsl-matrix* C)) (define vec2 (f64vector 1.0 1.0)) (define tbl (make-life-table '(1 2) '(5 10 20) '(0.5 0.75 0.9) '(0.5 0.5 0.5))) (display "Good evening, Dave. Everything's running smoothly, and you?") (newline)
I compile this with:
# a rudimentary Makefile for Gambit targets LDOPTS="-g -L/usr/lib/gsl -lgslcblas -lgsl" CCOPTS="-I/usr/include" GSCOPTS=-keep-c -debug -track-scheme -warnings -expansion libs = gsl_lalgebra.ss gsl_wrapper.ss f64matrix.ss test: gsc $(GSCOPTS) -ld-options $(LDOPTS) -cc-options $(CCOPTS) test.ss
Everything compiles just fine.
* I "run-scheme" in Emacs, and then the action happens: Gambit v4.2.0
(load "test")
Good evening, Dave. Everything's running smoothly, and you? "/home/joel/lisp/scm/population/libgenx/test.o11"
##make-life-table
*** ERROR IN (stdin)@2.1 -- Unbound variable: ##make-life-table 1> ,t
make-life-table
*** ERROR IN (stdin)@4.1 -- Unbound variable: sc#make-life-table 1> ,t
sc#make-life-table
*** ERROR IN (stdin)@6.1 -- Unbound variable: sc#make-life-table 1>
* So, what's going on here? What namespace is my structure attached to? Do I need to define namespaces in my files to be able to access these things from the REPL (and in future code)?
* The wiki page mentions something about the namespace functions not being supported in the future. What is the status of the namespace mechanism supported or not? Will it be in the future? It seems like I need to use it if I am going to use syntax-case, or will that change too?
Thanks, Joel
Afficher les réponses par date
Joel J. Adamson wrote:
... syntax-rules macros
syntax-rules is incompatible with most Gambit specialities. Maybe search the mailing list for more details, or wait until Marc can give you more specifics.
(I've never really used syntax-rules with Gambit. Currently I'm using define-macro and whole-sourcefile transformers for my stuff, and am planning on reading about and understanding s48's macro system and maybe work on implementing that for Gambit/chjmodule/Snow.)
- The wiki page mentions something about the namespace functions not being supported in the future.
Hm, I think there are roughly four levels of support or removal of a feature:
0. having decided that a feature is not to be removed any time. 1. not being sure whether something is the one and true way to do forever, but not having any reason to actually think of removing it. 2. having found actual reasons to believe something should be removed or replaced. 3. having decided to remove something.
The wording in the wiki is supposed to refer only to level 1. Not going to level 0 means leaving room for further decisions. That's just how I understand it, of course, and it has been the thinking behind writing that particular statement in the wiki.
What is the status of the namespace mechanism supported or not? Will it be in the future?
Speaking for myself, I'm currently counting on it being there for chjmodule. Should I/we find ways to achieve the things I/we want in other ways, I'll have no issue changing to those.
(BTW you could always reimplement namespaces for yourself if the following points are true: (1) you can filter the code coming from file and repl inputs, maintaining source location information (2) you've got a code analyzer (lexical analysis) (3) possibly for convenience, filter also the output of the debugging functions
Point 1 is already there in Gambit (undocumented, though; btw that's something I think would have made more sense standardizing in R*RS than an actual module system). Point 2 isn't hard, if you're going to handle imports/exports anyway. Point (3) may be achievable through overriding some pretty-print hook, I haven't tried yet.
It's not clear yet to me how well layering would/will work, e.g. mixing code using both the namespaces feature and future chjmodule functionality; you're seeing such problems currently with syntax-case, but that may be rather just because nobody has spent much effort on integrating syntax-case better (well, I could be wrong, I don't know about the issues).)
It seems like I need to use it if I am going to use syntax-case, or will that change too?
(I rather think there's something fishy going on in the way you're trying to use the not very deeply integrated syntax-case.)
Christian.
Christian Jaeger christian@pflanze.mine.nu writes:
Joel J. Adamson wrote:
... syntax-rules macros
syntax-rules is incompatible with most Gambit specialities.
Interesting: * I have had a lot of trouble understanding the pattern language anyway, and I have a hard time seeing what the advantages of define-syntax over define-macro anyway. Any thoughts? * Can you elaborate on Gambit's specialties that are incompatible with syntax-rules?
I don't have my heart set on using syntax-rules; I was under the impression that there is some huge advantage of it since in other implementations (MzScheme especially) it is used A LOT.
However, last night after figuring out the namespaces thing, I wrote a syntax-rules macro that looked exactly like I could have written it in define-macro language instead. Also after doing a quick find-grep of the examples and finding no instances of syntax-rules, and many examples of define-macro, I think I'm alright with using define-macro.
whole-sourcefile transformers
Can you tell me more about this technique?
Thanks -- this discussion is very helpful, Joel
Joel J. Adamson wrote:
Christian Jaeger christian@pflanze.mine.nu writes:
Joel J. Adamson wrote:
... syntax-rules macros
syntax-rules is incompatible with most Gambit specialities.
Interesting:
- I have had a lot of trouble understanding the pattern language anyway,
and I have a hard time seeing what the advantages of define-syntax over define-macro anyway. Any thoughts?
The disadvantages of define-macro:
* with define-macro you have to care about hygiene yourself (e.g. you have to use (gensym) to introduce new identifyers into the transformed code, and you have to fully qualify symbols that you want to refer to a particular package instead of to the identifyers in the package of the user of the macro)
* with define-macro you're working with sexpr's that have lost their source location information; as a result, when an error happens in code which has been built by a macro, the Gambit debugger will just points at the start of the region with the generated code; with macros spanning big code parts, this makes tracking errors hard. syntax-{case,rules} is said to keep source location information automatically.
Gambit has an undocumented define-macro variant which gives you the sexpr's with location information; thus with some care, you can still implement macros that keep location information. (I started implementing the same in chjmodule before I knew about the gambit lowlevel define-macro, and actually I can't find the name of the latter right now.)
- Can you elaborate on Gambit's specialties that are incompatible with
syntax-rules?
Read the comments at the top of /usr/local/Gambit-C/current/syntax-case.scm
I don't have my heart set on using syntax-rules; I was under the impression that there is some huge advantage of it since in other implementations (MzScheme especially) it is used A LOT.
However, last night after figuring out the namespaces thing, I wrote a syntax-rules macro that looked exactly like I could have written it in define-macro language instead. Also after doing a quick find-grep of the examples and finding no instances of syntax-rules, and many examples of define-macro, I think I'm alright with using define-macro.
whole-sourcefile transformers
Can you tell me more about this technique?
See the bottom of the file /usr/local/Gambit-C/current/syntax-case.scm, namely the ##expand-source and c#expand-source hooks.
e.g.
(set! ##expand-source (lambda (v) (pp v) v)) (define a "hello")
#(#(source1) (#(#(source1) define (console) 65537) #(#(source1) a (console) 524289) #(#(source1) "hello" (console) 655361)) (console) 1)
a
#(#(source1) a (console) 2) "hello"
I've written a library of functions for dealing with the source location information which is wrapped around the data (it is to be released sooner than RSN™ with the rest of my modules and the chjmodule system).
Christian.
Just a quick note about a pet-peeve of mine regarding define-macro:
The disadvantages of define-macro:
- with define-macro you have to care about hygiene yourself (e.g. you
have to use (gensym) to introduce new identifyers into the transformed code, and you have to fully qualify symbols that you want to refer to a particular package instead of to the identifyers in the package of the user of the macro)
It's worse than this. Consider the following example:
(define-macro (do-times i-and-n . body) (let ((i (car i-and-n)) (nn (cdr i-and-n)) (n (gensym))) `(let ((,n ,nn)) (do ((,i 0 (+ ,i 1))) ((= ,i ,n)) ,@body))))
(define v (vector 1 2 3 4 5 6 7 8 9 10)) (define v-of-v (vector v v v v v v v v v))
;; This works (do-times (i 10) (if (not (= (vector-ref v i) (+ i 1))) (display "not equal") (display "equal")) (newline))
;; This barfs, because we've *locally* bound "=" (pretend we've suitably defined vector=) ;; The error will be something like "vector=: expected <vector> got 1", coming from ;; the use of '= in the do test the macro expands into. (let ((= vector=)) (do-times (i 10) (if (not (= (vector-ref v-of-v i) v)) (display "not equal") (display "equal")) (newline)))
I've been bit by this bug before myself, in real code. And *there's nothing you can do about it* using define-macro. It's *far* worse than having to define a new gensym each time you want to introduce a local variable. If you want to use define-macro, *don't ever* re-define *any* standard symbol (even locally) if you want to be sure that you're not hosing macros. Syntax-case/rules just does the right thing here---and saves you from a very hard-to-find class of bugs.
Will
.. and you have to fully qualify symbols ...
It's worse than this. Consider the following example:
Assuming this macro is part of a package utils# :
(define-macro (do-times i-and-n . body) (let ((i (car i-and-n)) (nn (cdr i-and-n)) (n (gensym))) `(let ((,n ,nn)) (do ((,i 0 (+ ,i 1))) ((= ,i ,n))
((utils#= ,i ,n))
,@body))))
;; This barfs, because we've *locally* bound "=" (pretend we've suitably defined vector=)
What I said ;).
Well ok the above assumes chjmodule semantics, e.g. if the package utils imports the gambit / r5rs standard identifyers, utils#= is the same as the standard gambit = (e.g. there's a (define utils#= (let () (##namespace ("")) =))) in your code at some place).
Christian.
Christian Jaeger christian@pflanze.mine.nu writes:
Joel J. Adamson wrote:
[...]
What is the status of the namespace mechanism supported or not? Will it be in the future?
[...]
It's not clear yet to me how well layering would/will work, e.g. mixing code using both the namespaces feature and future chjmodule functionality; you're seeing such problems currently with syntax-case, but that may be rather just because nobody has spent much effort on integrating syntax-case better (well, I could be wrong, I don't know about the issues).)
I got my code to work and I think I understand the ##namespace mechanism much better. Now my concern is that if I'm building an application (my ultimate goal) am I going to have to re-work everything once a module system gets implemented. I really like how fast Gambit development happens: it's already enabled me to implement things I expected to be there, simply by voicing it to the newgroup. I also know that Gambit has some good advantages over most other implementations.
However, being a limited-time "spare-time" programmer, is my time to learn things better spent on another implementation with slower development and more stable (as in "probably not going to be reimplemented in such a way that you would have to change large portions of code") features? Is Gambit meant for building applications, or is it meant as a testing ground for features of a scheme implementation?
Joel
...
I got my code to work and I think I understand the ##namespace mechanism much better. Now my concern is that if I'm building an application (my ultimate goal) am I going to have to re-work everything once a module system gets implemented. I really like how fast Gambit development happens: it's already enabled me to implement things I expected to be there, simply by voicing it to the newgroup. I also know that Gambit has some good advantages over most other implementations.
However, being a limited-time "spare-time" programmer, is my time to learn things better spent on another implementation with slower development and more stable (as in "probably not going to be reimplemented in such a way that you would have to change large portions of code") features? Is Gambit meant for building applications, or is it meant as a testing ground for features of a scheme implementation?
Hi Joel,
This part of your message caught my attention. From everything I have seen over the years, Gambit is really meant for building real applications. I have seen many people using it for developing sophisticated software. Personnaly, my company is in the process of porting some commercial applications we developed to Gambit. It has exactly what we need. Blazzing speed, stabilitity, great core functionality and a very nice debugger.
Hope this help,
Best regards,
Guillaume Cartier
Joel J. Adamson wrote:
Now my concern is that if I'm building an application (my ultimate goal) am I going to have to re-work everything once a module system gets implemented.
As long as you are just importing and exporting identifyers across module boundaries, you're just using a namespaces system of some sort. Thus the switch to Snow! should (aside from changing to portable libraries) just be one of using a different import/export declaration; someone might even write a converter between ##namespace and Snow! (and chjmodule) modules.
If you want to use parametrization: you have to do that manually with a namespace-like module (e.g. you have to write functors and export those and call them manually in the client module); you don't have to rewrite such code once the module system supports parametrization itself (at least if you're only parametrizing functions and not macros--latter part open for thoughts), it will just help make it cleaner.
If you want to write lazy code, it's the same thing, you will have to use delay and force like everyone(?) else atm; once there's a transformer present in a module system, you can forget about those and let it do for you automatically, but you don't have to if you don't want to touch your code (btw I did write such a transformer without optimizations or general import/export mechanisms in just an hour or so a few days ago; should be available in chjmodule a little less RSN).
Is Gambit meant for building applications, or is it meant as a testing ground for features of a scheme implementation?
Gambit as a R5RS Scheme implementation is very mature; what I'm experimenting with is everything on top of R5RS (both as a learning process now as well as for building applications in the future).
Christian.
Christian Jaeger christian@pflanze.mine.nu writes:
Joel J. Adamson wrote:
Now my concern is that if I'm building an application (my ultimate goal) am I going to have to re-work everything once a module system gets implemented.
As long as you are just importing and exporting identifyers across module boundaries, you're just using a namespaces system of some sort.
Right right, that makes sense.
Thus the switch to Snow! should (aside from changing to portable libraries) just be one of using a different import/export declaration; someone might even write a converter between ##namespace and Snow! (and chjmodule) modules.
Just to clarify, you're suggesting I use the Snow system _as a module system_? That I write my libraries as Snow packages, etc?
Joel
Joel J. Adamson wrote:
Just to clarify, you're suggesting I use the Snow system _as a module system_? That I write my libraries as Snow packages, etc?
I have not yet written any Snow package yet, so I can't tell how well that works for development (as opposed to distribution).
Of course you could wait a week or two until I've got an updated chjmodule out the door and use that, but you might need to wait a bit more until you can seemlessly load Snow packages from chjmodules (not sure yet what it needs). And at the moment my intent with chjmodule is implementing features and not stability, and this might sometimes mean having to make incompatible changes--in any case, as long as it isn't being discussed (other than leading monologues with myself), I'll just go the path of the least resistance which means that I will just care about not breaking too many of my own modules, or otherwise write converters for their migration.
Whereas Snow may take backwards compatibility serious, so if you're looking for that, it may give you a better service; although I don't know how it can handle the mentioned define-macro namespacing issue, for example. If you want the simplest and most stable approach and don't care about portability right now, you might also just use Gambit namespaces directly; maybe writing a converter to (a future version of) Snow at some point is a fun experience after all for someone.
Christian.
Christian Jaeger christian@pflanze.mine.nu writes:
Joel J. Adamson wrote:
Just to clarify, you're suggesting I use the Snow system _as a module system_? That I write my libraries as Snow packages, etc?
I have not yet written any Snow package yet, so I can't tell how well that works for development (as opposed to distribution).
Well, one of the reasons I switched from PLT Scheme was that PlaneT packages did not seem a good fit for distribution: any user would have to get PLT Scheme and PlaneT, and I wasn't interested in using my application to promote DrScheme (largely a teaching tool).
[Will Farr, since I now know you're on this list, please comment if you like]
If you want the simplest and most stable approach and don't care about portability right now, you might also just use Gambit namespaces directly; maybe writing a converter to (a future version of) Snow at some point is a fun experience after all for someone.
I think that using the ##namespaces is a good idea for now since I've figured out how to use it ;)
Joel
At Tue, 12 Feb 2008 16:10:02 -0500, Joel J. Adamson wrote:
Christian Jaeger christian@pflanze.mine.nu writes:
Joel J. Adamson wrote:
Just to clarify, you're suggesting I use the Snow system _as a module system_? That I write my libraries as Snow packages, etc?
I have not yet written any Snow package yet, so I can't tell how well that works for development (as opposed to distribution).
Well, one of the reasons I switched from PLT Scheme was that PlaneT packages did not seem a good fit for distribution: any user would have to get PLT Scheme and PlaneT
That's not the case for module-based programs. When you create a distribution (via DrScheme's "Create Executable" menu item or via `mzc --exe' plus `mzc --exe-dir'), then all needed binaries, libraries, Planet packages, etc., are assembled into the distribution. The end user of a distribution won't have to download anything else or connect to the Planet server.
Matthew
Matthew Flatt mflatt@cs.utah.edu writes:
At Tue, 12 Feb 2008 16:10:02 -0500, Joel J. Adamson wrote:
Christian Jaeger christian@pflanze.mine.nu writes:
Joel J. Adamson wrote:
Just to clarify, you're suggesting I use the Snow system _as a module system_? That I write my libraries as Snow packages, etc?
I have not yet written any Snow package yet, so I can't tell how well that works for development (as opposed to distribution).
Well, one of the reasons I switched from PLT Scheme was that PlaneT packages did not seem a good fit for distribution: any user would have to get PLT Scheme and PlaneT
That's not the case for module-based programs. When you create a distribution (via DrScheme's "Create Executable" menu item or via `mzc --exe' plus `mzc --exe-dir'), then all needed binaries, libraries, Planet packages, etc., are assembled into the distribution. The end user of a distribution won't have to download anything else or connect to the Planet server.
Good to know; I guess I or the person who informed me otherwise had ill communication.
Joel