On Thu, Apr 12, 2012 at 4:05 PM, Mikael mikael.rcv@gmail.com wrote:
Den 12 april 2012 23:44 skrev Marc Feeley feeley@iro.umontreal.ca:
My question is this: is it important that your application be a "windows" application? What are the benefits over a "console" application.
"console" applications are good for apps where you have a practical use of the app being hardwired to the presence of a console window only. So it's a nice default setting for developers, to have a console to print out to, take text input, and so on.
The issue is just that "console" applications, if launched not from a console (but from, say, Explorer), get a console window allocated to them at startup, even if there is no console IO.
If you want to make an UI app (for example, an alarm clock), obviously you want your users' experience to be about your UI only, and a console window popping up in the background would be quiet shocking for them.
^^ This is indeed the reason why my app has to be a windows app. The users would complain if a console window appears when the app is run (they would probably interpret that to mean something was wrong :-D ).
In console apps, you can deallocate the console I think, but even then it flashes by quickly on the screen, giving the user a feeling that something is not right.
Yes, one solution would be to use GetConsoleWindow() and hide the window immediately after calling AllocConsole() and then hide the window. But, again the users wouldn't like the flashing window any better.
So the most straightforward thing for gui apps is to be compiled as "windows" application, and then to not use any console io at all (so, no access to current-output-port etc. at all, not even a (force-output) to it), because on such use a console window would be dynamically allocated anyhow. (There's some way to deallocate it just like above, though of course then it'd flash by at least just like above.) REPLeffect ought to have had some routine in his app that did that, and this is why the dynamic console window allocation he experienced was remedied by your suggestion of overwriting current-output-port with a dummy port..
Yes, one possible solution is to not print to stdout or stderr at all, but rather have some other facility for logging debugging messages. However, in my debugging I often ssh into the Windows box via Cygwin, and run the app in that ssh session on a separate monitor from the monitor where the Windows application is being run. Under cygwin, I *do* have a stdin/stdout/stderr, and I can get debugging messages to show up there. The original problem I was trying to solve was how to easilly prevent these messages from showing up when the app is run from the Windows box directly (and not via cygwin). A non-stdout/stderr debugging facility is something I may set up (and I have the beginnings of that for handling exceptions -- something I'll have to deal with on users' behalf). For now the stdout/stderr debugging is a simple way to get the information on a separate monitor.
Plus, it's good to know how this works with the Gambit runtime library (and thus how to avoid unwanted consoles popping up).
Not completely obvious abstraction this console vs. windows application thing. How is it in Unix, GUI apps have their own stdin/out completely decoupled from the GUI workings, and generally it's piped to the app's console which may be the host terminal window of the X server, or /dev/null, sth like this.
This is one of my gripes about Windows. There's no reason that I'm aware of that they couldn't have set up a stdin/stdout/stderr facility for their GUI applications (without having to have the separate console), they just didn't. I suppose it might have something to do with DOS application compatibility from Windows 3.0, or something. I think mostly they just didn't care (though I'm open to being proved wrong about that).
REPLeffect