You should probably go with a Lisp process per session.
One subprocess that stays open as long as the user keeps sending more commands back to the server? I thought about that, but after watching people use it from the logs, most people ran maybe 1 or 2 commands. (+ 1 2) was the biggest hit. Plus the server didn't even break a sweat, while opening a new subprocess per request. It just doesn't seem like the highest priority. Am I missing something?
Consider using CLISP because it starts up really fast and has very little memory overhead (although both CLISP and SBCL share their Lisp image VM pages, loading FASLs causes some of the pages to be modified and unshared - if you make a new image file with Parenscript preloaded, all the processes will share their entire image).
You mean make an image with save-lisp-and-die? That seems like a good idea.
The tricky thing is going to be detecting a stuck process. One easy way to bypass this is to do a new Lisp process per eval - just run the entire REPL history over every time the user presses enter, and have something like a 5 second timer that kills that process if it doesn't complete. I guess that's exactly like CGI. Users might run into trouble if something depends on randomly generated names (gensyms should be identical across invocations).
Oh thats an interesting idea! The timeouts wouldn't even be difficult because Node JS supports setTimeout on the server. Very cool.
You can use FreeBSD jails or one of the Linux jail tools to sandbox the Lisp and allow defmacros (although the latter are kind of a pain to set up).
Unfortunately, it looks like the way I have the server set up I can't run this stuff inside a chroot type jail. It is currently set up as a startup script in /etc/init.d/ and apparently Ubuntu 10.04 won't let start up scripts create chroots anymore...
As far as the code goes, the repository Parenscript has implicit returns for defuns and lambdas. I should put together a new release soon.
Awesome! My implementation of implicit returns is crappy and hacked together at best.
_Nick_