Hi.
I have written a small and very simple web-service to which I want to upload files. With the current version 1.2.1 of hunchentoot I have problems uploading big files, say 1GB. As far as I remeber, with older versions (1.0 and 1.1) big uploads worked well.
Basically the following code is in use:
--8<---------------cut here---------------start------------->8--- (hunchentoot:define-easy-handler (handle-upload :uri "/path/to/upload-service") () (let ((uploaded (when (and (boundp 'hunchentoot:*request*) (hunchentoot:post-parameter "filename")) (handle-file (hunchentoot:post-parameter "filename"))))) (generate-html-code))) --8<---------------cut here---------------end--------------->8---
And handle-file looks like this:
--8<---------------cut here---------------start------------->8--- (defun handle-file (post-parameter) (ht-log :info "Handling file upload with params: '~A'." post-parameter) (when (and post-parameter (listp post-parameter)) (destructuring-bind (path filename content-type) post-parameter (declare (ignore content-type)) ;; strip directory info send by Windows browsers (when (search "Windows" (hunchentoot:user-agent) :test #'char-equal) (setf filename (ppcre:regex-replace ".*\\" filename ""))) (fad:copy-file path (ensure-directories-exist (merge-pathnames filename *unsecure-upload-dir*)) :overwrite t) filename))) --8<---------------cut here---------------end--------------->8---
It seems that hunchentoot tries to read the whole stream into memory and that the heap is too small (the server has only 1GB RAM and the heap of the sbcl process is limited to about 600MB).
Is there any (easy) way to let hunchentoot load the data in small chunks in order to limit the maximal amount of memory used, independent of the file size?