Although I understand why it's not done by default, it seems slightly odd to me that Drakma doesn't provide a URL-ENCODE function whereas Hunchentoot does. I would've thought the need was considerably more immediate in Drakma...
With greetings, Herbert Snorrason
On 21 July 2010 13:28, Herbert Snorrason methanal@gmail.com wrote:
[...] Drakma doesn't provide a URL-ENCODE function [...]
And after thinking about it, turns out it _does_, just doesn't expose it. The thinking is, I presume, that all URL-encoded data should pass through another interface that handles the cleaning?
In any case, the attached patch updates url-encode to RFC3986 (or at least only passes that standard's "unreserved characters" through) and exposes it.
With greetings, Herbert Snorrason
On Wed, Jul 21, 2010 at 11:46 AM, Herbert Snorrason methanal@gmail.com wrote:
In any case, the attached patch updates url-encode to RFC3986 (or at least only passes that standard's "unreserved characters" through) and exposes it.
I suspect that the original portions of
- (find char "$-_.!*'()," :test #'char=)) + (find char "-_.~" :test #'char=)) (write-char char out)) - ((char= char #\Space) - (write-char #+ out))
are based on RFC 1738's more restricted list of characters:
Thus, only alphanumerics, the special characters "$-_.+!*'(),", and reserved characters used for their reserved purposes may be used unencoded within a URL. [1]
Since request parameters (which could include URIs) might appear in the HTTP request's URL, the parameters need to be URL encoded, not just composed of valid URI characters.
[1] http://www.rfc-editor.org/rfc/rfc1738.txt
//JT