It does look nice, and parser combinators are very cool, but I scrape a lot of semi structured documents. Sometimes it's hard to operate with all of it in memory, which is a current requirement of cl-parser-combinators.
Thank you though,
Matt On 02/04/2011 01:14 AM, Cyrus Harmon wrote:
I've been very pleased with cl-parser-combinators. Not sure what you're trying to parse, but it's pretty flexible and powerful. I've used it for parsing a printed representation of molecules, SMILES strings, and have found it to be a pleasure to work with.
Cyrus
On Feb 3, 2011, at 10:33 PM, Matthew D. Swank wrote:
I suppose this is only marginally related to common lisp, but everything I'm talking about is written in common lisp.
I use cl-yacc for a lot of parsing, but one thing that has always seemed harder than it needs to be is writing lexers to feed it. One thing that I've found helpful is the creation of a custom lexer for each parser state by making the action table entry for that state available to the lexer. This provides the terminals the parser is looking for, and narrows the tokens the lexer has to look for at each step. However, this means I am also maintaining my own fork of cl-yacc.
It seems (from my admittedly limited search) that this is not a common modification of yacc. Before I start bugging the maintainer about my changes, I want to know: am I abusing yacc?
I do like the separation between low level tokenization the higher level symbol parse, but is there another general parsing technique, available as a lisp library of course, that either works at a lower level than yacc usually does or allows the lexer to access more context about the parse?
Matt
pro mailing list pro@common-lisp.net http://common-lisp.net/cgi-bin/mailman/listinfo/pro