I am wondering if we even need the content trick to detect what needs transmitting. Consider this very slightly modified (and untested) xml rule:
(defun xmlrule (class html-attrs) `(c? (with-output-to-string (s) (with-html-output (s nil :prologue nil) (,(intern (symbol-name class) :keyword) ,@(loop for (cl-meth attr) in html-attrs append `(,(intern (symbol-name attr) :keyword) (,cl-meth self))) ;; (^x) -> (x self) ,@(loop for (cl-meth attr) in '((id id) (cls class) (title title) (style style)) append `(,(intern (symbol-name attr) :keyword) (,cl-meth self))) ;; (^x) -> (x self) (str (apply #'concatenate 'string (loop for k in (^kids) ;; got to see kids change ;; but not if same kid changes its xml: collecting (without-c-dependency (xhtml k))))))))))
Now the observer on xmlrule just pushes the string onto the updates slot as an assoc of (cons self (^xhtml)) and the transmitter simply skips any entry who has any ascendant also in the updates list.
What would be fun would be the custom xget or xlookup extension I suggested. Then we let the browser do the work of assembling the xhtml and then simply have an observer that transmits xhtml when it changes, because all references to kids would look like:
(loop for k in (^kids) collecting `(<xlookup ,(xuid k)>))
... or whatever the syntax would be.
The nice thing here is the added efficiency. What if the apropos search parameters are changed to say "exported only"? Instead of sending over the whole list of exported functions afresh, we send over only a new list of xlookups because their xhtml bodies are all over there already. How cool is that?
And once the demp gets fleshed out and a single match is its own div with multiple columns we are seriously reducing traffic.
Thoughts?
kt