@Mark Evenson evenson@panix.com would a GitHub fork + pull request work for you in case I work on this?
On Thu, 30 Jul 2020 at 13:40, Alessio Stalla alessiostalla@gmail.com wrote:
Hmm so Spark being a distributed computing library I guess you need full serialization of functions/closures. Well, I can give it a shot, but don't expect anything too soon.
On Thu, 30 Jul 2020 at 13:26, Steven Nunez steve_nunez@yahoo.com wrote:
Sorry, meant to send to the ABCL-dev list the first time.
The use case is Spark lambda functions. I couldn't do a better job than the Spark RDD Programming Guide http://spark.apache.org/docs/latest/rdd-programming-guide.html does at explaining the use case. It starts on the Basics heading. The ideal case would be the ability to take Java code like this:
JavaRDD<Integer> lineLengths = lines.map(s -> s.length());
and write it in ABCL like this:
(let ((line-lengths (#"map" *lines* (lambda (s) (length s)))))
This uses the ABCL length function, which would be a huge win if we can use Lisp functions to map across data structure. I've already got abcl.jar accessible to Spark on all the nodes of a cluster. I'd probably shadow the cl:lambda with a spark:lambda to make the syntax natural.
On Thursday, July 30, 2020, 7:08:35 PM GMT+8, Alessio Stalla < alessiostalla@gmail.com> wrote:
Somewhere in between. I could give a shot at it. It would be useful if Steven detailed his use case a bit more.
On Thu, Jul 30, 2020, 13:04 Mark Evenson evenson@panix.com wrote:
On Jul 30, 2020, at 10:00, Alessio Stalla alessiostalla@gmail.com
wrote:
Correction: indeed it was merged, but I didn't go as far as to make
closures serializable.
Any idea how much work for someone (i.e. me)to be able to serialize closures? Just a bit of elbow-grease, or major implementation?
-- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now."