Is there a way to implement multiple interfaces on a single Java proxy? This code almost works:
(java:jinterface-implementation "org.apache.spark.api.java.function.Function"
"call" (lambda (s) (length s))) except that the proxy also needs to implement Serializable. The jproxy code in java.lisp seems to suggest that multiple implementations are allowed: (defgeneric jmake-proxy (interface implementation &optional lisp-this) (:documentation "Returns a proxy Java object implementing the provided interface(s)... but I can't see adding multiple implementations in the code. I see there's a few jmake-proxy methods in there though: are there any documentation or examples for their usage? Lsw2 doesn't use this at all and I can't find any other good examples of using ABCL.
Multiple interfaces from the jinterface-implementation function would be ideal, as the above code could then be wrapped with a macro to produce a 'spark-lambda' and be used nearly like the regular ABCL lambda.
Apologies, when I said "but I can't see adding multiple implementations", I meant multiple interfaces.
On Thursday, July 30, 2020, 11:19:07 AM GMT+8, Steven Nunez steve_nunez@yahoo.com wrote:
Is there a way to implement multiple interfaces on a single Java proxy? This code almost works:
(java:jinterface-implementation "org.apache.spark.api.java.function.Function"
"call" (lambda (s) (length s))) except that the proxy also needs to implement Serializable. The jproxy code in java.lisp seems to suggest that multiple implementations are allowed: (defgeneric jmake-proxy (interface implementation &optional lisp-this) (:documentation "Returns a proxy Java object implementing the provided interface(s)... but I can't see adding multiple implementations in the code. I see there's a few jmake-proxy methods in there though: are there any documentation or examples for their usage? Lsw2 doesn't use this at all and I can't find any other good examples of using ABCL.
Multiple interfaces from the jinterface-implementation function would be ideal, as the above code could then be wrapped with a macro to produce a 'spark-lambda' and be used nearly like the regular ABCL lambda.
You may have luck by providing a list. However, I see a deeper problem. Serializable is a marker interface: it has no methods, it only declares the type serializable. However, you cannot just declare that an object is serializable to make it so; all its components must be serializable as well. This includes the invocation handler that ABCL creates under the cover, as well as all the Lisp objects that you use for the implementation, particularly functions and closures. And, bad news – those aren't serializable. So, if Serializable is a requirement because those instances will effectively be serialized – e.g., to persist them to a file or to send them over the network – you're out of luck.
Ages ago I had started a branch to make most Lisp objects serializable, but I don't remember how far I got. I don't think it was ever mature enough to be merged, but many years have passed.
On Thu, 30 Jul 2020 at 05:22, Steven Nunez steve_nunez@yahoo.com wrote:
Apologies, when I said "but I can't see adding multiple implementations", I meant multiple interfaces.
On Thursday, July 30, 2020, 11:19:07 AM GMT+8, Steven Nunez < steve_nunez@yahoo.com> wrote:
Is there a way to implement multiple interfaces on a single Java proxy? This code almost works:
(java:jinterface-implementation "org.apache.spark.api.java.function.Function"
"call" (lambda (s) (length s)))
except that the proxy also needs to implement Serializable. The jproxy code https://abcl.org/trac/browser/trunk/abcl/src/org/armedbear/lisp/java.lisp#L118 in java.lisp seems to suggest that multiple implementations are allowed:
(defgeneric jmake-proxy (interface implementation &optional lisp-this) (:documentation "Returns a proxy Java object implementing the provided interface(s)...
but I can't see adding multiple implementations in the code. I see there's a few jmake-proxy methods in there though: are there any documentation or examples for their usage? Lsw2 doesn't use this at all and I can't find any other good examples of using ABCL.
Multiple interfaces from the jinterface-implementation function would be ideal, as the above code could then be wrapped with a macro to produce a 'spark-lambda' and be used nearly like the regular ABCL lambda.
Correction: indeed it was merged, but I didn't go as far as to make closures serializable.
On Thu, 30 Jul 2020 at 09:57, Alessio Stalla alessiostalla@gmail.com wrote:
You may have luck by providing a list. However, I see a deeper problem. Serializable is a marker interface: it has no methods, it only declares the type serializable. However, you cannot just declare that an object is serializable to make it so; all its components must be serializable as well. This includes the invocation handler that ABCL creates under the cover, as well as all the Lisp objects that you use for the implementation, particularly functions and closures. And, bad news – those aren't serializable. So, if Serializable is a requirement because those instances will effectively be serialized – e.g., to persist them to a file or to send them over the network – you're out of luck.
Ages ago I had started a branch to make most Lisp objects serializable, but I don't remember how far I got. I don't think it was ever mature enough to be merged, but many years have passed.
On Thu, 30 Jul 2020 at 05:22, Steven Nunez steve_nunez@yahoo.com wrote:
Apologies, when I said "but I can't see adding multiple implementations", I meant multiple interfaces.
On Thursday, July 30, 2020, 11:19:07 AM GMT+8, Steven Nunez < steve_nunez@yahoo.com> wrote:
Is there a way to implement multiple interfaces on a single Java proxy? This code almost works:
(java:jinterface-implementation "org.apache.spark.api.java.function.Function"
"call" (lambda (s) (length s)))
except that the proxy also needs to implement Serializable. The jproxy code https://abcl.org/trac/browser/trunk/abcl/src/org/armedbear/lisp/java.lisp#L118 in java.lisp seems to suggest that multiple implementations are allowed:
(defgeneric jmake-proxy (interface implementation &optional lisp-this) (:documentation "Returns a proxy Java object implementing the provided interface(s)...
but I can't see adding multiple implementations in the code. I see there's a few jmake-proxy methods in there though: are there any documentation or examples for their usage? Lsw2 doesn't use this at all and I can't find any other good examples of using ABCL.
Multiple interfaces from the jinterface-implementation function would be ideal, as the above code could then be wrapped with a macro to produce a 'spark-lambda' and be used nearly like the regular ABCL lambda.
Bugger. I'd hate for this to come to a dead-end, as it was looking like an elegant solution. The Spark tuning guide mentions using Kyro to provide serialization in Spark. It's not entirely free however; you need to 'register' your classes with Kyro for it to work. Would that be sufficient to provide serialisation to the needed Lisp objects?
On Thursday, July 30, 2020, 3:57:53 PM GMT+8, Alessio Stalla alessiostalla@gmail.com wrote:
You may have luck by providing a list. However, I see a deeper problem. Serializable is a marker interface: it has no methods, it only declares the type serializable. However, you cannot just declare that an object is serializable to make it so; all its components must be serializable as well. This includes the invocation handler that ABCL creates under the cover, as well as all the Lisp objects that you use for the implementation, particularly functions and closures. And, bad news – those aren't serializable. So, if Serializable is a requirement because those instances will effectively be serialized – e.g., to persist them to a file or to send them over the network – you're out of luck. Ages ago I had started a branch to make most Lisp objects serializable, but I don't remember how far I got. I don't think it was ever mature enough to be merged, but many years have passed.
On Thu, 30 Jul 2020 at 05:22, Steven Nunez steve_nunez@yahoo.com wrote:
Apologies, when I said "but I can't see adding multiple implementations", I meant multiple interfaces.
On Thursday, July 30, 2020, 11:19:07 AM GMT+8, Steven Nunez steve_nunez@yahoo.com wrote:
Is there a way to implement multiple interfaces on a single Java proxy? This code almost works:
(java:jinterface-implementation "org.apache.spark.api.java.function.Function"
"call" (lambda (s) (length s))) except that the proxy also needs to implement Serializable. The jproxy code in java.lisp seems to suggest that multiple implementations are allowed: (defgeneric jmake-proxy (interface implementation &optional lisp-this) (:documentation "Returns a proxy Java object implementing the provided interface(s)... but I can't see adding multiple implementations in the code. I see there's a few jmake-proxy methods in there though: are there any documentation or examples for their usage? Lsw2 doesn't use this at all and I can't find any other good examples of using ABCL.
Multiple interfaces from the jinterface-implementation function would be ideal, as the above code could then be wrapped with a macro to produce a 'spark-lambda' and be used nearly like the regular ABCL lambda.
On Jul 30, 2020, at 10:00, Alessio Stalla alessiostalla@gmail.com wrote:
Correction: indeed it was merged, but I didn't go as far as to make closures serializable.
Any idea how much work for someone (i.e. me)to be able to serialize closures? Just a bit of elbow-grease, or major implementation?
Somewhere in between. I could give a shot at it. It would be useful if Steven detailed his use case a bit more.
On Thu, Jul 30, 2020, 13:04 Mark Evenson evenson@panix.com wrote:
On Jul 30, 2020, at 10:00, Alessio Stalla alessiostalla@gmail.com
wrote:
Correction: indeed it was merged, but I didn't go as far as to make
closures serializable.
Any idea how much work for someone (i.e. me)to be able to serialize closures? Just a bit of elbow-grease, or major implementation?
-- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now."
Sorry, meant to send to the ABCL-dev list the first time.
The use case is Spark lambda functions. I couldn't do a better job than the Spark RDD Programming Guide does at explaining the use case. It starts on the Basics heading. The ideal case would be the ability to take Java code like this:JavaRDD<Integer> lineLengths = lines.map(s -> s.length());and write it in ABCL like this: (let ((line-lengths (#"map" *lines* (lambda (s) (length s))))) This uses the ABCL length function, which would be a huge win if we can use Lisp functions to map across data structure. I've already got abcl.jar accessible to Spark on all the nodes of a cluster. I'd probably shadow the cl:lambda with a spark:lambda to make the syntax natural.
On Thursday, July 30, 2020, 7:08:35 PM GMT+8, Alessio Stalla alessiostalla@gmail.com wrote:
Somewhere in between.I could give a shot at it. It would be useful if Steven detailed his use case a bit more.
On Thu, Jul 30, 2020, 13:04 Mark Evenson evenson@panix.com wrote:
On Jul 30, 2020, at 10:00, Alessio Stalla alessiostalla@gmail.com wrote:
Correction: indeed it was merged, but I didn't go as far as to make closures serializable.
Any idea how much work for someone (i.e. me)to be able to serialize closures? Just a bit of elbow-grease, or major implementation?
Hmm so Spark being a distributed computing library I guess you need full serialization of functions/closures. Well, I can give it a shot, but don't expect anything too soon.
On Thu, 30 Jul 2020 at 13:26, Steven Nunez steve_nunez@yahoo.com wrote:
Sorry, meant to send to the ABCL-dev list the first time.
The use case is Spark lambda functions. I couldn't do a better job than the Spark RDD Programming Guide http://spark.apache.org/docs/latest/rdd-programming-guide.html does at explaining the use case. It starts on the Basics heading. The ideal case would be the ability to take Java code like this:
JavaRDD<Integer> lineLengths = lines.map(s -> s.length());
and write it in ABCL like this:
(let ((line-lengths (#"map" *lines* (lambda (s) (length s)))))
This uses the ABCL length function, which would be a huge win if we can use Lisp functions to map across data structure. I've already got abcl.jar accessible to Spark on all the nodes of a cluster. I'd probably shadow the cl:lambda with a spark:lambda to make the syntax natural.
On Thursday, July 30, 2020, 7:08:35 PM GMT+8, Alessio Stalla < alessiostalla@gmail.com> wrote:
Somewhere in between. I could give a shot at it. It would be useful if Steven detailed his use case a bit more.
On Thu, Jul 30, 2020, 13:04 Mark Evenson evenson@panix.com wrote:
On Jul 30, 2020, at 10:00, Alessio Stalla alessiostalla@gmail.com
wrote:
Correction: indeed it was merged, but I didn't go as far as to make
closures serializable.
Any idea how much work for someone (i.e. me)to be able to serialize closures? Just a bit of elbow-grease, or major implementation?
-- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now."
@Mark Evenson evenson@panix.com would a GitHub fork + pull request work for you in case I work on this?
On Thu, 30 Jul 2020 at 13:40, Alessio Stalla alessiostalla@gmail.com wrote:
Hmm so Spark being a distributed computing library I guess you need full serialization of functions/closures. Well, I can give it a shot, but don't expect anything too soon.
On Thu, 30 Jul 2020 at 13:26, Steven Nunez steve_nunez@yahoo.com wrote:
Sorry, meant to send to the ABCL-dev list the first time.
The use case is Spark lambda functions. I couldn't do a better job than the Spark RDD Programming Guide http://spark.apache.org/docs/latest/rdd-programming-guide.html does at explaining the use case. It starts on the Basics heading. The ideal case would be the ability to take Java code like this:
JavaRDD<Integer> lineLengths = lines.map(s -> s.length());
and write it in ABCL like this:
(let ((line-lengths (#"map" *lines* (lambda (s) (length s)))))
This uses the ABCL length function, which would be a huge win if we can use Lisp functions to map across data structure. I've already got abcl.jar accessible to Spark on all the nodes of a cluster. I'd probably shadow the cl:lambda with a spark:lambda to make the syntax natural.
On Thursday, July 30, 2020, 7:08:35 PM GMT+8, Alessio Stalla < alessiostalla@gmail.com> wrote:
Somewhere in between. I could give a shot at it. It would be useful if Steven detailed his use case a bit more.
On Thu, Jul 30, 2020, 13:04 Mark Evenson evenson@panix.com wrote:
On Jul 30, 2020, at 10:00, Alessio Stalla alessiostalla@gmail.com
wrote:
Correction: indeed it was merged, but I didn't go as far as to make
closures serializable.
Any idea how much work for someone (i.e. me)to be able to serialize closures? Just a bit of elbow-grease, or major implementation?
-- "A screaming comes across the sky. It has happened before but there is nothing to compare to it now."
Let know know where/if you get a github repo. To the extent that I can, I'll contribute.
On Thursday, July 30, 2020, 7:41:15 PM GMT+8, Alessio Stalla alessiostalla@gmail.com wrote:
Hmm so Spark being a distributed computing library I guess you need full serialization of functions/closures. Well, I can give it a shot, but don't expect anything too soon.
On Thu, 30 Jul 2020 at 13:26, Steven Nunez steve_nunez@yahoo.com wrote:
Sorry, meant to send to the ABCL-dev list the first time.
The use case is Spark lambda functions. I couldn't do a better job than the Spark RDD Programming Guide does at explaining the use case. It starts on the Basics heading. The ideal case would be the ability to take Java code like this:JavaRDD<Integer> lineLengths = lines.map(s -> s.length());and write it in ABCL like this: (let ((line-lengths (#"map" *lines* (lambda (s) (length s))))) This uses the ABCL length function, which would be a huge win if we can use Lisp functions to map across data structure. I've already got abcl.jar accessible to Spark on all the nodes of a cluster. I'd probably shadow the cl:lambda with a spark:lambda to make the syntax natural.
On Thursday, July 30, 2020, 7:08:35 PM GMT+8, Alessio Stalla alessiostalla@gmail.com wrote:
Somewhere in between.I could give a shot at it. It would be useful if Steven detailed his use case a bit more.
On Thu, Jul 30, 2020, 13:04 Mark Evenson evenson@panix.com wrote:
On Jul 30, 2020, at 10:00, Alessio Stalla alessiostalla@gmail.com wrote:
Correction: indeed it was merged, but I didn't go as far as to make closures serializable.
Any idea how much work for someone (i.e. me)to be able to serialize closures? Just a bit of elbow-grease, or major implementation?
On Jul 30, 2020, at 14:38, Alessio Stalla alessiostalla@gmail.com wrote:
@Mark Evenson would a GitHub fork + pull request work for you in case I work on this?
Certainly. I will add your admin rights as a maintainer forthwith!
On Jul 30, 2020, at 19:11, Mark Evenson evenson@panix.com wrote:
On Jul 30, 2020, at 14:38, Alessio Stalla alessiostalla@gmail.com wrote:
@Mark Evenson would a GitHub fork + pull request work for you in case I work on this?
Certainly. I will add your admin rights as a maintainer forthwith!
Done.
Travis builds via https://travis-ci.org/github/armedbear/abcl on pull requests is the current test suite run on pull requests.
Welcome, Mark
armedbear-devel@common-lisp.net