Greetings all, I have what I think is a problem with the ABCL class loader. I am working with a 'big data' library, Spark, but run into an issue on line 2 of the programming guide example. I am able to load the JARs from Maven with the ASDF system definition: (asdf:defsystem #:spark :description "Wrapper for Spark 3.0" :serial t :defsystem-depends-on (abcl-asdf) :depends-on (#:jss #:javaparser) :components ((:mvn "org.apache.spark/spark-core_2.12" :version "3.0.0") (:file "package") (:file "spark")))
and can create a SparkConf object: (defvar *spark-conf* #1"new SparkConf() .setAppName("abcl-app") .setMaster("local")" )
But when I try to create a 'context' (defvar *sc* (new 'JavaSparkContext *spark-conf*))
I get an error in the initialisation: Java exception 'java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.package$'.
There isn't much on this except from the Jenkins guys, who have attempted to put Spark and Spark applications into a CI system. They seem to think that it's related to a call to get a properties file in the package class and on a StackOverflow discussion suggested that "you should make sure that you set the classloader that Spark was loaded through using the Thread.currentThread().setContextClassLoader(myCustomLoader) call".
I've verified with (java:dump-classpath) that the JAR is on the ABCL classpath, and the JAR file does contain the spark-version-info.properties file. I've also tried getting the file myself with: (defvar rs #1"Thread.currentThread() .getContextClassLoader() .getResourceAsStream("spark-version-info.properties")" ) which returns nil, so their theory may be correct. Messing around with class loaders is a bit beyond my 20 year old Java knowledge so I thought I'd ask here if anyone has any ideas on how I can load Spark in way to use the default Java class loader. Alternatively it occurs to me to ask why the ABCL class loader isn't able to find the properties file if the JAR is on the classpath and then to correct whatever that problem is.
Cheers, Steve
24 hours later and little progress. I have determined that moving the properties file into the ABCL project directory enables me to get an inputstream on it from ABCL, but the application library still fails to load.
It (still) looks like a class loader issue. What I'd really like is a macro along lines of: (with-class-loader 'foo ... which would quickly confirm or eliminate that hypothesis. Anyone know if one exists, or something similar? Cheers, Steve
On Tuesday, July 21, 2020, 3:22:13 PM GMT+8, Steven Nunez steve_nunez@yahoo.com wrote:
Greetings all, I have what I think is a problem with the ABCL class loader. I am working with a 'big data' library, Spark, but run into an issue on line 2 of the programming guide example. I am able to load the JARs from Maven with the ASDF system definition: (asdf:defsystem #:spark :description "Wrapper for Spark 3.0" :serial t :defsystem-depends-on (abcl-asdf) :depends-on (#:jss #:javaparser) :components ((:mvn "org.apache.spark/spark-core_2.12" :version "3.0.0") (:file "package") (:file "spark")))
and can create a SparkConf object: (defvar *spark-conf* #1"new SparkConf() .setAppName("abcl-app") .setMaster("local")" )
But when I try to create a 'context' (defvar *sc* (new 'JavaSparkContext *spark-conf*))
I get an error in the initialisation: Java exception 'java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.package$'.
There isn't much on this except from the Jenkins guys, who have attempted to put Spark and Spark applications into a CI system. They seem to think that it's related to a call to get a properties file in the package class and on a StackOverflow discussion suggested that "you should make sure that you set the classloader that Spark was loaded through using the Thread.currentThread().setContextClassLoader(myCustomLoader) call".
I've verified with (java:dump-classpath) that the JAR is on the ABCL classpath, and the JAR file does contain the spark-version-info.properties file. I've also tried getting the file myself with: (defvar rs #1"Thread.currentThread() .getContextClassLoader() .getResourceAsStream("spark-version-info.properties")" ) which returns nil, so their theory may be correct. Messing around with class loaders is a bit beyond my 20 year old Java knowledge so I thought I'd ask here if anyone has any ideas on how I can load Spark in way to use the default Java class loader. Alternatively it occurs to me to ask why the ABCL class loader isn't able to find the properties file if the JAR is on the classpath and then to correct whatever that problem is.
Cheers, Steve
On Wed, Jul 22, 2020 at 09:52:45AM +0000, Steven Nunez wrote:
I've verified with (java:dump-classpath) that the JAR is on the ABCL classpath, and the JAR file does contain the spark-version-info.properties file. I've also tried getting the file myself with: (defvar rs #1"Thread.currentThread() .getContextClassLoader() .getResourceAsStream("spark-version-info.properties")" ) which returns nil, so their theory may be correct. Messing around with class loaders is a bit beyond my 20 year old Java knowledge [...]
Just to get you a bit unblocked, it seems you can indeed set the current context class loader and then the call to create the `JavaSparkContext` succeeds:
``` # verify that it doesn't work by default CL-USER> #1"Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties")" NIL
# have to find the right one, for me the first one in the list worked CL-USER> (car (car (dump-classpath))) #<org.armedbear.lisp.JavaClassLoader org.armedbear.lisp.JavaClassLoad.... {EF8EDD9}>
CL-USER> #1"Thread.currentThread()" #<java.lang.Thread Thread[repl-thread,5,main] {16B905B3}>
# well, thread first, then class loader CL-USER> (#"setContextClassLoader" * **) NIL
# looks like it works CL-USER> #1"Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties")" #<sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream sun.net.www.protocol.jar.JarURLC.... {7B45ACFC}>
CL-USER> (defvar *spark-conf* #1"new SparkConf().setAppName("abcl-app").setMaster("local")" ) *SPARK-CONF*
# important to only attempt this call last, otherwise it might throw errors (c.f. *inferior-lisp*) about already having one in the process of being constructed CL-USER> (defvar *sc* (jss:new 'JavaSparkContext *spark-conf*)) *SC* ```
Hopefully there's a better way of course, since this is hardly convenient.
On Jul 22, 2020, at 12:58, Olof-Joachim Frahm olof@macrolet.net wrote:
On Wed, Jul 22, 2020 at 09:52:45AM +0000, Steven Nunez wrote:
I've verified with (java:dump-classpath) that the JAR is on the ABCL classpath, and the JAR file does contain the spark-version-info.properties file. I've also tried getting the file myself with: (defvar rs #1"Thread.currentThread() .getContextClassLoader() .getResourceAsStream("spark-version-info.properties")" ) which returns nil, so their theory may be correct. Messing around with class loaders is a bit beyond my 20 year old Java knowledge [...]
Just to get you a bit unblocked, it seems you can indeed set the current context class loader and then the call to create the `JavaSparkContext` succeeds:
I’ve started to tool around with getting Spark working, but it doesn’t quite work for me yet.
My current progress is in [Ember][]
[Ember]: https://github.com/easye/ember.git
Thank you Olof; that was just what I needed to get things working. Well, that and another half day struggling with what turns out to be a 5 year old bug (#388). I really wish someone had mentioned in the JSS documentation, "Oh, and this doesn't yet work in top-level forms". All told, it was a lot more difficult to get started with ABCL than I expected it to be, but I'm glad it's done and grateful to those that helped. For reference, here's the code that finally works: (defun change-class-loader () (#"setContextClassLoader" #1"Thread.currentThread()" (java:get-current-classloader))) (change-class-loader)
(defun make-spark-config (&key (app-name "abcl-app") (conf-master "local")) "Return a spark configuration. Required to work around ABCL bug 388, otherwise we'd just do this at a top-level form. See https://abcl.org/trac/ticket/338" (let ((conf (jss:new (jss:find-java-class "org.apache.spark.sparkConf")))) (java:chain conf ("setAppName" app-name) ("setMaster" conf-master))))
(defun make-spark-context (spark-config) (jss:new 'JavaSparkContext spark-config))
;;; Now we can create our context and configuration object (defvar *spark-conf* (make-spark-config)) (defvar *sc* (make-spark-context *spark-conf*)) At least it gets me as far as line two of the spark 'hello world'; hopefully there aren't any other surprises lurking. If anyone can recommend any best practices or improvements, especially around the class loader bits, I'd be very happy to hear them.
Regards, Steve On Wednesday, July 22, 2020, 6:59:32 PM GMT+8, Olof-Joachim Frahm olof@macrolet.net wrote:
On Wed, Jul 22, 2020 at 09:52:45AM +0000, Steven Nunez wrote:
I've verified with (java:dump-classpath) that the JAR is on the ABCL classpath, and the JAR file does contain the spark-version-info.properties file. I've also tried getting the file myself with: (defvar rs #1"Thread.currentThread() .getContextClassLoader() .getResourceAsStream("spark-version-info.properties")" ) which returns nil, so their theory may be correct. Messing around with class loaders is a bit beyond my 20 year old Java knowledge [...]
Just to get you a bit unblocked, it seems you can indeed set the current context class loader and then the call to create the `JavaSparkContext` succeeds:
``` # verify that it doesn't work by default CL-USER> #1"Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties")" NIL
# have to find the right one, for me the first one in the list worked CL-USER> (car (car (dump-classpath))) #<org.armedbear.lisp.JavaClassLoader org.armedbear.lisp.JavaClassLoad.... {EF8EDD9}>
CL-USER> #1"Thread.currentThread()" #<java.lang.Thread Thread[repl-thread,5,main] {16B905B3}>
# well, thread first, then class loader CL-USER> (#"setContextClassLoader" * **) NIL
# looks like it works CL-USER> #1"Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties")" #<sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream sun.net.www.protocol.jar.JarURLC.... {7B45ACFC}>
CL-USER> (defvar *spark-conf* #1"new SparkConf().setAppName("abcl-app").setMaster("local")" ) *SPARK-CONF*
# important to only attempt this call last, otherwise it might throw errors (c.f. *inferior-lisp*) about already having one in the process of being constructed CL-USER> (defvar *sc* (jss:new 'JavaSparkContext *spark-conf*)) *SC* ```
Hopefully there's a better way of course, since this is hardly convenient.
On Jul 23, 2020, at 08:37, Steven Nunez steve_nunez@yahoo.com wrote:
Thank you Olof; that was just what I needed to get things working. Well, that and another half day struggling with what turns out to be a 5 year old bug (#388). I really wish someone had mentioned in the JSS documentation, "Oh, and this doesn't yet work in top-level forms". All told, it was a lot more difficult to get started with ABCL than I expected it to be, but I'm glad it's done and grateful to those that helped. For reference, here's the code that finally works:
(defun change-class-loader () (#"setContextClassLoader" #1"Thread.currentThread()" (java:get-current-classloader))) (change-class-loader)
At least it gets me as far as line two of the spark 'hello world'; hopefully there aren't any other surprises lurking. If anyone can recommend any best practices or improvements, especially around the class loader bits, I'd be very happy to hear them.
[…]
Hopefully there's a better way of course, since this is hardly convenient.
Slightly more convenient perhaps is to change the context ClassLoader in the ASDF :PERFORM clause https://github.com/easye/ember/blob/master/ember.asd#L7 before the Spark Maven artifacts are loaded:
(defsystem ember :description "Wrapper for Spark 3.0" :defsystem-depends-on (jss abcl-asdf) :depends-on (#:jss #:javaparser) :perform (load-op (o c) (#"setContextClassLoader" (#"currentThread" 'Thread) (java:get-current-classloader)) (call-next-method o c)) :components ((:mvn "org.apache.spark/spark-core_2.12" :version "3.0.0") (:file "package") (:file "ember")))
Such a setting of the context ClassLoader seems to be quite useful for integrating quite a few Java libraries with ABCL, I wonder if it shouldn't be the default for ABCL-ASDF loading Maven artifacts to set the context ClassLoader is this manner. Would such a choice adversely affect anyones current usage? I almost exclusively use ABCL via SLIME, so maybe production use of ABCL (i.e. as a standalone packaged application) would run into problems here. Thoughts?
For those using, openjdk17 one needs to set the [following run-time switch in the JVM][1]:
—add-opens java.base/sun.nio.ch=ALL-UNNAMED
As a bonus, that annoying [seven (!) year old bug][338], has been recently been fixed by Alejandro Zamora Fonseca for the [upcoming abcl-1.9.1 release][1.9.1].
[1]: https://stackoverflow.com/questions/72230174/java-17-solution-for-spark-java-lang-noclassdeffounderror-could-not-initializ [338]: https://abcl.org/trac/ticket/338 [1.9.1]: https://github.com/armedbear/abcl/pull/534
On 2/6/23 10:09, Mark Evenson wrote: […]
Such a setting of the context ClassLoader seems to be quite useful for integrating quite a few Java libraries with ABCL, I wonder if it shouldn't be the default for ABCL-ASDF loading Maven artifacts to set the context ClassLoader is this manner. Would such a choice adversely affect anyones current usage? I almost exclusively use ABCL via SLIME, so maybe production use of ABCL (i.e. as a standalone packaged application) would run into problems here. Thoughts?
I've [sketched out an implementation][1] of three symbols in the JAVA package for working with the context classloader:
(defun classloader (&optional java-object) … (defun context-classloader (&optional java-thread) …
(defmacro with-classloader ((thread-context) &body body)
…
The implementation of these symbols will likely make it into abcl-1.9.1 baring any major objections.
[1]: https://github.com/armedbear/abcl/pull/548
armedbear-devel@common-lisp.net