Greetings all,
I have what I think is a problem with the ABCL class loader. I am working with a 'big data' library, Spark, but run into an issue on line 2 of the programming guide example. I am able to load the JARs from Maven with the ASDF system definition:
(asdf:defsystem #:spark
:description "Wrapper for Spark 3.0"
:serial t
:defsystem-depends-on (abcl-asdf)
:depends-on (#:jss #:javaparser)
:components ((:mvn "org.apache.spark/spark-core_2.12" :version "3.0.0")
(:file "package")
(:file "spark")))
and can create a SparkConf object:
(defvar *spark-conf*
#1"new SparkConf()
.setAppName("abcl-app")
.setMaster("local")" )
But when I try to create a 'context'
(defvar *sc* (new 'JavaSparkContext *spark-conf*))
I get an error in the initialisation:
Java exception 'java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.package$'.
There isn't much on this except from the Jenkins guys, who have attempted to put Spark and Spark applications into a CI system. They seem to think that it's related to a call to get a properties file in the package class and on a StackOverflow discussion suggested that "you should make sure that you set the classloader that Spark was loaded through using the Thread.currentThread().setContextClassLoader(myCustomLoader) call".
I've verified with (java:dump-classpath) that the JAR is on the ABCL classpath, and the JAR file does contain the spark-version-info.properties file. I've also tried getting the file myself with:
(defvar rs
#1"Thread.currentThread()
.getContextClassLoader()
.getResourceAsStream("spark-version-info.properties")" )
which returns nil, so their theory may be correct.
Messing around with class loaders is a bit beyond my 20 year old Java knowledge so I thought I'd ask here if anyone has any ideas on how I can load Spark in way to use the default Java class loader. Alternatively it occurs to me to ask why the ABCL class loader isn't able to find the properties file if the JAR is on the classpath and then to correct whatever that problem is.
Cheers, Steve