On Wednesday, July 22, 2020, 6:59:32 PM GMT+8, Olof-Joachim Frahm <olof@macrolet.net> wrote:
On Wed, Jul 22, 2020 at 09:52:45AM +0000, Steven Nunez wrote:
> I've verified with (java:dump-classpath) that the JAR is on the ABCL
> classpath, and the JAR file does contain the
> spark-version-info.properties file. I've also tried getting the file
> myself with:
> (defvar rs
> #1"Thread.currentThread()
> .getContextClassLoader()
> .getResourceAsStream("spark-version-info.properties")" )
> which returns nil, so their theory may be correct.
> Messing around with class loaders is a bit beyond my 20 year old Java knowledge [...]
Just to get you a bit unblocked, it seems you can indeed set the current
context class loader and then the call to create the `JavaSparkContext`
succeeds:
```
# verify that it doesn't work by default
CL-USER> #1"Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties")"
NIL
# have to find the right one, for me the first one in the list worked
CL-USER> (car (car (dump-classpath)))
#<org.armedbear.lisp.JavaClassLoader org.armedbear.lisp.JavaClassLoad.... {EF8EDD9}>
CL-USER> #1"Thread.currentThread()"
#<java.lang.Thread Thread[repl-thread,5,main] {16B905B3}>
# well, thread first, then class loader
CL-USER> (#"setContextClassLoader" * **)
NIL
# looks like it works
CL-USER> #1"Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties")"
#<sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream sun.net.www.protocol.jar.JarURLC.... {7B45ACFC}>
CL-USER> (defvar *spark-conf* #1"new SparkConf().setAppName("abcl-app").setMaster("local")" )
*SPARK-CONF*
# important to only attempt this call last, otherwise it might throw
errors (c.f. *inferior-lisp*) about already having one in the process of
being constructed
CL-USER> (defvar *sc* (jss:new 'JavaSparkContext *spark-conf*))
*SC*
```
Hopefully there's a better way of course, since this is hardly
convenient.