Thank you Olof; that was just what I needed to get things working. Well, that and another half day struggling with what turns out to be a 5 year old bug (#388). I really wish someone had mentioned in the JSS documentation, "Oh, and this doesn't yet work in top-level forms". All told, it was a lot more difficult to get started with ABCL than I expected it to be, but I'm glad it's done and grateful to those that helped. For reference, here's the code that finally works:

(defun change-class-loader ()
  (#"setContextClassLoader" #1"Thread.currentThread()" (java:get-current-classloader)))
(change-class-loader)

(defun make-spark-config (&key (app-name "abcl-app") (conf-master "local"))
  "Return a spark configuration.
Required to work around ABCL bug 388, otherwise we'd just do this at a
top-level form. See https://abcl.org/trac/ticket/338"
  (let ((conf (jss:new (jss:find-java-class "org.apache.spark.sparkConf"))))
    (java:chain conf
                ("setAppName" app-name)
                ("setMaster" conf-master))))

(defun make-spark-context (spark-config)
  (jss:new 'JavaSparkContext spark-config))

;;; Now we can create our context and configuration object
(defvar *spark-conf* (make-spark-config))
(defvar *sc* (make-spark-context *spark-conf*))

At least it gets me as far as line two of the spark 'hello world'; hopefully there aren't any other surprises lurking. If anyone can recommend any best practices or improvements, especially around the class loader bits, I'd be very happy to hear them.

Regards,
    Steve
On Wednesday, July 22, 2020, 6:59:32 PM GMT+8, Olof-Joachim Frahm <olof@macrolet.net> wrote:


On Wed, Jul 22, 2020 at 09:52:45AM +0000, Steven Nunez wrote:

> I've verified with (java:dump-classpath) that the JAR is on the ABCL
> classpath, and the JAR file does contain the
> spark-version-info.properties file. I've also tried getting the file
> myself with:
> (defvar rs
>   #1"Thread.currentThread()
>     .getContextClassLoader()
>     .getResourceAsStream("spark-version-info.properties")" )
> which returns nil, so their theory may be correct.
> Messing around with class loaders is a bit beyond my 20 year old Java knowledge [...]


Just to get you a bit unblocked, it seems you can indeed set the current
context class loader and then the call to create the `JavaSparkContext`
succeeds:

```
# verify that it doesn't work by default
CL-USER> #1"Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties")"
NIL

# have to find the right one, for me the first one in the list worked
CL-USER> (car (car (dump-classpath)))
#<org.armedbear.lisp.JavaClassLoader org.armedbear.lisp.JavaClassLoad.... {EF8EDD9}>

CL-USER> #1"Thread.currentThread()"
#<java.lang.Thread Thread[repl-thread,5,main] {16B905B3}>

# well, thread first, then class loader
CL-USER> (#"setContextClassLoader" * **)
NIL

# looks like it works
CL-USER> #1"Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties")"
#<sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream sun.net.www.protocol.jar.JarURLC.... {7B45ACFC}>

CL-USER> (defvar *spark-conf* #1"new SparkConf().setAppName("abcl-app").setMaster("local")" )
*SPARK-CONF*

# important to only attempt this call last, otherwise it might throw
errors (c.f. *inferior-lisp*) about already having one in the process of
being constructed
CL-USER> (defvar *sc* (jss:new 'JavaSparkContext *spark-conf*))
*SC*
```

Hopefully there's a better way of course, since this is hardly
convenient.