I try to use livy statement to submit a piece of scala code that reads lucene index and then read S3 files. As you can imagine, I try to use spark/livy to search data. Is that possible? How to include those jars into the session? I assume I should have them set up when calling the livy API to create the session.
I even tried this is spark-shell. I copied lucene-queryparser-9.6.0.jar lucene-core-9.6.0.jar to a directory and then:
./spark-shell –jars /jar_location
But inside scala prompt, val analyzer = new StandardAnalyzer() shows not found error.
How to ensure the jar is included into spark shell? I opened Spark context Web UI, I can not see my jar.
Now back to livy, I assume it is even harder. Is that even possible?
Thanks.