runtime error
JAVA_HOME is not set Traceback (most recent call last): File "/home/user/app/app.py", line 24, in <module> df_chapters = create_spark_dataframe(text) File "/home/user/app/app.py", line 15, in create_spark_dataframe spark = SparkSession.builder.appName("Counting word occurrences from a book, under a microscope.").config("spark.driver.memory", "4g").getOrCreate() File "/home/user/.local/lib/python3.10/site-packages/pyspark/sql/session.py", line 477, in getOrCreate sc = SparkContext.getOrCreate(sparkConf) File "/home/user/.local/lib/python3.10/site-packages/pyspark/context.py", line 512, in getOrCreate SparkContext(conf=conf or SparkConf()) File "/home/user/.local/lib/python3.10/site-packages/pyspark/context.py", line 198, in __init__ SparkContext._ensure_initialized(self, gateway=gateway, conf=conf) File "/home/user/.local/lib/python3.10/site-packages/pyspark/context.py", line 432, in _ensure_initialized SparkContext._gateway = gateway or launch_gateway(conf) File "/home/user/.local/lib/python3.10/site-packages/pyspark/java_gateway.py", line 106, in launch_gateway raise RuntimeError("Java gateway process exited before sending its port number") RuntimeError: Java gateway process exited before sending its port number
Container logs:
Fetching error logs...