Django + Celery + PySpark inside Docker raises SystemExit: 1 and NoSuchFileException when creating SparkSession
I'm running a Django application that uses Celery tasks and PySpark inside a Docker container. One of my Celery tasks calls a function that initializes a SparkSession using getOrCreate(). However, when this happens, the worker exits unexpectedly with a SystemExit: 1 and a NoSuchFileException.
Here is the relevant part of the stack trace:
SystemExit: 1
[INFO] Worker exiting (pid: 66009)
...
WARN NativeCodeLoader: Unable to load native-hadoop library for your platform...
WARN DependencyUtils: Local jar /...antlr4-4.9.3.jar does not exist, skipping.
...
Exception in thread "main" java.nio.file.NoSuchFileException: /tmp/tmpagg4d47k/connection8081375827469483762.info
...
[ERROR] Worker (pid:66009) was sent SIGKILL! Perhaps out of memory?
how can i solve the problem