Error
This error occurs endlessly during PySpark code running.
TaskMemoryManager: Failed to allocate a page.
Solution
I added one spark config in SparkSession that solved my problem.
Set autoBroadcastJoinThreshold to -1.
“spark.sql.autoBroadcastJoinThreshold": ‘-1’
Code example
spark = (
SparkSession
.builder.appName('my_spark')
.config("spark.sql.autoBroadcastJoinThreshold", '-1')
.getOrCreate()
)