,如果它是在堆外分配的,为什么没有清理?
One instance of org.apache.spark.unsafe.memory.HeapMemoryAllocator loaded by jdk.internal.loader.ClassLoaders$AppClassLoader 1,61,06,14,312 (89.24%) bytes. The memory is accumulated in one instance of java.util.LinkedList, loaded by <system class loader>, which occupies 1,61,06,14,112 (89.24%) bytes.
,我也检查了Spark UI的“存储”选项卡,它没有显示任何rdd cached。
你看着火花调谐了吗?