是否可以检查pySpark中是否存在数据框?
我在本地python中知道,以检查数据框是否存在:
exists(df_name) && is.data.frame(get(df_name))
如何在pySpark中完成?由于命令exists
引发错误。
from pyspark.sql import DataFrame
df= sc.parallelize([
(1,2,3), (4,5,7)]).toDF(["a", "b", "c"])
if df is not None and isinstance(df,DataFrame):
#<some operation>
print("dataframe exists")