hadoop 啟動sparkpyspark為什么要先啟動hdfs?
啟動sparkpyspark為什么要先啟動hdfs?usr/spark/sbin/start-全部.sh啟動spark失敗。我怎么能試試火花-環(huán)境sh設(shè)置為:exportspark MASTER IP
啟動sparkpyspark為什么要先啟動hdfs?
usr/spark/sbin/start-全部.sh啟動spark失敗。我怎么能試試火花-環(huán)境sh設(shè)置為:exportspark MASTER IP=127.0.0.1 exportspark LOCAL IP=127.0.0.1