Error[8]: Undefined offset: 100, File: /www/wwwroot/outofmemory.cn/tmp/plugin_ss_superseo_model_superseo.php, Line: 121
File: /www/wwwroot/outofmemory.cn/tmp/plugin_ss_superseo_model_superseo.php, Line: 473, decode(

文章目录 spark-shell脚本spark-submit脚本spark-class脚本总结


spark-shell脚本
function main() {
  if $cygwin; then
    # Workaround for issue involving JLine and Cygwin
    # (see http://sourceforge.net/p/jline/bugs/40/).
    # If you're using the Mintty terminal emulator in Cygwin, may need to set the
    # "Backspace sends ^H" setting in "Keys" section of the Mintty options
    # (see https://github.com/sbt/sbt/issues/562).
    stty -icanon min 1 -echo > /dev/null 2>&1
    export SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Djline.terminal=unix"
    "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
    stty icanon echo > /dev/null 2>&1
  else
    export SPARK_SUBMIT_OPTS
    "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
  fi
}

我们看到脚本了执行了/bin/spark-submit脚本
/usr/local/spark/bin/spark-submit --class org.apache.spark.repl.Main --name Spark shell

spark-submit脚本
if [ -z "${SPARK_HOME}" ]; then
  source "$(dirname "")"/find-spark-# disable randomized hash for string in Python 3.3+home
fi

"${SPARK_HOME}"
export PYTHONHASHSEED=0

exec /classbin/spark-. org.apache.spark.deploy"$@"SparkSubmit # Find the java binary

这里其实执行了/bin/spark-class脚本
/usr/local/spark/bin/spark-class org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name Spark shel

spark-class脚本
if
[ - "${JAVA_HOME}"n ] ;"${JAVA_HOME}/bin/java" then
  RUNNER=else
if
  [ ( "$-command )v java]" ;"java" then
    RUNNER=else
  echo
    "JAVA_HOME is not set" exit >&2
    ( 1
  fi
fi
。。。
build_command)$RUNNER {
  "-" -Xmx128m cp$LAUNCH_CLASSPATH "." org.apache.spark.launcher"$@"Main "%d}"
  printf "${CMD[@]}" $?
/usr/local/java/bin/java -cp /usr/local/spark/conf/:/usr/local/spark/jars/
。。。
exec [+++]

实际执行命令
[+++] -Dscala.usejavacp=true -Xmx1g org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name Spark shell spark-shell /usr/local/java/bin/java -Xmx128m -cp /usr/local/spark/jars/ org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name Spark shell**

`

总结

可以看到其实在拼接java命令启动以SparkSubmit为主类的应用程序

)
File: /www/wwwroot/outofmemory.cn/tmp/route_read.php, Line: 126, InsideLink()
File: /www/wwwroot/outofmemory.cn/tmp/index.inc.php, Line: 165, include(/www/wwwroot/outofmemory.cn/tmp/route_read.php)
File: /www/wwwroot/outofmemory.cn/index.php, Line: 30, include(/www/wwwroot/outofmemory.cn/tmp/index.inc.php)
Error[8]: Undefined offset: 101, File: /www/wwwroot/outofmemory.cn/tmp/plugin_ss_superseo_model_superseo.php, Line: 121
File: /www/wwwroot/outofmemory.cn/tmp/plugin_ss_superseo_model_superseo.php, Line: 473, decode(

文章目录 spark-shell脚本spark-submit脚本spark-class脚本总结


spark-shell脚本
function main() {
  if $cygwin; then
    # Workaround for issue involving JLine and Cygwin
    # (see http://sourceforge.net/p/jline/bugs/40/).
    # If you're using the Mintty terminal emulator in Cygwin, may need to set the
    # "Backspace sends ^H" setting in "Keys" section of the Mintty options
    # (see https://github.com/sbt/sbt/issues/562).
    stty -icanon min 1 -echo > /dev/null 2>&1
    export SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Djline.terminal=unix"
    "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
    stty icanon echo > /dev/null 2>&1
  else
    export SPARK_SUBMIT_OPTS
    "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
  fi
}

我们看到脚本了执行了/bin/spark-submit脚本
/usr/local/spark/bin/spark-submit --class org.apache.spark.repl.Main --name Spark shell

spark-submit脚本
if [ -z "${SPARK_HOME}" ]; then
  source "$(dirname "")"/find-spark-# disable randomized hash for string in Python 3.3+home
fi

"${SPARK_HOME}"
export PYTHONHASHSEED=0

exec /classbin/spark-. org.apache.spark.deploy"$@"SparkSubmit # Find the java binary

这里其实执行了/bin/spark-class脚本
/usr/local/spark/bin/spark-class org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name Spark shel

spark-class脚本
if
[ - "${JAVA_HOME}"n ] ;"${JAVA_HOME}/bin/java" then
  RUNNER=else
if
  [ ( "$-command )v java]" ;"java" then
    RUNNER=else
  echo
    "JAVA_HOME is not set" exit >&2
    ( 1
  fi
fi
。。。
build_command)$RUNNER {
  "-" -Xmx128m cp$LAUNCH_CLASSPATH "." org.apache.spark.launcher"$@"Main "%d}"
  printf "${CMD[@]}" $?
/usr/local/java/bin/java -cp /usr/local/spark/conf/:/usr/local/spark/jars/
。。。
exec 

实际执行命令
[+++] -Dscala.usejavacp=true -Xmx1g org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name Spark shell spark-shell /usr/local/java/bin/java -Xmx128m -cp /usr/local/spark/jars/ org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name Spark shell**

`

总结

可以看到其实在拼接java命令启动以SparkSubmit为主类的应用程序

)
File: /www/wwwroot/outofmemory.cn/tmp/route_read.php, Line: 126, InsideLink()
File: /www/wwwroot/outofmemory.cn/tmp/index.inc.php, Line: 165, include(/www/wwwroot/outofmemory.cn/tmp/route_read.php)
File: /www/wwwroot/outofmemory.cn/index.php, Line: 30, include(/www/wwwroot/outofmemory.cn/tmp/index.inc.php)
spark2.0源码阅读 剖析spark-shell_区块链_内存溢出

spark2.0源码阅读 剖析spark-shell

spark2.0源码阅读 剖析spark-shell,第1张

文章目录 spark-shell脚本spark-submit脚本spark-class脚本总结


spark-shell脚本
function main() {
  if $cygwin; then
    # Workaround for issue involving JLine and Cygwin
    # (see http://sourceforge.net/p/jline/bugs/40/).
    # If you're using the Mintty terminal emulator in Cygwin, may need to set the
    # "Backspace sends ^H" setting in "Keys" section of the Mintty options
    # (see https://github.com/sbt/sbt/issues/562).
    stty -icanon min 1 -echo > /dev/null 2>&1
    export SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Djline.terminal=unix"
    "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
    stty icanon echo > /dev/null 2>&1
  else
    export SPARK_SUBMIT_OPTS
    "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
  fi
}

我们看到脚本了执行了/bin/spark-submit脚本
/usr/local/spark/bin/spark-submit --class org.apache.spark.repl.Main --name Spark shell

spark-submit脚本
if [ -z "${SPARK_HOME}" ]; then
  source "$(dirname "")"/find-spark-# disable randomized hash for string in Python 3.3+home
fi

"${SPARK_HOME}"
export PYTHONHASHSEED=0

exec /classbin/spark-. org.apache.spark.deploy"$@"SparkSubmit # Find the java binary

这里其实执行了/bin/spark-class脚本
/usr/local/spark/bin/spark-class org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name Spark shel

spark-class脚本
if
[ - "${JAVA_HOME}"n ] ;"${JAVA_HOME}/bin/java" then
  RUNNER=else
if
  [ ( "$-command )v java]" ;"java" then
    RUNNER=else
  echo
    "JAVA_HOME is not set" exit >&2
    ( 1
  fi
fi
。。。
build_command)$RUNNER {
  "-" -Xmx128m cp$LAUNCH_CLASSPATH "." org.apache.spark.launcher"$@"Main "%d}"
  printf "${CMD[@]}" $?
/usr/local/java/bin/java -cp /usr/local/spark/conf/:/usr/local/spark/jars/
。。。
exec 

实际执行命令
-Dscala.usejavacp=true -Xmx1g org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name Spark shell spark-shell /usr/local/java/bin/java -Xmx128m -cp /usr/local/spark/jars/ org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name Spark shell**

`

总结

可以看到其实在拼接java命令启动以SparkSubmit为主类的应用程序

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/1298931.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-06-10
下一篇 2022-06-10

发表评论

登录后才能评论

评论列表(0条)

保存