注:
jobmanager.memory.heap.size和jobmanager.memory.process.size二选一,且后者比前者大。
classloader.check-leaked-classloader: false和taskmanager.memory.process.size: 5120m是程序启动时候遇到的问题,所以配置上
env.hadoop.conf.dir: /opt/hadoop/etc/hadoop,
env.java.home: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.181-7.b13.el7.x86_64/jre和
env.yarn.conf.dir: /opt/hadoop/etc/hadoop是环境变量。如果你的 /etc/profile能找到对应路径,那么只需要三个配置
## jobmanager.memory.heap.size: 1024m
jobmanager.memory.process.size: 2048m
classloader.check-leaked-classloader: false
taskmanager.memory.process.size: 5120m
env.hadoop.conf.dir: /opt/hadoop/etc/hadoop
env.java.home: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.181-7.b13.el7.x86_64/jre
env.yarn.conf.dir: /opt/hadoop/etc/hadoop
测试:
记得在active状态的rm上执行!
/opt/flink-1.13.6/bin/flink run -t yarn-per-job /opt/flink/examples/batch/WordCount.jar
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)