git clone https://github.com/jvm-profiling-tools/async-profiler
# https://github.com/wankunde/async-profiler
make
# 此时build目录生成3个文件
(master) ls build/
async-profiler.jar jattach libasyncProfiler.so
# zip打包,并上传到hdfs
zip -r async-profiler.zip profiler.sh build/
Java 程序可以通过实例化 AsyncProfiler 对象来对程序本身进行Profile管理。
本地程序测试时,需要加载async-profile的native 库
Linux机器可以通过添加library path的方式进行本地Java程序测试
-Djava.library.path=/Users/wankun/ws/wankun/async-profiler/build/
MAC 本地无法测试不能使用上面的方式,因为Mac需要加载动态链接库格式为 dylib文件,但是通过make命令编译出来的是 so文件。
实际程序执行时会尝试去加载 /Users/wankun/ws/wankun/async-profiler/build/libasyncProfiler.so/libasyncProfiler.dylib
文件。
所以可以直接直接指定本地库地址进行测试
var profiler: AsyncProfiler = AsyncProfiler.getInstance("/Users/wankun/ws/wankun/async-profiler/build/libasyncProfiler.so")
当然,也可以通过直接使用agent参数加载动态连接库,并注释掉 System.loadLibrary("asyncProfiler");
这个代码,因为动态链接库已经被加载。
-agentpath:/Users/wankun/ws/wankun/async-profiler/build/libasyncProfiler.so
# 集群机器配置
sysctl -w kernel.perf_event_paranoid=1
sysctl -w kernel.kptr_restrict=0
# 上传zip工具包
HADOOP_USER_NAME=schedule hdfs dfs -mkdir -p /deploy/config/async-profiler
HADOOP_USER_NAME=schedule hdfs dfs -put -f async-profiler.zip /deploy/config/async-profiler/
HADOOP_USER_NAME=schedule hdfs dfs -setrep 10 /deploy/config/async-profiler/async-profiler.zip
# spark参数配置
Configuration :+ ("spark.yarn.dist.archives",
"hdfs:///deploy/config/async-profiler/async-profiler.zip#async-profiler")
Configuration :+ ("spark.yarn.dist.files", "hdfs:///deploy/config/profile.sh")
Configuration :+ ("spark.executor.extraLibraryPath", "./async-profiler/build/")
Configuration :+ ("spark.executor.plugins", classOf[ProfileExecutorPlugin].getName)
ProfileExecutorPlugin
是一个Spark Executor 插件,在Executor启动和关闭的时候自动开启和关闭profile。profile.sh
是自己编写的一个工具,可以实现随时启动和关闭profile功能。yum install cmake libtool gcc gcc-c++
yum remove automake autoconf cppunit-devel libtool
wget http://ftp.gnu.org/gnu/autoconf/autoconf-2.69.tar.gz
tar -zxvf autoconf-2.69.tar.gz
cd autoconf-2.69
./configure
make;make install
autoconf --version
wget http://ftp.gnu.org/gnu/automake/automake-1.14.tar.gz
tar -zxvf automake-1.14.tar.gz
cd automake-1.14
./bootstrap.sh
./configure
make; make install
automake --version
wget http://ftpmirror.gnu.org/libtool/libtool-2.4.2.tar.gz
tar -xzf libtool-2.4.2.tar.gz
cd libtool-2.4.2
./configure && make && sudo make install
libtool --version
git clone https://github.com/unittest-cpp/unittest-cpp
./autogen.sh
./configure
make && sudo make install
# cd unittest-cpp/
# cd builds/
# cmake ..
# cmake --build ./ --target install
# ln -s /usr/local/include/UnitTest++ /usr/include/UnitTest++
# yum install glibc-static libstdc++-static
git clone https://github.com/jvm-profiling-tools/honest-profiler.git
UNITTEST_INCLUDE_DIRS="/usr/include/UnitTest++/" UNITTEST_LIBRARIES="UnitTest++" cmake CMakeLists.txt
cmake CMakeLists.txt
export LC_ALL=C
mvn clean package -DskipTests
target File : ~/target/honest-profiler.zip
scp root@172.16.34.133:~/ws/honest-profiler/target/honest-profiler.zip ~/tmp/
spark.driver.extraJavaOptions -XX:+UseG1GC -agentpath:/appcom/home/hadoop/wankun/honest/liblagent.so=interval=7,logPath=/appcom/home/hadoop/wankun/logs/honest.log,start=0,host=localhost,port=8888
echo status |nc localhost 8888
echo start |nc localhost 8888
echo stop |nc localhost 8888
使用 dump-flamegraph 脚本解析 hpl 文件
cd $honest-profiler
./dump-flamegraph ../logs/honest.log ../logs/honest.txt
grep -v AGCT ../logs/honest.txt > ../logs/honest2.txt
pip install hprof2flamegraph
pip install --user hprof2flamegraph
perl ~/.local/bin/flamegraph.pl ~/tmp/honest2.txt > ~/tmp/honest.svg
svg 文件可以使用浏览器打开