1. 解压包
tar -xvf apache-kyuubi-1.4.1-incubating-bin-spark-3.1-hadoop3.2.tar.gz
tar -xvf spark-3.1.3.tar.gz
2. kyuubi 修改配配置
vim kyuubi-env.sh
export JAVA_HOME=/usr/local/java
export SPARK_HOME=/opt/spark-3.1.3-bin-2.7.4
export HADOOP_CONF_DIR=/etc/hadoop/conf
vim kyuubi-defaults.conf
kyuubi.ha.zookeeper.quorum 192.168.19.11:2181,192.168.19.11:2181,192.168.19.11:2181
spark.master yarn
3. 修改spark 配置
vim spark-env.sh
export HADOOP_CONF_DIR=/etc/hadoop/conf
export SPARK_LOCAL_IP=192.168.12.11
vim spark-defaults.conf
spark.master=yarn
spark.driver.host=192.168.12.11
4、 启动kyuubi
${kyuubi}/bin/kyuubi start
问题解决
1、无法链接yarn 8032,是因为代理用户问题导致。因为kyuubi会设置代理用户hive,而进程非hive导致
2. spark jersey 类找不到
这是因为spark 包中的jersey-client-1.9.jar、jersey-core-1.9.jar 与hadoop中的版本不一致导致。将hadoop/share/hadoop/yarn/lib/jersey-client 复制dao jars目录下即可解决