当前位置: 首页>后端>正文

spark-windows本地环境搭建

  • 本机系统windows10
  • Hadoop是hadoop-3.2.0,替换hadoop.dll 和winutils.exe (直接换掉bin目录)
    github找 https://github.com/steveloughran/winutils/blob/master/hadoop-3.0.0/bin
  • scala-SDK-2.12.10 , spark2.4.3不需要本地环境pom配置即可 (注意spark3.0会出现java 9 报错,网上的方法都不可用)
  • 环境变量JAVA_HOME、HADOOP_HOME配置,修改hadoop-env.cmd里JAVA_HOME
  • Program Files 替换成~1 , X86 改成~2 C:\PROGRA~1\Java\jdk1.8.0_172
  • pom.xml 文件
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>org.example</groupId>
    <artifactId>HelloWord</artifactId>
    <version>1.0-SNAPSHOT</version>
    <dependencies>

     <dependency>
     <groupId>org.apache.hive</groupId>
     <artifactId>hive-exec</artifactId>
     <version>1.2.1</version>
     </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.12</artifactId>
            <version>2.4.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.12</artifactId>
            <version>2.4.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.12</artifactId>
            <version>2.4.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>2.7.4</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>2.7.4</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-jdbc</artifactId>
            <version>1.2.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-lang3</artifactId>
            <version>3.8.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.7.4</version>
        </dependency>
    </dependencies>
</project>

测试运行

//  这个包报红注意吧Hadoop的依赖搞上去
import org.apache.spark.{SparkConf, SparkContext}

object HelloWord1 {

  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setMaster("local").setAppName("大爷")

    val sc = new SparkContext(conf)

    val helloWorld = sc.parallelize(List("Hello,大爷4!","Hello,大爷3!","Hello,大爷1!"))

    helloWorld.foreach(line => println(line))

  }
}

运行结果

spark-windows本地环境搭建,第1张
result.png

https://www.xamrdz.com/backend/3d41941481.html

相关文章: