当前位置: 首页 > 工具软件 > Spark Kernel > 使用案例 >

[Spark][spark_core]#1_spark入门

姚正真
2023-12-01

[root@node00 sbin]# spark-shell --master local[2]

val file = spark.sparkContext.textFile("file:///usr/local/wc.txt")
val wordCounts = file.flatMap(line => line.split(",")).map((word => (word,1))).reduceByKey(_ + _)
wordCounts.collect
 类似资料: