Update Spark_Transformation和Action算子.md

This commit is contained in:
heibaiying
2019-06-04 14:44:17 +08:00
committed by GitHub
parent c51a4ebd39
commit c636fbfd07

View File

@@ -319,7 +319,7 @@ sc.parallelize(list,numSlices = 2).aggregateByKey(zeroValue = 0,numPartitions =
(spark,7) (spark,7)
``` ```
`aggregateByKey(zeroValue = 0,numPartitions = 3)`的第二个参数`numPartitions `决定的是输出RDD的分区数量想要验证这个问题可以对上面代码进行改写使用`getNumPartitions`方法获取分区数量: `aggregateByKey(zeroValue = 0,numPartitions = 3)`的第二个参数`numPartitions`决定的是输出RDD的分区数量想要验证这个问题可以对上面代码进行改写使用`getNumPartitions`方法获取分区数量:
```scala ```scala
sc.parallelize(list,numSlices = 6).aggregateByKey(zeroValue = 0,numPartitions = 3)( sc.parallelize(list,numSlices = 6).aggregateByKey(zeroValue = 0,numPartitions = 3)(