From 10b5ad9d097b88b1d670d3e7af5c72ad2c3fdea8 Mon Sep 17 00:00:00 2001 From: heibaiying <31504331+heibaiying@users.noreply.github.com> Date: Sat, 18 May 2019 16:19:10 +0800 Subject: [PATCH] =?UTF-8?q?Update=20Spark-Streaming=E4=B8=8E=E6=B5=81?= =?UTF-8?q?=E5=A4=84=E7=90=86.md?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- notes/Spark-Streaming与流处理.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/notes/Spark-Streaming与流处理.md b/notes/Spark-Streaming与流处理.md index dfbce30..b29fc50 100644 --- a/notes/Spark-Streaming与流处理.md +++ b/notes/Spark-Streaming与流处理.md @@ -55,13 +55,13 @@ Spark Streaming是Spark的一个子模块,用于快速构建可扩展,高吞 + 能够和Spark其他模块无缝集成,将流处理与批处理完美结合; + Spark Streaming可以从HDFS,Flume,Kafka,Twitter和ZeroMQ读取数据,也支持自定义数据源。 -
+
### 2.2 DStream Spark Streaming提供称为离散流(DStream)的高级抽象,用于表示连续的数据流。 DStream可以从来自Kafka,Flume和Kinesis等数据源的输入数据流创建,也可以由其他DStream转化而来。**在内部,DStream表示为一系列RDD**。 -
+
@@ -75,4 +75,4 @@ storm和Flink都是真正意义上的流计算框架,但 Spark Streaming 只 ## 参考资料 -[Spark Streaming Programming Guide](https://spark.apache.org/docs/latest/streaming-programming-guide.html) \ No newline at end of file +[Spark Streaming Programming Guide](https://spark.apache.org/docs/latest/streaming-programming-guide.html)