ezoic

Monday, December 5, 2016

What's Spark Apache and how it works I


Apache Spark is an open source cluster computing framework. Originally developed at the University of California, Berkeley's AMPLab

Apache Spark provides programmers with an application programming interface centered on a data structure called the resilient distributed dataset (RDD), a read-only multiset of data items distributed over a cluster of machines, that is maintained in a fault-tolerant way. It was developed in response to limitations in the MapReduce cluster computing paradigm, which forces a particular linear dataflow structure on distributed programs: MapReduce programs read input data from disk, map a function across the data, reduce the results of the map, and store reduction results on disk. Spark's RDDs function as a working set for distributed programs that offers a (deliberately) restricted form of distributed shared memory



Spark has Spark Core, Spark Streaming and Spark SQL.

Spark can be written in Java, Scala, Python etc.


Here are some tutorials and slides  for Spark.


https://www.tutorialspoint.com/apache_spark/apache_spark_tutorial.pdf


http://lintool.github.io/SparkTutorial/slides/day1_context.pdf

http://www.bigdataeverywhere.com/files/israel/BDE-OverviewApacheSpark-GULMAN.pdf








No comments:

Post a Comment

looking for a man

 I am a mid aged woman. I live in southern california.  I was born in 1980. I do not have any kid. no compliacted dating.  I am looking for ...