*** Welcome to piglix ***

Spark (cluster computing framework)

Apache Spark
Spark Logo
Original author(s) Matei Zaharia
Developer(s) Apache Software Foundation, UC Berkeley AMPLab, Databricks
Initial release May 30, 2014; 3 years ago (2014-05-30)
Stable release
v2.1.1 / May 2, 2017; 30 days ago (2017-05-02)
Repository github.com/apache/spark
Development status Active
Written in Scala, Java, Python, R
Operating system Microsoft Windows, OS X, Linux
Type Data analytics, machine learning algorithms
License Apache License 2.0
Website spark.apache.org

Apache Spark is an open-source cluster-computing framework. Originally developed at the University of California, Berkeley's AMPLab, the Spark codebase was later donated to the Apache Software Foundation, which has maintained it since. Spark provides an interface for programming entire clusters with implicit data parallelism and fault-tolerance.

Apache Spark provides programmers with an application programming interface centered on a data structure called the resilient distributed dataset (RDD), a read-only multiset of data items distributed over a cluster of machines, that is maintained in a fault-tolerant way. It was developed in response to limitations in the MapReduce cluster computing paradigm, which forces a particular linear dataflow structure on distributed programs: MapReduce programs read input data from disk, map a function across the data, reduce the results of the map, and store reduction results on disk. Spark's RDDs function as a working set for distributed programs that offers a (deliberately) restricted form of distributed shared memory.

The availability of RDDs facilitates the implementation of both iterative algorithms, that visit their dataset multiple times in a loop, and interactive/exploratory data analysis, i.e., the repeated database-style querying of data. The latency of such applications (compared to a MapReduce implementation, as was common in Apache Hadoop stacks) may be reduced by several orders of magnitude. Among the class of iterative algorithms are the training algorithms for machine learning systems, which formed the initial impetus for developing Apache Spark.


...
Wikipedia

...