Spark and Streaming with Matei Zaharia

Greatest Hits Archives - Software Engineering Daily - A podcast by Greatest Hits Archives - Software Engineering Daily

Categories:

Apache Spark is a system for processing large data sets in parallel. The core abstraction of Spark is the resilient distributed dataset (RDD), a working set of data that sits in memory for fast, iterative processing. Matei Zaharia created Spark with two goals: to provide a composable, high-level set of APIs for performing distributed processing;