Sunday, 15 April 2012

parallel processing - Apache Spark vs Akka -


Can you please tell the difference between Apache Spark and AKKA, I know that both framework programs are distributed and parallel computation I do not see any link or difference between them.

In addition, I want to get appropriate use cases for each of them.

Apache Spark is actually built on Akka

In Akaka Scala or Java There is a general purpose framework for creating responsive, distributed, parallel, and flexible concurrent applications. Akka uses the model to hide all thread-related codes and gives you a really simple and useful interface to easily implement a scalable and fault-tolerant system. A good example for Akka is a real-time application that consume data that comes from mobile phones and processes it and sends them some kind of storage.

Apache Spark (Not Spark Streaming) is a framework for batch data processing Using a normalized version of the map-less algorithm A good example for Apache SPARC is stored for getting better information of your data The calculation of some of the metrics of the data is. The data is loaded and processed on demand.

Apache is able to perform similar actions and tasks on real-time small batches near Spark Streaming data, in the same way you will do if the data will already be stored.

Update April 2016

By Apache Spark 1.6.0, Apache is no longer dependent on ABC for communication between spark nodes. Thanks for the Egane Me for the comments.


No comments:

Post a Comment