________ is a programming model designed for processing large volumes of data in parallel by dividing the work into a set of independent tasks.
(a) Hive
(b) MapReduce
(c) Pig
(d) Lucene
This question was posed to me in quiz.
Asked question is from Data Flow topic in portion HDFS – Hadoop Distributed File System of Hadoop