or at least Edge to see the coolest products on Steemhunt.
'),document.write("\x3c!--"),document.execCommand("Stop"))Lets say you have to analyse or process huge amount of data , how will you do it ? How many processing units or nodes /computers do you need to attach and how will you make them communicate with each other?
The answer is Hadoop .
It is a framework that allows you to process huge amount of data in a distributed environment .
The two processes involved in any big data processing using Hadoop is Map and Reduce . You can learn more about this by visiting their site .
$1.25·35 votes·0 comments
You need a Steem account to join the discussion
Sign up now