Search code examples
distributed-computinghadoop2

What are the minimum requirement of hadoop cluster for testing


I am new to Hadoop. I am trying to built hadoop cluster to test performance of hadoop.I want to know what are the minimum cluster size,memory,disks space,number of cores,for each node(master and slave) and I want to know what would be the size of testing file. I am trying to process text file


Solution

  • For HortonWorks

    • Runs on 32-bit and 64-bit OS (Windows 7, Windows 8 and Mac OSX and LINUX)
    • Your machine should have a minimum of 10 GB to able to run the VM which allocates 8GB
    • Virtualization enabled on BIOS Only If you're running it on a VM
    • Browser: Chrome 25+, IE 9+, Safari 6+, Firefox 18+ recommended. (Sandbox will not run on IE 10)

    Just go to their download page http://hortonworks.com/hdp/downloads/


    Look at Cloudera requirements you'll get this

    • RAM - 4 GB
    • IPv6 must be disabled.
    • No blocking by iptables or firewalls; port 7180 must be open because it is used to access Cloudera Manager after installation. Cloudera Manager communicates using specific ports, which must be open.
    • ext3: This is the most tested underlying filesystem for HDFS.
    • CPU - the more the better
    • JDK 1.7 at least

    For more information you can check the following link

    http://www.cloudera.com/documentation/enterprise/latest/topics/cm_ig_cm_requirements.html#cmig_topic_4_1_unique_1