Search code examples
hadoopvirtual-machinehortonworks-data-platformhortonworks-sandboxbigdata

Error in reducing ram size in hortonworks hadoop


i have to reduce ram size of virtual box from 4 gb to 1 gb .I had tried for reducing it But it is unchangable so please suggest ways to do it in right manner . I am attaching screenshot . required image


Solution

  • The same error had occured when i had tried for hadoop , now you can use these things . Configuring YARN In a Hadoop cluster, it’s vital to balance the usage of RAM, CPU and disk so that processing is not constrained by any one of these cluster resources. As a general recommendation, we’ve found that allowing for 1-2 Containers per disk and per core gives the best balance for cluster utilization. So with our example cluster node with 12 disks and 12 cores, we will allow for 20 maximum Containers to be allocated to each node. Each machine in our cluster has 48 GB of RAM. Some of this RAM should be reserved for Operating System usage. On each node, we’ll assign 40 GB RAM for YARN to use and keep 8 GB for the Operating System. The following property sets the maximum memory YARN can utilize on the node: In yarn-site.xml

    <name>yarn.nodemanager.resource.memory-mb</name>
    
    <value>40960</value>
    

    The next step is to provide YARN guidance on how to break up the total resources available into Containers. You do this by specifying the minimum unit of RAM to allocate for a Container. We want to allow for a maximum of 20 Containers, and thus need (40 GB total RAM) / (20 # of Containers) = 2 GB minimum per container: In yarn-site.xml

     <name>yarn.scheduler.minimum-allocation-mb</name>
    
     <value>2048</value>
    

    YARN will allocate Containers with RAM amounts greater than the yarn.scheduler.minimum-allocation-mb. For more information you can visit hortonworks.com/blog/how-to-plan-and-configure-yarn-in-hdp-2-0/