Search code examples
javahdfshadoop2hadoop-streaming

Permission Dennied when copying file from Local system to HDFS from STS java program


I am wokring on the HDFS and trying to copy a file from the local system to the HDFS file system using Configuration and FileSystem classes from the hadoop conf and fs packages as follows:

import java.io.BufferedInputStream;
import java.io.FileInputStream;
import java.io.InputStream;
import java.io.OutputStream;
import java.net.URI;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
import org.apache.hadoop.util.Progressable;

public class FileCopyWithWrite {

    public static void main(String[] args) {
        // TODO Auto-generated method stub

        String localSrc = "/Users/bng/Documents/hContent/input/ncdc/sample.txt";
        String dst = "hdfs://localhost/sample.txt";
        try{

            InputStream in = new BufferedInputStream(new FileInputStream(localSrc));
            Configuration conf = new Configuration();;
            FileSystem fs = FileSystem.get(URI.create(dst), conf);

            OutputStream out = fs.create(new Path(dst), new Progressable() {

                public void progress() {
                    // TODO Auto-generated method stub
                    System.out.print(".");
                }
            });

            IOUtils.copyBytes(in, out, 4092, true);


        }catch(Exception e){
            e.printStackTrace();
        }
    }

}

But running this program gives me an exception as follows:

org.apache.hadoop.security.AccessControlException: Permission denied: user=KV, access=WRITE, inode="/":root:supergroup:drwxr-xr-x
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:238)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:179)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6545)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6527)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6479)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2712)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2632)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2520)

The reason is right that the current user KV does not have the file write permission to the books directory in the HDFS.

I tried copying the file from the console which is working fine. I tried the following command from the console:

sudo su
hadoop fs -copyFromLocal /Users/bng/Documents/hContent/input/ncdc/sample.txt hdfs://localhost/sample.txt

I found a lot of search results on google but none worked for me. How to solve this issue? How can i run the specific class from STS or eclipse with sudo permission? Or is there any other option for this?


Solution

  • Providing the permissions to the current user in HDFS solved the problem for me. I added the permissions in HDFS as follows:

    hadoop fs -chown -R KV:KV hdfs://localhost