scala - Move file from local to HDFS -


my environment uses spark, pig , hive.

i having trouble write code in scala (or other language compatible environment) copy file local file system hdfs.

does have advices on how should proceed?

you write scala job using hadoop filesystem api.
, use ioutils apache commons copy data inputstream outputstream

import org.apache.hadoop.conf.configuration; import org.apache.hadoop.fs.filesystem; import org.apache.hadoop.fs.path;  import org.apache.commons.io.ioutils;    val hadoopconf = new configuration(); val fs = filesystem.get(hadoopconf);  //create output stream hdfs file val outfilestream = fs.create(new path("hedf://<namenode>:<port>/<filename>))  //create input stream local file val instream = fs.open(new path("file://<input_file>"))  ioutils.copy(instream, outfilestream)  //close both files instream.close() outfilestream.close() 

Comments

Popular posts from this blog

mysql - FireDac error 314 - but DLLs are in program directory -

git - How to list all releases of public repository with GitHub API V3 -

c++ - Getting C2512 "no default constructor" for `ClassA` error on the first parentheses of constructor for `ClassB`? -