scala - Move file from local to HDFS -


my environment uses spark, pig , hive.

i having trouble write code in scala (or other language compatible environment) copy file local file system hdfs.

does have advices on how should proceed?

you write scala job using hadoop filesystem api.
, use ioutils apache commons copy data inputstream outputstream

import org.apache.hadoop.conf.configuration; import org.apache.hadoop.fs.filesystem; import org.apache.hadoop.fs.path;  import org.apache.commons.io.ioutils;    val hadoopconf = new configuration(); val fs = filesystem.get(hadoopconf);  //create output stream hdfs file val outfilestream = fs.create(new path("hedf://<namenode>:<port>/<filename>))  //create input stream local file val instream = fs.open(new path("file://<input_file>"))  ioutils.copy(instream, outfilestream)  //close both files instream.close() outfilestream.close() 

Comments

Popular posts from this blog

html - Firefox flex bug applied to buttons? -

html - Missing border-right in select on Firefox -

c# - two queries in same method -