Append data to file in hadoop using java api -


i have created file results of sequence of map-reduce jobs. program i' ve made iteratively outputs results. want append data in result file using java api. have tried fs.append doesn't work. time being using built-in libraries of java (eclipse 4.2.2) , when i'm ok debugin i'll make jar , throw in cluster.

first of all, "append" accepted in hdfs? , if yes can tell me how it's done? thnx in advance.

the code using job following:

try{     path pt = new path("/home/results.txt");     filesystem fs = filesystem.get(new configuration());     bufferedwriter br = new bufferedwriter(new outputstreamwriter(fs.append(pt)));     string line = "something";     br.write(line);     br.close(); } catch (exception e) {     system.out.println("file not found"); } 

early versions of hdfs had no support append operation. once file closed, immutable , changed writing new copy different filename.

see more information here

if using old version work me ......

 bufferedreader bfr=new bufferedreader(new inputstreamreader(hdfs.open(path)));     //open file first             string str = null;             bufferedwriter br=new bufferedwriter(new outputstreamwriter(hdfs.create(path,true)));              while ((str = bfr.readline())!= null)             {                 br.write(str); // write file content                 br.newline();                system.out.println("   ->>>>>  "+str);              }             br.write("hello     ");  // append file             br.newline();             br.close(); // close 

Comments

Popular posts from this blog

commonjs - How to write a typescript definition file for a node module that exports a function? -

openid - Okta: Failed to get authorization code through API call -

thorough guide for profiling racket code -