Hadoop File Read
Posted
by
user3684584
on Stack Overflow
See other posts from Stack Overflow
or by user3684584
Published on 2014-08-25T10:16:01Z
Indexed on
2014/08/25
10:20 UTC
Read the original article
Hit count: 184
hadoop
Hadoop Distributed Cache Wordcount example in hadoop 2.2.0. Copied file into hdfs filesystem to be used inside setup of mapper class.
protected void setup(Context context) throws IOException,InterruptedException { Path[] uris = DistributedCache.getLocalCacheFiles(context.getConfiguration()); cacheData=new HashMap();
for(Path urifile: uris)
{
try
{
BufferedReader readBuffer1 = new BufferedReader(new FileReader(urifile.toString()));
String line;
while ((line=readBuffer1.readLine())!=null)
{ System.out.println("**************"+line);
cacheData.put(line,line);
}
readBuffer1.close();
}
catch (Exception e)
{
System.out.println(e.toString());
}
}
}
Inside Driver Main class
Configuration conf = new Configuration();
String[] otherArgs = new GenericOptionsParser(conf,args).getRemainingArgs();
if (otherArgs.length != 3)
{
System.err.println("Usage: wordcount <in> <out>");
System.exit(2);
}
Job job = new Job(conf, "word_count");
job.setJarByClass(WordCount.class);
job.setMapperClass(Map.class);
job.setReducerClass(Reduce.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
Path outputpath=new Path(otherArgs[1]);
outputpath.getFileSystem(conf).delete(outputpath,true);
FileOutputFormat.setOutputPath(job,outputpath);
System.out.println("CachePath****************"+otherArgs[2]);
DistributedCache.addCacheFile(new URI(otherArgs[2]),job.getConfiguration());
System.exit(job.waitForCompletion(true) ? 0 : 1);
But getting exception
java.io.FileNotFoundException: file:/home/user12/tmp/mapred/local/1408960542382/cache (No such file or directory)
So Cache functionality not working properly. Any Idea ?
© Stack Overflow or respective owner