java hadoop: FileReader VS InputStreamReader -
i want use java class on hadoop hdfs, must rewrite functions. problem is, if use inputstreamreader app read wrong values.
here code (so it's work, want use uncommented code part):
public static geotimedatacenter[] readcentersarrayfromfile(int iteration) { properties pro = new properties(); try { pro.load(geotimedatahelper.class.getresourceasstream("/config.properties")); } catch (exception e) { e.printstacktrace(); } int k = integer.parseint(pro.getproperty("k")); geotimedatacenter[] centers = new geotimedatacenter[k]; bufferedreader br; try { //path pt=new path(pro.getproperty("seed.file")+(iteration-1)); //filesystem fs = filesystem.get(new configuration()); //br=new bufferedreader(new inputstreamreader(fs.open(pt))); br = new bufferedreader(new filereader(pro.getproperty("seed.file")+(iteration-1))); for(int =0; i<centers.length; i++){ string[] temp = null; try{ temp = br.readline().tostring().split("\t"); centers[i] = new geotimedatacenter(integer.parseint(temp[0]),new latlong(double.parsedouble(temp[1]),double.parsedouble(temp[2])),long.parselong(temp[3])); } catch(exception e) { temp = seeding.randomsingleseed().split("\t"); centers[i] = new geotimedatacenter(i,new latlong(double.parsedouble(temp[0]),double.parsedouble(temp[1])),datetolong(temp[2])); } } br.close(); } catch (ioexception e) { e.printstacktrace(); } return centers; }
maybe know problem?
best regards
i have found problem. have checksum exception. delete .crc files input file. in way no checksum exception , buffered reader work fine (uncommented code part, upstairs).
Comments
Post a Comment