0

HDFS 上のファイルに値を書き込もうとしています。コードは次のとおりです。

FileSystem fsys = FileSystem.get(new Configuration());
        String fileName = "/user/root/TestData/Parameter.txt";
        Path path = new Path(fileName);//(pathOfTestFile);    
//fstatus.getPath();
        FSDataOutputStream fos = null;
        try {
            fos = fsys.create(path);
        } catch (IOException e1) {
            e1.printStackTrace();
        }
        BufferedWriter writer = new BufferedWriter(new 
OutputStreamWriter(fos));
        writer.write(iterations);
        writer.close();
        fos.close();

しかし、コマンド hadoop fs -ls は 4 のサイズを示していることがわかりますが、 hadoop fs -cat を実行すると、ファイルに内容が表示されません。

4

1 に答える 1

2

I'm assuming that iterations is an integer / short / byte. Either way the byte(s) representation of these values is zero or more 0x00 bytes, followed by a 0x04 byte. Neither 0x00 nor 0x04 are printable characters so it's to no surprise that hadoop fs -cat shows nothing in the output.

If you pipe the output to hexdump -C though, i imagine you'll see your output:

hadoop fs -cat /path/to/file | hexdump -C

EDIT

You're also using BufferedWriter.write(int) which actually writes out the int as a chaarcter. You should be using FSDataOutputStream.writeInt(iterations) (it actually is a method of DataOutputStream) and then FSDataInputStream.readInt() to read it back again.

于 2012-07-19T10:29:55.350 に答える