Wednesday, 15 September 2010

hadoop - How to connect to remote datanode using hdfs client? -


My target is to download files from HDFS for local file system. I am using a client that connects to remote HDFS NameNode. Hadoop FS Go HDFS: // sourceHDFS: 8020 / path_to_file / file / path_to_save_file and I became an exception

  15/03. / 17 00:18:49 Warning client.ShortCircuitCache: ShortCircuitCache (0x11bbad83): I / O error remote construction: to load 1073754800_BP-703742109-127.0.0.1-1398459391664 15/03/17 12:18:49 warning hdfs. BlockReaderFactory has failed block reader java.io.IOException: found error for OP_READ_BLOCK, self = / 127.0.0.1: 57,733, remote = bigdatalite.localdomain / 127.0.0.1: 50010, file / user / hive / warehouse / b2_olap_hive For .db / dim_deal_log / 000000_0, pool BP-703742109-127.0.0.1-1398459391664 Block 1073754800_13977  

My understanding of the situation. HDFS connects to the customer's nameNode, but NameNode returns local datanode IPs (because NameNode and DataNode are located on the same machine). And incorrect address of DataNode for remote client 127.0.0.1

What is the datanode I can connect to right? And maybe my understanding is wrong?

Thank you in advance

You can not be bound to 127.0.0.1 . Ensure that the host name entry in / etc / hosts indicates the non-loopback interface. Bounce your detonodes and nominations.


No comments:

Post a Comment