2015년 1월 6일 화요일

[cdh-user] Failed shortcircuit with CDH5.3.1: shortcircuit.DomainSocketFactory: error creating DomainSocket

I am running hbase performanceEvaluation with MR1 installing of CDH5.3.1 but get below error: 

2015-01-05 04:00:33,783 WARN  [main] mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
2015-01-05 04:00:37,178 INFO  [main] input.FileInputFormat: Total input paths to process : 1
2015-01-05 04:00:37,198 WARN  [main] shortcircuit.DomainSocketFactory: error creating DomainSocket
java.net.ConnectException: connect(2) error: No such file or directory when trying to connect to '/var/run/hadoop-hdfs/dn.50010'
        at org.apache.hadoop.net.unix.DomainSocket.connect0(Native Method)
        at org.apache.hadoop.net.unix.DomainSocket.connect(DomainSocket.java:250)
        at org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory.createSocket(DomainSocketFactory.java:163)
        at org.apache.hadoop.hdfs.BlockReaderFactory.nextDomainPeer(BlockReaderFactory.java:719)
        at org.apache.hadoop.hdfs.BlockReaderFactory.createShortCircuitReplicaInfo(BlockReaderFactory.java:441)
        at org.apache.hadoop.hdfs.shortcircuit.ShortCircuitCache.create(ShortCircuitCache.java:783)
        at org.apache.hadoop.hdfs.shortcircuit.ShortCircuitCache.fetchOrCreate(ShortCircuitCache.java:717)
        at org.apache.hadoop.hdfs.BlockReaderFactory.getBlockReaderLocal(BlockReaderFactory.java:394)
        at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:305)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:574)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:797)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:844)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.util.LineReader.fillBuffer(LineReader.java:180)
        at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
        at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
        at org.apache.hadoop.util.LineReader.readLine(LineReader.java:370)
        at org.apache.hadoop.mapreduce.lib.input.NLineInputFormat.getSplitsForFile(NLineInputFormat.java:102)
        at org.apache.hadoop.mapreduce.lib.input.NLineInputFormat.getSplits(NLineInputFormat.java:79)
        at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1107)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1124)
        at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:178)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:1023)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:976)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:976)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:582)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:612)
        at org.apache.hadoop.hbase.PerformanceEvaluation.doMapReduce(PerformanceEvaluation.java:409)
        at org.apache.hadoop.hbase.PerformanceEvaluation.runTest(PerformanceEvaluation.java:1080)
        at org.apache.hadoop.hbase.PerformanceEvaluation.run(PerformanceEvaluation.java:1282)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
        at org.apache.hadoop.hbase.PerformanceEvaluation.main(PerformanceEvaluation.java:1303)
2015-01-05 04:00:37,203 WARN  [main] shortcircuit.ShortCircuitCache: ShortCircuitCache(0x76b06322): failed to load 1073741837_BP-1525690555-10.154.8.10-1420448369642


Configuration in my hdfs-site.xml:
      <property>
        <name>dfs.block.local-path-access.user</name>
        <value>root</value>
      </property>
      <property>
        <name>dfs.client.read.shortcircuit.streams.cache.size</name>
        <value>1000</value>
      </property>
      <property>
        <name>dfs.client.read.shortcircuit.streams.cache.size.expiry.ms</name>
        <value>1000</value>
      </property>
      <property>
        <name>dfs.client.read.shortcircuit.streams.cache.size</name>
        <value>1000000</value>
      </property>
      <property>
        <name>dfs.client.read.shortcircuit.streams.cache.size.expiry.ms</name>
        <value>1600000</value>
      </property>
      <property>
        <name>dfs.client.domain.socket.data.traffic</name>
        <value>false</value>
      </property>
      <property>
        <name>dfs.domain.socket.path</name>
        <value>/var/run/hadoop-hdfs/dn._PORT</value>
      </property>

Configuration in my hbase-site.xml:

      <property>
        <name>dfs.client.read.shortcircuit</name>
        <value>true</value>
      </property>
      <property>
        <name>hbase.regionserver.checksum.verify</name>
        <value>true</value>
      </property>

What wrong with it? I install hadoop MR1 and Hbase from RPM and modify the configs manually. Anything I forgot to enable the socket path?


댓글 없음:

댓글 쓰기