3

RHiveをロードして初期化した後(rhive.init()を使用)、rhive.connect()で次のエラーが発生します。

java.lang.UnsatisfiedLinkError: no MapRClient in java.library.path
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1734)
        at java.lang.Runtime.loadLibrary0(Runtime.java:823)
        at java.lang.System.loadLibrary(System.java:1028)
        at com.mapr.fs.MapRFileSystem.<clinit>(MapRFileSystem.java:1298)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:247)
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1028)
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1079)
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1413)
        at org.apache.hadoop.fs.FileSystem.access$100(FileSystem.java:69)
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1453)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1435)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:232)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:115)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at RJavaTools.invokeMethod(RJavaTools.java:386) Unable to load libMapRClient.so native library Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl,  :   java.lang.UnsatisfiedLinkError: no MapRClient in java.library.path

ハイブサーバーのIPアドレスを使用してrhive.connect( "10.2.138.168")を実行すると、同じエラーが発生します。私の環境変数は次のように設定されています:

MAHOUT_HOME=/opt/mapr/mahout/mahout-0.7  
JAVA_HOME=/usr/java/default
HADOOP_HOME=/opt/mapr/hadoop/hadoop-0.20.2
HADOOP_CONF_DIR=/opt/mapr/hadoop/hadoop-0.20.2/conf  
HIVE_HOME=/opt/mapr/hive/hive-0.9.0

rhive.env()を実行すると、次の警告/エラーが発生します。

Hive Home Directory : /opt/mapr/hive/hive-0.9.0
Hadoop Home Directory : /opt/mapr/hadoop/hadoop-0.20.2
Hadoop Conf Directory : /opt/mapr/hadoop/hadoop-0.20.2/conf
Default RServe List
################################# IMPORTANT ############################################# ############################### /\/\/\/\/\/\/\ ########################################## # Use of slaves and masters file to start/stop jobtracker/tasktracker is not supported. # Please use maprcli: # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -jobtracker start # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -tasktracker start # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -jobtracker stop # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -tasktracker stop #########################################################################################warning: cant't connect to a Rserver at ################################# IMPORTANT #############################################:6311warning: cant't connect to a Rserver at ############################### /\/\/\/\/\/\/\ ##########################################:6311warning: cant't connect to a Rserver at # Use of slaves and masters file to start/stop jobtracker/tasktracker is not supported.:6311warning: cant't connect to a Rserver at # Please use maprcli::6311warning: cant't connect to a Rserver at # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -jobtracker start:6311warning: cant't connect to a Rserver at # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -tasktracker start:6311warning: cant't connect to a Rserver at # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -jobtracker stop:6311warning: cant't connect to a Rserver at # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -tasktracker stop:6311warning: cant't connect to a Rserver at #########################################################################################:6311
Disconnected HiveServer and HDFS
Warning messages:
1: In socketConnection(host, port, open = "a+b", blocking = TRUE) :
  ################################# IMPORTANT #############################################:6311 cannot be opened
2: In socketConnection(host, port, open = "a+b", blocking = TRUE) :
  ############################### /\/\/\/\/\/\/\ ##########################################:6311 cannot be opened
3: In socketConnection(host, port, open = "a+b", blocking = TRUE) :
  # Use of slaves and masters file to start/stop jobtracker/tasktracker is not supported.:6311 cannot be opened
4: In socketConnection(host, port, open = "a+b", blocking = TRUE) :
  # Please use maprcli::6311 cannot be opened
5: In socketConnection(host, port, open = "a+b", blocking = TRUE) :
  # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -jobtracker start:6311 cannot be opened
6: In socketConnection(host, port, open = "a+b", blocking = TRUE) :
  # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -tasktracker start:6311 cannot be opened
7: In socketConnection(host, port, open = "a+b", blocking = TRUE) :
  # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -jobtracker stop:6311 cannot be opened
8: In socketConnection(host, port, open = "a+b", blocking = TRUE) :
  # /opt/mapr/bin/maprcli node services -nodes <list of ip addrs> -tasktracker stop:6311 cannot be opened
9: In socketConnection(host, port, open = "a+b", blocking = TRUE) :
  #########################################################################################:6311 cannot be opened

編集:

次に、次のように設定します。

export LD_LIBRARY_PATH=/opt/mapr/hadoop/hadoop-0.20.2/lib/native/Linux-amd64-64/

現在、rhive.connect()は次のエラーを返します。

2012-12-27 17:09:23,1578 ERROR Client fs/client/fileclient/cc/client.cc:676 Thread: 139849490286464 Unlink failed for file rhive_udf.jar, error Permission denied(13)
2012-12-27 17:09:23,1578 ERROR JniCommon fs/client/fileclient/cc/jni_common.cc:1219 Thread: 139849490286464 remove: File /rhive/lib/rhive_udf.jar, rpc error, Permission denied(13)
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl,  :
  java.io.IOException: Target hdfs:/rhive/lib/rhive_udf.jar already exists
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/mapr/hive/hive-0.9.0/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/mapr/hadoop/hadoop-0.20.2/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl,  :
  org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused
NULL

私がどこで間違っているのかについて何か考えはありますか?ありがとう!

4

1 に答える 1

3

Javaはdllライブラリを見つけることができません。アプリケーションを起動するときに、ロードするdllを指すようにjava.library.pathプロパティを設定する必要があります。`次のようなものを試してください

java -Djava.library.path=/opt/mapr/hadoop/hadoop-0.20.2/lib/native/
于 2012-12-27T17:46:23.827 に答える