Ich bekomme eine besondere Ausnahme beim Zugriff auf HBase von Scala-Code.NoSuchMethodError Ausnahme beim Verbinden mit HBase
Hier ist die lange Version der Geschichte:
I HBase 1.2.6 im Standalone-Modus auf meinem Rechner (Ubuntu 16.04) installiert. Weiter habe ich versucht, den folgenden Code auszuführen:
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.hbase.client.ConnectionFactory
import org.apache.hadoop.hbase.io.compress.Compression.Algorithm
import org.apache.hadoop.hbase.{HBaseConfiguration, HColumnDescriptor, HTableDescriptor, TableName}
object HelloWorld {
private val TABLE_NAME = "MY_TABLE_NAME_TOO"
private val CF_DEFAULT = "DEFAULT_COLUMN_FAMILY"
def main(args: Array[String]): Unit = {
val config: Configuration = HBaseConfiguration.create()
val connection = ConnectionFactory.createConnection(config)
val admin = connection.getAdmin
val table = new HTableDescriptor(TableName.valueOf(TABLE_NAME))
table.addFamily(new HColumnDescriptor(CF_DEFAULT).setCompressionType(Algorithm.NONE))
System.out.print("Creating table. ")
admin.createTable(table)
System.out.println(" Done.")
}
}
Die build.sbt
Datei:
name := "myapp"
version := "1.0"
scalaVersion := "2.12.1"
libraryDependencies += "org.apache.storm" % "storm-core" % "1.0.3"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.2.6" exclude("org.slf4j","slf4j-log4j12")
libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.2.6" exclude("org.slf4j","slf4j-log4j12")
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.8.0" exclude("org.slf4j","slf4j-log4j12")
libraryDependencies += "org.apache.hbase" % "hbase-protocol" % "1.2.6" exclude("org.slf4j","slf4j-log4j12")
ich den Code ausführen von sbt diese Befehle:
$ sbt
> compile
> run
ich folgende Fehlermeldung erhalten :
[info] Running hbase.HelloWorld
[info] 566 [main] WARN o.a.h.u.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[error] Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.security.authentication.util.KerberosUtil.hasKerberosTicket(Ljavax/security/auth/Subject;)Z
[error] at org.apache.hadoop.security.UserGroupInformation.<init>(UserGroupInformation.java:652)
[error] at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:843)
[error] at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:802)
[error] at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:675)
[error] at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:293)
[error] at org.apache.hadoop.hbase.security.User.getCurrent(User.java:191)
[error] at org.apache.hadoop.hbase.security.UserProvider.getCurrent(UserProvider.java:167)
[error] at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:215)
[error] at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)
[error] at hbase.HelloWorld$.main(HelloWorld.scala:48)
[error] at hbase.HelloWorld.main(HelloWorld.scala)
java.lang.RuntimeException: Nonzero exit code returned from runner: 1
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code returned from runner: 1
[error] Total time: 1 s, completed Jun 12, 2017 2:56:48 PM
Eine Ausnahme tritt auf, wenn die Leitung
val connection = ConnectionFactory.createConnection(config)
erreicht ist.
Es macht keinen Sinn für mich. Ich hoffe, dass jemand dieses Licht beleuchten kann.