HBase的0.96星火v 1.0+ [英] Hbase 0.96 with Spark v 1.0+

查看:197
本文介绍了HBase的0.96星火v 1.0+的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

HBASE /火花版本的这种结合似乎是pretty毒性。我花了几个小时试图找到各种MergeStrategy的,将工作,但无济于事。

下面是present build.sbt的核心是:

  VAL sparkVersion =1.0.0
// VAL sparkVersion =1.1.0-快照VAL hbaseVersion =0.96.1.1,cdh5.0.2libraryDependencies ++ = SEQ(
    org.apache.hbase%HBase的客户%hbaseVersion,
    org.apache.hbase%HBase的常见%hbaseVersion,
    org.apache.hbase%HBase的服务器%hbaseVersion,
    org.apache.hbase%HBase的协议%hbaseVersion,
    org.apache.hbase%的HBase-实例%hbaseVersion,
  (org.apache.spark%火花core_2.10%sparkVersion withSources())。EXCLUDEALL(ExclusionRule(org.mortbay.jetty)),
  org.apache.spark%火花sql_2.10%sparkVersion withSources()

下面是必然的复出错误信息:

  14/06/27十九时49分24秒信息的HttpServer:启动HTTP服务器
[错误](运行主0)java.lang.SecurityException异常:类javax.servlet.FilterRegistration的签名者的信息并不在同一个包匹配其他类的签名者的信息
java.lang.SecurityException异常:类javax.servlet.FilterRegistration的签名者的信息并不在同一个包匹配其他类的签名者的信息
        在java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
        在java.lang.ClassLoader中。preDefineClass(ClassLoader.java:666)
        在需要java.lang.ClassLoader.defineClass(ClassLoader.java:794)
        在java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        在java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
        在java.net.URLClassLoader.access $ 100(URLClassLoader.java:71)
        在java.net.URLClassLoader的$ 1.run(URLClassLoader.java:361)
        在java.net.URLClassLoader的$ 1.run(URLClassLoader.java:355)
        在java.security.AccessController.doPrivileged(本机方法)
        在java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        在java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        在java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        在org.eclipse.jetty.servlet.ServletContextHandler<&初始化GT;(ServletContextHandler.java:136)
        在org.eclipse.jetty.servlet.ServletContextHandler<&初始化GT;(ServletContextHandler.java:129)
        在org.eclipse.jetty.servlet.ServletContextHandler<&初始化GT;(ServletContextHandler.java:98)
        在org.apache.spark.ui.JettyUtils $ .createServletHandler(JettyUtils.scala:98)
        在org.apache.spark.ui.JettyUtils $ .createServletHandler(JettyUtils.scala:89)
        在org.apache.spark.ui.WebUI.attachPage(WebUI.scala:65)
        在org.apache.spark.ui.WebUI $$ anonfun $ attachTab $ 1.适用(WebUI.scala:58)
        在org.apache.spark.ui.WebUI $$ anonfun $ attachTab $ 1.适用(WebUI.scala:58)
        在scala.collection.mutable.ResizableArray $ class.foreach(ResizableArray.scala:59)
        在scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        在org.apache.spark.ui.WebUI.attachTab(WebUI.scala:58)
        在org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:66)
        在org.apache.spark.ui.SparkUI&下;初始化>(SparkUI.scala:60)。
        在org.apache.spark.ui.SparkUI<&初始化GT;(SparkUI.scala:42)。
        在org.apache.spark.SparkContext<初始化方式>(SparkContext.scala:222)
        在org.apache.spark.SparkContext&下;初始化方式>(SparkContext.scala:117)
        在com.huawei.swlab.sparkpoc.hbase.HBasePop $。主要(HBasePop.scala:31)
        在com.huawei.swlab.sparkpoc.hbase.HBasePop.main(HBasePop.scala)
        在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)
        在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        在java.lang.reflect.Method.invoke(Method.java:606)
[跟踪]堆栈跟踪燮pressed:运行最后*:runMain为全输出。
14/06/27十九点49分44秒信息的ConnectionManager:选择线程被中断!
了java.lang.RuntimeException:非零退出code:1


解决方案

我正在与我的星火/ HBase的应用程序完全相同的例外。我通过移动 org.mortbay.jetty 排除规则,以我的HBase的服务器依赖固定的:

  libraryDependencies + =org.apache.hbase%HBase的服务器%0.98.6-cdh5.2.0EXCLUDEALL ExclusionRule(组织=org.mortbay.jetty )

如果您有 Hadoop的共同作为你的直接依赖的一个,那么我也认为有必要创建排除规则的javax.servlet depdendencies:

  libraryDependencies + =org.apache.hadoop%Hadoop的常见的%2.5.0-cdh5.2.0EXCLUDEALL ExclusionRule(组织=的javax.servlet)

我把星火依赖性不变:

  libraryDependencies + =org.apache.spark%%火花核%1.1.0-cdh5.2.0libraryDependencies + =org.apache.spark%%火花流%1.1.0-cdh5.2.0libraryDependencies + =org.apache.spark%%火花流 - 卡夫卡%1.1.0-cdh5.2.0

This combination of Hbase / Spark versions appears to be pretty toxic. I have spent hours trying to find various MergeStrategy's that would work but to no avail.

Here is the core of the present build.sbt:

val sparkVersion = "1.0.0"
// val sparkVersion = "1.1.0-SNAPSHOT"

val hbaseVersion = "0.96.1.1-cdh5.0.2"

libraryDependencies ++= Seq(
    "org.apache.hbase" % "hbase-client" % hbaseVersion,
    "org.apache.hbase" % "hbase-common" % hbaseVersion,
    "org.apache.hbase" % "hbase-server" % hbaseVersion,
    "org.apache.hbase" % "hbase-protocol" % hbaseVersion,
    "org.apache.hbase" % "hbase-examples" % hbaseVersion,
  ("org.apache.spark" % "spark-core_2.10" % sparkVersion  withSources()).excludeAll(ExclusionRule("org.mortbay.jetty")),
  "org.apache.spark" % "spark-sql_2.10" % sparkVersion  withSources()
)

The following is the error message that inevitably resurfaces:

14/06/27 19:49:24 INFO HttpServer: Starting HTTP Server
[error] (run-main-0) java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
        at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
        at java.lang.ClassLoader.preDefineClass(ClassLoader.java:666)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:794)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:136)
        at org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:129)
        at org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:98)
        at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:98)
        at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:89)
        at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:65)
        at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:58)
        at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:58)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:58)
        at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:66)
        at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:60)
        at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:42)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:222)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:117)
        at com.huawei.swlab.sparkpoc.hbase.HBasePop$.main(HBasePop.scala:31)
        at com.huawei.swlab.sparkpoc.hbase.HBasePop.main(HBasePop.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
[trace] Stack trace suppressed: run last *:runMain for the full output.
14/06/27 19:49:44 INFO ConnectionManager: Selector thread was interrupted!
java.lang.RuntimeException: Nonzero exit code: 1

解决方案

I was getting the exact same exception with my Spark/HBase application. I fixed it by moving the org.mortbay.jetty exclusion rule to my hbase-server dependency:

libraryDependencies += "org.apache.hbase" % "hbase-server" % "0.98.6-cdh5.2.0" excludeAll ExclusionRule(organization = "org.mortbay.jetty")

If you have hadoop-common as one of your direct dependencies, then I also found it necessary to create an exclusion rule for javax.servlet depdendencies:

libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.5.0-cdh5.2.0" excludeAll ExclusionRule(organization = "javax.servlet")

I left my Spark dependencies untouched:

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0-cdh5.2.0"

libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.1.0-cdh5.2.0"

libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" % "1.1.0-cdh5.2.0"

这篇关于HBase的0.96星火v 1.0+的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆