scala 无法从 JAR 文件加载主类

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/40830638/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 08:52:47  来源:igfitidea点击:

Cannot load main class from JAR file

scalahadoopapache-sparksbt

提问by sirine

I have an application Spark-scala, I tried to dispaly a simple message "Hello my App". when I compile it by sbt compile it's fine, also I run it by sbt run it's fine, I displayed my message with success but he display an error; like this:

我有一个应用程序 Spark-scala,我试图显示一条简单的消息“你好我的应用程序”。当我通过 sbt compile 编译它时,它很好,我也通过 sbt run 运行它很好,我成功地显示了我的消息,但他显示了错误;像这样:

Hello my application!
16/11/27 15:17:11 ERROR Utils: uncaught error in thread SparkListenerBus,   stopping SparkContext
        java.lang.InterruptedException
     ERROR ContextCleaner: Error in cleaning thread
    java.lang.InterruptedException
     at org.apache.spark.ContextCleaner$$anon.run(ContextCleaner.scala:67)
    16/11/27 15:17:11 INFO SparkUI: Stopped Spark web UI at http://10.0.2.15:4040
    [success] Total time: 13 s, completed Nov 27, 2016 3:17:12 PM
    16/11/27 15:17:12 INFO DiskBlockManager: Shutdown hook called

I can't understand, it's fine or no! Also when i try to load my file jar after the run, he dispaly also an error:

看不懂,好不好!此外,当我尝试在运行后加载我的文件 jar 时,他也显示了一个错误:

my command line look like:

我的命令行看起来像:

spark-submit "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar

And the error this is:

错误是:

Error: Cannot load main class from JAR file:/root/projectFilms/appfilms
Run with --help for usage help or --verbose for debug output
16/11/27 15:24:11 INFO Utils: Shutdown hook called

Please can you answers me!

请你能回答我吗!

回答by Paul Velthuis

The error is due to the fact that the SparkContext is not stopped, this is required in versions higher than Spark 2.x. This should be stopped to prevent this error by SparkContext.stop(), or sc.stop(). Inspiration for solving this error is gained from own experiences and the following sources: Spark Context, Spark Listener Bus error

错误是由于 SparkContext 没有停止这一事实,这在高于 Spark 2.x 的版本中是必需的。应通过SparkContext.stop(), 或停止此操作以防止出现此错误sc.stop()。解决此错误的灵感来自于自己的经验和以下来源:Spark Context, Spark Listener Bus error

回答by devD

You forgot to use --class Parameter spark-submit "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar

您忘记使用 --class 参数 spark-submit "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar

spark-submit --class "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar.

spark-submit --class "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar。

Please note if appfilm belong to any package dont forgot to add package name as below packagename.appfilms

请注意,如果 appfilm 属于任何包,请不要忘记添加如下包名 packagename.appfilms

I believe this will suffice

我相信这就足够了