scala 创建 SparkContext 失败
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/30662084/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Fail to create SparkContext
提问by Jinho Yoo
I'm testing Spark in spark-shell with scala code. I'm building up the prototype to use Kafka and Spark.
我正在用 Scala 代码在 spark-shell 中测试 Spark。我正在构建原型以使用 Kafka 和 Spark。
I ran the spark-shelllike below.
我在spark-shell下面运行。
spark-shell --jars ~/spark/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.3.1.jar
And I ran the code below in the shell.
我在 shell 中运行了下面的代码。
import kafka.serializer.StringDecoder
import org.apache.spark.streaming._
import org.apache.spark.streaming.kafka._
import org.apache.spark.SparkConf
// Create context with 2 second batch interval
val sparkConf = new SparkConf().setAppName("DirectKafkaWordCount")
val ssc = new StreamingContext(sparkConf, Seconds(2) )
Then I found the error when I create ssc. spark-shelltold me message like below.
然后我在创建ssc. spark-shell告诉我消息如下。
scala> val ssc = new StreamingContext(sparkConf, Seconds(2) )
15/06/05 09:06:08 INFO SparkContext: Running Spark version 1.3.1
15/06/05 09:06:08 INFO SecurityManager: Changing view acls to: vagrant
15/06/05 09:06:08 INFO SecurityManager: Changing modify acls to: vagrant
15/06/05 09:06:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(vagrant); users with modify permissions: Set(vagrant)
15/06/05 09:06:08 INFO Slf4jLogger: Slf4jLogger started
15/06/05 09:06:08 INFO Remoting: Starting remoting
15/06/05 09:06:08 INFO Utils: Successfully started service 'sparkDriver' on port 51270.
15/06/05 09:06:08 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:51270]
15/06/05 09:06:08 INFO SparkEnv: Registering MapOutputTracker
15/06/05 09:06:08 INFO SparkEnv: Registering BlockManagerMaster
15/06/05 09:06:08 INFO DiskBlockManager: Created local directory at /tmp/spark-d3349ba2-125b-4dda-83fa-abfa6c692143/blockmgr-c0e59bba-c4df-423f-b147-ac55d9bd5ccf
15/06/05 09:06:08 INFO MemoryStore: MemoryStore started with capacity 267.3 MB
15/06/05 09:06:08 INFO HttpFileServer: HTTP File server directory is /tmp/spark-842c15d5-7e3f-49c8-a4d0-95bdf5c6b049/httpd-26f5e751-8406-4a97-9ed3-aa79fc46bc6e
15/06/05 09:06:08 INFO HttpServer: Starting HTTP Server
15/06/05 09:06:08 INFO Server: jetty-8.y.z-SNAPSHOT
15/06/05 09:06:08 INFO AbstractConnector: Started [email protected]:55697
15/06/05 09:06:08 INFO Utils: Successfully started service 'HTTP file server' on port 55697.
15/06/05 09:06:08 INFO SparkEnv: Registering OutputCommitCoordinator
15/06/05 09:06:08 INFO Server: jetty-8.y.z-SNAPSHOT
15/06/05 09:06:08 WARN AbstractLifeCycle: FAILED [email protected]:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:444)
at sun.nio.ch.Net.bind(Net.java:436)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect(JettyUtils.scala:199)
at org.apache.spark.ui.JettyUtils$$anonfun.apply(JettyUtils.scala:209)
at org.apache.spark.ui.JettyUtils$$anonfun.apply(JettyUtils.scala:209)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort.apply$mcVI$sp(Utils.scala:1837)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:120)
at org.apache.spark.SparkContext$$anonfun.apply(SparkContext.scala:309)
at org.apache.spark.SparkContext$$anonfun.apply(SparkContext.scala:309)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:309)
at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643)
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
at $line35.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
at $line35.$read$$iwC$$iwC$$iwC.<init>(<console>:50)
at $line35.$read$$iwC$$iwC.<init>(<console>:52)
at $line35.$read$$iwC.<init>(<console>:54)
at $line35.$read.<init>(<console>:56)
at $line35.$read$.<init>(<console>:60)
at $line35.$read$.<clinit>(<console>)
at $line35.$eval$.<init>(<console>:7)
at $line35.$eval$.<clinit>(<console>)
at $line35.$eval.$print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret(SparkILoop.scala:856)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
at org.apache.spark.repl.SparkILoop.processLine(SparkILoop.scala:656)
at org.apache.spark.repl.SparkILoop.innerLoop(SparkILoop.scala:664)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply$mcZ$sp(SparkILoop.scala:996)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply(SparkILoop.scala:944)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/06/05 09:06:08 WARN AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@e067ac3: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:444)
at sun.nio.ch.Net.bind(Net.java:436)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect(JettyUtils.scala:199)
at org.apache.spark.ui.JettyUtils$$anonfun.apply(JettyUtils.scala:209)
at org.apache.spark.ui.JettyUtils$$anonfun.apply(JettyUtils.scala:209)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort.apply$mcVI$sp(Utils.scala:1837)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:120)
at org.apache.spark.SparkContext$$anonfun.apply(SparkContext.scala:309)
at org.apache.spark.SparkContext$$anonfun.apply(SparkContext.scala:309)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:309)
at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643)
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
at $line35.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
at $line35.$read$$iwC$$iwC$$iwC.<init>(<console>:50)
at $line35.$read$$iwC$$iwC.<init>(<console>:52)
at $line35.$read$$iwC.<init>(<console>:54)
at $line35.$read.<init>(<console>:56)
at $line35.$read$.<init>(<console>:60)
at $line35.$read$.<clinit>(<console>)
at $line35.$eval$.<init>(<console>:7)
at $line35.$eval$.<clinit>(<console>)
at $line35.$eval.$print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret(SparkILoop.scala:856)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
at org.apache.spark.repl.SparkILoop.processLine(SparkILoop.scala:656)
at org.apache.spark.repl.SparkILoop.innerLoop(SparkILoop.scala:664)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply$mcZ$sp(SparkILoop.scala:996)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply(SparkILoop.scala:944)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
15/06/05 09:06:08 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
15/06/05 09:06:08 INFO Server: jetty-8.y.z-SNAPSHOT
15/06/05 09:06:08 INFO AbstractConnector: Started [email protected]:4041
15/06/05 09:06:08 INFO Utils: Successfully started service 'SparkUI' on port 4041.
15/06/05 09:06:08 INFO SparkUI: Started SparkUI at http://localhost:4041
15/06/05 09:06:08 INFO SparkContext: Added JAR file:/home/vagrant/spark/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.3.1.jar at http://10.0.2.15:55697/jars/spark-streaming-kafka-assembly_2.10-1.3.1.jar with timestamp 1433495168735
15/06/05 09:06:08 INFO Executor: Starting executor ID <driver> on host localhost
15/06/05 09:06:08 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:51270/user/HeartbeatReceiver
15/06/05 09:06:08 INFO NettyBlockTransferService: Server created on 37393
15/06/05 09:06:08 INFO BlockManagerMaster: Trying to register BlockManager
15/06/05 09:06:08 INFO BlockManagerMasterActor: Registering block manager localhost:37393 with 267.3 MB RAM, BlockManagerId(<driver>, localhost, 37393)
15/06/05 09:06:08 INFO BlockManagerMaster: Registered BlockManager
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:80)
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1016)
$iwC$$iwC.<init>(<console>:9)
$iwC.<init>(<console>:18)
<init>(<console>:20)
.<init>(<console>:24)
.<clinit>(<console>)
.<init>(<console>:7)
.<clinit>(<console>)
$print(<console>)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:606)
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
org.apache.spark.repl.SparkIMain.loadAndRunReq(SparkIMain.scala:840)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
org.apache.spark.repl.SparkILoop.reallyInterpret(SparkILoop.scala:856)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$$anonfun$apply.apply(SparkContext.scala:1812)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$$anonfun$apply.apply(SparkContext.scala:1808)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning.apply(SparkContext.scala:1808)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning.apply(SparkContext.scala:1795)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:1795)
at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:1847)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:1754)
at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643)
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
at $iwC$$iwC$$iwC.<init>(<console>:50)
at $iwC$$iwC.<init>(<console>:52)
at $iwC.<init>(<console>:54)
at <init>(<console>:56)
at .<init>(<console>:60)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret(SparkILoop.scala:856)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
at org.apache.spark.repl.SparkILoop.processLine(SparkILoop.scala:656)
at org.apache.spark.repl.SparkILoop.innerLoop(SparkILoop.scala:664)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply$mcZ$sp(SparkILoop.scala:996)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply(SparkILoop.scala:944)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I wonder why StreamingContextmakes the error. Could you unveil the problem?
我想知道为什么StreamingContext会出错。你能揭示问题吗?
I also checked 4040 port.
我还检查了 4040 端口。
This is the opened port list before running spark-shell.
这是运行前打开的端口列表spark-shell。
vagrant@vagrant-ubuntu-trusty-64:~$ netstat -an | grep "LISTEN "
tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN
tcp 0 0 0.0.0.0:47078 0.0.0.0:* LISTEN
tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN
tcp6 0 0 :::22 :::* LISTEN
tcp6 0 0 :::44461 :::* LISTEN
tcp6 0 0 :::111 :::* LISTEN
tcp6 0 0 :::80 :::* LISTEN
And this is the opened port list after running spark-shell.
这是运行后打开的端口列表spark-shell。
vagrant@vagrant-ubuntu-trusty-64:~$ netstat -an | grep "LISTEN "
tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN
tcp 0 0 0.0.0.0:47078 0.0.0.0:* LISTEN
tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN
tcp6 0 0 :::22 :::* LISTEN
tcp6 0 0 :::55233 :::* LISTEN
tcp6 0 0 :::4040 :::* LISTEN
tcp6 0 0 10.0.2.15:41545 :::* LISTEN
tcp6 0 0 :::44461 :::* LISTEN
tcp6 0 0 :::111 :::* LISTEN
tcp6 0 0 :::56784 :::* LISTEN
tcp6 0 0 :::80 :::* LISTEN
tcp6 0 0 :::39602 :::* LISTEN
回答by blueskin
A default SparkContext 'sc' is created when you start the spark-shell. The constructor method that you are using tries to create another instance of SparkContext which isn't what you should do. What you should really be doing is use the existing sparkContext to construct the StreamingContext using the overloaded constructor
启动 spark-shell 时会创建默认的 SparkContext 'sc'。您正在使用的构造函数方法试图创建另一个 SparkContext 实例,这不是您应该做的。您真正应该做的是使用现有的 sparkContext 使用重载的构造函数构造 StreamingContext
new StreamingContext(sparkContext: SparkContext, batchDuration: Duration)
So now your code should look like this,
所以现在你的代码应该是这样的,
// Set the existing SparkContext's Master, AppName and other params
sc.getConf.setMaster("local[2]").setAppName("NetworkWordCount").set("spark.ui.port", "44040" )
// Use 'sc' to create a Streaming context with 2 second batch interval
val ssc = new StreamingContext(sc, Seconds(2) )
回答by Tinku
You can change Spark UI port by using a property in the Spark config:
您可以使用 Spark 配置中的属性更改 Spark UI 端口:
spark.ui.port=44040
回答by Jinho Yoo
If you launch `spark-shell', basically one spark context, sc, is running. If you need to create new spark context for streaming then you need to use another port except 4040 because it's allocated by 1st spark context.
如果您启动 `spark-shell',基本上一个 spark 上下文 sc 正在运行。如果您需要为流创建新的 spark 上下文,则需要使用除 4040 之外的另一个端口,因为它是由第一个 spark 上下文分配的。
So finally, I wrote the code like below to create another spark context for streaming process.
所以最后,我编写了如下代码来为流处理创建另一个 spark 上下文。
import kafka.serializer.StringDecoder
import org.apache.spark.streaming._
import org.apache.spark.streaming.kafka._
import org.apache.spark.SparkConf
// Create context with 2 second batch interval
val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount").set("spark.ui.port", "44040" ).set("spark.driver.allowMultipleContexts", "true")
val ssc = new StreamingContext(conf, Seconds(2) )
....
Thank you for everybody who suggests the solution. ;-)
感谢所有提出解决方案的人。;-)
回答by Priyank Desai
I came here looking for this answer: I was trying to connect to cassandra through spark shell. Since there is a sparkContext sc running by default, I was getting an error:
我来这里是为了寻找这个答案:我试图通过 spark shell 连接到 cassandra。由于默认情况下有一个 sparkContext sc 运行,我收到一个错误:
Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
All I had to do was:
我所要做的就是:
sc.stop
[I know this doesn't answer the question above. But this seems to be the only question on stackoverflow that comes up on search and others might find it useful]
[我知道这不能回答上面的问题。但这似乎是搜索中出现的有关计算器溢出的唯一问题,其他人可能会发现它很有用]
回答by kerabanaga
Maybe not the same case but I had similar warning like "WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041." I restarted the machine then it's ok. I started the spark-shell and saw scala>
也许不是同一种情况,但我有类似的警告,如“WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.” 我重启了机器就OK了。我启动了 spark-shell 并看到了 scala>
回答by Shubham Sharma
I had faced same issue while starting spark-shell . I resolve it by below procedure , first i go towards spark/sbin directory , then i had started spark session by these command ,
我在启动 spark-shell 时遇到了同样的问题。我通过以下程序解决它,首先我进入spark/sbin目录,然后我通过这些命令启动了spark会话,
./start-all.sh
or you can use ./start-master.shand ./start-slave.shfor the same .
Now if you will run spark-shellor pysparkor any other spark component then it will automatically create spark context object scfor you .
或者您可以使用./start-master.sh和./start-slave.sh。现在,如果您将运行spark-shell或pyspark任何其他 spark 组件,那么它会自动sc为您创建 spark 上下文对象。

