2015-06-05 4 views
5

Sto testando Spark in spark-shell con codice scala. Sto costruendo il prototipo per usare Kafka e Spark.Impossibile creare SparkContext

Ho eseguito il spark-shell come di seguito.

spark-shell --jars ~/spark/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.3.1.jar 

E ho eseguito il codice di seguito nella shell.

import kafka.serializer.StringDecoder 
import org.apache.spark.streaming._ 
import org.apache.spark.streaming.kafka._ 
import org.apache.spark.SparkConf 


// Create context with 2 second batch interval 
val sparkConf = new SparkConf().setAppName("DirectKafkaWordCount") 
val ssc = new StreamingContext(sparkConf, Seconds(2)) 

Poi ho trovato l'errore quando creo ssc. spark-shell mi ha detto un messaggio come di seguito.

scala> val ssc = new StreamingContext(sparkConf, Seconds(2)) 
15/06/05 09:06:08 INFO SparkContext: Running Spark version 1.3.1 
15/06/05 09:06:08 INFO SecurityManager: Changing view acls to: vagrant 
15/06/05 09:06:08 INFO SecurityManager: Changing modify acls to: vagrant 
15/06/05 09:06:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(vagrant); users with modify permissions: Set(vagrant) 
15/06/05 09:06:08 INFO Slf4jLogger: Slf4jLogger started 
15/06/05 09:06:08 INFO Remoting: Starting remoting 
15/06/05 09:06:08 INFO Utils: Successfully started service 'sparkDriver' on port 51270. 
15/06/05 09:06:08 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:51270] 
15/06/05 09:06:08 INFO SparkEnv: Registering MapOutputTracker 
15/06/05 09:06:08 INFO SparkEnv: Registering BlockManagerMaster 
15/06/05 09:06:08 INFO DiskBlockManager: Created local directory at /tmp/spark-d3349ba2-125b-4dda-83fa-abfa6c692143/blockmgr-c0e59bba-c4df-423f-b147-ac55d9bd5ccf 
15/06/05 09:06:08 INFO MemoryStore: MemoryStore started with capacity 267.3 MB 
15/06/05 09:06:08 INFO HttpFileServer: HTTP File server directory is /tmp/spark-842c15d5-7e3f-49c8-a4d0-95bdf5c6b049/httpd-26f5e751-8406-4a97-9ed3-aa79fc46bc6e 
15/06/05 09:06:08 INFO HttpServer: Starting HTTP Server 
15/06/05 09:06:08 INFO Server: jetty-8.y.z-SNAPSHOT 
15/06/05 09:06:08 INFO AbstractConnector: Started [email protected]:55697 
15/06/05 09:06:08 INFO Utils: Successfully started service 'HTTP file server' on port 55697. 
15/06/05 09:06:08 INFO SparkEnv: Registering OutputCommitCoordinator 
15/06/05 09:06:08 INFO Server: jetty-8.y.z-SNAPSHOT 
15/06/05 09:06:08 WARN AbstractLifeCycle: FAILED [email protected]:4040: java.net.BindException: Address already in use 
java.net.BindException: Address already in use 
     at sun.nio.ch.Net.bind0(Native Method) 
     at sun.nio.ch.Net.bind(Net.java:444) 
     at sun.nio.ch.Net.bind(Net.java:436) 
     at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214) 
     at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) 
     at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187) 
     at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316) 
     at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265) 
     at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) 
     at org.spark-project.jetty.server.Server.doStart(Server.java:293) 
     at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) 
     at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199) 
     at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209) 
     at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209) 
     at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837) 
     at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) 
     at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828) 
     at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209) 
     at org.apache.spark.ui.WebUI.bind(WebUI.scala:120) 
     at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309) 
     at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309) 
     at scala.Option.foreach(Option.scala:236) 
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:309) 
     at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643) 
     at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48) 
     at $line35.$read$$iwC$$iwC$$iwC.<init>(<console>:50) 
     at $line35.$read$$iwC$$iwC.<init>(<console>:52) 
     at $line35.$read$$iwC.<init>(<console>:54) 
     at $line35.$read.<init>(<console>:56) 
     at $line35.$read$.<init>(<console>:60) 
     at $line35.$read$.<clinit>(<console>) 
     at $line35.$eval$.<init>(<console>:7) 
     at $line35.$eval$.<clinit>(<console>) 
     at $line35.$eval.$print(<console>) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) 
     at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) 
     at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) 
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) 
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) 
     at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856) 
     at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901) 
     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813) 
     at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656) 
     at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664) 
     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) 
     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) 
     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944) 
     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058) 
     at org.apache.spark.repl.Main$.main(Main.scala:31) 
     at org.apache.spark.repl.Main.main(Main.scala) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569) 
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166) 
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189) 
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) 
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
15/06/05 09:06:08 WARN AbstractLifeCycle: FAILED [email protected]: java.net.BindException: Address already in use 
java.net.BindException: Address already in use 
     at sun.nio.ch.Net.bind0(Native Method) 
     at sun.nio.ch.Net.bind(Net.java:444) 
     at sun.nio.ch.Net.bind(Net.java:436) 
     at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214) 
     at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) 
     at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187) 
     at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316) 
     at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265) 
     at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) 
     at org.spark-project.jetty.server.Server.doStart(Server.java:293) 
     at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) 
     at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199) 
     at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209) 
     at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209) 
     at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837) 
     at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) 
     at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828) 
     at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209) 
     at org.apache.spark.ui.WebUI.bind(WebUI.scala:120) 
     at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309) 
     at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309) 
     at scala.Option.foreach(Option.scala:236) 
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:309) 
     at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643) 
     at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46) 
     at $line35.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48) 
     at $line35.$read$$iwC$$iwC$$iwC.<init>(<console>:50) 
     at $line35.$read$$iwC$$iwC.<init>(<console>:52) 
     at $line35.$read$$iwC.<init>(<console>:54) 
     at $line35.$read.<init>(<console>:56) 
     at $line35.$read$.<init>(<console>:60) 
     at $line35.$read$.<clinit>(<console>) 
     at $line35.$eval$.<init>(<console>:7) 
     at $line35.$eval$.<clinit>(<console>) 
     at $line35.$eval.$print(<console>) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) 
     at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) 
     at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) 
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) 
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) 
     at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856) 
     at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901) 
     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813) 
     at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656) 
     at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664) 
     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) 
     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) 
     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944) 
     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058) 
     at org.apache.spark.repl.Main$.main(Main.scala:31) 
     at org.apache.spark.repl.Main.main(Main.scala) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569) 
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166) 
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189) 
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) 
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null} 
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null} 
15/06/05 09:06:08 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 
15/06/05 09:06:08 INFO Server: jetty-8.y.z-SNAPSHOT 
15/06/05 09:06:08 INFO AbstractConnector: Started [email protected]:4041 
15/06/05 09:06:08 INFO Utils: Successfully started service 'SparkUI' on port 4041. 
15/06/05 09:06:08 INFO SparkUI: Started SparkUI at http://localhost:4041 
15/06/05 09:06:08 INFO SparkContext: Added JAR file:/home/vagrant/spark/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.3.1.jar at http://10.0.2.15:55697/jars/spark-streaming-kafka-assembly_2.10-1.3.1.jar with timestamp 1433495168735 
15/06/05 09:06:08 INFO Executor: Starting executor ID <driver> on host localhost 
15/06/05 09:06:08 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://[email protected]:51270/user/HeartbeatReceiver 
15/06/05 09:06:08 INFO NettyBlockTransferService: Server created on 37393 
15/06/05 09:06:08 INFO BlockManagerMaster: Trying to register BlockManager 
15/06/05 09:06:08 INFO BlockManagerMasterActor: Registering block manager localhost:37393 with 267.3 MB RAM, BlockManagerId(<driver>, localhost, 37393) 
15/06/05 09:06:08 INFO BlockManagerMaster: Registered BlockManager 
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: 
org.apache.spark.SparkContext.<init>(SparkContext.scala:80) 
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1016) 
$iwC$$iwC.<init>(<console>:9) 
$iwC.<init>(<console>:18) 
<init>(<console>:20) 
.<init>(<console>:24) 
.<clinit>(<console>) 
.<init>(<console>:7) 
.<clinit>(<console>) 
$print(<console>) 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
java.lang.reflect.Method.invoke(Method.java:606) 
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) 
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) 
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) 
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) 
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) 
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856) 
     at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1812) 
     at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1808) 
     at scala.Option.foreach(Option.scala:236) 
     at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1808) 
     at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1795) 
     at scala.Option.foreach(Option.scala:236) 
     at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:1795) 
     at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:1847) 
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:1754) 
     at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643) 
     at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75) 
     at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29) 
     at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34) 
     at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36) 
     at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38) 
     at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40) 
     at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42) 
     at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44) 
     at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46) 
     at $iwC$$iwC$$iwC$$iwC.<init>(<console>:48) 
     at $iwC$$iwC$$iwC.<init>(<console>:50) 
     at $iwC$$iwC.<init>(<console>:52) 
     at $iwC.<init>(<console>:54) 
     at <init>(<console>:56) 
     at .<init>(<console>:60) 
     at .<clinit>(<console>) 
     at .<init>(<console>:7) 
     at .<clinit>(<console>) 
     at $print(<console>) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) 
     at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) 
     at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) 
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) 
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) 
     at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856) 
     at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901) 
     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813) 
     at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656) 
     at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664) 
     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) 
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) 
     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) 
     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944) 
     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058) 
     at org.apache.spark.repl.Main$.main(Main.scala:31) 
     at org.apache.spark.repl.Main.main(Main.scala) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569) 
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166) 
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189) 
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) 
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

Mi chiedo perché lo StreamingContext compia l'errore. Potresti svelare il problema?

Ho anche controllato la porta 4040.

Questa è la lista di porte aperte prima di eseguire spark-shell.

[email protected]:~$ netstat -an | grep "LISTEN " 
tcp  0  0 0.0.0.0:22    0.0.0.0:*    LISTEN 
tcp  0  0 0.0.0.0:47078   0.0.0.0:*    LISTEN 
tcp  0  0 0.0.0.0:111    0.0.0.0:*    LISTEN 
tcp6  0  0 :::22     :::*     LISTEN 
tcp6  0  0 :::44461    :::*     LISTEN 
tcp6  0  0 :::111     :::*     LISTEN 
tcp6  0  0 :::80     :::*     LISTEN 

E questa è la lista di porta aperta dopo l'esecuzione spark-shell.

[email protected]:~$ netstat -an | grep "LISTEN " 
tcp  0  0 0.0.0.0:22    0.0.0.0:*    LISTEN 
tcp  0  0 0.0.0.0:47078   0.0.0.0:*    LISTEN 
tcp  0  0 0.0.0.0:111    0.0.0.0:*    LISTEN 
tcp6  0  0 :::22     :::*     LISTEN 
tcp6  0  0 :::55233    :::*     LISTEN 
tcp6  0  0 :::4040     :::*     LISTEN 
tcp6  0  0 10.0.2.15:41545   :::*     LISTEN 
tcp6  0  0 :::44461    :::*     LISTEN 
tcp6  0  0 :::111     :::*     LISTEN 
tcp6  0  0 :::56784    :::*     LISTEN 
tcp6  0  0 :::80     :::*     LISTEN 
tcp6  0  0 :::39602    :::*     LISTEN 
+0

non è possibile creare un altro contesto scintilla all'interno della shell! Quale versione di Spark stai usando? – eliasah

+0

Ho inserito il registro errato. Si prega di rivedere uno nuovo. La versione di My Spark è 1.3.1. Sto eseguendo Spark su VM creato da Vagrant. –

+0

hai un altro programma in esecuzione sulla tua porta 4040! – eliasah

risposta

8

Un SparkContext predefinito 'sc' viene creato quando si avvia la shell di accensione. Il metodo di costruzione che stai utilizzando tenta di creare un'altra istanza di SparkContext che non è ciò che dovresti fare. Che cosa si dovrebbe davvero essere facendo è utilizzare i sparkContext esistenti per costruire la StreamingContext utilizzando il costruttore di overload

new StreamingContext(sparkContext: SparkContext, batchDuration: Duration) 

Così ora il codice dovrebbe essere simile,

// Set the existing SparkContext's Master, AppName and other params 
sc.getConf.setMaster("local[2]").setAppName("NetworkWordCount").set("spark.ui.port", "44040") 
// Use 'sc' to create a Streaming context with 2 second batch interval 
val ssc = new StreamingContext(sc, Seconds(2)) 
4

È possibile modificare la porta UI Spark utilizzando una proprietà nella configurazione Spark:

spark.ui.port=44040 
+0

Posso chiedere perché sholud cambio porto di spark ui? Io non è la ragione per non iniziare il contesto scintillante con Kafka. –

+0

se qualche altro processo usa la porta 4040, allora può cambiare – Tinku

+0

@JinhoYoo forse sta già eseguendo l'istanza di scintilla. Puoi semplicemente terminare il vecchio processo di scintilla. – sheh

1

Se si avvia `scintilla shell', in pratica un contesto scintilla, sc, è in esecuzione. Se è necessario creare un nuovo contesto spark per lo streaming, è necessario utilizzare un'altra porta tranne 4040 perché è allocata per il primo contesto spark.

Quindi, infine, ho scritto il codice come di seguito per creare un altro contesto scintilla per il processo di streaming.

import kafka.serializer.StringDecoder 
import org.apache.spark.streaming._ 
import org.apache.spark.streaming.kafka._ 
import org.apache.spark.SparkConf 


// Create context with 2 second batch interval 
val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount").set("spark.ui.port", "44040").set("spark.driver.allowMultipleContexts", "true") 
val ssc = new StreamingContext(conf, Seconds(2)) 
.... 

Grazie per tutti coloro che suggeriscono la soluzione. ;-)

1

Sono venuto qui in cerca di questa risposta : Stavo cercando di connettermi a Cassandra attraverso la scintilla. Poiché v'è uno sparkContext sc in esecuzione per impostazione predefinita, mi è stato sempre un errore:

Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 

Tutto quello che dovevo fare era:

sc.stop 

[So che questo non risponde alla domanda di cui sopra.Ma questa sembra essere l'unica domanda su StackOverflow che appare nella ricerca e altri potrebbero trovarlo utile]

1

Forse non è lo stesso caso ma avevo avvertimenti simili come "WARN util.Utils: Service" SparkUI "non poteva vincolare sulla porta 4040. Tentativo di porto 4041. " Ho riavviato la macchina, quindi va bene. Ho iniziato la scintilla e ho visto scala>

0

Avevo già riscontrato lo stesso problema durante l'avvio della scintilla. I risolverlo mediante procedura di seguito, prima vado verso directory scintilla/sbin, poi avevo iniziato sessione scintilla da questi comandi,

./start-all.sh 

oppure è possibile utilizzare ./start-master.sh e ./start-slave.sh per la stessa. Ora se esegui spark-shell o pyspark o qualsiasi altro componente di scintilla, automaticamente creerà automaticamente l'oggetto contesto scintilla sc.