scala - Running a Spark Application in Intellij 14.1.3 -


i trying run spark application written in scala in intellij 14.1.3.the scala sdk scala-sdk-2.11.6. following error when execute code:

exception in thread "main" java.lang.nosuchmethoderror: scala.collection.immutable.hashset$.empty()lscala/collection/immutable/hashset; @ akka.actor.actorcell$.<init>(actorcell.scala:336) @ akka.actor.actorcell$.<clinit>(actorcell.scala) @ akka.actor.rootactorpath.$div(actorpath.scala:159) @ akka.actor.localactorrefprovider.<init>(actorrefprovider.scala:464) @ akka.remote.remoteactorrefprovider.<init>(remoteactorrefprovider.scala:124) @ sun.reflect.nativeconstructoraccessorimpl.newinstance0(native method) @ sun.reflect.nativeconstructoraccessorimpl.newinstance(nativeconstructoraccessorimpl.java:62) @ sun.reflect.delegatingconstructoraccessorimpl.newinstance(delegatingconstructoraccessorimpl.java:45) @ java.lang.reflect.constructor.newinstance(constructor.java:422) @ akka.actor.reflectivedynamicaccess$$anonfun$createinstancefor$2.apply(dynamicaccess.scala:78) @ scala.util.try$.apply(try.scala:191) @ akka.actor.reflectivedynamicaccess.createinstancefor(dynamicaccess.scala:73) @ akka.actor.reflectivedynamicaccess$$anonfun$createinstancefor$3.apply(dynamicaccess.scala:84) @ akka.actor.reflectivedynamicaccess$$anonfun$createinstancefor$3.apply(dynamicaccess.scala:84) @ scala.util.success.flatmap(try.scala:230) @ akka.actor.reflectivedynamicaccess.createinstancefor(dynamicaccess.scala:84) @ akka.actor.actorsystemimpl.liftedtree1$1(actorsystem.scala:584) @ akka.actor.actorsystemimpl.<init>(actorsystem.scala:577) @ akka.actor.actorsystem$.apply(actorsystem.scala:141) @ akka.actor.actorsystem$.apply(actorsystem.scala:118) @ org.apache.spark.util.akkautils$.org$apache$spark$util$akkautils$$docreateactorsystem(akkautils.scala:122) @ org.apache.spark.util.akkautils$$anonfun$1.apply(akkautils.scala:55) @ org.apache.spark.util.akkautils$$anonfun$1.apply(akkautils.scala:54) @ org.apache.spark.util.utils$$anonfun$startserviceonport$1.apply$mcvi$sp(utils.scala:1837) @ scala.collection.immutable.range.foreach$mvc$sp(range.scala:166) @ org.apache.spark.util.utils$.startserviceonport(utils.scala:1828) @ org.apache.spark.util.akkautils$.createactorsystem(akkautils.scala:57) @ org.apache.spark.sparkenv$.create(sparkenv.scala:223) @ org.apache.spark.sparkenv$.createdriverenv(sparkenv.scala:163) @ org.apache.spark.sparkcontext.createsparkenv(sparkcontext.scala:269) @ org.apache.spark.sparkcontext.<init>(sparkcontext.scala:272) @ lrparquetprocess$.main(lrparquetprocess.scala:9) @ lrparquetprocess.main(lrparquetprocess.scala) @ sun.reflect.nativemethodaccessorimpl.invoke0(native method) @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:62) @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43) @ java.lang.reflect.method.invoke(method.java:497) @ com.intellij.rt.execution.application.appmain.main(appmain.java:140) 

process finished exit code 1

my pom.xml shown below:

<?xml version="1.0" encoding="utf-8"?> <project xmlns="http://maven.apache.org/pom/4.0.0"      xmlns:xsi="http://www.w3.org/2001/xmlschema-instance"      xsi:schemalocation="http://maven.apache.org/pom/4.0.0   http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelversion>4.0.0</modelversion>  <groupid>parquetgeneration</groupid> <artifactid>parquetgeneration</artifactid> <version>1.0-snapshot</version> <properties> <hadoop.version>2.7.0</hadoop.version> </properties> <dependencies> <dependency>         <groupid>org.apache.spark</groupid>         <artifactid>spark-core_2.10</artifactid>         <version>1.3.1</version>         <exclusions>             <exclusion>                 <groupid>org.apache.hadoop</groupid>                 <artifactid>hadoop-client</artifactid>             </exclusion>         </exclusions>     </dependency>     <dependency>     <groupid>org.apache.hadoop</groupid>         <artifactid>hadoop-hdfs</artifactid>         <version>${hadoop.version}</version>     </dependency>     <dependency>         <groupid>org.apache.hadoop</groupid>         <artifactid>hadoop-common</artifactid>         <version>${hadoop.version}</version>         <exclusions>             <exclusion>                 <groupid>org.eclipse.jetty</groupid>                 <artifactid>*</artifactid>             </exclusion>         </exclusions>     </dependency>     <dependency>         <groupid>org.apache.hadoop</groupid>         <artifactid>hadoop-mapreduce-client-app</artifactid>         <version>${hadoop.version}</version>     </dependency>     <dependency>         <groupid>org.scala-lang</groupid>         <artifactid>scala-library</artifactid>         <version>2.10.5</version>     </dependency>     <dependency>         <groupid>org.scala-lang</groupid>         <artifactid>scala-compiler</artifactid>         <version>2.10.5</version>     </dependency>       <dependency>         <groupid>org.apache.spark</groupid>         <artifactid>spark-sql_2.10</artifactid>         <version>1.2.1</version>     </dependency>     <dependency>         <groupid>com.typesafe.akka</groupid>         <artifactid>akka-actor_2.10</artifactid>         <version>2.3.11</version>     </dependency>      <dependency>         <groupid>org.apache.spark</groupid>         <artifactid>spark-hive_2.10</artifactid>         <version>1.3.1</version>     </dependency>  </dependencies> 

go scala 2.10, better @ moment


Comments

Popular posts from this blog

Java 3D LWJGL collision -

spring - SubProtocolWebSocketHandler - No handlers -

methods - python can't use function in submodule -