搭建spark源码运行环境报错及解决办法
Error:scalac:while compiling: C:\Users\Administrator\IdeaProjects\spark-1.6.0\sql\core\src\main\scala\org\apache\spark\sql\util\QueryExecutionListener.scala
during phase: jvm
library version: version 2.10.5
compiler version: version 2.10.5
reconstructed args: -nobootcp -deprecation -classpath C:\Program Files\Java\jdk1.8.0_66\jre\lib\charsets.jar;C:\Program
......
C:\Program Files\Java\jdk1.8.0_66\jre\lib\rt.jar;C:\Users\Administrator\IdeaProjects\spark-1.6.0\sql\core\target\scala-2.10\classes;C:\Users\Administrator\IdeaProjects\spark-1.6.0\core\target\scala-2.10\classes;C:\Users\Administrator\.m2\repository\org\apache\avro\avro-mapred\1.7.7\avro-mapred-1.7.7-hadoop2.jar;
......
C:\Users\Administrator\.m2\repository\org\objenesis\objenesis\1.0\objenesis-1.0.jar;C:\Users\Administrator\.m2\repository\org\spark-project\spark\unused\1.0.0\unused-1.0.0.jar -feature -javabootclasspath ; -unchecked
last tree to typer: Literal(Constant(org.apache.spark.sql.test.ExamplePoint))
symbol: null
symbol definition: null
tpe: Class(classOf)
symbol owners:
context owners: anonymous class withErrorHandling$1 -> package util== Enclosing template or block ==
Template( // val : , tree.tpe=org.apache.spark.sql.util.withErrorHandling$1
"scala.runtime.AbstractFunction1", "scala.Serializable" // parents
ValDef( private
"_"
)
...... ExecutionListenerManager$$anonfun$org$apache$spark$sql$util$ExecutionListenerManager$$withErrorHandling$1.super."" // def (): scala.runtime.AbstractFunction1 in class AbstractFunction1, tree.tpe=()scala.runtime.AbstractFunction1
Nil
)
()
)
)
)
== Expanded type of tree ==
ConstantType(
value = Constant(org.apache.spark.sql.test.ExamplePoint)
)
uncaught exception during compilation: java.lang.AssertionError12345678910111213141516171819202122232425262728293031323334353637
页:
[1]