每当我试图收集我的RDD时,我就开始得到以下错误,这是在我安装了Java10.1之后发生的,所以我当然取出它并重新安装它,同样的错误。然后,我安装了同样的错误Java9.04。然后,我取出python 2.7.14、ApacheSequence 2.3.0和Hadoop2.7,同样的错误。有没有人有其他原因让我继续犯错误?>>> from operator import add>>> from pyspark import SparkConf, SparkContext>>> import string>>> import sys>>> import re>>>>>>
sc = SparkContext(appName="NEW")2018-04-21 22:28:45 WARN Utils:66 - Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
>>> rdd = sc.parallelize(xrange(1,10))>>> new =rdd.collect()Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\spark\spark-2.3.0-bin-hadoop2.7\python\pyspark\rdd.py", line 824, in collect
port = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
File "C:\spark\spark-2.3.0-bin-hadoop2.7\python\lib\py4j-0.10.6-src.zip\py4j\java_gateway.py", line 1160, in __call__ File "C:\spark\s
park-2.3.0-bin-hadoop2.7\python\pyspark\sql\utils.py", line 63, in deco return f(*a, **kw)
File "C:\spark\spark-2.3.0-bin-hadoop2.7\python\lib\py4j-0.10.6-src.zip\py4j\protocol.py", line 320, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.: java.lang.Ill
egalArgumentException
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449)
添加回答
举报
0/150
提交
取消