scala and spark, no problem, cmd can be brought up, the version is compatible with the java version
intellij idea for compilers
CLASSPATH: ; %JAVA_HOME%\lib\dt.jar; %JAVA_HOME%\lib\tools.jar; % SPARK_HOME % \ jars \ spark - core_2 12-3.3.2 rainfall distribution on 10-12. Jar; %SPARK_HOME%\jars\scala-library-2.12.15.jar(chatgpt tells me it's for the next two jar files...)
Error :
Java: can't find the symbolic symbols: class JavaRDD location: package org. Apache. Spark. API. Java
java: package scala does not exist
Java: package org. Apache. Spark. API. Java. The function does not exist
Thanks in advance!
0 Answer
No answer yet
这家伙很懒,什么都没留下...