Is it OK to run under cmd?
Are environment variables set up PYTHONPATH and SPARK_HOME
# from pyspark import SparkConf, SparkContext
import pyspark
# 类SparkContext的类对象 是 执行环境入口对象
# setMaster spark运行模式,local是本机
# setAppName 为spark程序起名
conf1 = pyspark.SparkConf().setMaster("local[*]").setAppName("local_spark")
# 基于SparkConf类对象创建SparkContext类对象
sc = pyspark.SparkContext(conf=conf1)
# 打印PySpark运行版本
print(sc.version)
# 停止SparkContext对象的运行(停止pyspark程序)
sc.stop()
下面是报错
E:\Python\python3.10.4\python.exe E:/Python/pycode/5pyspark/ONE.py
系统找不到指定的路径。
Traceback (most recent call last):
File "E:\Python\pycode\5pyspark\ONE.py", line 11, in <module>
sc = pyspark.SparkContext(conf=conf1)
File "E:\Python\python3.10.4\lib\site-packages\pyspark\context.py", line 195, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "E:\Python\python3.10.4\lib\site-packages\pyspark\context.py", line 417, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File "E:\Python\python3.10.4\lib\site-packages\pyspark\java_gateway.py", line 103, in launch_gateway
time.sleep(0.1)
KeyboardInterrupt
Process finished with exit code -1073741510 (0xC000013A: interrupted by Ctrl+C)
0 Answer
Is it OK to run under cmd?
Are environment variables set up PYTHONPATH and SPARK_HOME
you can refer to this environment Settings: https://www.codenong.com/52726043/
This answer quotes ChatGPT
According to the error message, it seems that the running environment is faulty. The fault may be caused by an interruption or exception during Spark startup. It is recommended to check the following points:
1. Check whether Java has been installed and Java environment variables have been configured.
2. Check whether Spark is correctly installed and whether the version of Spark is compatible with that of pyspark.
3. Check whether Spark fails to start because other programs or processes occupy the port.
4. Run the script correctly in the IDE or command line to ensure that the code has no syntax errors or path errors.
You can check each of the above issues, or try running the script in another environment to see if Spark starts properly.
The system cannot find the specified path.
You check which path is wrong
这家伙很懒,什么都没留下...