佛罗里达州的Pypark

2024-05-16 04:02:09 发布

您现在位置:Python中文网/ 问答频道 /正文

我在这篇文章中尝试访问pyspark的解决方案 Access to Spark from Flask app

但是当我在我的命令里尝试这个的时候

 ./bin/spark-submit yourfilename.py

我明白了

^{pr2}$

有什么解决办法吗?在

我尝试将.py文件放入bin文件夹并运行spark submit应用程序副本 结果如下:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
18/03/21 01:52:00 INFO SparkContext: Running Spark version 2.2.0
18/03/21 01:52:01 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/03/21 01:52:01 INFO SparkContext: Submitted application: app.py
18/03/21 01:52:01 INFO SecurityManager: Changing view acls to: USER
18/03/21 01:52:01 INFO SecurityManager: Changing modify acls to: USER
18/03/21 01:52:01 INFO SecurityManager: Changing view acls groups to:
18/03/21 01:52:01 INFO SecurityManager: Changing modify acls groups to:
18/03/21 01:52:01 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(USER); groups with view permissions: Set(); users  with modify permissions: Set(USER); groups with modify permissions: Set()
18/03/21 01:52:02 INFO Utils: Successfully started service 'sparkDriver' on port 62901.
18/03/21 01:52:02 INFO SparkEnv: Registering MapOutputTracker
18/03/21 01:52:02 INFO SparkEnv: Registering BlockManagerMaster
18/03/21 01:52:02 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
18/03/21 01:52:02 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
18/03/21 01:52:02 INFO DiskBlockManager: Created local directory at C:\Users\USER\AppData\Local\Temp\blockmgr-5504ca97-3578-4f22-9c0e-b5230bc02369
18/03/21 01:52:02 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
18/03/21 01:52:02 INFO SparkEnv: Registering OutputCommitCoordinator
18/03/21 01:52:03 INFO Utils: Successfully started service 'SparkUI' on port 4040.
18/03/21 01:52:03 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.56.1:4040
18/03/21 01:52:03 INFO SparkContext: Added file file:/D:/opt/spark/spark-2.2.0-bin-hadoop2.7/bin/app.py at file:/D:/opt/spark/spark-2.2.0-bin-hadoop2.7/bin/app.py with timestamp 1521568323605
18/03/21 01:52:03 INFO Utils: Copying D:\opt\spark\spark-2.2.0-bin-hadoop2.7\bin\app.py to C:\Users\USER\AppData\Local\Temp\spark-de856657-5946-4d4f-a7ea-9c2740c88add\userFiles-8c88d3b0-5a05-4c54-861d-ce4397ed0bd5\app.py
18/03/21 01:52:04 INFO Executor: Starting executor ID driver on host localhost
18/03/21 01:52:04 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 62910.
18/03/21 01:52:04 INFO NettyBlockTransferService: Server created on 192.168.56.1:62910
18/03/21 01:52:04 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
18/03/21 01:52:04 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.56.1, 62910, None)
18/03/21 01:52:04 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.56.1:62910 with 366.3 MB RAM, BlockManagerId(driver, 192.168.56.1, 62910, None)
18/03/21 01:52:04 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.56.1, 62910, None)
18/03/21 01:52:04 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.56.1, 62910, None)
18/03/21 01:52:06 INFO SparkContext: Invoking stop() from shutdown hook
18/03/21 01:52:06 INFO SparkUI: Stopped Spark web UI at http://192.168.56.1:4040
18/03/21 01:52:06 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/03/21 01:52:06 INFO MemoryStore: MemoryStore cleared
18/03/21 01:52:06 INFO BlockManager: BlockManager stopped
18/03/21 01:52:06 INFO BlockManagerMaster: BlockManagerMaster stopped
18/03/21 01:52:06 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/03/21 01:52:06 INFO SparkContext: Successfully stopped SparkContext
18/03/21 01:52:06 INFO ShutdownHookManager: Shutdown hook called
18/03/21 01:52:06 INFO ShutdownHookManager: Deleting directory C:\Users\USER\AppData\Local\Temp\spark-de856657-5946-4d4f-a7ea-9c2740c88add\pyspark-5cc3aa4c-3890-4600-b4c0-090e179c18eb
18/03/21 01:52:06 INFO ShutdownHookManager: Deleting directory C:\Users\USER\AppData\Local\Temp\spark-de856657-5946-4d4f-a7ea-9c2740c88add

我再也不能运行烧瓶程序了


Tags: topyinfoappbinonwithspark