<p>请遵循以下步骤:</p>
<h3>相关aws JAR</h3>
<p>{cd1>确保安装了以下内容:</p>
<ul>
<li><a href="http://central.maven.org/maven2/com/amazonaws/aws-java-sdk/1.7.4/aws-java-sdk-1.7.4.jar" rel="nofollow noreferrer">aws-java-sdk</a></li>
<li><a href="http://central.maven.org/maven2/org/apache/hadoop/hadoop-aws/2.7.3/hadoop-aws-2.7.3.jar" rel="nofollow noreferrer">hadoop-aws</a></li>
</ul>
<p>您可以运行<code>pyspark</code>应用程序,如下所示:</p>
<p><code>pyspark jars "aws-java-sdk-1.7.4.jar,hadoop-aws-2.7.3</code>
(或来自docker<code>CMD</code>)</p>
<h3>在笔记本内,配置“hadoop配置”</h3>
<pre><code>sc._jsc.hadoopConfiguration().set("fs.s3a.access.key", "access_key")
sc._jsc.hadoopConfiguration().set("fs.s3a.secret.key", "secret_key")
sc._jsc.hadoopConfiguration().set("fs.s3a.proxy.host", "minio")
sc._jsc.hadoopConfiguration().set("fs.s3a.endpoint", "minio")
sc._jsc.hadoopConfiguration().set("fs.s3a.proxy.port", "9000")
sc._jsc.hadoopConfiguration().set("fs.s3a.path.style.access", "true")
sc._jsc.hadoopConfiguration().set("fs.s3a.connection.ssl.enabled", "false")
sc._jsc.hadoopConfiguration().set("fs.s3a.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem")
</code></pre>
<p>请检查<a href="https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/index.html#General_S3A_Client_configuration" rel="nofollow noreferrer">s3a client configuration</a>以查看完整的参数列表</p>
<p>现在应该能够从<code>minio</code>查询数据,例如:</p>
<p><code>sc.textFile("s3a://<file path>")</code></p>