有 Java 编程相关的问题?

你可以在下面搜索框中键入要查询的问题!

java配置单元表在Spark 2.1.1中不可见

我创建了一个数据库和一个表,并从配置单元终端加载了数据。但是,当我试图从Spark访问该表时,出现以下异常:

Exception in thread "main" org.apache.spark.sql.AnalysisException: Table or view not found: employee;

正如在其他一些SO答案中提到的,我已经放置了我的蜂巢站点。spark/conf目录中的xml

下面是我在配置单元站点中添加的代码。xml:

<property>
  <name>hive.exec.local.scratchdir</name>
  <value>/usr/local/hive/iotmp</value>
  <description>Local scratch space for Hive jobs</description>
</property>

<property>
  <name>hive.querylog.location</name>
  <value>/usr/local/hive/iotmp</value>
  <description>Location of Hive run time structured log file</description>
</property>

<property>
  <name>hive.downloaded.resources.dir</name>
  <value>/usr/local/hive/iotmp</value>
  <description>Temporary local directory for added resources in the remote file system.</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:derby://localhost:1527/metastore_db;create=true </value>
  <description>JDBC connect string for a JDBC metastore </description>
</property>

以下是我的Java代码:

import org.apache.spark.SparkConf;
import org.apache.spark.sql.SparkSession;
public class SparkHiveExample {
public static void main(String[] args) {

    SparkSession session = SparkSession.builder()
      .master("local")
      .appName("spark session example")
      .enableHiveSupport()
      .getOrCreate();
    System.out.println("show tables ---> "+session.sql("select * from employee").count());

 }
}

我正在使用以下命令运行:

spark-submit --class SparkHiveExample --master local[4] /Users/akanchan/Documents/workspace/testone/target/simple-project-1.0.jar

请帮我解决这个问题


共 (0) 个答案