首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >./shell没有正确启动(spack1.6.1-bin.hadoop2.6版本)

./shell没有正确启动(spack1.6.1-bin.hadoop2.6版本)
EN

Stack Overflow用户
提问于 2016-03-28 23:55:04
回答 2查看 7K关注 0票数 3

我安装了这个火花版本: spark-1.6.1-bin-hadoop2.6.tgz。

现在,当我开始使用./spark-shell命令时,我得到了这个问题(它显示了很多错误行,所以我只是放了一些看起来很重要的错误)。

代码语言:javascript
复制
     Cleanup action completed
        16/03/27 00:19:35 ERROR Schema: Failed initialising database.
        Failed to create database 'metastore_db', see the next exception for details.
        org.datanucleus.exceptions.NucleusDataStoreException: Failed to create database 'metastore_db', see the next exception for details.
            at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:516)

        Caused by: java.sql.SQLException: Directory /usr/local/spark-1.6.1-bin-hadoop2.6/bin/metastore_db cannot be created.
            org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
            ... 128 more
        Caused by: ERROR XBM0H: Directory /usr/local/spark-1.6.1-bin-hadoop2.6/bin/metastore_db cannot be created.


        Nested Throwables StackTrace:
        java.sql.SQLException: Failed to create database 'metastore_db', see the next exception for details.
  org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
            ... 128 more
        Caused by: ERROR XBM0H: Directory /usr/local/spark-1.6.1-bin-hadoop2.6/bin/metastore_db cannot be created.
            at org.apache.derby.iapi.error.StandardException.newException


        Caused by: java.sql.SQLException: Directory /usr/local/spark-1.6.1-bin-hadoop2.6/bin/metastore_db cannot be created.
            at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
            at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
            at 
            ... 128 more

        <console>:16: error: not found: value sqlContext
                 import sqlContext.implicits._
                        ^
        <console>:16: error: not found: value sqlContext
                 import sqlContext.sql
                        ^

        scala> 

我尝试了一些配置来解决这个问题,在有关值sqlContext not问题的其他问题中搜索这些问题,例如:

/etc/host文件:

代码语言:javascript
复制
127.0.0.1  hadoophost localhost localhost.localdomain localhost4 localhost4.localdomain4
    ::1         localhost localhost.localdomain localhost6 localhost6.localdomain6
    10.2.0.15 hadoophost

echo $HOSTNAME返回:

哈多福斯特

.bashrc文件包含:

代码语言:javascript
复制
export SPARK_LOCAL_IP=127.0.0.1

但是不要工作,你能给一些帮助来尝试理解为什么火花不能正常启动吗?

hive-default.xml.template

代码语言:javascript
复制
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?><!--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.
--><configuration>
  <!-- WARNING!!! This file is auto generated for documentation purposes ONLY! -->
  <!-- WARNING!!! Any changes you make to this file will be ignored by Hive.   -->
  <!-- WARNING!!! You must make your changes in hive-site.xml instead.         -->

在主文件夹中,我得到了相同的问题:

代码语言:javascript
复制
[hadoopadmin@hadoop home]$ pwd
/home
[hadoopadmin@hadoop home]$ 

文件夹权限:

代码语言:javascript
复制
[hadoopdadmin@hadoop spark-1.6.1-bin-hadoop2.6]$ ls -la
total 1416
drwxr-xr-x. 12 hadoop hadoop    4096 .
drwxr-xr-x. 16 root   root      4096  ..
drwxr-xr-x.  2 hadoop hadoop    4096  bin
-rw-r--r--.  1 hadoop hadoop 1343562  CHANGES.txt
drwxr-xr-x.  2 hadoop hadoop    4096  conf
drwxr-xr-x.  3 hadoop hadoop    4096  data
drwxr-xr-x.  3 hadoop hadoop    4096  ec2
drwxr-xr-x.  3 hadoop hadoop    4096  examples
drwxr-xr-x.  2 hadoop hadoop    4096  lib
-rw-r--r--.  1 hadoop hadoop   17352  LICENSE
drwxr-xr-x.  2 hadoop hadoop    4096  licenses
-rw-r--r--.  1 hadoop hadoop   23529  NOTICE
drwxr-xr-x.  6 hadoop hadoop    4096  python
drwxr-xr-x.  3 hadoop hadoop    4096  R
-rw-r--r--.  1 hadoop hadoop    3359  README.md
-rw-r--r--.  1 hadoop hadoop     120  RELEASE
drwxr-xr-x.  2 hadoop hadoop    4096  sbin
EN

回答 2

Stack Overflow用户

回答已采纳

发布于 2016-03-29 12:21:12

你正在使用星星之火与蜂巢的支持。

有两种可能的解决方案是基于你以后想对你的火花壳做什么或者在你的火花工作中-

  1. 您想要访问hadoop+hive安装中的蜂窝表,应该将hive-site.xml放在spark安装的conf子目录中。从您现有的单元安装中找到hivesite.xml。例如,在我的cloudera中,hivesite.xml位于/usr/lib/hive/conf。执行此步骤后启动shell应该可以成功地连接到现有的蜂窝转移,并且不会尝试在当前工作目录中创建临时.metastore数据库。
  2. 您不想访问hadoop+hive安装中的蜂窝表。如果您不关心连接到hive表,那么您可以遵循Alberto的解决方案。修复您要从其中启动星火壳的目录中的权限问题。确保允许在该目录中创建目录/文件。

希望这能有所帮助。

票数 4
EN

Stack Overflow用户

发布于 2016-03-29 00:04:12

显然,您没有在该目录中写入的权限,我建议您在./spark-shell中运行HOME (您可能希望将该命令添加到您的PATH中),或者在用户可以访问和可写的任何其他目录中运行该命令。

这也可能与您的Notebooks together with Spark相关。

票数 10
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/36273166

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档