Hadoop Certification - 02 Sqoop Import - Add on

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 ม.ค. 2025

ความคิดเห็น • 35

  • @ramat9238
    @ramat9238 7 ปีที่แล้ว

    Great explanation, well covered "Add on" topic as well. Thanks a lot for including viewers technical feed-back as well in your videos.

  • @aniruddhaparwekar737
    @aniruddhaparwekar737 8 ปีที่แล้ว

    Thanks Durga for the detailed videos. I also found another option and I am sure it may be covered in other videos of yours that I have not seen as yet. Just like "--hive-database" property, we can specify "--hive-table" in the SQOOP IMPORT. This will create the table in the desired database.--hive table .

    • @itversity
      @itversity  8 ปีที่แล้ว +1

      Yes, but you cannot use --hive-table with import-all-tables.

  • @81sinnombre
    @81sinnombre 8 ปีที่แล้ว

    The parameter --create-hive-table could be dismissed because the database sqoop_import already exists in hive? I run it without the parameter and I'am able to query the data.

  • @singhaideepak
    @singhaideepak 8 ปีที่แล้ว

    while running the import-all-tables command for hive import it always get stuck here and hive tables not getting loaded , nothing appears after that
    16/04/10 11:35:28 INFO mapreduce.ImportJobBase: Retrieved 58 records.
    16/04/10 11:35:29 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
    16/04/10 11:35:29 INFO hive.HiveImport: Loading uploaded data into Hive
    Logging initialized using configuration in jar:file:/usr/jars/hive-common-1.1.0-cdh5.4.2.jar!/hive-log4j.properties

    • @itversity
      @itversity  8 ปีที่แล้ว

      +Deepak Singhai Where are you running these? On a VM on your PC?

    • @singhaideepak
      @singhaideepak 8 ปีที่แล้ว

      I am running on VM

    • @itversity
      @itversity  8 ปีที่แล้ว

      +Deepak Singhai Can you send the details of the VM? How much memory you have given? Is it Cloudera VM or Hortonworks?

    • @singhaideepak
      @singhaideepak 8 ปีที่แล้ว

      I am running Cloudera VM 5.4.2 from Oracle virtualbox with memory 5054MB

    • @praveenv1498
      @praveenv1498 8 ปีที่แล้ว

      +itversity
      Hi Durga, I am also facing the same issue.I am using cloudera VM 5.7. And configurations are as below:
      My Settings are:
      Memory 12.1GB
      Processors 6
      Harddisk 64GB
      Could you please suggest. Thanks

  • @dherendrachaudhary6728
    @dherendrachaudhary6728 7 ปีที่แล้ว

    when i am trying to import all-tables from mysql database to hive only two table are imported and than some error and exception comes

  • @mathewvarghese3589
    @mathewvarghese3589 7 ปีที่แล้ว

    worked perfectly for me. But only 4 out of the 6 tables got imported. I did not get any errors as well. Any idea ?
    hive> dfs -ls /apps/hive/warehouse/sqoop_import.db;
    Found 4 items
    drwxrwxrwx - root hdfs 0 2018-01-27 12:42 /apps/hive/warehouse/sqoop_import.db/categories
    drwxrwxrwx - root hdfs 0 2018-01-27 12:43 /apps/hive/warehouse/sqoop_import.db/customers
    drwxrwxrwx - root hdfs 0 2018-01-27 12:44 /apps/hive/warehouse/sqoop_import.db/departments
    drwxrwxrwx - root hdfs 0 2018-01-27 12:45 /apps/hive/warehouse/sqoop_import.db/order_items
    hive> [root@sandbox-hdp ~]#
    But mysql has 6 tables:
    mysql> show tables;
    +---------------------+
    | Tables_in_retail_db |
    +---------------------+
    | categories |
    | customers |
    | departments |
    | order_items |
    | orders |
    | products |
    +---------------------+
    6 rows in set (0.00 sec)
    Please suggest. Thanks !

  • @tarunpande9707
    @tarunpande9707 9 ปีที่แล้ว

    Getting an issue "libjars/ant-eclipse-1.0-jvm1.2.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation." I guess the data node is stopped. Could you please suggest how to resolve.

    • @itversity
      @itversity  9 ปีที่แล้ว

      +Tarun Pande Are you using Cloudera VM?

    • @tarunpande9707
      @tarunpande9707 9 ปีที่แล้ว

      Cloudera 5.4.2 is the version.

    • @itversity
      @itversity  9 ปีที่แล้ว

      Are you able to launch cloudera manager? If yes, you can restart all the services using cloudera manager.

  • @shivakonar
    @shivakonar 9 ปีที่แล้ว

    Its really strange that --hive-database parameter isn't documented on the online documentation.
    However when you use the sqoop import-all-tables --help on a terminal, you should be able to see the flag.

  • @atulgupta1348
    @atulgupta1348 6 ปีที่แล้ว

    after running this command i am getting only 6 table in hive. so please suggest me how can i see all tables or get into hive database.

  • @dherendrachaudhary6728
    @dherendrachaudhary6728 7 ปีที่แล้ว

    in my case hive-database sqoop_import is not working i use biginsights VM

  • @bashful88
    @bashful88 8 ปีที่แล้ว

    Whwn I am running sqoop import on Hive the tables are getting created to /user/cloudera. Do I need to mention any location for Hive import to see the tables under /user/hive/warehouse ?

    • @itversity
      @itversity  8 ปีที่แล้ว

      No, you have to import with --hive-import

  • @ragch2261
    @ragch2261 8 ปีที่แล้ว

    when i specify --hive-database sqoop_import.db parameter, I get the error "Database does not exist". The db exists in hive though. Pls help

    • @ragch2261
      @ragch2261 8 ปีที่แล้ว +2

      Please ignore this. when i did not specify ".db". it worked.

  • @pavanreddy4759
    @pavanreddy4759 8 ปีที่แล้ว

    this import doesnt go through, it taking hours and holding on initializing the import.. even i have mentioned the sqoop_import database..

    • @itversity
      @itversity  8 ปีที่แล้ว

      Please raise the technical issues on discuss.itversity.com

  • @jithendrapoola2922
    @jithendrapoola2922 6 ปีที่แล้ว

    Where can I get that document

    • @annapurnachinta7806
      @annapurnachinta7806 6 ปีที่แล้ว

      You can refer to this below link github.com/dgadiraju/code/blob/master/hadoop/edw/cloudera/sqoop/sqoop_demo.txt

  • @itversity
    @itversity  8 ปีที่แล้ว

    For any technical discussions or doubts, please use our forum - discuss.itversity.com
    For practicing on state of the art big data cluster, please sign up on - labs.itversity.com
    Lab is under free preview until 12/31/2016 and after that subscription
    charges are 14.99$ per 31 days, 34.99$ per 93 days and 54.99$ per 185 days

  • @cocowu4887
    @cocowu4887 7 ปีที่แล้ว

    where's hortonworks?

  • @pawanmalwal2314
    @pawanmalwal2314 8 ปีที่แล้ว

    Whenever I run sqoop command with --password I get below error. Without --password it works fine. I created the user exactly the same way you have shown in previous videos. I am able to run all the commande without password but still curious to know what did I do wrong.
    [root@sandbox ~]# sqoop list-databases --connect "jdbc:mysql://sandbox.hortonworks.com:3306" --username retail_dba --password hadoop
    16/05/12 07:28:37 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
    16/05/12 07:28:38 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
    16/05/12 07:28:39 ERROR manager.CatalogQueryManager: Failed to list databases
    java.sql.SQLException: Access denied for user 'retail_dba'@'sandbox.hortonworks.com' (using password: YES)

  • @dherendrachaudhary6728
    @dherendrachaudhary6728 8 ปีที่แล้ว

    sir --hive-database argument is working in my biginsights VM