Thanks Durga for the detailed videos. I also found another option and I am sure it may be covered in other videos of yours that I have not seen as yet. Just like "--hive-database" property, we can specify "--hive-table" in the SQOOP IMPORT. This will create the table in the desired database.--hive table .
The parameter --create-hive-table could be dismissed because the database sqoop_import already exists in hive? I run it without the parameter and I'am able to query the data.
while running the import-all-tables command for hive import it always get stuck here and hive tables not getting loaded , nothing appears after that 16/04/10 11:35:28 INFO mapreduce.ImportJobBase: Retrieved 58 records. 16/04/10 11:35:29 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1 16/04/10 11:35:29 INFO hive.HiveImport: Loading uploaded data into Hive Logging initialized using configuration in jar:file:/usr/jars/hive-common-1.1.0-cdh5.4.2.jar!/hive-log4j.properties
+itversity Hi Durga, I am also facing the same issue.I am using cloudera VM 5.7. And configurations are as below: My Settings are: Memory 12.1GB Processors 6 Harddisk 64GB Could you please suggest. Thanks
worked perfectly for me. But only 4 out of the 6 tables got imported. I did not get any errors as well. Any idea ? hive> dfs -ls /apps/hive/warehouse/sqoop_import.db; Found 4 items drwxrwxrwx - root hdfs 0 2018-01-27 12:42 /apps/hive/warehouse/sqoop_import.db/categories drwxrwxrwx - root hdfs 0 2018-01-27 12:43 /apps/hive/warehouse/sqoop_import.db/customers drwxrwxrwx - root hdfs 0 2018-01-27 12:44 /apps/hive/warehouse/sqoop_import.db/departments drwxrwxrwx - root hdfs 0 2018-01-27 12:45 /apps/hive/warehouse/sqoop_import.db/order_items hive> [root@sandbox-hdp ~]# But mysql has 6 tables: mysql> show tables; +---------------------+ | Tables_in_retail_db | +---------------------+ | categories | | customers | | departments | | order_items | | orders | | products | +---------------------+ 6 rows in set (0.00 sec) Please suggest. Thanks !
Getting an issue "libjars/ant-eclipse-1.0-jvm1.2.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation." I guess the data node is stopped. Could you please suggest how to resolve.
Its really strange that --hive-database parameter isn't documented on the online documentation. However when you use the sqoop import-all-tables --help on a terminal, you should be able to see the flag.
Whwn I am running sqoop import on Hive the tables are getting created to /user/cloudera. Do I need to mention any location for Hive import to see the tables under /user/hive/warehouse ?
For any technical discussions or doubts, please use our forum - discuss.itversity.com For practicing on state of the art big data cluster, please sign up on - labs.itversity.com Lab is under free preview until 12/31/2016 and after that subscription charges are 14.99$ per 31 days, 34.99$ per 93 days and 54.99$ per 185 days
Whenever I run sqoop command with --password I get below error. Without --password it works fine. I created the user exactly the same way you have shown in previous videos. I am able to run all the commande without password but still curious to know what did I do wrong. [root@sandbox ~]# sqoop list-databases --connect "jdbc:mysql://sandbox.hortonworks.com:3306" --username retail_dba --password hadoop 16/05/12 07:28:37 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 16/05/12 07:28:38 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 16/05/12 07:28:39 ERROR manager.CatalogQueryManager: Failed to list databases java.sql.SQLException: Access denied for user 'retail_dba'@'sandbox.hortonworks.com' (using password: YES)
Great explanation, well covered "Add on" topic as well. Thanks a lot for including viewers technical feed-back as well in your videos.
Thanks Durga for the detailed videos. I also found another option and I am sure it may be covered in other videos of yours that I have not seen as yet. Just like "--hive-database" property, we can specify "--hive-table" in the SQOOP IMPORT. This will create the table in the desired database.--hive table .
Yes, but you cannot use --hive-table with import-all-tables.
The parameter --create-hive-table could be dismissed because the database sqoop_import already exists in hive? I run it without the parameter and I'am able to query the data.
while running the import-all-tables command for hive import it always get stuck here and hive tables not getting loaded , nothing appears after that
16/04/10 11:35:28 INFO mapreduce.ImportJobBase: Retrieved 58 records.
16/04/10 11:35:29 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
16/04/10 11:35:29 INFO hive.HiveImport: Loading uploaded data into Hive
Logging initialized using configuration in jar:file:/usr/jars/hive-common-1.1.0-cdh5.4.2.jar!/hive-log4j.properties
+Deepak Singhai Where are you running these? On a VM on your PC?
I am running on VM
+Deepak Singhai Can you send the details of the VM? How much memory you have given? Is it Cloudera VM or Hortonworks?
I am running Cloudera VM 5.4.2 from Oracle virtualbox with memory 5054MB
+itversity
Hi Durga, I am also facing the same issue.I am using cloudera VM 5.7. And configurations are as below:
My Settings are:
Memory 12.1GB
Processors 6
Harddisk 64GB
Could you please suggest. Thanks
when i am trying to import all-tables from mysql database to hive only two table are imported and than some error and exception comes
worked perfectly for me. But only 4 out of the 6 tables got imported. I did not get any errors as well. Any idea ?
hive> dfs -ls /apps/hive/warehouse/sqoop_import.db;
Found 4 items
drwxrwxrwx - root hdfs 0 2018-01-27 12:42 /apps/hive/warehouse/sqoop_import.db/categories
drwxrwxrwx - root hdfs 0 2018-01-27 12:43 /apps/hive/warehouse/sqoop_import.db/customers
drwxrwxrwx - root hdfs 0 2018-01-27 12:44 /apps/hive/warehouse/sqoop_import.db/departments
drwxrwxrwx - root hdfs 0 2018-01-27 12:45 /apps/hive/warehouse/sqoop_import.db/order_items
hive> [root@sandbox-hdp ~]#
But mysql has 6 tables:
mysql> show tables;
+---------------------+
| Tables_in_retail_db |
+---------------------+
| categories |
| customers |
| departments |
| order_items |
| orders |
| products |
+---------------------+
6 rows in set (0.00 sec)
Please suggest. Thanks !
Getting an issue "libjars/ant-eclipse-1.0-jvm1.2.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation." I guess the data node is stopped. Could you please suggest how to resolve.
+Tarun Pande Are you using Cloudera VM?
Cloudera 5.4.2 is the version.
Are you able to launch cloudera manager? If yes, you can restart all the services using cloudera manager.
Its really strange that --hive-database parameter isn't documented on the online documentation.
However when you use the sqoop import-all-tables --help on a terminal, you should be able to see the flag.
after running this command i am getting only 6 table in hive. so please suggest me how can i see all tables or get into hive database.
in my case hive-database sqoop_import is not working i use biginsights VM
Whwn I am running sqoop import on Hive the tables are getting created to /user/cloudera. Do I need to mention any location for Hive import to see the tables under /user/hive/warehouse ?
No, you have to import with --hive-import
when i specify --hive-database sqoop_import.db parameter, I get the error "Database does not exist". The db exists in hive though. Pls help
Please ignore this. when i did not specify ".db". it worked.
this import doesnt go through, it taking hours and holding on initializing the import.. even i have mentioned the sqoop_import database..
Please raise the technical issues on discuss.itversity.com
Where can I get that document
You can refer to this below link github.com/dgadiraju/code/blob/master/hadoop/edw/cloudera/sqoop/sqoop_demo.txt
For any technical discussions or doubts, please use our forum - discuss.itversity.com
For practicing on state of the art big data cluster, please sign up on - labs.itversity.com
Lab is under free preview until 12/31/2016 and after that subscription
charges are 14.99$ per 31 days, 34.99$ per 93 days and 54.99$ per 185 days
where's hortonworks?
Whenever I run sqoop command with --password I get below error. Without --password it works fine. I created the user exactly the same way you have shown in previous videos. I am able to run all the commande without password but still curious to know what did I do wrong.
[root@sandbox ~]# sqoop list-databases --connect "jdbc:mysql://sandbox.hortonworks.com:3306" --username retail_dba --password hadoop
16/05/12 07:28:37 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/05/12 07:28:38 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/05/12 07:28:39 ERROR manager.CatalogQueryManager: Failed to list databases
java.sql.SQLException: Access denied for user 'retail_dba'@'sandbox.hortonworks.com' (using password: YES)
sir --hive-database argument is working in my biginsights VM