Its because you are using --warehouse-dir import arg, so it creates one more department dir. Use --target-dir instead to get the data imported as normal directory structure.
Hi Sir, I have executed the Sqoop import command using Avrodatafile as file format. After execution I could see departments_avsc instead of sqoop_import_departments.avsc. Any particular reason ?
For any technical discussions or doubts, please use our forum - discuss.itversity.com For practicing on state of the art big data cluster, please sign up on - labs.itversity.com Lab is under free preview until 12/31/2016 and after that subscription charges are 14.99$ per 31 days, 34.99$ per 93 days and 54.99$ per 185 days
Hello Sir, I have below 2 questions regarding sqoop import -> 1) when I import data --as-avrodatafile, i can not see the file .avsc got created under the directory from where I am running sqoop command 2) instead of having default mappers as 4, when I import table having 6 rows, its creating 6 mappers while importing..why so? Thanks
Hi Durga, You have explained about creating hive table with data imported into HDFS in avro data file. is there any way to create hive table with data imported into HDFS in sequence data file. I tried to give hive-import and file format argument as sequence file, it failed with " hive import is not supported for sequence file format" Thanks, Vasanth Mahendran
Hello Durga Sir, I tried to do a sqoop-import as sequence file. However I was not able to see the rows in Hive. Could you help. I am getting the exception. java.io.IOException: WritableName can't load class: order_Items
hello sir when i use the create external table command in hive it is creating the schema but the table is not being loaded with data. can you think of any reason why this happened?
I am not sure but i have observed that we have to load the data after the table creation, i ran the below command after doing everything as durga sir told and it worked for me. load data inpath '/user/cloudera/department/' into table departments;
Its because you are using --warehouse-dir import arg, so it creates one more department dir. Use --target-dir instead to get the data imported as normal directory structure.
Hi Sir, I have executed the Sqoop import command using Avrodatafile as file format. After execution I could see departments_avsc instead of sqoop_import_departments.avsc. Any particular reason ?
+Narayana Pratti Are you using 5.5 VM?
+itversity Yes it is
It might be some thing to do with sqoop configuration. No need to worry about it.
For any technical discussions or doubts, please use our forum - discuss.itversity.com
For practicing on state of the art big data cluster, please sign up on - labs.itversity.com
Lab is under free preview until 12/31/2016 and after that subscription
charges are 14.99$ per 31 days, 34.99$ per 93 days and 54.99$ per 185 days
Hello Sir,
I have below 2 questions regarding sqoop import ->
1) when I import data --as-avrodatafile, i can not see the file .avsc got created under the directory from where I am running sqoop command
2) instead of having default mappers as 4, when I import table having 6 rows, its creating 6 mappers while importing..why so?
Thanks
Which distribution you are using?
I am working on ubuntu 12.04 on oracle VM with hadoop version 1.2.1, sqoop 1.4.4 & hive 0.10.0
Hi Durga,
You have explained about creating hive table with data imported into HDFS in avro data file. is there any way to create hive table with data imported into HDFS in sequence data file.
I tried to give hive-import and file format argument as sequence file, it failed with " hive import is not supported for sequence file format"
Thanks,
Vasanth Mahendran
Hello Durga Sir,
I tried to do a sqoop-import as sequence file. However I was not able to see the rows in Hive.
Could you help.
I am getting the exception. java.io.IOException: WritableName can't load class: order_Items
Please raise it in discuss.itversity.com
hello sir when i use the create external table command in hive it is creating the schema but the table is not being loaded with data. can you think of any reason why this happened?
I am not sure but i have observed that we have to load the data after the table creation, i ran the below command after doing everything as durga sir told and it worked for me.
load data inpath '/user/cloudera/department/' into table departments;
This .avsc will create for all versions or it is specific in recent versions because when i tried, i didn't see the .avsc file along with .java file.