For any technical discussions or doubts, please use our forum - discuss.itversity.com For practicing on state of the art big data cluster, please sign up on - labs.itversity.com Lab is under free preview until 12/31/2016 and after that subscription charges are 14.99$ per 31 days, 34.99$ per 93 days and 54.99$ per 185 days
Again , nice presentation on the sqoop export and update_mode options. Also , when there is a primary key constraint on the target table in mysql , how come the failed map tasks' 4 attempts will create duplicate keys in the table. Thats against the data integrity constraint we have on the table , isn't ??
Hi Durga sir, While exporting data from Hadoop using --update-key and --update-mode, script error out but i see data being updated/inserted correctly. Here is the error message: ERROR tool.ExportTool: Error during export: Export job failed! Any input on this would be greatly appreciated. Tushar
if our data in the export file is not match with data format/type in table we will get error still export completed. try with .txt and .csv file we can see the difference. if anything am wrong pls let me know
hi sir, is there any document how to install sqoop on mac am getting error as below before executing MR job in sqoop: Location: org/apache/hadoop/hdfs/DFSClient.getQuotaUsage(Ljava/lang/String;)Lorg/apache/hadoop/fs/QuotaUsage; @160: areturn Reason: Type 'org/apache/hadoop/fs/ContentSummary' (current frame, stack[0]) is not assignable to 'org/apache/hadoop/fs/QuotaUsage' (from method signature) Current Frame: bci: @160 flags: { } locals: { 'org/apache/hadoop/hdfs/DFSClient', 'java/lang/String', 'org/apache/hadoop/ipc/RemoteException', 'java/io/IOException' }
Just a simple stupid question sir. How do you complete a command when you press "/eval" (and then what to complete eval command). I couldn't found it how to work with this. Thanks
For any technical discussions or doubts, please use our forum - discuss.itversity.com
For practicing on state of the art big data cluster, please sign up on - labs.itversity.com
Lab is under free preview until 12/31/2016 and after that subscription
charges are 14.99$ per 31 days, 34.99$ per 93 days and 54.99$ per 185 days
Thank you
Again , nice presentation on the sqoop export and update_mode options. Also , when there is a primary key constraint on the target table in mysql , how come the failed map tasks' 4 attempts will create duplicate keys in the table. Thats against the data integrity constraint we have on the table , isn't ??
Hi Durga,
Could you please explain how to generate shortcut keys as you mentioned in the video.
Ex:
/eval
/-rm
Hi Durga sir,
While exporting data from Hadoop using --update-key and --update-mode, script error out but i see data being updated/inserted correctly.
Here is the error message:
ERROR tool.ExportTool: Error during export: Export job failed!
Any input on this would be greatly appreciated.
Tushar
if our data in the export file is not match with data format/type in table we will get error still export completed. try with .txt and .csv file we can see the difference. if anything am wrong pls let me know
hi sir, is there any document how to install sqoop on mac am getting error as below before executing MR job in sqoop:
Location:
org/apache/hadoop/hdfs/DFSClient.getQuotaUsage(Ljava/lang/String;)Lorg/apache/hadoop/fs/QuotaUsage; @160: areturn
Reason:
Type 'org/apache/hadoop/fs/ContentSummary' (current frame, stack[0]) is not assignable to 'org/apache/hadoop/fs/QuotaUsage' (from method signature)
Current Frame:
bci: @160
flags: { }
locals: { 'org/apache/hadoop/hdfs/DFSClient', 'java/lang/String', 'org/apache/hadoop/ipc/RemoteException', 'java/io/IOException' }
I think to set up sqoop, first you need to set up HDFS. Have you set up HDFS yet?
Just a simple stupid question sir.
How do you complete a command when you press "/eval" (and then what to complete eval command).
I couldn't found it how to work with this.
Thanks
+gagan chhabra You have to use terminal in vi mode. Here is the video which covers it: th-cam.com/video/Op6DS0VlaIM/w-d-xo.html
Thanks a lot :)