Home >> Big Data Hadoop >> Sqoop export command in Big Data Hadoop

Sqoop export command in Big Data Hadoop

•Export data back from the HDFS to the RDBMS database.
•The target table must exist in the target database.
•The files which are given as input to the sqoop contain records, which are called rows in table. Those are read and parsed into a set of records and delimited with user-specified delimiter.

sqoop export --connect jdbc:mysql://localhost:3306/database --username --driver com.mysql.jdbc.Driver root --table tablename --export-dir /hadoopfolder/part-m-00000 --fields-terminated-by ',‘

•mysql>select * from tablename;

Post Your Comment

Next Questions
Sqoop import export key word
Sqoop Job
Sqoop Codegen
Sqoop job scheduling
Sqoop cron Job
Sqoop cron job example
Hive Prerequisites
Hive Installation
What is Hive
Hive MapReduce Execution
Features of Hive
Hive Architecture
Configuring Metastore of Hive
Hive vs Hadoop
Hive Create Database
Hive Internal and External Table
Hive Internal Table
Drawback of Hive Internal/Managed Table
HIve External Table
Hive DDL
Hive CLI Options
Hive shell commands
Hive Logging & Resources
Hive Types Of Joins
Hive Joins

Copyright ©2022 coderraj.com. All Rights Reserved.