Home >> Big Data Hadoop >> Sqoop import table with columns in Big Data Hadoop

Sqoop import table with columns in Big Data Hadoop

sqoop import –connect jdbc:mysql://localhost:3306/telecom --driver com.mysql.jdbc.Driver --username root --password root  --table customers

--columns cust_id, name, address, date, history, occupation  --where item>=1234

--target-dir /tmp//customers

--m 8

--split-by cust_id

--fields-terminated-by ,

--escaped-by \

--hive-drop-import-delims 

--map-column-java

  cust_id=string, name=string, address=string, date=string, history=string, occupation=string

Post Your Comment

Next Questions
Sqoop incremental import
Sqoop import all tables
Sqoop export command
Sqoop import export key word
Sqoop Job
Sqoop Codegen
Sqoop job scheduling
Sqoop cron Job
Sqoop cron job example
Hive Prerequisites
Hive Installation
What is Hive
Hive MapReduce Execution
Features of Hive
Hive Architecture
Configuring Metastore of Hive
Hive vs Hadoop
Hive Create Database
Hive Internal and External Table
Hive Internal Table
Drawback of Hive Internal/Managed Table
HIve External Table
Hive DDL
Hive CLI Options
Hive shell commands

Copyright ©2022 coderraj.com. All Rights Reserved.