Load data into Hive with custom delimiter

10,291

Solution 1

By default, hive only allows user to use single character as field delimiter. Although there's RegexSerDe to specify multiple-character delimiter, it can be daunting to use, especially for amateurs.

The patch (HIVE-5871) adds a new SerDe named MultiDelimitSerDe. With MultiDelimitSerDe, users can specify a multiple-character field delimiter when creating tables, in a way most similar to typical table creations.

hive> CREATE TABLE logs (foo INT, bar STRING, created_date TIMESTAMP)
    > ROW FORMAT SERDE 'org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe' 
    > WITH SERDEPROPERTIES ("field.delim"="<=>")
    > STORED AS TEXTFILE;

hive> dfs -put /home/user1/multi_char.txt /user/hive/warehouse/logs/. ;

hive> select * from logs;
OK
120 abcdefg 2016-01-01 12:14:11
Time taken: 1.657 seconds, Fetched: 1 row(s)
hive> 

Solution 2

CREATE TABLE logs (foo INT, bar STRING, created_date TIMESTAMP)
ROW FORMAT SERDE 'org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe'
WITH SERDEPROPERTIES (
    "field.delim"="<=>",
    "collection.delim"=":",
    "mapkey.delim"="@"
);

load data in table

load data local inpath '/home/kishore/Data/input.txt' overwrite into table logs;

Solution 3

I suggest you to go with MultiDelimitSerDe answers mentioned earlier over mine. You can also give a try with RegexSerDe. But You need to have an additional step of parsing it to your datatypes since RegexSerde accepts String by default.

RegexSerDe will come to handy dealing with some log files where the data that is not uniformly arranged with only one single delimiter.

CREATE TABLE logs_tmp  (foo STRING,bar STRING, created_date STRING) 
ROW FORMAT SERDE 'org.apache.hadoop.hive.contrib.serde2.RegexSerDe' 
WITH SERDEPROPERTIES (
 "input.regex" = "(\\d{3})<=>(\\w+)<=>(\\d{4}-\\d{2}-\\d{2}\\s\\d{2}:\\d{2}:\\d{2})"
) 
STORED AS TEXTFILE;

LOAD DATA LOCAL INPATH 'logs.txt' overwrite into table logs_tmp;

CREATE TABLE logs  (foo INT,bar STRING, created_date TIMESTAMP) ;

INSERT INTO TABLE logs SELECT cast(foo as int) as foo,bar,cast(created_date as TIMESTAMP) as created_date from logs_tmp 

output:

   OK
    Time taken: 0.213 seconds    
    hive> select * from logs;
    120     abcdefg 2016-01-01 12:14:11
Share:
10,291
shriyog
Author by

shriyog

Updated on June 04, 2022

Comments

  • shriyog
    shriyog almost 2 years

    I'm trying to create an internal (managed) table in hive that can store my incremental log data. The table goes like this:

    CREATE TABLE logs (foo INT, bar STRING, created_date TIMESTAMP)
    ROW FORMAT DELIMITED
    FIELDS TERMINATED BY '<=>'
    STORED AS TEXTFILE;
    

    I need to load data into this table periodically.

    LOAD DATA INPATH '/user/foo/data/logs' INTO TABLE logs;
    

    But the data is not getting inserted into the table properly. There might be some problem with the delimiter.Can't find why.

    Example log line:

    120<=>abcdefg<=>2016-01-01 12:14:11
    

    On select * from logs; I get,

    120  =>abcdefg  NULL
    

    first attribute is fine, the second contains a part of delimiter but since it's string that is getting inserted and third will be null since it expects date time.

    Can anyone please help on how to provide custom delimiters and load data successfully.

  • shriyog
    shriyog over 7 years
    Thanks for such a quick response, It worked & now I'm able to load the data. But when I perform queries other than select * I get Class org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe not found Exception. So, can u specify the source and where to add it?
  • Ronak Patel
    Ronak Patel over 7 years
    You need to get updated jar and place it in $HIVE_HOME/lib dir...see Jira ticket in my answer that has details in which version this class was made available (hive 0.14 and later) details here