How can I partition a table with HIVE? How can I partition a table with HIVE? hadoop hadoop

How can I partition a table with HIVE?


If I understand correctly, you have files in the folders 4 level deep from the directory logs. In that case, you define your table as external with path 'logs' and partitioned by 4 virtual fields: year, month, day_of_month, hour_of_day.

The partitioning is essentially done for you by Flume.

EDIT 3/9: A lot of details depends on how exactly Flume writes files. But in general terms, your DDL should look something like this:

CREATE TABLE table_name(fields...)PARTITIONED BY(log_year STRING, log_month STRING,     log_day_of_month STRING, log_hour_of_day STRING)format descriptionSTORED AS TEXTFILELOCATION '/your user path/logs';

EDIT 3/15: Per zzarbi request, I'm adding a note that after the table is created, the Hive needs to be informed about partitions created. This needs to be done repeatedly as long as Flume or other process creates new partitions. See my answer to Create external with Partition question.