Rate This Document
Findability
Accuracy
Completeness
Readability

Viewing the Test Result

Use BulkLoad to import 1 TB data in batches and record the time.

1
2
hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.columns=HBASE_ROW_KEY,f1:H_NAME,f1:ADDRESS -Dimporttsv.separator="," -Dimporttsv.skip.bad.lines=true -Dmapreduce.split.minsize=1073741824 -Dimporttsv.bulk.output=/tmp/hbase/hfile ImportTable /tmp/hbase/datadirImport
hbase org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles /tmp/hbase/hfile ImportTable

You need to delete the badline and hfile files from the /tmp/hbase directory before another test.