| Modifier and Type | Field and Description |
|---|---|
static String |
CHECKPOINT_MODE
In this mode checkpoint data is provided to job.
|
protected org.apache.hadoop.conf.Configuration |
conf |
static String |
END_TIME_MILLIS
If this filter is set, an event will be provided to map reduce job if it
happened before this timestamp in milliseconds
|
static String |
HOME_DIR
The home directory for gemfirexd data in HDFS.
|
static String |
INPUT_TABLE
The name of the table to process, for example APP.CUSTOMERS.
|
static String |
PROPERTY_PREFIX
A valid property starting with this prefix will be used to tune input
format's behavior.
|
static String |
START_TIME_MILLIS
If this filter is set, an event will be provided to map reduce job if it
happened after this timestamp in milliseconds
|
| Constructor and Description |
|---|
RowInputFormat() |
| Modifier and Type | Method and Description |
|---|---|
org.apache.hadoop.mapreduce.RecordReader<Key,Row> |
createRecordReader(org.apache.hadoop.mapreduce.InputSplit split,
org.apache.hadoop.mapreduce.TaskAttemptContext context) |
List<org.apache.hadoop.mapreduce.InputSplit> |
createSplits(Collection<org.apache.hadoop.fs.FileStatus> hoplogs) |
static String |
getFullyQualifiedTableName(String name) |
Collection<org.apache.hadoop.fs.FileStatus> |
getHoplogs(org.apache.hadoop.conf.Configuration job)
Identifies filters provided in the job configuration and creates a list of
sorted hoplogs satisfying the filters.
|
List<org.apache.hadoop.mapreduce.InputSplit> |
getSplits(org.apache.hadoop.mapreduce.JobContext job) |
public static final String HOME_DIR
public static final String INPUT_TABLE
public static final String START_TIME_MILLIS
public static final String END_TIME_MILLIS
public static final String CHECKPOINT_MODE
public static final String PROPERTY_PREFIX
protected org.apache.hadoop.conf.Configuration conf
public org.apache.hadoop.mapreduce.RecordReader<Key,Row> createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context) throws IOException, InterruptedException
createRecordReader in class org.apache.hadoop.mapreduce.InputFormat<Key,Row>IOExceptionInterruptedExceptionpublic List<org.apache.hadoop.mapreduce.InputSplit> getSplits(org.apache.hadoop.mapreduce.JobContext job) throws IOException, InterruptedException
getSplits in class org.apache.hadoop.mapreduce.InputFormat<Key,Row>IOExceptionInterruptedExceptionpublic Collection<org.apache.hadoop.fs.FileStatus> getHoplogs(org.apache.hadoop.conf.Configuration job) throws IOException
IOExceptionpublic List<org.apache.hadoop.mapreduce.InputSplit> createSplits(Collection<org.apache.hadoop.fs.FileStatus> hoplogs) throws IOException
IOExceptionCopyright © 2010-2015 Pivotal Software, Inc. All rights reserved.