| Modifier and Type | Field and Description |
|---|---|
static java.lang.String |
CHECKPOINT_MODE
In this mode checkpoint data is provided to job.
|
protected org.apache.hadoop.conf.Configuration |
conf |
static java.lang.String |
END_TIME_MILLIS
If this filter is set, an event will be provided to map reduce job if it
happened before this timestamp in milliseconds
|
static java.lang.String |
HOME_DIR
The home directory for gemfirexd data in HDFS.
|
static java.lang.String |
INPUT_TABLE
The name of the table to process, for example APP.CUSTOMERS.
|
static java.lang.String |
PROPERTY_PREFIX
A valid property starting with this prefix will be used to tune input
format's behavior.
|
static java.lang.String |
START_TIME_MILLIS
If this filter is set, an event will be provided to map reduce job if it
happened after this timestamp in milliseconds
|
| Constructor and Description |
|---|
RowInputFormat() |
| Modifier and Type | Method and Description |
|---|---|
org.apache.hadoop.mapreduce.RecordReader<Key,Row> |
createRecordReader(org.apache.hadoop.mapreduce.InputSplit split,
org.apache.hadoop.mapreduce.TaskAttemptContext context) |
java.util.List<org.apache.hadoop.mapreduce.InputSplit> |
createSplits(java.util.Collection<org.apache.hadoop.fs.FileStatus> hoplogs) |
static java.lang.String |
getFullyQualifiedTableName(java.lang.String name) |
java.util.Collection<org.apache.hadoop.fs.FileStatus> |
getHoplogs(org.apache.hadoop.conf.Configuration job)
Identifies filters provided in the job configuration and creates a list of
sorted hoplogs satisfying the filters.
|
java.util.List<org.apache.hadoop.mapreduce.InputSplit> |
getSplits(org.apache.hadoop.mapreduce.JobContext job) |
public static final java.lang.String HOME_DIR
public static final java.lang.String INPUT_TABLE
public static final java.lang.String START_TIME_MILLIS
public static final java.lang.String END_TIME_MILLIS
public static final java.lang.String CHECKPOINT_MODE
public static final java.lang.String PROPERTY_PREFIX
protected org.apache.hadoop.conf.Configuration conf
public org.apache.hadoop.mapreduce.RecordReader<Key,Row> createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context) throws java.io.IOException, java.lang.InterruptedException
public java.util.List<org.apache.hadoop.mapreduce.InputSplit> getSplits(org.apache.hadoop.mapreduce.JobContext job)
throws java.io.IOException,
java.lang.InterruptedException
public java.util.Collection<org.apache.hadoop.fs.FileStatus> getHoplogs(org.apache.hadoop.conf.Configuration job)
throws java.io.IOException
java.io.IOExceptionpublic static java.lang.String getFullyQualifiedTableName(java.lang.String name)
public java.util.List<org.apache.hadoop.mapreduce.InputSplit> createSplits(java.util.Collection<org.apache.hadoop.fs.FileStatus> hoplogs)
throws java.io.IOException
java.io.IOExceptionCopyright © 2010-2015 Pivotal Software, Inc. All rights reserved.