public class HadoopScanRunner extends Object
HadoopScanMapper.| Constructor and Description |
|---|
HadoopScanRunner() |
| Modifier and Type | Method and Description |
|---|---|
static ScanMetrics |
runJob(org.apache.hadoop.conf.Configuration hadoopConf,
Class<? extends org.apache.hadoop.mapreduce.InputFormat> inputFormat,
String jobName,
Class<? extends org.apache.hadoop.mapreduce.Mapper> mapperClass) |
static ScanMetrics |
runJob(Configuration conf,
String confRootField,
org.apache.hadoop.conf.Configuration hadoopConf,
Class<? extends org.apache.hadoop.mapreduce.InputFormat> inputFormat,
String jobName,
Class<? extends org.apache.hadoop.mapreduce.Mapper> mapperClass)
Run a ScanJob on Hadoop MapReduce.
|
static ScanMetrics |
runScanJob(ScanJob scanJob,
Configuration conf,
String confRootField,
org.apache.hadoop.conf.Configuration hadoopConf,
Class<? extends org.apache.hadoop.mapreduce.InputFormat> inputFormat) |
static ScanMetrics |
runVertexScanJob(VertexScanJob vertexScanJob,
Configuration conf,
String confRootField,
org.apache.hadoop.conf.Configuration hadoopConf,
Class<? extends org.apache.hadoop.mapreduce.InputFormat> inputFormat) |
public static ScanMetrics runJob(Configuration conf, String confRootField, org.apache.hadoop.conf.Configuration hadoopConf, Class<? extends org.apache.hadoop.mapreduce.InputFormat> inputFormat, String jobName, Class<? extends org.apache.hadoop.mapreduce.Mapper> mapperClass) throws IOException, InterruptedException, ClassNotFoundException
The confRootField parameter must be a string in the format
package.package...class#fieldname, where fieldname is the
name of a public static field on the class specified by the portion of the
string before the #. The # itself is just a separator and
is discarded.
When a MapReduce task process prepares to execute the ScanJob, it will
read the public static field named by confFieldRoot and cast it to a
ConfigNamespace. This namespace object becomes the root of a
Configuration instantiated, populated with the key-value pairs
from the conf parameter, and then passed into the ScanJob.
This method blocks until the ScanJob completes, then returns the metrics generated by the job during its execution. It does not timeout.
conf - configuration settings for the ScanJobconfRootField - the root of the ScanJob's configurationhadoopConf - the Configuration passed to the MapReduce JobinputFormat - the InputFormat<StaticBuffer, Iterable<Entry>>
that reads (row, columns) pairs out of a JanusGraph edgestorejobName - mapperClass - IOException - if the job fails for any reasonClassNotFoundException - if scanJob.getClass() or if Hadoop
MapReduce's internal job-submission-related reflection failsInterruptedException - if interrupted while waiting for the Hadoop
MapReduce job to completepublic static ScanMetrics runJob(org.apache.hadoop.conf.Configuration hadoopConf, Class<? extends org.apache.hadoop.mapreduce.InputFormat> inputFormat, String jobName, Class<? extends org.apache.hadoop.mapreduce.Mapper> mapperClass) throws IOException, InterruptedException, ClassNotFoundException
public static ScanMetrics runScanJob(ScanJob scanJob, Configuration conf, String confRootField, org.apache.hadoop.conf.Configuration hadoopConf, Class<? extends org.apache.hadoop.mapreduce.InputFormat> inputFormat) throws IOException, InterruptedException, ClassNotFoundException
public static ScanMetrics runVertexScanJob(VertexScanJob vertexScanJob, Configuration conf, String confRootField, org.apache.hadoop.conf.Configuration hadoopConf, Class<? extends org.apache.hadoop.mapreduce.InputFormat> inputFormat) throws IOException, InterruptedException, ClassNotFoundException
Copyright © 2012–2024. All rights reserved.