|
||||||||||
| PREV NEXT | FRAMES NO FRAMES | |||||||||
| Packages that use MapWork | |
|---|---|
| org.apache.hadoop.hive.ql.exec | Hive QL execution tasks, operators, functions and other handlers. |
| org.apache.hadoop.hive.ql.io.rcfile.merge | |
| org.apache.hadoop.hive.ql.io.rcfile.stats | |
| org.apache.hadoop.hive.ql.io.rcfile.truncate | |
| org.apache.hadoop.hive.ql.optimizer | |
| org.apache.hadoop.hive.ql.optimizer.physical | |
| org.apache.hadoop.hive.ql.parse | |
| org.apache.hadoop.hive.ql.plan | |
| Uses of MapWork in org.apache.hadoop.hive.ql.exec |
|---|
| Methods in org.apache.hadoop.hive.ql.exec that return MapWork | |
|---|---|
static MapWork |
Utilities.getMapWork(org.apache.hadoop.conf.Configuration conf)
|
| Methods in org.apache.hadoop.hive.ql.exec with parameters of type MapWork | |
|---|---|
static void |
Utilities.createTmpDirs(org.apache.hadoop.conf.Configuration conf,
MapWork mWork)
Hive uses tmp directories to capture the output of each FileSinkOperator. |
static int |
Utilities.estimateNumberOfReducers(HiveConf conf,
org.apache.hadoop.fs.ContentSummary inputSummary,
MapWork work,
boolean finalMapRed)
Estimate the number of reducers needed for this job, based on job input, and configuration parameters. |
static double |
Utilities.getHighestSamplePercentage(MapWork work)
Returns the highest sample percentage of any alias in the given MapWork |
static List<org.apache.hadoop.fs.Path> |
Utilities.getInputPaths(org.apache.hadoop.mapred.JobConf job,
MapWork work,
org.apache.hadoop.fs.Path hiveScratchDir,
Context ctx)
Computes a list of all input paths needed to compute the given MapWork. |
static List<org.apache.hadoop.fs.Path> |
Utilities.getInputPathsTez(org.apache.hadoop.mapred.JobConf job,
MapWork work)
On Tez we're not creating dummy files when getting/setting input paths. |
static org.apache.hadoop.fs.ContentSummary |
Utilities.getInputSummary(Context ctx,
MapWork work,
org.apache.hadoop.fs.PathFilter filter)
Calculate the total size of input files. |
static long |
Utilities.getTotalInputFileSize(org.apache.hadoop.fs.ContentSummary inputSummary,
MapWork work,
double highestSamplePercentage)
Computes the total input file size. |
static long |
Utilities.getTotalInputNumFiles(org.apache.hadoop.fs.ContentSummary inputSummary,
MapWork work,
double highestSamplePercentage)
Computes the total number of input files. |
void |
MapOperator.initializeAsRoot(org.apache.hadoop.conf.Configuration hconf,
MapWork mapWork)
Initializes this map op as the root of the tree. |
static void |
Utilities.setInputAttributes(org.apache.hadoop.conf.Configuration conf,
MapWork mWork)
Set hive input format, and input format file if necessary. |
static void |
Utilities.setMapWork(org.apache.hadoop.conf.Configuration conf,
MapWork work)
|
static org.apache.hadoop.fs.Path |
Utilities.setMapWork(org.apache.hadoop.conf.Configuration conf,
MapWork w,
org.apache.hadoop.fs.Path hiveScratchDir,
boolean useCache)
|
| Uses of MapWork in org.apache.hadoop.hive.ql.io.rcfile.merge |
|---|
| Subclasses of MapWork in org.apache.hadoop.hive.ql.io.rcfile.merge | |
|---|---|
class |
MergeWork
|
| Uses of MapWork in org.apache.hadoop.hive.ql.io.rcfile.stats |
|---|
| Subclasses of MapWork in org.apache.hadoop.hive.ql.io.rcfile.stats | |
|---|---|
class |
PartialScanWork
Partial Scan Work. |
| Uses of MapWork in org.apache.hadoop.hive.ql.io.rcfile.truncate |
|---|
| Subclasses of MapWork in org.apache.hadoop.hive.ql.io.rcfile.truncate | |
|---|---|
class |
ColumnTruncateWork
|
| Uses of MapWork in org.apache.hadoop.hive.ql.optimizer |
|---|
| Methods in org.apache.hadoop.hive.ql.optimizer that return MapWork | |
|---|---|
static MapWork |
GenMapRedUtils.createRCFileMergeTask(FileSinkDesc fsInputDesc,
org.apache.hadoop.fs.Path finalName,
boolean hasDynamicPartitions)
Create a block level merge task for RCFiles. |
| Methods in org.apache.hadoop.hive.ql.optimizer with parameters of type MapWork | |
|---|---|
static String |
GenMapRedUtils.findAlias(MapWork work,
Operator<?> operator)
|
static Set<String> |
GenMapRedUtils.findAliases(MapWork work,
Operator<?> startOp)
|
static void |
GenMapRedUtils.replaceMapWork(String sourceAlias,
String targetAlias,
MapWork source,
MapWork target)
Replace the Map-side operator tree associated with targetAlias in target with the Map-side operator tree associated with sourceAlias in source. |
static void |
GenMapRedUtils.setMapWork(MapWork plan,
ParseContext parseCtx,
Set<ReadEntity> inputs,
PrunedPartitionList partsList,
Operator<? extends OperatorDesc> topOp,
String alias_id,
HiveConf conf,
boolean local)
initialize MapWork |
static void |
GenMapRedUtils.setTaskPlan(String path,
String alias,
Operator<? extends OperatorDesc> topOp,
MapWork plan,
boolean local,
TableDesc tt_desc)
set the current task in the mapredWork. |
| Uses of MapWork in org.apache.hadoop.hive.ql.optimizer.physical |
|---|
| Methods in org.apache.hadoop.hive.ql.optimizer.physical with parameters of type MapWork | |
|---|---|
long |
AbstractJoinTaskDispatcher.getTotalKnownInputSize(Context context,
MapWork currWork,
Map<String,ArrayList<String>> pathToAliases,
HashMap<String,Long> aliasToSize)
|
| Uses of MapWork in org.apache.hadoop.hive.ql.parse |
|---|
| Methods in org.apache.hadoop.hive.ql.parse that return MapWork | |
|---|---|
MapWork |
GenTezUtils.createMapWork(GenTezProcContext context,
Operator<?> root,
TezWork tezWork,
PrunedPartitionList partitions)
|
| Methods in org.apache.hadoop.hive.ql.parse with parameters of type MapWork | |
|---|---|
protected void |
GenTezUtils.setupMapWork(MapWork mapWork,
GenTezProcContext context,
PrunedPartitionList partitions,
Operator<? extends OperatorDesc> root,
String alias)
|
| Uses of MapWork in org.apache.hadoop.hive.ql.plan |
|---|
| Methods in org.apache.hadoop.hive.ql.plan that return MapWork | |
|---|---|
MapWork |
MapredWork.getMapWork()
|
| Methods in org.apache.hadoop.hive.ql.plan with parameters of type MapWork | |
|---|---|
void |
MapWork.mergingInto(MapWork mapWork)
|
void |
MapredWork.setMapWork(MapWork mapWork)
|
|
||||||||||
| PREV NEXT | FRAMES NO FRAMES | |||||||||