|
||||||||||
| PREV NEXT | FRAMES NO FRAMES | |||||||||
| Packages that use Context | |
|---|---|
| org.apache.hadoop.hive.ql | |
| org.apache.hadoop.hive.ql.exec | Hive QL execution tasks, operators, functions and other handlers. |
| org.apache.hadoop.hive.ql.exec.mr | |
| org.apache.hadoop.hive.ql.exec.tez | |
| org.apache.hadoop.hive.ql.lockmgr | Hive Lock Manager interfaces and some custom implmentations |
| org.apache.hadoop.hive.ql.optimizer.physical | |
| org.apache.hadoop.hive.ql.parse | |
| Uses of Context in org.apache.hadoop.hive.ql |
|---|
| Methods in org.apache.hadoop.hive.ql that return Context | |
|---|---|
Context |
DriverContext.getCtx()
|
| Constructors in org.apache.hadoop.hive.ql with parameters of type Context | |
|---|---|
DriverContext(Context ctx)
|
|
| Uses of Context in org.apache.hadoop.hive.ql.exec |
|---|
| Methods in org.apache.hadoop.hive.ql.exec with parameters of type Context | |
|---|---|
static List<org.apache.hadoop.fs.Path> |
Utilities.getInputPaths(org.apache.hadoop.mapred.JobConf job,
MapWork work,
org.apache.hadoop.fs.Path hiveScratchDir,
Context ctx)
Computes a list of all input paths needed to compute the given MapWork. |
static org.apache.hadoop.fs.ContentSummary |
Utilities.getInputSummary(Context ctx,
MapWork work,
org.apache.hadoop.fs.PathFilter filter)
Calculate the total size of input files. |
static boolean |
Utilities.isEmptyPath(org.apache.hadoop.mapred.JobConf job,
org.apache.hadoop.fs.Path dirPath,
Context ctx)
|
| Uses of Context in org.apache.hadoop.hive.ql.exec.mr |
|---|
| Methods in org.apache.hadoop.hive.ql.exec.mr with parameters of type Context | |
|---|---|
static String |
ExecDriver.generateCmdLine(HiveConf hconf,
Context ctx)
Given a Hive Configuration object - generate a command line fragment for passing such configuration information to ExecDriver. |
| Uses of Context in org.apache.hadoop.hive.ql.exec.tez |
|---|
| Methods in org.apache.hadoop.hive.ql.exec.tez with parameters of type Context | |
|---|---|
org.apache.tez.dag.api.Vertex |
DagUtils.createVertex(org.apache.hadoop.mapred.JobConf conf,
BaseWork work,
org.apache.hadoop.fs.Path scratchDir,
org.apache.hadoop.yarn.api.records.LocalResource appJarLr,
List<org.apache.hadoop.yarn.api.records.LocalResource> additionalLr,
org.apache.hadoop.fs.FileSystem fileSystem,
Context ctx,
boolean hasChildren,
TezWork tezWork)
Create a vertex from a given work object. |
| Uses of Context in org.apache.hadoop.hive.ql.lockmgr |
|---|
| Methods in org.apache.hadoop.hive.ql.lockmgr with parameters of type Context | |
|---|---|
void |
HiveTxnManager.acquireLocks(QueryPlan plan,
Context ctx,
String username)
Acquire all of the locks needed by a query. |
void |
DbTxnManager.acquireLocks(QueryPlan plan,
Context ctx,
String username)
|
| Uses of Context in org.apache.hadoop.hive.ql.optimizer.physical |
|---|
| Methods in org.apache.hadoop.hive.ql.optimizer.physical that return Context | |
|---|---|
Context |
PhysicalContext.getContext()
|
| Methods in org.apache.hadoop.hive.ql.optimizer.physical with parameters of type Context | |
|---|---|
long |
AbstractJoinTaskDispatcher.getTotalKnownInputSize(Context context,
MapWork currWork,
Map<String,ArrayList<String>> pathToAliases,
HashMap<String,Long> aliasToSize)
|
Task<? extends Serializable> |
SortMergeJoinTaskDispatcher.processCurrentTask(MapRedTask currTask,
ConditionalTask conditionalTask,
Context context)
|
Task<? extends Serializable> |
CommonJoinTaskDispatcher.processCurrentTask(MapRedTask currTask,
ConditionalTask conditionalTask,
Context context)
|
abstract Task<? extends Serializable> |
AbstractJoinTaskDispatcher.processCurrentTask(MapRedTask currTask,
ConditionalTask conditionalTask,
Context context)
|
void |
PhysicalContext.setContext(Context context)
|
| Constructors in org.apache.hadoop.hive.ql.optimizer.physical with parameters of type Context | |
|---|---|
PhysicalContext(HiveConf conf,
ParseContext parseContext,
Context context,
List<Task<? extends Serializable>> rootTasks,
Task<? extends Serializable> fetchTask)
|
|
| Uses of Context in org.apache.hadoop.hive.ql.parse |
|---|
| Fields in org.apache.hadoop.hive.ql.parse declared as Context | |
|---|---|
protected Context |
BaseSemanticAnalyzer.ctx
|
| Methods in org.apache.hadoop.hive.ql.parse that return Context | |
|---|---|
Context |
ParseContext.getContext()
|
| Methods in org.apache.hadoop.hive.ql.parse with parameters of type Context | |
|---|---|
void |
ColumnStatsSemanticAnalyzer.analyze(ASTNode ast,
Context origCtx)
|
void |
BaseSemanticAnalyzer.analyze(ASTNode ast,
Context ctx)
|
protected void |
TezCompiler.decideExecMode(List<Task<? extends Serializable>> rootTasks,
Context ctx,
GlobalLimitCtx globalLimitCtx)
|
protected abstract void |
TaskCompiler.decideExecMode(List<Task<? extends Serializable>> rootTasks,
Context ctx,
GlobalLimitCtx globalLimitCtx)
|
protected void |
MapReduceCompiler.decideExecMode(List<Task<? extends Serializable>> rootTasks,
Context ctx,
GlobalLimitCtx globalLimitCtx)
|
void |
BaseSemanticAnalyzer.initCtx(Context ctx)
|
protected void |
TezCompiler.optimizeTaskPlan(List<Task<? extends Serializable>> rootTasks,
ParseContext pCtx,
Context ctx)
|
protected abstract void |
TaskCompiler.optimizeTaskPlan(List<Task<? extends Serializable>> rootTasks,
ParseContext pCtx,
Context ctx)
|
protected void |
MapReduceCompiler.optimizeTaskPlan(List<Task<? extends Serializable>> rootTasks,
ParseContext pCtx,
Context ctx)
|
ASTNode |
ParseDriver.parse(String command,
Context ctx)
|
ASTNode |
ParseDriver.parse(String command,
Context ctx,
boolean setTokenRewriteStream)
Parses a command, optionally assigning the parser's token stream to the given context. |
ASTNode |
ParseDriver.parseSelect(String command,
Context ctx)
|
void |
ParseContext.setContext(Context ctx)
|
|
||||||||||
| PREV NEXT | FRAMES NO FRAMES | |||||||||