Uses of Class
org.apache.hadoop.hive.ql.metadata.HiveException

Packages that use HiveException
org.apache.hadoop.hive.ql   
org.apache.hadoop.hive.ql.exec Hive QL execution tasks, operators, functions and other handlers. 
org.apache.hadoop.hive.ql.exec.mapjoin   
org.apache.hadoop.hive.ql.exec.mr   
org.apache.hadoop.hive.ql.exec.persistence   
org.apache.hadoop.hive.ql.exec.tez   
org.apache.hadoop.hive.ql.exec.vector   
org.apache.hadoop.hive.ql.exec.vector.expressions   
org.apache.hadoop.hive.ql.exec.vector.expressions.aggregates   
org.apache.hadoop.hive.ql.exec.vector.expressions.aggregates.gen   
org.apache.hadoop.hive.ql.exec.vector.udf   
org.apache.hadoop.hive.ql.index   
org.apache.hadoop.hive.ql.index.bitmap   
org.apache.hadoop.hive.ql.index.compact   
org.apache.hadoop.hive.ql.io   
org.apache.hadoop.hive.ql.io.rcfile.merge   
org.apache.hadoop.hive.ql.io.rcfile.truncate   
org.apache.hadoop.hive.ql.lockmgr Hive Lock Manager interfaces and some custom implmentations 
org.apache.hadoop.hive.ql.metadata   
org.apache.hadoop.hive.ql.metadata.formatting   
org.apache.hadoop.hive.ql.optimizer   
org.apache.hadoop.hive.ql.optimizer.ppr   
org.apache.hadoop.hive.ql.parse   
org.apache.hadoop.hive.ql.plan   
org.apache.hadoop.hive.ql.security   
org.apache.hadoop.hive.ql.security.authorization   
org.apache.hadoop.hive.ql.security.authorization.plugin Licensed to the Apache Software Foundation (ASF) under one or more contributor license agreements. 
org.apache.hadoop.hive.ql.session   
org.apache.hadoop.hive.ql.udf.generic Standard toolkit and framework for generic User-defined functions. 
org.apache.hadoop.hive.ql.udf.ptf   
org.apache.hadoop.hive.ql.udf.xml   
 

Uses of HiveException in org.apache.hadoop.hive.ql
 

Methods in org.apache.hadoop.hive.ql that throw HiveException
 boolean DriverContext.addToRunnable(Task<? extends Serializable> tsk)
           
 Task<? extends Serializable> DriverContext.getRunnable(int maxthreads)
           
 void DriverContext.launching(TaskRunner runner)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.exec
 

Subclasses of HiveException in org.apache.hadoop.hive.ql.exec
 class AmbiguousMethodException
          Exception thrown by the UDF and UDAF method resolvers in case a unique method is not found.
 class NoMatchingMethodException
          Exception thrown by the UDF and UDAF method resolvers in case no matching method is found.
 class UDFArgumentException
          exception class, thrown when udf argument have something wrong.
 class UDFArgumentLengthException
          exception class, thrown when udf arguments have wrong length.
 class UDFArgumentTypeException
          exception class, thrown when udf arguments have wrong types.
 

Methods in org.apache.hadoop.hive.ql.exec that throw HiveException
protected  Object ExprNodeNullEvaluator._evaluate(Object row, int version)
           
protected  Object ExprNodeGenericFuncEvaluator._evaluate(Object row, int version)
           
protected  Object ExprNodeFieldEvaluator._evaluate(Object row, int version)
           
protected  Object ExprNodeEvaluatorRef._evaluate(Object row, int version)
           
protected  Object ExprNodeEvaluatorHead._evaluate(Object row, int version)
           
protected abstract  Object ExprNodeEvaluator._evaluate(Object row, int version)
          Evaluate value
protected  Object ExprNodeConstantEvaluator._evaluate(Object row, int version)
           
protected  Object ExprNodeColumnEvaluator._evaluate(Object row, int version)
           
 void FileSinkOperator.FSPaths.abortWriters(org.apache.hadoop.fs.FileSystem fs, boolean abort, boolean delete)
           
static void FunctionTask.addFunctionResources(List<ResourceUri> resources)
           
static URI ArchiveUtils.addSlash(URI u)
          Makes sure, that URI points to directory by adding slash to it.
 void PTFPartition.append(Object o)
           
protected  void CommonJoinOperator.checkAndGenObject()
           
 void Operator.cleanUpInputFileChanged()
           
 void TableScanOperator.cleanUpInputFileChangedOp()
           
 void SMBMapJoinOperator.cleanUpInputFileChangedOp()
           
 void Operator.cleanUpInputFileChangedOp()
           
 void MapOperator.cleanUpInputFileChangedOp()
           
 void MapJoinOperator.cleanUpInputFileChangedOp()
           
 void FetchTask.clearFetch()
          Clear the Fetch Operator.
 void FetchOperator.clearFetchContext()
          Clear the context, if anything needs to be done.
 void SkewJoinHandler.close(boolean abort)
           
 void ScriptOperator.close(boolean abort)
           
 void Operator.close(boolean abort)
           
protected  void UDTFOperator.closeOp(boolean abort)
           
 void TableScanOperator.closeOp(boolean abort)
           
 void SMBMapJoinOperator.closeOp(boolean abort)
           
protected  void ReduceSinkOperator.closeOp(boolean abort)
           
protected  void PTFOperator.closeOp(boolean abort)
           
protected  void Operator.closeOp(boolean abort)
          Operator specific close routine.
protected  void MuxOperator.closeOp(boolean abort)
           
 void MapOperator.closeOp(boolean abort)
          close extra child operators that are initialized but are not executed.
 void MapJoinOperator.closeOp(boolean abort)
           
 void LimitOperator.closeOp(boolean abort)
           
 void JoinOperator.closeOp(boolean abort)
          All done.
 void HashTableSinkOperator.closeOp(boolean abort)
           
 void HashTableDummyOperator.closeOp(boolean abort)
           
 void GroupByOperator.closeOp(boolean abort)
          We need to forward all the aggregations to children.
 void FileSinkOperator.closeOp(boolean abort)
           
protected  void DemuxOperator.closeOp(boolean abort)
           
 void CommonJoinOperator.closeOp(boolean abort)
          All done.
 void AbstractMapJoinOperator.closeOp(boolean abort)
           
 void FileSinkOperator.FSPaths.closeWriters(boolean abort)
           
 Integer ExprNodeGenericFuncEvaluator.compare(Object row)
          If the genericUDF is a base comparison, it returns an integer based on the result of comparing the two sides of the UDF, like the compareTo method in Comparable.
static ArrayList<Object> JoinUtil.computeKeys(Object row, List<ExprNodeEvaluator> keyFields, List<ObjectInspector> keyFieldsOI)
          Return the key as a standard object.
protected  MapJoinKey MapJoinOperator.computeMapJoinKey(Object row, byte alias)
           
static Object[] JoinUtil.computeMapJoinValues(Object row, List<ExprNodeEvaluator> valueFields, List<ObjectInspector> valueFieldsOI, List<ExprNodeEvaluator> filters, List<ObjectInspector> filtersOI, int[] filterMap)
          Return the value as a standard object.
static List<Object> JoinUtil.computeValues(Object row, List<ExprNodeEvaluator> valueFields, List<ObjectInspector> valueFieldsOI, boolean hasFilter)
          Return the value as a standard object.
static String ArchiveUtils.conflictingArchiveNameOrNull(Hive db, Table tbl, LinkedHashMap<String,String> partSpec)
          Determines if one can insert into partition(s), or there's a conflict with archive.
static void PTFOperator.connectLeadLagFunctionsToPartition(PTFDesc ptfDesc, PTFPartition.PTFPartitionIterator<Object> pItr)
           
static PTFPartition PTFPartition.create(HiveConf cfg, SerDe serDe, StructObjectInspector inputOI, StructObjectInspector outputOI)
           
static ArchiveUtils.PartSpecInfo ArchiveUtils.PartSpecInfo.create(Table tbl, Map<String,String> partSpec)
          Extract partial prefix specification from table and key-value map
protected  void FileSinkOperator.createBucketFiles(FileSinkOperator.FSPaths fsp)
           
protected  void FileSinkOperator.createBucketForFileIdx(FileSinkOperator.FSPaths fsp, int filesIdx)
           
 PTFPartition PTFOperator.createFirstPartitionForChain(ObjectInspector oi, HiveConf hiveConf, boolean isMapSide)
          Create a new Partition.
 org.apache.hadoop.fs.Path ArchiveUtils.PartSpecInfo.createPath(Table tbl)
          Creates path where partitions matching prefix should lie in filesystem
protected  void Operator.defaultEndGroup()
           
protected  void Operator.defaultStartGroup()
           
 void Operator.endGroup()
           
 void MuxOperator.endGroup()
           
 void MapJoinOperator.endGroup()
           
 void JoinOperator.endGroup()
          Forward a record of join results.
 void GroupByOperator.endGroup()
           
 void DemuxOperator.endGroup()
           
 void CommonJoinOperator.endGroup()
          Forward a record of join results.
 Object ExprNodeEvaluator.evaluate(Object row)
           
protected  Object ExprNodeEvaluator.evaluate(Object row, int version)
          Evaluate the expression given the row.
 void TopNHash.flush()
          Flushes all the rows cached in the heap.
 void Operator.flush()
           
 void GroupByOperator.flush()
          Forward all aggregations to children.
protected  void TemporaryHashSinkOperator.flushToFile()
           
protected  void HashTableSinkOperator.flushToFile()
           
protected  void GroupByOperator.forward(Object[] keys, GenericUDAFEvaluator.AggregationBuffer[] aggs)
          Forward a record of keys and aggregation results.
protected  void Operator.forward(Object row, ObjectInspector rowInspector)
           
 void MuxOperator.forward(Object row, ObjectInspector rowInspector)
           
 void DemuxOperator.forward(Object row, ObjectInspector rowInspector)
           
 void UDTFOperator.forwardUDTFOutput(Object o)
          forwardUDTFOutput is typically called indirectly by the GenericUDTF when the GenericUDTF has generated output rows that should be passed on to the next operator(s) in the DAG.
 void MapJoinOperator.generateMapMetaData()
           
static ExprNodeEvaluator ExprNodeEvaluatorFactory.get(ExprNodeDesc desc)
           
static int ArchiveUtils.getArchivingLevel(Partition p)
          Returns archiving level, which is how many fields were set in partial specification ARCHIVE was run for
 Object PTFPartition.getAt(int i)
           
static String[] Utilities.getDbTableName(String dbtable)
          Extract db and table name from dbtable string, where db and table are separated by "." If there is no db name part, set the current sessions default db
protected  FileSinkOperator.FSPaths FileSinkOperator.getDynOutPaths(List<String> row, String lbDirName)
           
protected  List<Object> CommonJoinOperator.getFilteredValue(byte alias, Object row)
           
static List<LinkedHashMap<String,String>> Utilities.getFullDPSpecs(org.apache.hadoop.conf.Configuration conf, DynamicPartitionCtx dpCtx)
          Construct a list of full partition spec from Dynamic Partition Context and the directory names corresponding to these dynamic partitions.
 URI ArchiveUtils.HarPathHelper.getHarUri(URI original, org.apache.hadoop.hive.shims.HadoopShims shim)
           
static Hive FunctionRegistry.getHive()
           
 String ArchiveUtils.PartSpecInfo.getName()
          Generates name for prefix partial partition specification.
abstract  void KeyWrapper.getNewKey(Object row, ObjectInspector rowInspector)
           
static List<ObjectInspector>[] JoinUtil.getObjectInspectorsFromEvaluators(List<ExprNodeEvaluator>[] exprEntries, ObjectInspector[] inputObjInspector, int posBigTableAlias, int tagLen)
           
 ObjectInspector FetchOperator.getOutputObjectInspector()
          returns output ObjectInspector, never null
static String ArchiveUtils.getPartialName(Partition p, int level)
          Get a prefix of the given parition's string representation.
static PartitionDesc Utilities.getPartitionDesc(Partition part)
           
static PartitionDesc Utilities.getPartitionDescFromTableDesc(TableDesc tblDesc, Partition part)
           
static String[] FunctionUtils.getQualifiedFunctionNameParts(String name)
           
static RowContainer<List<Object>> JoinUtil.getRowContainer(org.apache.hadoop.conf.Configuration hconf, List<ObjectInspector> structFieldObjectInspectors, Byte alias, int containerSize, TableDesc[] spillTableDesc, JoinDesc conf, boolean noFilter, org.apache.hadoop.mapred.Reporter reporter)
           
static
<T extends OperatorDesc>
Operator<T>
OperatorFactory.getVectorOperator(T conf, VectorizationContext vContext)
           
 void SkewJoinHandler.handleSkew(int tag)
           
protected static ObjectInspector[] Operator.initEvaluators(ExprNodeEvaluator[] evals, int start, int length, ObjectInspector rowInspector)
          Initialize an array of ExprNodeEvaluator from start, for specified length and return the result ObjectInspectors.
protected static ObjectInspector[] Operator.initEvaluators(ExprNodeEvaluator[] evals, ObjectInspector rowInspector)
          Initialize an array of ExprNodeEvaluator and return the result ObjectInspectors.
protected static StructObjectInspector ReduceSinkOperator.initEvaluatorsAndReturnStruct(ExprNodeEvaluator[] evals, List<List<Integer>> distinctColIndices, List<String> outputColNames, int length, ObjectInspector rowInspector)
          Initializes array of ExprNodeEvaluator.
protected static StructObjectInspector Operator.initEvaluatorsAndReturnStruct(ExprNodeEvaluator[] evals, List<String> outputColName, ObjectInspector rowInspector)
          Initialize an array of ExprNodeEvaluator and put the return values into a StructObjectInspector with integer field names.
 void Operator.initialize(org.apache.hadoop.conf.Configuration hconf, ObjectInspector[] inputOIs)
          Initializes operators only if all parents have been initialized.
protected  void Operator.initialize(org.apache.hadoop.conf.Configuration hconf, ObjectInspector inputOI, int parentId)
          Collects all the parent's output object inspectors and calls actual initialization method.
 void DefaultFetchFormatter.initialize(org.apache.hadoop.conf.Configuration hconf, Properties props)
           
 ObjectInspector ExprNodeNullEvaluator.initialize(ObjectInspector rowInspector)
           
 ObjectInspector ExprNodeGenericFuncEvaluator.initialize(ObjectInspector rowInspector)
           
 ObjectInspector ExprNodeFieldEvaluator.initialize(ObjectInspector rowInspector)
           
 ObjectInspector ExprNodeEvaluatorRef.initialize(ObjectInspector rowInspector)
           
 ObjectInspector ExprNodeEvaluatorHead.initialize(ObjectInspector rowInspector)
           
abstract  ObjectInspector ExprNodeEvaluator.initialize(ObjectInspector rowInspector)
          Initialize should be called once and only once.
 ObjectInspector ExprNodeConstantEvaluator.initialize(ObjectInspector rowInspector)
           
 ObjectInspector ExprNodeColumnEvaluator.initialize(ObjectInspector rowInspector)
           
 void MapOperator.initializeAsRoot(org.apache.hadoop.conf.Configuration hconf, MapWork mapWork)
          Initializes this map op as the root of the tree.
protected  void Operator.initializeChildren(org.apache.hadoop.conf.Configuration hconf)
          Calls initialize on each of the children with outputObjetInspector as the output row format.
protected  void MuxOperator.initializeChildren(org.apache.hadoop.conf.Configuration hconf)
          Calls initialize on each of the children with outputObjetInspector as the output row format.
protected  void DemuxOperator.initializeChildren(org.apache.hadoop.conf.Configuration hconf)
           
 void SMBMapJoinOperator.initializeLocalWork(org.apache.hadoop.conf.Configuration hconf)
           
 void Operator.initializeLocalWork(org.apache.hadoop.conf.Configuration hconf)
           
 void SMBMapJoinOperator.initializeMapredLocalWork(MapJoinDesc mjConf, org.apache.hadoop.conf.Configuration hconf, MapredLocalWork localWork, org.apache.commons.logging.Log l4j)
           
protected  void UnionOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
          UnionOperator will transform the input rows if the inputObjInspectors from different parents are different.
protected  void UDTFOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void TableScanOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void SMBMapJoinOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void SelectOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void ScriptOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void ReduceSinkOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void PTFOperator.initializeOp(org.apache.hadoop.conf.Configuration jobConf)
           
protected  void Operator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
          Operator specific initialization.
protected  void MuxOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
 void MapOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void MapJoinOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void ListSinkOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void LimitOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void LateralViewJoinOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void JoinOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void HashTableSinkOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void HashTableDummyOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void GroupByOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void FilterOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void FileSinkOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void ExtractOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void DummyStoreOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void DemuxOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void CommonJoinOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void CollectOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void AbstractMapJoinOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void CommonJoinOperator.internalForward(Object row, ObjectInspector outputOI)
           
static Object FunctionRegistry.invoke(Method m, Object thisObject, Object... arguments)
           
protected static short JoinUtil.isFiltered(Object row, List<ExprNodeEvaluator> filters, List<ObjectInspector> ois, int[] filterMap)
          Returns true if the row does not pass through filters.
 PTFPartition.PTFPartitionIterator<Object> PTFPartition.iterator()
           
 void Operator.jobClose(org.apache.hadoop.conf.Configuration conf, boolean success)
          Unlike other operator interfaces which are called from map or reduce task, jobClose is called from the jobclient side once the job has completed.
 void Operator.jobCloseOp(org.apache.hadoop.conf.Configuration conf, boolean success)
           
 void JoinOperator.jobCloseOp(org.apache.hadoop.conf.Configuration hconf, boolean success)
           
 void FileSinkOperator.jobCloseOp(org.apache.hadoop.conf.Configuration hconf, boolean success)
           
 T PTFPartition.PTFPartitionIterator.lag(int amt)
           
 T PTFPartition.PTFPartitionIterator.lead(int amt)
           
 void HashTableLoader.load(MapJoinTableContainer[] mapJoinTables, MapJoinTableContainerSerDe[] mapJoinTableSerdes)
           
protected  FileSinkOperator.FSPaths FileSinkOperator.lookupListBucketingPaths(String lbDirName)
          Lookup list bucketing path.
static void Utilities.mvFileToFinalPath(org.apache.hadoop.fs.Path specPath, org.apache.hadoop.conf.Configuration hconf, boolean success, org.apache.commons.logging.Log log, DynamicPartitionCtx dpCtx, FileSinkDesc conf, org.apache.hadoop.mapred.Reporter reporter)
           
protected  GenericUDAFEvaluator.AggregationBuffer[] GroupByOperator.newAggregations()
           
static int JoinUtil.populateJoinKeyValue(List<ExprNodeEvaluator>[] outMap, Map<Byte,List<ExprNodeDesc>> inputMap, Byte[] order, int posBigTableAlias)
           
static int JoinUtil.populateJoinKeyValue(List<ExprNodeEvaluator>[] outMap, Map<Byte,List<ExprNodeDesc>> inputMap, int posBigTableAlias)
           
 Object MuxOperator.Handler.process(Object row)
           
 void MapOperator.process(org.apache.hadoop.io.Writable value)
           
 void Operator.processGroup(int tag)
           
 void MuxOperator.processGroup(int tag)
           
protected  void PTFOperator.processInputPartition()
           
protected  void PTFOperator.processMapFunction()
           
 void UnionOperator.processOp(Object row, int tag)
           
 void UDTFOperator.processOp(Object row, int tag)
           
 void TableScanOperator.processOp(Object row, int tag)
          Other than gathering statistics for the ANALYZE command, the table scan operator does not do anything special other than just forwarding the row.
 void SMBMapJoinOperator.processOp(Object row, int tag)
           
 void SelectOperator.processOp(Object row, int tag)
           
 void ScriptOperator.processOp(Object row, int tag)
           
 void ReduceSinkOperator.processOp(Object row, int tag)
           
 void PTFOperator.processOp(Object row, int tag)
           
abstract  void Operator.processOp(Object row, int tag)
          Process the row.
 void MuxOperator.processOp(Object row, int tag)
           
 void MapOperator.processOp(Object row, int tag)
           
 void MapJoinOperator.processOp(Object row, int tag)
           
 void ListSinkOperator.processOp(Object row, int tag)
           
 void LimitOperator.processOp(Object row, int tag)
           
 void LateralViewJoinOperator.processOp(Object row, int tag)
          An important assumption for processOp() is that for a given row from the TS, the LVJ will first get the row from the left select operator, followed by all the corresponding rows from the UDTF operator.
 void LateralViewForwardOperator.processOp(Object row, int tag)
           
 void JoinOperator.processOp(Object row, int tag)
           
 void HashTableSinkOperator.processOp(Object row, int tag)
           
 void HashTableDummyOperator.processOp(Object row, int tag)
           
 void GroupByOperator.processOp(Object row, int tag)
           
 void ForwardOperator.processOp(Object row, int tag)
           
 void FilterOperator.processOp(Object row, int tag)
           
 void FileSinkOperator.processOp(Object row, int tag)
           
 void ExtractOperator.processOp(Object row, int tag)
           
 void DummyStoreOperator.processOp(Object row, int tag)
           
 void DemuxOperator.processOp(Object row, int tag)
           
 void CollectOperator.processOp(Object row, int tag)
           
 boolean FetchOperator.pushRow()
          Get the next row and push down it to operator tree.
protected  void FetchOperator.pushRow(InspectableObject row)
           
protected  void PTFOperator.reconstructQueryDef(HiveConf hiveConf)
          Initialize the visitor to use the QueryDefDeserializer Use the order defined in QueryDefWalker to visit the QueryDef
static void Utilities.rename(org.apache.hadoop.fs.FileSystem fs, org.apache.hadoop.fs.Path src, org.apache.hadoop.fs.Path dst)
          Rename src to dst, or in the case dst already exists, move files in src to dst.
static void Utilities.renameOrMoveFiles(org.apache.hadoop.fs.FileSystem fs, org.apache.hadoop.fs.Path src, org.apache.hadoop.fs.Path dst)
          Rename src to dst, or in the case dst already exists, move files in src to dst.
 void PTFPartition.reset()
           
 void PTFPartition.PTFPartitionIterator.reset()
           
protected  void GroupByOperator.resetAggregations(GenericUDAFEvaluator.AggregationBuffer[] aggs)
           
 Object PTFPartition.PTFPartitionIterator.resetToIndex(int idx)
           
 void MapOperator.setChildren(org.apache.hadoop.conf.Configuration hconf)
           
protected  void PTFOperator.setupKeysWrapper(ObjectInspector inputOI)
           
 int DDLTask.showColumns(Hive db, ShowColumnsDesc showCols)
           
protected  List<Object> SMBMapJoinOperator.smbJoinComputeKeys(Object row, byte alias)
           
static String[] FunctionUtils.splitQualifiedFunctionName(String functionName)
          Splits a qualified function name into an array containing the database name and function name.
 void Operator.startGroup()
           
 void MuxOperator.startGroup()
           
 void MapJoinOperator.startGroup()
           
 void GroupByOperator.startGroup()
           
 void FileSinkOperator.startGroup()
           
 void DemuxOperator.startGroup()
           
 void CommonJoinOperator.startGroup()
           
 int TopNHash.startVectorizedBatch(int size)
          Perform basic checks and initialize TopNHash for the new vectorized row batch.
 int TopNHash.tryStoreKey(HiveKey key)
          Try store the non-vectorized key.
 void TopNHash.tryStoreVectorizedKey(HiveKey key, int batchIndex)
          Try to put the key from the current vectorized batch into the heap.
static void FunctionRegistry.unregisterTemporaryUDF(String functionName)
           
protected  void GroupByOperator.updateAggregations(GenericUDAFEvaluator.AggregationBuffer[] aggs, Object row, ObjectInspector rowInspector, boolean hashAggr, boolean newEntryForHashAggr, Object[][] lastInvoke)
           
 

Constructors in org.apache.hadoop.hive.ql.exec that throw HiveException
ArchiveUtils.HarPathHelper(HiveConf hconf, URI archive, URI originalBase)
          Creates helper for archive.
ExprNodeFieldEvaluator(ExprNodeFieldDesc desc)
           
ExprNodeGenericFuncEvaluator(ExprNodeGenericFuncDesc expr)
           
MuxOperator.Handler(ObjectInspector inputObjInspector, List<ExprNodeDesc> keyCols, List<ExprNodeDesc> valueCols, List<String> outputKeyColumnNames, List<String> outputValueColumnNames, Integer tag)
           
PTFPartition(HiveConf cfg, SerDe serDe, StructObjectInspector inputOI, StructObjectInspector outputOI)
           
SecureCmdDoAs(HiveConf conf)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.exec.mapjoin
 

Subclasses of HiveException in org.apache.hadoop.hive.ql.exec.mapjoin
 class MapJoinMemoryExhaustionException
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.exec.mr
 

Methods in org.apache.hadoop.hive.ql.exec.mr that throw HiveException
 void HashTableLoader.load(MapJoinTableContainer[] mapJoinTables, MapJoinTableContainerSerDe[] mapJoinTableSerdes)
           
static void ExecDriver.main(String[] args)
           
 

Constructors in org.apache.hadoop.hive.ql.exec.mr that throw HiveException
ExecDriver(MapredWork plan, org.apache.hadoop.mapred.JobConf job, boolean isSilent)
          Constructor/Initialization for invocation as independent utility.
MapredLocalTask(MapredLocalWork plan, org.apache.hadoop.mapred.JobConf job, boolean isSilent)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.exec.persistence
 

Methods in org.apache.hadoop.hive.ql.exec.persistence that throw HiveException
 void LazyFlatRowContainer.add(MapJoinObjectSerDeContext context, org.apache.hadoop.io.BytesWritable value, boolean allowLazy)
          Called when loading the hashtable.
 void LazyFlatRowContainer.addRow(List<Object> t)
           
 void MapJoinRowContainer.addRow(Object[] value)
           
 void LazyFlatRowContainer.addRow(Object[] value)
           
 void PTFRowContainer.addRow(Row t)
           
 void RowContainer.addRow(ROW t)
           
 void AbstractRowContainer.addRow(ROW t)
          add a row into the RowContainer
 void RowContainer.clearRows()
          Remove all elements in the RowContainer.
 void PTFRowContainer.clearRows()
           
 void AbstractRowContainer.clearRows()
          Remove all elements in the RowContainer.
protected  void RowContainer.close()
           
 void PTFRowContainer.close()
           
 MapJoinRowContainer MapJoinRowContainer.copy()
           
 MapJoinRowContainer LazyFlatRowContainer.copy()
           
 void RowContainer.copyToDFSDirecory(org.apache.hadoop.fs.FileSystem destFs, org.apache.hadoop.fs.Path destPath)
           
 ROW RowContainer.first()
           
 Row PTFRowContainer.first()
           
 List<Object> LazyFlatRowContainer.first()
           
 ROW AbstractRowContainer.RowIterator.first()
           
 byte MapJoinRowContainer.getAliasFilter()
           
 byte LazyFlatRowContainer.getAliasFilter()
           
 Row PTFRowContainer.getAt(int rowIdx)
           
 MapJoinTableContainer MapJoinTableContainerSerDe.load(ObjectInputStream in)
           
 ROW RowContainer.next()
           
 Row PTFRowContainer.next()
           
 ROW AbstractRowContainer.RowIterator.next()
           
protected  boolean RowContainer.nextBlock(int readIntoOffset)
           
 void MapJoinTableContainerSerDe.persist(ObjectOutputStream out, MapJoinTableContainer tableContainer)
           
static MapJoinKey MapJoinKey.readFromRow(ByteStream.Output output, MapJoinKey key, Object row, List<ExprNodeEvaluator> fields, List<ObjectInspector> keyFieldsOI, boolean mayReuseKey)
           
protected  void MapJoinKeyObject.readFromRow(Object[] fieldObjs, List<ObjectInspector> keyFieldsOI)
           
static MapJoinKey MapJoinKey.readFromVector(ByteStream.Output output, MapJoinKey key, VectorHashKeyWrapper kw, VectorExpressionWriter[] keyOutputWriters, VectorHashKeyWrapperBatch keyWrapperBatch, boolean mayReuseKey)
           
 void MapJoinKeyObject.readFromVector(VectorHashKeyWrapper kw, VectorExpressionWriter[] keyOutputWriters, VectorHashKeyWrapperBatch keyWrapperBatch)
           
 int LazyFlatRowContainer.rowCount()
           
 int AbstractRowContainer.rowCount()
           
 AbstractRowContainer.RowIterator<List<Object>> LazyFlatRowContainer.rowIter()
           
 AbstractRowContainer.RowIterator<ROW> AbstractRowContainer.rowIter()
           
protected  void RowContainer.setupWriter()
           
 

Constructors in org.apache.hadoop.hive.ql.exec.persistence that throw HiveException
PTFRowContainer(int bs, org.apache.hadoop.conf.Configuration jc, org.apache.hadoop.mapred.Reporter reporter)
           
RowContainer(org.apache.hadoop.conf.Configuration jc, org.apache.hadoop.mapred.Reporter reporter)
           
RowContainer(int bs, org.apache.hadoop.conf.Configuration jc, org.apache.hadoop.mapred.Reporter reporter)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.exec.tez
 

Methods in org.apache.hadoop.hive.ql.exec.tez that throw HiveException
 void HashTableLoader.load(MapJoinTableContainer[] mapJoinTables, MapJoinTableContainerSerDe[] mapJoinTableSerdes)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.exec.vector
 

Methods in org.apache.hadoop.hive.ql.exec.vector that throw HiveException
 void VectorizedRowBatchCtx.addPartitionColsToBatch(VectorizedRowBatch batch)
          Add the partition values to the batch
 void VectorizedRowBatchCtx.addRowToBatch(int rowIndex, org.apache.hadoop.io.Writable rowBlob, VectorizedRowBatch batch, org.apache.hadoop.io.DataOutputBuffer buffer)
          Adds the row to the batch after deserializing the row
static void VectorizedBatchUtil.addRowToBatch(Object row, StructObjectInspector oi, int rowIndex, VectorizedRowBatch batch, org.apache.hadoop.io.DataOutputBuffer buffer)
          Iterates thru all the columns in a given row and populates the batch
 void VectorizationContext.addToColumnMap(String columnName, int outputColumn)
           
 T VectorUtilBatchObjectPool.IAllocator.alloc()
           
 void VectorColumnAssign.assignObjectValue(Object val, int destIndex)
           
 void VectorColumnAssign.assignVectorValue(VectorizedRowBatch inBatch, int batchIndex, int valueColumn, int destIndex)
           
static VectorColumnAssign[] VectorColumnAssignFactory.buildAssigners(VectorizedRowBatch outputBatch)
           
static VectorColumnAssign[] VectorColumnAssignFactory.buildAssigners(VectorizedRowBatch outputBatch, ObjectInspector outputOI, Map<String,Integer> columnMap, List<String> outputColumnNames)
          Builds the assigners from an object inspector and from a list of columns.
static VectorColumnAssign VectorColumnAssignFactory.buildObjectAssign(VectorizedRowBatch outputBatch, int outColIndex, ObjectInspector objInspector)
           
 void VectorSMBMapJoinOperator.closeOp(boolean aborted)
           
 void VectorMapJoinOperator.closeOp(boolean aborted)
           
 void VectorGroupByOperator.closeOp(boolean aborted)
           
static VectorHashKeyWrapperBatch VectorHashKeyWrapperBatch.compileKeyWrapperBatch(VectorExpression[] keyExpressions)
          Prepares a VectorHashKeyWrapperBatch to work for a specific set of keys.
protected  MapJoinKey VectorMapJoinOperator.computeMapJoinKey(Object row, byte alias)
           
 VectorizedRowBatch VectorizedRowBatchCtx.createVectorizedRowBatch()
          Creates a Vectorized row batch and the column vectors.
 void VectorHashKeyWrapperBatch.evaluateBatch(VectorizedRowBatch batch)
          Processes a batch: Evaluates each key vector expression. Copies out each key's primitive values into the key wrappers computes the hashcode of the key wrappers
 VectorAggregateExpression VectorizationContext.getAggregatorExpression(AggregationDesc desc)
           
 T VectorUtilBatchObjectPool.getFromPool()
           
 void VectorHashKeyWrapper.getNewKey(Object row, ObjectInspector rowInspector)
           
 VectorExpression VectorizationContext.getVectorExpression(ExprNodeDesc exprDesc)
           
 VectorExpression VectorizationContext.getVectorExpression(ExprNodeDesc exprDesc, VectorExpressionDescriptor.Mode mode)
          Returns a vector expression for a given expression description.
 Class<?> VectorExpressionDescriptor.getVectorExpressionClass(Class<?> udf, VectorExpressionDescriptor.Descriptor descriptor)
           
 VectorExpression[] VectorizationContext.getVectorExpressions(List<ExprNodeDesc> exprNodes)
           
 VectorExpression[] VectorizationContext.getVectorExpressions(List<ExprNodeDesc> exprNodes, VectorExpressionDescriptor.Mode mode)
           
 Object VectorHashKeyWrapperBatch.getWritableKeyValue(VectorHashKeyWrapper kw, int i, VectorExpressionWriter keyOutputWriter)
          Get the row-mode writable object value of a key from a key wrapper
 void VectorizedRowBatchCtx.init(org.apache.hadoop.conf.Configuration hiveConf, org.apache.hadoop.mapred.FileSplit split)
          Initializes VectorizedRowBatch context based on the split and Hive configuration (Job conf with hive Plan).
protected  void VectorSMBMapJoinOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void VectorSelectOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void VectorReduceSinkOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
 void VectorMapJoinOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void VectorGroupByOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void VectorFilterOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void VectorFileSinkOperator.initializeOp(org.apache.hadoop.conf.Configuration hconf)
           
protected  void VectorSMBMapJoinOperator.internalForward(Object row, ObjectInspector outputOI)
           
protected  void VectorMapJoinOperator.internalForward(Object row, ObjectInspector outputOI)
          'forwards' the (row-mode) record into the (vectorized) output batch
 void VectorMapOperator.process(org.apache.hadoop.io.Writable value)
           
 void VectorSMBMapJoinOperator.processOp(Object row, int tag)
           
 void VectorSelectOperator.processOp(Object row, int tag)
           
 void VectorReduceSinkOperator.processOp(Object row, int tag)
           
 void VectorMapJoinOperator.processOp(Object row, int tag)
           
 void VectorLimitOperator.processOp(Object row, int tag)
           
 void VectorGroupByOperator.processOp(Object row, int tag)
           
 void VectorFilterOperator.processOp(Object row, int tag)
           
 void VectorFileSinkOperator.processOp(Object data, int tag)
           
protected  List<Object> VectorSMBMapJoinOperator.smbJoinComputeKeys(Object row, byte alias)
           
 

Constructors in org.apache.hadoop.hive.ql.exec.vector that throw HiveException
VectorFilterOperator(VectorizationContext vContext, OperatorDesc conf)
           
VectorGroupByOperator(VectorizationContext vContext, OperatorDesc conf)
           
VectorMapJoinOperator(VectorizationContext vContext, OperatorDesc conf)
           
VectorReduceSinkOperator(VectorizationContext vContext, OperatorDesc conf)
           
VectorSelectOperator(VectorizationContext vContext, OperatorDesc conf)
           
VectorSMBMapJoinOperator(VectorizationContext vContext, OperatorDesc conf)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.exec.vector.expressions
 

Methods in org.apache.hadoop.hive.ql.exec.vector.expressions that throw HiveException
static VectorExpressionWriter VectorExpressionWriterFactory.genVectorExpressionWritable(ExprNodeDesc nodeDesc)
          Compiles the appropriate vector expression writer based on an expression info (ExprNodeDesc)
static VectorExpressionWriter VectorExpressionWriterFactory.genVectorExpressionWritable(ObjectInspector fieldObjInspector)
          Compiles the appropriate vector expression writer based on an expression info (ExprNodeDesc)
static VectorExpressionWriter[] VectorExpressionWriterFactory.getExpressionWriters(List<ExprNodeDesc> nodesDesc)
          Helper function to create an array of writers from a list of expression descriptors.
static VectorExpressionWriter[] VectorExpressionWriterFactory.getExpressionWriters(StructObjectInspector objInspector)
          Returns VectorExpressionWriter objects for the fields in the given object inspector.
static VectorExpressionWriter[] VectorExpressionWriterFactory.getSettableExpressionWriters(SettableStructObjectInspector objInspector)
           
 Object VectorExpressionWriter.initValue(Object ost)
           
static void VectorExpressionWriterFactory.processVectorExpressions(List<ExprNodeDesc> nodesDesc, List<String> columnNames, VectorExpressionWriterFactory.SingleOIDClosure closure)
          Creates the value writers for a column vector expression list.
static void VectorExpressionWriterFactory.processVectorExpressions(List<ExprNodeDesc> nodesDesc, VectorExpressionWriterFactory.ListOIDClosure closure)
          Creates the value writers for a column vector expression list.
 Object VectorExpressionWriter.setValue(Object row, ColumnVector column, int columnRow)
           
 Object VectorExpressionWriter.writeValue(byte[] value, int start, int length)
           
 Object VectorExpressionWriter.writeValue(ColumnVector column, int row)
           
 Object VectorExpressionWriter.writeValue(Decimal128 value)
           
 Object VectorExpressionWriter.writeValue(double value)
           
 Object VectorExpressionWriter.writeValue(long value)
           
 

Constructors in org.apache.hadoop.hive.ql.exec.vector.expressions that throw HiveException
FilterStringColLikeStringScalar(int colNum, byte[] likePattern)
           
FilterStringColRegExpStringScalar(int colNum, byte[] regExpPattern)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.exec.vector.expressions.aggregates
 

Methods in org.apache.hadoop.hive.ql.exec.vector.expressions.aggregates that throw HiveException
 void VectorUDAFSumDecimal.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFCountStar.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFCount.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFAvgDecimal.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
abstract  void VectorAggregateExpression.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch unit)
           
 void VectorUDAFSumDecimal.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFCountStar.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFCount.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFAvgDecimal.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int bufferIndex, VectorizedRowBatch batch)
           
abstract  void VectorAggregateExpression.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch vrg)
           
 Object VectorUDAFSumDecimal.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFCountStar.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFCount.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFAvgDecimal.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
abstract  Object VectorAggregateExpression.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFSumDecimal.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFCountStar.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFCount.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFAvgDecimal.getNewAggregationBuffer()
           
abstract  VectorAggregateExpression.AggregationBuffer VectorAggregateExpression.getNewAggregationBuffer()
           
 void VectorUDAFSumDecimal.init(AggregationDesc desc)
           
 void VectorUDAFCountStar.init(AggregationDesc desc)
           
 void VectorUDAFCount.init(AggregationDesc desc)
           
 void VectorUDAFAvgDecimal.init(AggregationDesc desc)
           
abstract  void VectorAggregateExpression.init(AggregationDesc desc)
           
 void VectorUDAFSumDecimal.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFCountStar.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFCount.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFAvgDecimal.reset(VectorAggregateExpression.AggregationBuffer agg)
           
abstract  void VectorAggregateExpression.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.exec.vector.expressions.aggregates.gen
 

Methods in org.apache.hadoop.hive.ql.exec.vector.expressions.aggregates.gen that throw HiveException
 void VectorUDAFVarSampLong.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFVarSampDouble.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFVarSampDecimal.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFVarPopLong.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFVarPopDouble.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFVarPopDecimal.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFSumLong.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFSumDouble.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFStdSampLong.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFStdSampDouble.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFStdSampDecimal.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFStdPopLong.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFStdPopDouble.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFStdPopDecimal.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFMinString.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFMinLong.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFMinDouble.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFMinDecimal.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFMaxString.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFMaxLong.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFMaxDouble.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFMaxDecimal.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFAvgLong.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFAvgDouble.aggregateInput(VectorAggregateExpression.AggregationBuffer agg, VectorizedRowBatch batch)
           
 void VectorUDAFVarSampLong.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFVarSampDouble.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFVarSampDecimal.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFVarPopLong.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFVarPopDouble.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFVarPopDecimal.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFSumLong.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFSumDouble.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFStdSampLong.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFStdSampDouble.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFStdSampDecimal.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFStdPopLong.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFStdPopDouble.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFStdPopDecimal.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFMinString.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregrateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFMinLong.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregrateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFMinDouble.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregrateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFMinDecimal.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregrateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFMaxString.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregrateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFMaxLong.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregrateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFMaxDouble.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregrateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFMaxDecimal.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int aggregrateIndex, VectorizedRowBatch batch)
           
 void VectorUDAFAvgLong.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int bufferIndex, VectorizedRowBatch batch)
           
 void VectorUDAFAvgDouble.aggregateInputSelection(VectorAggregationBufferRow[] aggregationBufferSets, int bufferIndex, VectorizedRowBatch batch)
           
 Object VectorUDAFVarSampLong.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFVarSampDouble.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFVarSampDecimal.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFVarPopLong.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFVarPopDouble.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFVarPopDecimal.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFSumLong.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFSumDouble.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFStdSampLong.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFStdSampDouble.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFStdSampDecimal.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFStdPopLong.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFStdPopDouble.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFStdPopDecimal.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFMinString.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFMinLong.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFMinDouble.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFMinDecimal.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFMaxString.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFMaxLong.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFMaxDouble.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFMaxDecimal.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFAvgLong.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 Object VectorUDAFAvgDouble.evaluateOutput(VectorAggregateExpression.AggregationBuffer agg)
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFVarSampLong.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFVarSampDouble.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFVarSampDecimal.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFVarPopLong.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFVarPopDouble.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFVarPopDecimal.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFSumLong.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFSumDouble.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFStdSampLong.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFStdSampDouble.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFStdSampDecimal.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFStdPopLong.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFStdPopDouble.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFStdPopDecimal.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFMinString.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFMinLong.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFMinDouble.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFMinDecimal.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFMaxString.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFMaxLong.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFMaxDouble.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFMaxDecimal.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFAvgLong.getNewAggregationBuffer()
           
 VectorAggregateExpression.AggregationBuffer VectorUDAFAvgDouble.getNewAggregationBuffer()
           
 void VectorUDAFVarSampLong.init(AggregationDesc desc)
           
 void VectorUDAFVarSampDouble.init(AggregationDesc desc)
           
 void VectorUDAFVarSampDecimal.init(AggregationDesc desc)
           
 void VectorUDAFVarPopLong.init(AggregationDesc desc)
           
 void VectorUDAFVarPopDouble.init(AggregationDesc desc)
           
 void VectorUDAFVarPopDecimal.init(AggregationDesc desc)
           
 void VectorUDAFSumLong.init(AggregationDesc desc)
           
 void VectorUDAFSumDouble.init(AggregationDesc desc)
           
 void VectorUDAFStdSampLong.init(AggregationDesc desc)
           
 void VectorUDAFStdSampDouble.init(AggregationDesc desc)
           
 void VectorUDAFStdSampDecimal.init(AggregationDesc desc)
           
 void VectorUDAFStdPopLong.init(AggregationDesc desc)
           
 void VectorUDAFStdPopDouble.init(AggregationDesc desc)
           
 void VectorUDAFStdPopDecimal.init(AggregationDesc desc)
           
 void VectorUDAFMinString.init(AggregationDesc desc)
           
 void VectorUDAFMinLong.init(AggregationDesc desc)
           
 void VectorUDAFMinDouble.init(AggregationDesc desc)
           
 void VectorUDAFMinDecimal.init(AggregationDesc desc)
           
 void VectorUDAFMaxString.init(AggregationDesc desc)
           
 void VectorUDAFMaxLong.init(AggregationDesc desc)
           
 void VectorUDAFMaxDouble.init(AggregationDesc desc)
           
 void VectorUDAFMaxDecimal.init(AggregationDesc desc)
           
 void VectorUDAFAvgLong.init(AggregationDesc desc)
           
 void VectorUDAFAvgDouble.init(AggregationDesc desc)
           
 void VectorUDAFVarSampLong.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFVarSampDouble.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFVarSampDecimal.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFVarPopLong.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFVarPopDouble.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFVarPopDecimal.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFSumLong.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFSumDouble.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFStdSampLong.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFStdSampDouble.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFStdSampDecimal.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFStdPopLong.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFStdPopDouble.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFStdPopDecimal.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFMinString.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFMinLong.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFMinDouble.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFMinDecimal.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFMaxString.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFMaxLong.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFMaxDouble.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFMaxDecimal.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFAvgLong.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 void VectorUDAFAvgDouble.reset(VectorAggregateExpression.AggregationBuffer agg)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.exec.vector.udf
 

Methods in org.apache.hadoop.hive.ql.exec.vector.udf that throw HiveException
 void VectorUDFAdaptor.init()
           
 

Constructors in org.apache.hadoop.hive.ql.exec.vector.udf that throw HiveException
VectorUDFAdaptor(ExprNodeGenericFuncDesc expr, int outputColumn, String resultType, VectorUDFArgDesc[] argDescs)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.index
 

Methods in org.apache.hadoop.hive.ql.index that throw HiveException
 void HiveIndexHandler.analyzeIndexDefinition(Table baseTable, Index index, Table indexTable)
          Requests that the handler validate an index definition and fill in additional information about its stored representation.
 void AggregateIndexHandler.analyzeIndexDefinition(Table baseTable, Index idx, Table indexTable)
           
 boolean HiveIndexResult.contains(org.apache.hadoop.mapred.FileSplit split)
           
 List<Task<?>> TableBasedIndexHandler.generateIndexBuildTaskList(Table baseTbl, Index index, List<Partition> indexTblPartitions, List<Partition> baseTblPartitions, Table indexTbl, Set<ReadEntity> inputs, Set<WriteEntity> outputs)
           
 List<Task<?>> HiveIndexHandler.generateIndexBuildTaskList(Table baseTbl, Index index, List<Partition> indexTblPartitions, List<Partition> baseTblPartitions, Table indexTbl, Set<ReadEntity> inputs, Set<WriteEntity> outputs)
          Requests that the handler generate a plan for building the index; the plan should read the base table and write out the index representation.
protected abstract  Task<?> TableBasedIndexHandler.getIndexBuilderMapRedTask(Set<ReadEntity> inputs, Set<WriteEntity> outputs, List<FieldSchema> indexField, boolean partitioned, PartitionDesc indexTblPartDesc, String indexTableName, PartitionDesc baseTablePartDesc, String baseTableName, String dbName)
           
 

Constructors in org.apache.hadoop.hive.ql.index that throw HiveException
HiveIndexResult(List<String> indexFiles, org.apache.hadoop.mapred.JobConf conf)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.index.bitmap
 

Methods in org.apache.hadoop.hive.ql.index.bitmap that throw HiveException
 void BitmapIndexHandler.analyzeIndexDefinition(Table baseTable, Index index, Table indexTable)
           
protected  Task<?> BitmapIndexHandler.getIndexBuilderMapRedTask(Set<ReadEntity> inputs, Set<WriteEntity> outputs, List<FieldSchema> indexField, boolean partitioned, PartitionDesc indexTblPartDesc, String indexTableName, PartitionDesc baseTablePartDesc, String baseTableName, String dbName)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.index.compact
 

Methods in org.apache.hadoop.hive.ql.index.compact that throw HiveException
 void CompactIndexHandler.analyzeIndexDefinition(Table baseTable, Index index, Table indexTable)
           
protected  Task<?> CompactIndexHandler.getIndexBuilderMapRedTask(Set<ReadEntity> inputs, Set<WriteEntity> outputs, List<FieldSchema> indexField, boolean partitioned, PartitionDesc indexTblPartDesc, String indexTableName, PartitionDesc baseTablePartDesc, String baseTableName, String dbName)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.io
 

Methods in org.apache.hadoop.hive.ql.io that throw HiveException
static boolean HiveFileFormatUtils.checkInputFormat(org.apache.hadoop.fs.FileSystem fs, HiveConf conf, Class<? extends org.apache.hadoop.mapred.InputFormat> inputFormatCls, ArrayList<org.apache.hadoop.fs.FileStatus> files)
          checks if files are in same format as the given input format.
static FileSinkOperator.RecordWriter HiveFileFormatUtils.getHiveRecordWriter(org.apache.hadoop.mapred.JobConf jc, TableDesc tableInfo, Class<? extends org.apache.hadoop.io.Writable> outputClass, FileSinkDesc conf, org.apache.hadoop.fs.Path outPath, org.apache.hadoop.mapred.Reporter reporter)
           
static FileSinkOperator.RecordWriter HiveFileFormatUtils.getRecordWriter(org.apache.hadoop.mapred.JobConf jc, HiveOutputFormat<?,?> hiveOutputFormat, Class<? extends org.apache.hadoop.io.Writable> valueClass, boolean isCompressed, Properties tableProp, org.apache.hadoop.fs.Path outPath, org.apache.hadoop.mapred.Reporter reporter)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.io.rcfile.merge
 

Methods in org.apache.hadoop.hive.ql.io.rcfile.merge that throw HiveException
static org.apache.hadoop.fs.Path RCFileMergeMapper.backupOutputPath(org.apache.hadoop.fs.FileSystem fs, org.apache.hadoop.fs.Path outpath, org.apache.hadoop.mapred.JobConf job)
           
static void RCFileMergeMapper.jobClose(org.apache.hadoop.fs.Path outputPath, boolean success, org.apache.hadoop.mapred.JobConf job, SessionState.LogHelper console, DynamicPartitionCtx dynPartCtx, org.apache.hadoop.mapred.Reporter reporter)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.io.rcfile.truncate
 

Methods in org.apache.hadoop.hive.ql.io.rcfile.truncate that throw HiveException
static org.apache.hadoop.fs.Path ColumnTruncateMapper.backupOutputPath(org.apache.hadoop.fs.FileSystem fs, org.apache.hadoop.fs.Path outpath, org.apache.hadoop.mapred.JobConf job)
           
static void ColumnTruncateMapper.jobClose(org.apache.hadoop.fs.Path outputPath, boolean success, org.apache.hadoop.mapred.JobConf job, SessionState.LogHelper console, DynamicPartitionCtx dynPartCtx, org.apache.hadoop.mapred.Reporter reporter)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.lockmgr
 

Subclasses of HiveException in org.apache.hadoop.hive.ql.lockmgr
 class LockException
          Exception from lock manager.
 

Uses of HiveException in org.apache.hadoop.hive.ql.metadata
 

Subclasses of HiveException in org.apache.hadoop.hive.ql.metadata
 class HiveFatalException
           
 class InvalidTableException
          Generic exception class for Hive.
 

Methods in org.apache.hadoop.hive.ql.metadata that throw HiveException
 void Hive.alterDatabase(String dbName, Database db)
           
 void Hive.alterFunction(String dbName, String funcName, Function newFunction)
           
 void Hive.alterIndex(String dbName, String baseTblName, String idxName, Index newIdx)
          Updates the existing index metadata with the new metadata.
 void Hive.alterPartition(String tblName, Partition newPart)
          Updates the existing partition metadata with the new metadata.
 void Hive.alterPartition(String dbName, String tblName, Partition newPart)
          Updates the existing partition metadata with the new metadata.
 void Hive.alterPartitions(String tblName, List<Partition> newParts)
          Updates the existing table metadata with the new metadata.
 void Hive.alterTable(String tblName, Table newTbl)
          Updates the existing table metadata with the new metadata.
 void Hive.cancelDelegationToken(String tokenStrForm)
           
 void HiveMetaStoreChecker.checkMetastore(String dbName, String tableName, List<? extends Map<String,String>> partitions, CheckResult result)
          Check the metastore for inconsistencies, data missing in either the metastore or on the dfs.
 void Table.checkValidity()
           
static StorageDescriptor Partition.cloneSd(Table tbl)
          We already have methods that clone stuff using XML or Kryo.
 void Hive.compact(String dbname, String tableName, String partName, String compactType)
          Enqueue a compaction request.
 Table Table.copy()
           
protected static void Hive.copyFiles(HiveConf conf, org.apache.hadoop.fs.Path srcf, org.apache.hadoop.fs.Path destf, org.apache.hadoop.fs.FileSystem fs)
           
protected  void Table.copyFiles(org.apache.hadoop.fs.Path srcf)
          Inserts files specified into the partition.
 void Hive.createDatabase(Database db)
          Create a Database.
 void Hive.createDatabase(Database db, boolean ifNotExist)
          Create a database
 void Hive.createFunction(Function func)
           
 void Hive.createIndex(String tableName, String indexName, String indexHandlerClass, List<String> indexedCols, String indexTblName, boolean deferredRebuild, String inputFormat, String outputFormat, String serde, String storageHandler, String location, Map<String,String> idxProps, Map<String,String> tblProps, Map<String,String> serdeProps, String collItemDelim, String fieldDelim, String fieldEscape, String lineDelim, String mapKeyDelim, String indexComment)
           
static Partition Partition.createMetaPartitionObject(Table tbl, Map<String,String> partSpec, org.apache.hadoop.fs.Path location)
           
 Partition Hive.createPartition(Table tbl, Map<String,String> partSpec)
          Creates a partition.
 List<Partition> Hive.createPartitions(AddPartitionDesc addPartitionDesc)
           
 void Hive.createRole(String roleName, String ownerName)
           
 void Hive.createTable(String tableName, List<String> columns, List<String> partCols, Class<? extends org.apache.hadoop.mapred.InputFormat> fileInputFormat, Class<?> fileOutputFormat)
          Creates a table metdata and the directory for the table data
 void Hive.createTable(String tableName, List<String> columns, List<String> partCols, Class<? extends org.apache.hadoop.mapred.InputFormat> fileInputFormat, Class<?> fileOutputFormat, int bucketCount, List<String> bucketCols)
          Creates a table metdata and the directory for the table data
 void Hive.createTable(Table tbl)
          Creates the table with the give objects
 void Hive.createTable(Table tbl, boolean ifNotExists)
          Creates the table with the give objects
 boolean Hive.databaseExists(String dbName)
          Query metadata to see if a database with the given name already exists.
 boolean Hive.deletePartitionColumnStatistics(String dbName, String tableName, String partName, String colName)
           
 boolean Hive.deleteTableColumnStatistics(String dbName, String tableName, String colName)
           
 void Hive.dropDatabase(String name)
          Drop a database.
 void Hive.dropDatabase(String name, boolean deleteData, boolean ignoreUnknownDb)
          Drop a database
 void Hive.dropDatabase(String name, boolean deleteData, boolean ignoreUnknownDb, boolean cascade)
          Drop a database
 void Hive.dropFunction(String dbName, String funcName)
           
 boolean Hive.dropIndex(String db_name, String tbl_name, String index_name, boolean deleteData)
           
 boolean Hive.dropPartition(String tblName, List<String> part_vals, boolean deleteData)
           
 boolean Hive.dropPartition(String db_name, String tbl_name, List<String> part_vals, boolean deleteData)
           
 List<Partition> Hive.dropPartitions(String tblName, List<DropTableDesc.PartSpec> partSpecs, boolean deleteData, boolean ignoreProtection, boolean ifExists)
           
 List<Partition> Hive.dropPartitions(String dbName, String tblName, List<DropTableDesc.PartSpec> partSpecs, boolean deleteData, boolean ignoreProtection, boolean ifExists)
           
 void Hive.dropRole(String roleName)
           
 void Hive.dropTable(String tableName)
          Drops table along with the data in it.
 void Hive.dropTable(String dbName, String tableName)
          Drops table along with the data in it.
 void Hive.dropTable(String dbName, String tableName, boolean deleteData, boolean ignoreUnknownTab)
          Drops the table.
 InputEstimator.Estimation InputEstimator.estimate(org.apache.hadoop.mapred.JobConf job, TableScanOperator ts, long remaining)
          Estimate input size based on filter and projection on table scan operator
 void Hive.exchangeTablePartitions(Map<String,String> partitionSpecs, String sourceDb, String sourceTable, String destDb, String destinationTableName)
           
 PrincipalPrivilegeSet Hive.get_privilege_set(HiveObjectType objectType, String db_name, String table_name, List<String> part_values, String column_name, String user_name, List<String> group_names)
           
static Hive Hive.get()
           
static Hive Hive.get(HiveConf c)
          Gets hive object for the current thread.
static Hive Hive.get(HiveConf c, boolean needsRefresh)
          get a connection to metastore.
 List<String> Hive.getAllDatabases()
          Get all existing database names.
 List<Index> Table.getAllIndexes(short max)
           
 Set<Partition> Hive.getAllPartitionsOf(Table tbl)
          Get all the partitions; unlike Hive.getPartitions(Table), does not include auth.
 List<String> Hive.getAllRoleNames()
          Get all existing role names.
 List<String> Hive.getAllTables()
          Get all table names for the current database.
 List<String> Hive.getAllTables(String dbName)
          Get all table names for the specified database.
static HiveAuthenticationProvider HiveUtils.getAuthenticator(org.apache.hadoop.conf.Configuration conf, HiveConf.ConfVars authenticatorConfKey)
           
 HiveAuthorizationProvider HiveStorageHandler.getAuthorizationProvider()
          Returns the implementation specific authorization provider
 HiveAuthorizationProvider DefaultStorageHandler.getAuthorizationProvider()
           
static HiveAuthorizationProvider HiveUtils.getAuthorizeProviderManager(org.apache.hadoop.conf.Configuration conf, HiveConf.ConfVars authorizationProviderConfKey, HiveAuthenticationProvider authenticator)
           
static HiveAuthorizationProvider HiveUtils.getAuthorizeProviderManager(org.apache.hadoop.conf.Configuration conf, HiveConf.ConfVars authorizationProviderConfKey, HiveAuthenticationProvider authenticator, boolean nullIfOtherClass)
          Create a new instance of HiveAuthorizationProvider
static HiveAuthorizerFactory HiveUtils.getAuthorizerFactory(org.apache.hadoop.conf.Configuration conf, HiveConf.ConfVars authorizationProviderConfKey)
          Return HiveAuthorizerFactory used by new authorization plugin interface.
 Database Hive.getDatabase(String dbName)
          Get the database by name.
 Database Hive.getDatabaseCurrent()
          Get the Database object for current database
 List<String> Hive.getDatabasesByPattern(String databasePattern)
          Get all existing databases that match the given pattern.
 String Hive.getDelegationToken(String owner, String renewer)
           
static List<FieldSchema> Hive.getFieldsFromDeserializer(String name, Deserializer serde)
           
 Function Hive.getFunction(String dbName, String funcName)
           
 List<String> Hive.getFunctions(String dbName, String pattern)
           
 Index Hive.getIndex(String qualifiedIndexName)
           
 Index Hive.getIndex(String baseTableName, String indexName)
           
 Index Hive.getIndex(String dbName, String baseTableName, String indexName)
           
 List<Index> Hive.getIndexes(String dbName, String tblName, short max)
           
static HiveIndexHandler HiveUtils.getIndexHandler(HiveConf conf, String indexHandlerClass)
           
 Class<? extends org.apache.hadoop.mapred.InputFormat> Partition.getInputFormatClass()
           
 Class<? extends HiveOutputFormat> Partition.getOutputFormatClass()
           
 Partition Hive.getPartition(Table tbl, Map<String,String> partSpec, boolean forceCreate)
           
 Partition Hive.getPartition(Table tbl, Map<String,String> partSpec, boolean forceCreate, String partPath, boolean inheritTableSpecs)
          Returns partition metadata
 Map<String,List<ColumnStatisticsObj>> Hive.getPartitionColumnStatistics(String dbName, String tableName, List<String> partNames, List<String> colNames)
           
 List<String> Hive.getPartitionNames(String tblName, short max)
           
 List<String> Hive.getPartitionNames(String dbName, String tblName, Map<String,String> partSpec, short max)
           
 List<String> Hive.getPartitionNames(String dbName, String tblName, short max)
           
 List<Partition> Hive.getPartitions(Table tbl)
          get all the partitions that the table has
 List<Partition> Hive.getPartitions(Table tbl, Map<String,String> partialPartSpec)
          get all the partitions of the table that matches the given partial specification.
 List<Partition> Hive.getPartitions(Table tbl, Map<String,String> partialPartSpec, short limit)
          get all the partitions of the table that matches the given partial specification.
 boolean Hive.getPartitionsByExpr(Table tbl, ExprNodeGenericFuncDesc expr, HiveConf conf, List<Partition> result)
          Get a list of Partitions by expr.
 List<Partition> Hive.getPartitionsByFilter(Table tbl, String filter)
          Get a list of Partitions by filter.
 List<Partition> Hive.getPartitionsByNames(Table tbl, List<String> partNames)
          Get all partitions of the table that matches the list of given partition names.
 List<Partition> Hive.getPartitionsByNames(Table tbl, Map<String,String> partialPartSpec)
          get all the partitions of the table that matches the given partial specification.
 org.apache.hadoop.fs.Path[] Partition.getPath(Sample s)
           
 List<RolePrincipalGrant> Hive.getRoleGrantInfoForPrincipal(String principalName, PrincipalType principalType)
           
static HiveStorageHandler HiveUtils.getStorageHandler(org.apache.hadoop.conf.Configuration conf, String className)
           
 Table Hive.getTable(String tableName)
          Returns metadata for the table named tableName
 Table Hive.getTable(String tableName, boolean throwException)
          Returns metadata for the table named tableName
 Table Hive.getTable(String dbName, String tableName)
          Returns metadata of the table
 Table Hive.getTable(String dbName, String tableName, boolean throwException)
          Returns metadata of the table
 List<ColumnStatisticsObj> Hive.getTableColumnStatistics(String dbName, String tableName, List<String> colNames)
           
 List<String> Hive.getTablesByPattern(String tablePattern)
          Returns all existing tables from default database which match the given pattern.
 List<String> Hive.getTablesByPattern(String dbName, String tablePattern)
          Returns all existing tables from the specified database which match the given pattern.
 List<String> Hive.getTablesForDb(String database, String tablePattern)
          Returns all existing tables from the given database which match the given pattern.
 boolean Hive.grantPrivileges(PrivilegeBag privileges)
           
 boolean Hive.grantRole(String roleName, String userName, PrincipalType principalType, String grantor, PrincipalType grantorType, boolean grantOption)
           
protected  void Partition.initialize(Table table, Partition tPartition)
          Initializes this object with the given variables
 List<Role> Hive.listRoles(String userName, PrincipalType principalType)
           
 ArrayList<LinkedHashMap<String,String>> Hive.loadDynamicPartitions(org.apache.hadoop.fs.Path loadPath, String tableName, Map<String,String> partSpec, boolean replace, int numDP, boolean holdDDLTime, boolean listBucketingEnabled)
          Given a source directory name of the load path, load all dynamically generated partitions into the specified table and return a list of strings that represent the dynamic partition paths.
 void Hive.loadPartition(org.apache.hadoop.fs.Path loadPath, String tableName, Map<String,String> partSpec, boolean replace, boolean holdDDLTime, boolean inheritTableSpecs, boolean isSkewedStoreAsSubdir)
          Load a directory into a Hive Table Partition - Alters existing content of the partition with the contents of loadPath.
 void Hive.loadTable(org.apache.hadoop.fs.Path loadPath, String tableName, boolean replace, boolean holdDDLTime)
          Load a directory into a Hive Table.
 Table Hive.newTable(String tableName)
           
protected static boolean Hive.renameFile(HiveConf conf, org.apache.hadoop.fs.Path srcf, org.apache.hadoop.fs.Path destf, org.apache.hadoop.fs.FileSystem fs, boolean replace)
           
 void Hive.renamePartition(Table tbl, Map<String,String> oldPartSpec, Partition newPart)
          Rename a old partition to new partition
protected  void Table.replaceFiles(org.apache.hadoop.fs.Path srcf)
          Replaces the directory corresponding to the table by srcf.
protected static void Hive.replaceFiles(org.apache.hadoop.fs.Path srcf, org.apache.hadoop.fs.Path destf, org.apache.hadoop.fs.Path oldPath, HiveConf conf)
          Replaces files in the partition with new data set specified by srcf.
 boolean Hive.revokePrivileges(PrivilegeBag privileges)
           
 boolean Hive.revokeRole(String roleName, String userName, PrincipalType principalType)
           
 void Table.setBucketCols(List<String> bucketCols)
           
 void Table.setInputFormatClass(String name)
           
 void Table.setOutputFormatClass(String name)
           
 void Table.setSkewedColNames(List<String> skewedColNames)
           
 void Table.setSkewedColValues(List<List<String>> skewedValues)
           
 void Table.setSkewedInfo(SkewedInfo skewedInfo)
           
 void Table.setSkewedValueLocationMap(List<String> valList, String dirName)
           
 void Partition.setSkewedValueLocationMap(List<String> valList, String dirName)
           
 void Table.setSortCols(List<Order> sortOrder)
           
 void Table.setStoredAsSubDirectories(boolean storedAsSubDirectories)
           
 void Partition.setValues(Map<String,String> partSpec)
          Set Partition's values
 ShowCompactResponse Hive.showCompactions()
           
 List<HiveObjectPrivilege> Hive.showPrivilegeGrant(HiveObjectType objectType, String principalName, PrincipalType principalType, String dbName, String tableName, List<String> partValues, String columnName)
           
 GetOpenTxnsInfoResponse Hive.showTransactions()
           
 boolean Hive.updatePartitionColumnStatistics(ColumnStatistics statsObj)
           
 boolean Hive.updateTableColumnStatistics(ColumnStatistics statsObj)
           
 void Hive.validatePartitionNameCharacters(List<String> partVals)
           
 

Constructors in org.apache.hadoop.hive.ql.metadata that throw HiveException
DummyPartition(Table tbl, String name)
           
DummyPartition(Table tbl, String name, Map<String,String> partSpec)
           
Partition(Table tbl)
          create an empty partition.
Partition(Table tbl, Map<String,String> partSpec, org.apache.hadoop.fs.Path location)
          Create partition object with the given info.
Partition(Table tbl, Partition tp)
           
Sample(int num, int fraction, Dimension d)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.metadata.formatting
 

Methods in org.apache.hadoop.hive.ql.metadata.formatting that throw HiveException
 void MetaDataFormatter.describeTable(DataOutputStream out, String colPath, String tableName, Table tbl, Partition part, List<FieldSchema> cols, boolean isFormatted, boolean isExt, boolean isPretty, boolean isOutputPadded)
          Describe table.
 void JsonMetaDataFormatter.describeTable(DataOutputStream out, String colPath, String tableName, Table tbl, Partition part, List<FieldSchema> cols, boolean isFormatted, boolean isExt, boolean isPretty, boolean isOutputPadded)
          Describe table.
 void MetaDataFormatter.error(OutputStream out, String msg, int errorCode, String sqlState)
          Write an error message.
 void JsonMetaDataFormatter.error(OutputStream out, String msg, int errorCode, String sqlState)
          Write an error message.
 void MetaDataFormatter.error(OutputStream out, String errorMessage, int errorCode, String sqlState, String errorDetail)
           
 void JsonMetaDataFormatter.error(OutputStream out, String errorMessage, int errorCode, String sqlState, String errorDetail)
           
 void MetaDataFormatter.showDatabaseDescription(DataOutputStream out, String database, String comment, String location, String ownerName, String ownerType, Map<String,String> params)
          Describe a database.
 void JsonMetaDataFormatter.showDatabaseDescription(DataOutputStream out, String database, String comment, String location, String ownerName, String ownerType, Map<String,String> params)
          Show the description of a database
 void MetaDataFormatter.showDatabases(DataOutputStream out, List<String> databases)
          Show the databases
 void JsonMetaDataFormatter.showDatabases(DataOutputStream out, List<String> databases)
          Show a list of databases
 void MetaDataFormatter.showTablePartitons(DataOutputStream out, List<String> parts)
          Show the table partitions.
 void JsonMetaDataFormatter.showTablePartitons(DataOutputStream out, List<String> parts)
          Show the table partitions.
 void MetaDataFormatter.showTables(DataOutputStream out, Set<String> tables)
          Show a list of tables.
 void JsonMetaDataFormatter.showTables(DataOutputStream out, Set<String> tables)
          Show a list of tables.
 void MetaDataFormatter.showTableStatus(DataOutputStream out, Hive db, HiveConf conf, List<Table> tbls, Map<String,String> part, Partition par)
          Show the table status.
 void JsonMetaDataFormatter.showTableStatus(DataOutputStream out, Hive db, HiveConf conf, List<Table> tbls, Map<String,String> part, Partition par)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.optimizer
 

Methods in org.apache.hadoop.hive.ql.optimizer that throw HiveException
static Set<Partition> IndexUtils.checkPartitionsCoveredByIndex(TableScanOperator tableScan, ParseContext pctx, Map<Table,List<Index>> indexes)
          Check the partitions used by the table scan to make sure they also exist in the index table.
 

Uses of HiveException in org.apache.hadoop.hive.ql.optimizer.ppr
 

Methods in org.apache.hadoop.hive.ql.optimizer.ppr that throw HiveException
static Object PartExprEvalUtils.evalExprWithPart(ExprNodeDesc expr, Partition p, List<VirtualColumn> vcs, StructObjectInspector rowObjectInspector)
          Evaluate expression with partition columns
static Object PartExprEvalUtils.evaluateExprOnPart(ObjectPair<PrimitiveObjectInspector,ExprNodeEvaluator> pair, Object partColValues)
           
static ObjectPair<PrimitiveObjectInspector,ExprNodeEvaluator> PartExprEvalUtils.prepareExpr(ExprNodeGenericFuncDesc expr, List<String> partNames)
           
static PrunedPartitionList PartitionPruner.prune(TableScanOperator ts, ParseContext parseCtx, String alias)
          Get the partition list for the TS operator that satisfies the partition pruner condition.
static boolean PartitionPruner.prunePartitionNames(List<String> columnNames, ExprNodeGenericFuncDesc prunerExpr, String defaultPartitionName, List<String> partNames)
          Prunes partition names to see if they match the prune expression.
 

Uses of HiveException in org.apache.hadoop.hive.ql.parse
 

Subclasses of HiveException in org.apache.hadoop.hive.ql.parse
 class SemanticException
          Exception from SemanticAnalyzer.
 

Methods in org.apache.hadoop.hive.ql.parse that throw HiveException
 PTFExpressionDef PTFTranslator.buildExpressionDef(ShapeDetails inpShape, ASTNode arg)
           
 List<Task<? extends Serializable>> IndexUpdater.generateUpdateTasks()
           
static ExprNodeEvaluator WindowingExprNodeEvaluatorFactory.get(LeadLagInfo llInfo, ExprNodeDesc desc)
           
 Hive HiveSemanticAnalyzerHookContextImpl.getHive()
           
 Hive HiveSemanticAnalyzerHookContext.getHive()
           
 PrunedPartitionList ParseContext.getPrunedPartitions(String alias, TableScanOperator ts)
           
 boolean BaseSemanticAnalyzer.isValidPrefixSpec(Table tTable, Map<String,String> spec)
          Checks if given specification is proper specification for prefix of partition cols, for table partitioned by ds, hr, min valid ones are (ds='2008-04-08'), (ds='2008-04-08', hr='12'), (ds='2008-04-08', hr='12', min='30') invalid one is for example (ds='2008-04-08', min='30')
 void WindowingExprNodeEvaluatorFactory.FindLeadLagFuncExprs.visit(ExprNodeGenericFuncDesc fnExpr)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.plan
 

Methods in org.apache.hadoop.hive.ql.plan that throw HiveException
static void PlanUtils.addInputsForView(ParseContext parseCtx)
           
protected  void PTFDeserializer.initialize(BoundaryDef def, ShapeDetails inpShape)
           
protected  void PTFDeserializer.initialize(PartitionedTableFunctionDef def)
           
protected  void PTFDeserializer.initialize(PTFExpressionDef eDef, ShapeDetails inpShape)
           
protected  void PTFDeserializer.initialize(PTFQueryInputDef def, StructObjectInspector OI)
           
protected  void PTFDeserializer.initialize(ShapeDetails shp, StructObjectInspector OI)
           
 void PTFDeserializer.initializePTFChain(PartitionedTableFunctionDef tblFnDef)
           
 void PTFDeserializer.initializeWindowing(WindowTableFunctionDef def)
           
 

Constructors in org.apache.hadoop.hive.ql.plan that throw HiveException
PartitionDesc(Partition part)
           
PartitionDesc(Partition part, TableDesc tblDesc)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.security
 

Methods in org.apache.hadoop.hive.ql.security that throw HiveException
 void SessionStateUserAuthenticator.destroy()
           
 void SessionStateConfigUserAuthenticator.destroy()
           
 void HiveAuthenticationProvider.destroy()
           
 void HadoopDefaultAuthenticator.destroy()
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.security.authorization
 

Methods in org.apache.hadoop.hive.ql.security.authorization that throw HiveException
 void StorageBasedAuthorizationProvider.authorize(Database db, Privilege[] readRequiredPriv, Privilege[] writeRequiredPriv)
           
 void HiveAuthorizationProvider.authorize(Database db, Privilege[] readRequiredPriv, Privilege[] writeRequiredPriv)
          Authorization privileges against a database object.
 void BitSetCheckedAuthorizationProvider.authorize(Database db, Privilege[] inputRequiredPriv, Privilege[] outputRequiredPriv)
           
 void StorageBasedAuthorizationProvider.authorize(Partition part, Privilege[] readRequiredPriv, Privilege[] writeRequiredPriv)
           
 void HiveAuthorizationProvider.authorize(Partition part, Privilege[] readRequiredPriv, Privilege[] writeRequiredPriv)
          Authorization privileges against a hive partition object.
 void BitSetCheckedAuthorizationProvider.authorize(Partition part, Privilege[] inputRequiredPriv, Privilege[] outputRequiredPriv)
           
 void StorageBasedAuthorizationProvider.authorize(org.apache.hadoop.fs.Path path, Privilege[] readRequiredPriv, Privilege[] writeRequiredPriv)
          Authorization privileges against a path.
 void StorageBasedAuthorizationProvider.authorize(Privilege[] readRequiredPriv, Privilege[] writeRequiredPriv)
           
 void HiveAuthorizationProvider.authorize(Privilege[] readRequiredPriv, Privilege[] writeRequiredPriv)
          Authorization user level privileges.
 void BitSetCheckedAuthorizationProvider.authorize(Privilege[] inputRequiredPriv, Privilege[] outputRequiredPriv)
           
 void StorageBasedAuthorizationProvider.authorize(Table table, Partition part, List<String> columns, Privilege[] readRequiredPriv, Privilege[] writeRequiredPriv)
           
 void HiveAuthorizationProvider.authorize(Table table, Partition part, List<String> columns, Privilege[] readRequiredPriv, Privilege[] writeRequiredPriv)
          Authorization privileges against a list of columns.
 void BitSetCheckedAuthorizationProvider.authorize(Table table, Partition part, List<String> columns, Privilege[] inputRequiredPriv, Privilege[] outputRequiredPriv)
           
 void StorageBasedAuthorizationProvider.authorize(Table table, Privilege[] readRequiredPriv, Privilege[] writeRequiredPriv)
           
 void HiveAuthorizationProvider.authorize(Table table, Privilege[] readRequiredPriv, Privilege[] writeRequiredPriv)
          Authorization privileges against a hive table object.
 void BitSetCheckedAuthorizationProvider.authorize(Table table, Privilege[] inputRequiredPriv, Privilege[] outputRequiredPriv)
           
protected  boolean BitSetCheckedAuthorizationProvider.authorizePrivileges(PrincipalPrivilegeSet privileges, Privilege[] inputPriv, boolean[] inputCheck, Privilege[] outputPriv, boolean[] outputCheck)
           
protected  boolean BitSetCheckedAuthorizationProvider.authorizeUserPriv(Privilege[] inputRequiredPriv, boolean[] inputCheck, Privilege[] outputRequiredPriv, boolean[] outputCheck)
           
 PrincipalPrivilegeSet HiveAuthorizationProviderBase.HiveProxy.get_privilege_set(HiveObjectType column, String dbName, String tableName, List<String> partValues, String col, String userName, List<String> groupNames)
           
 Database HiveAuthorizationProviderBase.HiveProxy.getDatabase(String dbName)
           
protected  org.apache.hadoop.fs.Path StorageBasedAuthorizationProvider.getDbLocation(Database db)
           
static HivePrincipal.HivePrincipalType AuthorizationUtils.getHivePrincipalType(PrincipalType type)
          Convert thrift principal type to authorization plugin principal type
static HiveObjectRef AuthorizationUtils.getThriftHiveObjectRef(HivePrivilegeObject privObj)
          Convert thrift HiveObjectRef to plugin HivePrivilegeObject
static HiveObjectType AuthorizationUtils.getThriftHiveObjType(HivePrivilegeObject.HivePrivilegeObjectType type)
          Convert plugin privilege object type to thrift type
static PrivilegeGrantInfo AuthorizationUtils.getThriftPrivilegeGrantInfo(HivePrivilege privilege, HivePrincipal grantorPrincipal, boolean grantOption, int grantTime)
          Get thrift privilege grant info
 void StorageBasedAuthorizationProvider.init(org.apache.hadoop.conf.Configuration conf)
           
 void HiveAuthorizationProvider.init(org.apache.hadoop.conf.Configuration conf)
           
 void DefaultHiveMetastoreAuthorizationProvider.init(org.apache.hadoop.conf.Configuration conf)
           
 void DefaultHiveAuthorizationProvider.init(org.apache.hadoop.conf.Configuration conf)
           
 

Constructors in org.apache.hadoop.hive.ql.security.authorization that throw HiveException
AuthorizationPreEventListener.PartitionWrapper(Partition mapiPart, PreEventContext context)
           
AuthorizationPreEventListener.PartitionWrapper(Table table, Partition mapiPart)
           
AuthorizationPreEventListener(org.apache.hadoop.conf.Configuration config)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.security.authorization.plugin
 

Subclasses of HiveException in org.apache.hadoop.hive.ql.security.authorization.plugin
 class HiveAccessControlException
          Exception thrown by the Authorization plugin api (v2).
 class HiveAuthzPluginException
          Exception thrown by the Authorization plugin api (v2).
 

Uses of HiveException in org.apache.hadoop.hive.ql.session
 

Methods in org.apache.hadoop.hive.ql.session that throw HiveException
 void SessionState.applyAuthorizationPolicy()
          If authorization mode is v2, then pass it through authorizer so that it can apply any security configuration changes.
static CreateTableAutomaticGrant CreateTableAutomaticGrant.create(HiveConf conf)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.udf.generic
 

Methods in org.apache.hadoop.hive.ql.udf.generic that throw HiveException
 void NGramEstimator.add(ArrayList<String> ng)
          Adds a new n-gram to the estimation.
 void GenericUDAFEvaluator.aggregate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
          This function will be called by GroupByOperator when it sees a new input row.
 Object GenericUDFConcat.binaryEvaluate(GenericUDF.DeferredObject[] arguments)
           
 void GenericUDTFStack.close()
           
 void GenericUDTFPosExplode.close()
           
 void GenericUDTFParseUrlTuple.close()
           
 void GenericUDTFJSONTuple.close()
           
 void GenericUDTFInline.close()
           
 void GenericUDTFExplode.close()
           
abstract  void GenericUDTF.close()
          Called to notify the UDTF that there are no more rows to process.
 void UDTFCollector.collect(Object input)
           
 void Collector.collect(Object input)
          Other classes will call collect() with the data that it has.
 Integer GenericUDFBaseCompare.compare(GenericUDF.DeferredObject[] arguments)
           
 void GenericUDAFAverage.GenericUDAFAverageEvaluatorDouble.doReset(org.apache.hadoop.hive.ql.udf.generic.GenericUDAFAverage.AverageAggregationBuffer<Double> aggregation)
           
 void GenericUDAFAverage.GenericUDAFAverageEvaluatorDecimal.doReset(org.apache.hadoop.hive.ql.udf.generic.GenericUDAFAverage.AverageAggregationBuffer<HiveDecimal> aggregation)
           
protected abstract  void GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.doReset(org.apache.hadoop.hive.ql.udf.generic.GenericUDAFAverage.AverageAggregationBuffer<TYPE> aggregation)
           
 Object GenericUDAFEvaluator.evaluate(GenericUDAFEvaluator.AggregationBuffer agg)
          This function will be called by GroupByOperator when it sees a new input row.
 Object UDFCurrentDB.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFWhen.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFUpper.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFUnixTimeStamp.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFUnion.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFTranslate.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFToVarchar.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFToUnixTimeStamp.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFToDecimal.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFToDate.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFToChar.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFToBinary.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFTimestamp.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFStruct.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFStringToMap.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFSplit.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFSortArray.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFSize.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFSentences.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFRound.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFReflect2.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFReflect.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFPrintf.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFPower.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPPositive.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPOr.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPNull.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPNotNull.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPNotEqual.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPNot.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPNegative.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPLessThan.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPGreaterThan.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPEqualOrLessThan.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPEqualOrGreaterThan.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPEqualNS.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPEqual.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFOPAnd.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFNvl.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFNamedStruct.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFMapValues.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFMapKeys.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFMap.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFMacro.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFLower.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFLocate.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFLeadLag.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFInstr.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFInFile.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFIndex.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFIn.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFIf.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFHash.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFFromUtcTimestamp.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFFormatNumber.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFFloorCeilBase.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFField.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFEWAHBitmapEmpty.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFEncode.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFElt.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFDecode.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFDateSub.evaluate(GenericUDF.DeferredObject[] arguments)
           
 org.apache.hadoop.io.IntWritable GenericUDFDateDiff.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFDateAdd.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFDate.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFConcatWS.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFConcat.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFCoalesce.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFCase.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFBridge.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFBetween.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFBaseTrim.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFBasePad.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFBaseNumeric.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFAssertTrue.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFArrayContains.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFArray.evaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDFAbs.evaluate(GenericUDF.DeferredObject[] arguments)
           
abstract  Object GenericUDF.evaluate(GenericUDF.DeferredObject[] arguments)
          Evaluate the GenericUDF with the arguments.
 Object AbstractGenericUDFEWAHBitmapBop.evaluate(GenericUDF.DeferredObject[] arguments)
           
protected  void GenericUDTF.forward(Object o)
          Passes an output row to the collector.
 Object GenericUDF.DeferredObject.get()
           
 Object GenericUDF.DeferredJavaObject.get()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFVariance.GenericUDAFVarianceEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFSum.GenericUDAFSumHiveDecimal.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFSum.GenericUDAFSumDouble.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFSum.GenericUDAFSumLong.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFRank.GenericUDAFRankEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFPercentileApprox.GenericUDAFPercentileApproxEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFNTile.GenericUDAFNTileEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFnGrams.GenericUDAFnGramEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFMkCollectionEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFMin.GenericUDAFMinEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFMax.GenericUDAFMaxEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFLastValue.GenericUDAFLastValueEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.getNewAggregationBuffer()
           
abstract  GenericUDAFEvaluator.AggregationBuffer GenericUDAFEvaluator.getNewAggregationBuffer()
          Get a new aggregation object.
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFCount.GenericUDAFCountEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFComputeStats.GenericUDAFDecimalStatsEvaluator.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFAverage.GenericUDAFAverageEvaluatorDouble.getNewAggregationBuffer()
           
 GenericUDAFEvaluator.AggregationBuffer GenericUDAFAverage.GenericUDAFAverageEvaluatorDecimal.getNewAggregationBuffer()
           
protected abstract  org.apache.hadoop.hive.ql.udf.generic.LeadLagBuffer GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.getNewLLBuffer()
           
protected  org.apache.hadoop.hive.ql.udf.generic.LeadLagBuffer GenericUDAFLead.GenericUDAFLeadEvaluator.getNewLLBuffer()
           
protected  org.apache.hadoop.hive.ql.udf.generic.LeadLagBuffer GenericUDAFLag.GenericUDAFLagEvaluator.getNewLLBuffer()
           
 ArrayList<Object[]> NGramEstimator.getNGrams()
          Returns the final top-k n-grams in a format suitable for returning to Hive.
protected  double[] GenericUDAFPercentileApprox.GenericUDAFPercentileApproxEvaluator.getQuantileArray(ConstantObjectInspector quantileOI)
           
protected abstract  Object GenericUDFLeadLag.getRow(int amt)
           
protected  Object GenericUDFLead.getRow(int amt)
           
protected  Object GenericUDFLag.getRow(int amt)
           
 ObjectInspector GenericUDAFVariance.GenericUDAFVarianceEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFSum.GenericUDAFSumHiveDecimal.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFSum.GenericUDAFSumDouble.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFSum.GenericUDAFSumLong.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFRank.GenericUDAFRankEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFPercentRank.GenericUDAFPercentRankEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFPercentileApprox.GenericUDAFSinglePercentileApproxEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFPercentileApprox.GenericUDAFMultiplePercentileApproxEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFNTile.GenericUDAFNTileEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFnGrams.GenericUDAFnGramEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFMkCollectionEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFMin.GenericUDAFMinEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFMax.GenericUDAFMaxEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFLastValue.GenericUDAFLastValueEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
          Initialize the evaluator.
 ObjectInspector GenericUDAFCumeDist.GenericUDAFCumeDistEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFCount.GenericUDAFCountEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFComputeStats.GenericUDAFDecimalStatsEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFBridge.GenericUDAFBridgeEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 ObjectInspector GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.init(GenericUDAFEvaluator.Mode m, ObjectInspector[] parameters)
           
 void NGramEstimator.initialize(int pk, int ppf, int pn)
          Sets the 'k' and 'pf' parameters.
 void GenericUDAFVariance.GenericUDAFVarianceEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFSum.GenericUDAFSumHiveDecimal.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFSum.GenericUDAFSumDouble.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFSum.GenericUDAFSumLong.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFRank.GenericUDAFRankEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFPercentileApprox.GenericUDAFPercentileApproxEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFNTile.GenericUDAFNTileEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFnGrams.GenericUDAFnGramEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFMkCollectionEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFMin.GenericUDAFMinEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFMax.GenericUDAFMaxEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFLastValue.GenericUDAFLastValueEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
abstract  void GenericUDAFEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
          Iterate through original data.
 void GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFCount.GenericUDAFCountEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFComputeStats.GenericUDAFDecimalStatsEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFBridge.GenericUDAFBridgeEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg, Object[] parameters)
           
 void GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer aggregation, Object[] parameters)
           
 void GenericUDFInFile.load(InputStream is)
          Load the file from an InputStream.
 void GenericUDAFVariance.GenericUDAFVarianceEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFSum.GenericUDAFSumHiveDecimal.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFSum.GenericUDAFSumDouble.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFSum.GenericUDAFSumLong.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFRank.GenericUDAFRankEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFPercentileApprox.GenericUDAFPercentileApproxEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFNTile.GenericUDAFNTileEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFnGrams.GenericUDAFnGramEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFMkCollectionEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFMin.GenericUDAFMinEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFMax.GenericUDAFMaxEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFLastValue.GenericUDAFLastValueEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
abstract  void GenericUDAFEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
          Merge with partial aggregation result.
 void GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFCount.GenericUDAFCountEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object obj)
           
 void GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFComputeStats.GenericUDAFDecimalStatsEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFBridge.GenericUDAFBridgeEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg, Object partial)
           
 void GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer aggregation, Object partial)
           
 void NGramEstimator.merge(List<org.apache.hadoop.io.Text> other)
          Takes a serialized n-gram estimator object created by the serialize() method and merges it with the current n-gram object.
 void GenericUDF.DeferredObject.prepare(int version)
           
 void GenericUDF.DeferredJavaObject.prepare(int version)
           
 void GenericUDTFStack.process(Object[] args)
           
 void GenericUDTFPosExplode.process(Object[] o)
           
 void GenericUDTFParseUrlTuple.process(Object[] o)
           
 void GenericUDTFJSONTuple.process(Object[] o)
           
 void GenericUDTFInline.process(Object[] os)
           
 void GenericUDTFExplode.process(Object[] o)
           
abstract  void GenericUDTF.process(Object[] args)
          Give a set of arguments for the UDTF to process.
 void GenericUDAFVariance.GenericUDAFVarianceEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFSum.GenericUDAFSumHiveDecimal.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFSum.GenericUDAFSumDouble.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFSum.GenericUDAFSumLong.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFRank.GenericUDAFRankEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFPercentileApprox.GenericUDAFPercentileApproxEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFNTile.GenericUDAFNTileEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFnGrams.GenericUDAFnGramEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFMkCollectionEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFMin.GenericUDAFMinEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFMax.GenericUDAFMaxEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFLastValue.GenericUDAFLastValueEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
abstract  void GenericUDAFEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
          Reset the aggregation.
 void GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFCount.GenericUDAFCountEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFComputeStats.GenericUDAFDecimalStatsEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFBridge.GenericUDAFBridgeEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
           
 void GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer aggregation)
           
 ArrayList<org.apache.hadoop.io.Text> NGramEstimator.serialize()
          In preparation for a Hive merge() call, serializes the current n-gram estimator object into an ArrayList of Text objects.
 String GenericUDFConcat.stringEvaluate(GenericUDF.DeferredObject[] arguments)
           
 Object GenericUDAFVarianceSample.GenericUDAFVarianceSampleEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFVariance.GenericUDAFVarianceEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFSum.GenericUDAFSumHiveDecimal.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFSum.GenericUDAFSumDouble.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFSum.GenericUDAFSumLong.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFStdSample.GenericUDAFStdSampleEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFStd.GenericUDAFStdEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFRank.GenericUDAFRankEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFPercentRank.GenericUDAFPercentRankEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFPercentileApprox.GenericUDAFSinglePercentileApproxEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFPercentileApprox.GenericUDAFMultiplePercentileApproxEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFNTile.GenericUDAFNTileEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFnGrams.GenericUDAFnGramEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFMkCollectionEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFMin.GenericUDAFMinEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFMax.GenericUDAFMaxEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFLastValue.GenericUDAFLastValueEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
abstract  Object GenericUDAFEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
          Get final aggregation result.
 Object GenericUDAFCumeDist.GenericUDAFCumeDistEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFCovarianceSample.GenericUDAFCovarianceSampleEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFCount.GenericUDAFCountEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFComputeStats.GenericUDAFDecimalStatsEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFBridge.GenericUDAFBridgeEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer aggregation)
           
 Object GenericUDAFVariance.GenericUDAFVarianceEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFSum.GenericUDAFSumHiveDecimal.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFSum.GenericUDAFSumDouble.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFSum.GenericUDAFSumLong.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFRank.GenericUDAFRankEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFPercentileApprox.GenericUDAFPercentileApproxEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFNTile.GenericUDAFNTileEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFnGrams.GenericUDAFnGramEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFMkCollectionEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFMin.GenericUDAFMinEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFMax.GenericUDAFMaxEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFLastValue.GenericUDAFLastValueEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
abstract  Object GenericUDAFEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
          Get partial aggregation result.
 Object GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFCount.GenericUDAFCountEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFComputeStats.GenericUDAFDecimalStatsEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFBridge.GenericUDAFBridgeEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
           
 Object GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer aggregation)
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.udf.ptf
 

Methods in org.apache.hadoop.hive.ql.udf.ptf that throw HiveException
protected  PTFPartition TableFunctionEvaluator._transformRawInput(PTFPartition iPart)
           
protected  PTFPartition NoopWithMap._transformRawInput(PTFPartition iPart)
           
 void WindowingTableFunction.execute(PTFPartition.PTFPartitionIterator<Object> pItr, PTFPartition outP)
           
protected abstract  void TableFunctionEvaluator.execute(PTFPartition.PTFPartitionIterator<Object> pItr, PTFPartition oPart)
           
 void MatchPath.execute(PTFPartition.PTFPartitionIterator<Object> pItr, PTFPartition outP)
           
 PTFPartition TableFunctionEvaluator.execute(PTFPartition iPart)
           
 PTFPartition NoopWithMap.execute(PTFPartition iPart)
           
 PTFPartition Noop.execute(PTFPartition iPart)
           
static ArrayList<Object> MatchPath.getPath(Object currRow, ObjectInspector rowOI, PTFPartition.PTFPartitionIterator<Object> pItr, int sz)
           
static Object MatchPath.getSelectListInput(Object currRow, ObjectInspector rowOI, PTFPartition.PTFPartitionIterator<Object> pItr, int sz)
           
 void TableFunctionResolver.initialize(PTFDesc ptfDesc, PartitionedTableFunctionDef tDef, TableFunctionEvaluator evaluator)
           
 void WindowingTableFunction.WindowingTableFunctionResolver.initializeOutputOI()
           
abstract  void TableFunctionResolver.initializeOutputOI()
          This method is invoked during runtime(during deserialization of theQueryDef).
 void NoopWithMap.NoopWithMapResolver.initializeOutputOI()
           
 void Noop.NoopResolver.initializeOutputOI()
           
 void MatchPath.MatchPathResolver.initializeOutputOI()
           
 void TableFunctionResolver.initializeRawInputOI()
           
 void NoopWithMap.NoopWithMapResolver.initializeRawInputOI()
           
static MatchPath.SymbolFunctionResult MatchPath.SymbolFunction.match(MatchPath.SymbolFunction syFn, Object row, PTFPartition.PTFPartitionIterator<Object> pItr)
           
protected abstract  MatchPath.SymbolFunctionResult MatchPath.SymbolFunction.match(Object row, PTFPartition.PTFPartitionIterator<Object> pItr)
           
protected  MatchPath.SymbolFunctionResult MatchPath.Symbol.match(Object row, PTFPartition.PTFPartitionIterator<Object> pItr)
           
protected  MatchPath.SymbolFunctionResult MatchPath.Star.match(Object row, PTFPartition.PTFPartitionIterator<Object> pItr)
           
protected  MatchPath.SymbolFunctionResult MatchPath.Plus.match(Object row, PTFPartition.PTFPartitionIterator<Object> pItr)
           
protected  MatchPath.SymbolFunctionResult MatchPath.Chain.match(Object row, PTFPartition.PTFPartitionIterator<Object> pItr)
           
 PTFPartition TableFunctionEvaluator.transformRawInput(PTFPartition iPart)
           
 void MatchPath.ResultExpressionParser.translate()
           
 

Uses of HiveException in org.apache.hadoop.hive.ql.udf.xml
 

Methods in org.apache.hadoop.hive.ql.udf.xml that throw HiveException
 Object GenericUDFXPath.evaluate(GenericUDF.DeferredObject[] arguments)
           
 



Copyright © 2014 The Apache Software Foundation. All rights reserved.