object GCBackupAndRestore
Linear Supertypes
Ordering
- Alphabetic
- By Inheritance
Inherited
- GCBackupAndRestore
- AnyRef
- Any
- Hide All
- Show All
Visibility
- Public
- All
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- val AzureStorageAccountKeyName: String
- val S3AccessKeyName: String
- val S3SecretKeyName: String
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
- def constructAbsoluteObjectPaths(objectsRelativePathsDF: DataFrame, srcNamespace: String, storageType: String): Dataset[String]
- def constructDistCpCommand(hadoopProps: Array[(String, String)], absoluteAddressesTextFilePath: String, dstNamespaceForHadoopFs: String): Array[String]
-
def
eliminatePathsOfNonExistingObjects(absolutePathsDF: Dataset[String], hc: Configuration): Dataset[String]
Eliminate objects that don't exist on the underlying object store from the path list.
Eliminate objects that don't exist on the underlying object store from the path list.
- absolutePathsDF
a data frame containing object absolute paths
- hc
hadoop configurations
- returns
a dataset only including absolute paths of objects that exist on the underlying object store
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def getTextFileLocation(prefix: String, hc: Configuration): String
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
main(args: Array[String]): Unit
Required arguments are the following:
Required arguments are the following:
- address of parquet that includes the relative paths of files to backup or restore, created by a gc run 2. src: the namespace to backup or restore objects from/to (i.e. repo storage namespace, or an external location compatibly) 3. backup/restore destination: the namespace to backup or restore objects from/to (i.e. an external location or repo storage namespace compatibly) 4. Object storage type: "s3" or "azure"
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- lazy val spark: SparkSession
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
- def validateAndParseHadoopConfig(hc: Configuration, storageType: String): Array[(String, String)]
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()