package clients
- Alphabetic
- Public
- All
Type Members
-
case class
APIConfigurations(apiUrl: String, accessKey: String, secretKey: String, connectionTimeoutSec: String = "", readTimeoutSec: String = "", source: String = "") extends Product with Serializable
- source
a string describing the application using the client. Will be sent as part of the X-Lakefs-Client header.
- class ApiClient extends AnyRef
- trait BulkRemover extends AnyRef
-
class
Conf extends ScallopConf
Conf options and arguments usage is documented in https://github.com/scallop/scallop/wiki
-
class
ConfigMapper extends Serializable
Maps a broadcast object including hadoop configuration values into hadoop configuration.
- class EntryRecord[Proto <: Message] extends AnyRef
- class EntryRecordReader[Proto <: GeneratedMessage with Message[Proto]] extends RecordReader[Array[Byte], WithIdentifier[Proto]]
-
case class
ExportStatus(path: String, success: Boolean, msg: String) extends Serializable with Product
Export status of a single file
Export status of a single file
- path
of the file
- success
indicates if the file export succeeded
- msg
additional information of the status
- class Exporter extends AnyRef
- class GarbageCollector extends Serializable
- class GravelerSplit extends InputSplit with Writable
- class Handler extends Callable[ExportStatus] with Serializable
- class Item[T] extends AnyRef
- trait KeyFilter extends Serializable
- class LakeFSAllRangesInputFormat extends LakeFSBaseInputFormat
- abstract class LakeFSBaseInputFormat extends InputFormat[Array[Byte], WithIdentifier[Entry]]
- class LakeFSCommitInputFormat extends LakeFSBaseInputFormat
- class LakeFSJobParams extends AnyRef
- class LakeFSRangeGetter extends RangeGetter
- trait RangeGetter extends Serializable
- class RequestRetryWrapper extends AnyRef
-
trait
S3ClientBuilder extends Serializable
Interface to build an S3 client.
Interface to build an S3 client. The object io.treeverse.clients.conditional.S3ClientBuilder -- conditionally defined in a separate file according to the supported Hadoop version -- implements this trait. (Scala requires companion objects to be defined in the same file, so it cannot be a companion.)
- class S3RetryCondition extends SDKDefaultRetryCondition
- class SSTableIterator[Proto <: GeneratedMessage with Message[Proto]] extends Iterator[Item[Proto]]
- class SSTableReader[Proto <: GeneratedMessage with Message[Proto]] extends Closeable
- class SparkFilter extends KeyFilter
- class WithIdentifier[Proto <: GeneratedMessage with Message[Proto]] extends AnyRef
Value Members
- object ApiClient
-
object
BuildInfo extends Product with Serializable
This object was generated by sbt-buildinfo.
- object BulkRemoverFactory
- object Exporter
- object GarbageCollector extends Serializable
- object GravelerSplit
- object HadoopUtils
- object LakeFSContext
- object LakeFSInputFormat
- object LakeFSJobParams
-
object
Main
Main exports lakeFS object to an object-store location.
Main exports lakeFS object to an object-store location. There are 3 options to run the Export main:
- Export all objects from branch 'main' on 'example-repo' repository to s3 location 's3://example-bucket/prefix/': submit example-repo s3://example-bucket/prefix/ --branch=main 2. Export all objects from a commit 'c805e49bafb841a0875f49cd555b397340bbd9b8' on 'example-repo' repository to s3 location 's3://example-bucket/prefix/': submit example-repo s3://example-bucket/prefix/ --commit_id=c805e49bafb841a0875f49cd555b397340bbd9b8 3. Export only the diff between branch 'main' and commit 'c805e49bafb841a0875f49cd555b397340bbd9b8' on 'example-repo' repository to s3 location 's3://example-bucket/prefix/': submit example-repo s3://example-bucket/prefix/ --branch=main --prev_commit_id=c805e49bafb841a0875f49cd555b397340bbd9b8
- object SSTableReader
- object StorageClientType extends Enumeration
- object StorageUtils
- object URLResolver extends Serializable
-
object
VarInt
from https://github.com/addthis/stream-lib/blob/master/src/main/java/com/clearspring/analytics/util/Varint.java