sealed abstract case class AnalyzerPipe[F[_]] extends Product with Serializable

AnalyzerPipe provides methods to tokenize a possibly very long Stream[F, String] or Stream[F, Byte], such as from a file. When possible, prefer starting with a Stream[F, Byte] and use tokenizeBytes.

Source
AnalyzerPipe.scala
Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. AnalyzerPipe
  2. Serializable
  3. Product
  4. Equals
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @native()
  6. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  7. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  8. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  9. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  10. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  11. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  12. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  13. def productElementNames: Iterator[String]
    Definition Classes
    Product
  14. val readerF: (Reader) => Resource[F, TokenGetter]
  15. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  16. def tokenizeBytes(in: Stream[F, Byte], tokenN: Int): Stream[F, String]

    Emits a string for every token, as determined by the Analyzer, in the input stream.

    Emits a string for every token, as determined by the Analyzer, in the input stream. Decoding from bytes to strings is done using the default charset.

    in

    input stream to tokenize

    tokenN

    maximum number of tokens to read at a time

  17. def tokenizeStrings(in: Stream[F, String], tokenN: Int): Stream[F, String]

    Emits a string for every token, as determined by the Analyzer, in the input stream.

    Emits a string for every token, as determined by the Analyzer, in the input stream. A space is inserted between each element in the input stream to avoid accidentally combining words. See tokenizeStringsRaw to avoid this behaviour.

    in

    input stream to tokenize

    tokenN

    maximum number of tokens to read at a time

  18. def tokenizeStringsRaw(in: Stream[F, String], tokenN: Int): Stream[F, String]

    Emits a string for every token, as determined by the Analyzer, in the input stream.

    Emits a string for every token, as determined by the Analyzer, in the input stream. Becareful, the end of one string will be joined with the beginning of the next in the Analyzer. See tokenizeStrings to automatically intersperse spaces.

    in

    input stream to tokenize

    tokenN

    maximum number of tokens to read at a time

  19. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  20. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  21. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped