ml.dmlc.xgboost4j.scala.spark

XGBoost

Related Doc: package spark

object XGBoost extends Serializable

Linear Supertypes
Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. XGBoost
  2. Serializable
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  5. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  10. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  11. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  12. def loadModelFromHadoopFile(modelPath: String)(implicit sparkContext: SparkContext): XGBoostModel

    Load XGBoost model from path in HDFS-compatible file system

    Load XGBoost model from path in HDFS-compatible file system

    modelPath

    The path of the file representing the model

    returns

    The loaded model

  13. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  14. final def notify(): Unit

    Definition Classes
    AnyRef
  15. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  16. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  17. def toString(): String

    Definition Classes
    AnyRef → Any
  18. def train(trainingData: RDD[org.apache.spark.ml.feature.LabeledPoint], params: Map[String, Any], round: Int, nWorkers: Int, obj: ObjectiveTrait = null, eval: EvalTrait = null, useExternalMemory: Boolean = false, missing: Float = Float.NaN): XGBoostModel

    train XGBoost model with the RDD-represented data

    train XGBoost model with the RDD-represented data

    trainingData

    the trainingset represented as RDD

    params

    Map containing the configuration entries

    round

    the number of iterations

    nWorkers

    the number of xgboost workers, 0 by default which means that the number of workers equals to the partition number of trainingData RDD

    obj

    the user-defined objective function, null by default

    eval

    the user-defined evaluation function, null by default

    useExternalMemory

    indicate whether to use external memory cache, by setting this flag as true, the user may save the RAM cost for running XGBoost within Spark

    missing

    the value represented the missing value in the dataset

    returns

    XGBoostModel when successful training

    Exceptions thrown

    ml.dmlc.xgboost4j.java.XGBoostError when the model training is failed

  19. def trainWithDataFrame(trainingData: Dataset[_], params: Map[String, Any], round: Int, nWorkers: Int, obj: ObjectiveTrait = null, eval: EvalTrait = null, useExternalMemory: Boolean = false, missing: Float = Float.NaN, featureCol: String = "features", labelCol: String = "label"): XGBoostModel

    train XGBoost model with the DataFrame-represented data

    train XGBoost model with the DataFrame-represented data

    trainingData

    the trainingset represented as DataFrame

    params

    Map containing the parameters to configure XGBoost

    round

    the number of iterations

    nWorkers

    the number of xgboost workers, 0 by default which means that the number of workers equals to the partition number of trainingData RDD

    obj

    the user-defined objective function, null by default

    eval

    the user-defined evaluation function, null by default

    useExternalMemory

    indicate whether to use external memory cache, by setting this flag as true, the user may save the RAM cost for running XGBoost within Spark

    missing

    the value represented the missing value in the dataset

    featureCol

    the name of input column, "features" as default value

    labelCol

    the name of output column, "label" as default value

    returns

    XGBoostModel when successful training

    Annotations
    @throws( classOf[XGBoostError] )
    Exceptions thrown

    ml.dmlc.xgboost4j.java.XGBoostError when the model training is failed

  20. def trainWithRDD(trainingData: RDD[org.apache.spark.ml.feature.LabeledPoint], params: Map[String, Any], round: Int, nWorkers: Int, obj: ObjectiveTrait = null, eval: EvalTrait = null, useExternalMemory: Boolean = false, missing: Float = Float.NaN): XGBoostModel

    various of train()

    various of train()

    trainingData

    the trainingset represented as RDD

    params

    Map containing the configuration entries

    round

    the number of iterations

    nWorkers

    the number of xgboost workers, 0 by default which means that the number of workers equals to the partition number of trainingData RDD

    obj

    the user-defined objective function, null by default

    eval

    the user-defined evaluation function, null by default

    useExternalMemory

    indicate whether to use external memory cache, by setting this flag as true, the user may save the RAM cost for running XGBoost within Spark

    missing

    the value represented the missing value in the dataset

    returns

    XGBoostModel when successful training

    Annotations
    @throws( classOf[XGBoostError] )
    Exceptions thrown

    ml.dmlc.xgboost4j.java.XGBoostError when the model training is failed

  21. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  22. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped