5. Regressor

  1. Click on the Regressor in the Machine Learning category.

  1. Model Type: Choose the regression model.

  1. Allocate to: Enter the variable name to assign to the created machine learning model.

  2. Code View: Preview the generated code.

  3. Run: Execute the code.


Linear Regression

  1. Fit Intercept: Choose whether to include the intercept.


Ridge / Lasso

  1. Alpha: Adjust the level of regularization.


ElasticNet

  1. Alpha: Adjust the level of regularization.

  2. L1 ratio: Adjusts the balance (ratio) between L1 (Lasso) and L2 (Ridge) regularization.


SVR(SupportVectorMachine Regressor)

  1. C: Represents the degree of freedom for model regularization. Higher values of C make the model more complex, fitting the training data more closely.

  2. Kernel: Function mapping data to a higher-dimensional space, controlling model complexity.

  • Degree(Poly): Determines the degree of polynomial.

  • Gamma(Poly, rbf, sigmoid): Adjusts the curvature of the decision boundary.

  • Coef0(Poly, sigmoid): Additional parameter for the kernel, controlling the offset. Higher values fit the training data more closely.

  1. Random state: Sets the seed value for the random number generator used in model training.


DecisionTree Regressor

  1. Criterion: Specifies the measure used for node splitting.

  2. Max depth: Specifies the maximum depth of the tree.

  3. Min Samples Split: Specifies the minimum number of samples required to split a node.

  4. Random state: Sets the seed value for the random number generator used in model training.


RandomForest Regressor

  1. N estimators: Specifies the number of trees in the ensemble.

  2. Criterion: Specifies the measure used for node splitting.

  3. Max depth: Specifies the maximum depth of the tree.

  4. Min Samples Split: Specifies the minimum number of samples required to split a node.

  5. N jobs: Specifies the number of CPU cores or threads to be used during model training.

  6. Random State: Sets the seed value for the random number generator used in model training.


GradientBoosting Regressor

  1. Loss: Specifies the loss function used.

  2. Learning rate: Specifies the learning rate.

  3. N estimators: Specifies the number of trees in the ensemble.

  4. Criterion: Specifies the measure used for node splitting.

  5. Random State: Sets the seed value for the random number generator used in model training.


XGB Regressor

  1. N estimators: Specifies the number of trees in the ensemble.

  2. Max depth: Specifies the maximum depth of the tree.

  3. Learning rate: Specifies the learning rate.

  4. Gamma: Specifies the minimum loss reduction required to make a further partition.

  5. Random State: Sets the seed value for the random number generator used in model training.


LGBM Regressor

  1. Boosting type: Specifies the boosting type used in the algorithm.

  2. Max depth: Specifies the maximum depth of the tree.

  3. Learning Rate: Specifies the learning rate.

  4. N estimators: Specifies the number of trees in the ensemble.

  5. Random State: Sets the seed value for the random number generator used in model training.


CatBoost Regressor

  1. Learning rate: Specifies the learning rate.

  2. Loss function: Specifies the loss function used.

  3. Task Type: Specifies the hardware used for data processing.

  4. Max Depth: Specifies the maximum depth of the tree.

  5. N estimators: Specifies the number of trees in the ensemble.

  6. Random State: Sets the seed value for the random number generator used in model training.

Last updated