Ensemble Learning: Bagging and Boosting | by Jonas Dieckmann | Feb, 2023

Ensemble Learning: Bagging and Boosting | by Jonas Dieckmann | Feb, 2023

[ad_1]

Image by Unsplash
Image by author

Example: Image classification

Image by author

Bias-Variance tradeoff

Image by author
A random bag. Image by Unsplash
Image by author
Image by author

Code example for bagging

from sklearn.ensemble import BaggingClassifier 

# define base estimator
est = LogisticRegression() # or est = SVC() or est = DecisionTreeClassifier

# n_estimators defines the number of base estimators in the ensemble
# max_samples defines number of samples to draw from X to train each base estimator

bag_model = BaggingClassifier(base_estimator= est, n_estimators = 10, max_samples=1.0)

bag_model = bag_model.fit(X_train, y_train)

Prediction = bag_model.predict(X_test)

Boost your models! Image by Unsplash
Image by author
Image by author

Code example for boosting

from sklearn.ensemble import AdaBoostClassifier

# define base estimator (requires support for sample weighting)
est = LogisticRegression() # or est = SVC() or est = DecisionTreeClassifier ….

# n_estimators defines maximum number of estimators at which boosting is terminated
# learning_rate defines the weight applied to each classifier at each boosting iteration
boost_model = AdaBoostClassifier(base_estimator= est, n_estimators = 10, learning_rate=1)

boost_model = boost_model.fit(X_train, y_train)

Prediction = boost_model.predict(X_test)

Image by Unsplash

Similarities

Differences

Implications

[ad_2]
Source link

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *