XGboost is a boosting algorithm which uses gradient boosting and is a robust technique. Let’s learn to build XGboost classifier.

# First install xgboost from below link
# https://xgboost.readthedocs.io/en/latest/build.html
# Imports
from sklearn.datasets import load_iris
from xgboost import XGBClassifier
import pandas as pd
import numpy as np

# Load Data
iris = load_iris()

# Create a dataframe
df = pd.DataFrame(iris.data, columns = iris.feature_names)
df['target'] = iris.target

# Let's see a sample of created df
df.sample(frac=0.01)
sepal length (cm) sepal width (cm) petal length (cm) petal width (cm) target
57 4.9 2.4 3.3 1.0 1
85 6.0 3.4 4.5 1.6 1
# Let's see target names
targets = iris.target_names
print(targets)
['setosa' 'versicolor' 'virginica']
# Prepare training data for building the model
X_train = df.drop(['target'], axis=1).values
y_train = df['target']

# Instantiate the model
cls = XGBClassifier()

# Train/Fit the model 
cls.fit(X_train, y_train)

# Make prediction using the model
X_pred = [5.1, 3.2, 1.5, 0.5]
y_pred = cls.predict([X_pred])

print("Prediction is: {}".format(targets[y_pred]))
Prediction is: ['setosa']

That's how we Build XGboost classifier

That’s all for this mini tutorial. To sum it up, we learned how to Build XGboost classifier.

Hope it was easy, cool and simple to follow. Now it’s on you.

Leave a Reply

Your email address will not be published. Required fields are marked *