Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Machine Learning for Cybersecurity Cookbook

You're reading from   Machine Learning for Cybersecurity Cookbook Over 80 recipes on how to implement machine learning algorithms for building security systems using Python

Arrow left icon
Product type Paperback
Published in Nov 2019
Publisher Packt
ISBN-13 9781789614671
Length 346 pages
Edition 1st Edition
Languages
Arrow right icon
Author (1):
Arrow left icon
Emmanuel Tsukerman Emmanuel Tsukerman
Author Profile Icon Emmanuel Tsukerman
Emmanuel Tsukerman
Arrow right icon
View More author details
Toc

Table of Contents (11) Chapters Close

Preface 1. Machine Learning for Cybersecurity FREE CHAPTER 2. Machine Learning-Based Malware Detection 3. Advanced Malware Detection 4. Machine Learning for Social Engineering 5. Penetration Testing Using Machine Learning 6. Automatic Intrusion Detection 7. Securing and Attacking Data with Machine Learning 8. Secure and Private AI 9. Other Books You May Enjoy Appendix

Training an XGBoost classifier

Gradient boosting is widely considered the most reliable and accurate algorithm for generic machine learning problems. We will utilize XGBoost to create malware detectors in future recipes.

Getting ready

The preparation for this recipe consists of installing the scikit-learn, pandas, and xgboost packages in pip. The command for this is as follows:

pip install sklearn xgboost pandas

In addition, a dataset named file_pe_header.csv is provided in the repository for this recipe.

How to do it...

In the following steps, we will demonstrate how to instantiate, train, and test an XGBoost classifier:

  1. Start by reading in the data:
import pandas as pd

df = pd.read_csv("file_pe_headers.csv", sep=",")
y = df["Malware"]
X = df.drop(["Name", "Malware"], axis=1).to_numpy()
  1. Next, train-test-split a dataset:
from sklearn.model_selection import train_test_split

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3)
  1. Create one instance of an XGBoost model and train it on the training set:
from xgboost import XGBClassifier

XGB_model_instance = XGBClassifier()
XGB_model_instance.fit(X_train, y_train)
  1. Finally, assess its performance on the testing set:
from sklearn.metrics import accuracy_score

y_test_pred = XGB_model_instance.predict(X_test)
accuracy = accuracy_score(y_test, y_test_pred)
print("Accuracy: %.2f%%" % (accuracy * 100))

The following screenshot shows the output:

How it works...

We begin by reading in our data (step 1). We then create a train-test split (step 2). We proceed to instantiate an XGBoost classifier with default parameters and fit it to our training set (step 3). Finally, in step 4, we use our XGBoost classifier to predict on the testing set. We then produce the measured accuracy of our XGBoost model's predictions.

You have been reading a chapter from
Machine Learning for Cybersecurity Cookbook
Published in: Nov 2019
Publisher: Packt
ISBN-13: 9781789614671
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image