Improving the performance of the model
Earlier, we discussed some factors that we should consider as we designed our baseline architecture for sentiment analysis in this chapter. Also, in Chapter 8, Handling Overfitting, we explored some foundational concepts to mitigate against overfitting. There, we looked at ideas such as early stopping and dropout regularization. To curb overfitting, let’s begin by tuning some of our model’s hyperparameters. To do this, let’s construct a function called sentiment_model
. This function takes in three parameters – vocab_size
, embedding_dim
, and the size of the training set.
Increasing the size of the vocabulary
One hyperparameter we may consider changing is the size of the vocabulary. Increasing the vocabulary size empowers the model to learn more unique words from our dataset. Let’s see how this will impact the performance of our base model. Here, we adjust vocab_size
from 10000
to 20000
, while keeping the...