Language modeling is the task of computing the probability of a sequence of words. Language models are crucial to a lot of different applications, such as speech recognition, optical character recognition, machine translation, and spelling correction. For example, in American English, the two phrases wreck a nice beach and recognize speech are almost identical in pronunciation, but their respective meanings are completely different from each other. A good language model can distinguish which phrase is most likely to be correct, based on the context of the conversation. This section will provide an overview of word and character-level language models and how RNNs can be used to build them.





















































