Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Java: Data Science Made Easy

You're reading from   Java: Data Science Made Easy Data collection, processing, analysis, and more

Arrow left icon
Product type Course
Published in Jul 2017
Publisher Packt
ISBN-13 9781788475655
Length 734 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (3):
Arrow left icon
Alexey Grigorev Alexey Grigorev
Author Profile Icon Alexey Grigorev
Alexey Grigorev
Richard M. Reese Richard M. Reese
Author Profile Icon Richard M. Reese
Richard M. Reese
Jennifer L. Reese Jennifer L. Reese
Author Profile Icon Jennifer L. Reese
Jennifer L. Reese
Arrow right icon
View More author details
Toc

Table of Contents (29) Chapters Close

Title Page
Credits
Preface
1. Module 1
2. Getting Started with Data Science FREE CHAPTER 3. Data Acquisition 4. Data Cleaning 5. Data Visualization 6. Statistical Data Analysis Techniques 7. Machine Learning 8. Neural Networks 9. Deep Learning 10. Text Analysis 11. Visual and Audio Analysis 12. Visual and Audio Analysis 13. Mathematical and Parallel Techniques for Data Analysis 14. Bringing It All Together 15. Module 2
16. Data Science Using Java 17. Data Processing Toolbox 18. Exploratory Data Analysis 19. Supervised Learning - Classification and Regression 20. Unsupervised Learning - Clustering and Dimensionality Reduction 21. Working with Text - Natural Language Processing and Information Retrieval 22. Extreme Gradient Boosting 23. Deep Learning with DeepLearning4J 24. Scaling Data Science 25. Deploying Data Science Models 26. Bibliography

Recurrent Neural Networks


RNN differ from feed-forward networks in that their input includes the input from the previous iteration or step. They still process the current input but use a feedback loop to take into consideration the inputs to the prior step, also called the recent past, for context. This step effectively gives the network memory. One popular type of recurrent network involves Long Short-Term Memory (LSTM). This type of memory improves the processing power of the network.

RNNs are designed to process sequential data and are especially useful for analysis and prediction with text data. Given a sequence of words, an RNN can predict the probability of each word being the next in the sequence. This also allows for text generation by the network. RNNs are versatile and also process image data well, especially image labeling applications. The flexibility in design and purpose and ease in training make RNNs popular choices for many data science applications. DL4J also provides support...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image