Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Modern Generative AI with ChatGPT and OpenAI Models

You're reading from   Modern Generative AI with ChatGPT and OpenAI Models Leverage the capabilities of OpenAI's LLM for productivity and innovation with GPT3 and GPT4

Arrow left icon
Product type Paperback
Published in May 2023
Publisher Packt
ISBN-13 9781805123330
Length 286 pages
Edition 1st Edition
Languages
Tools
Concepts
Arrow right icon
Author (1):
Arrow left icon
Valentina Alto Valentina Alto
Author Profile Icon Valentina Alto
Valentina Alto
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Preface 1. Part 1: Fundamentals of Generative AI and GPT Models
2. Chapter 1: Introduction to Generative AI FREE CHAPTER 3. Chapter 2: OpenAI and ChatGPT – Beyond the Market Hype 4. Part 2: ChatGPT in Action
5. Chapter 3: Getting Familiar with ChatGPT 6. Chapter 4: Understanding Prompt Design 7. Chapter 5: Boosting Day-to-Day Productivity with ChatGPT 8. Chapter 6: Developing the Future with ChatGPT 9. Chapter 7: Mastering Marketing with ChatGPT 10. Chapter 8: Research Reinvented with ChatGPT 11. Part 3: OpenAI for Enterprises
12. Chapter 9: OpenAI and ChatGPT for Enterprises – Introducing Azure OpenAI 13. Chapter 10: Trending Use Cases for Enterprises 14. Chapter 11: Epilogue and Final Thoughts 15. Index 16. Other Books You May Enjoy

Road to ChatGPT: the math of the model behind it

Since its foundation in 2015, OpenAI has invested in the research and development of the class of models called Generative Pre-trained Transformers (GPT), and they have captured everyone’s attention as being the engine behind ChatGPT.

GPT models belong to the architectural framework of transformers introduced in a 2017 paper by Google researchers, Attention Is All You Need.

The transformer architecture was introduced to overcome the limitations of traditional Recurrent Neural Networks (RNNs). RNNs were first introduced in the 1980s by researchers at the Los Alamos National Laboratory, but they did not gain much attention until the 1990s. The original idea behind RNNs was that of processing sequential data or time series data, keeping information across time steps.

Indeed, up to that moment in time, the classic Artificial Neural Network (ANN) architecture was that of the feedforward ANN, where the output of each hidden...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image