Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Advanced Python Programming

You're reading from   Advanced Python Programming Build high performance, concurrent, and multi-threaded apps with Python using proven design patterns

Arrow left icon
Product type Course
Published in Feb 2019
Publisher Packt
ISBN-13 9781838551216
Length 672 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (3):
Arrow left icon
Quan Nguyen Quan Nguyen
Author Profile Icon Quan Nguyen
Quan Nguyen
Sakis Kasampalis Sakis Kasampalis
Author Profile Icon Sakis Kasampalis
Sakis Kasampalis
Dr. Gabriele Lanaro Dr. Gabriele Lanaro
Author Profile Icon Dr. Gabriele Lanaro
Dr. Gabriele Lanaro
Arrow right icon
View More author details
Toc

Table of Contents (41) Chapters Close

Title Page
Copyright
About Packt
Contributors
Preface
Benchmarking and Profiling Pure Python Optimizations FREE CHAPTER Fast Array Operations with NumPy and Pandas C Performance with Cython Exploring Compilers Implementing Concurrency Parallel Processing Advanced Introduction to Concurrent and Parallel Programming Amdahl's Law Working with Threads in Python Using the with Statement in Threads Concurrent Web Requests Working with Processes in Python Reduction Operators in Processes Concurrent Image Processing Introduction to Asynchronous Programming Implementing Asynchronous Programming in Python Building Communication Channels with asyncio Deadlocks Starvation Race Conditions The Global Interpreter Lock The Factory Pattern The Builder Pattern Other Creational Patterns The Adapter Pattern The Decorator Pattern The Bridge Pattern The Facade Pattern Other Structural Patterns The Chain of Responsibility Pattern The Command Pattern The Observer Pattern 1. Appendix 2. Other Books You May Enjoy Index

Chapter 10


What is a thread? What are the core differences between a thread and a process?

A thread of execution is the smallest unit of programming commands. More than one thread can be implemented within a same process, usually executing concurrently and accessing/sharing the same resources, such as memory, while separate processes do not do this.

What are the API options provided by the thread module in Python?

The main feature of the thread module is its fast and efficient method of creating new threads to execute functions: the thread.start_new_thread() function. Aside from this, the module only supports a number of low-level ways of working with multithreaded primitives and sharing their global data space. Additionally, simple lock objects (for example, mutexes and semaphores) are provided for synchronization purposes.

What are the API options provided by the threading module in Python?

In addition to all of the functionalities for working with threads that the thread module provides, the threading module also supports a number of extra methods, as follows:

  • threading.activeCount(): This function returns the number of currently active thread objects in the program.
  • threading.currentThread(): This function returns the number of thread objects in the current thread control from the caller.
  • threading.enumerate(): This function returns a list of all of the currently active thread objects in the program.

What are the processes of creating new threads via the thread andthreadingmodules?

The processes for creating new threads using the thread and threading module is as follows:

  • In the thread module, new threads are created to execute functions concurrently. The way to do this is by using the thread.start_new_thread() function: thread.start_new_thread(function, args[, kwargs]).
  • To create and customize a new thread using the threading module, there are specific steps that need to be followed:
    1. Define a subclass of the threading.Thread class in our program
    2. Override the default __init__(self [,args]) method inside the subclass to add custom arguments for the class
    3. Override the default run(self [,args]) method inside the subclass to customize the behavior of the thread class when a new thread is initialized and started

What is the idea behind thread synchronization using locks?

In a given program, when a thread is accessing/executing the critical section of the program, any other threads need to wait until that thread finishes executing. The typical goal of thread synchronization is to avoid any potential data discrepancies when multiple threads access their shared resource; allowing only one thread to execute the critical section at a time guarantees that no data conflicts can occur in our multithreaded applications. One of the most common ways to apply thread synchronization is through the implementation of a locking mechanism.

What is the process of implementing thread synchronization using locks in Python?

In our threading module, the threading.Lock class provides a simple and intuitive approach to creating and working with locks. Its main usage includes the following methods:

  • threading.Lock(): This method initializes and returns a new lock object.
  • acquire(blocking): When this method is called, all threads will run synchronously (that is, only one thread can execute the critical section at a time).
  • release(): When this method is called, the lock is released.

What is the idea behind the queue data structure?

A queue is an abstract data structure that is a collection of different elements maintained in a specific order; these elements can be other objects in a program.

What is the main application of queuing in concurrent programming?

The concept of a queue is even more prevalent in the subfield of concurrent programming, as the order of elements maintained inside a queue plays an important role when a multithreaded program handles and manipulates its shared resources.

What are the core differences between a regular queue and a priority queue?

The priority queue abstract data structure is similar to the queue data structure, but each of the elements of a priority queue, as the name suggests, has a priority associated with it; in other words, when an element is added to a priority queue, its priority needs to be specified. Unlike in regular queues, the dequeuing principle of a priority queue relies on the priority of the elements: the elements with higher priority are processed before those with lower priority.

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image