Index
A
- apply() method
- about / Working with tasks
- apply_async() method
- about / Working with tasks
- arguments, Server class
- ncpus / Discovering PP
- ppservers / Discovering PP
- arguments, submit method
- func / Discovering PP
- args / Discovering PP
- modules / Discovering PP
- callback / Discovering PP
- Asgard-desktop / Using PP to make a distributed Web crawler
- asynchronous operations
- asyncio
- URL / Using event loops
- about / Using asyncio
- using / Using asyncio
- coroutine, defining / Understanding coroutines and futures
- coroutine and asyncio.Future, using / Using coroutine and asyncio.Future
- .Task class, using / Using asyncio.Task
- incompatible library, using with / Using an incompatible library with asyncio
- asyncio.Future object
- and coroutine, using / Using coroutine and asyncio.Future
- asyncio.Task class
- using / Using asyncio.Task
- AsyncResult class / Using Celery to obtain a Fibonacci series term
B
- BaseEventLoop.run_in_executor method
- BaseEventLoop.run_until_complete function
- about / Using asyncio.Task
- blocking operations
- broker
C
- callback function
- Celery
- about / Understanding Celery
- using / Why use Celery?
- used, for obtaining Fibonacci series term / Using Celery to obtain a Fibonacci series term
- used, for creating distributed Web crawler / Using Celery to make a distributed Web crawler
- Celery architecture
- about / Understanding Celery's architecture
- tasks, working with / Working with tasks
- broker / Discovering message transport (broker)
- workers / Understanding workers
- result backends / Understanding result backends
- Celery class / Dispatching a simple task
- Celery module
- client components
- about / Working with tasks
- client machine, Celery
- setting up / Setting up the client machine
- concurrent.futures module
- used, for web crawler / Crawling the Web using the concurrent.futures module
- concurrent programming
- Condition mechanism
- conn.send(value) / Using multiprocessing.Pipe
- consumer_task function / Using multiprocessing to compute Fibonacci series terms with multiple inputs
- coroutine
- about / Using asyncio
- and futures / Understanding coroutines and futures
- and asyncio.Future, using / Using coroutine and asyncio.Future
- countdown parameter
- about / Working with tasks
- CPU registry / Understanding the process model
- CPU scheduler
- CPU scheduling / Understanding the process model
- cpu_count function / Using multiprocessing to compute Fibonacci series terms with multiple inputs
- CPython
- about / Taking care of Python GIL
- crawl_task function / Crawling the Web using ProcessPoolExecutor, Using Celery to make a distributed Web crawler
- current state / Understanding the process model
- current_process function / Using multiprocessing to compute Fibonacci series terms with multiple inputs
D
- data decomposition
- using / Using data decomposition
- data exchange tasks
- identifying / Identifying the tasks that require data exchange
- data_queue variable / Using multiprocessing to compute Fibonacci series terms with multiple inputs
- deadlock / Deadlock
- delay(arg, kwarg=value) method
- about / Working with tasks
- distributed programming
- distributed Web Crawler
- making, Parallel Python (PP) used / Using PP to make a distributed Web crawler
- distributed Web crawler
- creating, Celery used / Using Celery to make a distributed Web crawler
- divide and conquer technique
- about / The divide and conquer technique
E
- environment, Celery
- setting up / Setting up the environment
- client machine, setting up / Setting up the client machine
- server machine, setting up / Setting up the server machine
- epoll() function
- about / Polling functions
- Level-triggered / Polling functions
- Edge-triggered / Polling functions
- epoll_wait() function / Polling functions
- eventlet
- URL / Using event loops
- event loop
- about / Understanding event loop
- using / Using event loops
- event loop, implementing applications
- Tornado web server / Using event loops
- Twisted / Using event loops
- asyncio / Using event loops
- gevent / Using event loops
- eventlet / Using event loops
- expires parameter
- about / Working with tasks
F
- feeder thread / Understanding multiprocessing.Queue
- Fibonacci function
- Fibonacci sequence
- Fibonacci series
- obtaining, threading module used / Using threading to obtain the Fibonacci series term with multiple inputs
- computing, multiprocessing used / Using multiprocessing to compute Fibonacci series terms with multiple inputs
- Fibonacci series term
- obtaining, Celery used / Using Celery to obtain a Fibonacci series term
- Fibonacci series term, on SMP architecture
- calculating, Parallel Python (PP) used / Using PP to calculate the Fibonacci series term on SMP architecture
- fibonacci_task function / Using threading to obtain the Fibonacci series term with multiple inputs
- fibo_dict variable / Using multiprocessing to compute Fibonacci series terms with multiple inputs
- file descriptors
- about / Exploring named pipes
- URL / Exploring named pipes
- First-In, First-Out (FIFO)
- about / Exploring named pipes
- futures
- about / Using asyncio
- and coroutines / Understanding coroutines and futures
- future_tasks
G
- get() function / Using Celery to obtain a Fibonacci series term
- gevent
- URL / Using event loops
- GIL
- about / Taking care of Python GIL
- Global Interpreter Lock (GIL) / Summary
- group_urls_task function / Crawling the Web using the concurrent.futures module, Crawling the Web using ProcessPoolExecutor
H
- highest Fibonacci value
- obtaining, for multiple inputs / Obtaining the highest Fibonacci value for multiple inputs
- calculating, example / Obtaining the highest Fibonacci value for multiple inputs
I
- I/O information / Understanding the process model
- Iceman-Q47OC-500P4C / Using PP to make a distributed Web crawler
- Iceman-Thinkad-X220 / Using PP to make a distributed Web crawler
- incompatible library
- using, with asyncio / Using an incompatible library with asyncio
- independent tasks
- identifying / Identifying independent tasks
- interprocess communication (IPC)
J
- join() method / Using multiprocessing.Pipe
K
- kernel thread
- about / Understanding different kinds of threads
- advantages / Understanding different kinds of threads
- disadvantages / Understanding different kinds of threads
L
- link parameter
- about / Working with tasks
- link_error parameter
- about / Working with tasks
- load balance / Load balance
M
- Manager object / Using multiprocessing to compute Fibonacci series terms with multiple inputs
- manage_crawl_task function / Using Celery to make a distributed Web crawler
- manage_fibo_task function / Using Celery to obtain a Fibonacci series term
- max_workers parameter / Crawling the Web using the concurrent.futures module
- memcached
- memory allocation / Understanding the process model
- merge sort / The divide and conquer technique
- message passing
- about / Understanding message passing
- advantages / Understanding message passing
- multiprocessing
- used, to compute Fibonacci series / Using multiprocessing to compute Fibonacci series terms with multiple inputs
- multiprocessing.Pipe
- using / Using multiprocessing.Pipe
- multiprocessing.Queue / Understanding multiprocessing.Queue
- multiprocessing communication
- implementing / Implementing multiprocessing communication
- multiprocessing.Pipe, using / Using multiprocessing.Pipe
- multiprocessing.Queue / Understanding multiprocessing.Queue
- multiprocessing module
- mutex
- about / Understanding shared state
N
- named pipes
- about / Exploring named pipes
- using, with Python / Using named pipes with Python
- writing in / Writing in a named pipe
- reading / Reading named pipes
- ncpus argument / Discovering PP
- non-blocking operations
- non-blocking operator
- non-determinism / Race conditions
- number_of_cpus variable / Using multiprocessing to compute Fibonacci series terms with multiple inputs
O
- os.getpid() / Using multiprocessing.Pipe
- os module
P
- parallelism
- example / Why use parallel programming?
- parallel programming
- need for / Why use parallel programming?
- about / Exploring common forms of parallelization
- advantages / Exploring common forms of parallelization
- shared state / Communicating in parallel programming, Understanding shared state
- message passing / Communicating in parallel programming, Understanding message passing
- parallel programming, problems
- identifying / Identifying parallel programming problems
- deadlock / Deadlock
- starvation / Starvation
- race conditions / Race conditions
- Parallel Python (PP)
- discovering / Discovering PP
- URL, for documentation / Discovering PP
- URL, for arguments / Discovering PP
- used, for calculating Fibonacci series term on SMP architecture / Using PP to calculate the Fibonacci series term on SMP architecture
- used, for making distributed Web Crawler / Using PP to make a distributed Web crawler
- parallel Python module
- about / The parallel Python module
- URL / The parallel Python module
- parallel systems
- pipeline
- tasks, decomposing with / Decomposing tasks with pipeline
- poll() function
- features / Polling functions
- polling functions
- about / Polling functions
- select() / Polling functions
- poll() / Polling functions
- epoll() / Polling functions
- kqueue / Polling functions
- PPES
- about / Discovering PP
- ppservers argument / Discovering PP
- priority / Understanding the process model
- process
- Process Control Block (PCB)
- about / Understanding the process model
- Process ID / Understanding the process model
- program counter / Understanding the process model
- I/O information / Understanding the process model
- memory allocation / Understanding the process model
- CPU scheduling / Understanding the process model
- priority / Understanding the process model
- current state / Understanding the process model
- CPU registry / Understanding the process model
- process ID / Understanding the process model
- process mapping
- defining / Processing and mapping
- independent tasks, identifying / Identifying independent tasks
- data exchange tasks, identifying / Identifying the tasks that require data exchange
- load balance / Load balance
- ProcessPoolExecutor
- used, for web crawling / Crawling the Web using ProcessPoolExecutor
- ProcessPoolExecutor class / Crawling the Web using ProcessPoolExecutor
- process states
- running / Defining the states of a process
- ready / Defining the states of a process
- waiting / Defining the states of a process
- producer_task function / Using multiprocessing.Pipe
- producer_task method / Using multiprocessing to compute Fibonacci series terms with multiple inputs
- program counter / Understanding the process model
- proposed solution, Web crawler
- about / Crawling the Web
- Python
- named pipes, using with / Using named pipes with Python
- Python, parallel programming tools
- threading module / The Python threading module
- multiprocessing module / The Python multiprocessing module
- parallel Python module / The parallel Python module
Q
- queue parameter
- about / Working with tasks
- queues
- specifying, for task types / Defining queues by task types
- fibo_queue / Defining queues by task types
- sqrt_queue / Defining queues by task types
- webcrawler_queue / Defining queues by task types
- quick sort / The divide and conquer technique
R
- race conditions / Race conditions
- ready() method / Using Celery to obtain a Fibonacci series term
- readyness notification scheme / Polling functions
- regular expression
- request module
- request object / Using Celery to obtain a Fibonacci series term
- resource descriptor / Understanding event loop
- result backend
- about / Understanding result backends
- retry parameter
- about / Working with tasks
- RPC
S
- select() function
- disadvantages / Polling functions
- serializer parameter
- about / Working with tasks
- server machine, Celery
- setting up / Setting up the server machine
- set_result method
- shared state
- about / Understanding shared state
- shared_queue / Using threading to obtain the Fibonacci series term with multiple inputs
- sleep_func function
- sockets
- Software Transactional Memory (STM) / Summary
- solution scheme
- about / Crawling the Web
- start() method / Using multiprocessing.Pipe
- starvation / Starvation
- submit method / Crawling the Web using the concurrent.futures module
T
- task execution parameters
- countdown / Working with tasks
- expires / Working with tasks
- retry / Working with tasks
- queue / Working with tasks
- serializer / Working with tasks
- link / Working with tasks
- link_error / Working with tasks
- task methods
- delay(arg, kwarg=value) / Working with tasks
- apply_async() / Working with tasks
- apply() / Working with tasks
- tasks
- decomposing, with pipeline / Decomposing tasks with pipeline
- working with / Working with tasks
- dispatching / Dispatching a simple task
- tasks class
- about / Using asyncio
- task types
- queues, defining by / Defining queues by task types
- task_dispatcher.py module / Using Celery to obtain a Fibonacci series term
- task_done() / Using threading to obtain the Fibonacci series term with multiple inputs
- threading module
- about / The Python threading module
- URL / The Python threading module, Choosing between threading and _thread
- and _thread module, selecting between / Choosing between threading and _thread
- used, to obtain Fibonacci series with multiples inputs / Using threading to obtain the Fibonacci series term with multiple inputs
- ThreadPoolExecutor object
- threads
- defining / Defining threads
- advantages / Advantages and disadvantages of using threads
- disadvantages / Advantages and disadvantages of using threads
- thread states
- defining / Defining the states of a thread
- creation / Defining the states of a thread
- execution / Defining the states of a thread
- ready / Defining the states of a thread
- blocked / Defining the states of a thread
- concluded / Defining the states of a thread
- thread types
- kernel thread / Understanding different kinds of threads
- user thread / Understanding different kinds of threads
- Tornado web server
- URL / Polling functions
- Tornado web server
- URL / Using event loops
- Twisted
- URL / Using event loops
U
- Uniform Resource Locators (URLs) / Crawling the Web
- user thread
- about / Understanding different kinds of threads
- advantages / Understanding different kinds of threads
- disadvantages / Understanding different kinds of threads
W
- web crawler
- concurrent.futures module, used for / Crawling the Web using the concurrent.futures module
- Web crawler
- about / Crawling the Web
- web crawling
- ProcessPoolExecutor, used for / Crawling the Web using ProcessPoolExecutor
- with statement
- workers
- about / Understanding workers
- concurrency mode / Understanding workers
- remote control / Understanding workers
- revoking tasks / Understanding workers