Reader small image

You're reading from  Kotlin Design Patterns and Best Practices - Second Edition

Product typeBook
Published inJan 2022
Reading LevelBeginner
PublisherPackt
ISBN-139781801815727
Edition2nd Edition
Languages
Right arrow
Author (1)
Alexey Soshin
Alexey Soshin
author image
Alexey Soshin

Alexey Soshin is a software architect with 15 years of experience in the industry. He started exploring Kotlin when Kotlin was still in beta, and since then has been a big enthusiast of the language. He's a conference speaker, published writer, and the author of a video course titled Pragmatic System Design.
Read more about Alexey Soshin

Right arrow

Chapter 8: Designing for Concurrency

Concurrent design patterns help us to manage many tasks at once and structure their life cycle. By using these patterns efficiently, we can avoid problems such as resource leaks and deadlocks.

In this chapter, we'll discuss concurrent design patterns and how they are implemented in Kotlin. To do this, we'll be using the building blocks from previous chapters: coroutines, channels, flows, and concepts from functional programming.

We will be covering the following topics in this chapter:

  • Deferred value
  • Barrier
  • Scheduler
  • Pipeline
  • Fan out
  • Fan in
  • Racing
  • Mutex
  • Sidekick channel

After completing this chapter, you'll be able to work with asynchronous values efficiently, coordinate the work of different coroutines, and distribute and aggregate work, as well as have the tools needed to resolve any concurrency problems that may arise in the process.

Technical requirements

In addition to the technical requirements from the previous chapters, you will also need a Gradle-enabled Kotlin project to be able to add the required dependencies.

You can find the source code used in this chapter on GitHub at the following location:

https://github.com/PacktPublishing/Kotlin-Design-Patterns-and-Best-Practices/tree/main/Chapter08

Deferred Value

The goal of the Deferred Value design pattern is to return a reference to a result of an asynchronous computation. A Future in Java and Scala, and a Promise in JavaScript are both implementations of the Deferred Value design pattern.

We've already discussed deferred values in Chapter 6, Threads and Coroutines. We've seen that the async() function returns a type called Deferred, which is also an implementation of this design pattern.

Interestingly enough, the Deferred value itself is an implementation of both the Proxy design pattern that we've seen in Chapter 3, Understanding Structural Patterns, and the State design pattern from Chapter 4, Getting Familiar with Behavioral Patterns.

We can create a new container for the result of an asynchronous computation using the CompletableDeferred constructor:

val deferred = CompletableDeferred<String>()

To populate the Deferred value with a result, we use the complete() function, and if an...

Barrier

The Barrier design pattern provides us with the ability to wait for multiple concurrent tasks to complete before proceeding further. A common use case for this is composing objects from different sources.

For example, take the following class:

data class FavoriteCharacter(
    val name: String,
    val catchphrase: String,
    val picture: ByteArray = Random.nextBytes(42)
)

Let's assume that the catchphrase data comes from one service and the picture data comes from another. We would like to fetch these two pieces of data concurrently:

fun CoroutineScope.getCatchphraseAsync
(
    characterName: String
) = async { … }
fun CoroutineScope.getPicture
(
    characterName: String
) = async { … }

The most basic way to implement concurrent fetching would be as follows:

suspend fun fetchFavoriteCharacter(name: String) = coroutineScope {
  ...

Scheduler

The goal of the Scheduler design pattern is to decouple what is being run from how it's being run and optimize the use of resources when doing so.

In Kotlin, Dispatchers are an implementation of the Scheduler design pattern that decouple the coroutine (that is, the what) from underlying thread pools (that is, the how).

We've already seen dispatchers briefly in Chapter 6, Threads and Coroutines.

To remind you, the coroutine builders such as launch() and async() can specify which dispatcher to use. Here's an example of how you specify it explicitly:

runBlocking {
    // This will use the Dispatcher from the parent 
    // coroutine
    launch {
        // Prints: main
        println(Thread.currentThread().name) 
    }
    launch(Dispatchers.Default) {
   ...

Pipeline

The Pipeline design pattern allows us to scale heterogeneous work, consisting of multiple steps of varying complexity across multiple CPUs, by breaking the work into smaller, concurrent pieces. Let's look at the following example to understand it better.

Back in Chapter 4, Getting Familiar with Behavioral Patterns, we wrote an HTML page parser. It was assumed that the HTML pages themselves were already fetched for us, though. What we would like to design now is a process that would create a possibly infinite stream of pages.

First, we would like to fetch news pages once in a while. For that, we'll have a producer:

fun CoroutineScope.producePages() = produce {
    fun getPages(): List<String> {
        // This should actually fetch something
        return listOf(
            "<html>...

Fan Out

The goal of the Fan Out design pattern is to distribute work between multiple concurrent processors, also known as workers. To understand it better, let's look again at the previous section but consider the following problem:

What if the amount of work at the different steps in our pipeline is very different?

For example, it takes a lot more time to fetch the HTML content than to parse it. In such a case, we may want to distribute that heavy work between multiple coroutines. In the previous example, only a single coroutine was reading from each channel. But multiple coroutines can consume from a single channel too, thus dividing the work.

To simplify the problem we're about to discuss, let's have only one coroutine producing some results:

fun CoroutineScope.generateWork() = produce {
    for (i in 1..10_000) {
        send("page$i")
    }
   &...

Fan In

The goal of the Fan In design pattern is to combine results from multiple workers. This design pattern is helpful when our workers produce results and we need to gather them.

This design pattern is the opposite of the Fan Out design pattern we discussed in the previous section. Instead of multiple coroutines reading from the same channel, multiple coroutines can write their results to the same channel.

Combining the Fan Out and Fan In design patterns is a good base for MapReduce algorithms. To demonstrate this, we'll slightly change the workers from the previous example, as follows:

private fun CoroutineScope.doWorkAsync(
    channel: ReceiveChannel<String>,
    resultChannel: Channel<String>
) = async(Dispatchers.Default) {
    for (p in channel) {
        resultChannel.send(p.repeat(2))
    }
}

Now, once done, each worker sends...

Racing

Racing is a design pattern that runs multiple jobs concurrently, picking the result that returns first as the winner and discarding others as losers.

We can implement Racing in Kotlin using the select() function on channels.

Let's imagine you are building a weather application. For redundancy, you fetch the weather from two different sources, Precise Weather and Weather Today. We'll describe them as two producers that return their name and temperature.

If we have more than one producer, we can subscribe to their channels and take the first result that is available.

First, let's declare the two weather producers:

fun CoroutineScope.preciseWeather() = produce {
    delay(Random.nextLong(100))
    send("Precise Weather" to "+25c")
}
 
fun CoroutineScope.weatherToday() = produce {
    delay(Random.nextLong(100))
    send("Weather Today" to ...

Mutex

Also known as mutual exclusions, mutex provides a means to protect a shared state that can be accessed by multiple coroutines at once.

Let's start with the same old dreaded counter example, where multiple concurrent tasks try to update the same counter:

var counter = 0
val jobs = List(10) {
    async(Dispatchers.Default) {
        repeat(1000) {
            counter++
        }
    }
}
jobs.awaitAll()
println(counter)

As you've probably guessed, the result that is printed is less than 10,000 – totally embarrassing!

To solve this, we can introduce a locking mechanism that will allow only a single coroutine to interact with the variable at once, making the operation atomic.

Each coroutine will try to obtain the ownership of the counter. If another coroutine is updating...

Sidekick channel

The Sidekick channel design pattern allows us to offload some work from our main worker to a back worker.

Up until now, we've only discussed the use of select as a receiver. But we can also use select to send items to another channel. Let's look at the following example.

First, we'll declare batman as an actor coroutine that processes 10 messages per second:

val batman = actor<String> {
    for (c in channel) {
        println("Batman is beating some sense into $c")
        delay(100)
    }
}

Next, we'll declare robin as another actor coroutine that is a bit slower and processes only four messages per second:

val robin = actor<String> {
    for (c in channel) {
        println("Robin is beating some sense into $c")
 ...

Summary

In this chapter, we covered various design patterns related to concurrency in Kotlin. Most of them are based on coroutines, channels, deferred values, or a combination of these building blocks.

Deferred values are used as placeholders for asynchronous values. The Barrier design pattern allows multiple asynchronous tasks to rendezvous before proceeding further. The Scheduler design pattern decouples the code of tasks from the way they are executed at runtime.

The Pipeline, Fan In, and Fan Out design patterns help us distribute the work and collect the results. Mutex helps us to control the number of tasks that are being executed at the same time. The Racing design pattern allows us to improve the responsiveness of our application. Finally, the Sidekick Channel design pattern offloads work onto a backup task in case the main task is not able to process the incoming events quickly enough.

All of these patterns should help you to manage the concurrency of your application...

Questions

  1. What does it mean when we say that the select expression in Kotlin is biased?
  2. When should you use a mutex instead of a channel?
  3. Which of the concurrent design patterns could help you implement a MapReduce or divide-and-conquer algorithm efficiently?
lock icon
The rest of the chapter is locked
You have been reading a chapter from
Kotlin Design Patterns and Best Practices - Second Edition
Published in: Jan 2022Publisher: PacktISBN-13: 9781801815727
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €14.99/month. Cancel anytime

Author (1)

author image
Alexey Soshin

Alexey Soshin is a software architect with 15 years of experience in the industry. He started exploring Kotlin when Kotlin was still in beta, and since then has been a big enthusiast of the language. He's a conference speaker, published writer, and the author of a video course titled Pragmatic System Design.
Read more about Alexey Soshin