Thread Safe Property and Resource Access with the Transaction Wrapper
Here is a transaction type to copy & paste into projects to encapsulates thread-safe read/write access:
struct Transaction {
let queue: DispatchQueue
func read<T>(_ value: @autoclosure () -> T) -> T {
return read(block: value)
}
func read<T>(block: () -> T) -> T {
return queue.sync {
block()
}
}
func write(block: @escaping () -> Void) {
queue.async(flags: .barrier) {
block()
}
}
}
Just make sure to not use the main queue, because .sync
call from main to main will deadlock your app!
It ensures you read values synchronously, which isn’t dangerous, and enqueue and execute write operations in order. This is useful if you need to access any resource from multiple threads and want to avoid the overhead of mutex locks.
I called it Transaction
because the result is access to resources in a way similar to ACID transactions: complex operations are executed in a batch (Atomicity), write operations don’t get in each other’s way (Consistency & Isolation); the Durability part is debatable, though.
Usage example
Here’s how you’d wrap read and write operations for a totally contrived example:
class Counter {
private static let queue = DispatchQueue(label: "counter", qos: .background)
private var count = 0
var tx: Transaction {
return Transaction(queue: Counter.queue)
}
func increment() {
tx.write {
self.count += 1
}
}
var currentValue: Int {
return tx.read(count)
}
}
Checking if this really does help
Here’s an example program to demonstrate the effects:
let counter = Counter()
for _ in (0 ..< 20) {
DispatchQueue.global(qos: .background).async {
// This is actually bad sample code because it's two calls where time passes in between
counter.increment()
print(counter.currentValue, terminator: ", ")
}
}
It’s output:
10, 10, 10, 10, 10, 10, 10, 10, 10, 17, 10, 18, 18, 18, 19, 20, 20, 20, 20, 20
You see there are 20 print statements, but they don’t happen right after the increment. They are still in order, but apparently some “read” calls have been intercepted by other “write” operations and had to wait, e.g. the last 5.
Here’s one without the Transaction
’s queue being used, demonstating how random the values can appear if you read and write willy-nilly:
14, 14, 15, 14, 16, 15, 15, 19, 20, 20, 15, 20, 15, 20, 20, 20, 20, 20, 20, 20
^^ ^^ ^^ ^^ ^^
It’s even more apparent with 100+ iterations, but I won’t print these here.
Property Wrapper variant
It cannot be a struct
because async mutations need to capture a mutable reference to self
in the async setter.
Also keep in mind that one-line increments like += 1
actually equal one get
plus one set
call. That will produce results you do not anticipate. The counter example above with a naive += 1
ends at the value 5
after 100 iterations on my machine, for example, because a ton of reading happens pretty quickly and asynchronously before a write operation sets up the barrier.
Do access the backing property directly with the underscore prefix and call e.g. _counter.mutate { $0 += 1 }
to do it all at once.
That is, in effect, similar to tx.write { self.count += 1 }
.
@propertyWrapper
final class Transaction<Value> {
private let queue: DispatchQueue
private var value: Value
init(wrappedValue: Value, queue: DispatchQueue) {
self.queue = queue
self.value = wrappedValue
}
var wrappedValue: Value {
get { queue.sync { value } }
set { queue.async(flags: .barrier) { self.value = newValue } }
}
func mutate(_ mutation: (inout Value) -> Void) {
return queue.sync {
mutation(&value)
}
}
}
And then use it like so:
class Counter {
private static let queue = DispatchQueue(label: "counter", qos: .background)
@TransactionWrapper(wrappedValue: 0, queue: Counter.queue)
private(set) var count
func increment() {
_count.mutate { $0 += 1 }
}
}
Conclusion and usage
I use the Transaction
type for access to a cache and for object repositories. Works like a charm and the call site doesn’t get too distracting. A very useful tool.
Heads up: There’s no need to go all-in with this. The overhead of async
calls with the .barrier
flag is not to be underestimated. Simply setting the value is many magnitudes faste. Only protect resources that you absolutely need to, and only introduce concurrency when you cannot get by without. Network API calls should not write to resources via transaction; they should finish in the background and then produce the result to your app on the main queue.
I didn’t invent any of this. There’s plenty of discussion on the web. For some more recent iterations on the topic, see also:
- Swift Atomic Properties with Property Wrappers by Vadim Bulavin
- Concurrent vs Serial DispatchQueue: Concurrency in Swift explained by Antoine van der Lee