The lockable library offers thread-safe HashMap (see LockableHashMap) and LruCache (see LockableLruCache) types where individual keys can be locked/unlocked, even if there is no entry for this key in the map.
This can be very useful for synchronizing access to an underlying key-value store or for building cache data structures on top of such a key-value store.
This example builds a simple LRU cache and locks some entries.
```rust use lockable::{AsyncLimit, LockableLruCache};
let lockable_cache: LockableLruCache
// Insert an entry lockablecache.asynclock(4, AsyncLimit::no_limit()) .await? .insert(String::from("Value"));
// Hold a lock on a different entry let guard = lockablecache.asynclock(5, AsyncLimit::no_limit()) .await?;
// This next line would wait until the lock gets released, which in this case would // cause a deadlock because we're on the same thread. // let guard2 = lockablecache.asynclock(5, AsyncLimit::no_limit()).await?;
// After dropping the corresponding guard, we can lock it again std::mem::drop(guard); let guard2 = lockablecache.asynclock(5, AsyncLimit::no_limit()).await?; ```
This example builds a simple lock pool using the LockableHashMap data structure. A lock pool is a pool of keyable locks. In this example, the entries don't have a value assigned to them and the lock pool is only used to synchronize access to some keyed resource. ```rust use lockable::{AsyncLimit, LockableHashMap};
let lockablecache: LockableHashMap
// This next line would wait until the lock gets released, which in this case would // cause a deadlock because we're on the same thread. // let entry3 = lockablecache.asynclock(4, AsyncLimit::no_limit()).await?;
// After dropping the corresponding guard, we can lock it again std::mem::drop(entry1); let entry3 = lockablecache.asynclock(4, AsyncLimit::no_limit()).await?; ```
License: MIT OR Apache-2.0