Silgi

Cache

Production-grade response caching with TTL, SWR, request deduplication, and pluggable backends.

Cache query results in memory, Redis, Cloudflare KV, or any storage backend. Powered by ocache with built-in TTL, stale-while-revalidate, and request deduplication.

Basic usage

Wrap a procedure with cacheQuery() to cache its results:

import {  } from 'silgi/cache'

const  = s.$use(({ : 60 })).$resolve(({  }) => .db.users.findMany())

The first call runs the resolver and caches the result. Subsequent calls with the same input return the cached value for 60 seconds.

Options

cacheQuery({
  : 60, // TTL in seconds (default: 60)
  : true, // stale-while-revalidate (default: true)
  : 60, // max stale age in seconds (default: maxAge)
  : 'users_list', // cache key prefix (default: auto-generated)
  : () => .id, // custom cache key from input
})
OptionTypeDefaultDescription
maxAgenumber60Cache TTL in seconds
swrbooleantrueReturn stale data while revalidating in the background
staleMaxAgenumbermaxAgeMaximum seconds to serve stale data during SWR
namestringautoCache key prefix for invalidation
getKey(input) => stringhashCustom cache key generator
shouldBypassCache(input) => boolean-Skip cache entirely (e.g. admin, debug)
shouldInvalidateCache(input) => boolean-Force re-resolve and update cache
validate(entry) => boolean-Return false to treat entry as miss
transform(entry) => T-Transform cached value before returning
basestring | string[]'/cache'Multi-tier storage prefixes
integritystringautoCustom integrity hash (auto-invalidates on redeploy)
onError(error) => voidconsole.errorError handler for cache read/write failures

Bypass and conditional invalidation

Skip the cache entirely for specific requests, or force a refresh:

import {  } from 'silgi/cache'

const  = s
  .$use(
    ({
      : 300,
      // Admin users bypass cache
      : () => ( as any)?.asAdmin === true,
      // Refresh flag forces re-resolve
      : () => ( as any)?.refresh === true,
      // Only cache non-empty arrays
      : () => .(.) && .. > 0,
    }),
  )
  .$resolve(({  }) => .db.users.findMany())

Invalidation

Invalidate cached queries by name — typically after a mutation:

import { ,  } from 'silgi/cache'

const  = s.$use(({ : 300, : 'users_list' })).$resolve(({  }) => .db.users.findMany())

const  = s.$resolve(async ({ ,  }) => {
  const  = await .db.users.create()
  await ('users_list')
  return 
})

Storage backends

By default, cache lives in memory. For production, plug in any storage backend via unstorage (included as a dependency).

Configure it once at server startup — typically in your server.ts:

Redis

server.ts
import { ,  } from 'silgi/cache'
import {  } from 'silgi/unstorage'
import  from 'unstorage/drivers/redis'

// Set up Redis cache before creating the handler
const  = ({
  : ({ : 'redis://localhost:6379' }),
})
(())

// Then create your server
const  = silgi({ : () => ({  }) })
const  = .router({ ... })
export default { : .handler() }

Cloudflare KV

server.ts
import { ,  } from 'silgi/cache'
import {  } from 'silgi/unstorage'
import  from 'unstorage/drivers/cloudflare-kv-binding'

const  = ({
  : ({ : 'MY_KV' }),
})
(())

Custom storage

Any object with get and set methods works:

import {  } from 'silgi/cache'

({
  : () => myStore.get(),
  : (, , ) => myStore.set(, , ?.),
})

HTTP Cache vs Server Cache

Silgi has two caching mechanisms — they serve different purposes:

Feature$route({ cache })cacheQuery()
WhereBrowser / CDNServer memory / Redis / KV
HowCache-Control headerocache (in-process)
Input-awareNoYes (hashes input)
SWRCDN-levelBuilt-in
InvalidationTTL onlyinvalidateQueryCache()
Use casePublic, static dataDB queries, expensive compute

Use both together for maximum performance:

const  = s
  .$route({ : 60 })
  .$use(cacheQuery({ : 300, : 'users' }))
  .$resolve(({  }) => .db.users.findMany())

How it works

  1. cacheQuery() wraps the procedure as onion middleware
  2. On each call, input is hashed to produce a cache key
  3. Cache hit → return immediately (zero resolver overhead)
  4. Cache miss → run resolver, store result, return
  5. With SWR → return stale data immediately, revalidate in background
  6. Concurrent identical requests share one in-flight promise (deduplication)
  7. Cache entries auto-expire after maxAge seconds

The cache key is computed from the serialized input using ohash. Two calls with the same input always produce the same key, regardless of property order.

What's next?

  • Rate Limiting — protect endpoints from abuse
  • Server — HTTP-level Cache-Control headers via route.cache
  • Plugins — all available plugins

On this page