Server
Start your API with serve(), deploy anywhere with handler(), enable HTTP/2, and auto-generate API docs.
Silgi gives you two ways to run your API: serve() for a quick Node.js server, and handler() for any runtime that supports the Fetch API (Bun, Deno, Cloudflare Workers, etc.).
serve()
The simplest way to get a server running. One call, and you have an HTTP server with automatic port selection, optional API docs, and WebSocket support:
import { } from 'silgi'
import { } from 'zod'
const = ({
: () => ({ : getDB() }),
})
const = .({
: .(() => ({ : 'ok' })),
})
.(, {
: 3000,
})Run it with npx tsx src/server.ts and you'll see:
Silgi server running at http://127.0.0.1:3000All options
s.serve(appRouter, {
: 3000, // auto-finds next available if busy
: '0.0.0.0', // bind to all interfaces
: true, // API docs at /reference
: true, // WebSocket RPC on the same port
: {
// HTTP/2 with TLS
: './certs/cert.pem',
: './certs/key.pem',
},
})| Option | Type | Default | Description |
|---|---|---|---|
port | number | 3000 | Port to listen on. If it's taken, Silgi automatically picks the next free port in 3000-3100. |
hostname | string | "127.0.0.1" | Network interface. Use "0.0.0.0" to accept connections from other machines. |
scalar | boolean | ScalarOptions | false | Enable auto-generated API docs at /reference. |
ws | boolean | false | Enable WebSocket RPC on the same port. |
http2 | { cert, key } | undefined | Enable HTTP/2 with TLS certificates. Falls back to HTTP/1.1 for older clients. |
Automatic port selection
If your requested port is already in use, serve() finds the next available one in the 3000-3100 range. You never need to handle EADDRINUSE yourself.
handler()
For runtimes other than Node.js, or when you want to plug Silgi into an existing server, use handler(). It returns a standard Fetch API function: (Request) => Promise<Response>.
const = s.handler(appRouter)
// handle is: (request: Request) => Promise<Response>This works everywhere the Web Fetch API exists:
const = s.handler(appRouter)
Bun.serve({
: 3000,
: ,
})const = s.handler(appRouter)
Deno.serve({ : 3000 }, )const = s.handler(appRouter)
export default {
: ,
}handler() supports content negotiation automatically. If the client sends an Accept: application/x-msgpack header,
the response is encoded as MessagePack. Same for devalue. JSON is the default fallback.
Content negotiation
Both serve() and handler() inspect the Accept header and respond in the format the client prefers:
| Accept header | Response format | When to use |
|---|---|---|
application/json (or none) | JSON | Default, works everywhere |
application/x-msgpack | MessagePack | Smaller payloads, binary |
application/x-devalue+json | devalue | Rich types (Date, Map, Set, BigInt) |
The server also checks Content-Type on incoming POST requests to decode the body correctly. No configuration needed on the server side — it all happens automatically.
Scalar API docs
Silgi can generate an OpenAPI 3.1.0 spec from your router and serve an interactive API reference powered by Scalar:
s.serve(appRouter, {
: 3000,
: true,
})This gives you two extra routes:
/reference— interactive API documentation UI/openapi.json— the raw OpenAPI specification
The spec is generated once at startup, so there is no per-request cost.
Customizing the docs
Pass an options object instead of true:
s.serve(appRouter, {
: {
: 'My API',
: 'Built with Silgi',
: '2.0.0',
: [{ : 'https://api.example.com' }],
: { : '[email protected]' },
: { : 'http', : 'bearer', : 'JWT' },
},
})Scalar UI source
By default, the Scalar UI JavaScript is loaded from a CDN. You can change this with the cdn option:
s.serve(appRouter, {
: {
: 'My API',
: 'local', // serve from node_modules (offline)
},
})| Value | Description |
|---|---|
'cdn' (default) | Load from cdn.jsdelivr.net |
'unpkg' | Load from unpkg.com |
'local' | Serve from node_modules — no external requests, fully offline |
| Custom URL | Any URL string, e.g. '/assets/scalar.js' for self-hosting |
The 'local' option requires @scalar/api-reference as a dependency. If the package is not found, Silgi falls back
to CDN with a warning.
bash pnpm add @scalar/api-reference bash npm install @scalar/api-reference bash bun add @scalar/api-reference Adding metadata to procedures
The generated docs pull information from your procedure definitions. Add route metadata to make the docs more useful:
const = k
.$input(z.object({ : z.number().optional() }))
.$route({
: 'List all users',
: 'Returns a paginated list of users, ordered by creation date.',
: ['users'],
})
.$resolve(({ , }) => .db.users.findMany({ : .limit }))The route field accepts:
| Property | Description |
|---|---|
summary | Short description shown in the endpoint list |
description | Longer description shown in the detail view |
tags | Group endpoints by tag |
deprecated | Mark an endpoint as deprecated |
successStatus | Override the default 200 status |
successDescription | Describe what a successful response looks like |
cache | Cache-Control header for GET procedure responses |
Caching
Add a cache option to route to set Cache-Control headers on query responses. This enables HTTP-level caching by browsers, CDNs, and reverse proxies.
// Shorthand: cache for 60 seconds
const = s.$route({ : 60 }).$resolve(({ }) => .db.users.findMany())
// Full control: custom Cache-Control value
const = s.$route({ : 'public, max-age=300, stale-while-revalidate=60' }).$resolve(() => loadConfig())| Value | Behavior |
|---|---|
number | Sets Cache-Control: public, max-age=N (seconds) |
string | Sets the exact Cache-Control header value |
The cache header is compiled once at startup and added to every response for that procedure — zero per-request overhead.
Caching only applies to GET procedures (.$route({ method: 'GET' })). POST procedures and subscriptions never get
Cache-Control headers, even if you set cache on them.
HTTP/2
For production deployments that benefit from multiplexing and header compression, pass TLS certificates:
s.serve(appRouter, {
: 443,
: {
: './certs/cert.pem',
: './certs/key.pem',
},
})Output:
Silgi server running at https://127.0.0.1:443
HTTP/2 enabled (with HTTP/1.1 fallback)HTTP/2 requires TLS. Older clients that don't support HTTP/2 automatically fall back to HTTP/1.1 on the same port.
For local development, generate self-signed certificates with mkcert. For production, use certificates from your hosting provider or Let's Encrypt.
WebSocket support
Enable bidirectional RPC alongside the HTTP server:
s.serve(appRouter, {
: 3000,
: true,
})Silgi server running at http://127.0.0.1:3000
WebSocket RPC at ws://127.0.0.1:3000HTTP and WebSocket share the same port. The server detects WebSocket upgrade requests and routes them accordingly. See the WebSocket protocol page for the message format and how to connect from the client.
Lifecycle hooks
Silgi fires hooks at key points during request processing. Register them in the hooks option when creating the instance:
const = silgi({
: () => ({ : getDB() }),
: {
: ({ , }) => {
.(`--> ${}`)
},
: ({ , , }) => {
.(`<-- ${} (${.toFixed(1)}ms)`)
},
: ({ , }) => {
.(`ERR ${}:`, )
},
'serve:start': ({ , , }) => {
.(`Server ready at ${}`)
},
},
})You can also add or remove hooks after the instance is created:
// Add a hook
s.hook('error', ({ }) => reportToSentry())
// Remove a hook
k.removeHook('error', myHookFn)| Hook | When it fires | Payload |
|---|---|---|
request | Before a request is processed | { path, input } |
response | After a successful response | { path, output, durationMs } |
error | When any error occurs | { path, error } |
serve:start | When the server starts listening | { url, port, hostname } |
Hooks are for side effects like logging and metrics. They don't modify the request or response. For that, use guards and wraps.
Raw Response and binary streaming
Procedures can return a Response or ReadableStream for full control over the HTTP response — useful for file downloads, PDF generation, image serving, and binary data.
Returning a Response
Return a standard Response object for complete control over status, headers, and body:
const = s.$input(z.object({ : z.string() })).$resolve(async ({ , }) => {
const = await .storage.getFile(.id)
return new (, {
: {
'content-type': 'application/pdf',
'content-disposition': `attachment; filename="${.id}.pdf"`,
},
})
})Returning a ReadableStream
Return a ReadableStream for binary streaming. Silgi sends it with application/octet-stream:
const = s.$resolve(async ({ }) => {
return new ({
async () {
.(new ().('id,name\n'))
for await (const of .db.users.cursor()) {
.(new ().(`${.id},${.name}\n`))
}
.()
},
})
})When a procedure returns a Response or ReadableStream, content negotiation and JSON serialization are skipped
entirely. The response is passed through as-is.
callable()
For server-side code where you need to call a procedure directly — without HTTP, serialization, or the client proxy. Useful in scripts, cron jobs, and seed files.
import { } from 'silgi'
const = k
.$input(z.object({ : z.number().optional() }))
.$resolve(({ , }) => .db.users.findMany({ : .limit }))
const = (, {
: () => ({ : getDB() }),
})
// Direct call — compiled pipeline, no HTTP
const = await ({ : 10 })The compiled pipeline (guards, wraps, validation) runs exactly as it would through serve() or handler(). The only difference is there's no network involved.
What's next?
- Client — connect to your server from the browser or another service
- Middleware — guards and wraps that run inside your request pipeline
- Fastify integration — mount Silgi inside an existing Fastify app
- Plugins — add CORS, logging, rate limiting, and tracing
- Protocols — learn about JSON, MessagePack, devalue, and WebSocket