Vercel Integration
Serverless function traces, edge middleware latency, cold start analysis, and deploy event correlation for Vercel. Full Next.js RSC and SSR instrumentation included.
How It Works
Install @tigerops/node
Add @tigerops/node to your Next.js or serverless function project. The SDK includes a Vercel-specific flush mode that ensures traces are exported before the function response is returned.
Configure vercel.json
Add TIGEROPS_API_KEY to your Vercel environment variables. The SDK auto-detects the Vercel runtime environment and configures OTLP export with synchronous flush for cold-start-safe tracing.
Enable Vercel Log Drains
Connect the TigerOps Vercel integration from the Vercel marketplace. This sets up a log drain to forward all function logs, edge runtime logs, and build logs to TigerOps.
Track Deployments
TigerOps automatically receives Vercel deployment webhooks. Every production deployment creates an event marker on your metric charts, correlating deploys with performance changes.
What You Get Out of the Box
Serverless Function Traces
Cold start duration, invocation count, execution time, and memory usage per serverless function. TigerOps traces the full request lifecycle including external API calls and database queries.
Edge Middleware Latency
Request processing time in Vercel Edge Middleware, matcher pattern hit rates, and redirect/rewrite counts. Identify which middleware rules add the most latency.
Cold Start Analysis
Cold start frequency and duration per function and per region. TigerOps correlates cold starts with traffic patterns and suggests warm-up strategies for latency-sensitive routes.
Next.js SSR & RSC Traces
Server-side rendering time, React Server Component execution, data fetch duration, and streaming chunk timing for Next.js App Router and Pages Router applications.
Build & Deploy Metrics
Build duration, bundle size trends, and function count per deployment. TigerOps alerts when build times regress significantly or bundle sizes exceed your configured limits.
Log Drain Ingestion
All Vercel function logs, edge runtime logs, and build logs are ingested and indexed by TigerOps. Correlate log errors with traces and metric anomalies in one unified view.
vercel.json + OTel Config
Configure TigerOps for your Vercel deployment with environment variables and the SDK initializer.
// vercel.json — environment variables for all functions
{
"env": {
"TIGEROPS_SERVICE_NAME": "my-nextjs-app",
"TIGEROPS_ENVIRONMENT": "production"
},
"build": {
"env": {
"TIGEROPS_SERVICE_NAME": "my-nextjs-app"
}
}
}
// Set TIGEROPS_API_KEY in Vercel dashboard → Settings → Environment Variables
---
// instrumentation.ts — Next.js App Router (src/ or root)
// This file is automatically loaded by Next.js before your app code
export async function register() {
if (process.env.NEXT_RUNTIME === 'nodejs') {
const { init } = await import('@tigerops/node')
init({
serviceName: process.env.TIGEROPS_SERVICE_NAME ?? 'nextjs-app',
// Vercel serverless mode: flush traces before response
flushOnExit: true,
})
}
if (process.env.NEXT_RUNTIME === 'edge') {
const { initEdge } = await import('@tigerops/node/edge')
initEdge({
serviceName: process.env.TIGEROPS_SERVICE_NAME ?? 'nextjs-edge',
})
}
}
---
// middleware.ts — trace edge middleware execution time
import { NextResponse } from 'next/server'
import type { NextRequest } from 'next/server'
import { withTigerOps } from '@tigerops/node/edge'
export const middleware = withTigerOps(async (request: NextRequest) => {
// Your middleware logic here
return NextResponse.next()
})
export const config = {
matcher: ['/((?!_next/static|favicon.ico).*)'],
}Common Questions
Does TigerOps work with Vercel Edge Functions and the Edge Runtime?
Yes. TigerOps uses a lightweight OTLP/HTTP exporter (not gRPC) that is compatible with the Vercel Edge Runtime. The edge-compatible export path uses fetch() for telemetry export, which is available in all Edge Runtime environments.
How does TigerOps handle the cold start penalty when exporting traces?
TigerOps uses synchronous flush mode on Vercel: traces are exported before the function response is sent using waitUntil() in Edge Runtime or process.on("beforeExit") in Node.js functions. This adds minimal latency (typically under 10ms) while ensuring all traces are captured.
Can I monitor Vercel preview deployments separately from production?
Yes. TigerOps reads the VERCEL_ENV environment variable and tags all telemetry with the deployment environment (production, preview, development). You can filter dashboards and alerts by environment to monitor preview deployments independently.
Does TigerOps integrate with Vercel Speed Insights and Web Analytics?
TigerOps complements Vercel Speed Insights by adding server-side trace data, API route performance, and backend dependency monitoring. Vercel Speed Insights focuses on client-side Core Web Vitals; TigerOps covers the full request path including database and external API calls.
Can I use TigerOps with non-Next.js Vercel deployments?
Yes. Any Node.js or Edge Runtime function deployed to Vercel can use @tigerops/node. Framework-specific auto-instrumentation is available for Next.js, Remix, SvelteKit, Astro, and bare serverless functions.
Full Observability for Your Vercel Deployments
Cold start analysis, edge middleware traces, and deploy correlation. Works with Next.js out of the box.