TRAILBASE|DOCUMENTATION

Customer Audit Exports

Provide your customers with automated daily exports of their audit logs to their own S3 buckets. Support JSONL and CSV formats with checksum verification and compliance guarantees.

Overview

Enterprise customers often require complete copies of their audit logs for compliance, data sovereignty, or backup purposes. Trailbase's export system automatically delivers audit data to customer-controlled storage on a schedule.

Export Configuration

Create Export Configuration

Set up an export destination for a customer:

// API route to configure customer export
export async function POST(req: Request) {
  const session = await requireSession();
  const body = await req.json();

  const exportConfig = await db.exportConfig.create({
    data: {
      tenantId: session.tenantId,
      destination: 'S3',
      s3Bucket: body.bucket,
      s3Region: body.region,
      s3KeyPrefix: body.keyPrefix || 'trailbase-audit-logs/',
      s3AccessKeyId: body.accessKeyId,
      s3SecretAccessKey: encrypt(body.secretAccessKey), // Encrypt secrets!
      format: body.format || 'JSONL', // 'JSONL' or 'CSV'
      schedule: body.schedule || 'daily', // 'daily', 'weekly', 'monthly'
      compression: body.compression || 'gzip',
      enabled: true,
    },
  });

  // Audit the export configuration
  await trailbase.log({
    action: 'export.configure',
    actor: { id: session.userId, email: session.userEmail },
    resource: { type: 'export_config', id: exportConfig.id },
    outcome: 'success',
    metadata: {
      destination: 'S3',
      bucket: body.bucket,
      format: body.format,
      schedule: body.schedule,
    },
  });

  return Response.json({ exportConfig });
}

Configuration Schema

// Database schema (Prisma)
model ExportConfig {
  id                 String   @id @default(cuid())
  tenantId           String
  destination        String   // 'S3', 'GCS', 'Azure'

  // S3 configuration
  s3Bucket           String?
  s3Region           String?
  s3KeyPrefix        String?
  s3AccessKeyId      String?
  s3SecretAccessKey  String?  // Encrypted

  // Export settings
  format             String   // 'JSONL', 'CSV'
  schedule           String   // 'daily', 'weekly', 'monthly'
  compression        String?  // 'gzip', 'none'
  enabled            Boolean  @default(true)

  // Metadata
  createdAt          DateTime @default(now())
  lastRunAt          DateTime?
  lastRunStatus      String?  // 'success', 'failure'

  exports            Export[]

  @@index([tenantId])
}

Export Formats

JSONL Format

Newline-delimited JSON, one event per line. Ideal for programmatic processing:

// Example JSONL output (each line is a complete JSON object)
{"event_id":"evt_123","event_time":"2026-02-10T14:30:00Z","action":"user.login","actor":{"id":"user_alice","email":"alice@acme.com"},"resource":{"type":"session","id":"sess_xyz"},"outcome":"success"}
{"event_id":"evt_124","event_time":"2026-02-10T14:31:00Z","action":"document.create","actor":{"id":"user_alice","email":"alice@acme.com"},"resource":{"type":"document","id":"doc_abc"},"outcome":"success"}
{"event_id":"evt_125","event_time":"2026-02-10T14:32:00Z","action":"document.share","actor":{"id":"user_alice","email":"alice@acme.com"},"resource":{"type":"document","id":"doc_abc"},"outcome":"success"}

CSV Format

Comma-separated values for spreadsheet analysis:

// Example CSV output
event_id,event_time,action,actor_id,actor_email,resource_type,resource_id,outcome,metadata
evt_123,2026-02-10T14:30:00Z,user.login,user_alice,alice@acme.com,session,sess_xyz,success,"{}"
evt_124,2026-02-10T14:31:00Z,document.create,user_alice,alice@acme.com,document,doc_abc,success,"{}"
evt_125,2026-02-10T14:32:00Z,document.share,user_alice,alice@acme.com,document,doc_abc,success,"{"shared_with":"bob@acme.com"}"

Export Worker

Scheduled Export Job

Run the export worker on a schedule using cron or a task queue:

// app/api/cron/export/route.ts
import { NextRequest } from 'next/server';
import { runExports } from '@/lib/export-worker';

export async function GET(req: NextRequest) {
  // Verify cron secret
  const authHeader = req.headers.get('authorization');
  if (authHeader !== `Bearer ${process.env.CRON_SECRET}`) {
    return new Response('Unauthorized', { status: 401 });
  }

  await runExports();

  return Response.json({ success: true });
}

// Schedule: 0 2 * * * (daily at 2 AM UTC)

Export Worker Implementation

// lib/export-worker.ts
import { db } from './db';
import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';
import { createGzip } from 'zlib';
import { pipeline } from 'stream/promises';

export async function runExports() {
  // Get all enabled export configurations
  const configs = await db.exportConfig.findMany({
    where: { enabled: true },
  });

  for (const config of configs) {
    try {
      await runExportForConfig(config);
    } catch (error) {
      console.error(`Export failed for config ${config.id}:`, error);
      await db.exportConfig.update({
        where: { id: config.id },
        data: {
          lastRunStatus: 'failure',
          lastRunAt: new Date(),
        },
      });
    }
  }
}

async function runExportForConfig(config: any) {
  // Calculate date range
  const endDate = new Date();
  const startDate = new Date(endDate);
  startDate.setDate(startDate.getDate() - 1); // Last 24 hours

  // Fetch events from database
  const events = await db.auditEvent.findMany({
    where: {
      tenantId: config.tenantId,
      eventTime: {
        gte: startDate,
        lt: endDate,
      },
    },
    orderBy: { eventTime: 'asc' },
  });

  // Format events
  let content: string;
  if (config.format === 'JSONL') {
    content = events.map((e) => JSON.stringify(e)).join('\n');
  } else {
    content = formatAsCSV(events);
  }

  // Compress if configured
  let finalContent: Buffer;
  const filename = generateFilename(config, startDate, endDate);

  if (config.compression === 'gzip') {
    finalContent = await compress(Buffer.from(content, 'utf8'));
  } else {
    finalContent = Buffer.from(content, 'utf8');
  }

  // Calculate checksum
  const checksum = crypto
    .createHash('sha256')
    .update(finalContent)
    .digest('hex');

  // Upload to S3
  const s3 = new S3Client({
    region: config.s3Region,
    credentials: {
      accessKeyId: config.s3AccessKeyId,
      secretAccessKey: decrypt(config.s3SecretAccessKey),
    },
  });

  await s3.send(
    new PutObjectCommand({
      Bucket: config.s3Bucket,
      Key: `${config.s3KeyPrefix}${filename}`,
      Body: finalContent,
      ContentType: config.format === 'JSONL' ? 'application/x-ndjson' : 'text/csv',
      Metadata: {
        trailbase_tenant_id: config.tenantId,
        trailbase_checksum: checksum,
        trailbase_event_count: events.length.toString(),
        trailbase_start_date: startDate.toISOString(),
        trailbase_end_date: endDate.toISOString(),
      },
    })
  );

  // Record export in database
  await db.export.create({
    data: {
      exportConfigId: config.id,
      tenantId: config.tenantId,
      startDate,
      endDate,
      eventCount: events.length,
      format: config.format,
      filename,
      s3Key: `${config.s3KeyPrefix}${filename}`,
      checksum,
      status: 'completed',
      completedAt: new Date(),
    },
  });

  // Update config status
  await db.exportConfig.update({
    where: { id: config.id },
    data: {
      lastRunStatus: 'success',
      lastRunAt: new Date(),
    },
  });
}

function generateFilename(config: any, startDate: Date, endDate: Date): string {
  const dateStr = startDate.toISOString().split('T')[0];
  const extension = config.format === 'JSONL' ? 'jsonl' : 'csv';
  const compression = config.compression === 'gzip' ? '.gz' : '';
  return `audit-logs-${config.tenantId}-${dateStr}.${extension}${compression}`;
}

function formatAsCSV(events: any[]): string {
  const headers = [
    'event_id',
    'event_time',
    'action',
    'actor_id',
    'actor_email',
    'resource_type',
    'resource_id',
    'outcome',
    'metadata',
  ];

  const rows = events.map((e) => [
    e.eventId,
    e.eventTime.toISOString(),
    e.action,
    e.actor.id,
    e.actor.email,
    e.resource.type,
    e.resource.id,
    e.outcome,
    JSON.stringify(e.metadata || {}),
  ]);

  return [
    headers.join(','),
    ...rows.map((row) => row.map((cell) => `"${cell}"`).join(',')),
  ].join('\n');
}

async function compress(buffer: Buffer): Promise<Buffer> {
  const chunks: Buffer[] = [];
  const gzip = createGzip();

  gzip.on('data', (chunk) => chunks.push(chunk));

  return new Promise((resolve, reject) => {
    gzip.on('end', () => resolve(Buffer.concat(chunks)));
    gzip.on('error', reject);
    gzip.end(buffer);
  });
}

Export Verification

Checksum Verification

Customers can verify the integrity of exported files using the checksum:

// Customer-side verification script
import { S3Client, GetObjectCommand } from '@aws-sdk/client-s3';
import crypto from 'crypto';

async function verifyExport(bucket: string, key: string) {
  const s3 = new S3Client({ region: 'us-east-1' });

  // Download file and metadata
  const response = await s3.send(
    new GetObjectCommand({ Bucket: bucket, Key: key })
  );

  const fileContent = await streamToBuffer(response.Body);
  const expectedChecksum = response.Metadata?.trailbase_checksum;

  // Calculate checksum
  const actualChecksum = crypto
    .createHash('sha256')
    .update(fileContent)
    .digest('hex');

  // Verify
  if (actualChecksum === expectedChecksum) {
    console.log('✓ Checksum verified - export is authentic');
    return true;
  } else {
    console.error('✗ Checksum mismatch - export may be corrupted');
    return false;
  }
}

Event Count Validation

// Verify event count matches metadata
async function validateEventCount(bucket: string, key: string) {
  const s3 = new S3Client({ region: 'us-east-1' });
  const response = await s3.send(
    new GetObjectCommand({ Bucket: bucket, Key: key })
  );

  const fileContent = await streamToBuffer(response.Body);
  const expectedCount = parseInt(response.Metadata?.trailbase_event_count || '0');

  // Count lines in JSONL (or rows in CSV minus header)
  const actualCount = fileContent
    .toString('utf8')
    .split('\n')
    .filter((line) => line.trim()).length;

  console.log(`Expected: ${expectedCount}, Actual: ${actualCount}`);
  return expectedCount === actualCount;
}

Export Re-run

Backfill Historical Data

Re-run exports for a specific date range:

// POST /api/v1/exports/rerun
export async function POST(req: Request) {
  const session = await requireSession();
  const body = await req.json();

  const { startDate, endDate, format } = body;

  // Validate date range
  const start = new Date(startDate);
  const end = new Date(endDate);

  if (end > new Date()) {
    return Response.json(
      { error: 'End date cannot be in the future' },
      { status: 400 }
    );
  }

  // Queue export job
  await queue.publish('export', {
    tenantId: session.tenantId,
    startDate: start,
    endDate: end,
    format: format || 'JSONL',
    requestedBy: session.userId,
  });

  return Response.json({
    message: 'Export queued',
    estimatedCompletion: '5-10 minutes',
  });
}

Customer-Facing UI

Export Configuration Page

// app/tenant/[tenantId]/exports/configure/page.tsx
'use client';

import { useState } from 'react';

export default function ExportConfigurePage() {
  const [config, setConfig] = useState({
    bucket: '',
    region: 'us-east-1',
    accessKeyId: '',
    secretAccessKey: '',
    format: 'JSONL',
    schedule: 'daily',
  });

  async function handleSubmit(e: React.FormEvent) {
    e.preventDefault();

    const response = await fetch('/api/exports/configure', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify(config),
    });

    if (response.ok) {
      alert('Export configured successfully!');
    }
  }

  return (
    <form onSubmit={handleSubmit}>
      <h1>Configure Audit Log Exports</h1>

      <label>
        S3 Bucket Name
        <input
          type="text"
          value={config.bucket}
          onChange={(e) => setConfig({ ...config, bucket: e.target.value })}
          placeholder="my-audit-logs-bucket"
          required
        />
      </label>

      <label>
        AWS Region
        <select
          value={config.region}
          onChange={(e) => setConfig({ ...config, region: e.target.value })}
        >
          <option value="us-east-1">US East (N. Virginia)</option>
          <option value="us-west-2">US West (Oregon)</option>
          <option value="eu-west-1">EU (Ireland)</option>
        </select>
      </label>

      <label>
        Access Key ID
        <input
          type="text"
          value={config.accessKeyId}
          onChange={(e) => setConfig({ ...config, accessKeyId: e.target.value })}
          required
        />
      </label>

      <label>
        Secret Access Key
        <input
          type="password"
          value={config.secretAccessKey}
          onChange={(e) => setConfig({ ...config, secretAccessKey: e.target.value })}
          required
        />
      </label>

      <label>
        Export Format
        <select
          value={config.format}
          onChange={(e) => setConfig({ ...config, format: e.target.value })}
        >
          <option value="JSONL">JSONL (Newline-Delimited JSON)</option>
          <option value="CSV">CSV (Comma-Separated Values)</option>
        </select>
      </label>

      <label>
        Schedule
        <select
          value={config.schedule}
          onChange={(e) => setConfig({ ...config, schedule: e.target.value })}
        >
          <option value="daily">Daily</option>
          <option value="weekly">Weekly</option>
          <option value="monthly">Monthly</option>
        </select>
      </label>

      <button type="submit">Save Configuration</button>
    </form>
  );
}

Monitoring Exports

Export History Dashboard

// Display recent exports
export default async function ExportsPage() {
  const session = await requireSession();

  const exports = await db.export.findMany({
    where: { tenantId: session.tenantId },
    orderBy: { completedAt: 'desc' },
    take: 30,
  });

  return (
    <div>
      <h1>Export History</h1>
      <table>
        <thead>
          <tr>
            <th>Date Range</th>
            <th>Events</th>
            <th>Format</th>
            <th>Status</th>
            <th>File</th>
          </tr>
        </thead>
        <tbody>
          {exports.map((exp) => (
            <tr key={exp.id}>
              <td>
                {exp.startDate.toLocaleDateString()} -{' '}
                {exp.endDate.toLocaleDateString()}
              </td>
              <td>{exp.eventCount.toLocaleString()}</td>
              <td>{exp.format}</td>
              <td>
                <span className={`status-${exp.status}`}>
                  {exp.status}
                </span>
              </td>
              <td>
                <code>{exp.filename}</code>
              </td>
            </tr>
          ))}
        </tbody>
      </table>
    </div>
  );
}

Best Practices

  • Use IAM roles when possible: Avoid storing access keys; use AWS IAM roles with AssumeRole
  • Encrypt credentials: Always encrypt S3 credentials before storing in your database
  • Set lifecycle policies: Configure S3 lifecycle rules to archive old exports to Glacier
  • Monitor export failures: Alert when exports fail to ensure continuity
  • Compress large exports: Use gzip to reduce storage costs and transfer time
  • Validate exports: Always include checksums and event counts in metadata

S3 Bucket Policy

Your customers need to grant your AWS account write access to their S3 bucket. Provide them with a bucket policy template:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "AllowTrailbaseExports",
      "Effect": "Allow",
      "Principal": {
        "AWS": "arn:aws:iam::YOUR_ACCOUNT_ID:role/trailbase-exporter"
      },
      "Action": ["s3:PutObject", "s3:PutObjectAcl"],
      "Resource": "arn:aws:s3:::CUSTOMER_BUCKET/trailbase-audit-logs/*"
    }
  ]
}

Next Steps

Explore the SDK Reference for complete API documentation and advanced usage patterns.

Edit this page on GitHub