Writing a Database Log Handler for Laravel

/ Content

Log files are great for tailing and grepping during development. In production, on a multi-tenant app, they fall apart fast. When a customer opens a support ticket saying something broke Tuesday night, you're running grep company_47 against a 200MB file, then piping that into another grep for the right time window, then parsing the output by eye. If your app handles multiple companies with overlapping activity, that process is miserable.

The fix is writing logs to the database. Not instead of files, necessarily, but in addition to them. Once logs are in a table, you can query by user, company, date range, severity, or any combination. You can join against your other tables. You can build an admin UI on top of it. grep does not scale for "show me everything that touched company 47 between midnight and 3 AM on Tuesday" and SQL does.

The Migration

The table is intentionally minimal. You want to store just enough to filter and correlate, without turning it into a second application table.

Schema::create('debug_logs', function (Blueprint $table) {
    $table->id();
    $table->foreignId('company_id')->nullable()->constrained()->nullOnDelete();
    $table->foreignId('user_id')->nullable()->constrained()->nullOnDelete();
    $table->foreignId('integration_id')->nullable()->constrained()->nullOnDelete();
    $table->string('level', 20);
    $table->string('type', 60)->default('general_log');
    $table->text('message');
    $table->json('context')->nullable();
    $table->json('extra')->nullable();
    $table->timestamp('created_at')->useCurrent();
});

Notice there's no updated_at. Logs are immutable. Once written, they don't change, so the column would just waste space and never hold meaningful data.

Index everything you'll filter on:

$table->index('level');
$table->index('type');
$table->index('company_id');
$table->index('user_id');
$table->index('integration_id');
$table->index('created_at');

The type column is how I categorize logs beyond severity. Things like webhook_received, payment_processed, sync_failed. It lets you ask "show me all failed syncs for company 47 this week" without digging through noise from other log types.

The Model

Nothing complicated here. The main things to get right are turning off updated_at and casting the JSON columns so they come back as arrays.

<?php

namespace App\Models;

use Illuminate\Database\Eloquent\Model;
use Illuminate\Database\Eloquent\Relations\BelongsTo;

class DebugLog extends Model
{
    public $timestamps = false;

    protected $fillable = [
        'company_id',
        'user_id',
        'integration_id',
        'level',
        'type',
        'message',
        'context',
        'extra',
        'created_at',
    ];

    protected $casts = [
        'context' => 'array',
        'extra' => 'array',
        'created_at' => 'datetime',
    ];

    public function company(): BelongsTo
    {
        return $this->belongsTo(Company::class);
    }

    public function user(): BelongsTo
    {
        return $this->belongsTo(User::class);
    }
}

The Handler

Monolog's AbstractProcessingHandler is the right base class. It handles the bulk of the processing pipeline and just asks you to implement write().

<?php

namespace App\Logging;

use App\Models\DebugLog;
use Monolog\Handler\AbstractProcessingHandler;
use Monolog\LogRecord;

class DatabaseLogger extends AbstractProcessingHandler
{
    protected array $context = [];
    protected array $extra = [];

    protected function write(LogRecord $record): void
    {
        $this->context = $record->context;
        $this->extra = $record->extra;

        DebugLog::create([
            'company_id' => $this->getCompanyId(),
            'integration_id' => $this->getIntegrationId(),
            'user_id' => $this->getUserId(),
            'level' => $record->level->getName(),
            'type' => $this->getType(),
            'message' => $record->message,
            'context' => $this->context ?: null,
            'extra' => $this->extra ?: null,
        ]);
    }
}

The critical ordering detail: $this->context and $this->extra are assigned before calling any of the get*() methods. Each of those methods mutates both arrays by unsetting keys as it pulls values out. By the time the DebugLog::create() call runs, $this->context and $this->extra contain only the leftovers that weren't claimed as structured fields.

Extracting Context

The getType() method is simple:

protected function getType(): string
{
    $type = $this->context['type'] ?? $this->extra['type'] ?? 'general_log';

    unset($this->context['type'], $this->extra['type']);

    return $type;
}

getModelId() is the flexible one. It handles three cases: you passed an integer ID directly, you passed an Eloquent model instance, or you passed nothing and it should return null.

protected function getModelId(string $name): ?int
{
    $idKey = "{$name}_id";

    if (isset($this->context[$idKey])) {
        $id = $this->context[$idKey];
        unset($this->context[$idKey]);
        return (int) $id;
    }

    if (isset($this->extra[$idKey])) {
        $id = $this->extra[$idKey];
        unset($this->extra[$idKey]);
        return (int) $id;
    }

    if (isset($this->context[$name]) && is_object($this->context[$name])) {
        $id = $this->context[$name]->id;
        unset($this->context[$name]);
        return $id;
    }

    if (isset($this->extra[$name]) && is_object($this->extra[$name])) {
        $id = $this->extra[$name]->id;
        unset($this->extra[$name]);
        return $id;
    }

    return null;
}

The public-facing methods build on top of getModelId() and add fallbacks for when nothing was explicitly passed:

protected function getCompanyId(): ?int
{
    return $this->getModelId('company') ?? auth()->user()?->company_id;
}

protected function getIntegrationId(): ?int
{
    return $this->getModelId('integration');
}

protected function getUserId(): ?int
{
    return $this->getModelId('user') ?? auth()->id();
}

getCompanyId() is where your tenant resolution lives. If your app resolves the current company differently (a middleware-set property, a config value, a scope), swap out auth()->user()?->company_id for whatever makes sense. The point is that if you don't explicitly pass a company, it falls back to the authenticated user's company automatically. Most log calls don't need to specify a company because they're already running in that context.

Registering the Channel

In config/logging.php, add a new channel:

'database' => [
    'driver' => 'monolog',
    'handler' => \App\Logging\DatabaseLogger::class,
    'level' => 'debug',
],

If you want both file and database logging at the same time, wrap them in a stack:

'stack' => [
    'driver' => 'stack',
    'channels' => ['daily', 'database'],
    'ignore_exceptions' => false,
],

Set LOG_CHANNEL=stack in your .env and both channels run for every log call.

Using It

Log a message with context:

Log::channel('database')->info('Webhook received', [
    'type' => 'webhook_received',
    'company_id' => 47,
    'event' => 'payment.completed',
    'payload_size' => strlen($rawPayload),
]);

You can also pass model instances directly and the handler will extract the ID:

Log::channel('database')->info('Sync started', [
    'type' => 'sync_started',
    'company' => $company,
    'integration' => $integration,
]);

Once the data is in the table, querying it is just Eloquent:

DebugLog::query()
    ->where('company_id', 47)
    ->where('type', 'webhook_received')
    ->where('created_at', '>=', now()->subHours(3))
    ->orderBy('created_at', 'desc')
    ->get();

That's the whole point. You could not write that query against a log file.

A Note on Performance

Writing to the database on every single log call adds a query to every request. In a low-volume app that's fine. In a high-volume app it adds up.

A few ways to handle it: set the channel level to warning or error in production and only drop to debug when you're actively investigating something. Or only write specific type values to the database and let everything else go to the file log. Or dispatch the insert as a queued job if you can tolerate slight delays in log availability.

The database channel works best as an intentional tool. You direct specific log calls to it because you know you'll want to query that data. Routing all application logging through it without thinking about volume will eventually catch up with you.