Skip to content
Go back

Backups Are the Most Important Thing You're Not Doing

If you’re building with AI coding agents and you don’t have backups, you’re one bad prompt away from losing everything.

I’m not exaggerating. In the past year, AI agents have deleted production databases, wiped home directories, destroyed family photos, and fabricated fake data to cover their tracks. These aren’t hypothetical risks — they’re documented incidents involving real people and real tools.

I’ve been building a SaaS product (Growth Method) with Laravel, Livewire, and Claude Code. As someone who can’t write PHP and relies entirely on an AI agent to write code, backups aren’t optional for me. They’re the safety net that makes the whole workflow possible.

Table of contents

Open Table of contents

AI agents delete things

Here are real incidents from the past year:

The pattern is always the same: an AI agent has write access to something important, makes a catastrophic decision, and acts faster than you can intervene. Git helps — but only if you’ve committed recently, and only for code. Your database, your uploaded files, your server configuration — none of that lives in git.

The 3-2-1 backup rule

The standard backup strategy is the 3-2-1 rule, popularised by photographer Peter Krogh and endorsed by US-CERT:

The modern evolution adds two more numbers: 3-2-1-1-0 — one immutable copy (that can’t be overwritten or deleted) and zero errors after verification testing.

This sounds like overkill until you read about Jason McCreary’s experience. JMac — creator of Laravel Shift — lost roughly 600 completed Shifts during a production database outage. In the panic, he accidentally overwrote his most recent backup while troubleshooting, forcing a restore from the weekly backup instead. He had to reconstruct orders from Stripe data. Panic-driven decisions destroyed his backup integrity. (jasonmccreary.me)

JMac went on to author the pull request for DB::prohibitDestructiveCommands() in Laravel 11.9 — the feature that blocks migrate:fresh, migrate:refresh, migrate:reset, and db:wipe in production. One line in your AppServiceProvider and the most dangerous database commands are blocked:

DB::prohibitDestructiveCommands($this->app->isProduction());

That’s a safety rail born from real data loss.

What I’d recommend

There are three levels of backup, and the right choice depends on what you’re protecting.

Backblaze PersonalBackblaze B2 + spatie/laravel-backupTime Machine (macOS)
What it backs upYour entire laptop, continuouslyYour Laravel app files + databaseYour entire laptop, hourly
Where it stores backupsBackblaze cloudBackblaze B2 (S3-compatible cloud)External hard drive
Cost$99/year, unlimited storage$6/TB/month (first 10GB free)Free (drive cost only)
RecoveryFull restore or individual filesphp artisan backup:restoreFull restore or individual files
Best forProtecting your development machineProtecting your production appLocal quick-recovery
Offsite?YesYesNo

For your laptop: Backblaze Personal at $99/year with unlimited storage. It runs continuously in the background and backs up everything. Pair it with Time Machine to an external drive and you’ve got the 3-2-1 rule covered for your development machine.

For your Laravel app: Backblaze B2 combined with spatie/laravel-backup. B2 is roughly one-fifth the cost of AWS S3 for storage, with free egress through CDN partners. The spatie package has over 21 million Packagist installs and backs up your entire application — files and database — to any Laravel filesystem disk.

Since B2 is S3-compatible, the Laravel integration is just a filesystem disk configuration. Add this to config/filesystems.php:

'b2' => [
    'driver' => 's3',
    'key' => env('B2_ACCESS_KEY_ID'),
    'secret' => env('B2_SECRET_ACCESS_KEY'),
    'region' => env('B2_DEFAULT_REGION'),
    'bucket' => env('B2_BUCKET'),
    'endpoint' => env('B2_ENDPOINT'),
    'use_path_style_endpoint' => false,
    'throw' => true,
    'request_checksum_calculation' => 'when_required',
    'response_checksum_validation' => 'when_required',
],

The last two checksum lines are critical — Backblaze rejects requests with certain checksum headers that Laravel’s S3 driver sends by default. Without them, you’ll get 400 errors and no backups.

Then point spatie/laravel-backup at the disk, schedule php artisan backup:run in your Laravel scheduler, and your app is backed up automatically.

The Laravel community takes backups seriously

Taylor Otwell co-founded Ottomatik — a dedicated backup service for servers — with Jeffrey Way and Eric Barnes. The creator of Laravel thought backups were important enough to build a product around.

Freek Van der Herten built spatie/laravel-backup after DigitalOcean physically lost Spatie’s production server in 2016. His lesson: never rely solely on your hosting provider for backups. The package now has over 6,000 GitHub stars and is maintained as part of Spatie’s ecosystem of 300+ open source packages.

Aaron Francis and Caleb Porzio — two of the most prominent voices in the Laravel community — have both endorsed Backblaze publicly. Aaron mentioned having several terabytes stored there.

And Toby Maxham, a senior software engineer in the Laravel community, shared a cautionary tale about accidentally deleting all customer invoices due to a SQL typo — saved only because he had Backblaze backups in place.

Why this matters more for agentic engineers

If you’re a traditional developer, backups are good practice. If you’re an agentic engineer — someone who directs AI agents to build and maintain software — backups are essential infrastructure.

Here’s why: AI agents operate faster than you can review. A developer typing DELETE FROM invoices has a moment to pause before hitting enter. An AI agent executing that same query does it instantly, embedded in a chain of other operations, potentially without you even seeing it scroll past. The incidents above all share this characteristic — the damage happened faster than the human could intervene.

Git protects your code but not your data. Committing frequently is critical, and it’s something I do after every meaningful change. But your database, your uploaded files, your .env configuration, your server state — none of that is in git. A production backup strategy covers the gaps that version control can’t.

Laravel 12’s tagline is literally “The clean stack for Artisans and agents.” The framework is leaning into the agentic future. Laravel Boost restricts AI agent database tools to read-only queries. DB::prohibitDestructiveCommands() blocks the most dangerous artisan commands in production. These are safety rails, not substitutes for backups.

The minimum viable backup strategy for an agentic engineer:

  1. Commit before every AI operation — 30 seconds to commit versus hours to recover without one
  2. Backblaze Personal on your laptop — continuous, set-and-forget protection for your development machine
  3. spatie/laravel-backup to Backblaze B2 — automated daily backups of your production app and database
  4. Time Machine to an external drive — local quick-recovery for when you need a file back in seconds

That’s the 3-2-1 rule with almost no effort. The laptop backup runs silently. The Laravel backup runs on a schedule. Time Machine runs hourly. You set it up once and forget about it — until the day you need it, at which point it’s the most valuable thing you own.

Key takeaway

AI agents are powerful precisely because they have write access to your systems. That same power means they can destroy things faster than you can react. Backups are the only safety net that works regardless of what goes wrong — whether it’s an AI agent running terraform destroy, a SQL typo deleting every invoice, or a panicked engineer overwriting their own backup. Set up Backblaze, install spatie/laravel-backup, and commit before every AI operation. It takes an afternoon to configure and it protects everything you’ve built.


Back to top ↑