My app has always run on MySQL. It works. Nothing is broken. But I kept seeing signals that PostgreSQL was becoming the database Laravel was building for — Laravel Cloud runs on it, the new AI SDK’s vector search needs it, and official Laravel products like Nightwatch use it.
So I asked Claude Code:
What would be some good reasons to move to Postgres rather than MySQL for this project?
What came back was more useful than I expected. Not just a list of features, but an assessment of what would need to change in my codebase — and how little that turned out to be.
Table of contents
Open Table of contents
Why I started thinking about this
I had three reasons already in mind when I opened the GitHub issue:
- PostgreSQL is fully open source — no Oracle dual-licensing concerns
- pgvector for AI features — my app already uses AI for scoring, classification, and summaries
- Replace Algolia with Postgres full-text search — one fewer paid dependency
But I wanted to know: are there more reasons? And what would the migration actually involve?
What Claude Code found in my codebase
I asked Claude to scan the entire codebase for anything MySQL-specific. The results surprised me.
This is where one of Laravel’s best features pays off — and one that non-developers might not fully appreciate. Laravel has a “query builder” and a “schema builder” that sit between your code and the database. Instead of writing raw MySQL or raw PostgreSQL, you write Laravel code, and the framework translates it into whatever dialect your database speaks. It’s like writing in English and having a translator handle the French or Spanish depending on who’s listening.
Because almost all of my code uses these builders rather than raw SQL, the scan found three categories of change across the entire codebase:
1. A MySQL-only date function → Carbon in PHP
The initial plan suggested replacing DATEDIFF() with PostgreSQL’s EXTRACT(DAY FROM ...). But when I actually made the change, I realised the original MySQL code had a deeper problem — date_test_inprogress lives on the ideas table but was inside a whereHas('team', ...) subquery, relying on MySQL’s correlated subquery resolution, which was fragile.
The better fix was moving the logic out of SQL entirely and into PHP with Carbon:
// Before: MySQL-specific DATEDIFF inside a fragile subquery
->whereHas('team', fn ($q) => $q->whereRaw('DATEDIFF(NOW(), date_test_inprogress) > experiment_duration_target+1'))
// After: Database-agnostic with Carbon
->whereNotNull('date_test_inprogress')
->get()
->filter(function ($idea) {
$daysSinceStart = $idea->date_test_inprogress->diffInDays(now());
return $daysSinceStart > ($idea->team->experiment_duration_target + 1);
});
2. A MySQL-only ordering function → a reusable Eloquent scope
FIELD() is MySQL-only. The plan was to replace it with CASE WHEN, but since the same ordering appeared in multiple places, I created a reusable scope on the model instead:
// Before: MySQL FIELD() duplicated across files
->orderByRaw(sprintf(
"FIELD(stage, '%s', '%s', '%s', '%s', '%s')",
IdeaStage::STAGE_PRIORITISED->value,
IdeaStage::STAGE_IN_DESIGN->value,
IdeaStage::STAGE_IN_PROGRESS->value,
IdeaStage::STAGE_ANALYSING->value,
IdeaStage::STAGE_COMPLETE->value
))
// After: One-line call using a scope
->orderByStage()
The scope itself uses parameterised CASE WHEN, which works on both databases:
public function scopeOrderByStage(Builder $query): Builder
{
return $query->orderByRaw(
'CASE stage '.
'WHEN ? THEN 1 '.
'WHEN ? THEN 2 '.
'WHEN ? THEN 3 '.
'WHEN ? THEN 4 '.
'WHEN ? THEN 5 '.
'ELSE 6 END',
[
IdeaStage::STAGE_PRIORITISED->value,
IdeaStage::STAGE_IN_DESIGN->value,
IdeaStage::STAGE_IN_PROGRESS->value,
IdeaStage::STAGE_ANALYSING->value,
IdeaStage::STAGE_COMPLETE->value,
]
);
}
3. LIKE → whereLike() for case-insensitive search
This one wasn’t in the original scan. MySQL’s LIKE is case-insensitive by default (because of utf8mb4_unicode_ci collation), but PostgreSQL’s LIKE is case-sensitive. So where('name', 'like', '%test%') that finds “Test Campaign” on MySQL would miss it on PostgreSQL.
Laravel 11.34+ added whereLike(), which emits LIKE on MySQL and ILIKE on PostgreSQL — preserving case-insensitive behaviour on both:
// Before: Case-insensitive on MySQL, case-sensitive on Postgres
$query->where('name', 'like', '%'.$search.'%')
// After: Case-insensitive on both databases
$query->whereLike('name', '%'.$search.'%')
This needed changing in three files — comments search, ideas table search, and the MCP tools listing.
Everything else — the 40+ unsigned integer columns, the 5 enum columns, the 7 JSON columns, all the booleans and decimals — was handled automatically because the builders know how to translate it for PostgreSQL.
I asked the follow-up question any non-developer would ask:
What would a migration look like?
And the answer was reassuringly simple:
| Category | Effort | Details |
|---|---|---|
| Raw MySQL SQL | Must fix | DATEDIFF() → Carbon, FIELD() → CASE WHEN scope |
| Case-sensitive LIKE | Must fix | where('col', 'like', ...) → whereLike() (3 files) |
| Unsigned integers (40+ columns) | Auto-handled | Laravel abstracts these |
| Enum columns (5 locations) | Auto-handled | Laravel uses CHECK constraints on Postgres |
| JSON columns (7 locations) | No changes | Fully compatible |
| Booleans, decimals, floats | No changes | Fully compatible |
| Config | Trivial | Already has a pgsql section |
The three options I considered
Option A: Stay on MySQL
The safe choice. Nothing is broken, no migration effort, no downtime risk. But it means no access to pgvector (which the new Laravel AI SDK uses for semantic search), continued dependence on Algolia for search, and falling behind the framework’s direction.
Option B: Migrate fully to PostgreSQL
Change .env, fix the 2 raw SQL queries, test locally, use a tool like pgloader for the production data migration. Small effort, big unlock.
Option C: Run both databases in parallel
Keep MySQL for existing features, add Postgres for new AI/vector features only. This sounds clever but creates real problems — two databases to manage, no cross-database joins, related data scattered across two places. It’s the kind of “temporary” complexity that tends to become permanent.
What the official Laravel docs say
This is what convinced me. The Laravel 12 Search documentation is explicit:
Vector search requires a PostgreSQL database with the pgvector extension and the Laravel AI SDK. All Laravel Cloud Serverless Postgres databases already include pgvector.
Laravel 12 has first-party support for storing and querying vector embeddings directly in your database — but only if that database is PostgreSQL. The framework now includes built-in tools for creating vector columns, enabling the pgvector extension, and searching by similarity — all with a few lines of code.
For search, the Scout documentation confirms that the built-in database engine works with PostgreSQL full-text indexes:
The database engine uses MySQL / PostgreSQL full-text indexes and LIKE clauses to search your existing database directly… no external service or additional infrastructure required.
This means I could drop Algolia entirely and use Scout’s database driver with Postgres full-text search — no extra service, no monthly bill.
Laravel Cloud also published an official Migrate MySQL to PostgreSQL guide, which recommends pgloader for the conversion. When the platform publishes a migration guide, it tells you which database they’re betting on.
What the Laravel community thinks
I looked for real signals from well-known Laravel community members — real quotes, blog posts, and product decisions, not speculation.
Freek Van der Herten (Spatie) migrated Flare (the error tracking service) from MySQL to PostgreSQL. In a blog post about the migration, the team wrote that the main reason was full-text search not working properly in MySQL for their use case, and that PostgreSQL offered “many exciting opportunities thanks to its extensibility.”
Marcel Pociot added PostgreSQL as a first-class service in Laravel Herd Pro. It was the #1 most requested feature from the community.
Jess Archer built Nightwatch — Laravel’s official error tracking, performance monitoring, and logging service — on PostgreSQL. When the Laravel team’s own products choose Postgres, that tells you something.
Taylor Otwell built Laravel Cloud’s Serverless Postgres on Neon, with autoscaling, auto-hibernation, and connection pooling. In a Laravel Podcast episode, he said: “It’s more that we’ve happened to do the Postgres stuff first.” Pragmatic, not dogmatic — but the infrastructure choices are clear.
What would Taylor Otwell do?
I don’t know exactly what Taylor would say. But I do know what he’s done:
- He built Cloud on Postgres — because Neon’s serverless technology gave him autoscaling, hibernation, and connection pooling that MySQL couldn’t match
- He built the AI SDK’s vector search as a Postgres-only feature —
whereVectorSimilarTo()and$table->vector()don’t work on MySQL - He published an official MySQL-to-Postgres migration guide on Cloud — he clearly expects people to make this move
Taylor’s philosophy has always been simplicity and fewer moving parts. Dropping a paid search dependency in favour of the database you’re already running is exactly that kind of simplification. Based on where he’s investing the framework’s energy, I think he’d switch to Postgres, use the built-in tools, and not overthink it.
Key takeaway
If you’re starting a new Laravel project today, use PostgreSQL. The framework’s newest features — vector search, the AI SDK, Cloud’s serverless infrastructure — all require it.
If you’re already on MySQL (like I was), the migration is smaller than you think. Laravel’s query builder and schema builder abstract away almost everything. In my case, an entire production app came down to three categories of change — replacing DATEDIFF() with Carbon, turning FIELD() into a reusable scope, and swapping where('col', 'like', ...) for whereLike(). The heaviest part isn’t the code — it’s the production data migration itself, and tools like pgloader handle that.
The Laravel ecosystem isn’t abandoning MySQL. It still works, and Laravel still supports it. But the momentum — the new features, the infrastructure investments, the product decisions — is clearly moving toward PostgreSQL. If you’re building AI features, or if you want to remove a paid search dependency, or if you simply want to be aligned with where the framework is heading, now is a good time to make the switch.