Case Study

Scaling a Real Estate SaaS Platform from 5 Agents to 200+

How we re-architected a property search platform across the US and Canada — handling millions of MLS listings with sub-second response times.

Industry
Real Estate / PropTech
Duration
8 Months
Services
Architecture · Backend · DevOps
Markets
USA & Canada

Real Estate SaaS · Platform Architecture

200+
Active Agents
2M+
Property Listings
<200ms
Search Response
99.9%
Uptime SLA

The Client's Vision

A growing real estate brokerage needed a modern, white-label website platform that could serve individual agents across the United States and Canada. Each agent needed their own branded IDX website with MLS-integrated property search, lead capture, and CRM capabilities — all powered by a single, centralized backend.

The initial version worked for 5 agents. But as demand grew, the platform buckled under the weight of multi-tenant data, slow property searches, and infrastructure that wasn't built to scale. They came to us to rebuild the core architecture and turn a fragile MVP into a production-grade SaaS platform.

What Was Breaking

When we audited the existing system, we identified four critical bottlenecks that were preventing the platform from scaling beyond a handful of agents.

Slow Property Search

MongoDB direct queries for property search were taking 3–5 seconds under load. Auto-suggestions were unusable for agents showing properties to buyers in real-time.

MLS Data Sync Failures

The cron-based sync was running as a single process with no error recovery. One failed MLS feed would stall updates for all agents across the platform.

No Multi-Tenant Isolation

All agents shared configuration, leads, and property data in flat collections. One agent's misconfiguration could affect every other agent's website.

Sold & Active Data Conflicts

Sold and active properties lived in the same collection with no clear status management. Agents were showing sold properties as active listings on their sites.

The Architecture We Built

We designed a three-layer data architecture that separates concerns cleanly: MongoDB as the source of truth, Elasticsearch for fast search, and Redis as a high-speed cache layer sitting on top. Each layer does what it's best at — nothing more.

Data Flow Architecture

MLS Data Feeds (RETS / RESO Web API)
Property data ingested from multiple US & Canadian MLS boards via scheduled sync
Cron Sync Engine — Every 2 Hours
Delta detection, error handling, retry logic. Processes only changed records to minimize load.
MongoDB — Source of Truth
Stores all property data (active + sold), website configurations, lead data, and agent profiles
Elasticsearch — Search Layer
Handles property search, filters, geo-queries, auto-suggestions. Local per environment.
Redis — Cache Layer
Sits on top of Elasticsearch. Serves repeated queries instantly. Refreshed every 2 hours with cron sync.

For individual property detail pages, we bypass both Redis and Elasticsearch entirely and query MongoDB directly. This ensures agents always see the most current property data — including status changes, price updates, and newly uploaded photos — without waiting for the cache to refresh.

Tech Stack

Next.js
Node.js
MongoDB
Elasticsearch
Redis
AWS (SES, S3)
Linode / VPS
MLS / RETS API

How We Delivered It

Phase 1 — Weeks 1–3
Audit & Architecture Design

Full codebase audit. Mapped existing data models, identified bottlenecks, designed the three-layer architecture (MongoDB → Elasticsearch → Redis). Defined multi-tenant data isolation strategy.

Phase 2 — Weeks 4–10
Search Engine & Cache Layer

Deployed Elasticsearch per environment. Built property search indexing pipeline, auto-suggestion engine, and geo-based filtering. Implemented Redis as a read-through cache with 2-hour TTL synced to the cron schedule.

Phase 3 — Weeks 11–18
Cron Sync & Multi-Tenant Hardening

Rewrote the MLS sync engine with delta detection, retry logic, and per-feed error isolation. Restructured MongoDB collections for clean multi-tenant separation. Split environment configs for USA Live, Canada Live, and Staging.

Phase 4 — Weeks 19–24
Frontend Optimization & Launch

Optimized Next.js frontend with ISR, dynamic imports, image optimization via S3 + CDN. Deployed white-label agent themes. Load-tested the full stack with simulated 200-agent traffic before go-live.

The Impact

Within 3 months of the new architecture going live, the platform handled a 40x increase in agent count without any degradation in performance or reliability.

Search response time
3.5s → 180ms
94% faster property search
Active agents on platform
5 → 200+
40x scale with zero downtime
MLS sync reliability
99.9% uptime
Zero missed sync cycles post-launch
Lead capture rate
+65% increase
Faster pages = more conversions
“Our agents used to complain about slow search and broken listings. After the rebuild, the platform became our biggest competitive advantage — agents actually want to sign up now.”
— Brokerage Operations Director

What Made This Work

Real estate SaaS platforms fail when they treat scaling as a hosting problem. The real challenge is data architecture — how property data flows from MLS feeds through your search layer and into the agent's browser. Getting that pipeline right is what separates platforms that cap out at 10 agents from ones that can serve hundreds.

The combination of Elasticsearch for search, Redis for cache, and MongoDB for source-of-truth isn't novel — but configuring them to refresh in sync with a 2-hour cron cycle, while keeping property detail pages on direct-to-DB reads, is the kind of nuance that only comes from building in this space repeatedly.

Every environment — USA Live, Canada Live, and Staging — runs its own local Elasticsearch and Redis instances. This isolation means a Canadian MLS sync failure never impacts US agent websites, and staging changes never touch production data.

Building a Real Estate Platform?

We help PropTech companies scale from MVP to production-grade SaaS. Let's talk about your architecture.

Book Strategy CallCase Studies