Building a social discovery platform powered by AI recommendation
The AI engine drives engagement even when recommendations land far from the top of the feed.

The challenge
Event discovery on mobile has a hard problem: users don't know what they're looking for, and recommendations that miss the mark get ignored. Joyraft's founding team saw this in user research — when events were buried below the fold, even strong matches failed to convert because users never saw them. The discovery experience itself became the bottleneck.
They needed a way to surface the right events to the right people across three platforms (iOS, Android, and web) without ballooning the engineering cost of maintaining three separate applications.
The core constraint was structural. Building three native apps meant tripling the work — design consistency broke down, feature releases staggered, and the team couldn't ship fast enough to iterate on the AI layer. The business needed to grow, but the platform couldn't scale with the team size.
What we learned
| Social proof beats algorithmic confidence | Users engage more when they see friends attending than when an algorithm ranks events as a perfect match. |
| Three codebases, three teams | Maintaining iOS, Android, and web independently means features stagger, bugs replicate, and small teams stop shipping. |
| Cold start breaks recommendations | Models trained on empty data produce noise — but new users have no history to learn from. |
The solution
Twistag built Joyraft as a hybrid stack: React Native for mobile (iOS and Android from a single codebase) and Next.js for web, connected by a shared component library that enforced design consistency across platforms. This architecture meant a six-person team could ship features once and see them live on three surfaces simultaneously.
The recommendation engine sits at the centre. We chose Google Cloud Platform's Vertex AI as the managed ML platform — letting us iterate on the model without building infrastructure — paired with a hybrid filtering approach. The system combines collaborative filtering (what users like you engaged with) and content-based signals (event metadata, category, venue, user interests). When a user marks interest in an event or their friend attends something, the engine ingests that signal immediately.
The crucial insight came from user testing: social proof outperforms descriptions. When users saw that friends had marked interest in an event, they engaged. So we surfaced that signal prominently, not as an afterthought, but as the primary discovery mechanism. Rather than ranking by predicted rating alone, we weighted friend activity and public interest counts alongside algorithm confidence.
The cold-start problem — how do you recommend to new users with no history — we solved by shipping an MVP experience first. The app launched without the AI engine. We let behaviour accumulate naturally for weeks, then introduced recommendations gradually. This sequencing meant the model had real user data to learn from before users ever saw its output.
What this shaped
| Hybrid signal, not pure algorithm | Weight friend activity and public interest alongside algorithmic confidence — social proof guides discovery better than ranking does. |
| One codebase, three platforms | React Native and Next.js let one team ship features simultaneously to iOS, Android, and browsers. |
| Defer AI until behaviour exists | Ship the MVP without recommendations — let usage accumulate, then introduce algorithms to data that actually matters. |
The impact
The AI engine now drives 40% of clicks on recommended events that fall outside the top three positions on the feed. Users engage with recommendations even when the algorithm places them low — a clear signal that the matching logic is sound and users trust it. The engine works quietly.
Joyraft reached 10,000 weekly active users across the three platforms. Feature parity across iOS, Android, and web eliminated the stagger that plagued earlier multi-platform attempts. One release cycle. One design system. Three user experiences that feel identical. The team shipped the first AI-powered recommendation layer in three months, then iterated monthly — something that would have taken three times as long with separate native codebases.
What this proved
| Quiet engine outperforms top results | Forty percent of clicks land outside the top three — the engine works quietly. |
| Parity reduces friction | Identical experience across platforms eliminates switching cost — the platform reads as one product, not three. |
| Reasoning earns return visits | Users return when they understand why a thing was recommended — friends going, behaviour matching, signal visible. |
Technologies used
- React Native
- Next.js
- Vertex AI
- GCP

