Co-Founders Behind Reface And Prisma Join Hands To Improve On-Device Model Inference With Mirai

What if your phone could run powerful AI models without relying on the cloud? That's the promise of on-device AI, and a new London-based startup called Mirai is building the infrastructure to make it happen. Founded by the creators behind viral apps Reface and Prisma, Mirai just raised $10 million to help smartphones and laptops handle complex AI tasks locally. This approach could mean faster responses, better privacy, and lower costs for the next generation of consumer apps. For developers and tech leaders asking what's next in mobile AI, Mirai represents a focused effort to bring intelligent features directly to the devices people use every day.
Co-Founders Behind Reface And Prisma Join Hands To Improve On-Device Model Inference With Mirai
Credit: Mirai

What Is Mirai and Why Does On-Device AI Matter?

Mirai is developing a specialized framework designed to optimize how artificial intelligence models run directly on consumer hardware. Instead of sending every request to distant data centers, the technology enables phones and laptops to process AI workloads locally. This shift addresses growing concerns about latency, data privacy, and the rising costs of cloud-based inference. For everyday users, the result could be smoother experiences in apps that use AI for photos, voice, or real-time translation. The team believes that as models become more efficient, the future of consumer AI lives on the edge, not just in the cloud. By prioritizing on-device execution, Mirai helps applications feel more responsive while reducing dependence on constant internet connectivity.

The Founders Behind Mirai: Reface and Prisma Veterans

The startup is led by Dima Shvets and Alexey Moiseenkov, two entrepreneurs with proven track records in consumer AI applications. Shvets previously co-founded Reface, the popular face-swapping app that gained millions of users worldwide. Moiseenkov served as CEO and co-founder of Prisma, the app that brought artistic AI filters to smartphones years before the generative AI boom. Both founders have deep experience scaling apps that rely on machine learning, giving them unique insight into the limitations of current on-device AI infrastructure. Their shared vision is to remove technical barriers so developers can build richer, more responsive AI features without heavy cloud dependencies. Having navigated the challenges of viral consumer apps, they understand what it takes to deliver seamless AI experiences at scale.

Why On-Device AI Is the Next Frontier for Consumer Apps

While much of the AI conversation focuses on massive cloud infrastructure, consumer developers face different challenges. Running models on the device reduces latency, which is critical for real-time features like camera effects or voice assistants. It also enhances user privacy since sensitive data never leaves the phone. Plus, avoiding constant cloud calls can significantly cut operational costs for app makers. Shvets and Moiseenkov noticed that many developers were seeking better cost optimization and higher margins per token when building AI-powered features. Mirai aims to solve these pain points by creating a streamlined pipeline for on-device model inference that works across different hardware platforms. This developer-first approach could accelerate innovation in categories from social media to productivity tools.

How Mirai's Framework Optimizes Model Inference on Phones

At its core, Mirai's technology focuses on making AI models run more efficiently on resource-constrained devices. The framework handles tasks like model compression, quantization, and hardware-aware optimization automatically. This means developers can deploy sophisticated models without needing deep expertise in edge computing or mobile hardware. The system is designed to adapt to various chipsets, ensuring consistent performance whether the app runs on a flagship phone or a mid-range laptop. By abstracting away the complexity of on-device deployment, Mirai lets creators focus on building great user experiences rather than wrestling with technical constraints. The framework also includes monitoring tools to help teams track performance and iterate quickly based on real-world usage data.

The $10 Million Bet on Edge AI's Future

Mirai's $10 million seed round signals growing investor confidence in the on-device AI market. The funding will support the 14-person technical team as they refine their framework and expand partnerships with app developers. Unlike many AI startups chasing large language model training, Mirai is deliberately focused on inference optimization for consumer hardware. This niche is becoming increasingly valuable as major tech companies invest heavily in neural processing units for phones and PCs. The round positions Mirai to become a key infrastructure player as the industry shifts toward hybrid AI architectures that blend cloud and edge capabilities. With this backing, the team can accelerate development while maintaining a lean, product-focused culture.

What This Means for Developers and Everyday Users

For app developers, Mirai's tools could lower the barrier to entry for adding advanced AI features. Instead of managing complex deployment pipelines, teams can integrate the framework and start experimenting with on-device models faster. This acceleration could lead to a new wave of innovative apps that leverage local AI for personalized, context-aware experiences. For end users, the benefits include quicker response times, offline functionality, and stronger privacy protections. Imagine photo editing apps that apply complex filters instantly, or voice assistants that understand commands without an internet connection. These scenarios become more feasible when models run efficiently on the device itself. Over time, this could reshape expectations for what mobile apps can do without relying on remote servers.

The Road Ahead for Mirai and On-Device Intelligence

Looking forward, the Mirai team plans to expand support for more model types and hardware configurations. They're also exploring ways to enable collaborative inference, where lightweight tasks run on-device while heavier computations seamlessly offload to the cloud when needed. This hybrid approach could offer the best of both worlds: the speed and privacy of local processing with the power of cloud-scale models. As consumer expectations for AI-powered features continue to rise, infrastructure that makes on-device intelligence practical will become increasingly essential. Mirai's early progress suggests that the next chapter of AI innovation might happen right in your pocket. The startup is actively engaging with developer communities to gather feedback and prioritize features that deliver immediate value.
The rise of on-device AI represents a pivotal shift in how we interact with intelligent applications. By focusing on the infrastructure that powers local model inference, Mirai is addressing a critical gap in the AI ecosystem. With experienced founders, strong investor backing, and a clear technical vision, the startup is well-positioned to help bring faster, smarter, and more private AI experiences to billions of devices worldwide. As the technology matures, we may look back on this moment as the beginning of a more distributed, user-centric AI future. For anyone building or using AI-powered apps, the move toward on-device intelligence isn't just a technical trend—it's a fundamental rethinking of where intelligence lives and how it serves people in their daily lives.

Comments