Summary
Disclaimer: This summary has been generated by AI. It is experimental, and feedback is welcomed. Please reach out to info@qconsf.com with any comments or concerns.
The presentation "Designing Fast, Delightful UX With LLMs for Mobile Frontends" by Bala Ramdoss explores the integration of large language models (LLMs) into mobile app user experiences to create fast and engaging interfaces.
Key Concepts Discussed:
- AI-powered Features: Highlighted the challenges and solutions in creating dynamic customer experiences (CX) with LLMs, focusing on breaking down responses into chunks and managing latency effectively.
- Front-end and Backend Dynamics: Discussed the role of frontend architecture in managing user interactions and how backend support is crucial for feeding data to models efficiently.
- Prompt Engineering: The importance of crafting prompts that allow models to select the most appropriate CX components was emphasized, illustrating its role in seamless app interactions.
- On-device AI: The talk also covered the benefits of on-device AI models provided by major platforms like Apple and Google, including privacy advantages and reduced latency.
- Future Directions: Encouraged app developers to focus on prompt design and consider leveraging server-driven UI and BFF (Backend for Frontend) patterns for scalable AI feature implementation.
Bala Ramdoss concluded by urging developers to explore on-device AI for lighter use cases and to continue evolving frontend systems to manage modern AI capabilities effectively .
This is the end of the AI-generated content.
Abstract
Delivering AI-powered features in mobile apps is not just about calling an LLM API. It is about crafting fast, reliable, and engaging user experiences. In this talk, I’ll share practical lessons from designing and scaling LLM-driven experiences in mobile apps, where frontend architecture and UX design played an important role.
We’ll explore how to architect for speed and interactivity, when to use on-device LLMs versus backend inference, and how to design interfaces that gracefully handle latency. We'll also examine the trade-offs that shape responsive, cost-effective, and scalable AI features.
Whether you're experimenting with LLMs in your product or already in production, you’ll leave with insights to make your AI-powered experiences feel fast, native, and user-first.
Speaker
Balakrishnan (Bala) Ramdoss
Senior Android Engineer @Amazon - Building Camera-Based AI Features, Specializes in Scalable Solutions for Complex Challenges
Bala Ramdoss is a Senior Android Engineer at Amazon, where he builds camera-based AI features like Amazon Lens to enhance the visual shopping experience. With over 10 years of Android development experience, Bala specializes in scalable solutions for complex challenges, including AR-powered experiences and high-performance Android UI. When not building apps, he enjoys exploring new tech and engaging conversations over coffee.