The developer behind AliExpress’s BDUI engine explains how backend-driven interfaces enable the fluid, evolving experiences that emotional AI applications require.
The App Store review process takes days. Google Play isn’t much faster. For applications promising dream-like experiences that shift and evolve based on user emotion, this deployment bottleneck creates a fundamental problem: how do you build software that feels alive when every change requires a week-long approval cycle?
This tension between creative ambition and platform constraints surfaced repeatedly at DreamWare Hackathon 2025, where 29 teams spent 72 hours building applications meant to “engineer the surreal.” Projects promised interfaces that behave organically, visuals that metamorphose based on mood, and experiences that remember feelings rather than just data points. Each vision collided with the same technical reality: native mobile apps are static artifacts, frozen at deployment until the next update.
The BDUI Solution
Egor Korotkii builds systems that solve this problem at scale. As an iOS developer at AliExpress, he created Fusion—a Backend-Driven UI (BDUI) engine that enables app releases without App Store or Google Play deployment. The interface itself becomes data, streamed from servers and rendered dynamically on the client. Changes that once required app updates now deploy instantly.
The technical architecture is straightforward in concept, demanding in execution. Instead of hardcoding UI components, the app receives JSON or similar structured data describing what to render: component types, layouts, styling, interaction handlers. The client interprets this specification and constructs the interface at runtime. Update the server payload, and every user sees the new experience immediately.
For e-commerce, this enables rapid experimentation—new promotional layouts, personalized product presentations, A/B tests that don’t require deployment cycles. For creative applications promising fluid, evolving interfaces, BDUI provides the architectural foundation that makes such promises technically achievable.
His background spans the major Russian technology platforms: VK, where he led refactoring of the main application feed from Objective-C to Swift; Mail.ru Group, where he worked on the Youla marketplace app; and now AliExpress, building infrastructure that serves millions of users. His open-source library UIRefreshKit—handling pull-to-refresh and pagination patterns—reflects an orientation toward reusable, well-architected solutions. As a lecturer in advanced iOS development at VK Education, he’s taught these patterns to the next generation of mobile developers.
Fluid Interfaces Without App Updates
DreamWare’s “Dream Fragments” criteria included “Fluid Interface”—UI that behaves organically, shifting like sand or water. Traditional mobile development makes this extraordinarily difficult to iterate on. Each experimental variation requires a build, a submission, a review, and user adoption of the update. The feedback cycle stretches across weeks.
BDUI collapses this cycle to minutes. An interface that morphs based on time of day? Define the transformation rules server-side, push different layouts at different hours. A visual style that evolves as users interact? Track interaction patterns on the backend, adjust the UI specification in response. The app becomes a rendering engine; the experience becomes data.
Projects like “The Living Dreamspace,” which generates reactive music based on user emotional states, or “Marina Ocean,” which creates emotion-reactive environments, implicitly require this kind of dynamic interface capability. The visual environment must respond to detected emotional states—but hardcoding every possible emotional-visual mapping into a native app creates an explosion of complexity. Server-driven approaches let the mapping logic live on the backend, where it can be tuned, expanded, and personalized without touching the client.
The DreamWare submission “DreamGlobe” enables users to share dreams on a global map with AI voice interaction. The voice feature orchestrates music pausing, speech processing, and audio synthesis. Coordinating these modalities while maintaining visual coherence across different dream types and user contexts is precisely the problem BDUI architectures address. The server can emit different interface configurations based on dream content, user preferences, and contextual factors—dynamic orchestration that static apps cannot achieve.
Memory Leak Detection and Production Reliability
Creative ambition means nothing if the app crashes. His work at AliExpress included designing and integrating a memory leak detection mechanism into CI, utilizing logging systems and real-time data visualization in Grafana. This infrastructure significantly increased the app’s crash-free rate—a metric that directly affects user retention and App Store rankings.
For emotional AI applications, reliability carries additional weight. Users engaging with wellness applications, AI companions, or therapeutic tools are often in vulnerable states. A crash during an emotional check-in doesn’t just frustrate—it potentially harms. The user sharing difficult feelings with an AI companion, interrupted by an unexpected termination, may not return.
The engineering discipline required to maintain crash-free rates at scale—millions of users, diverse devices, unpredictable network conditions—transfers directly to creative applications. DreamWare projects that impressed in demos but lacked this production hardening will struggle in deployment. Memory leaks accumulate. Edge cases multiply. The beautiful interface that worked perfectly on the developer’s device fails unpredictably on older hardware with constrained memory.
This is where hackathon evaluation diverges from production readiness assessment. A 72-hour project cannot implement comprehensive crash analytics and memory profiling. But evaluators with production experience recognize which architectures will scale and which contain hidden fragility. The difference between a promising prototype and a deployable product often lies in these invisible infrastructure decisions.
GraphQL and Flexible Data Requirements
At VK, Korotkii integrated API endpoints providing flexible configuration of the feed output based on GraphQL. This query language, developed at Facebook, lets clients specify exactly what data they need rather than accepting fixed server responses. For dynamic interfaces, this flexibility is essential.
Consider an emotional AI application that visualizes user mood history. On a phone, you might show a simple timeline. On a tablet, a richer visualization with multiple data dimensions. A traditional REST API returns the same payload regardless of device context, forcing the client to either over-fetch (wasting bandwidth, slowing load times) or make multiple requests (adding latency, complexity). GraphQL lets each client request precisely the data its interface requires.
Combined with BDUI, GraphQL enables interfaces that adapt not just visually but informationally. The server specifies what UI to render; the UI requests exactly the data it needs to populate that rendering. The architecture supports genuine adaptation rather than responsive scaling of static designs.
DreamWare projects involving data visualization—emotional analytics dashboards, dream pattern recognition, mood tracking over time—benefit from this architectural pattern. “DearDiary,” which implements real-time sentiment analysis with an analytics dashboard showing emotional patterns, faces exactly these data-fetching challenges. Showing “your anxious Mondays in a chart” requires different data granularity than a simple mood log. Flexible query systems support diverse visualization needs without backend proliferation.
The Architecture Merger Problem
One of Korotkii’s notable achievements at VK involved redesigning project architecture to facilitate the codebase merger of two applications, reducing development costs. This unglamorous work—reconciling conventions, resolving conflicts, creating unified patterns—represents the kind of architectural judgment that determines long-term project viability.
Creative applications often start as experiments. A hackathon project becomes a prototype becomes a product. Each transition introduces integration challenges. The AI component built as a standalone service must integrate with the user management system. The visualization layer built with one framework must coexist with analytics built in another. The memory system designed for single-user testing must scale to concurrent access.
Teams that architect for eventual integration—clean interfaces between components, consistent conventions, documented boundaries—navigate these transitions smoothly. Teams that optimize only for immediate demonstration accumulate technical debt that compounds with each evolution. The codebase merger experience, painful as it is, teaches patterns that prevent future pain.
For DreamWare submissions aiming to become real products, this consideration matters. The judges evaluating technical execution look not just at what works today but at what will work as the project grows. Architecture that supports evolution differs from architecture that merely supports demonstration.
Teaching and Mentorship
Beyond production work, Korotkii serves as a lecturer in advanced iOS development at VK Education and has mentored students entering the field. This teaching orientation influences how he evaluates projects: not just assessing current implementation but recognizing growth trajectories.
A hackathon submission with rough edges but sound architectural instincts differs fundamentally from a polished demo built on fragile foundations. The first can be refined; the second must be rebuilt. Experienced educators recognize this distinction because they’ve watched students progress from shaky beginnings to solid implementations—and watched other students polish surfaces while avoiding structural problems.
This perspective applies to DreamWare’s creative ambitions. Teams attempting genuinely novel interactions—typing patterns becoming music, dreams rendered as navigable spaces, emotions preserved across digital afterlives—will produce imperfect first implementations. The question is whether the imperfection is superficial or fundamental. Does the architecture support what the concept promises, even if the current execution falls short? Or does the impressive demo mask structural limitations that will prevent the concept from ever fully realizing?
Server-Driven Creative Applications
The broader pattern emerging from BDUI architecture suggests a direction for creative AI applications: treat the interface as a living specification rather than a static artifact. The emotional AI companion doesn’t just generate text responses—it emits interface configurations that change how responses are presented based on emotional context. The dream visualization tool doesn’t just render images—it defines interaction patterns that adapt to dream content.
This architectural shift has implications beyond deployment convenience. When interfaces become data, they become subject to the same machine learning techniques applied to other data. An AI system can learn which interface configurations correlate with positive user outcomes. Personalization extends from content to presentation. The app doesn’t just remember what you told it—it remembers how you prefer to interact and adapts its interface accordingly.
For DreamWare’s vision of “Emotional Memory”—applications that remember feelings, not just data points—this interface-as-data pattern provides technical grounding. The feeling isn’t just stored in a database; it’s reflected in how the application presents itself. Return to the app in a state similar to a previous visit, and the interface echoes that previous configuration, creating continuity of experience that static apps cannot achieve.
Production Realities for Dream-Like Software
The gap between hackathon demonstration and production deployment is where most creative software projects fail. The demo works because the developer controls the conditions. Production fails because users are unpredictable, devices are diverse, networks are unreliable, and edge cases multiply faster than engineers can address them.
BDUI architectures, memory leak detection, GraphQL flexibility, clean component boundaries—these aren’t glamorous features to list in a hackathon submission. They’re the infrastructure that determines whether the glamorous features actually work when real users encounter them. The dream-like interface that flows like water becomes a nightmare when it crashes, lags, or drains battery. The emotional AI companion that promises understanding becomes frustrating when it loads slowly or renders incorrectly on older devices.
DreamWare Hackathon 2025 showcased creative visions pushing the boundaries of what software can understand about human emotion and experience. The teams that translate those visions into products users can rely on will be the ones that pair creative ambition with engineering discipline—building infrastructure as carefully as they design experiences.
The surreal, it turns out, requires solid foundations.
DreamWare Hackathon 2025 was organized by Hackathon Raptors, a Community Interest Company supporting innovation in software development. The event featured 29 teams competing across 72 hours with $2,300 in prizes. Egor Korotkii served as a judge evaluating projects for technical execution, conceptual depth, and originality.
