JetpackCompose.app's Dispatch Issue #17

šŸ’Œ In today's edition, you get inside scoop from Google I/O, interesting updates from the event that impact your daily life & hear from other Android experts about what they are most excited about.

1/3 of mobile release time is wasted on low-value work [2025 Report]

Coordination overload, manual bottlenecks, and ineffective automation silently undermine your mobile team’s productivity.

Learn what’s holding you back and how to fix the top issues šŸ‘‡šŸ»

GM Friends. This is JetpackCompose.app’s Dispatch. The #1 Doctor recommended source for your weekly Android knowledge.

This is a very special issue of the newsletter because this is the yearly Google I/O special 🄳 I had the fortune of attending Google I/O 2025 in person and I had a blast meeting a lot of of friends - old and new. There’s a lot of exciting updates to cover so let’s dive right in 🤿

(L-R) Mike Evans, Etienne Caron, Vinay Gaba (me) & Kaushik Gopal at the Keynote

šŸ”„ Top Highlights from Google I/O 2025

Google unleashed a waterfall of announcements this year: AI‑infused everything, smarter tools, and a few genuine jaw‑droppers. I watched the keynote so you don’t have to; what follows is the distilled, no‑fluff rundown of the features most likely to change your day‑to‑day as an Android developer.

This was an action packed I/O - one that deserves to be remembered given how a ton of stuff actually launched and was available to use on the same day šŸ‘šŸ»

Android XR: The Future is Here (And It's Mind-Blowing)

The Android XR Glasses got the crowd hyped up!

Last October’s Android XR announcement didn’t wow me, but the I/O demo sure did. The new Android XR glasses fuse Gemini AI with lightweight hardware: live translation, heads‑up directions, voice‑triggered photos, even ā€œwhat am I looking at?ā€ answers. Because they run Android XR, your existing Android skills translate over. Unlike Wear OS, these glasses solve real problems and feel mainstream‑ready. And this is coming from someone that’s not a stranger to this form factor (I worked at Snap in a past life). If you’re itching for a moon‑shot side project, start here.

Jules: Your New Asynchronous Coding Buddy

Google’s Jules (public beta) is Copilot on steroids. Give it a task like ā€œwrite tests for the login page,ā€ ā€œbump version of Kotlin,ā€ ā€œfix the flaky loginā€, and it:

  1. Clones your repo into an isolated Cloud VM.

  2. Reads the whole project, not just the open tab.

  3. Shows its plan, then edits code, pushes a branch, and attaches an audio changelog (!).

I connected my Github repo and asked it to fix a long standing bug that I wasn’t motivated to spend time on. It did an alright job fixing it and even pushed the branch with the fix. All I had to do was merge the change

I let it tackle a year‑old bug; it produced a working fix I just merged. It’s not perfect, but for grunt work it’s a tireless junior dev running 10Ɨ speed and it’s free while in beta.

Just remember – like all AI tools, it's getting better fast, so even if it's not perfect today, it'll probably be scary good in six months.

As we are going through the new and exciting features from Google I/O 2025, let’s pause and take the time to test our chops and see if we still got it!

Despite our best efforts, sometimes issues slip through the cracks. That's why having an amazing observability tool in your corner is important to keep your app humming. Bitdrift gives you on-demand observability that's custom-built for mobile teams: no log limits, no overage fees, and no waiting on your next app release. Learn More

When an item in the list is clicked, our users keep tripping a ConcurrentModificationException. Can you see why? Where’s the bug that makes the app crash?

@Composable
fun SmoothScrollingList(
    items: List<String>,
    onItemClick: (String) -> Unit
) {
    val listState = rememberLazyListState()
    val scope = rememberCoroutineScope()

    LazyColumn(state = listState) {
        items(items) { item ->
            Card(
                modifier = Modifier
                    .fillMaxWidth()
                    .padding(8.dp)
                    .clickable {
                        onItemClick(item)            
                        scope.launch {
                            listState.animateScrollToItem(0)
                        }
                    }
            ) {
                Text(item, Modifier.padding(16.dp))
            }
        }
    }
}

Find the answer in a section below ā¬‡ļø

Android Studio’s AI Glow‑Up

Google I/O 2025 brought some genuinely exciting updates to Android Studio that go way beyond typical "AI washing." Let me break down the features that actually matter:

Agent Mode finally brings Cursor‑style ā€œvibe codingā€ abilities to Android Studio. Additionally, specialized helpers like the Version Upgrade Agent have the capability to read release notes (!!!) and auto‑fix breaking changes. Goodbye dependency dread.

Journeys: Describe a user flow in plain English like "User logs in, navigates to profile, updates their name, and saves changes" and Gemini writes & runs Espresso tests for you. The demo was a bit bumpy and clearly still a WIP but I’m willing to bet this will improve meaningfully over time.

App Quality Insights now proposes one‑click crash fixes after analyzing your source code.

Resizable Previews + Transform UI with Gemini let you tweak Compose layouts in plain English and instantly test across form factors. Right-click in the preview, select "Transform UI With Gemini," and say something like "Center align these buttons"

Stitch: From Idea to App in Minutes

Stitch is Google's new experiment that turns natural language descriptions or rough sketches into functional UI code. But here's the exciting bit – it doesn't just give you code, it also exports to Figma, creating a seamless bridge between design and development. I mocked a new JetpackCompose.app landing page in minutes.

This addresses the handoff between design and development which is one of the biggest pain points in our industry.

The tool leverages Gemini 2.5 Pro's multimodal capabilities, which explains why it can understand both text descriptions and visual inputs so well.

What excites me most is the bidirectional workflow. You can start with code, export to Figma for design refinement, then bring it back to code. Or start with a design, get the code, and iterate from there. This kind of flexibility could dramatically speed up the design-development cycle.

Gemini in Android Studio for Business addresses the elephant in the room

Google clearly heard the feedback about data privacy concerns in enterprise environments. Gemini in Android Studio for businesses now offers enterprise-grade privacy and security features.

The key differentiator is their data governance policy: your code, inputs, and AI-generated recommendations won't be used to train shared models. You control and own your data and IP. They're also providing IP indemnification – protection against third parties claiming copyright infringement related to AI-generated code.

For enterprise customers, there's also code customization where Gemini can connect to your GitHub, GitLab, or BitBucket repositories to understand your team's preferred frameworks and coding patterns.

After years of complaints about the existing Navigation library, Google went back to the drawing board and built Jetpack Navigation 3 from scratch, specifically for Compose. And honestly, it's about time! šŸŽ‰

The fundamental shift here is philosophical: you own the back stack. Instead of the library managing an opaque back stack that you can only observe indirectly, Nav3 gives you a simple SnapshotStateList<T> where T can be any type you choose. Want to navigate? Just add or remove items from the list. It's that simple.

// Create a back stack with your routes
val backStack = remember { mutableStateListOf<Any>(Home) }

// Navigate by adding to the back stack
backStack.add(Product("123"))

// Go back by removing from the back stack
backStack.removeLastOrNull()

This approach solves several major pain points:

  • No more dual sources of truth – your back stack state is the single source of truth

  • Adaptive layouts are finally possible – you can display multiple destinations simultaneously for list-detail layouts on large screens

  • Complete control over navigation behavior – no more fighting the library to implement custom navigation patterns

While the API is still evolving, the mental model feels so much more natural for Compose development. Check out the recipes repository for practical examples.

Jetpack Compose gets serious about performance and stability

The latest Compose updates address two major developer concerns: performance and API stability.

On the performance front, Google introduced Pausable Composition, which allows composition work to be split across multiple frames, eliminating jank. Combined with background text prefetch and improved LazyLayout prefetch, these changes reportedly eliminate nearly all jank in Google's internal benchmarks.

They also added a new diagnostic feature that provides better stack traces in debug builds:

class App : Application() {
   // In onCreate - only enable for debug builds
   Composer.setDiagnosticStackTraceEnabled(BuildConfig.DEBUG)
}

Two new lint checks help catch common performance issues:

  • @FrequentlyChangingValue warns when reading frequently changing values (like scroll position) might cause excessive recompositions

  • @RememberInComposition catches cases where you should be using remember but aren't

Google also mentioned that their apps now develop against daily snapshots of Compose, meaning issues are caught and fixed long before reaching public releases. This should significantly improve the upgrade experience we've all been struggling with.

The theme across all these announcements is clear: Google is betting big on AI-assisted development, and they're building the tools to make it happen. What strikes me most is how these aren't just gimmicky AI features – they're solving real problems we face every day.

The question isn't whether AI will change how we develop Android apps – it's how quickly we adapt to these new capabilities. The developers who embrace these tools early will have a significant productivity advantage.

🩺 Community Pulse

I also used this opportunity to ask some really amazing droidheads a simple question—

ā€œIf you are only allowed to name one thing, what are you most excited about from Google I/O 2025?ā€

Journeys for Android Studio

Ben Schwab | Member of Technical Staff @ Anthropic • ex-Airbnb, Coursera

Android XR was truly a remarkable demo. We've already seen a paradigm shift with AI and maybe this is what gets Google glass take 2 to work. As a Android dev it was great to see that we can our familiar stack of Android studio, Compose and firebase to power XR apps. With developer preview 2 out, I am eager to get my hands dirty and build the much sought after timer app šŸ˜‰

Saurabh Arora | GDE • Android @ X • ex-Viki

Performance, probably, is one of the topics not a lot of people talk about and is quite one of the most important areas for any product Especially how the tooling and guidance to make it better has been growing in the last year, taking in consideration a wide range of adoption in Baseline profiles Integrations inside Android Studio, dynamic and static scores give us metrics that can help us take particular actions forward a better experience for our users

Dinorah Tovar | GDE • Mobile Engineer

Google always gives us tantalizing glances into their research, AR glasses are probably a top pick for a lot of folks, Gemini Robotics was quite impressive. But if we stick to tech that's actually available? Gemini Live APIs is my answer.

Etienne Caron | GDE • Technical Founder @ Kanastruk • ex-Shopify, Intel

All things AI! Especially Gemini, Gemma, on device models, LLMs and so much more. It’s both incredibly exciting and a little nerve wracking to see how fast things are evolving.

Meha Garg | Android @ Tinder • ex-IBM, CloudOn, Siemens

I’m excited about the enhanced AI tools in Android Studio. Using the agent-powered Gemini directly in the IDE should streamline the development workflow—especially compared to my current setup with Cursor, which requires frequent app switching. Automatic library upgrades will save a ton of time, and the new Journeys-based UI testing looks like a promising way to make writing integration tests more intuitive and effective.

Michael Evans | GDE • Android @ Block • Formerly @Twitter

What’s really happening

  • LazyColumn starts its layout pass, iterating the same MutableList instance.

  • onItemClick mutates that list (remove, update, etc.) before the frame ends.

  • The iterator notices its structure changed mid-stride and throws ConcurrentModificationException.

The scroll animation isn’t the direct culprit—it just widens the window in which iteration and mutation overlap.

How do we fix it?

Re-order the operations— Scroll first (while the list is still intact), then mutate.

@Composable fun SmoothScrollingList( items: List<String>, onItemClick: (String) -> Unit ) { val listState = rememberLazyListState() val scope = rememberCoroutineScope()

LazyColumn(state = listState) {
    items(
        items = items,
        key = { it }          // stable keys help recycling
    ) { item ->
        Card(
            modifier = Modifier
                .fillMaxWidth()
                .padding(8.dp)
                .clickable {
                    scope.launch {
                        // ā¬…ļø scroll first mutate afterwards
                        listState.animateScrollToItem(0)  
                        onItemClick(item)                
                    }
                }
        ) {
            Text(item, Modifier.padding(16.dp))
        }
    }
}

šŸ¦„ How you can help?

If you enjoy reading this newsletter and would like to keep Dispatch free, here are two quick ways to help:

šŸ‘‰šŸ» Chip in via Github Sponsors. If your company offers an education budget, contributing a ā€œcoffeeā€ to keep this newsletter going will certainly qualify. You’ll get an instant receipt for reimbursement, and I’ll keep Dispatch running ā˜•ļø

šŸ‘‰šŸ»Spread the word. Tweet, Bluesky, toot, Slack, or carrier‑pigeon this issue to someone who loves Android. Each new subscriber pushes Dispatch closer to sustainability.

šŸ‘‚ Let me hear it!

What’d you think of this email? Tap your choice below and lemme hear it šŸ‘‡

šŸš€šŸš€šŸš€šŸš€šŸš€ AMAZING-LOVED IT!
šŸš€šŸš€šŸš€ PRETTY GOOD!

On that note, here’s hoping that your bugs are minor and your compilations are error free,

Tech Lead Manager @ Airbnb | Google Developer Expert for Android | ex-Snap, Spotify, Deloitte

Vinay Gaba is a Google Developer Expert for Android and serves as a Tech Lead Manager at Airbnb, where he spearheads the UI Tooling team. His team's mission is to enhance developer productivity through cutting-edge tools leveraging LLMs and Generative AI. Vinay has deep expertise in Android, UI Infrastructure, Developer Tooling, Design Systems and Figma Plugins. Prior to Airbnb, he worked at Snapchat, Spotify, and Deloitte. Vinay holds a Master's degree in Computer Science from Columbia University.

Reply

or to participate.