- JetpackCompose.app Dispatch
- Posts
- JetpackCompose.app's Dispatch Issue #17
JetpackCompose.app's Dispatch Issue #17
š In today's edition, you get inside scoop from Google I/O, interesting updates from the event that impact your daily life & hear from other Android experts about what they are most excited about.
1/3 of mobile release time is wasted on low-value work [2025 Report]
Coordination overload, manual bottlenecks, and ineffective automation silently undermine your mobile teamās productivity.
Learn whatās holding you back and how to fix the top issues šš»
GM Friends. This is JetpackCompose.appās Dispatch. The #1 Doctor recommended source for your weekly Android knowledge.
This is a very special issue of the newsletter because this is the yearly Google I/O special š„³ I had the fortune of attending Google I/O 2025 in person and I had a blast meeting a lot of of friends - old and new. Thereās a lot of exciting updates to cover so letās dive right in š¤æ

(L-R) Mike Evans, Etienne Caron, Vinay Gaba (me) & Kaushik Gopal at the Keynote
š„ Top Highlights from Google I/O 2025
Google unleashed a waterfall of announcements this year: AIāinfused everything, smarter tools, and a few genuine jawādroppers. I watched the keynote so you donāt have to; what follows is the distilled, noāfluff rundown of the features most likely to change your dayātoāday as an Android developer.

This was an action packed I/O - one that deserves to be remembered given how a ton of stuff actually launched and was available to use on the same day šš»
Android XR: The Future is Here (And It's Mind-Blowing)

The Android XR Glasses got the crowd hyped up!
Last Octoberās Android XR announcement didnāt wow me, but the I/O demo sure did. The new Android XR glasses fuse Gemini AI with lightweight hardware: live translation, headsāup directions, voiceātriggered photos, even āwhat am I looking at?ā answers. Because they run AndroidāÆXR, your existing Android skills translate over. Unlike WearāÆOS, these glasses solve real problems and feel mainstreamāready. And this is coming from someone thatās not a stranger to this form factor (I worked at Snap in a past life). If youāre itching for a moonāshot side project, start here.
Jules: Your New Asynchronous Coding Buddy
Googleās Jules (public beta) is Copilot on steroids. Give it a task like āwrite tests for the login page,ā ābump version of Kotlin,ā āfix the flaky loginā, and it:
Clones your repo into an isolated CloudāÆVM.
Reads the whole project, not just the open tab.
Shows its plan, then edits code, pushes a branch, and attaches an audio changelog (!).

I connected my Github repo and asked it to fix a long standing bug that I wasnāt motivated to spend time on. It did an alright job fixing it and even pushed the branch with the fix. All I had to do was merge the change
I let it tackle a yearāold bug; it produced a working fix I just merged. Itās not perfect, but for grunt work itās a tireless junior dev running 10Ć speed and itās free while in beta.
Just remember ā like all AI tools, it's getting better fast, so even if it's not perfect today, it'll probably be scary good in six months.
As we are going through the new and exciting features from Google I/O 2025, letās pause and take the time to test our chops and see if we still got it!
Despite our best efforts, sometimes issues slip through the cracks. That's why having an amazing observability tool in your corner is important to keep your app humming. Bitdrift gives you on-demand observability that's custom-built for mobile teams: no log limits, no overage fees, and no waiting on your next app release. Learn More
When an item in the list is clicked, our users keep tripping a ConcurrentModificationException
. Can you see why? Whereās the bug that makes the app crash?
@Composable
fun SmoothScrollingList(
items: List<String>,
onItemClick: (String) -> Unit
) {
val listState = rememberLazyListState()
val scope = rememberCoroutineScope()
LazyColumn(state = listState) {
items(items) { item ->
Card(
modifier = Modifier
.fillMaxWidth()
.padding(8.dp)
.clickable {
onItemClick(item)
scope.launch {
listState.animateScrollToItem(0)
}
}
) {
Text(item, Modifier.padding(16.dp))
}
}
}
}
Find the answer in a section below ā¬ļø
AndroidāÆStudioās AI GlowāUp
Google I/O 2025 brought some genuinely exciting updates to Android Studio that go way beyond typical "AI washing." Let me break down the features that actually matter:
Agent Mode finally brings Cursorāstyle āvibe codingā abilities to Android Studio. Additionally, specialized helpers like the VersionāÆUpgrade Agent have the capability to read release notes (!!!) and autoāfix breaking changes. Goodbye dependency dread.
Journeys: Describe a user flow in plain English like "User logs in, navigates to profile, updates their name, and saves changes" and Gemini writes & runs Espresso tests for you. The demo was a bit bumpy and clearly still a WIP but Iām willing to bet this will improve meaningfully over time.
App Quality Insights now proposes oneāclick crash fixes after analyzing your source code.
Resizable Previews + TransformāÆUI with Gemini let you tweak Compose layouts in plain English and instantly test across form factors. Right-click in the preview, select "Transform UI With Gemini," and say something like "Center align these buttons"
Stitch: From Idea to App in Minutes
Stitch is Google's new experiment that turns natural language descriptions or rough sketches into functional UI code. But here's the exciting bit ā it doesn't just give you code, it also exports to Figma, creating a seamless bridge between design and development. I mocked a new JetpackCompose.app landing page in minutes.

This addresses the handoff between design and development which is one of the biggest pain points in our industry.
The tool leverages Gemini 2.5 Pro's multimodal capabilities, which explains why it can understand both text descriptions and visual inputs so well.
What excites me most is the bidirectional workflow. You can start with code, export to Figma for design refinement, then bring it back to code. Or start with a design, get the code, and iterate from there. This kind of flexibility could dramatically speed up the design-development cycle.
Gemini in Android Studio for Business addresses the elephant in the room
Google clearly heard the feedback about data privacy concerns in enterprise environments. Gemini in Android Studio for businesses now offers enterprise-grade privacy and security features.
The key differentiator is their data governance policy: your code, inputs, and AI-generated recommendations won't be used to train shared models. You control and own your data and IP. They're also providing IP indemnification ā protection against third parties claiming copyright infringement related to AI-generated code.
For enterprise customers, there's also code customization where Gemini can connect to your GitHub, GitLab, or BitBucket repositories to understand your team's preferred frameworks and coding patterns.
After years of complaints about the existing Navigation library, Google went back to the drawing board and built Jetpack Navigation 3 from scratch, specifically for Compose. And honestly, it's about time! š
The fundamental shift here is philosophical: you own the back stack. Instead of the library managing an opaque back stack that you can only observe indirectly, Nav3 gives you a simple SnapshotStateList<T>
where T
can be any type you choose. Want to navigate? Just add or remove items from the list. It's that simple.
// Create a back stack with your routes
val backStack = remember { mutableStateListOf<Any>(Home) }
// Navigate by adding to the back stack
backStack.add(Product("123"))
// Go back by removing from the back stack
backStack.removeLastOrNull()
This approach solves several major pain points:
No more dual sources of truth ā your back stack state is the single source of truth
Adaptive layouts are finally possible ā you can display multiple destinations simultaneously for list-detail layouts on large screens
Complete control over navigation behavior ā no more fighting the library to implement custom navigation patterns
While the API is still evolving, the mental model feels so much more natural for Compose development. Check out the recipes repository for practical examples.
Jetpack Compose gets serious about performance and stability
The latest Compose updates address two major developer concerns: performance and API stability.
On the performance front, Google introduced Pausable Composition, which allows composition work to be split across multiple frames, eliminating jank. Combined with background text prefetch and improved LazyLayout prefetch, these changes reportedly eliminate nearly all jank in Google's internal benchmarks.
They also added a new diagnostic feature that provides better stack traces in debug builds:
class App : Application() {
// In onCreate - only enable for debug builds
Composer.setDiagnosticStackTraceEnabled(BuildConfig.DEBUG)
}
Two new lint checks help catch common performance issues:
@FrequentlyChangingValue warns when reading frequently changing values (like scroll position) might cause excessive recompositions
@RememberInComposition catches cases where you should be using
remember
but aren't
Google also mentioned that their apps now develop against daily snapshots of Compose, meaning issues are caught and fixed long before reaching public releases. This should significantly improve the upgrade experience we've all been struggling with.
The theme across all these announcements is clear: Google is betting big on AI-assisted development, and they're building the tools to make it happen. What strikes me most is how these aren't just gimmicky AI features ā they're solving real problems we face every day.
The question isn't whether AI will change how we develop Android apps ā it's how quickly we adapt to these new capabilities. The developers who embrace these tools early will have a significant productivity advantage.

𩺠Community Pulse
I also used this opportunity to ask some really amazing droidheads a simple questionā
āIf you are only allowed to name one thing, what are you most excited about from Google I/O 2025?ā
![]() | Journeys for Android Studio Ben Schwab | Member of Technical Staff @ Anthropic ⢠ex-Airbnb, Coursera |
![]() | Android XR was truly a remarkable demo. We've already seen a paradigm shift with AI and maybe this is what gets Google glass take 2 to work. As a Android dev it was great to see that we can our familiar stack of Android studio, Compose and firebase to power XR apps. With developer preview 2 out, I am eager to get my hands dirty and build the much sought after timer app š Saurabh Arora | GDE ⢠Android @ X ⢠ex-Viki |
![]() | Performance, probably, is one of the topics not a lot of people talk about and is quite one of the most important areas for any product Especially how the tooling and guidance to make it better has been growing in the last year, taking in consideration a wide range of adoption in Baseline profiles Integrations inside Android Studio, dynamic and static scores give us metrics that can help us take particular actions forward a better experience for our users Dinorah Tovar | GDE ⢠Mobile Engineer |
![]() | Google always gives us tantalizing glances into their research, AR glasses are probably a top pick for a lot of folks, Gemini Robotics was quite impressive. But if we stick to tech that's actually available? Gemini Live APIs is my answer. Etienne Caron | GDE ⢠Technical Founder @ Kanastruk ⢠ex-Shopify, Intel |
![]() | All things AI! Especially Gemini, Gemma, on device models, LLMs and so much more. Itās both incredibly exciting and a little nerve wracking to see how fast things are evolving. Meha Garg | Android @ Tinder ⢠ex-IBM, CloudOn, Siemens |
![]() | Iām excited about the enhanced AI tools in Android Studio. Using the agent-powered Gemini directly in the IDE should streamline the development workflowāespecially compared to my current setup with Cursor, which requires frequent app switching. Automatic library upgrades will save a ton of time, and the new Journeys-based UI testing looks like a promising way to make writing integration tests more intuitive and effective. |

Whatās really happening
LazyColumn
starts its layout pass, iterating the sameMutableList
instance.onItemClick
mutates that list (remove
,update
, etc.) before the frame ends.The iterator notices its structure changed mid-stride and throws
ConcurrentModificationException
.
The scroll animation isnāt the direct culpritāit just widens the window in which iteration and mutation overlap.
How do we fix it?
Re-order the operationsā Scroll first (while the list is still intact), then mutate.
@Composable fun SmoothScrollingList( items: List<String>, onItemClick: (String) -> Unit ) { val listState = rememberLazyListState() val scope = rememberCoroutineScope()
LazyColumn(state = listState) {
items(
items = items,
key = { it } // stable keys help recycling
) { item ->
Card(
modifier = Modifier
.fillMaxWidth()
.padding(8.dp)
.clickable {
scope.launch {
// ā¬
ļø scroll first mutate afterwards
listState.animateScrollToItem(0)
onItemClick(item)
}
}
) {
Text(item, Modifier.padding(16.dp))
}
}
}
š¦ How you can help?
If you enjoy reading this newsletter and would like to keep Dispatch free, here are two quick ways to help:
šš» Chip in via Github Sponsors. If your company offers an education budget, contributing a ācoffeeā to keep this newsletter going will certainly qualify. Youāll get an instant receipt for reimbursement, and Iāll keep Dispatch running āļø
šš»Spread the word. Tweet, Bluesky, toot, Slack, or carrierāpigeon this issue to someone who loves Android. Each new subscriber pushes Dispatch closer to sustainability.
š Let me hear it!
Whatād you think of this email? Tap your choice below and lemme hear it š
ššššš AMAZING-LOVED IT!
ššš PRETTY GOOD!
š MEH - NOT GREAT
On that note, hereās hoping that your bugs are minor and your compilations are error free,
![]() | Tech Lead Manager @ Airbnb | Google Developer Expert for Android | ex-Snap, Spotify, Deloitte Vinay Gaba is a Google Developer Expert for Android and serves as a Tech Lead Manager at Airbnb, where he spearheads the UI Tooling team. His team's mission is to enhance developer productivity through cutting-edge tools leveraging LLMs and Generative AI. Vinay has deep expertise in Android, UI Infrastructure, Developer Tooling, Design Systems and Figma Plugins. Prior to Airbnb, he worked at Snapchat, Spotify, and Deloitte. Vinay holds a Master's degree in Computer Science from Columbia University. |
Reply