Last Friday, the starOS Labs team showed up to Google's offices for one of Peter Danenberg's bi-weekly Gemini developer meetups. We came to listen. We left with a lot to think about.

Who's Peter Danenberg
If you haven't encountered him yet, Peter Danenberg is a Senior Software Engineer at Google DeepMind, leading rapid prototyping for Gemini. He's also the person who built the Gemini Meetup into one of the most active developer communities in the Bay Area — 69 events run, 14,000+ RSVPs. That's not a side project. That's a signal about how serious Google is about getting developers hands-on with the technology.
He's given invited talks at MIT, Oxford, Stanford, Harvard, TEDx, and the UN. The meetup format reflects that background — rigorous, demo-heavy, and focused on what's actually buildable today, not vaporware.
The Room
The session was packed. Laptops open, notebooks out, a screen showing live demos. The energy was focused — the kind of crowd that shows up to build, not to network for business cards.

The demos moved fast. Presenters walked through what Gemini is capable of right now — not polished marketing material, but live builds and real integrations. One demo featured a Gemini-powered SparkBot — a conversational AI assistant built on the Gemini API, showing how quickly you can get from API call to functional product.
What We Heard
The through-line across talks was practical capability:
- Gemini 2.5 Pro with its Code Canvas feature — native code editing integrated directly into the model's output, not bolted on
- Live API with Native Audio — 24 languages, real-time voice, designed for agentic applications that need to listen and respond in conversation
- Gemini 2.5 Pro with its Code Canvas feature — native code editing integrated directly into the model's output, not bolted on
- Live API with Native Audio — 24 languages, real-time voice, designed for agentic applications that need to listen and respond in conversation
- Model Context Protocol (MCP) support in Gemini SDKs — the same protocol pattern we use in our own tooling, now a first-class citizen in Google's ecosystem
- Gemma 4 — Google's open models, 26B and 31B variants, for teams that need to run locally or on private infra
- Memorilabs — a persistent memory layer for AI agents that caught everyone's attention in the room. The idea: your AI tools remember context across sessions, projects, and tools — not just within a single conversation. For anyone building agents that need to maintain state over time, this is the missing piece. We left the meetup actively thinking about how to integrate it into our own stack.
The MCP discussion was the most relevant for us technically — we've been building around that pattern for months and seeing Google formally standardize on it confirms the direction. But Memorilabs was the one that sparked the most conversation on the way home. Persistent, cross-session AI memory changes what's possible for agent-driven workflows in a fundamental way.
Why We Went as a Team
Going as a team wasn't about optics. It was about coverage. Different people hear different things, ask different questions, and make different connections.

The synthesis conversation on the way back was as valuable as the sessions. When four people who work on adjacent problems all process the same information, you get a richer picture than any one of them would get alone.
What It Means for How We Build
Google is moving fast on Gemini. The gap between frontier model research and developer-accessible tooling is closing quickly. A few things we're taking seriously coming out of this:
- MCP is the integration layer — it's becoming the shared protocol for how AI systems talk to tools and data. Build to it.
- Audio and multimodal are next — the Live API with native audio isn't a gimmick. Real-time voice interaction at this quality level changes what's possible for user-facing AI products.
- Open models matter — Gemma 4 means teams can run capable models on private infra without the cloud dependency. Worth evaluating for use cases where data sensitivity is a concern.
The Gemini Meetup is one of the better developer events running in the Bay Area right now. Practical, technically serious, and free. If you're building with AI, it's worth being in the room.
We'll be back.