Rizel Scarlett led open-source developer relations at Block, where Goose continues to serve as the reference implementation for new protocol features. From MCP UI to MCP Apps, Goose is where standards meet reality.

This article is adapted from an interview with Rizel Scarlett, then Tech Lead for Open Source Developer Relations at Block.


Before MCP had a name, Block had a problem. The company was going through a restructuring, and an internal AI agent called Goose was helping teams move faster. But its extension system required Python expertise and custom integration work for every new tool.

Block reached out to Anthropic about the friction only to find Anthropic was already working on solving this problem with something called “Model Context Protocol” aka MCP. Block became a contributor to the spec before it shipped. And Goose became the first MCP client available to the public.

That early involvement set a pattern that continues today. When new MCP features need a reference implementation, Goose is often where they land first. Thus, it’s fitting that Block donated Goose alongside Anthropic’s donation of MCP to the Agentic AI Foundation (AAIF) in November 2025.

What makes an agent an agent

“The way I would describe [an agent] is software that does tasks on your behalf,” Rizel Scarlett told RL Nabors during an interview for Arcade’s MCP MVP series. “As opposed to something like ChatGPT or Claude.ai where you’re having a conversation back and forth—you ask it something, it suggests something, you have to copy and paste. With something like Goose, it can actually do the task. It can edit the code in your IDE. It can even touch your system settings.”

Agents are more than a language model. You could say they’re only as good as the harness around model. The LLM provides reasoning, but the harness provides the loop, the tools, and the ability to act. LLM, loop, and tools harnessed together make an agent, and it’s the interactions between those three that determine the utility and quality of an agent.

Rizel used a retro console gaming analogy: the LLM is the game cartridge. It has the knowledge, but without the console, the power supply, and the controller, it can’t do anything useful. Goose is that harness. You plug in whichever model you want (Claude, GPT, Gemini, or even local models like Llama or Qwen), and Goose handles the agentic loop: sending user input to the model, receiving a plan, executing tools, and iterating until the task is complete.

“We’re open source,” Rizel said. “You can say, I like Llama, I like Qwen—plug it in. That way you can have a fully locally running agent.”

The Firefox of agents

Block is a co-founder of the Agentic AI Foundation alongside Anthropic and OpenAI. This makes Goose neutral infrastructure, not a product that can be deprecated on a corporate whim.

“One thing we’re prioritizing with Goose is making sure it’s the reference implementation for MCP,” Rizel said. “Any new stuff that comes out in the spec—you should be able to try it out in Goose, whether it’s elicitations, sampling, or MCP Apps.”

This positions Goose as a proving ground for MCP features before they’re finalized. Developers building MCP servers can test against Goose to see how their tools will behave in a compliant client.

From MCP UI to MCP Apps

The journey from MCP UI to MCP Apps shows how the ecosystem evolves. MCP UI was an experimental feature that let agents render interactive interfaces, not just text responses. “Your agent is able to respond with more than just text,” Rizel explained. “Instead of saying ‘I want to buy this shirt’ and Goose just saying ‘Yeah, you can buy a blue shirt,’ it’ll show you an interface that says, here’s what the blue shirt looks like, you can add this to your cart.”

Andrew Harvard, a design engineer on the Goose team, saw the potential and implemented MCP UI support quickly. The community response was enthusiastic enough that MCP UI evolved into MCP Apps, now part of the official spec.

The key difference: MCP Apps can make direct tool calls. With MCP UI, clicking a button could only prompt the interface again. With MCP Apps, clicking a button can invoke a tool. You can display a list of tweets, click “reply,” and have that action call the reply tool without shattering the interface and falling back to text.

“It’s a subtle difference,” Nabors noted during the demo, but it makes a huge difference in bringing app-like interactions to what would otherwise be a chatbot experience.

MCP Apps are now supported in Goose, Claude, ChatGPT, Postman, and MCP Jam. The pattern of hosting UI components at endpoints, whether locally or publicly, opens up possibilities that feel like the next generation of design systems.

Scheduling, lead-worker models, and local flexibility

Goose includes features that even paid tiers of commercial agents don’t always offer. Scheduled tasks let you run agents on a timer. The lead-worker model lets you assign different models to different roles: one for planning, another for execution, and a third for code review.

“I often do that if I want my code to get reviewed,” Rizel said. “I want one to execute with the planning, and then I want the other one to review the code. That way there’s less bias.”

For developers running local models, this flexibility matters. Small models like Qwen can struggle with tool calling but handle execution well. Larger models reason better but run slower. Splitting responsibilities across models lets you optimize for both speed and quality.

Rizel’s personal setup uses Claude Opus 4.5 as the primary model with Gemini or GPT-5.2 as alternatives. For local experimentation, she’s seen developers get good results with Qwen 2.5 30B, though tool calling remains a weak point for most local models.

Building your own Claude Code

“If you wanted to build your own Claude Code from scratch, you could start with Goose,” Rizel said. “It’s open source, it’s part of the Linux Foundation. It wouldn’t be too tough to grab Goose, open it up, customize its internals, make it yours.”

This is the pitch for Goose as infrastructure: not a product you use as-is, but a foundation you build on, like Electron builds on Chrome for app development. And because Goose runs locally, you can integrate it into workflows that commercial agents can’t touch: internal tools, air-gapped systems, custom security requirements.

Getting involved

Goose development happens in the open. The team is active on Discord, where users ask questions and propose features. The GitHub Discussions include the 2026 roadmap, and issues are available for contributors to pick up.

“The best part is you can use Goose to help you solve a lot of the issues,” Rizel said. “We’re open to using AI as long as you’re reviewing what was created and testing.”

Find Goose on YouTube for live streams and tutorials. Follow Rizel on social media as @blackgirlbytes (that’s “bytes” with a Y).


MCP MVP is a video series from Arcade with RL Nabors that spotlights the builders shaping the agentic ecosystem. Watch the full interview with Rizel Scarlett →

Want to give your agents authenticated access to APIs without managing tokens yourself? See how Arcade handles OAuth for MCP tools.