Product upvotes vs the next 3

Waiting for data. Loading

Product comments vs the next 3

Waiting for data. Loading

Product upvote speed vs the next 3

Waiting for data. Loading

Product upvotes and comments

Waiting for data. Loading

Product vs the next 3

Loading

Figma for Agents

Design with AI agents, connected to your design system

AI-generated designs break brand standards because agents can't see your design system. Figma's use_figma MCP tool changes that. For product teams bridging design and code with AI agents.

Top comment

Today Figma opened the canvas to agents.

What is it:

Figma's use_figma MCP tool lets AI agents create and edit designs directly in Figma, working with your actual components, variables, and auto layout not against them.

The problem:

Every AI-generated design has the same tell: it doesn't look like your product.

Components are invented. Spacing is arbitrary.

The output is technically a UI, but it's nobody's design system.

So designers throw it out and start over.

The solution:

Skills are markdown files that encode your team's design conventions.

Agents read them before touching the canvas.

Combined with use_figma, agents now have both access and context they know how to work in Figma and they know how to work in your Figma.

What you can do with it:

  • 🏗️ Generate component libraries from a codebase

  • 🔗 Sync design tokens between code and Figma variables, with drift detection

  • ♿ Auto-generate screen reader specs from UI designs

  • 🔄 Run parallel workflows across multiple agents

Who it's for:

Product and design-engineering teams that use Figma as the shared source of truth and want their AI agent workflows to stay connected to it. Heavy users of Claude Code, Codex, Cursor, and Copilot will feel this immediately.

Question for makers: Skills make agent behaviour more deterministic by encoding specific conventions.

How are you thinking about versioning skills as your design system evolves?