Skip to main content
Allows AI assistants like Cursor, Claude Desktop, Windsurf, or ChatGPT to ask questions and perform actions with your Jira instance.

Example prompts

Show tickets assigned to me with links and a summary of progress.

What's the link and status of that ticket with the new button feature?

Add a comment to PROJ-456 explaining the fix in my current branch.

Add this spreadsheet of user stories as story issues to PROJ. Make the stories medium priority with a due date of 2 weeks from now.

Get started

Quickstart

Install and configure Jira MCP in minutes.

Configure your AI tool

Set up Jira MCP with your preferred AI assistant.

How it works

1

Prompt your AI assistant

You enter questions or commands to an LLM client such as Claude Desktop, Cursor, Windsurf, or ChatGPT.
2

LLM analyzes available tools

The LLM analyzes the available MCP tools and decides which one(s) to use. The LLM has context of each tool and what it is meant for in human language.
3

MCP server executes commands

The client executes the chosen tool(s) through the MCP server. The MCP server runs locally on your machine or remotely via an endpoint.
4

Results returned to LLM

The results are sent back to the LLM, which formulates a natural language response and displays data or performs actions using the MCP server.

Architecture

MCP follows a client-server architecture where an MCP host (an AI application like Cursor or Claude Desktop) establishes connections to one or more MCP servers. The MCP host accomplishes this by creating one MCP client for each MCP server. Each MCP client maintains a dedicated connection with its corresponding MCP server.

Learn more about MCP architecture

Read the official MCP documentation for details on architecture.