Heading image for post: How To Rev Up Your Rails Development with MCP

Ruby AI

How To Rev Up Your Rails Development with MCP

Profile picture of Jack Rosa

Shipping new features on legacy Rails applications requires deep codebase context. The rails-mcp-server gem closes the gap between AI agents and your Rails projects, enabling more relevant code analysis and context aware refactoring suggestions. Whether you're dealing with tech debt in a brownfield application or building new greenfield features, this tool can help you move faster with confidence.

The Model Context Protocol (MCP) is a way to allow LLM models to interact with development environments and external tools. The rails-mcp-server gem is a Ruby implementation that enables LLMs to interact directly with Rails projects through MCP; Once you have it set up with an agent like claude or copilot, the model will have way more context about your app's architecture and removes a lot of the nonsense and guesswork associated with AI driven development. Check out the repo here

I'll walk you through setting up the rails-mcp-server gem for your Rails projects.

Installation

Installing the rails-mcp-server gem is simple like any ruby gem. Open your terminal and run.

Don't install it to your project's directory or add it to a Rails gemfile, this gem is meant to be installed globally and configured to run with multiple projects.

gem install rails-mcp-server

Config

Setting Up Your Projects

Once you run the server for the first time, you can configure the gem to access your rails projects. The configuration location depends on your operating system:

  • macOS: $XDG_CONFIG_HOME/rails-mcp or ~/.config/rails-mcp if XDG_CONFIG_HOME is not set
  • Windows: %APPDATA%\rails-mcp

The first time the server runs, these directories will be created.

Running the Rails MCP Server

The server can be ran in two modes, but for the purposes of this article we will stick to http mode, if you want to find out about STDIO mode check out the docs here. Running the server will create the config directory.

HTTP Mode

HTTP mode runs as an HTTP server with JSON-RPC and Server-Sent Events (SSE) endpoints, perfect for web applications. Lets start it up.

# Start on the default port (6029)
rails-mcp-server --mode http

# Starting on a custom port
rails-mcp-server --mode http -p 8080

When running in HTTP mode, the server can be accessed at these endpoints:

  • JSON-RPC endpoint: http://localhost:<port>/mcp/messages
  • SSE endpoint: http://localhost:<port>/mcp/sse

Configuring Your Rails Projects

The server will also create a projects.yml file in your config directory when you run it; to include your Rails projects, just provide a project name and a path to the directory:

# ~/.config/rails-mcp/projects.yml
test_app: "~/projects/test_app"

Integrating with Claude Code

Add the following to your claude/config.json:

{
  "mcpServers": {
    "railsMcpServer": {
      "command": "ruby",
      "args": ["/full/path/to/rails-mcp-server/exe/rails-mcp-server"]
    }
  }
}

To find the full path to your rails-mcp-server executable:

which rails-mcp-server

Restart the claude code session to refresh the new config.

With a new claude code session started, use the command /mcp to see that the session has access to the rails-mcp-server.

Integrating with Copilot

Add the following to your .vscode/mcp.json:

{
  "servers": {
    "railsMcpServer": {
      "command": "ruby",
      "args": ["/full/path/to/rails-mcp-server/exe/rails-mcp-server"]
    }
  }
}

To find the full path to your rails-mcp-server executable:

which rails-mcp-server

Using The Rails MCP Server

Once configured, your AI assistant can interact with your Rails projects using the provided tools, check out the docs too see the provided tools.

Analyzing Your Code

A helpful thing to note is that you dont need to use the tools names specifically in LLM chats. You can simply refrence them in plain english.

load the Turbo guides and then show me how to refactor my blog feed with turbo streams

Before you can use the MCP tools on your project, you will have to tell the mcp to switch to your project. it will have to be a project name that is included in projects.yml

Here's some of the tools can use in your LLM chats and how you might use them in a prompt:

project_info

=> break down each of the 3rd party integrations used in the project
get_routes

=> which routes are being used for the messaging endpoints?
analyze_models

=> how is the user model associated with the blogpost model?

See how the prompt above says load the turbo guides, the LLM is smart enough to know that it will use the load_guide tool to respond to the prompt.

Provide Even More Context

While the MCP server gives your AI assistants access to your code structure, providing more context about what you're working on helps generate better suggestions. Be sure to mention:

  • The specific files you want to work with
  • A clear description of the feature or bug you're addressing
  • Any architectural or logistical constraints (obviously)

Conclusion

The rails-mcp-server gem fills in the context gap between AI and your Rails development workflow.

Whether you're working on a single application or switching between multiple Rails projects, the MCP server provides a way for AI, and hopefully you, to understand your codebase better.

Extras

Reach out to Hashrocket if you need help modernizing your Rails Project! 🚀

More posts about Ruby rails mcp AI

  • Adobe logo
  • Barnes and noble logo
  • Aetna logo
  • Vanderbilt university logo
  • Ericsson logo

We're proud to have launched hundreds of products for clients such as LensRentals.com, Engine Yard, Verisign, ParkWhiz, and Regions Bank, to name a few.

Let's talk about your project