rails-ai-bridge

Stop letting AI
guess your Rails app.

AI coding assistants work better when they know your app's actual structure. This gem gives them a map — so they write code that fits your conventions, not generic Rails patterns.

Zero config Read-only Open source

The problem

AI without context makes expensive guesses.

1

Without a map

  • Guesses table names that do not match your conventions
  • Misses custom associations, scopes, and callbacks
  • Wastes tokens rediscovering structure you already have
  • Generates generic Rails code that does not fit your patterns

Result: slower responses, more corrections, higher token bills.

2

With rails-ai-bridge

  • Knows your exact schema, routes, and model relationships
  • Understands your team conventions and architecture
  • Drills into specifics on demand via live MCP tools
  • Writes code that actually fits your codebase

Result: faster first useful response, fewer corrections.

The mental model

Two layers. One purpose.

Think of it as giving your AI both a guidebook and a librarian. The guidebook sets orientation. The librarian answers follow-up questions.

LAYER 1

Static context files

Compact, assistant-specific files generated once and committed to your repo. They give the AI passive orientation every time a session starts.

CLAUDE.md for Claude Code
AGENTS.md for OpenAI Codex
.cursorrules for Cursor
.github/copilot-instructions.md for Copilot

≤ 150 lines each. Compact by default.

LAYER 2

Live MCP tools

A read-only server that answers exact questions on demand. The AI calls these only when it needs a specific detail.

rails_get_schema tables, columns, indexes
rails_get_model_details associations, validations, scopes
rails_get_routes HTTP verbs, paths, actions
rails_search_code ripgrep across your codebase

11 built-in tools. All read-only.

How it works

Introspect. Generate. Serve.

Introspect Scan schema, models, routes, controllers, gems Generate Write assistant-specific context files Serve Run MCP server with read-only tools Static files MCP server

Introspect

Up to 27 built-in scanners read your app: schema, models, routes, controllers, gems, tests, conventions, views, Turbo, Stimulus, auth, API, DevOps, and more.

Generate

rails ai:bridge writes compact files tailored to each AI tool's format and size limits. Split rules keep per-file guidance focused.

Serve

rails ai:serve starts an MCP server Claude Code and Cursor auto-detect. Supports stdio and HTTP transports.

Concrete example

What your AI actually sees.

Here is how a tool call works in practice. Start broad, then drill in.

REQUEST rails_get_schema
{"detail": "summary"}
RESPONSE
users ............... 12 columns
orders .............. 8 columns
order_items ......... 6 columns
products ............ 14 columns
categories .......... 4 columns

The AI gets oriented fast without drowning in detail.

REQUEST rails_get_model_details
{"model": "Order"}
RESPONSE
belongs_to :user
has_many :order_items
dependent: :destroy

validates :total, numericality: { >= 0 }
scope :pending, -> { where(status: 'pending') }
enum status: { pending: 0, paid: 1, shipped: 2 }

Exact details, only when needed. No hallucinated columns.

Semantic model tiering

Models are classified by importance so the AI knows where to focus first:

core_entity — User, Order rich_join — OrderItem supporting — AuditLog

Safety

Read-only by design.

Read-only tools

Every MCP tool inspects structure. None write files or mutate the database.

Secrets filtered

Credentials, .env files, master.key, and private config paths are excluded from generated context.

Production-safe

HTTP MCP requires explicit opt-in and an auth token in production. stdio is local-only.

Opt-in extras

database_stats is opt-in because it queries PostgreSQL table statistics.

Next steps

Try it in five minutes.

Add the gem, run the installer, generate your bridge files, and start the MCP server.

$ bundle add rails-ai-bridge
$ rails generate rails_ai_bridge:install
$ rails ai:bridge
$ rails ai:serve

Commit the generated files so your whole team benefits from the same AI context.