graphiti/.env.railway
Tyler Lafleur b56d469648 Fix Railway deployment: Remove cache mounts, add port config, create deployment guides
- Fix Docker cache mount issues that caused Railway build failures
- Add port argument support to MCP server for Railway compatibility
- Create Railway-optimized Dockerfile without cache mounts
- Add railway.json configuration for proper deployment
- Create comprehensive deployment and ChatGPT integration guides
- Add environment variable templates for Railway deployment
- Support Railway's PORT environment variable handling
- Ready for ChatGPT MCP SSE integration

🚀 Generated with Claude Code (https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-19 19:44:48 -05:00

28 lines
No EOL
789 B
Text

# Railway Environment Variables Template
# Copy these to your Railway project environment variables
# Required: OpenAI API Configuration
OPENAI_API_KEY=sk-proj-your-openai-api-key-here
MODEL_NAME=gpt-4.1-mini
SMALL_MODEL_NAME=gpt-4.1-nano
# Neo4j Database Configuration
# Option 1: Neo4j Aura Cloud (Recommended for production)
NEO4J_URI=neo4j+s://your-instance.databases.neo4j.io
NEO4J_USER=neo4j
NEO4J_PASSWORD=your-aura-password
# Option 2: Local Neo4j (Development only)
# NEO4J_URI=bolt://localhost:7687
# NEO4J_USER=neo4j
# NEO4J_PASSWORD=password
# Optional Configuration
LLM_TEMPERATURE=0.0
SEMAPHORE_LIMIT=10
GRAPHITI_TELEMETRY_ENABLED=false
# Railway automatically sets PORT and HOST
# These are handled by the application automatically
# PORT=8000
# MCP_SERVER_HOST=0.0.0.0