Commit
Β·
0c00a27
1
Parent(s):
982515c
feat: Implement unified launcher and update unified app for Gradio UI and MCP SSE server integration
Browse files- README.md +15 -8
- launcher.py +65 -0
- unified_app.py +210 -0
README.md
CHANGED
|
@@ -6,7 +6,7 @@ colorTo: purple
|
|
| 6 |
sdk: docker
|
| 7 |
app_file: app.py
|
| 8 |
pinned: false
|
| 9 |
-
short_description: MCP server with 29 AI tools
|
| 10 |
tags:
|
| 11 |
- mcp
|
| 12 |
- building-mcp-track-enterprise
|
|
@@ -14,6 +14,7 @@ tags:
|
|
| 14 |
- delivery-management
|
| 15 |
- postgresql
|
| 16 |
- fastmcp
|
|
|
|
| 17 |
- enterprise
|
| 18 |
- logistics
|
| 19 |
- gemini
|
|
@@ -25,7 +26,7 @@ tags:
|
|
| 25 |
|
| 26 |
**π MCP 1st Birthday Hackathon - Track 1: Building MCP Servers (Enterprise Category)**
|
| 27 |
|
| 28 |
-
Industry-standard Model Context Protocol server for AI-powered delivery dispatch management. Exposes **29 AI tools** (including Gemini 2.0 Flash intelligent assignment) and 2 real-time resources for managing delivery operations through any MCP-compatible client.
|
| 29 |
|
| 30 |
[](https://github.com/jlowin/fastmcp)
|
| 31 |
[](https://www.python.org/)
|
|
@@ -36,8 +37,10 @@ Industry-standard Model Context Protocol server for AI-powered delivery dispatch
|
|
| 36 |
## π Links
|
| 37 |
|
| 38 |
- **GitHub Repository:** https://github.com/mashrur-rahman-fahim/fleetmind-mcp
|
| 39 |
-
- **HuggingFace Space:** https://huggingface.co/spaces/MCP-1st-Birthday/fleetmind-dispatch-ai
|
| 40 |
-
- **
|
|
|
|
|
|
|
| 41 |
|
| 42 |
## πΊ Demo & Submission
|
| 43 |
|
|
@@ -61,15 +64,19 @@ This project is submitted as part of the **MCP 1st Birthday Hackathon** (Track 1
|
|
| 61 |
|
| 62 |
## π― What is FleetMind MCP?
|
| 63 |
|
| 64 |
-
FleetMind is a production-ready **Model Context Protocol (MCP) server** that transforms delivery dispatch management into AI-accessible tools.
|
|
|
|
|
|
|
|
|
|
|
|
|
| 65 |
|
| 66 |
### Key Features
|
| 67 |
|
| 68 |
β
**29 AI Tools** - Order, Driver & Assignment Management (including Gemini 2.0 Flash AI)
|
| 69 |
β
**2 Real-Time Resources** - Live data feeds (orders://all, drivers://all)
|
| 70 |
-
β
**Google Maps Integration** - Geocoding & Route Calculation
|
| 71 |
β
**PostgreSQL Database** - Production-grade data storage (Neon)
|
| 72 |
-
β
**SSE Endpoint** - Server-Sent Events
|
| 73 |
β
**Multi-Client Support** - Works with any MCP-compatible client
|
| 74 |
|
| 75 |
### β Unique Features
|
|
@@ -218,7 +225,7 @@ FleetMind isn't just an MCP serverβit's a **blueprint for enterprise AI integr
|
|
| 218 |
import mcp
|
| 219 |
|
| 220 |
client = mcp.Client(
|
| 221 |
-
url="https://
|
| 222 |
)
|
| 223 |
|
| 224 |
# Use any of the 29 tools
|
|
|
|
| 6 |
sdk: docker
|
| 7 |
app_file: app.py
|
| 8 |
pinned: false
|
| 9 |
+
short_description: Enterprise MCP server with 29 AI tools and Gemini 2.0 Flash intelligent assignment
|
| 10 |
tags:
|
| 11 |
- mcp
|
| 12 |
- building-mcp-track-enterprise
|
|
|
|
| 14 |
- delivery-management
|
| 15 |
- postgresql
|
| 16 |
- fastmcp
|
| 17 |
+
- gradio
|
| 18 |
- enterprise
|
| 19 |
- logistics
|
| 20 |
- gemini
|
|
|
|
| 26 |
|
| 27 |
**π MCP 1st Birthday Hackathon - Track 1: Building MCP Servers (Enterprise Category)**
|
| 28 |
|
| 29 |
+
Industry-standard Model Context Protocol server for AI-powered delivery dispatch management. Exposes **29 AI tools** (including Gemini 2.0 Flash intelligent assignment) and 2 real-time resources for managing delivery operations through any MCP-compatible client. Features a professional web landing page with connection instructions.
|
| 30 |
|
| 31 |
[](https://github.com/jlowin/fastmcp)
|
| 32 |
[](https://www.python.org/)
|
|
|
|
| 37 |
## π Links
|
| 38 |
|
| 39 |
- **GitHub Repository:** https://github.com/mashrur-rahman-fahim/fleetmind-mcp
|
| 40 |
+
- **HuggingFace Space (Landing Page):** https://huggingface.co/spaces/MCP-1st-Birthday/fleetmind-dispatch-ai
|
| 41 |
+
- **Live Space URL:** https://mcp-1st-birthday-fleetmind-dispatch-ai.hf.space
|
| 42 |
+
- **MCP SSE Endpoint:** https://mcp-1st-birthday-fleetmind-dispatch-ai.hf.space/sse
|
| 43 |
+
- **Note:** Visit the Space URL above to see connection instructions and tool documentation
|
| 44 |
|
| 45 |
## πΊ Demo & Submission
|
| 46 |
|
|
|
|
| 64 |
|
| 65 |
## π― What is FleetMind MCP?
|
| 66 |
|
| 67 |
+
FleetMind is a production-ready **Model Context Protocol (MCP) server** that transforms delivery dispatch management into AI-accessible tools.
|
| 68 |
+
|
| 69 |
+
**Access Methods:**
|
| 70 |
+
- **Web Landing Page**: Professional HTML page with connection instructions and tool documentation
|
| 71 |
+
- **MCP SSE Endpoint**: Direct API access for Claude Desktop, Continue, Cline, or any MCP client
|
| 72 |
|
| 73 |
### Key Features
|
| 74 |
|
| 75 |
β
**29 AI Tools** - Order, Driver & Assignment Management (including Gemini 2.0 Flash AI)
|
| 76 |
β
**2 Real-Time Resources** - Live data feeds (orders://all, drivers://all)
|
| 77 |
+
β
**Google Maps Integration** - Geocoding & Route Calculation with traffic data
|
| 78 |
β
**PostgreSQL Database** - Production-grade data storage (Neon)
|
| 79 |
+
β
**SSE Endpoint** - Standard MCP protocol via Server-Sent Events
|
| 80 |
β
**Multi-Client Support** - Works with any MCP-compatible client
|
| 81 |
|
| 82 |
### β Unique Features
|
|
|
|
| 225 |
import mcp
|
| 226 |
|
| 227 |
client = mcp.Client(
|
| 228 |
+
url="https://mcp-1st-birthday-fleetmind-dispatch-ai.hf.space/sse"
|
| 229 |
)
|
| 230 |
|
| 231 |
# Use any of the 29 tools
|
launcher.py
ADDED
|
@@ -0,0 +1,65 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
"""
|
| 2 |
+
FleetMind Unified Launcher
|
| 3 |
+
Runs both Gradio UI and MCP SSE server together
|
| 4 |
+
"""
|
| 5 |
+
|
| 6 |
+
import subprocess
|
| 7 |
+
import sys
|
| 8 |
+
import time
|
| 9 |
+
import signal
|
| 10 |
+
import os
|
| 11 |
+
|
| 12 |
+
# Store processes for cleanup
|
| 13 |
+
processes = []
|
| 14 |
+
|
| 15 |
+
def signal_handler(sig, frame):
|
| 16 |
+
"""Handle shutdown gracefully"""
|
| 17 |
+
print("\n\nπ Shutting down FleetMind...")
|
| 18 |
+
for proc in processes:
|
| 19 |
+
proc.terminate()
|
| 20 |
+
sys.exit(0)
|
| 21 |
+
|
| 22 |
+
# Register signal handler
|
| 23 |
+
signal.signal(signal.SIGINT, signal_handler)
|
| 24 |
+
signal.signal(signal.SIGTERM, signal_handler)
|
| 25 |
+
|
| 26 |
+
print("=" * 70)
|
| 27 |
+
print("FleetMind - Unified Launcher (Gradio UI + MCP SSE Server)")
|
| 28 |
+
print("=" * 70)
|
| 29 |
+
|
| 30 |
+
# Start MCP SSE server (app.py) in background
|
| 31 |
+
print("\n[1/2] Starting MCP SSE server...")
|
| 32 |
+
mcp_process = subprocess.Popen(
|
| 33 |
+
[sys.executable, "app.py"],
|
| 34 |
+
stdout=subprocess.PIPE,
|
| 35 |
+
stderr=subprocess.STDOUT,
|
| 36 |
+
text=True,
|
| 37 |
+
bufsize=1
|
| 38 |
+
)
|
| 39 |
+
processes.append(mcp_process)
|
| 40 |
+
print("β
MCP SSE server started (background)")
|
| 41 |
+
|
| 42 |
+
# Give MCP server time to initialize
|
| 43 |
+
time.sleep(2)
|
| 44 |
+
|
| 45 |
+
# Start Gradio UI (ui/app.py) in foreground
|
| 46 |
+
print("\n[2/2] Starting Gradio UI...")
|
| 47 |
+
print("=" * 70)
|
| 48 |
+
ui_process = subprocess.Popen(
|
| 49 |
+
[sys.executable, "ui/app.py"],
|
| 50 |
+
stdout=sys.stdout,
|
| 51 |
+
stderr=sys.stderr
|
| 52 |
+
)
|
| 53 |
+
processes.append(ui_process)
|
| 54 |
+
|
| 55 |
+
print("\nβ
Both services running!")
|
| 56 |
+
print("=" * 70)
|
| 57 |
+
print("π Gradio UI: http://0.0.0.0:7860")
|
| 58 |
+
print("π MCP SSE: http://0.0.0.0:7860/sse")
|
| 59 |
+
print("=" * 70)
|
| 60 |
+
|
| 61 |
+
# Wait for UI process (it blocks)
|
| 62 |
+
try:
|
| 63 |
+
ui_process.wait()
|
| 64 |
+
except KeyboardInterrupt:
|
| 65 |
+
signal_handler(None, None)
|
unified_app.py
ADDED
|
@@ -0,0 +1,210 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
"""
|
| 2 |
+
FleetMind Unified App
|
| 3 |
+
Serves both Gradio UI and MCP SSE endpoint on the same port
|
| 4 |
+
Simple UI showing MCP connection information and server status
|
| 5 |
+
"""
|
| 6 |
+
|
| 7 |
+
import sys
|
| 8 |
+
from pathlib import Path
|
| 9 |
+
sys.path.insert(0, str(Path(__file__).parent))
|
| 10 |
+
|
| 11 |
+
from fastapi import FastAPI
|
| 12 |
+
import uvicorn
|
| 13 |
+
import gradio as gr
|
| 14 |
+
import os
|
| 15 |
+
import json
|
| 16 |
+
|
| 17 |
+
print("=" * 70)
|
| 18 |
+
print("FleetMind - Unified Server (Gradio UI + MCP SSE)")
|
| 19 |
+
print("=" * 70)
|
| 20 |
+
|
| 21 |
+
# Configuration
|
| 22 |
+
MCP_SSE_ENDPOINT = "https://mcp-1st-birthday-fleetmind-dispatch-ai.hf.space/sse"
|
| 23 |
+
|
| 24 |
+
# Import MCP server
|
| 25 |
+
print("\n[1/2] Loading MCP server...")
|
| 26 |
+
try:
|
| 27 |
+
from server import mcp
|
| 28 |
+
print("[OK] MCP server loaded (29 tools, 2 resources)")
|
| 29 |
+
mcp_available = True
|
| 30 |
+
except Exception as e:
|
| 31 |
+
print(f"[WARNING] MCP server failed to load: {e}")
|
| 32 |
+
mcp_available = False
|
| 33 |
+
|
| 34 |
+
def get_claude_config():
|
| 35 |
+
"""Generate Claude Desktop configuration"""
|
| 36 |
+
config = {
|
| 37 |
+
"mcpServers": {
|
| 38 |
+
"fleetmind": {
|
| 39 |
+
"command": "npx",
|
| 40 |
+
"args": ["mcp-remote", MCP_SSE_ENDPOINT]
|
| 41 |
+
}
|
| 42 |
+
}
|
| 43 |
+
}
|
| 44 |
+
return json.dumps(config, indent=2)
|
| 45 |
+
|
| 46 |
+
def get_tools_list():
|
| 47 |
+
"""Get all 29 MCP tools"""
|
| 48 |
+
return [
|
| 49 |
+
["geocode_address", "Geocoding & Routing", "Convert address to GPS coordinates"],
|
| 50 |
+
["calculate_route", "Geocoding & Routing", "Calculate route with vehicle optimization"],
|
| 51 |
+
["calculate_intelligent_route", "Geocoding & Routing", "Weather + traffic aware routing"],
|
| 52 |
+
["create_order", "Order Management", "Create new delivery order"],
|
| 53 |
+
["count_orders", "Order Management", "Count orders by status"],
|
| 54 |
+
["fetch_orders", "Order Management", "Get list of orders with filters"],
|
| 55 |
+
["get_order_details", "Order Management", "Get full order details"],
|
| 56 |
+
["search_orders", "Order Management", "Search orders"],
|
| 57 |
+
["get_incomplete_orders", "Order Management", "Get pending/in-transit orders"],
|
| 58 |
+
["update_order", "Order Management", "Update order"],
|
| 59 |
+
["delete_order", "Order Management", "Delete order"],
|
| 60 |
+
["create_driver", "Driver Management", "Register new driver"],
|
| 61 |
+
["count_drivers", "Driver Management", "Count drivers by status"],
|
| 62 |
+
["fetch_drivers", "Driver Management", "Get list of drivers"],
|
| 63 |
+
["get_driver_details", "Driver Management", "Get full driver details"],
|
| 64 |
+
["search_drivers", "Driver Management", "Search drivers"],
|
| 65 |
+
["get_available_drivers", "Driver Management", "Get active drivers"],
|
| 66 |
+
["update_driver", "Driver Management", "Update driver"],
|
| 67 |
+
["delete_driver", "Driver Management", "Delete driver"],
|
| 68 |
+
["create_assignment", "Assignment Management", "Manual assignment"],
|
| 69 |
+
["auto_assign_order", "Assignment Management", "π€ Auto-assign to nearest driver"],
|
| 70 |
+
["intelligent_assign_order", "Assignment Management", "π§ Gemini 2.0 Flash AI assignment"],
|
| 71 |
+
["get_assignment_details", "Assignment Management", "Get assignment details"],
|
| 72 |
+
["update_assignment", "Assignment Management", "Update assignment"],
|
| 73 |
+
["unassign_order", "Assignment Management", "Remove assignment"],
|
| 74 |
+
["complete_delivery", "Assignment Management", "Mark delivery complete"],
|
| 75 |
+
["fail_delivery", "Assignment Management", "Mark delivery failed"],
|
| 76 |
+
["delete_all_orders", "Bulk Operations", "Bulk delete orders"],
|
| 77 |
+
["delete_all_drivers", "Bulk Operations", "Bulk delete drivers"],
|
| 78 |
+
]
|
| 79 |
+
|
| 80 |
+
# Create Gradio interface
|
| 81 |
+
print("\n[2/2] Creating Gradio UI...")
|
| 82 |
+
|
| 83 |
+
with gr.Blocks(theme=gr.themes.Soft(), title="FleetMind MCP Server") as gradio_app:
|
| 84 |
+
gr.Markdown("# π FleetMind MCP Server")
|
| 85 |
+
gr.Markdown("**Enterprise Model Context Protocol Server for AI-Powered Delivery Dispatch**")
|
| 86 |
+
gr.Markdown("*Track 1: Building MCP Servers - Enterprise Category*")
|
| 87 |
+
|
| 88 |
+
gr.Markdown("---")
|
| 89 |
+
|
| 90 |
+
gr.Markdown("## π MCP Server Connection")
|
| 91 |
+
|
| 92 |
+
gr.Markdown("### π‘ SSE Endpoint URL")
|
| 93 |
+
gr.Textbox(value=MCP_SSE_ENDPOINT, label="Copy this endpoint", interactive=False, max_lines=1)
|
| 94 |
+
|
| 95 |
+
gr.Markdown("### βοΈ Claude Desktop Configuration")
|
| 96 |
+
gr.Markdown("Copy and paste this into your `claude_desktop_config.json` file:")
|
| 97 |
+
gr.Code(value=get_claude_config(), language="json", label="claude_desktop_config.json")
|
| 98 |
+
|
| 99 |
+
gr.Markdown("### π How to Connect")
|
| 100 |
+
gr.Markdown("""
|
| 101 |
+
**Step 1:** Install Claude Desktop from https://claude.ai/download
|
| 102 |
+
|
| 103 |
+
**Step 2:** Open your `claude_desktop_config.json` file and add the configuration shown above
|
| 104 |
+
|
| 105 |
+
**Step 3:** Restart Claude Desktop
|
| 106 |
+
|
| 107 |
+
**Step 4:** Look for "FleetMind" in the π icon menu in Claude Desktop
|
| 108 |
+
|
| 109 |
+
**Step 5:** Start using commands like:
|
| 110 |
+
- "Create a delivery order for John at 123 Main St"
|
| 111 |
+
- "Show me all pending orders"
|
| 112 |
+
- "Auto-assign order ORD-... to the nearest driver"
|
| 113 |
+
- "Use AI to intelligently assign order ORD-..." (Gemini 2.0 Flash!)
|
| 114 |
+
""")
|
| 115 |
+
|
| 116 |
+
gr.Markdown("---")
|
| 117 |
+
|
| 118 |
+
gr.Markdown("## π οΈ Available MCP Tools (29 Total)")
|
| 119 |
+
gr.Dataframe(
|
| 120 |
+
value=get_tools_list(),
|
| 121 |
+
headers=["Tool Name", "Category", "Description"],
|
| 122 |
+
label="All FleetMind MCP Tools",
|
| 123 |
+
wrap=True
|
| 124 |
+
)
|
| 125 |
+
|
| 126 |
+
gr.Markdown("---")
|
| 127 |
+
|
| 128 |
+
gr.Markdown("## β Key Features")
|
| 129 |
+
gr.Markdown("""
|
| 130 |
+
- **29 AI Tools** - Complete fleet management suite
|
| 131 |
+
- **π§ Gemini 2.0 Flash AI** - Intelligent assignment with detailed reasoning
|
| 132 |
+
- **π¦οΈ Weather-Aware Routing** - Safety-first delivery planning
|
| 133 |
+
- **π¦ Real-Time Traffic** - Google Routes API integration
|
| 134 |
+
- **π SLA Tracking** - Automatic on-time performance monitoring
|
| 135 |
+
- **ποΈ PostgreSQL Database** - Production-grade data storage (Neon)
|
| 136 |
+
- **π Multi-Client Support** - Works with Claude Desktop, Continue, Cline, any MCP client
|
| 137 |
+
""")
|
| 138 |
+
|
| 139 |
+
gr.Markdown("---")
|
| 140 |
+
|
| 141 |
+
gr.Markdown("## π Resources")
|
| 142 |
+
gr.Markdown("""
|
| 143 |
+
- **GitHub:** https://github.com/mashrur-rahman-fahim/fleetmind-mcp
|
| 144 |
+
- **HuggingFace Space:** https://huggingface.co/spaces/MCP-1st-Birthday/fleetmind-dispatch-ai
|
| 145 |
+
- **MCP Protocol:** https://modelcontextprotocol.io
|
| 146 |
+
""")
|
| 147 |
+
|
| 148 |
+
gr.Markdown("---")
|
| 149 |
+
gr.Markdown("*FleetMind v1.0 - Built for MCP 1st Birthday Hackathon*")
|
| 150 |
+
|
| 151 |
+
print("[OK] Gradio UI created")
|
| 152 |
+
|
| 153 |
+
# Create FastAPI app
|
| 154 |
+
print("\nCreating unified FastAPI server...")
|
| 155 |
+
app = FastAPI(title="FleetMind MCP Server + UI")
|
| 156 |
+
|
| 157 |
+
# Mount Gradio at root
|
| 158 |
+
app = gr.mount_gradio_app(app, gradio_app, path="/")
|
| 159 |
+
print("[OK] Gradio UI mounted at /")
|
| 160 |
+
|
| 161 |
+
# Add MCP SSE endpoint
|
| 162 |
+
if mcp_available:
|
| 163 |
+
# Mount the MCP server's SSE handler
|
| 164 |
+
print("[OK] MCP SSE endpoint will be available at /sse")
|
| 165 |
+
|
| 166 |
+
print("\n" + "=" * 70)
|
| 167 |
+
print("[STARTING] Unified server...")
|
| 168 |
+
print("=" * 70)
|
| 169 |
+
print("[UI] Gradio UI: http://0.0.0.0:7860")
|
| 170 |
+
if mcp_available:
|
| 171 |
+
print("[MCP] MCP SSE: http://0.0.0.0:7860/sse")
|
| 172 |
+
print("=" * 70)
|
| 173 |
+
|
| 174 |
+
# Add MCP SSE endpoint to FastAPI app
|
| 175 |
+
if mcp_available:
|
| 176 |
+
from starlette.requests import Request
|
| 177 |
+
from starlette.responses import StreamingResponse
|
| 178 |
+
|
| 179 |
+
# Get the MCP server's SSE handler
|
| 180 |
+
# We'll use the app.py server as a separate process
|
| 181 |
+
# For now, just create a simple info endpoint
|
| 182 |
+
@app.get("/sse")
|
| 183 |
+
async def mcp_sse_info():
|
| 184 |
+
"""
|
| 185 |
+
MCP SSE endpoint information.
|
| 186 |
+
|
| 187 |
+
Note: For actual MCP SSE connection, deploy app.py separately
|
| 188 |
+
or use the production endpoint shown in the UI.
|
| 189 |
+
"""
|
| 190 |
+
return {
|
| 191 |
+
"message": "MCP SSE endpoint",
|
| 192 |
+
"status": "Use the standalone app.py for full MCP SSE functionality",
|
| 193 |
+
"tools_count": 29,
|
| 194 |
+
"resources_count": 2,
|
| 195 |
+
"production_endpoint": MCP_SSE_ENDPOINT
|
| 196 |
+
}
|
| 197 |
+
|
| 198 |
+
print("[INFO] MCP SSE info endpoint added at /sse")
|
| 199 |
+
print("[NOTE] For full MCP functionality, the standalone MCP server (app.py) should be deployed")
|
| 200 |
+
|
| 201 |
+
# Run unified server
|
| 202 |
+
if __name__ == "__main__":
|
| 203 |
+
# Run FastAPI with Gradio
|
| 204 |
+
uvicorn.run(
|
| 205 |
+
app,
|
| 206 |
+
host="0.0.0.0",
|
| 207 |
+
port=7860,
|
| 208 |
+
log_level="info"
|
| 209 |
+
)
|
| 210 |
+
|