GitHub Copilot Integration Guide
This guide will help you integrate CWCloud MCP Server with GitHub Copilot in VS Code, enabling you to access powerful AI models (GPT-4, Claude, Gemini) directly through Copilot Chat.
Quick Setupβ
Automated Setup (Recommended)β
Run the setup script from the project root:
./setup-github-copilot.sh
# Or
make setup-copilot
This script will:
- Build the MCP server
- Collect your CWCloud API credentials
- Test the server connection
- Create VS Code MCP configuration
- Optionally set up global configuration
Manual Setupβ
If you prefer manual setup, follow these steps:
1. Build the Serverβ
go build -o bin/cwcloud-mcp ./cmd/server
# Or
make build
2. Configure Environmentβ
Create a .env file with your CWCloud credentials:
COMWORK_API_URL=https://api.cwcloud.tech
COMWORK_ACCESS_KEY=your_access_key_here
COMWORK_SECRET_KEY=your_secret_key_here
LOG_LEVEL=info
Get your API keys from CWCloud Console.
3. Create MCP Configurationβ
Create .vscode/mcp.json in your project:
{
"servers": {
"cwcloud-mcp": {
"type": "stdio",
"command": "/absolute/path/to/your/bin/cwcloud-mcp",
"args": [],
"env": {
"COMWORK_API_URL": "https://api.cwcloud.tech",
"COMWORK_ACCESS_KEY": "your_access_key_here",
"COMWORK_SECRET_KEY": "your_secret_key_here",
"LOG_LEVEL": "info"
}
}
}
}
Note: Use absolute paths in the command field.
4. Global Configuration (Optional)β
To make the server available across all workspaces, copy the configuration to your VS Code user directory:
Linux/WSL:
mkdir -p ~/.config/Code/User
cp .vscode/mcp.json ~/.config/Code/User/
macOS:
mkdir -p ~/Library/Application\ Support/Code/User
cp .vscode/mcp.json ~/Library/Application\ Support/Code/User/
Windows:
mkdir "%APPDATA%\Code\User"
copy .vscode\mcp.json "%APPDATA%\Code\User\"
Using CWCloud Tools in GitHub Copilotβ
After setup, restart VS Code and you'll have access to these tools in Copilot Chat:
Available Toolsβ
AI Toolsβ
#generate_ai_promptβ
Generate AI responses using various models.
Parameters:
adapter(required): AI model to usemessage(required): Your promptconversation_id(optional): Continue existing conversationmax_tokens(optional): Token limittemperature(optional): Creativity (0.0-1.0)
Supported Adapters:
gpt4o- GPT-4 Optimizedgpt4o-mini- GPT-4 Mini (faster, cheaper)claude3sonnet- Claude 3 Sonnetclaude3haiku- Claude 3 Haikugemini- Google Geminigemini-pro- Google Gemini Pro
#list_conversationsβ
List your AI conversation history.
Parameters: None
#list_adaptersβ
List all available AI adapters and their current status.
Parameters:
format(optional): Output format -json(default),table, orlistshow_unavailable(optional): Include unavailable adapters (default: false)
Environment Management Toolsβ
#list_environmentsβ
List all your cloud environments (user access).
Parameters:
format(optional): Output format -table(default),json, orlist
Example:
#list_environments format:table