Complete Guide to Getting Started with CoPaw: Building Your First AI Assistant from Scratch

AI Expert

Difficulty: Beginner | Duration: 15 Minutes | Gain: Master CoPaw core concepts and practical skills

CoPaw is your personal AI assistant—a powerful open-source framework built on AgentScope that works across multiple chat platforms. Whether you need a bot for DingTalk, Feishu, QQ, Discord, or iMessage, CoPaw has you covered.

Target Audience

This guide is suitable for the following developers:

  • 1-5 years of Python development experience
  • Want to build AI assistants for personal or enterprise use
  • Looking for a solution with flexible model options

Core Dependencies and Environment

Before starting, let's ensure your environment meets the following requirements:

DependencyMinimum Version
Python3.10
pipLatest version
OSmacOS / Linux / Windows

[!TIP]
If you don't want to manage Python yourself, CoPaw also provides a one-click installation script that handles all dependencies automatically.

Project Structure Overview

After initialization, a typical CoPaw project structure looks like this:

~/.copaw/
├── .secret/
│   └── providers.json       # Model provider configuration
├── active_skills/          # Auto-loaded custom skills
├── memory/                 # Conversation memory storage
├── models/                 # Local model files
└── copaw.json              # Main configuration file

Step-by-Step Instructions

Step 1: Install CoPaw

Installation is simple. Open your terminal and run:

pip install copaw

This installs the latest version of CoPaw from PyPI. The package includes all core dependencies required to run the assistant.

Step 2: Initialize Workspace

Once installed, let’s initialize the CoPaw workspace with default settings:

copaw init --defaults

This command creates the necessary directory structure and configuration files in your home directory (located at ~/.copaw/).

[!WARNING]
The initialization process creates a .secret directory to store sensitive information (like API Keys). Ensure this directory remains private and is not committed to version control.

Step 3: Configure Model Providers

Now comes the crucial part—connecting to an LLM. Let’s configure a model provider. We will use Defapi as the primary example, as it provides roughly half the price of official APIs while maintaining full OpenAI compatibility.

Edit the configuration file located at ~/.copaw/.secret/providers.json:

{
  "custom_providers": {
    "defapi": {
      "id": "defapi",
      "name": "Defapi",
      "default_base_url": "https://api.defapi.cn/v1",
      "api_key_prefix": "",
      "base_url": "https://api.defapi.cn/v1",
      "api_key": "your-defapi-api-key",
      "models": [
        {"id": "gpt-4o-mini", "name": "GPT-4o Mini"},
        {"id": "gpt-4o", "name": "GPT-4o"},
        {"id": "claude-sonnet-4-20250514", "name": "Claude Sonnet 4"},
        {"id": "gemini-2.0-flash", "name": "Gemini 2.0 Flash"}
      ],
      "chat_model": "OpenAIChatModel"
    }
  },
  "active_llm": {
    "provider_id": "defapi",
    "model": "gpt-4o-mini"
  }
}

Defapi is a cost-effective API platform that provides access to major LLM providers at approximately half the official price. It fully supports the OpenAI-compatible /v1/chat/completions protocol, making integration seamless. All major models on Defapi are compatible with the following protocols:

  • v1/chat/completions interface
  • v1/messages interface
  • v1beta/models/ interface

You can obtain an API Key from the Defapi official website.

Step 4: Built-in Providers at a Glance

CoPaw also supports several built-in providers. Here are some quick configuration examples:

OpenAI Configuration:

{
  "providers": {
    "openai": {
      "base_url": "https://api.openai.com/v1",
      "api_key": "sk-your-openai-key"
    }
  },
  "active_llm": {
    "provider_id": "openai",
    "model": "gpt-4o-mini"
  }
}

ModelScope (Ideal for users in China):

{
  "providers": {
    "modelscope": {
      "base_url": "https://api-inference.modelscope.cn/v1",
      "api_key": "ms-your-key"
    }
  }
}

DashScope (Alibaba Cloud):

{
  "providers": {
    "dashscope": {
      "base_url": "https://dashscope.aliyuncs.com/compatible-mode/v1",
      "api_key": "sk-your-dashscope-key"
    }
  }
}

Step 5: Launch CoPaw and Verify

Now let’s launch the application and verify everything is working:

copaw app

After running, open your browser and visit:

http://127.0.0.1:8088/

You should see the CoPaw Console interface, where you can:

  • Chat with the AI assistant
  • Configure channels (DingTalk, Feishu, QQ, Discord, etc.)
  • Manage skills and plugins
  • Test model connections

[!TIP]
Use the CLI command copaw models to quickly view current model configurations and test connections.

Step 6: Connect Chat Channels (Optional)

To make your assistant accessible via messaging apps, let’s add a channel. Taking Discord as an example:

  1. Create a Discord Bot in the Discord Developer Portal
  2. Get your Bot Token
  3. Add the configuration to copaw.json:
{
  "channels": {
    "discord": {
      "enabled": true,
      "bot_token": "your-discord-bot-token",
      "channel_ids": ["your-channel-id"]
    }
  }
}

Similar configurations apply to DingTalk, Feishu, QQ, and other supported platforms.

Troubleshooting

Here are the most common issues encountered when setting up CoPaw:

Issue 1: 401 Unauthorized Error

Symptom: API requests fail with a 401 status code.

Solution:

  • Double-check if the API Key is correctly copied into providers.json
  • Verify if the key has expired or been revoked
  • For Defapi, ensure you are using the correct API Key format

Issue 2: Connection Timeout

Symptom: Requests hang or time out after 30 seconds.

Solution:

  • Check your internet connection
  • Verify firewall settings allow outbound HTTPS traffic
  • For local models (Ollama), ensure the service is running on the correct port

Issue 3: Model Not Found

Symptom: Specific model IDs are not recognized.

Solution:

  • Confirm the exact model ID from the provider's documentation
  • Some models may be region-restricted—check availability in your region
  • Try using a more common model like gpt-4o-mini to verify basic connectivity

Issue 4: Ollama Connection Failure

Symptom: Local Ollama models cannot connect.

Solution:

  • Ensure ollama serve is running in another terminal
  • Verify the Base URL is set to http://localhost:11434/v1
  • Check if Ollama is installed and accessible in your PATH

Issue 5: Config File Not Found

Symptom: CoPaw cannot find providers.json.

Solution:

  • Ensure you have run copaw init --defaults first
  • Check if the file exists at ~/.copaw/.secret/providers.json
  • On Windows, use %USERPROFILE%\.copaw\.secret\providers.json

Issue 6: Port Already in Use

Symptom: Unable to start CoPaw on port 8088.

Solution:

  • Another application is using port 8088
  • Change the port in the configuration or stop the conflicting application

Advanced Directions

Once you've mastered the basics, here are some directions to explore:

1. Privacy-First Local Models

CoPaw supports running models locally without any external API calls. This is perfect for:

  • Privacy-sensitive applications
  • Offline operations
  • Reducing costs

Install local model support:

pip install 'copaw[llamacpp]'

Then download and use models like Qwen3:

copaw models download Qwen/Qwen3-4B-GGUF

2. Custom Skills

CoPaw supports extending its functionality through custom skills. Skills are automatically loaded from the active_skills/ directory. You can write Python scripts to define new tools the AI can use.

3. Heartbeat Scheduled Tasks

Use CoPaw’s heartbeat feature to trigger periodic actions:

  • Daily news summaries
  • Timed reminders
  • Automated content generation

4. Memory and Context Management

CoPaw implements intelligent memory management:

  • Long-term memory for persistent context
  • Token-based memory compression for efficiency
  • Context window configuration based on the chosen model

5. Multi-Agent Workflows

Based on AgentScope, CoPaw can handle complex multi-agent scenarios, allowing different AI models to collaborate on tasks.

Summary

CoPaw provides a flexible foundation for building AI assistants that work across multiple platforms. Key takeaways from this guide:

  1. Simple Installation — Just pip install copaw to get started
  2. Defapi Offers Best Value — Approximately 50% cheaper than official APIs while remaining fully compatible
  3. Centralized Configuration — All settings are in providers.json
  4. Console Makes Management Easier — A web interface for testing and configuration
  5. Built-in Extensibility — Skills, channels, and local models provide endless customization options

Start with Defapi for the best balance of cost and performance. Once you're comfortable, explore other providers or local models based on your specific needs.

[!TIP]
Remember: The best AI assistant is the one that fits your use case. Don't be afraid to experiment with different models and configurations to find what works best for you.

Happy AI journey!