Skip to content

LLM Commands

Manage LLM projects, agents, and chat sessions.

Commands

bash
tdx llm projects [pattern]          # List projects
tdx llm project list [pattern]      # Same as projects

tdx llm agents [pattern]            # List agents
tdx llm agent list [pattern]        # Same as agents

Project Management

Set Project Context

bash
# Set current project context (session-only)
tdx llm use "MyProject"

# Now all agent commands use "MyProject" by default
tdx llm agents
tdx llm agent show "Data Analyst"

List and Create Projects

bash
# List available models
tdx llm models

# List all projects
tdx llm projects

# List projects matching pattern
tdx llm projects "data*"
tdx llm projects "*_prod"

# Create a new project
tdx llm project create "MyProject" --description "Data analysis project"

# Delete a project
tdx llm project delete "OldProject"

Backup and Restore Projects

Backup an entire LLM project to a folder, including all agents, knowledge bases, prompts, and integrations:

bash
# Backup project to default folder ({project_name}.llm)
tdx llm project backup "MyProject"

# Backup to custom folder
tdx llm project backup "MyProject" -o ./backups/myproject

# Preview what would be backed up (dry-run)
tdx llm project backup "MyProject" --dry-run

# Overwrite existing backup without confirmation
tdx llm project backup "MyProject" -y

Restore a project from a backup:

bash
# Restore project with original name
tdx llm project restore ./MyProject.llm

# Restore with a new name
tdx llm project restore ./MyProject.llm --name "MyProject-restored"

# Preview what would be restored (dry-run)
tdx llm project restore ./MyProject.llm --dry-run

# Skip confirmation if project exists
tdx llm project restore ./MyProject.llm -y

Backup Contents

The backup folder contains:

  • project.json - Project metadata and backup info
  • agents.json - All agents with full configuration
  • knowledgebases.json - All knowledge bases
  • prompts.json - All prompts
  • integrations.json - All integrations

Agent Management

List Agents

bash
# List agents in current project
tdx llm agents

# Filter agents by pattern
tdx llm agents "test*"

# List agents in specific project
tdx llm agents --llm-project "OtherProject"

Create Agent

bash
# Create basic agent
tdx llm agent create "My Agent" --model claude-4.5-haiku

# Create with system prompt
tdx llm agent create "SQL Expert" \
  --system-prompt "You are an expert in SQL and data analysis." \
  --model "claude-4.5-sonnet"

# Create with all options
tdx llm agent create "Data Analyst" \
  --system-prompt "Help users analyze data." \
  --model "claude-4.5-sonnet" \
  --starter-message "Hello! I can help you analyze your data." \
  --max-tool-iterations 8 \
  --temperature 0.5

Show and Update Agent

bash
# Show agent details
tdx llm agent show "Data Analyst"

# Update agent name
tdx llm agent update "Data Analyst" --name "Senior Data Analyst"

# Update agent prompt
tdx llm agent update "Data Analyst" \
  --prompt "You are a senior data analyst." \
  --description "Updated description"

# Delete agent
tdx llm agent delete "Old Agent"

Chat History

bash
# List recent chat sessions
tdx llm history

# Show specific chat messages
tdx llm history chat456

Project Resolution

Agent commands resolve projects in this order:

  1. --llm-project global option (highest priority)
  2. tdx llm use <project> context (session-only)
  3. Default project tdx_default_<username> (auto-created)

Claude Code Integration

See tdx claude for launching Claude Code with TD LLM backend.

LLM Proxy Server

Experimental

This feature is experimental and may have limitations.

Start a local HTTP server providing Anthropic-compatible API endpoints:

bash
# Start proxy with defaults (port 4000)
tdx llm proxy

# Custom port
tdx llm proxy --port 8000

# Use specific project and agent
tdx llm proxy --project "MyProject" --agent "MyAgent"

# Enable debug mode
tdx llm proxy --debug

Proxy Configuration for Claude Code

Create .claude/settings.local.json:

json
{
  "env": {
    "ANTHROPIC_BASE_URL": "http://127.0.0.1:4000",
    "ANTHROPIC_MODEL": "sonnet"
  }
}

Known Limitations

  • Tool calls rely on prompt engineering
  • Non-streaming responses not implemented
  • Image inputs not supported
  • TD's built-in tools not directly accessible

Agent Options Reference

Create Options

OptionDescriptionDefault
--system-prompt <text>System prompt/instructionsempty
--model <name>Model type or aliasclaude-4.5-haiku
--starter-message <text>Starter message-
--max-tool-iterations <n>Max tool iterations4
--temperature <n>Temperature (0.0-2.0)0.7

Update Options

OptionDescription
--name <text>New agent name
--prompt <text>New prompt/instructions
--description <text>New description
--starter-message <text>New starter message

Backup Options

OptionDescriptionDefault
-o, --output <folder>Output folder{project_name}.llm
--dry-runPreview without creating filesfalse
-y, --yesSkip confirmation promptfalse

Restore Options

OptionDescriptionDefault
--name <text>New project nameOriginal name from backup
--dry-runPreview without making changesfalse
-y, --yesSkip confirmation promptfalse