Getting Started with PentAGI

System Requirements

  • Docker and Docker Compose
  • Minimum 4GB RAM
  • 10GB free disk space
  • Internet access for downloading images and updates

License Agreement

Before using PentAGI, please read and agree to our End User License Agreement (EULA). The EULA can be found in our GitHub repository: EULA.md

For the most up-to-date information and current state of the project, please visit our GitHub repository.

Quick Installation

1. Create a working directory:

mkdir pentagi && cd pentagi

2. Copy the configuration file:

curl -o .env https://raw.githubusercontent.com/vxcontrol/pentagi/master/.env.example

3. Fill in the required API keys in the .env file:

# Required: At least one of these LLM providers
OPEN_AI_KEY=your_openai_key
ANTHROPIC_API_KEY=your_anthropic_key

# Optional: Additional search capabilities
GOOGLE_API_KEY=your_google_key
GOOGLE_CX_KEY=your_google_cx
TAVILY_API_KEY=your_tavily_key
TRAVERSAAL_API_KEY=your_traversaal_key

4. Launch PentAGI:

curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose.yml
docker compose up -d

Visit localhost:8443 to access PentAGI Web UI (default: admin@pentagi.com / admin)

System Architecture

PentAGI is built on a modular, scalable, and secure architecture. Key components:

1. Core Services

  • Frontend UI: React-based web interface with TypeScript
  • Backend API: REST and GraphQL APIs in Go
  • Vector Store: PostgreSQL with pgvector for semantic search
  • Task Queue: Async task processing system
  • AI Agent: Multi-agent system with specialized roles

2. Monitoring

  • OpenTelemetry: Unified observability data collection
  • Grafana: Real-time visualization and alerting dashboards
  • VictoriaMetrics: High-performance time-series storage
  • Jaeger: End-to-end distributed tracing

3. Analytics

  • Langfuse: Advanced LLM observability and analytics
  • ClickHouse: Column-oriented analytics data warehouse
  • Redis: High-speed caching and rate limiting
  • MinIO: S3-compatible object storage

External System Integration

Langfuse

For monitoring and analyzing AI agent operations:

# Add to your .env file:
LANGFUSE_BASE_URL=http://langfuse-web:3000

# Launch Langfuse stack:
curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-langfuse.yml
docker compose -f docker-compose.yml -f docker-compose-langfuse.yml up -d

Monitoring and Observability

For detailed system operation tracking:

# Add to your .env file:
OTEL_HOST=otelcol:8148

# Launch monitoring stack:
curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-observability.yml
docker compose -f docker-compose.yml -f docker-compose-observability.yml up -d

OAuth Integration

For GitHub and Google authentication, add to your .env file:

# GitHub OAuth
GITHUB_CLIENT_ID=your_github_client_id
GITHUB_CLIENT_SECRET=your_github_client_secret

# Google OAuth
GOOGLE_CLIENT_ID=your_google_client_id
GOOGLE_CLIENT_SECRET=your_google_client_secret