WorkspaceGPT v1.0.0

Your AI-powered
local coding assistant

Chat with your codebase and your Confluence docs from right inside your IDE. Designed for total privacy running 100% locally or via your own APIs.

Explore Features
WorkspaceGPT
WorkspaceGPT Logo
🔐

100% Local & Private

Run with Ollama locally with zero data sent out. Total privacy.

☁️

Cloud Connectors

Hook into OpenAI, Gemini, Groq, or OpenRouter for maximum power.

Capabilities designed for builders

Everything you need to navigate, understand, and build within your existing codebase seamlessly.

🤖

AI-Powered Q&A

Get context-aware answers from your local workspace using blazing fast Retrieval-Augmented Generation (RAG).

📄

Confluence One-Click

Seamlessly connect to your Confluence space and instantly start chatting with your documentation alongside your code.

🔷

Azure DevOps (ADO)

Deep integration with ADO to fetch work items, user stories, and pull requests directly into your AI context.

💬

Interactive Editor Chat

Ask questions directly in the IDE to receive intelligent, project-specific code solutions. Stop switching context.

Ready in minutes

1

Installation

WorkspaceGPT is available directly through the marketplace. Install it for VS Code or Cursor.

ext install Riteshkant.workspacegpt-extension
2

Choose Provider

Select your engine in Settings > Providers.

  • Ollama (100% Local) - Default: llama3.2:1b
  • OpenAI / Gemini - High performance models
  • OpenRouter - Multiple model access
3

Connect Contexts

Confluence Start

Go to Settings > Confluence. Click Sign in for one-click auth, and hit Sync.

ADO Synchronization

Go to Settings > ADO Integration. Enter PAT to sync pull requests, tickets, and work items.