HyperSaaS
BackendAI Chat

Overview

AI chat system with pluggable agent frameworks and tool calling.

The chat module is the core of HyperSaaS's AI capabilities. It provides multi-model chat sessions with a pluggable agent architecture supporting LangGraph, PydanticAI, and custom frameworks.

Core Models

ChatSession

class ChatSession(BaseModel, WorkspaceAwareModel):
    id = models.UUIDField(primary_key=True)
    user = models.ForeignKey(User, on_delete=models.CASCADE)
    workspace = models.ForeignKey(Workspace, on_delete=models.CASCADE)
    name = models.CharField(max_length=255, null=True, blank=True)
    status = models.CharField(choices=[ACTIVE, COMPLETED, ARCHIVED])

    # AI Configuration
    ai_provider = models.CharField(choices=[OPENAI, ANTHROPIC, GOOGLE, GROQ, DEEPSEEK])
    ai_model = models.CharField(max_length=100)
    system_prompt = models.TextField(blank=True)
    temperature = models.DecimalField(null=True)       # 0.0 - 2.0
    max_tokens = models.PositiveIntegerField(null=True)
    top_p = models.DecimalField(null=True)             # 0.0 - 1.0
    frequency_penalty = models.DecimalField(null=True) # -2.0 - 2.0
    presence_penalty = models.DecimalField(null=True)  # -2.0 - 2.0
    stream = models.BooleanField(default=False)
    model_parameters = models.JSONField(default=dict)  # Provider-specific params

    # Agent Framework
    agent_framework = models.CharField(
        choices=[("none", "None"), ("langgraph", "LangGraph"), ("pydantic_ai", "PydanticAI")],
        default="langgraph"
    )

    # Sharing
    is_shared = models.BooleanField(default=False)
    is_public = models.BooleanField(default=False)
    shareable_link_id = models.UUIDField(unique=True)

Message

class Message(BaseModel):
    id = models.UUIDField(primary_key=True)
    session = models.ForeignKey(ChatSession, on_delete=models.CASCADE)
    user = models.ForeignKey(User, null=True, blank=True)
    parent_message = models.ForeignKey("self", null=True, blank=True)
    sequence_number = models.PositiveIntegerField()  # Auto-assigned atomically
    role = models.CharField(choices=[USER, ASSISTANT, SYSTEM, TOOL])
    content = models.TextField()  # Max 100k chars
    attachments = models.JSONField(default=list)
    map_data = models.JSONField(null=True, blank=True)  # Location data

The sequence_number is atomically assigned using F() expressions to prevent race conditions in concurrent message creation.

Message Processing Flow

User sends message


Save user Message (atomic sequence_number)


Get AI handler → agent_framework check

    ├─ "langgraph" → LangGraphHandler
    ├─ "pydantic_ai" → PydanticAIHandler
    └─ "none" → BaseChatModelHandler (no tools)

    ├─ If AGENT_ASYNC_ENABLED → Celery task, return task_id
    └─ If sync → invoke directly


Handler returns AgentMessage[] + usage


process_and_save_ai_response()
    ├─ Save assistant Message(s)
    ├─ Extract map_data
    ├─ Log AIUsage record
    └─ Increment session message count

Supported Providers & Models

ProviderModels
OpenAIo3, o3-mini, o4-mini, gpt-5, gpt-5-mini, gpt-4o, gpt-4-turbo, o1
Anthropicclaude-opus-4.6, claude-sonnet-4.6, claude-sonnet-4.5, claude-haiku-4.5
Googlegemini-2.5-pro, gemini-2.5-flash, gemini-2.5-flash-lite
GroqVarious open models
DeepSeekDeepSeek models

On this page