Skip to main content

Introduction

OrbitAI provides a complete ecosystem for creating, managing, and orchestrating intelligent agents that can collaborate to solve complex tasks using Large Language Models (LLMs), tools, and knowledge bases.

What is OrbitAI?

OrbitAI introduce autonomous agents that can work individually or as teams to accomplish sophisticated workflows. Rather than traditional single-turn AI interactions, OrbitAI enables persistent, purpose-oriented agents that can:
  • Reason and plan complex multi-step solutions.
  • Collaborate with other agents in coordinated workflows.
  • Learn and adapt through persistent memory systems.
  • Use tools to interact with external systems and APIs.
  • Access knowledge through semantic search and retrieval systems.
  • Operate safely with built-in guardrails and validation.

How OrbitAI Works

The OrbitAI architecture is built around the concept of Orbits - coordinated teams of agents working together to execute tasks. Here’s how the system operates:
  1. Agent Creation: Define specialized agents with specific roles, purpose, and context.
  2. Task Definition: Create structured tasks with clear objectives and expected outputs.
  3. Orbit: Combine agents and tasks into orchestrated workflows.
  4. Execution: The system coordinates agent collaboration, tool usage, and task completion.
  5. Results: Receive structured outputs with full execution context and metrics.

Core Components

OrbitAI’s architecture consists of nine primary components that work together to create a powerful AI application platform:
ComponentPurposeKey Features
OrbitsOrchestration engine that coordinates agent teams and task executionSequential/hierarchical processes, multi-agent coordination, execution management
AgentsAutonomous AI entities with specific roles and capabilitiesRole-based behavior, tool integration, memory systems, delegation support
TasksStructured units of work with defined objectives and constraintsFlexible execution, dependency management, output formatting, validation
LLMsLarge Language Model providers and management systemSupport (OpenAI), model selection, token management
ToolsExternal capabilities and integrations available to agentsSystem tools, file operations, web search, API integrations, etc
ProcessExecution strategies that determine how agents collaborateSequential (one-by-one) or hierarchical (manager-coordinated) workflows
KnowledgeBaseSemantic information storage and retrieval systemDocument ingestion, embedding-based search, RAG (Retrieval Augmented Generation)
MemoryPersistent storage for agent learning and context retentionShort-term, long-term, entity, and contextual memory types
GuardrailsSafety and validation framework for agent behaviorContent filtering, behavior constraints, output validation, compliance checks

System Integration

How Components Work Together

The OrbitAI system creates powerful AI applications through seamless integration of its components:

1. Initialization & Configuration

let llmManager = LLMManager()
try llmManager.addProvider(OpenAIProvider(apiKey: "your-key"))

2. Agent Creation & Specialization

Agents are created with specific roles and equipped with relevant tools and LLM access:
let recipeExtractor = Agent(
    role: "Recipe Extractor",
    purpose: "Extract recipe content from URLs", 
    context: "Expert in web scraping and recipe data extraction",
    tools: ["webScraper"]
)

3. Task Definition & Assignment

Tasks are created with clear objectives and can leverage all system components:
let extractTask = Task(
    description: "Extract recipe from https://example.com/chocolate-chip-cookies",
    expectedOutput: "Plain text recipe with ingredients and instructions",
    agent: recipeExtractor
)

4. Knowledge & Memory Integration

Agents can access shared knowledge bases and maintain persistent memory:
let knowledgeBase = KnowledgeBase()
try await knowledgeBase.addSource(DirectorySource(path: "/recipes"))

recipeExtractor.knowledgeBase = knowledgeBase
recipeExtractor.enableMemory(type: .longTerm)

5. Orbit Orchestration

The Orbit coordinates everything together:
let orbit = try Orbit(
    agents: [recipeExtractor],
    tasks: [extractTask],
    process: .sequential,
    memory: true,
    usageMetrics: true
)

6. Execution Flow

During execution, components interact dynamically:
  • LLM Provider: Agents use configured LLM providers for reasoning and text generation.
  • Tool Integration: Agents invoke tools as needed for task completion.
  • Memory Access: Agents store and retrieve relevant information.
  • Knowledge Query: Agents search knowledge bases for additional context.
  • Guardrail Checking: All inputs and outputs are validated against configured guardrails.
  • Task Coordination: The process manages task dependencies and agent collaboration.

7. Result Integration

Final results combine outputs from all components:
let results = try await orbit.start()
// Results include:
// - Task outputs and agent contributions  
// - Memory updates and learning
// - Tool usage and external data
// - Knowledge base queries and context
// - Guardrail validation results
// - Performance metrics and usage data

Key Integration Patterns

  1. Memory-Knowledge Synergy: Agents combine personal memory with shared knowledge bases for comprehensive context.
  2. Tool-LLM Integration: LLM providers help agents select and use appropriate tools for task completion.
  3. Guardrail-Validation Pipeline: All agent outputs pass through guardrail validation before task completion.
  4. Process-Agent Coordination: Process types determine how agents communicate and coordinate work.
  5. Task-Memory Feedback: Completed tasks contribute to agent memory for future improvement.
This enables OrbitAI to handle complex, multi-step workflows while maintaining safety, efficiency, and adaptability. The modular design allows developers to configure and extend the system for specific use cases while leveraging the full power of autonomous agent collaboration.

Setting up

Get started with OrbitAI in your Swift project.

Installation Guide

Install OrbitAI and configure your development environment.