Skip to main content

Supported Providers

Sorty supports multiple AI providers, from cloud-based services to on-device models:

OpenAI

GPT-4, GPT-4 Turbo, GPT-3.5Supports vision mode for image analysis.

Anthropic

Claude 3.5 Sonnet, Claude 3 Opus/HaikuStrong reasoning, detailed explanations.

Apple Intelligence

On-device foundation modelsPrivate, no data leaves your Mac. Requires macOS 15.1+, M-series chip.

Ollama

Local LLMs (Llama, Mistral, etc.)Run models locally for complete privacy.

GitHub Copilot

GitHub Copilot LLM accessUse your existing Copilot subscription.

OpenRouter

Multi-provider gatewayAccess to 100+ models through one API.
All providers implement the AIClientProtocol, making them interchangeable.

Provider Factory

Sorty uses a factory pattern to instantiate the correct client:
public struct AIClientFactory {
    public static func createClient(config: AIConfig) throws -> AIClientProtocol {
        switch config.provider {
        case .openAI, .groq, .openAICompatible, .openRouter, .ollama, .gemini:
            return OpenAIClient(config: config)
            
        case .githubCopilot:
            return GitHubCopilotClient(config: config)
            
        case .anthropic:
            return AnthropicClient(config: config)
            
        case .appleFoundationModel:
            if AppleFoundationModelClient.isAvailable() {
                return AppleFoundationModelClient(config: config)
            }
            throw AIClientError.apiError(
                statusCode: 501,
                message: "Apple Intelligence is not supported on this version of macOS."
            )
        }
    }
}

Configuration

OpenAI

1

Get API Key

Sign up at platform.openai.com and create an API key.
2

Configure in Sorty

Navigate to Settings → AI Provider and select OpenAI.
  • API URL: https://api.openai.com/v1
  • API Key: Your OpenAI API key
  • Model: gpt-4-turbo-preview (recommended)
3

Enable Vision (Optional)

Toggle Enable Vision to use GPT-4 Vision for image analysis.

Anthropic

1

Get API Key

Sign up at console.anthropic.com and create an API key.
2

Configure in Sorty

Navigate to Settings → AI Provider and select Anthropic.
  • API URL: https://api.anthropic.com/v1
  • API Key: Your Anthropic API key (starts with ant-api-)
  • Model: claude-3-5-sonnet-20241022 (recommended)

Apple Intelligence

Requires macOS 15.1+ (macOS Sequoia) and an M-series chip (M1/M2/M3/M4).
1

Enable Apple Intelligence

Go to System Settings → Apple Intelligence & Siri and enable Apple Intelligence.
2

Configure in Sorty

Navigate to Settings → AI Provider and select Apple Foundation Models.No API key required - models run on-device.
  • 100% On-Device: Nothing leaves your Mac
  • No API Costs: Free to use
  • No Network Required: Works offline
  • No Rate Limits: Process unlimited files
Limitations:
  • Smaller context window than cloud models
  • May be slower than server-side GPUs
  • No vision mode support (yet)

Ollama (Local Models)

1

Install Ollama

Download from ollama.ai and install.
# macOS/Linux
curl -fsSL https://ollama.ai/install.sh | sh
2

Pull a Model

ollama pull llama3.2
# or
ollama pull mistral
# or
ollama pull qwen2.5
3

Start Ollama Server

ollama serve
The server runs on http://localhost:11434 by default.
4

Configure in Sorty

Navigate to Settings → AI Provider and select Ollama.
  • API URL: http://localhost:11434/v1
  • Model: llama3.2 (or your pulled model)
  • Requires API Key: OFF

GitHub Copilot

1

Authenticate

Navigate to Settings → AI Provider and select GitHub Copilot.Click Authenticate to sign in with GitHub.
2

Verify

Sorty will use your existing Copilot subscription.
Requires an active GitHub Copilot subscription ($10/month for individuals).

OpenRouter

1

Get API Key

Sign up at openrouter.ai and create an API key.
2

Configure in Sorty

Navigate to Settings → AI Provider and select OpenRouter.
  • API URL: https://openrouter.ai/api/v1
  • API Key: Your OpenRouter key
  • Model: See available models

Streaming Support

All providers support streaming responses for live progress updates:
public protocol AIClientProtocol {
    var streamingDelegate: StreamingDelegate? { get set }
    
    func analyze(
        files: [FileItem],
        customInstructions: String?,
        personaPrompt: String?,
        temperature: Double?
    ) async throws -> OrganizationPlan
}

public protocol StreamingDelegate {
    func didReceiveChunk(_ chunk: String)
    func didComplete(content: String)
    func didFail(error: Error)
}
  1. Client sends request with streaming enabled
  2. Server responds in chunks via Server-Sent Events (SSE)
  3. Delegate receives chunks and updates UI in real-time
  4. Progress insights are extracted from >> prefixed lines
  5. JSON output is parsed when complete

Error Handling

Sorty provides detailed, user-friendly error messages:
Authentication failedYour API key may be invalid or expired. Check your credentials in Settings.
API keys are automatically redacted from error messages for security.

Vision Support

Compatible Providers

ProviderVision SupportModels
OpenAI✅ Yesgpt-4-vision-preview, gpt-4-turbo, gpt-4o
Anthropic✅ Yesclaude-3-opus, claude-3-sonnet, claude-3.5-sonnet
Ollama⚠️ Limitedllava, bakllava (experimental)
Apple Intelligence❌ Not yet-
GitHub Copilot❌ No-

How Vision Analysis Works

1

Image Detection

Sorty identifies image files by extension:
private let visionImageExtensions: Set<String> = 
    ["jpg", "jpeg", "png", "heic", "webp"]
2

Image Preprocessing

Images are resized and encoded as base64 (max 1024x1024 for efficiency).
3

Multimodal Request

Images are sent alongside filenames to the AI:
func analyzeWithImages(
    files: [FileItem],
    imageData: [String: Data],
    customInstructions: String?,
    personaPrompt: String?,
    temperature: Double?
) async throws -> OrganizationPlan
4

Vision Insights

AI analyzes image content and suggests organization based on:
  • Objects and scenes
  • Text within images
  • Visual similarity
  • Context clues
Vision analysis significantly increases API costs and processing time. Use selectively.

Deep Scan Support

Provider Compatibility

extension AIProvider {
    public var supportsDeepScan: Bool {
        switch self {
        case .openAI, .anthropic, .openAICompatible, .openRouter:
            return true
        case .ollama, .githubCopilot, .appleFoundationModel:
            return false
        }
    }
}
  • Ollama: Local models have smaller context windows
  • Apple Intelligence: API limitations
  • GitHub Copilot: Designed for code completion, not document analysis

Timeout & Retry

Configure request timeouts in Settings → Advanced:
// Default timeout
let timeout: TimeInterval = 120.0 // 2 minutes

// For large batches
let extendedTimeout: TimeInterval = 300.0 // 5 minutes
Increase timeout for slow providers or large file batches.

API Key Security

API keys are stored securely in macOS Keychain:
// Keys are never stored in UserDefaults or plain text
Keychain.save(apiKey, for: "sorty.api.key")

// Privacy Mode blurs keys in UI
if privacyModeEnabled {
    displayedKey = "sk-••••••••••••••••"
}

File Organization

Learn how the AI organizes your files

Personas

Customize AI behavior for different workflows

The Learnings

Train the AI on your preferences