Mastering the New Laravel AI SDK in Laravel 12

Authors
  • avatar
    Name
    Nino
    Occupation
    Senior Tech Editor

Laravel has historically redefined developer productivity by providing expressive, elegant syntax for complex backend tasks. From Eloquent ORM to the robust Queue system, the framework has always prioritized developer experience (DX). With the official release of Laravel 12, the ecosystem takes its most significant step into the future of intelligent computing: the introduction of the Laravel AI SDK. This first-party package offers a unified, standardized interface to interact with the world's most powerful Large Language Models (LLMs), effectively ending the era of fragmented API integrations.

The Shift Toward Unified AI Integration

Before Laravel 12, developers were forced to choose between official vendor SDKs (like OpenAI's PHP client) or community-maintained wrappers. While functional, these approaches often led to vendor lock-in and inconsistent codebases. If a project needed to switch from OpenAI to Anthropic's Claude 3.5 Sonnet, it required a significant rewrite of the service layer.

The Laravel AI SDK solves this by providing an abstraction layer. Whether you are calling GPT-4o, Claude, Gemini, or even locally hosted models like DeepSeek-V3 via aggregators like n1n.ai, the syntax remains identical. This allows for high-speed LLM API switching without breaking your application logic.

Core Architecture: The Agent Pattern

At the heart of the SDK is the "Agent." Unlike simple API calls, an Agent in Laravel 12 is a dedicated class that encapsulates personality, instructions, and tools. This follows the industry trend toward agentic workflows where AI is not just a chatbot but a functional component of the software.

To begin, install the package via Composer:

composer require laravel/ai

After publishing the configuration and running migrations, you can generate your first Agent:

php artisan make:agent CustomerSupportAgent

This command creates a file in App/Ai/Agents. Here, you define the instructions() method, which serves as the system prompt. For instance, if you are using n1n.ai to access multiple models, your agent can be configured to use specific providers based on the task complexity.

Implementation Guide: Building a Context-Aware Sales Coach

One of the most powerful features of the SDK is the RemembersConversations trait. Traditionally, maintaining chat history involved manually storing JSON blobs in a database and sending them back with every request. Laravel 12 automates this using the agent_conversations and agent_conversation_messages tables.

namespace App\Ai\Agents;

use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Promptable;
use Laravel\Ai\Concerns\RemembersConversations;

class SalesCoach implements Agent
{
    use Promptable, RemembersConversations;

    public function instructions(): string
    {
        return "You are an expert sales mentor. Provide feedback on user scripts.";
    }
}

In your controller, you can now link a conversation to a specific user model:

$coach = new SalesCoach();
$response = $coach->forUser($request->user())
                  ->prompt("How do I handle price objections?");

The SDK handles the retrieval of the last 10-20 messages, formats them for the LLM, and saves the new response automatically. This drastically reduces the boilerplate code required for production-ready chatbots.

Structured Output and Data Extraction

For enterprise applications, raw text responses are rarely enough. We need structured data. The Laravel AI SDK includes a HasStructuredOutput interface that leverages JSON schemas to force the LLM to return valid objects.

FeatureTraditional ApproachLaravel AI SDK
ValidationManual regex/JSON parsingAutomatic Schema Validation
ReliabilityHigh failure rate on complex objectsNative JSON Mode support
SpeedRequires multiple promptsSingle-shot structured generation

Example schema definition:

public function schema(JsonSchema $schema): array
{
    return [
        'lead_score' => $schema->integer()->min(1)->max(100),
        'intent' => $schema->string()->enum(['purchase', 'inquiry', 'complaint']),
        'summary' => $schema->string(),
    ];
}

By routing these requests through a high-performance aggregator like n1n.ai, developers can ensure that even under heavy load, the structured data generation remains stable and fast.

Multi-modal Capabilities: Beyond Text

Laravel 12 doesn't stop at text. The SDK treats images, audio, and documents as first-class citizens. Using the Files namespace, you can pass attachments directly into the prompt method. This is particularly useful for RAG (Retrieval-Augmented Generation) or automated content moderation.

use Laravel\Ai\Files;

$response = (new DocumentAnalyzer)->prompt(
    'Extract the total amount from this invoice.',
    attachments: [
        Files\Document::fromPath(storage_path('invoices/inv_001.pdf'))
    ]
);

Pro Tip: Optimizing for Latency with Streaming

For a modern user experience, waiting for a 500-word response to finish generating is unacceptable. The SDK provides a stream() method that integrates seamlessly with Laravel's response handling. When paired with Server-Sent Events (SSE) or Livewire, you can display text to the user as it is being generated by the model.

return (new SalesCoach)->stream("Write a 300-word sales pitch.");

Conclusion

The Laravel AI SDK represents a paradigm shift in how PHP developers interact with artificial intelligence. By abstracting the complexities of conversation state, structured output, and multi-modal inputs, it allows developers to focus on building features rather than managing API quirks. To get the most out of this SDK, using a stable backend that supports all major models is crucial.

Get a free API key at n1n.ai.