Fetching latest headlines…
AI-Powered SEO: Building an Automated Content Strategy Pipeline with Laravel and OpenAI
NORTH AMERICA
πŸ‡ΊπŸ‡Έ United Statesβ€’May 11, 2026

AI-Powered SEO: Building an Automated Content Strategy Pipeline with Laravel and OpenAI

2 views0 likes0 comments
Originally published byDev.to

Most developers I know treat SEO as an afterthought β€” something you bolt on after the site is live, usually because a client asked why they're not ranking. But in 2024, the gap between teams that automate their content intelligence and those that don't is becoming painfully obvious in search results.

This article isn't about using ChatGPT to write blog posts. It's about building a proper automated pipeline that handles keyword research, content gap analysis, meta generation, and internal linking β€” programmatically, with real data, using tools you already know.

Why Automate Content Strategy?

Content strategy at scale involves a lot of repetitive cognitive work: pulling keyword data, grouping by intent, checking what competitors rank for, generating outlines, writing meta descriptions for 200 pages. These tasks are parallelizable and pattern-driven β€” which makes them excellent candidates for automation.

The goal isn't to remove human judgment. It's to handle the mechanical 80% so your team can focus on the 20% that actually requires expertise.

Building the Pipeline

Here's the architecture we'll implement:

  1. Keyword ingestion β€” Pull search data from an API (Google Search Console or DataForSEO)
  2. Intent classification β€” Use OpenAI to categorize keywords by search intent
  3. Content gap analysis β€” Compare existing content against target keywords
  4. Automated meta generation β€” Generate SEO-optimized titles and descriptions
  5. Internal link suggestions β€” Surface relevant internal linking opportunities

Step 1: Keyword Ingestion with DataForSEO

First, let's pull keyword data. DataForSEO has a solid Laravel-friendly REST API:

// app/Services/KeywordResearchService.php

namespace App\Services;

use Illuminate\Support\Facades\Http;
use Illuminate\Support\Collection;

class KeywordResearchService
{
    private string $baseUrl = 'https://api.dataforseo.com/v3';

    public function getSeedKeywords(string $domain, string $country = 'ae'): Collection
    {
        $response = Http::withBasicAuth(
            config('services.dataforseo.login'),
            config('services.dataforseo.password')
        )->post("{$this->baseUrl}/keywords_data/google_ads/keywords_for_site/live", [
            [
                'target' => $domain,
                'location_code' => 2784, // UAE
                'language_code' => 'en',
                'include_serp_info' => true,
            ]
        ]);

        return collect($response->json('tasks.0.result'))
            ->map(fn($item) => [
                'keyword'    => $item['keyword'],
                'volume'     => $item['search_volume'],
                'difficulty' => $item['keyword_difficulty'] ?? null,
                'cpc'        => $item['cpc'] ?? 0,
            ])
            ->filter(fn($item) => $item['volume'] >= 50);
    }
}

Step 2: Intent Classification with OpenAI

Raw keyword lists are noise. We need to understand why someone is searching β€” informational, navigational, commercial, or transactional. Let's use a structured prompt and GPT-4o-mini for cost efficiency:

// app/Jobs/ClassifyKeywordIntentJob.php

namespace App\Jobs;

use App\Models\Keyword;
use Illuminate\Bus\Batchable;
use Illuminate\Contracts\Queue\ShouldQueue;
use OpenAI\Laravel\Facades\OpenAI;

class ClassifyKeywordIntentJob implements ShouldQueue
{
    use Batchable;

    public function __construct(private array $keywords) {}

    public function handle(): void
    {
        $keywordList = implode('\n', array_column($this->keywords, 'keyword'));

        $response = OpenAI::chat()->create([
            'model' => 'gpt-4o-mini',
            'messages' => [
                [
                    'role' => 'system',
                    'content' => 'You are an SEO expert. Classify each keyword by search intent. Return JSON only.'
                ],
                [
                    'role' => 'user',
                    'content' => "Classify these keywords. Return an array of objects with 'keyword' and 'intent' (informational|navigational|commercial|transactional):\n\n{$keywordList}"
                ]
            ],
            'response_format' => ['type' => 'json_object'],
        ]);

        $classified = json_decode($response->choices[0]->message->content, true);

        foreach ($classified['keywords'] as $item) {
            Keyword::where('keyword', $item['keyword'])
                ->update(['intent' => $item['intent']]);
        }
    }
}

Dispatch these in batches to stay within rate limits:

use Illuminate\Support\Facades\Bus;

$chunks = $keywords->chunk(20);

$batch = Bus::batch(
    $chunks->map(fn($chunk) => new ClassifyKeywordIntentJob($chunk->toArray()))->toArray()
)->name('keyword-intent-classification')->dispatch();

Step 3: Content Gap Analysis

Now compare what you have against what you should have. Grab your existing page titles and URLs from the database and use embeddings to find semantic distance:

// app/Services/ContentGapService.php

public function findGaps(Collection $keywords, Collection $existingPages): Collection
{
    // Generate embeddings for existing page titles
    $pageEmbeddings = $existingPages->map(function ($page) {
        $response = OpenAI::embeddings()->create([
            'model' => 'text-embedding-3-small',
            'input' => $page->title,
        ]);
        return [
            'url'       => $page->url,
            'embedding' => $response->embeddings[0]->embedding,
        ];
    });

    return $keywords->filter(function ($keyword) use ($pageEmbeddings) {
        $kwResponse = OpenAI::embeddings()->create([
            'model' => 'text-embedding-3-small',
            'input' => $keyword['keyword'],
        ]);
        $kwEmbedding = $kwResponse->embeddings[0]->embedding;

        // Calculate cosine similarity against all pages
        $maxSimilarity = $pageEmbeddings->max(fn($page) =>
            $this->cosineSimilarity($kwEmbedding, $page['embedding'])
        );

        // If no page covers this topic well, flag it as a gap
        return $maxSimilarity < 0.82;
    });
}

Step 4: Automated Meta Generation

For any page missing a meta description β€” or with one that's too short β€” generate optimized versions at scale:

// Artisan command: php artisan seo:generate-metas

public function handle(): void
{
    $pages = Page::whereNull('meta_description')
        ->orWhere('meta_description', '')
        ->cursor();

    foreach ($pages as $page) {
        $response = OpenAI::chat()->create([
            'model' => 'gpt-4o-mini',
            'messages' => [
                ['role' => 'system', 'content' => 'Generate SEO meta descriptions. Max 155 characters. Include a clear value proposition. No clickbait.'],
                ['role' => 'user', 'content' => "Page title: {$page->title}\nPage content excerpt: {$page->excerpt}\n\nGenerate a meta description."]
            ]
        ]);

        $page->update([
            'meta_description' => substr($response->choices[0]->message->content, 0, 155)
        ]);

        $this->info("Updated: {$page->title}");
    }
}

Handling the Human Layer

Automation handles throughput. Humans handle quality. Set up a simple review queue in Livewire where editors can approve or edit AI-generated metadata before it goes live. Don't ship AI output directly to production without a human checkpoint β€” especially for client sites.

When our team builds projects for clients needing solid technical foundations β€” like the Dubai web design services work we do for local businesses β€” this review workflow is always baked in from the start, not patched in later.

Scheduling the Pipeline

Wire everything together in the scheduler:

// routes/console.php

use Illuminate\Support\Facades\Schedule;

Schedule::command('seo:ingest-keywords')->weekly();
Schedule::command('seo:classify-intents')->weekly()->after(/* ingest completes */);
Schedule::command('seo:gap-analysis')->weekly();
Schedule::command('seo:generate-metas')->daily();

For production, use Laravel Horizon to monitor queue throughput and keep an eye on OpenAI API costs β€” embedding calls add up quickly at scale.

Measuring Impact

Automation without measurement is just expensive scripting. Track:

  • Keyword coverage rate β€” What percentage of target keywords have a matching page?
  • Meta description coverage β€” Pages with optimized descriptions vs. total pages
  • Content gap closure rate β€” How many gaps were addressed each quarter?
  • Organic click-through rate β€” From Google Search Console via its API

Plug these into a simple dashboard (a Livewire component works perfectly) and review monthly.

Conclusion

AI-powered SEO automation isn't about replacing strategists β€” it's about giving them leverage. A pipeline like this handles keyword classification, gap discovery, and meta generation at a scale no human team can match manually, while keeping your editorial team in control of the output that actually ships.

Start with meta generation β€” it's the lowest-risk, highest-ROI piece of this puzzle. Get that running reliably, then layer in the gap analysis and keyword classification. Build incrementally, measure everything, and treat the AI as a fast junior analyst, not an authority.

The sites that will dominate search over the next few years aren't the ones with the biggest content budgets β€” they're the ones with the most intelligent content operations.

Comments (0)

Sign in to join the discussion

Be the first to comment!