Compare commits

..

22 Commits

Author SHA1 Message Date
f2b8b136af chore: release v1.9.7
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m15s
Monorepo Pipeline / 🧹 Lint (push) Successful in 4m6s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m19s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 38s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 43s
Monorepo Pipeline / 🚀 Release (push) Successful in 1m54s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 2m33s
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 4s
2026-03-02 21:16:51 +01:00
2e07b213d1 chore: remove unused 3d dependencies in gatekeeper to fix lint 2026-03-02 21:16:49 +01:00
a2c1eaefba chore: release v1.9.6
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m18s
Monorepo Pipeline / 🧹 Lint (push) Failing after 3m32s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m3s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 21:08:34 +01:00
80ff266f9c fix: allow vitest to pass with no tests in seo-engine
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 54s
Monorepo Pipeline / 🧹 Lint (push) Failing after 2m7s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m13s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 21:04:14 +01:00
6b1c5b7e30 chore: release @mintel/payload-ai
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Failing after 47s
Monorepo Pipeline / 🧹 Lint (push) Failing after 2m3s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m10s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 21:00:51 +01:00
80eefad5ea feat: extract reusable @mintel/payload-ai package 2026-03-02 21:00:09 +01:00
72556af24c fix(mcp): handle gitea api envelope responses and add safety checks
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Failing after 54s
Monorepo Pipeline / 🧹 Lint (push) Failing after 2m36s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m27s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 12:53:11 +01:00
2a5466c6c0 feat(gitea-mcp): add custom Gitea MCP server for Antigravity compatibility
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 3s
Monorepo Pipeline / 🧪 Test (push) Failing after 50s
Monorepo Pipeline / 🧹 Lint (push) Failing after 3m6s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m13s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 12:36:57 +01:00
2d36a4ec71 ci(qa): refactor QA suite into granular jobs and fix NPM_TOKEN auth
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Failing after 47s
Monorepo Pipeline / 🧹 Lint (push) Failing after 3m3s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m41s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 11:51:12 +01:00
ded9da7d32 feat(seo-engine): implement competitor scraper, MDX draft editor, and strategy report generator
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Failing after 51s
Monorepo Pipeline / 🧹 Lint (push) Failing after 2m25s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m28s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 10:16:11 +01:00
36ed26ad79 feat(gatekeeper): major UI upgrade - high-fidelity light theme, iridescent mouse-reactive form, and enhanced background animation
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m10s
Monorepo Pipeline / 🧹 Lint (push) Failing after 3m15s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m53s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 4s
2026-02-28 21:48:03 +01:00
4e72a0baac fix(pipeline): remove image-processor build job and cms-infra from gatekeeper dockerfile
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 46s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m48s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m48s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 4s
2026-02-27 23:08:39 +01:00
8ca7eb3f49 chore: release v1.9.5
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 59s
Monorepo Pipeline / 🧹 Lint (push) Successful in 3m33s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m4s
Monorepo Pipeline / 🐳 Build Image Processor (push) Failing after 24s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 37s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Failing after 14s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 41s
Monorepo Pipeline / 🚀 Release (push) Successful in 1m41s
2026-02-27 22:27:30 +01:00
32d3ff010a feat(release): introduce dedicated release script to replace flawed git push hook
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 7s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m4s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m50s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m21s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 21:03:54 +01:00
cb68e1fb5c chore: sync versions to v1.9.0
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m12s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m51s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m57s
Monorepo Pipeline / 🐳 Build Image Processor (push) Failing after 32s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 33s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 40s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 31s
Monorepo Pipeline / 🚀 Release (push) Successful in 1m36s
2026-02-27 21:01:52 +01:00
1bd7c6aba5 fix(husky): avoid echo syntax error under sh
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 3s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m5s
Monorepo Pipeline / 🧹 Lint (push) Successful in 3m11s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m46s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 19:41:16 +01:00
8b0e130b08 chore: sync versions to v1.9.4
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 13s
Monorepo Pipeline / 🧪 Test (push) Successful in 58s
Monorepo Pipeline / 🧹 Lint (push) Successful in 3m18s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m5s
Monorepo Pipeline / 🐳 Build Image Processor (push) Failing after 25s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 33s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 45s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 34s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m1s
2026-02-27 19:40:30 +01:00
bd1d33a157 fix(husky): aggressively intercept tag push and silence expected error 2026-02-27 19:40:28 +01:00
b70a89ec86 chore: sync versions to v1.9.3
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m15s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m55s
Monorepo Pipeline / 🧹 Lint (push) Successful in 4m20s
Monorepo Pipeline / 🐳 Build Image Processor (push) Failing after 35s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 45s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 52s
Monorepo Pipeline / 🚀 Release (push) Successful in 3m9s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 52s
2026-02-27 19:37:34 +01:00
da28305c2d fix(husky): simplify pre-push hook to let native git push modified tag
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m3s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m54s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m44s
Monorepo Pipeline / 🐳 Build Image Processor (push) Failing after 28s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 35s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 42s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 36s
Monorepo Pipeline / 🚀 Release (push) Successful in 3m2s
2026-02-27 19:37:30 +01:00
fecb5c50ea chore: sync versions to v1.9.2
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 6s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m6s
Monorepo Pipeline / 🧹 Lint (push) Successful in 3m30s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m12s
Monorepo Pipeline / 🐳 Build Image Processor (push) Failing after 31s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 33s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 39s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 43s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m54s
2026-02-27 19:37:02 +01:00
b4b81a8315 fix(husky): auto-push current branch to keep synced after version bump 2026-02-27 19:37:00 +01:00
76 changed files with 6258 additions and 227 deletions

View File

@@ -1,7 +0,0 @@
---
"@mintel/monorepo": patch
"acquisition-manager": patch
"feedback-commander": patch
---
fix: make directus extension build scripts more resilient

View File

@@ -1,5 +1,5 @@
# Project
IMAGE_TAG=v1.9.1
IMAGE_TAG=v1.9.7
PROJECT_NAME=sample-website
PROJECT_COLOR=#82ed20

View File

@@ -192,9 +192,6 @@ jobs:
file: packages/infra/docker/Dockerfile.gatekeeper
name: Gatekeeper (Product)
- image: image-processor
file: apps/image-service/Dockerfile
name: Image Processor
steps:
- name: Checkout
uses: actions/checkout@v4

View File

@@ -18,10 +18,16 @@ on:
required: true
GATEKEEPER_PASSWORD:
required: true
NPM_TOKEN:
required: false
MINTEL_PRIVATE_TOKEN:
required: false
GITEA_PAT:
required: false
jobs:
qa_suite:
name: 🛡 Nightly QA Suite
prepare:
name: 🏗 Prepare & Install
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
@@ -39,95 +45,157 @@ jobs:
- name: 🔐 Registry Auth
run: |
echo "@mintel:registry=https://git.infra.mintel.me/api/packages/mmintel/npm" > .npmrc
echo "//git.infra.mintel.me/api/packages/mmintel/npm/:_authToken=${{ secrets.MINTEL_PRIVATE_TOKEN || secrets.GITEA_PAT }}" >> .npmrc
echo "//git.infra.mintel.me/api/packages/mmintel/npm/:_authToken=${{ secrets.NPM_TOKEN || secrets.MINTEL_PRIVATE_TOKEN || secrets.GITEA_PAT }}" >> .npmrc
- name: Install dependencies
id: deps
run: |
pnpm store prune
pnpm install --no-frozen-lockfile
- name: 📦 Cache APT Packages
uses: actions/cache@v4
- name: 📦 Archive dependencies
uses: actions/upload-artifact@v4
with:
path: /var/cache/apt/archives
key: apt-cache-${{ runner.os }}-${{ runner.arch }}-chromium
name: node_modules
path: |
node_modules
.npmrc
retention-days: 1
- name: 💾 Cache Chromium
id: cache-chromium
uses: actions/cache@v4
static:
name: 🔍 Static Analysis
needs: prepare
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup pnpm
uses: pnpm/action-setup@v3
with:
path: /usr/bin/chromium
key: ${{ runner.os }}-chromium-native-${{ hashFiles('package.json') }}
- name: 🔍 Install Chromium (Native & ARM64)
if: steps.cache-chromium.outputs.cache-hit != 'true' && steps.deps.outcome == 'success'
run: |
rm -f /etc/apt/apt.conf.d/docker-clean
apt-get update
apt-get install -y gnupg wget ca-certificates
OS_ID=$(. /etc/os-release && echo $ID)
CODENAME=$(. /etc/os-release && echo $VERSION_CODENAME)
if [ "$OS_ID" = "debian" ]; then
apt-get install -y chromium
else
mkdir -p /etc/apt/keyrings
KEY_ID="82BB6851C64F6880"
wget -qO- "https://keyserver.ubuntu.com/pks/lookup?op=get&search=0x$KEY_ID" | gpg --dearmor > /etc/apt/keyrings/xtradeb.gpg
echo "deb [signed-by=/etc/apt/keyrings/xtradeb.gpg] http://ppa.launchpad.net/xtradeb/apps/ubuntu $CODENAME main" > /etc/apt/sources.list.d/xtradeb-ppa.list
printf "Package: *\nPin: release o=LP-PPA-xtradeb-apps\nPin-Priority: 1001\n" > /etc/apt/preferences.d/xtradeb
apt-get update
apt-get install -y --allow-downgrades chromium
fi
[ -f /usr/bin/chromium ] && ln -sf /usr/bin/chromium /usr/bin/google-chrome
[ -f /usr/bin/chromium ] && ln -sf /usr/bin/chromium /usr/bin/chromium-browser
# ── Quality Gates ─────────────────────────────────────────────────────────
- name: 🌐 Full Sitemap HTML Validation
if: always() && steps.deps.outcome == 'success'
version: 10
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20
- name: 📥 Restore dependencies
uses: actions/download-artifact@v4
with:
name: node_modules
- name: 🌐 HTML Validation
env:
NEXT_PUBLIC_BASE_URL: ${{ inputs.TARGET_URL }}
GATEKEEPER_PASSWORD: ${{ secrets.GATEKEEPER_PASSWORD }}
run: pnpm run check:html
- name: 🌐 Dynamic Asset Presence & Error Scan
if: always() && steps.deps.outcome == 'success'
- name: 🖼️ Asset Scan
env:
NEXT_PUBLIC_BASE_URL: ${{ inputs.TARGET_URL }}
GATEKEEPER_PASSWORD: ${{ secrets.GATEKEEPER_PASSWORD }}
run: pnpm run check:assets
- name: ♿ Accessibility Scan (WCAG)
if: always() && steps.deps.outcome == 'success'
accessibility:
name: ♿ Accessibility
needs: prepare
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup pnpm
uses: pnpm/action-setup@v3
with:
version: 10
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20
- name: 📥 Restore dependencies
uses: actions/download-artifact@v4
with:
name: node_modules
- name: 🔍 Install Chromium
run: |
apt-get update && apt-get install -y gnupg wget ca-certificates
CODENAME=$(. /etc/os-release && echo $VERSION_CODENAME)
mkdir -p /etc/apt/keyrings
wget -qO- "https://keyserver.ubuntu.com/pks/lookup?op=get&search=0x82BB6851C64F6880" | gpg --dearmor > /etc/apt/keyrings/xtradeb.gpg
echo "deb [signed-by=/etc/apt/keyrings/xtradeb.gpg] http://ppa.launchpad.net/xtradeb/apps/ubuntu $CODENAME main" > /etc/apt/sources.list.d/xtradeb-ppa.list
printf "Package: *\nPin: release o=LP-PPA-xtradeb-apps\nPin-Priority: 1001\n" > /etc/apt/preferences.d/xtradeb
apt-get update && apt-get install -y --allow-downgrades chromium
ln -sf /usr/bin/chromium /usr/bin/google-chrome
- name: ♿ WCAG Scan
continue-on-error: true
env:
NEXT_PUBLIC_BASE_URL: ${{ inputs.TARGET_URL }}
GATEKEEPER_PASSWORD: ${{ secrets.GATEKEEPER_PASSWORD }}
run: pnpm run check:wcag
- name: 📦 Unused Dependencies Scan (depcheck)
if: always() && steps.deps.outcome == 'success'
analysis:
name: 🧪 Maintenance & Links
needs: prepare
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup pnpm
uses: pnpm/action-setup@v3
with:
version: 10
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20
- name: 📥 Restore dependencies
uses: actions/download-artifact@v4
with:
name: node_modules
- name: 📦 Depcheck
continue-on-error: true
run: pnpm dlx depcheck --ignores="*eslint*,*typescript*,*tailwindcss*,*postcss*,*prettier*,*@types/*,*husky*,*lint-staged*,*@next/*,*@lhci/*,*commitlint*,*cspell*,*rimraf*,*@payloadcms/*,*start-server-and-test*,*html-validate*,*critters*,*dotenv*,*turbo*"
- name: 🔗 Markdown & HTML Link Check (Lychee)
if: always() && steps.deps.outcome == 'success'
- name: 🔗 Lychee Link Check
uses: lycheeverse/lychee-action@v2
with:
args: --accept 200,204,429 --timeout 15 content/ app/ public/
fail: true
- name: 🎭 LHCI Desktop Audit
id: lhci_desktop
if: always() && steps.deps.outcome == 'success'
performance:
name: 🎭 Lighthouse
needs: prepare
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20
- name: Setup pnpm
uses: pnpm/action-setup@v3
with:
version: 10
- name: 📥 Restore dependencies
uses: actions/download-artifact@v4
with:
name: node_modules
- name: 🔍 Install Chromium
run: |
apt-get update && apt-get install -y gnupg wget ca-certificates
CODENAME=$(. /etc/os-release && echo $VERSION_CODENAME)
mkdir -p /etc/apt/keyrings
wget -qO- "https://keyserver.ubuntu.com/pks/lookup?op=get&search=0x82BB6851C64F6880" | gpg --dearmor > /etc/apt/keyrings/xtradeb.gpg
echo "deb [signed-by=/etc/apt/keyrings/xtradeb.gpg] http://ppa.launchpad.net/xtradeb/apps/ubuntu $CODENAME main" > /etc/apt/sources.list.d/xtradeb-ppa.list
printf "Package: *\nPin: release o=LP-PPA-xtradeb-apps\nPin-Priority: 1001\n" > /etc/apt/preferences.d/xtradeb
apt-get update && apt-get install -y --allow-downgrades chromium
ln -sf /usr/bin/chromium /usr/bin/google-chrome
- name: 🎭 LHCI Desktop
env:
LHCI_URL: ${{ inputs.TARGET_URL }}
GATEKEEPER_PASSWORD: ${{ secrets.GATEKEEPER_PASSWORD }}
run: pnpm run pagespeed:test -- --collect.settings.preset=desktop
- name: 📱 LHCI Mobile Audit
id: lhci_mobile
if: always() && steps.deps.outcome == 'success'
- name: 📱 LHCI Mobile
env:
LHCI_URL: ${{ inputs.TARGET_URL }}
GATEKEEPER_PASSWORD: ${{ secrets.GATEKEEPER_PASSWORD }}
@@ -135,7 +203,7 @@ jobs:
notifications:
name: 🔔 Notify
needs: [qa_suite]
needs: [prepare, static, accessibility, analysis, performance]
if: always()
runs-on: docker
container:
@@ -144,22 +212,30 @@ jobs:
- name: 🔔 Gotify
shell: bash
run: |
SUITE="${{ needs.qa_suite.result }}"
PREPARE="${{ needs.prepare.result }}"
STATIC="${{ needs.static.result }}"
A11Y="${{ needs.accessibility.result }}"
ANALYSIS="${{ needs.analysis.result }}"
PERF="${{ needs.performance.result }}"
PROJECT="${{ inputs.PROJECT_NAME }}"
URL="${{ inputs.TARGET_URL }}"
if [[ "$SUITE" != "success" ]]; then
if [[ "$PREPARE" != "success" || "$STATIC" != "success" || "$PERF" != "success" ]]; then
PRIORITY=8
EMOJI="⚠️"
EMOJI="🚨"
STATUS_LINE="Nightly QA Failed! Action required."
else
PRIORITY=2
EMOJI="✅"
STATUS_LINE="Nightly QA Passed perfectly."
STATUS_LINE="Nightly QA Passed."
fi
TITLE="$EMOJI $PROJECT Nightly QA"
MESSAGE="$STATUS_LINE\n$URL\nPlease check Pipeline output for details."
MESSAGE="$STATUS_LINE
Prepare: $PREPARE | Static: $STATIC | A11y: $A11Y
Analysis: $ANALYSIS | Perf: $PERF
$URL"
curl -s -k -X POST "${{ secrets.GOTIFY_URL }}/message?token=${{ secrets.GOTIFY_TOKEN }}" \
-F "title=$TITLE" \

View File

@@ -5,42 +5,4 @@ if [ -f "$SCRIPT_DIR/scripts/validate-sdk-imports.sh" ]; then
"$SCRIPT_DIR/scripts/validate-sdk-imports.sh" || exit 1
fi
# Check if we are pushing a tag
while read local_ref local_sha remote_ref remote_sha
do
if [[ "$remote_ref" == refs/tags/* ]]; then
TAG=${remote_ref#refs/tags/}
echo "🏷️ Tag detected: $TAG, ensuring versions are synced..."
# Run sync script
pnpm sync-versions "$TAG"
# Check for changes in relevant files
SYNC_FILES="package.json packages/*/package.json apps/*/package.json .env.example"
CHANGES=$(git status --porcelain $SYNC_FILES)
if [[ -n "$CHANGES" ]]; then
echo "📝 Version sync made changes. Integrating into tag..."
# Stage and commit
git add $SYNC_FILES
git commit -m "chore: sync versions to $TAG" --no-verify
# Force update the local tag to point to the new commit
git tag -f "$TAG" > /dev/null
echo "✅ Tag $TAG has been updated locally with synced versions."
echo "🚀 Auto-pushing updated tag..."
# Push the updated tag directly (using --no-verify to avoid recursion)
git push origin "$TAG" --force --no-verify
echo "✨ Success! The hook synchronized the versions and pushed the updated tag for you."
echo " Note: The original push command was aborted in favor of the auto-push. This is normal."
exit 1 # We MUST exit 1 here to stop git from proceeding with the original push which would fail
else
echo "✨ Versions already in sync for $TAG."
exit 0 # Allow git to proceed with the original push since we didn't do it ourselves
fi
fi
done

View File

@@ -1,6 +1,6 @@
{
"name": "sample-website",
"version": "1.9.1",
"version": "1.9.7",
"private": true,
"type": "module",
"scripts": {

View File

@@ -0,0 +1,39 @@
services:
gatekeeper-proxy:
image: alpine:latest
command: sleep infinity
restart: unless-stopped
networks:
- infra
labels:
- "caddy=http://gatekeeper.localhost"
- "caddy.route=/*"
- "caddy.route.0_redir=/ /gatekeeper/login 302"
- "caddy.route.1_reverse_proxy=gatekeeper-app:3000"
gatekeeper-app:
image: node:20-alpine
working_dir: /app
volumes:
- .:/app
- gatekeeper_root_node_modules:/app/node_modules
- gatekeeper_pkg_node_modules:/app/packages/gatekeeper/node_modules
- gatekeeper_next_cache:/app/packages/gatekeeper/.next
- gatekeeper_pnpm_store:/pnpm
environment:
- NODE_ENV=development
- NPM_TOKEN=${NPM_TOKEN:-}
networks:
- infra
command: >
sh -c "corepack enable && pnpm config set store-dir /pnpm && pnpm install --no-frozen-lockfile && pnpm --filter @mintel/gatekeeper run dev --hostname 0.0.0.0 --port 3000"
networks:
infra:
external: true
volumes:
gatekeeper_root_node_modules:
gatekeeper_pkg_node_modules:
gatekeeper_next_cache:
gatekeeper_pnpm_store:

View File

@@ -5,11 +5,13 @@
"scripts": {
"build": "pnpm -r build",
"dev": "pnpm -r dev",
"dev:gatekeeper": "bash -c 'trap \"COMPOSE_PROJECT_NAME=gatekeeper docker-compose -f docker-compose.gatekeeper.yml down\" EXIT INT TERM; docker network create infra 2>/dev/null || true && COMPOSE_PROJECT_NAME=gatekeeper docker-compose -f docker-compose.gatekeeper.yml down && COMPOSE_PROJECT_NAME=gatekeeper docker-compose -f docker-compose.gatekeeper.yml up --build --remove-orphans'",
"lint": "pnpm -r --filter='./packages/**' --filter='./apps/**' lint",
"test": "pnpm -r test",
"changeset": "changeset",
"version-packages": "changeset version",
"sync-versions": "tsx scripts/sync-versions.ts --",
"release:version": "bash scripts/release.sh",
"release": "pnpm build && changeset publish",
"release:tag": "pnpm build && pnpm -r publish --no-git-checks --access public",
"prepare": "husky"
@@ -47,7 +49,7 @@
"pino-pretty": "^13.1.3",
"require-in-the-middle": "^8.0.1"
},
"version": "1.9.1",
"version": "1.9.7",
"pnpm": {
"onlyBuiltDependencies": [
"@parcel/watcher",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/cli",
"version": "1.9.1",
"version": "1.9.7",
"publishConfig": {
"access": "public",
"registry": "https://git.infra.mintel.me/api/packages/mmintel/npm"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/cloner",
"version": "1.9.1",
"version": "1.9.7",
"type": "module",
"main": "dist/index.js",
"module": "dist/index.js",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/concept-engine",
"version": "1.9.1",
"version": "1.9.7",
"private": true,
"description": "AI-powered web project concept generation and analysis",
"type": "module",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/content-engine",
"version": "1.9.1",
"version": "1.9.7",
"private": false,
"type": "module",
"main": "./dist/index.js",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/eslint-config",
"version": "1.9.1",
"version": "1.9.7",
"publishConfig": {
"access": "public",
"registry": "https://git.infra.mintel.me/api/packages/mmintel/npm"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/estimation-engine",
"version": "1.9.1",
"version": "1.9.7",
"private": true,
"type": "module",
"main": "./dist/index.js",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/gatekeeper",
"version": "1.9.1",
"version": "1.9.7",
"private": true,
"type": "module",
"scripts": {
@@ -12,6 +12,7 @@
},
"dependencies": {
"@mintel/next-utils": "workspace:*",
"framer-motion": "^11.18.2",
"lucide-react": "^0.474.0",
"next": "16.1.6",
"react": "^19.0.0",

View File

@@ -8,7 +8,7 @@
}
body {
@apply bg-white text-slate-800 font-serif antialiased selection:bg-slate-900 selection:text-white;
@apply bg-[#f5f5f7] text-black/80 font-serif antialiased selection:bg-black/10 selection:text-black;
line-height: 1.6;
}
@@ -18,15 +18,15 @@
h4,
h5,
h6 {
@apply font-sans font-bold text-slate-900 tracking-tighter;
@apply font-sans font-bold text-black tracking-tighter;
}
p {
@apply mb-4 text-base leading-relaxed text-slate-700;
@apply mb-4 text-base leading-relaxed text-black/50;
}
a {
@apply text-slate-900 hover:text-slate-700 transition-colors no-underline;
@apply text-black/50 hover:text-black transition-colors no-underline;
}
}
@@ -36,34 +36,58 @@
}
.btn {
@apply inline-flex items-center justify-center px-6 py-3 border border-slate-200 bg-white text-slate-600 font-sans font-bold text-sm uppercase tracking-widest rounded-full transition-all duration-500 ease-industrial hover:border-slate-400 hover:text-slate-900 hover:bg-slate-50 hover:-translate-y-0.5 hover:shadow-xl hover:shadow-slate-100 active:translate-y-0 active:shadow-sm;
@apply inline-flex items-center justify-center px-6 py-3 border border-black/10 bg-white text-black/60 font-sans font-bold text-sm uppercase tracking-widest rounded-full transition-all duration-500 ease-industrial hover:border-black/20 hover:text-black hover:bg-white hover:-translate-y-0.5 hover:shadow-xl hover:shadow-black/5 active:translate-y-0 active:shadow-sm;
}
.btn-primary {
@apply border-slate-900 text-slate-900 hover:bg-slate-900 hover:text-white;
@apply border-black bg-black text-white hover:bg-black/85 hover:text-white;
}
}
/* Custom scrollbar */
/* Custom scrollbar - light theme */
::-webkit-scrollbar {
width: 8px;
height: 8px;
width: 6px;
height: 6px;
}
::-webkit-scrollbar-track {
background: #f1f5f9;
background: #f5f5f7;
}
::-webkit-scrollbar-thumb {
background: #cbd5e1;
background: #d1d1d6;
border-radius: 4px;
}
::-webkit-scrollbar-thumb:hover {
background: #94a3b8;
background: #b0b0b8;
}
/* Animations */
@keyframes fade-in {
from {
opacity: 0;
transform: translateY(12px);
}
to {
opacity: 1;
transform: translateY(0);
}
}
@keyframes slide-up {
from {
opacity: 0;
transform: translateY(20px);
}
to {
opacity: 1;
transform: translateY(0);
}
}
@keyframes shake {
0%,
100% {
@@ -79,6 +103,15 @@
}
}
.animate-fade-in {
animation: fade-in 0.8s ease-out forwards;
}
.animate-slide-up {
animation: slide-up 0.8s ease-out 0.2s forwards;
opacity: 0;
}
.animate-shake {
animation: shake 0.2s ease-in-out 0s 2;
}

View File

@@ -13,6 +13,17 @@ const newsreader = Newsreader({
export const metadata: Metadata = {
title: "Gatekeeper | Access Control",
description: "Mintel Infrastructure Protection",
openGraph: {
title: "Gatekeeper | Access Control",
description: "Mintel Infrastructure Protection",
siteName: "Mintel Gatekeeper",
type: "website",
},
twitter: {
card: "summary_large_image",
title: "Gatekeeper | Access Control",
description: "Mintel Infrastructure Protection",
},
};
export default function RootLayout({

View File

@@ -1,7 +1,9 @@
import { cookies } from "next/headers";
import { redirect } from "next/navigation";
import { ArrowRight, ShieldCheck } from "lucide-react";
import { ShieldCheck } from "lucide-react";
import Image from "next/image";
import { GateScene } from "../../components/gate-scene";
import { AnimatedLoginForm } from "../../components/animated-login-form";
interface LoginPageProps {
searchParams: Promise<{ [key: string]: string | string[] | undefined }>;
@@ -17,8 +19,8 @@ export default async function LoginPage({ searchParams }: LoginPageProps) {
async function login(formData: FormData) {
"use server";
const email = (formData.get("email") as string || "").trim();
const password = (formData.get("password") as string || "").trim();
const email = ((formData.get("email") as string) || "").trim();
const password = ((formData.get("password") as string) || "").trim();
const expectedCode = process.env.GATEKEEPER_PASSWORD || "mintel";
const adminEmail = process.env.DIRECTUS_ADMIN_EMAIL;
@@ -116,7 +118,9 @@ export default async function LoginPage({ searchParams }: LoginPageProps) {
}
if (userIdentity) {
console.log(`[Login] Success: ${userIdentity} | Redirect: ${targetRedirect}`);
console.log(
`[Login] Success: ${userIdentity} | Redirect: ${targetRedirect}`,
);
const cookieStore = await cookies();
// Store identity in the cookie (simplified for now, ideally signed)
const sessionValue = JSON.stringify({
@@ -127,7 +131,9 @@ export default async function LoginPage({ searchParams }: LoginPageProps) {
const isDev = process.env.NODE_ENV === "development";
console.log(`[Login] Setting Cookie: ${authCookieName} | Domain: ${cookieDomain || "Default"}`);
console.log(
`[Login] Setting Cookie: ${authCookieName} | Domain: ${cookieDomain || "Default"}`,
);
cookieStore.set(authCookieName, sessionValue, {
httpOnly: true,
@@ -145,101 +151,81 @@ export default async function LoginPage({ searchParams }: LoginPageProps) {
}
return (
<div className="min-h-screen flex items-center justify-center relative bg-white font-serif antialiased overflow-hidden">
{/* Background Decor - Signature mintel.me style */}
<div
className="absolute inset-0 pointer-events-none opacity-[0.03] scale-[1.01]"
style={{
backgroundImage: `linear-gradient(to right, #000 1px, transparent 1px), linear-gradient(to bottom, #000 1px, transparent 1px)`,
backgroundSize: "clamp(30px, 8vw, 40px) clamp(30px, 8vw, 40px)",
}}
/>
<div className="min-h-screen flex items-center justify-center relative bg-[#f5f5f7] font-serif antialiased overflow-hidden selection:bg-black/10 selection:text-black">
{/* 3D Digital Gate Background */}
<GateScene />
<main className="relative z-10 w-full max-w-sm px-8 sm:px-6">
<div className="space-y-12 sm:space-y-16 animate-fade-in">
{/* Top Icon Box - Signature mintel.me Black Square */}
<main className="relative z-10 w-full max-w-[380px] px-6 sm:px-4 pb-24 sm:pb-32 pointer-events-auto">
<div className="space-y-10 animate-fade-in">
{/* Top Icon Box */}
<div className="flex justify-center">
<div className="w-16 h-16 bg-black rounded-xl flex items-center justify-center shadow-xl shadow-slate-100 hover:scale-105 transition-all duration-500 ease-[cubic-bezier(0.23,1,0.32,1)] rotate-2 hover:rotate-0">
<a
href="https://mintel.me"
target="_blank"
rel="noopener noreferrer"
className="w-14 h-14 bg-white rounded-2xl flex items-center justify-center shadow-lg shadow-black/[0.06] hover:scale-105 hover:shadow-black/10 transition-all duration-500 ease-[cubic-bezier(0.23,1,0.32,1)] border border-black/[0.06] hover:border-black/10"
>
<Image
src="/gatekeeper/icon-white.svg"
alt="Mintel"
width={32}
height={32}
className="w-8 h-8"
style={{ filter: "invert(1)" }}
width={28}
height={28}
className="w-7 h-7 opacity-80"
unoptimized
/>
</div>
</a>
</div>
<div className="space-y-12 animate-slide-up">
<div className="text-center space-y-4">
<h1 className="text-xs font-sans font-bold uppercase tracking-[0.4em] text-slate-900 border-b border-slate-50 pb-4 inline-block mx-auto min-w-[200px]">
{projectName} <span className="text-slate-300">Gatekeeper</span>
<div className="space-y-8 animate-slide-up">
<div className="text-center space-y-3">
<h1 className="text-[11px] font-sans font-bold uppercase tracking-[0.5em] text-black/80 pb-3 inline-block mx-auto min-w-[220px]">
{projectName} <span className="text-black/30">Gatekeeper</span>
</h1>
<p className="text-[10px] text-slate-400 font-sans uppercase tracking-widest italic flex items-center justify-center gap-2">
<span className="w-1 h-1 bg-slate-200 rounded-full" />
Infrastructure Protection
<span className="w-1 h-1 bg-slate-200 rounded-full" />
</p>
<div className="flex items-center justify-center gap-4">
<div className="h-px w-8 bg-gradient-to-r from-transparent to-black/10" />
<p className="text-[8px] text-black/30 font-sans uppercase tracking-[0.35em] font-semibold">
Infrastructure Protection
</p>
<div className="h-px w-8 bg-gradient-to-l from-transparent to-black/10" />
</div>
</div>
{error && (
<div className="bg-red-50 text-red-600 px-5 py-3 rounded-2xl text-[9px] font-sans font-bold uppercase tracking-widest flex items-center gap-3 border border-red-100 animate-shake">
<ShieldCheck className="w-4 h-4" />
<div className="bg-red-50 backdrop-blur-md text-red-600 px-5 py-4 rounded-2xl text-[9px] font-sans font-bold uppercase tracking-widest flex items-center gap-3 border border-red-200 animate-shake">
<ShieldCheck className="w-4 h-4 text-red-500/70" />
<span>Access Denied. Try Again.</span>
</div>
)}
<form action={login} className="space-y-4">
<input type="hidden" name="redirect" value={redirectUrl} />
{/* The Animated Framer Motion Form */}
<AnimatedLoginForm
redirectUrl={redirectUrl}
loginAction={login}
projectName={projectName}
/>
<div className="space-y-2">
<div className="relative group">
<input
type="email"
name="email"
placeholder="EMAIL (OPTIONAL)"
className="w-full bg-slate-50/50 border border-slate-200 rounded-2xl px-6 py-4 focus:outline-none focus:border-slate-900 focus:bg-white transition-all text-[10px] font-sans font-bold tracking-[0.2em] uppercase placeholder:text-slate-300 shadow-sm shadow-slate-50"
/>
</div>
<div className="relative group">
<input
type="password"
name="password"
required
autoFocus
autoComplete="current-password"
placeholder="ACCESS CODE"
className="w-full bg-slate-50/50 border border-slate-200 rounded-2xl px-6 py-4 focus:outline-none focus:border-slate-900 focus:bg-white transition-all text-sm font-sans font-bold tracking-[0.3em] uppercase placeholder:text-slate-300 placeholder:tracking-widest shadow-sm shadow-slate-50"
/>
</div>
</div>
<button
type="submit"
className="btn btn-primary w-full py-5 rounded-2xl text-[10px] shadow-lg shadow-slate-100 flex items-center justify-center"
{/* Bottom Section */}
<div className="pt-4 sm:pt-6 flex flex-col items-center gap-5">
<div className="h-px w-16 bg-gradient-to-r from-transparent via-black/10 to-transparent" />
<a
href="https://mintel.me"
target="_blank"
rel="noopener noreferrer"
className="opacity-30 transition-opacity hover:opacity-60"
>
Unlock Access
<ArrowRight className="ml-3 w-3 h-3 group-hover:translate-x-1 transition-transform" />
</button>
</form>
{/* Bottom Section - Full Branding Parity */}
<div className="pt-12 sm:pt-20 flex flex-col items-center gap-6 sm:gap-8">
<div className="h-px w-8 bg-slate-100" />
<div className="opacity-80 transition-opacity hover:opacity-100">
<Image
src="/gatekeeper/logo-black.svg"
src="/gatekeeper/logo-white.svg"
alt={projectName}
width={140}
height={40}
className="h-7 sm:h-auto grayscale contrast-125 w-auto"
width={120}
height={36}
className="h-5 sm:h-6 w-auto"
style={{ filter: "invert(1)" }}
unoptimized
/>
</div>
<p className="text-[8px] font-sans font-bold text-slate-300 uppercase tracking-[0.4em] sm:tracking-[0.5em] text-center">
&copy; 2026 MINTEL
</a>
<p className="text-[7px] font-sans font-semibold text-black/25 uppercase tracking-[0.5em] text-center">
&copy; {new Date().getFullYear()} MINTEL
</p>
</div>
</div>

View File

@@ -0,0 +1,126 @@
import { ImageResponse } from "next/og";
export const runtime = "edge";
// Image metadata
export const alt = "Gatekeeper Infrastructure Protection";
export const size = {
width: 1200,
height: 630,
};
export const contentType = "image/png";
export default async function Image() {
const projectName = process.env.PROJECT_NAME || "MINTEL";
return new ImageResponse(
<div
style={{
background: "linear-gradient(to bottom, #020617, #0f172a)",
width: "100%",
height: "100%",
display: "flex",
flexDirection: "column",
alignItems: "center",
justifyContent: "center",
position: "relative",
fontFamily: "Inter, sans-serif",
}}
>
{/* Subtle Background Pattern matching the industrial look */}
<div
style={{
position: "absolute",
top: 0,
left: 0,
right: 0,
bottom: 0,
backgroundImage:
"radial-gradient(circle at 50% 50%, #334155 1px, transparent 1px)",
backgroundSize: "40px 40px",
opacity: 0.1,
}}
/>
{/* Central Card Element */}
<div
style={{
display: "flex",
flexDirection: "column",
alignItems: "center",
justifyContent: "center",
background: "rgba(15, 23, 42, 0.6)",
border: "1px solid rgba(51, 65, 85, 0.4)",
borderRadius: "32px",
padding: "80px",
boxShadow: "0 25px 50px -12px rgba(0, 0, 0, 0.5)",
}}
>
{/* Top Icon Box */}
<div
style={{
display: "flex",
alignItems: "center",
justifyContent: "center",
width: "100px",
height: "100px",
background: "#000",
borderRadius: "24px",
border: "2px solid #334155",
boxShadow: "0 20px 40px rgba(0,0,0,0.8)",
marginBottom: "40px",
transform: "rotate(2deg)",
}}
>
<svg
width="48"
height="48"
viewBox="0 0 24 24"
fill="none"
stroke="white"
strokeWidth="1.5"
strokeLinecap="round"
strokeLinejoin="round"
>
<path d="M12 22s8-4 8-10V5l-8-3-8 3v7c0 6 8 10 8 10z" />
</svg>
</div>
{/* Project Name & Typography */}
<div
style={{
display: "flex",
fontSize: "64px",
fontWeight: 900,
letterSpacing: "0.2em",
color: "white",
textTransform: "uppercase",
marginBottom: "16px",
}}
>
{projectName}{" "}
<span style={{ color: "#64748b", marginLeft: "10px" }}>
GATEKEEPER
</span>
</div>
<div
style={{
display: "flex",
fontSize: "24px",
fontWeight: 600,
letterSpacing: "0.4em",
color: "#94a3b8",
textTransform: "uppercase",
}}
>
Infrastructure Protection
</div>
</div>
</div>,
{
...size,
},
);
}

View File

@@ -0,0 +1,283 @@
"use client";
import { useState, useEffect, useRef } from "react";
import { motion } from "framer-motion";
import { ArrowRight, Lock, Shield, Fingerprint } from "lucide-react";
interface AnimatedLoginFormProps {
redirectUrl: string;
loginAction: (formData: FormData) => Promise<void>;
projectName: string;
}
export function AnimatedLoginForm({
redirectUrl,
loginAction,
}: AnimatedLoginFormProps) {
const [isFocused, setIsFocused] = useState(false);
const [isSubmitting, setIsSubmitting] = useState(false);
const [mounted, setMounted] = useState(false);
const wrapperRef = useRef<HTMLDivElement>(null);
const beamRef = useRef<HTMLDivElement>(null);
// Mouse tracking refs (no re-render)
const mouse = useRef({ x: 0, y: 0 });
const angle = useRef(0);
const tilt = useRef({ x: 0, y: 0 });
useEffect(() => {
setMounted(true);
}, []);
// Single rAF loop: iridescent border + perspective tilt
useEffect(() => {
let animId: number;
const onMouseMove = (e: MouseEvent) => {
mouse.current = { x: e.clientX, y: e.clientY };
};
window.addEventListener("mousemove", onMouseMove);
const animate = () => {
if (!wrapperRef.current || !beamRef.current) {
animId = requestAnimationFrame(animate);
return;
}
const rect = wrapperRef.current.getBoundingClientRect();
const cx = rect.left + rect.width / 2;
const cy = rect.top + rect.height / 2;
const dx = mouse.current.x - cx;
const dy = mouse.current.y - cy;
// Angle from form center to mouse → positions the bright highlight
const targetAngle = (Math.atan2(dy, dx) * 180) / Math.PI;
// Lerp angle smoothly (shortest path)
let diff = targetAngle - angle.current;
while (diff > 180) diff -= 360;
while (diff < -180) diff += 360;
angle.current += diff * 0.06;
// Intensity: slightly stronger on focus
const intensity = isFocused ? 1 : 0.7;
// Mouse-aligned iridescent conic gradient
// The "hotspot" (brightest white) faces the mouse
beamRef.current.style.background = `conic-gradient(from ${angle.current}deg at 50% 50%,
rgba(255,255,255,${1.0 * intensity}) 0deg,
rgba(200,210,255,${0.8 * intensity}) 20deg,
rgba(255,200,230,${0.7 * intensity}) 45deg,
rgba(150,160,180,${0.6 * intensity}) 80deg,
rgba(40,40,50,${0.5 * intensity}) 160deg,
rgba(20,20,30,${0.4 * intensity}) 200deg,
rgba(140,150,170,${0.5 * intensity}) 280deg,
rgba(210,225,255,${0.7 * intensity}) 320deg,
rgba(255,255,255,${1.0 * intensity}) 360deg)`;
// Subtle perspective tilt — max ±4deg
const maxTilt = 4;
const normX = dx / (rect.width * 2);
const normY = dy / (rect.height * 2);
const targetTiltY = normX * maxTilt;
const targetTiltX = -normY * maxTilt;
tilt.current.x += (targetTiltX - tilt.current.x) * 0.08;
tilt.current.y += (targetTiltY - tilt.current.y) * 0.08;
wrapperRef.current.style.transform = `perspective(800px) rotateX(${tilt.current.x}deg) rotateY(${tilt.current.y}deg)`;
animId = requestAnimationFrame(animate);
};
animId = requestAnimationFrame(animate);
return () => {
window.removeEventListener("mousemove", onMouseMove);
cancelAnimationFrame(animId);
};
}, [isFocused]);
const handleSubmit = async (formData: FormData) => {
setIsSubmitting(true);
try {
await loginAction(formData);
} finally {
setIsSubmitting(false);
}
};
return (
<motion.div
initial={{ opacity: 0, y: 30, scale: 0.95 }}
animate={{ opacity: 1, y: 0, scale: 1 }}
transition={{ duration: 0.8, ease: [0.23, 1, 0.32, 1], delay: 0.3 }}
className="relative"
style={{ willChange: "transform" }}
>
{/* Outer wrapper for tilt (refs need non-motion div) */}
<div
ref={wrapperRef}
style={{ willChange: "transform", transformStyle: "preserve-3d" }}
className="relative"
>
{/* ── Always-on iridescent beam border ── */}
<div className="absolute -inset-[1.5px] rounded-[28px] z-0 overflow-hidden">
{/* Sharp edge layer */}
<div
ref={beamRef}
className="absolute inset-0 rounded-[28px]"
style={{ filter: "blur(0.4px)" }}
/>
{/* Soft glow bloom */}
<div
className="absolute inset-0 rounded-[28px]"
style={{
background: "inherit",
filter: "blur(15px)",
opacity: 0.5,
}}
/>
</div>
{/* ── Glassmorphism card — high-fidelity glossy light ── */}
<div
className="relative z-10 bg-white/70 backdrop-blur-3xl rounded-3xl p-7 sm:p-9"
style={{
boxShadow:
/* Razor-sharp inner border highlight */ "inset 0 0 0 1px rgba(255,255,255,0.7), " +
/* Top gloss edge */ "inset 0 1.5px 0.5px rgba(255,255,255,1), " +
/* Secondary soft top gloss */ "inset 0 4px 10px rgba(255,255,255,0.4), " +
/* Bottom inner shadow */ "inset 0 -1px 1px rgba(0,0,0,0.05), " +
/* Outer drop shadows for depth */ "0 25px 50px -12px rgba(0,0,0,0.08), 0 4px 8px rgba(0,0,0,0.02)",
}}
>
{/* Subtle surface "sheen" gradient */}
<div className="absolute inset-0 rounded-3xl bg-gradient-to-br from-white/40 via-transparent to-black/[0.02] pointer-events-none" />
{/* Shield icon header */}
<motion.div
initial={{ opacity: 0, scale: 0.8 }}
animate={{ opacity: 1, scale: 1 }}
transition={{ delay: 0.5, duration: 0.6 }}
className="flex justify-center mb-6"
>
<div className="relative">
<div className="w-12 h-12 rounded-2xl bg-gradient-to-br from-black/[0.04] to-black/[0.08] border border-black/[0.06] flex items-center justify-center backdrop-blur-sm">
<Shield className="w-5 h-5 text-black/40" />
</div>
{mounted && (
<div className="absolute -inset-2 rounded-2xl border border-black/[0.04] animate-ping opacity-30" />
)}
</div>
</motion.div>
<form
action={handleSubmit}
className="space-y-5 relative z-10"
onFocus={() => setIsFocused(true)}
onBlur={(e) => {
if (!e.currentTarget.contains(e.relatedTarget as Node)) {
setIsFocused(false);
}
}}
>
<input type="hidden" name="redirect" value={redirectUrl} />
<div className="space-y-3">
{/* Email Input */}
<motion.div
initial={{ opacity: 0, x: -20 }}
animate={{ opacity: 1, x: 0 }}
transition={{ delay: 0.6, duration: 0.5 }}
className="relative group"
>
<Fingerprint className="absolute left-4 top-1/2 -translate-y-1/2 w-3.5 h-3.5 text-black/20 group-focus-within:text-black/50 transition-colors duration-300 pointer-events-none" />
<input
type="email"
name="email"
placeholder="Identity (optional)"
onFocus={() => setIsFocused(true)}
className="w-full bg-black/[0.02] border border-black/[0.06] rounded-2xl pl-11 pr-4 py-3.5 focus:outline-none focus:border-black/15 focus:bg-white/80 transition-all duration-300 text-[11px] font-sans font-medium tracking-[0.08em] placeholder:text-black/25 placeholder:normal-case placeholder:tracking-normal text-black/70"
/>
<div className="absolute bottom-0 left-4 right-4 h-px bg-gradient-to-r from-transparent via-black/0 to-transparent group-focus-within:via-black/10 transition-all duration-500" />
</motion.div>
{/* Password Input */}
<motion.div
initial={{ opacity: 0, x: -20 }}
animate={{ opacity: 1, x: 0 }}
transition={{ delay: 0.7, duration: 0.5 }}
className="relative group"
>
<Lock className="absolute left-4 top-1/2 -translate-y-1/2 w-3.5 h-3.5 text-black/20 group-focus-within:text-black/50 transition-colors duration-300 pointer-events-none" />
<input
type="password"
name="password"
required
autoFocus
autoComplete="current-password"
placeholder="Access code"
onFocus={() => setIsFocused(true)}
className="w-full bg-black/[0.02] border border-black/[0.06] rounded-2xl pl-11 pr-4 py-3.5 focus:outline-none focus:border-black/15 focus:bg-white/80 transition-all duration-300 text-[13px] font-sans font-medium tracking-[0.15em] placeholder:text-black/25 placeholder:tracking-normal placeholder:text-[11px] placeholder:font-normal text-black/80"
/>
<div className="absolute bottom-0 left-4 right-4 h-px bg-gradient-to-r from-transparent via-black/0 to-transparent group-focus-within:via-black/10 transition-all duration-500" />
</motion.div>
</div>
{/* Submit Button */}
<motion.div
initial={{ opacity: 0, y: 10 }}
animate={{ opacity: 1, y: 0 }}
transition={{ delay: 0.8, duration: 0.5 }}
>
<motion.button
type="submit"
disabled={isSubmitting}
whileHover={{ scale: 1.01 }}
whileTap={{ scale: 0.98 }}
className={`relative w-full py-4 rounded-2xl text-[10px] font-bold tracking-[0.3em] uppercase flex items-center justify-center overflow-hidden transition-all duration-300 ${
isSubmitting
? "bg-black/5 text-black/25 cursor-not-allowed"
: "bg-black text-white hover:bg-black/85 shadow-lg shadow-black/10"
}`}
>
{!isSubmitting && (
<div className="absolute inset-0 bg-gradient-to-r from-transparent via-white/10 to-transparent -translate-x-full hover:translate-x-full transition-transform duration-700" />
)}
{isSubmitting ? (
<span className="flex items-center gap-3 relative z-10">
<motion.div
animate={{ rotate: 360 }}
transition={{
repeat: Infinity,
duration: 1,
ease: "linear",
}}
className="w-4 h-4 border-2 border-white/20 border-t-white/70 rounded-full"
/>
Authenticating...
</span>
) : (
<span className="flex items-center gap-3 relative z-10">
Unlock Access
<ArrowRight className="w-4 h-4" />
</span>
)}
</motion.button>
</motion.div>
</form>
{/* Security badge */}
<motion.div
initial={{ opacity: 0 }}
animate={{ opacity: 1 }}
transition={{ delay: 1.0, duration: 0.5 }}
className="mt-5 flex items-center justify-center gap-2 text-[8px] font-sans font-semibold text-black/25 uppercase tracking-[0.3em]"
>
<div className="w-1 h-1 rounded-full bg-black/20 animate-pulse" />
Encrypted Connection
</motion.div>
</div>
</div>
</motion.div>
);
}

View File

@@ -0,0 +1,127 @@
"use client";
import { useEffect, useRef } from "react";
export function GateScene() {
const canvasRef = useRef<HTMLCanvasElement>(null);
const mouse = useRef({ x: -1000, y: -1000 });
useEffect(() => {
const canvas = canvasRef.current;
if (!canvas) return;
const ctx = canvas.getContext("2d");
if (!ctx) return;
const FONT_SIZE = 13;
const COL_GAP = 18;
const ROW_GAP = 20;
interface Cell {
char: string;
alpha: number;
targetAlpha: number;
speed: number;
nextChange: number;
}
let cells: Cell[][] = [];
let cols = 0;
let rows = 0;
let animId: number;
const init = () => {
cols = Math.ceil(canvas.width / COL_GAP);
rows = Math.ceil(canvas.height / ROW_GAP);
cells = Array.from({ length: cols }, () =>
Array.from({ length: rows }, () => ({
char: Math.random() > 0.5 ? "1" : "0",
alpha: Math.random() * 0.08,
targetAlpha: Math.random() * 0.15,
speed: 0.008 + Math.random() * 0.02,
nextChange: Math.floor(Math.random() * 80),
})),
);
};
const onMouseMove = (e: MouseEvent) => {
mouse.current = { x: e.clientX, y: e.clientY };
};
const resize = () => {
canvas.width = window.innerWidth;
canvas.height = window.innerHeight;
init();
};
let frame = 0;
const draw = () => {
frame++;
ctx.clearRect(0, 0, canvas.width, canvas.height);
ctx.font = `${FONT_SIZE}px 'Courier New', monospace`;
for (let c = 0; c < cols; c++) {
for (let r = 0; r < rows; r++) {
const cell = cells[c][r];
const x = c * COL_GAP;
const y = r * ROW_GAP + FONT_SIZE;
// Mouse proximity influence
const dx = mouse.current.x - x;
const dy = mouse.current.y - y;
const distSq = dx * dx + dy * dy;
const proximity = Math.max(0, 1 - Math.sqrt(distSq) / 250);
// Nudge alpha toward target
cell.alpha += (cell.targetAlpha - cell.alpha) * cell.speed;
// More aggressive random behavior
if (frame >= cell.nextChange) {
cell.targetAlpha = Math.random() * 0.25; // Higher max alpha
cell.speed = 0.01 + Math.random() * 0.03; // Faster transitions
cell.nextChange = frame + 20 + Math.floor(Math.random() * 80); // More frequent changes
// Higher flip probability near mouse
const flipProb = 0.3 + proximity * 0.5;
if (Math.random() < flipProb) {
cell.char = cell.char === "0" ? "1" : "0";
}
}
const a = Math.min(0.4, cell.alpha + proximity * 0.35);
if (a < 0.01) continue;
// Dark chars on light background
ctx.fillStyle = `rgba(0, 0, 0, ${a})`;
ctx.fillText(cell.char, x, y);
}
}
animId = requestAnimationFrame(draw);
};
resize();
window.addEventListener("resize", resize);
window.addEventListener("mousemove", onMouseMove);
animId = requestAnimationFrame(draw);
return () => {
cancelAnimationFrame(animId);
window.removeEventListener("resize", resize);
window.removeEventListener("mousemove", onMouseMove);
};
}, []);
return (
<div className="absolute inset-0 pointer-events-none z-0 bg-[#f5f5f7]">
<canvas ref={canvasRef} className="absolute inset-0 w-full h-full" />
<div
className="absolute inset-0 pointer-events-none"
style={{
background:
"radial-gradient(ellipse 55% 65% at 50% 50%, rgba(245,245,247,0.7) 0%, transparent 100%)",
}}
/>
</div>
);
}

View File

@@ -0,0 +1,19 @@
FROM node:20-alpine AS builder
WORKDIR /app
COPY package.json pnpm-lock.yaml* ./
RUN corepack enable pnpm && pnpm install
COPY tsconfig.json ./
COPY src ./src
RUN pnpm build
FROM node:20-alpine
WORKDIR /app
COPY --from=builder /app/package.json ./
COPY --from=builder /app/node_modules ./node_modules
COPY --from=builder /app/dist ./dist
# Use node to run the compiled index.js
ENTRYPOINT ["node", "dist/index.js"]

View File

@@ -0,0 +1,20 @@
{
"name": "@mintel/gitea-mcp",
"version": "1.9.7",
"description": "Native Gitea MCP server for 100% Antigravity compatibility",
"main": "dist/index.js",
"type": "module",
"scripts": {
"build": "tsc",
"start": "node dist/index.js"
},
"dependencies": {
"@modelcontextprotocol/sdk": "^1.5.0",
"zod": "^3.23.8",
"axios": "^1.7.2"
},
"devDependencies": {
"typescript": "^5.5.3",
"@types/node": "^20.14.10"
}
}

View File

@@ -0,0 +1,266 @@
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import {
CallToolRequestSchema,
ListToolsRequestSchema,
ListResourcesRequestSchema,
ReadResourceRequestSchema,
SubscribeRequestSchema,
UnsubscribeRequestSchema,
Tool,
Resource,
} from "@modelcontextprotocol/sdk/types.js";
import { z } from "zod";
import axios from "axios";
const GITEA_HOST = process.env.GITEA_HOST || "https://git.infra.mintel.me";
const GITEA_ACCESS_TOKEN = process.env.GITEA_ACCESS_TOKEN;
if (!GITEA_ACCESS_TOKEN) {
console.error("Error: GITEA_ACCESS_TOKEN environment variable is required");
process.exit(1);
}
const giteaClient = axios.create({
baseURL: `${GITEA_HOST.replace(/\/$/, '')}/api/v1`,
headers: {
Authorization: `token ${GITEA_ACCESS_TOKEN}`,
},
});
const LIST_PIPELINES_TOOL: Tool = {
name: "gitea_list_pipelines",
description: "List recent action runs (pipelines) for a specific repository",
inputSchema: {
type: "object",
properties: {
owner: { type: "string", description: "Repository owner (e.g., 'mmintel')" },
repo: { type: "string", description: "Repository name (e.g., 'at-mintel')" },
limit: { type: "number", description: "Number of runs to fetch (default: 5)" },
},
required: ["owner", "repo"],
},
};
const GET_PIPELINE_LOGS_TOOL: Tool = {
name: "gitea_get_pipeline_logs",
description: "Get detailed logs for a specific pipeline run or job",
inputSchema: {
type: "object",
properties: {
owner: { type: "string", description: "Repository owner" },
repo: { type: "string", description: "Repository name" },
run_id: { type: "number", description: "ID of the action run" },
},
required: ["owner", "repo", "run_id"],
},
};
// Subscription State
const subscriptions = new Set<string>();
const runStatusCache = new Map<string, string>(); // uri -> status
const server = new Server(
{
name: "gitea-mcp-native",
version: "1.0.0",
},
{
capabilities: {
tools: {},
resources: { subscribe: true }, // Enable subscriptions
},
}
);
// --- Tools ---
server.setRequestHandler(ListToolsRequestSchema, async () => {
return {
tools: [LIST_PIPELINES_TOOL, GET_PIPELINE_LOGS_TOOL],
};
});
server.setRequestHandler(CallToolRequestSchema, async (request) => {
if (request.params.name === "gitea_list_pipelines") {
// ... (Keeping exact same implementation as before for brevity)
const { owner, repo, limit = 5 } = request.params.arguments as any;
try {
const runsResponse = await giteaClient.get(`/repos/${owner}/${repo}/actions/runs`, {
params: { limit },
});
const runs = (runsResponse.data.workflow_runs || []) as any[];
const enhancedRuns = await Promise.all(
runs.map(async (run: any) => {
try {
const jobsResponse = await giteaClient.get(`/repos/${owner}/${repo}/actions/runs/${run.id}/jobs`);
const jobs = (jobsResponse.data.jobs || []) as any[];
return {
id: run.id,
name: run.name,
status: run.status,
created_at: run.created_at,
jobs: jobs.map((job: any) => ({
id: job.id,
name: job.name,
status: job.status,
conclusion: job.conclusion
}))
};
} catch (e) {
return { id: run.id, name: run.name, status: run.status, created_at: run.created_at, jobs: [] };
}
})
);
return {
content: [{ type: "text", text: JSON.stringify(enhancedRuns, null, 2) }],
};
} catch (error: any) {
return { isError: true, content: [{ type: "text", text: `Error fetching pipelines: ${error.message}` }] };
}
}
if (request.params.name === "gitea_get_pipeline_logs") {
const { owner, repo, run_id } = request.params.arguments as any;
try {
const jobsResponse = await giteaClient.get(`/repos/${owner}/${repo}/actions/runs/${run_id}/jobs`);
const jobs = (jobsResponse.data.jobs || []) as any[];
const logs = jobs.map((job: any) => ({
job_id: job.id,
job_name: job.name,
status: job.status,
conclusion: job.conclusion,
steps: (job.steps || []).map((step: any) => ({
name: step.name,
status: step.status,
conclusion: step.conclusion
}))
}));
return { content: [{ type: "text", text: JSON.stringify(logs, null, 2) }] };
} catch (error: any) {
return { isError: true, content: [{ type: "text", text: `Error: ${error.message}` }] };
}
}
throw new Error(`Unknown tool: ${request.params.name}`);
});
// --- Resources & Subscriptions ---
// We will expose a dynamic resource URI pattern: gitea://runs/{owner}/{repo}/{run_id}
server.setRequestHandler(ListResourcesRequestSchema, async () => {
return {
resources: [
{
uri: "gitea://runs",
name: "Gitea Pipeline Runs",
description: "Dynamic resource for subscribing to pipeline runs. Format: gitea://runs/{owner}/{repo}/{run_id}",
mimeType: "application/json",
}
],
};
});
server.setRequestHandler(ReadResourceRequestSchema, async (request) => {
const uri = request.params.uri;
const match = uri.match(/^gitea:\/\/runs\/([^\/]+)\/([^\/]+)\/(\d+)$/);
if (!match) {
throw new Error(`Invalid resource URI. Must be gitea://runs/{owner}/{repo}/{run_id}`);
}
const [, owner, repo, run_id] = match;
try {
const runResponse = await giteaClient.get(`/repos/${owner}/${repo}/actions/runs/${run_id}`);
const jobsResponse = await giteaClient.get(`/repos/${owner}/${repo}/actions/runs/${run_id}/jobs`);
const resourceContent = {
run: runResponse.data,
jobs: jobsResponse.data
};
// Update internal cache when read
runStatusCache.set(uri, runResponse.data.status);
return {
contents: [
{
uri,
mimeType: "application/json",
text: JSON.stringify(resourceContent, null, 2),
},
],
};
} catch (error: any) {
throw new Error(`Failed to read Gitea resource: ${error.message}`);
}
});
server.setRequestHandler(SubscribeRequestSchema, async (request) => {
const uri = request.params.uri;
if (!uri.startsWith("gitea://runs/")) {
throw new Error("Only gitea://runs resources can be subscribed to");
}
subscriptions.add(uri);
console.error(`Client subscribed to ${uri}`);
return {};
});
server.setRequestHandler(UnsubscribeRequestSchema, async (request) => {
const uri = request.params.uri;
subscriptions.delete(uri);
console.error(`Client unsubscribed from ${uri}`);
return {};
});
// The server polling mechanism that pushes updates to subscribed clients
async function pollSubscriptions() {
for (const uri of subscriptions) {
const match = uri.match(/^gitea:\/\/runs\/([^\/]+)\/([^\/]+)\/(\d+)$/);
if (!match) continue;
const [, owner, repo, run_id] = match;
try {
const runResponse = await giteaClient.get(`/repos/${owner}/${repo}/actions/runs/${run_id}`);
const currentStatus = runResponse.data.status;
const prevStatus = runStatusCache.get(uri);
// If status changed (e.g. running -> completed), notify client
if (prevStatus !== currentStatus) {
runStatusCache.set(uri, currentStatus);
server.notification({
method: "notifications/resources/updated",
params: { uri }
});
console.error(`Pushed update for ${uri}: ${prevStatus} -> ${currentStatus}`);
// Auto-unsubscribe if completed/failed so we don't poll forever?
// Let the client decide, or we can handle it here if requested.
}
} catch (e) {
console.error(`Error polling subscription ${uri}:`, e);
}
}
// Poll every 5 seconds
setTimeout(pollSubscriptions, 5000);
}
async function run() {
const transport = new StdioServerTransport();
await server.connect(transport);
console.error("Gitea MCP Native Server running on stdio");
// Start the background poller
pollSubscriptions();
}
run().catch((error) => {
console.error("Fatal error:", error);
process.exit(1);
});

View File

@@ -0,0 +1,16 @@
{
"compilerOptions": {
"target": "ES2022",
"module": "NodeNext",
"moduleResolution": "NodeNext",
"outDir": "./dist",
"rootDir": "./src",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true
},
"include": [
"src/**/*"
]
}

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/husky-config",
"version": "1.9.1",
"version": "1.9.7",
"publishConfig": {
"access": "public",
"registry": "https://git.infra.mintel.me/api/packages/mmintel/npm"

View File

@@ -13,7 +13,7 @@ COPY packages/eslint-config/package.json ./packages/eslint-config/package.json
COPY packages/next-config/package.json ./packages/next-config/package.json
COPY packages/tsconfig/package.json ./packages/tsconfig/package.json
COPY packages/infra/package.json ./packages/infra/package.json
COPY packages/cms-infra/package.json ./packages/cms-infra/package.json
COPY packages/mail/package.json ./packages/mail/package.json
COPY packages/cli/package.json ./packages/cli/package.json
COPY packages/observability/package.json ./packages/observability/package.json

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/infra",
"version": "1.9.1",
"version": "1.9.7",
"publishConfig": {
"access": "public",
"registry": "https://git.infra.mintel.me/api/packages/mmintel/npm"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/journaling",
"version": "1.9.1",
"version": "1.9.7",
"private": true,
"type": "module",
"main": "./dist/index.js",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/mail",
"version": "1.9.1",
"version": "1.9.7",
"private": false,
"publishConfig": {
"access": "public",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/meme-generator",
"version": "1.9.1",
"version": "1.9.7",
"private": false,
"type": "module",
"main": "./dist/index.js",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/next-config",
"version": "1.9.1",
"version": "1.9.7",
"publishConfig": {
"access": "public",
"registry": "https://git.infra.mintel.me/api/packages/mmintel/npm"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/next-feedback",
"version": "1.9.1",
"version": "1.9.7",
"publishConfig": {
"access": "public",
"registry": "https://git.infra.mintel.me/api/packages/mmintel/npm"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/next-observability",
"version": "1.9.1",
"version": "1.9.7",
"publishConfig": {
"access": "public",
"registry": "https://git.infra.mintel.me/api/packages/mmintel/npm"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/next-utils",
"version": "1.9.1",
"version": "1.9.7",
"publishConfig": {
"access": "public",
"registry": "https://git.infra.mintel.me/api/packages/mmintel/npm"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/observability",
"version": "1.9.1",
"version": "1.9.7",
"publishConfig": {
"access": "public",
"registry": "https://git.infra.mintel.me/api/packages/mmintel/npm"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/page-audit",
"version": "1.9.1",
"version": "1.9.7",
"private": true,
"description": "AI-powered website IST-analysis using DataForSEO and Gemini",
"type": "module",

View File

@@ -0,0 +1,7 @@
# @mintel/payload-ai
## 1.1.0
### Minor Changes
- Initial release of the @mintel/payload-ai package

View File

@@ -0,0 +1,45 @@
{
"name": "@mintel/payload-ai",
"version": "1.9.7",
"private": true,
"description": "Reusable Payload CMS AI Extensions",
"type": "module",
"scripts": {
"build": "tsc",
"typecheck": "tsc --noEmit"
},
"main": "./dist/index.js",
"types": "./dist/index.d.ts",
"exports": {
".": "./dist/index.js",
"./components/*": "./dist/components/*",
"./actions/*": "./dist/actions/*",
"./globals/*": "./dist/globals/*",
"./endpoints/*": "./dist/endpoints/*",
"./utils/*": "./dist/utils/*"
},
"peerDependencies": {
"@payloadcms/next": ">=3.0.0",
"@payloadcms/ui": ">=3.0.0",
"payload": ">=3.0.0",
"react": ">=18.0.0",
"react-dom": ">=18.0.0"
},
"dependencies": {
"@mintel/content-engine": "workspace:*",
"@mintel/thumbnail-generator": "workspace:*",
"replicate": "^1.4.0"
},
"devDependencies": {
"@payloadcms/next": "3.77.0",
"@payloadcms/ui": "3.77.0",
"payload": "3.77.0",
"react": "^19.2.3",
"react-dom": "^19.2.3",
"@types/node": "^20.17.17",
"@types/react": "^19.2.8",
"@types/react-dom": "^19.2.3",
"next": "^15.1.0",
"typescript": "^5.7.3"
}
}

View File

@@ -0,0 +1,190 @@
"use server";
import { getPayloadHMR } from "@payloadcms/next/utilities";
import configPromise from "@payload-config";
import * as fs from "node:fs/promises";
import * as path from "node:path";
import * as os from "node:os";
async function getOrchestrator() {
const OPENROUTER_KEY =
process.env.OPENROUTER_KEY || process.env.OPENROUTER_API_KEY;
const REPLICATE_KEY = process.env.REPLICATE_API_KEY;
if (!OPENROUTER_KEY) {
throw new Error(
"Missing OPENROUTER_API_KEY in .env (Required for AI generation)",
);
}
const importDynamic = new Function("modulePath", "return import(modulePath)");
const { AiBlogPostOrchestrator } = await importDynamic(
"@mintel/content-engine",
);
return new AiBlogPostOrchestrator({
apiKey: OPENROUTER_KEY,
replicateApiKey: REPLICATE_KEY,
model: "google/gemini-3-flash-preview",
});
}
export async function generateSlugAction(
title: string,
draftContent: string,
oldSlug?: string,
instructions?: string,
) {
try {
const orchestrator = await getOrchestrator();
const newSlug = await orchestrator.generateSlug(
draftContent,
title,
instructions,
);
if (oldSlug && oldSlug !== newSlug) {
const payload = await getPayloadHMR({ config: configPromise as any });
await payload.create({
collection: "redirects",
data: {
from: oldSlug,
to: newSlug,
},
});
}
return { success: true, slug: newSlug };
} catch (e: any) {
return { success: false, error: e.message };
}
}
export async function generateThumbnailAction(
draftContent: string,
title?: string,
instructions?: string,
) {
try {
const payload = await getPayloadHMR({ config: configPromise as any });
const OPENROUTER_KEY =
process.env.OPENROUTER_KEY || process.env.OPENROUTER_API_KEY;
const REPLICATE_KEY = process.env.REPLICATE_API_KEY;
if (!OPENROUTER_KEY) {
throw new Error("Missing OPENROUTER_API_KEY in .env");
}
if (!REPLICATE_KEY) {
throw new Error(
"Missing REPLICATE_API_KEY in .env (Required for Thumbnails)",
);
}
const importDynamic = new Function(
"modulePath",
"return import(modulePath)",
);
const { AiBlogPostOrchestrator } = await importDynamic(
"@mintel/content-engine",
);
const { ThumbnailGenerator } = await importDynamic(
"@mintel/thumbnail-generator",
);
const orchestrator = new AiBlogPostOrchestrator({
apiKey: OPENROUTER_KEY,
replicateApiKey: REPLICATE_KEY,
model: "google/gemini-3-flash-preview",
});
const tg = new ThumbnailGenerator({ replicateApiKey: REPLICATE_KEY });
const prompt = await orchestrator.generateVisualPrompt(
draftContent || title || "Technology",
instructions,
);
const tmpPath = path.join(os.tmpdir(), `mintel-thumb-${Date.now()}.png`);
await tg.generateImage(prompt, tmpPath);
const fileData = await fs.readFile(tmpPath);
const stat = await fs.stat(tmpPath);
const fileName = path.basename(tmpPath);
const newMedia = await payload.create({
collection: "media",
data: {
alt: title ? `Thumbnail for ${title}` : "AI Generated Thumbnail",
},
file: {
data: fileData,
name: fileName,
mimetype: "image/png",
size: stat.size,
},
});
// Cleanup temp file
await fs.unlink(tmpPath).catch(() => { });
return { success: true, mediaId: newMedia.id };
} catch (e: any) {
return { success: false, error: e.message };
}
}
export async function generateSingleFieldAction(
documentTitle: string,
documentContent: string,
fieldName: string,
fieldDescription: string,
instructions?: string,
) {
try {
const OPENROUTER_KEY =
process.env.OPENROUTER_KEY || process.env.OPENROUTER_API_KEY;
if (!OPENROUTER_KEY) throw new Error("Missing OPENROUTER_API_KEY");
const payload = await getPayloadHMR({ config: configPromise as any });
// Fetch context documents from DB
const contextDocsData = await payload.find({
collection: "context-files",
limit: 100,
});
const projectContext = contextDocsData.docs
.map((doc: any) => `--- ${doc.filename} ---\n${doc.content}`)
.join("\n\n");
const prompt = `You are an expert AI assistant perfectly trained for generating exact data values for CMS components.
PROJECT STRATEGY & CONTEXT:
${projectContext}
DOCUMENT TITLE: ${documentTitle}
DOCUMENT DRAFT:\n${documentContent}\n
YOUR TASK: Generate the exact value for a specific field named "${fieldName}".
${fieldDescription ? `FIELD DESCRIPTION / CONSTRAINTS: ${fieldDescription}\n` : ""}
${instructions ? `EDITOR INSTRUCTIONS for this field: ${instructions}\n` : ""}
CRITICAL RULES:
1. Respond ONLY with the requested content value.
2. NO markdown wrapping blocks (like \`\`\`mermaid or \`\`\`html) around the output! Just the raw code or text.
3. If the field implies a diagram or flow, output RAW Mermaid.js code.
4. If it's standard text, write professional B2B German. No quotes, no conversational filler.`;
const res = await fetch("https://openrouter.ai/api/v1/chat/completions", {
method: "POST",
headers: {
Authorization: `Bearer ${OPENROUTER_KEY}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
model: "google/gemini-3-flash-preview",
messages: [{ role: "user", content: prompt }],
}),
});
const data = await res.json();
const text = data.choices?.[0]?.message?.content?.trim() || "";
return { success: true, text };
} catch (e: any) {
return { success: false, error: e.message };
}
}

View File

@@ -0,0 +1,83 @@
"use server";
import { parseMarkdownToLexical } from "../utils/lexicalParser";
import { getPayloadHMR } from "@payloadcms/next/utilities";
import configPromise from "@payload-config";
export async function optimizePostText(
draftContent: string,
instructions?: string,
) {
try {
const payload = await getPayloadHMR({ config: configPromise as any });
const globalAiSettings = (await payload.findGlobal({ slug: "ai-settings" })) as any;
const customSources =
globalAiSettings?.customSources?.map((s: any) => s.sourceName) || [];
const OPENROUTER_KEY =
process.env.OPENROUTER_KEY || process.env.OPENROUTER_API_KEY;
const REPLICATE_KEY = process.env.REPLICATE_API_KEY;
if (!OPENROUTER_KEY) {
throw new Error(
"OPENROUTER_KEY or OPENROUTER_API_KEY not found in environment.",
);
}
const importDynamic = new Function(
"modulePath",
"return import(modulePath)",
);
const { AiBlogPostOrchestrator } = await importDynamic(
"@mintel/content-engine",
);
const orchestrator = new AiBlogPostOrchestrator({
apiKey: OPENROUTER_KEY,
replicateApiKey: REPLICATE_KEY,
model: "google/gemini-3-flash-preview",
});
// Fetch context documents purely from DB
const contextDocsData = await payload.find({
collection: "context-files",
limit: 100,
});
const projectContext = contextDocsData.docs.map((doc: any) => doc.content);
const optimizedMarkdown = await orchestrator.optimizeDocument({
content: draftContent,
projectContext,
availableComponents: [], // Removed hardcoded config.components dependency
instructions,
internalLinks: [],
customSources,
});
if (!optimizedMarkdown || typeof optimizedMarkdown !== "string") {
throw new Error("AI returned invalid markup.");
}
const blocks = parseMarkdownToLexical(optimizedMarkdown);
return {
success: true,
lexicalAST: {
root: {
type: "root",
format: "",
indent: 0,
version: 1,
children: blocks,
direction: "ltr",
},
},
};
} catch (error: any) {
console.error("Failed to optimize post:", error);
return {
success: false,
error: error.message || "An unknown error occurred during optimization.",
};
}
}

View File

@@ -0,0 +1,163 @@
"use client";
import React, { useState } from "react";
import { useDocumentInfo, toast } from "@payloadcms/ui";
type Action = "upscale" | "recover";
interface ActionState {
loading: boolean;
resultId?: string | number;
}
export const AiMediaButtons: React.FC = () => {
const { id } = useDocumentInfo();
const [upscale, setUpscale] = useState<ActionState>({ loading: false });
const [recover, setRecover] = useState<ActionState>({ loading: false });
if (!id) return null; // Only show on existing documents
const runAction = async (action: Action) => {
const setter = action === "upscale" ? setUpscale : setRecover;
setter({ loading: true });
const label = action === "upscale" ? "AI Upscale" : "AI Recover";
toast.info(
`${label} started this can take 3090 seconds, please wait…`,
);
try {
// The API path is hardcoded here and assuming that's where the host app registers the endpoint.
const response = await fetch(`/api/media/${id}/ai-process`, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ action }),
});
const result = await response.json();
if (!response.ok) {
throw new Error(result.error || `${label} failed`);
}
setter({ loading: false, resultId: result.mediaId });
toast.success(
`${label} erfolgreich! Neues Bild (ID: ${result.mediaId}) wurde gespeichert.`,
);
} catch (err: any) {
console.error(`[AiMediaButtons] ${action} error:`, err);
toast.error(
err instanceof Error ? err.message : `${label} fehlgeschlagen`,
);
setter({ loading: false });
}
};
const buttonStyle: React.CSSProperties = {
background: "var(--theme-elevation-150)",
border: "1px solid var(--theme-elevation-200)",
color: "var(--theme-text)",
padding: "8px 14px",
borderRadius: "4px",
fontSize: "13px",
fontWeight: 500,
display: "inline-flex",
alignItems: "center",
gap: "6px",
transition: "opacity 0.15s ease",
};
const disabledStyle: React.CSSProperties = {
opacity: 0.55,
cursor: "not-allowed",
};
return (
<div
style={{
display: "flex",
flexWrap: "wrap",
gap: "10px",
marginBottom: "1.5rem",
marginTop: "0.5rem",
}}
>
{/* AI Upscale */}
<div style={{ display: "flex", flexDirection: "column", gap: "6px" }}>
<button
type="button"
disabled={upscale.loading || recover.loading}
onClick={() => runAction("upscale")}
style={{
...buttonStyle,
...(upscale.loading || recover.loading ? disabledStyle : { cursor: "pointer" }),
}}
>
{upscale.loading ? "⏳ AI Upscale läuft…" : "✨ AI Upscale"}
</button>
{upscale.resultId && (
<a
href={`/admin/collections/media/${upscale.resultId}`}
target="_blank"
rel="noopener noreferrer"
style={{
fontSize: "12px",
color: "var(--theme-elevation-500)",
textDecoration: "underline",
}}
>
Neues Bild öffnen (ID: {upscale.resultId})
</a>
)}
</div>
{/* AI Recover */}
<div style={{ display: "flex", flexDirection: "column", gap: "6px" }}>
<button
type="button"
disabled={upscale.loading || recover.loading}
onClick={() => runAction("recover")}
style={{
...buttonStyle,
...(upscale.loading || recover.loading ? disabledStyle : { cursor: "pointer" }),
}}
>
{recover.loading ? "⏳ AI Recover läuft…" : "🔄 AI Recover"}
</button>
{recover.resultId && (
<a
href={`/admin/collections/media/${recover.resultId}`}
target="_blank"
rel="noopener noreferrer"
style={{
fontSize: "12px",
color: "var(--theme-elevation-500)",
textDecoration: "underline",
}}
>
Neues Bild öffnen (ID: {recover.resultId})
</a>
)}
</div>
<p
style={{
width: "100%",
fontSize: "0.8rem",
color: "var(--theme-elevation-500)",
margin: 0,
lineHeight: 1.4,
}}
>
<strong>AI Upscale</strong> verbessert die Auflösung via{" "}
<code>google/upscaler</code>. <strong>AI Recover</strong> restauriert
alte/beschädigte Fotos via{" "}
<code>microsoft/bringing-old-photos-back-to-life</code>. Das
Ergebnis wird als neues Medium gespeichert.
</p>
</div>
);
};

View File

@@ -0,0 +1,136 @@
"use client";
import React, { useState } from "react";
import { useField, useDocumentInfo, useForm } from "@payloadcms/ui";
import { generateSingleFieldAction } from "../../actions/generateField.js";
export function AiFieldButton({ path, field }: { path: string; field: any }) {
const [isGenerating, setIsGenerating] = useState(false);
const [instructions, setInstructions] = useState("");
const [showInstructions, setShowInstructions] = useState(false);
// Payload hooks
const { value, setValue } = useField<string>({ path });
const { title } = useDocumentInfo();
const { fields } = useForm();
const extractText = (lexicalRoot: any): string => {
if (!lexicalRoot) return "";
let text = "";
const iterate = (node: any) => {
if (node.text) text += node.text + " ";
if (node.children) node.children.forEach(iterate);
};
iterate(lexicalRoot);
return text;
};
const handleGenerate = async (e: React.MouseEvent) => {
e.preventDefault();
const lexicalValue = fields?.content?.value as any;
const legacyValue = fields?.legacyMdx?.value as string;
let draftContent = legacyValue || "";
if (!draftContent && lexicalValue?.root) {
draftContent = extractText(lexicalValue.root);
}
setIsGenerating(true);
try {
// Field name is passed as a label usually, fallback to path
const fieldName = typeof field?.label === "string" ? field.label : path;
const fieldDescription =
typeof field?.admin?.description === "string"
? field.admin.description
: "";
const res = await generateSingleFieldAction(
(title as string) || "",
draftContent,
fieldName,
fieldDescription,
instructions,
);
if (res.success && res.text) {
setValue(res.text);
} else {
alert("Fehler: " + res.error);
}
} catch (e) {
alert("Fehler bei der Generierung.");
} finally {
setIsGenerating(false);
setShowInstructions(false);
}
};
return (
<div
style={{
marginTop: "8px",
marginBottom: "8px",
display: "flex",
flexDirection: "column",
gap: "8px",
}}
>
<div style={{ display: "flex", gap: "8px", alignItems: "center" }}>
<button
type="button"
onClick={handleGenerate}
disabled={isGenerating}
style={{
background: "var(--theme-elevation-150)",
border: "1px solid var(--theme-elevation-200)",
color: "var(--theme-text)",
padding: "4px 12px",
borderRadius: "4px",
fontSize: "12px",
cursor: isGenerating ? "not-allowed" : "pointer",
display: "flex",
alignItems: "center",
gap: "6px",
opacity: isGenerating ? 0.6 : 1,
}}
>
{isGenerating ? "✨ AI arbeitet..." : "✨ AI Ausfüllen"}
</button>
<button
type="button"
onClick={(e) => {
e.preventDefault();
setShowInstructions(!showInstructions);
}}
style={{
background: "transparent",
border: "none",
color: "var(--theme-elevation-500)",
fontSize: "12px",
cursor: "pointer",
textDecoration: "underline",
}}
>
{showInstructions ? "Prompt verbergen" : "Mit Prompt..."}
</button>
</div>
{showInstructions && (
<textarea
value={instructions}
onChange={(e) => setInstructions(e.target.value)}
placeholder="Eigene Anweisung an AI (z.B. 'als catchy slogan')"
disabled={isGenerating}
style={{
width: "100%",
padding: "6px 8px",
fontSize: "12px",
borderRadius: "4px",
border: "1px solid var(--theme-elevation-200)",
background: "var(--theme-elevation-50)",
color: "var(--theme-text)",
}}
rows={2}
/>
)}
</div>
);
}

View File

@@ -0,0 +1,107 @@
"use client";
import React, { useState, useEffect } from "react";
import { useForm, useField } from "@payloadcms/ui";
import { generateSlugAction } from "../../actions/generateField.js";
export function GenerateSlugButton({ path }: { path: string }) {
const [isGenerating, setIsGenerating] = useState(false);
const [instructions, setInstructions] = useState("");
useEffect(() => {
if (!isGenerating) return;
const handleBeforeUnload = (e: BeforeUnloadEvent) => {
e.preventDefault();
e.returnValue =
"Slug-Generierung läuft noch. Wenn Sie neu laden, bricht der Vorgang ab!";
};
window.addEventListener("beforeunload", handleBeforeUnload);
return () => window.removeEventListener("beforeunload", handleBeforeUnload);
}, [isGenerating]);
const { fields, replaceState } = useForm();
const { value, initialValue, setValue } = useField({ path });
const extractText = (lexicalRoot: any): string => {
if (!lexicalRoot) return "";
let text = "";
const iterate = (node: any) => {
if (node.text) text += node.text + " ";
if (node.children) node.children.forEach(iterate);
};
iterate(lexicalRoot);
return text;
};
const handleGenerate = async () => {
const title = (fields?.title?.value as string) || "";
const lexicalValue = fields?.content?.value as any;
const legacyValue = fields?.legacyMdx?.value as string;
let draftContent = legacyValue || "";
if (!draftContent && lexicalValue?.root) {
draftContent = extractText(lexicalValue.root);
}
setIsGenerating(true);
try {
const res = await generateSlugAction(
title,
draftContent,
initialValue as string,
instructions,
);
if (res.success && res.slug) {
setValue(res.slug);
} else {
alert("Fehler: " + res.error);
}
} catch (e) {
console.error(e);
alert("Unerwarteter Fehler.");
} finally {
setIsGenerating(false);
}
};
return (
<div className="flex gap-2 items-center mb-4">
<textarea
value={instructions}
onChange={(e) => setInstructions(e.target.value)}
placeholder="Optionale AI Anweisung für den Slug..."
disabled={isGenerating}
style={{
width: "100%",
minHeight: "40px",
padding: "8px 12px",
fontSize: "14px",
borderRadius: "4px",
border: "1px solid var(--theme-elevation-200)",
background: "var(--theme-elevation-50)",
color: "var(--theme-text)",
marginBottom: "8px",
}}
/>
<button
type="button"
onClick={handleGenerate}
disabled={isGenerating}
className="btn btn--icon-style-none btn--size-medium ml-auto"
style={{
background: "var(--theme-elevation-150)",
border: "1px solid var(--theme-elevation-200)",
color: "var(--theme-text)",
boxShadow: "0 2px 4px rgba(0,0,0,0.05)",
transition: "all 0.2s ease",
opacity: isGenerating ? 0.6 : 1,
cursor: isGenerating ? "not-allowed" : "pointer",
}}
>
<span className="btn__content">
{isGenerating ? "✨ Generiere (ca 10s)..." : "✨ AI Slug Generieren"}
</span>
</button>
</div>
);
}

View File

@@ -0,0 +1,108 @@
"use client";
import React, { useState, useEffect } from "react";
import { useForm, useField } from "@payloadcms/ui";
import { generateThumbnailAction } from "../../actions/generateField.js";
export function GenerateThumbnailButton({ path }: { path: string }) {
const [isGenerating, setIsGenerating] = useState(false);
const [instructions, setInstructions] = useState("");
useEffect(() => {
if (!isGenerating) return;
const handleBeforeUnload = (e: BeforeUnloadEvent) => {
e.preventDefault();
e.returnValue =
"Bild-Generierung läuft noch (dies dauert bis zu 2 Minuten). Wenn Sie neu laden, bricht der Vorgang ab!";
};
window.addEventListener("beforeunload", handleBeforeUnload);
return () => window.removeEventListener("beforeunload", handleBeforeUnload);
}, [isGenerating]);
const { fields } = useForm();
const { value, setValue } = useField({ path });
const extractText = (lexicalRoot: any): string => {
if (!lexicalRoot) return "";
let text = "";
const iterate = (node: any) => {
if (node.text) text += node.text + " ";
if (node.children) node.children.forEach(iterate);
};
iterate(lexicalRoot);
return text;
};
const handleGenerate = async () => {
const title = (fields?.title?.value as string) || "";
const lexicalValue = fields?.content?.value as any;
const legacyValue = fields?.legacyMdx?.value as string;
let draftContent = legacyValue || "";
if (!draftContent && lexicalValue?.root) {
draftContent = extractText(lexicalValue.root);
}
setIsGenerating(true);
try {
const res = await generateThumbnailAction(
draftContent,
title,
instructions,
);
if (res.success && res.mediaId) {
setValue(res.mediaId);
} else {
alert("Fehler: " + res.error);
}
} catch (e) {
console.error(e);
alert("Unerwarteter Fehler.");
} finally {
setIsGenerating(false);
}
};
return (
<div className="flex gap-2 items-center mt-2 mb-4">
<textarea
value={instructions}
onChange={(e) => setInstructions(e.target.value)}
placeholder="Optionale Thumbnail-Detailanweisung (Farben, Stimmung, etc.)..."
disabled={isGenerating}
style={{
width: "100%",
minHeight: "40px",
padding: "8px 12px",
fontSize: "14px",
borderRadius: "4px",
border: "1px solid var(--theme-elevation-200)",
background: "var(--theme-elevation-50)",
color: "var(--theme-text)",
marginBottom: "8px",
}}
/>
<button
type="button"
onClick={handleGenerate}
disabled={isGenerating}
className="btn btn--icon-style-none btn--size-medium"
style={{
background: "var(--theme-elevation-150)",
border: "1px solid var(--theme-elevation-200)",
color: "var(--theme-text)",
boxShadow: "0 2px 4px rgba(0,0,0,0.05)",
transition: "all 0.2s ease",
opacity: isGenerating ? 0.6 : 1,
cursor: isGenerating ? "not-allowed" : "pointer",
}}
>
<span className="btn__content">
{isGenerating
? "✨ AI arbeitet (dauert ca. 1-2 Min)..."
: "✨ AI Thumbnail Generieren"}
</span>
</button>
</div>
);
}

View File

@@ -0,0 +1,136 @@
"use client";
import React, { useState, useEffect } from "react";
import { useForm, useDocumentInfo } from "@payloadcms/ui";
import { optimizePostText } from "../actions/optimizePost.js";
import { Button } from "@payloadcms/ui";
export function OptimizeButton() {
const [isOptimizing, setIsOptimizing] = useState(false);
const [instructions, setInstructions] = useState("");
useEffect(() => {
if (!isOptimizing) return;
const handleBeforeUnload = (e: BeforeUnloadEvent) => {
e.preventDefault();
e.returnValue =
"Lexical Block-Optimierung läuft noch (dies dauert bis zu 45 Sekunden). Wenn Sie neu laden, bricht der Vorgang ab!";
};
window.addEventListener("beforeunload", handleBeforeUnload);
return () => window.removeEventListener("beforeunload", handleBeforeUnload);
}, [isOptimizing]);
const { fields, setModified, replaceState } = useForm();
const { title } = useDocumentInfo();
const handleOptimize = async () => {
// ... gathering draftContent logic
const lexicalValue = fields?.content?.value as any;
const legacyValue = fields?.legacyMdx?.value as string;
let draftContent = legacyValue || "";
const extractText = (lexicalRoot: any): string => {
if (!lexicalRoot) return "";
let text = "";
const iterate = (node: any) => {
if (node.text) text += node.text + " ";
if (node.children) node.children.forEach(iterate);
};
iterate(lexicalRoot);
return text;
};
if (!draftContent && lexicalValue?.root) {
draftContent = extractText(lexicalValue.root);
}
if (!draftContent || draftContent.trim().length < 50) {
alert(
"Der Entwurf ist zu kurz. Bitte tippe zuerst ein paar Stichpunkte oder einen Rohling ein.",
);
return;
}
setIsOptimizing(true);
try {
// 2. We inject the title so the AI knows what it's writing about
const payloadText = `---\ntitle: "${title}"\n---\n\n${draftContent}`;
const response = await optimizePostText(payloadText, instructions);
if (response.success && response.lexicalAST) {
// 3. Inject the new Lexical AST directly into the field form state
// We use Payload's useForm hook replacing the value of the 'content' field.
replaceState({
...fields,
content: {
...fields.content,
value: response.lexicalAST,
initialValue: response.lexicalAST,
},
});
setModified(true);
alert(
"🎉 Artikel wurde erfolgreich von der AI optimiert und mit Lexical Components angereichert.",
);
} else {
alert("❌ Fehler: " + response.error);
}
} catch (error) {
console.error("Optimization failed:", error);
alert("Ein unerwarteter Fehler ist aufgetreten.");
} finally {
setIsOptimizing(false);
}
};
return (
<div className="mb-8 p-4 bg-slate-50 border border-slate-200 rounded-md">
<h3 className="text-sm font-semibold mb-2">AI Post Optimizer</h3>
<p className="text-xs text-slate-500 mb-4">
Lass Mintel AI deinen Text-Rohentwurf analysieren und automatisch in
einen voll formatierten Lexical Artikel mit passenden B2B Komponenten
(MemeCards, Mermaids) umwandeln.
</p>
<textarea
value={instructions}
onChange={(e) => setInstructions(e.target.value)}
placeholder="Optionale Anweisungen an die AI (z.B. 'schreibe etwas lockerer' oder 'fokussiere dich auf SEO')..."
disabled={isOptimizing}
style={{
width: "100%",
minHeight: "60px",
padding: "8px 12px",
fontSize: "14px",
borderRadius: "4px",
border: "1px solid var(--theme-elevation-200)",
background: "var(--theme-elevation-50)",
color: "var(--theme-text)",
marginBottom: "16px",
}}
/>
<button
type="button"
onClick={handleOptimize}
disabled={isOptimizing}
className="btn btn--icon-style-none btn--size-medium mt-4"
style={{
background: "var(--theme-elevation-150)",
border: "1px solid var(--theme-elevation-200)",
color: "var(--theme-text)",
boxShadow: "0 2px 4px rgba(0,0,0,0.05)",
transition: "all 0.2s ease",
opacity: isOptimizing ? 0.7 : 1,
cursor: isOptimizing ? "not-allowed" : "pointer",
}}
>
<span className="btn__content" style={{ fontWeight: 600 }}>
{isOptimizing ? "✨ AI arbeitet (ca 30s)..." : "✨ Jetzt optimieren"}
</span>
</button>
</div>
);
}

View File

@@ -0,0 +1,177 @@
import type { PayloadRequest, PayloadHandler } from "payload";
import Replicate from "replicate";
type Action = "upscale" | "recover";
const replicate = new Replicate({
auth: process.env.REPLICATE_API_KEY,
});
/**
* Downloads a remote URL and returns a Buffer.
*/
async function downloadImage(url: string): Promise<Buffer> {
const res = await fetch(url);
if (!res.ok) throw new Error(`Failed to download image: ${res.status} ${res.statusText}`);
const arrayBuffer = await res.arrayBuffer();
return Buffer.from(arrayBuffer);
}
/**
* Resolves the public URL for a media document.
* Handles both S3 and local static files.
*/
function resolveMediaUrl(doc: any): string | null {
// S3 storage sets `url` directly
if (doc.url) return doc.url;
// Local static files: build from NEXT_PUBLIC_BASE_URL + /media/<filename>
const base = process.env.NEXT_PUBLIC_BASE_URL || "http://localhost:3000";
if (doc.filename) return `${base}/media/${doc.filename}`;
return null;
}
export const replicateMediaHandler: PayloadHandler = async (
req: PayloadRequest,
) => {
const { id } = req.routeParams as { id: string };
const payload = req.payload;
// Parse action from request body
let action: Action;
try {
const body = await req.json?.();
action = body?.action as Action;
} catch {
return Response.json({ error: "Invalid request body" }, { status: 400 });
}
if (action !== "upscale" && action !== "recover") {
return Response.json(
{ error: "Invalid action. Must be 'upscale' or 'recover'." },
{ status: 400 },
);
}
// Fetch the media document
let mediaDoc: any;
try {
mediaDoc = await payload.findByID({ collection: "media", id });
} catch {
return Response.json({ error: "Media not found" }, { status: 404 });
}
if (!mediaDoc) {
return Response.json({ error: "Media not found" }, { status: 404 });
}
// Check that it's an image
const mimeType: string = mediaDoc.mimeType || "";
if (!mimeType.startsWith("image/")) {
return Response.json(
{ error: "This media file is not an image and cannot be AI-processed." },
{ status: 422 },
);
}
const imageUrl = resolveMediaUrl(mediaDoc);
if (!imageUrl) {
return Response.json(
{ error: "Could not resolve a public URL for this media file." },
{ status: 422 },
);
}
// --- Run Replicate ---
let outputUrl: string;
try {
if (action === "upscale") {
console.log(`[AI Media] Starting upscale for media ${id} ${imageUrl}`);
const output = await replicate.run("google/upscaler", {
input: {
image: imageUrl,
},
});
// google/upscaler returns a string URL
outputUrl = typeof output === "string" ? output : (output as any)?.url ?? String(output);
} else {
// recover
console.log(`[AI Media] Starting photo recovery for media ${id} ${imageUrl}`);
const output = await replicate.run(
"microsoft/bringing-old-photos-back-to-life",
{
input: {
image: imageUrl,
HR: true,
},
},
);
// returns a FileOutput or URL string
outputUrl = typeof output === "string" ? output : (output as any)?.url ?? String(output);
}
} catch (err: any) {
console.error("[AI Media] Replicate error:", err);
return Response.json(
{ error: err?.message ?? "Replicate API call failed" },
{ status: 500 },
);
}
// --- Download and re-upload as new media document ---
let imageBuffer: Buffer;
try {
imageBuffer = await downloadImage(outputUrl);
} catch (err: any) {
console.error("[AI Media] Download error:", err);
return Response.json(
{ error: `Failed to download result: ${err?.message}` },
{ status: 500 },
);
}
const suffix = action === "upscale" ? "_upscaled" : "_recovered";
const originalName: string = mediaDoc.filename || "image.jpg";
const ext = originalName.includes(".") ? `.${originalName.split(".").pop()}` : ".jpg";
const baseName = originalName.includes(".")
? originalName.slice(0, originalName.lastIndexOf("."))
: originalName;
const newFilename = `${baseName}${suffix}${ext}`;
const originalAlt: string = mediaDoc.alt || originalName;
let newMedia: any;
try {
newMedia = await payload.create({
collection: "media",
data: {
alt: `${originalAlt}${suffix}`,
},
file: {
data: imageBuffer,
mimetype: mimeType,
name: newFilename,
size: imageBuffer.byteLength,
},
});
} catch (err: any) {
console.error("[AI Media] Upload error:", err);
return Response.json(
{ error: `Failed to save result: ${err?.message}` },
{ status: 500 },
);
}
console.log(
`[AI Media] ${action} complete new media ID: ${newMedia.id}`,
);
return Response.json(
{
message: `AI ${action} successful. New media document created.`,
mediaId: newMedia.id,
url: resolveMediaUrl(newMedia),
},
{ status: 200 },
);
};

View File

@@ -0,0 +1,30 @@
import type { GlobalConfig } from "payload";
export const AiSettings: GlobalConfig = {
slug: "ai-settings",
label: "AI Settings",
access: {
read: () => true, // Needed if the Next.js frontend or server actions need to fetch it
},
admin: {
group: "Configuration",
},
fields: [
{
name: "customSources",
type: "array",
label: "Custom Trusted Sources",
admin: {
description:
"List of trusted B2B/Tech sources (e.g. 'Vercel Blog', 'Fireship', 'Theo - t3.gg') the AI should prioritize when researching facts or videos. This overrides the hardcoded defaults.",
},
fields: [
{
name: "sourceName",
type: "text",
required: true,
label: "Channel or Publication Name",
},
],
},
],
};

View File

@@ -0,0 +1,15 @@
/**
* @mintel/payload-ai
* Primary entry point for reusing Mintel AI extensions in Payload CMS.
*/
export * from './globals/AiSettings';
export * from './actions/generateField';
export * from './actions/optimizePost';
export * from './components/FieldGenerators/AiFieldButton';
export * from './components/AiMediaButtons';
export * from './components/OptimizeButton';
export * from './components/FieldGenerators/GenerateThumbnailButton';
export * from './components/FieldGenerators/GenerateSlugButton';
export * from './utils/lexicalParser';
export * from './endpoints/replicateMediaEndpoint';

5
packages/payload-ai/src/types.d.ts vendored Normal file
View File

@@ -0,0 +1,5 @@
declare module "@payload-config" {
import { Config } from "payload";
const configPromise: Promise<Config>;
export default configPromise;
}

View File

@@ -0,0 +1,640 @@
/**
* Converts a Markdown+JSX string into a Lexical AST node array.
* Handles all registered Payload blocks and standard markdown formatting.
*/
function propValue(chunk: string, prop: string): string {
// Match prop="value" or prop='value' or prop={value}
const match =
chunk.match(new RegExp(`${prop}=["']([^"']+)["']`)) ||
chunk.match(new RegExp(`${prop}=\\{([^}]+)\\}`));
return match ? match[1] : "";
}
function innerContent(chunk: string, tag: string): string {
const match = chunk.match(
new RegExp(`<${tag}(?:\\s[^>]*)?>([\\s\\S]*?)<\\/${tag}>`),
);
return match ? match[1].trim() : "";
}
function blockNode(blockType: string, fields: Record<string, any>) {
return {
type: "block",
format: "",
version: 2,
fields: { blockType, ...fields },
};
}
export function parseMarkdownToLexical(markdown: string): any[] {
const textNode = (text: string) => ({
type: "paragraph",
format: "",
indent: 0,
version: 1,
children: [{ mode: "normal", type: "text", text, version: 1 }],
});
const nodes: any[] = [];
// Strip frontmatter
let content = markdown;
const fm = content.match(/^---\s*\n[\s\S]*?\n---/);
if (fm) content = content.replace(fm[0], "").trim();
// Pre-process: reassemble multi-line JSX tags that got split by double-newline chunking.
// This handles tags like <IconList>\n\n<IconListItem ... />\n\n</IconList>
content = reassembleMultiLineJSX(content);
const rawChunks = content.split(/\n\s*\n/);
for (let chunk of rawChunks) {
chunk = chunk.trim();
if (!chunk) continue;
// === Self-closing tags (no children) ===
// ArticleMeme / MemeCard
if (chunk.includes("<ArticleMeme") || chunk.includes("<MemeCard")) {
nodes.push(
blockNode("memeCard", {
template: propValue(chunk, "template"),
captions: propValue(chunk, "captions"),
}),
);
continue;
}
// BoldNumber
if (chunk.includes("<BoldNumber")) {
nodes.push(
blockNode("boldNumber", {
value: propValue(chunk, "value"),
label: propValue(chunk, "label"),
source: propValue(chunk, "source"),
sourceUrl: propValue(chunk, "sourceUrl"),
}),
);
continue;
}
// WebVitalsScore
if (chunk.includes("<WebVitalsScore")) {
nodes.push(
blockNode("webVitalsScore", {
lcp: parseFloat(propValue(chunk, "lcp")) || 0,
inp: parseFloat(propValue(chunk, "inp")) || 0,
cls: parseFloat(propValue(chunk, "cls")) || 0,
description: propValue(chunk, "description"),
}),
);
continue;
}
// LeadMagnet
if (chunk.includes("<LeadMagnet")) {
nodes.push(
blockNode("leadMagnet", {
title: propValue(chunk, "title"),
description: propValue(chunk, "description"),
buttonText: propValue(chunk, "buttonText") || "Jetzt anfragen",
href: propValue(chunk, "href") || "/contact",
variant: propValue(chunk, "variant") || "standard",
}),
);
continue;
}
// ComparisonRow
if (chunk.includes("<ComparisonRow")) {
nodes.push(
blockNode("comparisonRow", {
description: propValue(chunk, "description"),
negativeLabel: propValue(chunk, "negativeLabel"),
negativeText: propValue(chunk, "negativeText"),
positiveLabel: propValue(chunk, "positiveLabel"),
positiveText: propValue(chunk, "positiveText"),
reverse: chunk.includes("reverse={true}"),
}),
);
continue;
}
// StatsDisplay
if (chunk.includes("<StatsDisplay")) {
nodes.push(
blockNode("statsDisplay", {
label: propValue(chunk, "label"),
value: propValue(chunk, "value"),
subtext: propValue(chunk, "subtext"),
}),
);
continue;
}
// MetricBar
if (chunk.includes("<MetricBar")) {
nodes.push(
blockNode("metricBar", {
label: propValue(chunk, "label"),
value: parseFloat(propValue(chunk, "value")) || 0,
max: parseFloat(propValue(chunk, "max")) || 100,
unit: propValue(chunk, "unit") || "%",
}),
);
continue;
}
// ExternalLink
if (chunk.includes("<ExternalLink")) {
nodes.push(
blockNode("externalLink", {
href: propValue(chunk, "href"),
label:
propValue(chunk, "label") || innerContent(chunk, "ExternalLink"),
}),
);
continue;
}
// TrackedLink
if (chunk.includes("<TrackedLink")) {
nodes.push(
blockNode("trackedLink", {
href: propValue(chunk, "href"),
label:
propValue(chunk, "label") || innerContent(chunk, "TrackedLink"),
eventName: propValue(chunk, "eventName"),
}),
);
continue;
}
// YouTube
if (chunk.includes("<YouTubeEmbed")) {
nodes.push(
blockNode("youTubeEmbed", {
videoId: propValue(chunk, "videoId"),
}),
);
continue;
}
// LinkedIn
if (chunk.includes("<LinkedInEmbed")) {
nodes.push(
blockNode("linkedInEmbed", {
url: propValue(chunk, "url"),
}),
);
continue;
}
// Twitter
if (chunk.includes("<TwitterEmbed")) {
nodes.push(
blockNode("twitterEmbed", {
url: propValue(chunk, "url"),
}),
);
continue;
}
// Interactive (self-closing, defaults only)
if (chunk.includes("<RevenueLossCalculator")) {
nodes.push(
blockNode("revenueLossCalculator", {
title: propValue(chunk, "title") || "Performance Revenue Simulator",
}),
);
continue;
}
if (chunk.includes("<PerformanceChart")) {
nodes.push(
blockNode("performanceChart", {
title: propValue(chunk, "title") || "Website Performance",
}),
);
continue;
}
if (chunk.includes("<PerformanceROICalculator")) {
nodes.push(
blockNode("performanceROICalculator", {
baseConversionRate:
parseFloat(propValue(chunk, "baseConversionRate")) || 2.5,
monthlyVisitors:
parseInt(propValue(chunk, "monthlyVisitors")) || 50000,
}),
);
continue;
}
if (chunk.includes("<LoadTimeSimulator")) {
nodes.push(
blockNode("loadTimeSimulator", {
initialLoadTime:
parseFloat(propValue(chunk, "initialLoadTime")) || 3.5,
}),
);
continue;
}
if (chunk.includes("<ArchitectureBuilder")) {
nodes.push(
blockNode("architectureBuilder", {
preset: propValue(chunk, "preset") || "standard",
}),
);
continue;
}
if (chunk.includes("<DigitalAssetVisualizer")) {
nodes.push(
blockNode("digitalAssetVisualizer", {
assetId: propValue(chunk, "assetId"),
}),
);
continue;
}
// === Tags with inner content ===
// TLDR
if (chunk.includes("<TLDR>")) {
const inner = innerContent(chunk, "TLDR");
if (inner) {
nodes.push(blockNode("mintelTldr", { content: inner }));
continue;
}
}
// Paragraph (handles <Paragraph>, <Paragraph ...attrs>)
if (/<Paragraph[\s>]/.test(chunk)) {
const inner = innerContent(chunk, "Paragraph");
if (inner) {
nodes.push(blockNode("mintelP", { text: inner }));
continue;
}
}
// H2 (handles <H2>, <H2 id="...">)
if (/<H2[\s>]/.test(chunk)) {
const inner = innerContent(chunk, "H2");
if (inner) {
nodes.push(
blockNode("mintelHeading", {
text: inner,
seoLevel: "h2",
displayLevel: "h2",
}),
);
continue;
}
}
// H3 (handles <H3>, <H3 id="...">)
if (/<H3[\s>]/.test(chunk)) {
const inner = innerContent(chunk, "H3");
if (inner) {
nodes.push(
blockNode("mintelHeading", {
text: inner,
seoLevel: "h3",
displayLevel: "h3",
}),
);
continue;
}
}
// Marker (inline highlight, usually inside Paragraph store as standalone block)
if (chunk.includes("<Marker>") && !chunk.includes("<Paragraph")) {
const inner = innerContent(chunk, "Marker");
if (inner) {
nodes.push(blockNode("marker", { text: inner }));
continue;
}
}
// LeadParagraph
if (chunk.includes("<LeadParagraph>")) {
const inner = innerContent(chunk, "LeadParagraph");
if (inner) {
nodes.push(blockNode("leadParagraph", { text: inner }));
continue;
}
}
// ArticleBlockquote
if (chunk.includes("<ArticleBlockquote")) {
nodes.push(
blockNode("articleBlockquote", {
quote: innerContent(chunk, "ArticleBlockquote"),
author: propValue(chunk, "author"),
role: propValue(chunk, "role"),
}),
);
continue;
}
// ArticleQuote
if (chunk.includes("<ArticleQuote")) {
nodes.push(
blockNode("articleQuote", {
quote:
innerContent(chunk, "ArticleQuote") || propValue(chunk, "quote"),
author: propValue(chunk, "author"),
role: propValue(chunk, "role"),
source: propValue(chunk, "source"),
}),
);
continue;
}
// Mermaid
if (chunk.includes("<Mermaid")) {
nodes.push(
blockNode("mermaid", {
id: propValue(chunk, "id") || `chart-${Date.now()}`,
title: propValue(chunk, "title"),
showShare: chunk.includes("showShare={true}"),
chartDefinition: innerContent(chunk, "Mermaid"),
}),
);
continue;
}
// Diagram variants (prefer inner definition, fall back to raw chunk text)
if (chunk.includes("<DiagramFlow")) {
nodes.push(
blockNode("diagramFlow", {
definition:
innerContent(chunk, "DiagramFlow") ||
propValue(chunk, "definition") ||
chunk,
}),
);
continue;
}
if (chunk.includes("<DiagramSequence")) {
nodes.push(
blockNode("diagramSequence", {
definition:
innerContent(chunk, "DiagramSequence") ||
propValue(chunk, "definition") ||
chunk,
}),
);
continue;
}
if (chunk.includes("<DiagramGantt")) {
nodes.push(
blockNode("diagramGantt", {
definition:
innerContent(chunk, "DiagramGantt") ||
propValue(chunk, "definition") ||
chunk,
}),
);
continue;
}
if (chunk.includes("<DiagramPie")) {
nodes.push(
blockNode("diagramPie", {
definition:
innerContent(chunk, "DiagramPie") ||
propValue(chunk, "definition") ||
chunk,
}),
);
continue;
}
if (chunk.includes("<DiagramState")) {
nodes.push(
blockNode("diagramState", {
definition:
innerContent(chunk, "DiagramState") ||
propValue(chunk, "definition") ||
chunk,
}),
);
continue;
}
if (chunk.includes("<DiagramTimeline")) {
nodes.push(
blockNode("diagramTimeline", {
definition:
innerContent(chunk, "DiagramTimeline") ||
propValue(chunk, "definition") ||
chunk,
}),
);
continue;
}
// Section (wrapping container unwrap and parse inner content as top-level blocks)
if (chunk.includes("<Section")) {
const inner = innerContent(chunk, "Section");
if (inner) {
const innerNodes = parseMarkdownToLexical(inner);
nodes.push(...innerNodes);
}
continue;
}
// FAQSection (wrapping container)
if (chunk.includes("<FAQSection")) {
// FAQSection contains nested H3/Paragraph pairs.
// We extract them as individual blocks instead.
const faqContent = innerContent(chunk, "FAQSection");
if (faqContent) {
// Parse nested content recursively
const innerNodes = parseMarkdownToLexical(faqContent);
nodes.push(...innerNodes);
}
continue;
}
// IconList with IconListItems
if (chunk.includes("<IconList")) {
const items: any[] = [];
// Self-closing: <IconListItem icon="Check" title="..." description="..." />
const itemMatches = chunk.matchAll(/<IconListItem\s+([^>]*?)\/>/g);
for (const m of itemMatches) {
const attrs = m[1];
const title = (attrs.match(/title=["']([^"']+)["']/) || [])[1] || "";
const desc =
(attrs.match(/description=["']([^"']+)["']/) || [])[1] || "";
items.push({
icon: (attrs.match(/icon=["']([^"']+)["']/) || [])[1] || "Check",
title: title || "•",
description: desc,
});
}
// Content-wrapped: <IconListItem check>HTML content</IconListItem>
const itemMatches2 = chunk.matchAll(
/<IconListItem([^>]*)>([\s\S]*?)<\/IconListItem>/g,
);
for (const m of itemMatches2) {
const attrs = m[1] || "";
const innerHtml = m[2].trim();
// Use title attr if present, otherwise use inner HTML (stripped of tags) as title
const titleAttr = (attrs.match(/title=["']([^"']+)["']/) || [])[1];
const strippedInner = innerHtml.replace(/<[^>]+>/g, "").trim();
items.push({
icon: (attrs.match(/icon=["']([^"']+)["']/) || [])[1] || "Check",
title: titleAttr || strippedInner || "•",
description: "",
});
}
if (items.length > 0) {
nodes.push(blockNode("iconList", { items }));
continue;
}
}
// StatsGrid
if (chunk.includes("<StatsGrid")) {
const stats: any[] = [];
const statMatches = chunk.matchAll(/<StatItem\s+([^>]*?)\/>/g);
for (const m of statMatches) {
const attrs = m[1];
stats.push({
label: (attrs.match(/label=["']([^"']+)["']/) || [])[1] || "",
value: (attrs.match(/value=["']([^"']+)["']/) || [])[1] || "",
});
}
// Also try inline props pattern
if (stats.length === 0) {
const innerStats = innerContent(chunk, "StatsGrid");
if (innerStats) {
// fallback: store the raw content
nodes.push(blockNode("statsGrid", { stats: [] }));
continue;
}
}
nodes.push(blockNode("statsGrid", { stats }));
continue;
}
// PremiumComparisonChart
if (chunk.includes("<PremiumComparisonChart")) {
nodes.push(
blockNode("premiumComparisonChart", {
title: propValue(chunk, "title"),
}),
);
continue;
}
// WaterfallChart
if (chunk.includes("<WaterfallChart")) {
nodes.push(
blockNode("waterfallChart", {
title: propValue(chunk, "title"),
}),
);
continue;
}
// Reveal (animation wrapper just pass through)
if (chunk.includes("<Reveal")) {
const inner = innerContent(chunk, "Reveal");
if (inner) {
// Parse inner content as regular nodes
const innerNodes = parseMarkdownToLexical(inner);
nodes.push(...innerNodes);
continue;
}
}
// Standalone IconListItem (outside IconList context)
if (chunk.includes("<IconListItem")) {
// Skip these should be inside an IconList
continue;
}
// Skip wrapper divs (like <div className="my-8">)
if (/^<div\s/.test(chunk) || chunk === "</div>") {
continue;
}
// === Standard Markdown ===
// CarouselBlock
if (chunk.includes("<Carousel")) {
const slides: any[] = [];
const slideMatches = chunk.matchAll(/<Slide\s+([^>]*?)\/>/g);
for (const m of slideMatches) {
const attrs = m[1];
slides.push({
image: (attrs.match(/image=["']([^"']+)["']/) || [])[1] || "",
caption: (attrs.match(/caption=["']([^"']+)["']/) || [])[1] || "",
});
}
if (slides.length > 0) {
nodes.push(blockNode("carousel", { slides }));
continue;
}
}
// Headings
const headingMatch = chunk.match(/^(#{1,6})\s+(.*)/);
if (headingMatch) {
nodes.push({
type: "heading",
tag: `h${headingMatch[1].length}`,
format: "",
indent: 0,
version: 1,
direction: "ltr",
children: [
{ mode: "normal", type: "text", text: headingMatch[2], version: 1 },
],
});
continue;
}
// Default: plain text paragraph
nodes.push(textNode(chunk));
}
return nodes;
}
/**
* Reassembles multi-line JSX tags that span across double-newline boundaries.
* E.g. <IconList>\n\n<IconListItem.../>\n\n</IconList> becomes a single chunk.
*/
function reassembleMultiLineJSX(content: string): string {
// Tags that wrap other content across paragraph breaks
const wrapperTags = [
"IconList",
"StatsGrid",
"FAQSection",
"Section",
"Reveal",
"Carousel",
];
for (const tag of wrapperTags) {
const openRegex = new RegExp(`<${tag}[^>]*>`, "g");
let match;
while ((match = openRegex.exec(content)) !== null) {
const openPos = match.index;
const closeTag = `</${tag}>`;
const closePos = content.indexOf(closeTag, openPos);
if (closePos === -1) continue;
const fullEnd = closePos + closeTag.length;
const fullBlock = content.substring(openPos, fullEnd);
// Replace double newlines inside this block with single newlines
// so it stays as one chunk during splitting
const collapsed = fullBlock.replace(/\n\s*\n/g, "\n");
content =
content.substring(0, openPos) + collapsed + content.substring(fullEnd);
// Adjust regex position
openRegex.lastIndex = openPos + collapsed.length;
}
}
return content;
}

View File

@@ -0,0 +1,30 @@
{
"extends": "../../tsconfig.json",
"compilerOptions": {
"module": "ESNext",
"moduleResolution": "bundler",
"target": "ES2022",
"lib": [
"ES2022",
"DOM",
"DOM.Iterable"
],
"jsx": "react-jsx",
"outDir": "dist",
"rootDir": "src",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"declaration": true,
"sourceMap": true
},
"include": [
"src/**/*"
],
"exclude": [
"node_modules",
"dist",
"**/*.test.ts"
]
}

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/pdf",
"version": "1.9.1",
"version": "1.9.7",
"type": "module",
"main": "dist/index.js",
"module": "dist/index.js",

4
packages/seo-engine/.gitignore vendored Normal file
View File

@@ -0,0 +1,4 @@
node_modules
dist
.seo-output
.env

View File

@@ -0,0 +1,123 @@
# @mintel/seo-engine
AI-powered SEO keyword discovery, topic clustering, competitor analysis, and content gap identification — grounded in real search data, zero hallucinations.
## Architecture
```
ProjectContext + SeoConfig
┌──────────────────────────────────────────┐
│ SEO Engine Orchestrator │
│ │
│ 1. Seed Query Expansion │
│ (company + industry + seedKeywords) │
│ │
│ 2. Data Collection (parallel) │
│ ├── Serper Search Agent │
│ │ (related searches, PAA, │
│ │ organic snippets, volume proxy) │
│ ├── Serper Autocomplete Agent │
│ │ (long-tail suggestions) │
│ └── Serper Competitor Agent │
│ (top-10 SERP positions) │
│ │
│ 3. LLM Evaluation (Gemini/Claude) │
│ → Strict context filtering │
│ → Topic Clustering + Intent Mapping │
│ │
│ 4. Content Gap Analysis (LLM) │
│ → Compare clusters vs existing pages │
│ → Identify missing content │
└──────────────────────────────────────────┘
SeoEngineOutput
(clusters, gaps, competitors, discarded)
```
## Quick Start
```typescript
import { runSeoEngine } from "@mintel/seo-engine";
const result = await runSeoEngine(
{
companyName: "KLZ Cables",
industry: "Mittelspannungskabel, Kabeltiefbau",
briefing: "B2B provider of specialized medium-voltage cables.",
targetAudience: "Bauleiter, Netzbetreiber",
competitors: ["nkt.de", "faberkabel.de"],
seedKeywords: ["NA2XS2Y", "VPE-isoliert"],
existingPages: [
{ url: "/produkte", title: "Produkte" },
{ url: "/kontakt", title: "Kontakt" },
],
locale: { gl: "de", hl: "de" },
},
{
serperApiKey: process.env.SERPER_API_KEY!,
openRouterApiKey: process.env.OPENROUTER_API_KEY!,
},
);
```
## Configuration
### `ProjectContext`
| Field | Type | Description |
| ------------------ | ----------------------------- | ------------------------------------------- |
| `companyName` | `string?` | Client company name |
| `industry` | `string?` | Industry / main focus keywords |
| `briefing` | `string?` | Project briefing text |
| `targetAudience` | `string?` | Who the content targets |
| `competitors` | `string[]?` | Competitor domains to analyze |
| `seedKeywords` | `string[]?` | Explicit seed keywords beyond auto-derived |
| `existingPages` | `{ url, title }[]?` | Current site pages for content gap analysis |
| `customGuidelines` | `string?` | Extra strict filtering rules for the LLM |
| `locale` | `{ gl: string, hl: string }?` | Google locale (default: `de`) |
### `SeoConfig`
| Field | Type | Description |
| ------------------ | --------- | -------------------------------------------- |
| `serperApiKey` | `string` | **Required.** Serper API key |
| `openRouterApiKey` | `string` | **Required.** OpenRouter API key |
| `model` | `string?` | LLM model (default: `google/gemini-2.5-pro`) |
| `maxKeywords` | `number?` | Cap total keywords returned |
## Output
```typescript
interface SeoEngineOutput {
topicClusters: TopicCluster[]; // Grouped keywords with intent + scores
competitorRankings: CompetitorRanking[]; // Who ranks for your terms
contentGaps: ContentGap[]; // Missing pages you should create
discardedTerms: string[]; // Terms filtered out (with reasons)
}
```
## Agents
| Agent | Source | Data |
| --------------------- | ---------------------- | ----------------------------------- |
| `serper-agent` | Serper `/search` | Related searches, PAA, snippets |
| `serper-autocomplete` | Serper `/autocomplete` | Google Autocomplete long-tail terms |
| `serper-competitors` | Serper `/search` | Competitor SERP positions |
## API Keys
- **Serper** — [serper.dev](https://serper.dev) (pay-per-search, ~$0.001/query)
- **OpenRouter** — [openrouter.ai](https://openrouter.ai) (pay-per-token)
No monthly subscriptions. Pure pay-on-demand.
## Development
```bash
pnpm install # from monorepo root
pnpm --filter @mintel/seo-engine build
npx tsx src/test-run.ts # smoke test (needs API keys in .env)
```

View File

@@ -0,0 +1,37 @@
{
"name": "@mintel/seo-engine",
"version": "1.9.7",
"private": true,
"description": "AI-powered SEO keyword and topic cluster evaluation engine",
"type": "module",
"main": "./dist/index.js",
"module": "./dist/index.js",
"types": "./dist/index.d.ts",
"exports": {
".": {
"types": "./dist/index.d.ts",
"import": "./dist/index.js"
}
},
"scripts": {
"build": "tsup",
"dev": "tsup --watch",
"test": "vitest run --passWithNoTests",
"clean": "rm -rf dist",
"lint": "eslint src --ext .ts"
},
"dependencies": {
"axios": "^1.7.9",
"cheerio": "1.0.0-rc.12",
"dotenv": "^16.4.7"
},
"devDependencies": {
"@mintel/eslint-config": "workspace:*",
"@mintel/tsconfig": "workspace:*",
"@types/node": "^20.17.17",
"tsup": "^8.3.6",
"tsx": "^4.19.2",
"typescript": "^5.7.3",
"vitest": "^3.0.5"
}
}

View File

@@ -0,0 +1,132 @@
import axios from "axios";
import * as cheerio from "cheerio";
import { llmJsonRequest } from "../llm-client.js";
export interface ScrapedContext {
url: string;
wordCount: number;
text: string;
headings: { level: number; text: string }[];
}
export interface ReverseEngineeredBriefing {
recommendedWordCount: number;
coreTopicsToCover: string[];
suggestedHeadings: string[];
entitiesToInclude: string[];
contentFormat: string; // e.g. "Lange Liste mit Fakten", "Kaufberater", "Lexikon-Eintrag"
}
/**
* Fetches the HTML of a URL and extracts the main readable text and headings.
*/
export async function scrapeCompetitorUrl(
url: string,
): Promise<ScrapedContext | null> {
try {
console.log(`[Scraper] Fetching source: ${url}`);
const response = await axios.get(url, {
headers: {
"User-Agent":
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36",
},
timeout: 10000,
});
const $ = cheerio.load(response.data);
// Remove junk elements before extracting text
$(
"script, style, nav, footer, header, aside, .cookie, .banner, iframe",
).remove();
const headings: { level: number; text: string }[] = [];
$(":header").each((_, el) => {
const level = parseInt(el.tagName.replace(/h/i, ""), 10);
const text = $(el).text().trim().replace(/\s+/g, " ");
if (text) headings.push({ level, text });
});
// Extract body text, removing excessive whitespace
const text = $("body").text().replace(/\s+/g, " ").trim();
const wordCount = text.split(" ").length;
return {
url,
text: text.slice(0, 15000), // Cap length to prevent blowing up the LLM token limit
wordCount,
headings,
};
} catch (err) {
console.error(
`[Scraper] Failed to scrape ${url}: ${(err as Error).message}`,
);
return null;
}
}
const BRIEFING_SYSTEM_PROMPT = `
You are a Senior Technical SEO Strategist.
I will give you the scraped text and headings of a competitor's article that currently ranks #1 on Google for our target keyword.
### OBJECTIVE:
Reverse engineer the content. Tell me EXACTLY what topics, entities, and headings we must include
in our own article to beat this competitor.
Do not just copy their headings. Distill the *core intent* and *required knowledge depth*.
### RULES:
- If the text is very short (e.g., an e-commerce category page), mention that the format is "Category Page" and recommend a word count +50% higher than theirs.
- Extract hyper-specific entities (e.g. DIN norms, specific materials, specific processes) that prove topic authority.
- LANGUAGE: Match the language of the provided text.
### OUTPUT FORMAT:
{
"recommendedWordCount": number,
"coreTopicsToCover": ["string"],
"suggestedHeadings": ["string"],
"entitiesToInclude": ["string"],
"contentFormat": "string"
}
`;
/**
* Analyzes the scraped context using an LLM to generate a blueprint to beat the competitor.
*/
export async function analyzeCompetitorContent(
context: ScrapedContext,
targetKeyword: string,
config: { openRouterApiKey: string; model?: string },
): Promise<ReverseEngineeredBriefing | null> {
const userPrompt = `
TARGET KEYWORD TO BEAT: "${targetKeyword}"
COMPETITOR URL: ${context.url}
COMPETITOR WORD COUNT: ${context.wordCount}
COMPETITOR HEADINGS:
${context.headings.map((h) => `H${h.level}: ${h.text}`).join("\n")}
COMPETITOR TEXT (Truncated):
${context.text}
`;
try {
const { data } = await llmJsonRequest<ReverseEngineeredBriefing>({
model: config.model || "google/gemini-2.5-pro",
apiKey: config.openRouterApiKey,
systemPrompt: BRIEFING_SYSTEM_PROMPT,
userPrompt,
});
// Ensure numbers are numbers
data.recommendedWordCount =
Number(data.recommendedWordCount) || context.wordCount + 300;
return data;
} catch (err) {
console.error(
`[Scraper] NLP Analysis failed for ${context.url}:`,
(err as Error).message,
);
return null;
}
}

View File

@@ -0,0 +1,64 @@
import axios from "axios";
export interface SerperResult {
relatedSearches: string[];
peopleAlsoAsk: string[];
organicSnippets: string[];
estimatedTotalResults: number;
}
/**
* Fetch Google search data via Serper's /search endpoint.
* Extracts related searches, People Also Ask, organic snippets,
* and totalResults as a search volume proxy.
*/
export async function fetchSerperData(
query: string,
apiKey: string,
locale: { gl: string; hl: string } = { gl: "de", hl: "de" },
): Promise<SerperResult> {
try {
const response = await axios.post(
"https://google.serper.dev/search",
{
q: query,
gl: locale.gl,
hl: locale.hl,
},
{
headers: {
"X-API-KEY": apiKey,
"Content-Type": "application/json",
},
},
);
const data = response.data;
const relatedSearches =
data.relatedSearches?.map((r: any) => r.query) || [];
const peopleAlsoAsk = data.peopleAlsoAsk?.map((p: any) => p.question) || [];
const organicSnippets = data.organic?.map((o: any) => o.snippet) || [];
const estimatedTotalResults = data.searchInformation?.totalResults
? parseInt(data.searchInformation.totalResults, 10)
: 0;
return {
relatedSearches,
peopleAlsoAsk,
organicSnippets,
estimatedTotalResults,
};
} catch (error) {
console.error(
`Serper API error for query "${query}":`,
(error as Error).message,
);
return {
relatedSearches: [],
peopleAlsoAsk: [],
organicSnippets: [],
estimatedTotalResults: 0,
};
}
}

View File

@@ -0,0 +1,43 @@
import axios from "axios";
export interface AutocompleteResult {
suggestions: string[];
}
/**
* Fetch Google Autocomplete suggestions via Serper's /autocomplete endpoint.
* These represent real user typing behavior — extremely high-signal for long-tail keywords.
*/
export async function fetchAutocompleteSuggestions(
query: string,
apiKey: string,
locale: { gl: string; hl: string } = { gl: "de", hl: "de" },
): Promise<AutocompleteResult> {
try {
const response = await axios.post(
"https://google.serper.dev/autocomplete",
{
q: query,
gl: locale.gl,
hl: locale.hl,
},
{
headers: {
"X-API-KEY": apiKey,
"Content-Type": "application/json",
},
},
);
const suggestions =
response.data.suggestions?.map((s: any) => s.value || s) || [];
return { suggestions };
} catch (error) {
console.error(
`Serper Autocomplete error for query "${query}":`,
(error as Error).message,
);
return { suggestions: [] };
}
}

View File

@@ -0,0 +1,75 @@
import axios from "axios";
export interface CompetitorRanking {
keyword: string;
domain: string;
position: number;
title: string;
snippet: string;
link: string;
}
/**
* For a given keyword, check which competitor domains appear in the top organic results.
* Filters results to only include domains in the `competitorDomains` list.
*/
export async function fetchCompetitorRankings(
keyword: string,
competitorDomains: string[],
apiKey: string,
locale: { gl: string; hl: string } = { gl: "de", hl: "de" },
): Promise<CompetitorRanking[]> {
if (competitorDomains.length === 0) return [];
try {
const response = await axios.post(
"https://google.serper.dev/search",
{
q: keyword,
gl: locale.gl,
hl: locale.hl,
num: 20,
},
{
headers: {
"X-API-KEY": apiKey,
"Content-Type": "application/json",
},
},
);
const organic: any[] = response.data.organic || [];
// Normalize competitor domains for matching
const normalizedCompetitors = competitorDomains.map((d) =>
d
.replace(/^(https?:\/\/)?(www\.)?/, "")
.replace(/\/$/, "")
.toLowerCase(),
);
return organic
.filter((result: any) => {
const resultDomain = new URL(result.link).hostname
.replace(/^www\./, "")
.toLowerCase();
return normalizedCompetitors.some(
(cd) => resultDomain === cd || resultDomain.endsWith(`.${cd}`),
);
})
.map((result: any) => ({
keyword,
domain: new URL(result.link).hostname.replace(/^www\./, ""),
position: result.position,
title: result.title || "",
snippet: result.snippet || "",
link: result.link,
}));
} catch (error) {
console.error(
`Serper Competitor check error for keyword "${keyword}":`,
(error as Error).message,
);
return [];
}
}

View File

@@ -0,0 +1,148 @@
import * as fs from "node:fs/promises";
import * as path from "node:path";
import type { ContentGap } from "./types.js";
import type { ReverseEngineeredBriefing } from "./agents/scraper.js";
export interface FileEditorConfig {
outputDir: string;
authorName?: string;
}
/**
* Generates an SEO-friendly URL slug from a title.
*/
function createSlug(title: string): string {
return title
.toLowerCase()
.replace(/ä/g, "ae")
.replace(/ö/g, "oe")
.replace(/ü/g, "ue")
.replace(/ß/g, "ss")
.replace(/[^a-z0-9]+/g, "-")
.replace(/^-+|-+$/g, "");
}
/**
* Automatically creates local .mdx draft files for identified high-priority content gaps.
* Each file is self-explanatory: it tells the writer exactly WHY this page needs to exist,
* WHAT to write, and HOW to structure the content — all based on real competitor data.
*/
export async function createGapDrafts(
gaps: ContentGap[],
briefings: Map<string, ReverseEngineeredBriefing>,
config: FileEditorConfig,
): Promise<string[]> {
const createdFiles: string[] = [];
try {
await fs.mkdir(path.resolve(process.cwd(), config.outputDir), {
recursive: true,
});
} catch (e) {
console.error(
`[File Editor] Could not create directory ${config.outputDir}:`,
e,
);
return [];
}
const dateStr = new Date().toISOString().split("T")[0];
for (const gap of gaps) {
if (gap.priority === "low") continue;
const slug = createSlug(gap.recommendedTitle);
const filePath = path.join(
path.resolve(process.cwd(), config.outputDir),
`${slug}.mdx`,
);
const briefing = briefings.get(gap.targetKeyword);
const priorityEmoji = gap.priority === "high" ? "🔴" : "🟡";
let body = "";
// ── Intro: Explain WHY this file exists ──
body += `{/* ═══════════════════════════════════════════════════════════════════\n`;
body += ` 📋 SEO CONTENT BRIEFING — Auto-generated by @mintel/seo-engine\n`;
body += ` ═══════════════════════════════════════════════════════════════════\n\n`;
body += ` Dieses Dokument wurde automatisch erstellt.\n`;
body += ` Es basiert auf einer Analyse der aktuellen Google-Suchergebnisse\n`;
body += ` und der Webseiten deiner Konkurrenz.\n\n`;
body += ` ▸ Du kannst dieses File direkt als MDX-Seite verwenden.\n`;
body += ` ▸ Ersetze den Briefing-Block unten durch deinen eigenen Text.\n`;
body += ` ▸ Setze isDraft auf false, wenn der Text fertig ist.\n`;
body += ` ═══════════════════════════════════════════════════════════════════ */}\n\n`;
// ── Section 1: Warum diese Seite? ──
body += `## ${priorityEmoji} Warum diese Seite erstellt werden sollte\n\n`;
body += `**Priorität:** ${gap.priority === "high" ? "Hoch — Direkt umsatzrelevant" : "Mittel — Stärkt die thematische Autorität"}\n\n`;
body += `${gap.rationale}\n\n`;
body += `| Feld | Wert |\n`;
body += `|------|------|\n`;
body += `| **Focus Keyword** | \`${gap.targetKeyword}\` |\n`;
body += `| **Topic Cluster** | ${gap.relatedCluster} |\n`;
body += `| **Priorität** | ${gap.priority} |\n\n`;
// ── Section 2: Competitor Briefing ──
if (briefing) {
body += `## 🔍 Konkurrenz-Analyse (Reverse Engineered)\n\n`;
body += `> Die folgenden Daten stammen aus einer automatischen Analyse der Webseite,\n`;
body += `> die aktuell auf **Platz 1 bei Google** für das Keyword \`${gap.targetKeyword}\` rankt.\n`;
body += `> Nutze diese Informationen, um **besseren Content** zu schreiben.\n\n`;
body += `### Content-Format des Konkurrenten\n\n`;
body += `**${briefing.contentFormat}** — Empfohlene Mindestlänge: **~${briefing.recommendedWordCount} Wörter**\n\n`;
body += `### Diese Themen MUSS dein Artikel abdecken\n\n`;
body += `Die folgenden Punkte werden vom aktuellen Platz-1-Ranker behandelt. Wenn dein Artikel diese Themen nicht abdeckt, wird es schwer, ihn zu überholen:\n\n`;
briefing.coreTopicsToCover.forEach(
(t, i) => (body += `${i + 1}. ${t}\n`),
);
body += `\n### Fachbegriffe & Entitäten die im Text vorkommen müssen\n\n`;
body += `Diese Begriffe signalisieren Google, dass dein Text fachlich tiefgreifend ist. Versuche, möglichst viele davon natürlich in deinen Text einzubauen:\n\n`;
briefing.entitiesToInclude.forEach((e) => (body += `- \`${e}\`\n`));
body += `\n### Empfohlene Gliederung\n\n`;
body += `Orientiere dich an dieser Struktur (du kannst sie anpassen):\n\n`;
briefing.suggestedHeadings.forEach(
(h, i) => (body += `${i + 1}. **${h}**\n`),
);
} else {
body += `## 🔍 Konkurrenz-Analyse\n\n`;
body += `> Für dieses Keyword konnte kein Konkurrent gescraped werden.\n`;
body += `> Schreibe den Artikel trotzdem — du hast weniger Wettbewerb!\n`;
}
body += `\n---\n\n`;
body += `## ✍️ Dein Content (hier schreiben)\n\n`;
body += `{/* Lösche alles oberhalb dieser Zeile, wenn dein Text fertig ist. */}\n\n`;
body += `Hier beginnt dein eigentlicher Artikel...\n`;
const file = `---
title: "${gap.recommendedTitle}"
description: "TODO: Meta-Description mit dem Keyword '${gap.targetKeyword}' schreiben."
date: "${dateStr}"
author: "${config.authorName || "Mintel SEO Engine"}"
tags: ["${gap.relatedCluster}"]
isDraft: true
focus_keyword: "${gap.targetKeyword}"
---
${body}`;
try {
await fs.writeFile(filePath, file, "utf8");
console.log(`[File Editor] Created draft: ${filePath}`);
createdFiles.push(filePath);
} catch (err) {
console.error(
`[File Editor] Failed to write ${filePath}:`,
(err as Error).message,
);
}
}
return createdFiles;
}

View File

@@ -0,0 +1,237 @@
import { llmJsonRequest } from "./llm-client.js";
import { fetchSerperData } from "./agents/serper-agent.js";
import { fetchAutocompleteSuggestions } from "./agents/serper-autocomplete.js";
import {
fetchCompetitorRankings,
type CompetitorRanking,
} from "./agents/serper-competitors.js";
import {
scrapeCompetitorUrl,
analyzeCompetitorContent,
type ReverseEngineeredBriefing,
} from "./agents/scraper.js";
import { analyzeContentGaps, type ContentGap } from "./steps/content-gap.js";
import { SEO_SYSTEM_PROMPT } from "./prompts.js";
import type {
ProjectContext,
SeoConfig,
SeoEngineOutput,
TopicCluster,
} from "./types.js";
const DEFAULT_MODEL = "google/gemini-2.5-pro";
export async function runSeoEngine(
context: ProjectContext,
config: SeoConfig,
): Promise<SeoEngineOutput> {
if (!config.serperApiKey)
throw new Error("Missing Serper API Key in SeoConfig.");
if (!config.openRouterApiKey)
throw new Error("Missing OpenRouter API Key in SeoConfig.");
const locale = context.locale || { gl: "de", hl: "de" };
const seedQueries: string[] = [];
// Derive seed queries from context
if (context.companyName) seedQueries.push(context.companyName);
if (context.industry) seedQueries.push(context.industry);
if (context.competitors && context.competitors.length > 0) {
seedQueries.push(...context.competitors.slice(0, 2));
}
if (context.seedKeywords && context.seedKeywords.length > 0) {
seedQueries.push(...context.seedKeywords);
}
if (seedQueries.length === 0) {
throw new Error(
"ProjectContext must provide at least an industry, company name, or seedKeywords.",
);
}
console.log(
`[SEO Engine] Sourcing raw data for ${seedQueries.length} seeds: ${seedQueries.join(", ")}`,
);
// ──────────────────────────────────────────────
// Step 1: Google Search Data + Autocomplete (parallel per seed)
// ──────────────────────────────────────────────
const rawSearchData = new Set<string>();
const allAutocompleteSuggestions = new Set<string>();
const volumeMap = new Map<string, number>(); // keyword → totalResults
const searchPromises = seedQueries.map(async (query) => {
const [searchResult, autocompleteResult] = await Promise.all([
fetchSerperData(query, config.serperApiKey!, locale),
fetchAutocompleteSuggestions(query, config.serperApiKey!, locale),
]);
searchResult.relatedSearches.forEach((r) => rawSearchData.add(r));
searchResult.peopleAlsoAsk.forEach((p) => rawSearchData.add(p));
searchResult.organicSnippets.forEach((o) => rawSearchData.add(o));
autocompleteResult.suggestions.forEach((s) => {
rawSearchData.add(s);
allAutocompleteSuggestions.add(s);
});
if (searchResult.estimatedTotalResults > 0) {
volumeMap.set(query, searchResult.estimatedTotalResults);
}
});
await Promise.all(searchPromises);
const rawTerms = Array.from(rawSearchData);
console.log(
`[SEO Engine] Sourced ${rawTerms.length} raw terms (incl. ${allAutocompleteSuggestions.size} autocomplete). Evaluating with LLM...`,
);
// ──────────────────────────────────────────────
// Step 2: LLM Evaluation + Topic Clustering
// ──────────────────────────────────────────────
const userPrompt = `
PROJECT CONTEXT:
CompanyName: ${context.companyName || "N/A"}
Industry / Main Focus: ${context.industry || "N/A"}
Briefing Summary: ${context.briefing || "N/A"}
Target Audience: ${context.targetAudience || "N/A"}
Known Competitors: ${context.competitors?.join(", ") || "N/A"}
EXTRA STRICT GUIDELINES:
${context.customGuidelines || "None. Apply standard Mintel strict adherence."}
RAW SEARCH TERMS SOURCED FROM GOOGLE (incl. autocomplete, PAA, related, snippets):
${rawTerms.map((t, i) => `${i + 1}. ${t}`).join("\n")}
EVALUATE AND CLUSTER STRICTLY ACCORDING TO SYSTEM INSTRUCTIONS.
`;
const { data: clusterData } = await llmJsonRequest<{
topicClusters: TopicCluster[];
discardedTerms: string[];
}>({
model: config.model || DEFAULT_MODEL,
apiKey: config.openRouterApiKey,
systemPrompt: SEO_SYSTEM_PROMPT,
userPrompt,
});
const topicClusters = clusterData.topicClusters || [];
const discardedTerms = clusterData.discardedTerms || [];
// Attach volume estimates based on totalResults proxy
for (const cluster of topicClusters) {
for (const kw of cluster.secondaryKeywords) {
const vol = volumeMap.get(kw.term);
if (vol !== undefined) {
kw.estimatedVolume =
vol > 1_000_000 ? "high" : vol > 100_000 ? "medium" : "low";
}
}
}
console.log(
`[SEO Engine] LLM clustered ${topicClusters.reduce((a, c) => a + c.secondaryKeywords.length + 1, 0)} keywords into ${topicClusters.length} clusters. Discarded ${discardedTerms.length}.`,
);
// ──────────────────────────────────────────────
// Step 3 & 4: Competitor SERP Analysis & Content Scraping
// ──────────────────────────────────────────────
let competitorRankings: CompetitorRanking[] = [];
const competitorBriefings: Record<string, ReverseEngineeredBriefing> = {};
if (context.competitors && context.competitors.length > 0) {
const primaryKeywords = topicClusters
.map((c) => c.primaryKeyword)
.slice(0, 5);
console.log(
`[SEO Engine] Checking competitor rankings for: ${primaryKeywords.join(", ")}`,
);
const competitorPromises = primaryKeywords.map((kw) =>
fetchCompetitorRankings(
kw,
context.competitors!,
config.serperApiKey!,
locale,
),
);
const results = await Promise.all(competitorPromises);
competitorRankings = results.flat();
console.log(
`[SEO Engine] Found ${competitorRankings.length} competitor rankings.`,
);
// Pick top ranking competitor for each primary keyword to reverse engineer
console.log(`[SEO Engine] Reverse engineering top competitor content...`);
const scrapePromises = primaryKeywords.map(async (kw) => {
const topRanking = competitorRankings.find((r) => r.keyword === kw);
if (!topRanking) return null;
const scraped = await scrapeCompetitorUrl(topRanking.link);
if (!scraped) return null;
const briefing = await analyzeCompetitorContent(scraped, kw, {
openRouterApiKey: config.openRouterApiKey!,
model: config.model,
});
if (briefing) {
competitorBriefings[kw] = briefing;
}
});
await Promise.all(scrapePromises);
console.log(
`[SEO Engine] Generated ${Object.keys(competitorBriefings).length} competitor briefings.`,
);
}
// ──────────────────────────────────────────────
// Step 5: Content Gap Analysis
// ──────────────────────────────────────────────
let contentGaps: ContentGap[] = [];
if (context.existingPages && context.existingPages.length > 0) {
console.log(
`[SEO Engine] Analyzing content gaps against ${context.existingPages.length} existing pages...`,
);
contentGaps = await analyzeContentGaps(
topicClusters,
context.existingPages,
{
openRouterApiKey: config.openRouterApiKey,
model: config.model,
},
);
console.log(`[SEO Engine] Found ${contentGaps.length} content gaps.`);
}
// ──────────────────────────────────────────────
// Optional Keyword Cap
// ──────────────────────────────────────────────
if (config.maxKeywords) {
let count = 0;
for (const cluster of topicClusters) {
cluster.secondaryKeywords = cluster.secondaryKeywords.filter(() => {
if (count < config.maxKeywords!) {
count++;
return true;
}
return false;
});
}
}
console.log(`[SEO Engine] ✅ Complete.`);
return {
topicClusters,
competitorRankings,
competitorBriefings,
contentGaps,
autocompleteSuggestions: Array.from(allAutocompleteSuggestions),
discardedTerms,
};
}

View File

@@ -0,0 +1,12 @@
export * from "./types.js";
export * from "./engine.js";
export * from "./editor.js";
export { generateSeoReport } from "./report.js";
export { fetchSerperData } from "./agents/serper-agent.js";
export { fetchAutocompleteSuggestions } from "./agents/serper-autocomplete.js";
export { fetchCompetitorRankings } from "./agents/serper-competitors.js";
export {
scrapeCompetitorUrl,
analyzeCompetitorContent,
} from "./agents/scraper.js";
export { analyzeContentGaps } from "./steps/content-gap.js";

View File

@@ -0,0 +1,153 @@
// ============================================================================
// LLM Client — Unified interface with model routing via OpenRouter
// ============================================================================
import axios from "axios";
export interface LLMRequestOptions {
model: string;
systemPrompt: string;
userPrompt: string;
jsonMode?: boolean;
apiKey: string;
}
export interface LLMResponse {
content: string;
usage: {
promptTokens: number;
completionTokens: number;
cost: number;
};
}
/**
* Clean raw LLM output to parseable JSON.
* Handles markdown fences, control chars, trailing commas.
*/
export function cleanJson(str: string): string {
let cleaned = str.replace(/```json\n?|```/g, "").trim();
// eslint-disable-next-line no-control-regex
cleaned = cleaned.replace(/[\x00-\x1f\x7f-\x9f]/gi, " ");
cleaned = cleaned.replace(/,\s*([\]}])/g, "$1");
return cleaned;
}
/**
* Send a request to an LLM via OpenRouter.
*/
export async function llmRequest(
options: LLMRequestOptions,
): Promise<LLMResponse> {
const { model, systemPrompt, userPrompt, jsonMode = true, apiKey } = options;
const resp = await axios
.post(
"https://openrouter.ai/api/v1/chat/completions",
{
model,
messages: [
{ role: "system", content: systemPrompt },
{ role: "user", content: userPrompt },
],
...(jsonMode ? { response_format: { type: "json_object" } } : {}),
},
{
headers: {
Authorization: `Bearer ${apiKey}`,
"Content-Type": "application/json",
},
timeout: 120000,
},
)
.catch((err) => {
if (err.response) {
console.error(
"OpenRouter API Error:",
JSON.stringify(err.response.data, null, 2),
);
}
throw err;
});
const content = resp.data.choices?.[0]?.message?.content;
if (!content) {
throw new Error(`LLM returned no content. Model: ${model}`);
}
let cost = 0;
const usage = resp.data.usage || {};
if (usage.cost !== undefined) {
cost = usage.cost;
} else {
// Fallback estimation
cost =
(usage.prompt_tokens || 0) * (0.1 / 1_000_000) +
(usage.completion_tokens || 0) * (0.4 / 1_000_000);
}
return {
content,
usage: {
promptTokens: usage.prompt_tokens || 0,
completionTokens: usage.completion_tokens || 0,
cost,
},
};
}
/**
* Send a request and parse the response as JSON.
*/
export async function llmJsonRequest<T = any>(
options: LLMRequestOptions,
): Promise<{ data: T; usage: LLMResponse["usage"] }> {
let response;
try {
response = await llmRequest({ ...options, jsonMode: true });
} catch (err) {
console.warn(
"Retrying LLM request without forced JSON mode...",
(err as Error).message,
);
response = await llmRequest({ ...options, jsonMode: false });
}
const cleaned = cleanJson(response.content);
let parsed: T;
try {
parsed = JSON.parse(cleaned);
} catch (e) {
throw new Error(
`Failed to parse LLM JSON response: ${(e as Error).message}\nRaw: ${cleaned.substring(0, 500)}`,
);
}
// Unwrap common LLM artifacts: {"0": {...}}, {"state": {...}}, etc.
const unwrapped = unwrapResponse(parsed);
return { data: unwrapped as T, usage: response.usage };
}
/**
* Recursively unwrap common LLM wrapping patterns.
*/
function unwrapResponse(obj: any): any {
if (!obj || typeof obj !== "object" || Array.isArray(obj)) return obj;
const keys = Object.keys(obj);
if (keys.length === 1) {
const key = keys[0];
if (
key === "0" ||
key === "state" ||
key === "facts" ||
key === "result" ||
key === "data"
) {
return unwrapResponse(obj[key]);
}
}
return obj;
}

View File

@@ -0,0 +1,35 @@
export const SEO_SYSTEM_PROMPT = `
You are a high-end Digital Architect and Expert SEO Analyst for the Mintel ecosystem.
Your exact job is to process RAW SEARCH DATA from Google (via Serper API) and evaluate it against our STRICT PROJECT CONTEXT.
### OBJECTIVE:
Given a project briefing, industry, and raw search queries (related searches, user questions), you must evaluate each term.
Filter out ANY hallucinations, generic irrelevant fluff, or terms that do not strictly match the client's high-end context.
Then, group the surviving relevant terms into logical "Topic Clusters" with search intents.
### RULES:
- NO Hallucinations. Do not invent keywords that were not provided in the raw data or strongly implied by the context.
- ABOSLUTE STRICTNESS: If a raw search term is irrelevant to the provided industry/briefing, DISCARD IT. Add it to the "discardedTerms" list.
- HIGH-END QUALITY: The Mintel standard requires precision. Exclude generic garbage like "was ist ein unternehmen" if the client does B2B HDD-Bohrverfahren.
### OUTPUT FORMAT:
You MUST respond with valid JSON matching this schema:
{
"topicClusters": [
{
"clusterName": "string",
"primaryKeyword": "string",
"secondaryKeywords": [
{
"term": "string",
"intent": "informational" | "navigational" | "commercial" | "transactional",
"relevanceScore": number, // 1-10
"rationale": "string" // Short explanation why this fits the context
}
],
"userIntent": "string" // Broad intent for the cluster
}
],
"discardedTerms": ["string"] // Words you threw out and why
}
`;

View File

@@ -0,0 +1,237 @@
import * as fs from "node:fs/promises";
import * as path from "node:path";
import type {
SeoEngineOutput,
TopicCluster,
ContentGap,
CompetitorRanking,
} from "./types.js";
import type { ReverseEngineeredBriefing } from "./agents/scraper.js";
export interface ReportConfig {
projectName: string;
outputDir: string;
filename?: string;
}
/**
* Generates a comprehensive, human-readable SEO Strategy Report in Markdown.
* This is the "big picture" document that summarizes everything the SEO Engine found
* and gives the team a clear action plan.
*/
export async function generateSeoReport(
output: SeoEngineOutput,
config: ReportConfig,
): Promise<string> {
const dateStr = new Date().toLocaleDateString("de-DE", {
year: "numeric",
month: "long",
day: "numeric",
});
const allKeywords = output.topicClusters.flatMap((c) => [
c.primaryKeyword,
...c.secondaryKeywords.map((k) => k.term),
]);
let md = "";
// ══════════════════════════════════════════════
// Header
// ══════════════════════════════════════════════
md += `# 📊 SEO Strategie-Report: ${config.projectName}\n\n`;
md += `> Erstellt am **${dateStr}** von der **@mintel/seo-engine**\n\n`;
md += `## Zusammenfassung auf einen Blick\n\n`;
md += `| Metrik | Wert |\n`;
md += `|--------|------|\n`;
md += `| Keywords gefunden | **${allKeywords.length}** |\n`;
md += `| Topic Clusters | **${output.topicClusters.length}** |\n`;
md += `| Konkurrenz-Rankings analysiert | **${output.competitorRankings.length}** |\n`;
md += `| Konkurrenz-Briefings erstellt | **${Object.keys(output.competitorBriefings).length}** |\n`;
md += `| Content Gaps identifiziert | **${output.contentGaps.length}** |\n`;
md += `| Autocomplete-Vorschläge | **${output.autocompleteSuggestions.length}** |\n`;
md += `| Verworfene Begriffe | **${output.discardedTerms.length}** |\n\n`;
// ══════════════════════════════════════════════
// Section 1: Keywords zum Tracken
// ══════════════════════════════════════════════
md += `---\n\n`;
md += `## 🎯 Keywords zum Tracken\n\n`;
md += `Diese Keywords sind relevant für das Projekt und sollten in einem Ranking-Tracker (z.B. Serpbear) beobachtet werden:\n\n`;
md += `| # | Keyword | Intent | Relevanz | Cluster |\n`;
md += `|---|---------|--------|----------|--------|\n`;
let kwIndex = 1;
for (const cluster of output.topicClusters) {
md += `| ${kwIndex++} | **${cluster.primaryKeyword}** | — | 🏆 Primary | ${cluster.clusterName} |\n`;
for (const kw of cluster.secondaryKeywords) {
const intentEmoji =
kw.intent === "transactional"
? "💰"
: kw.intent === "commercial"
? "🛒"
: kw.intent === "navigational"
? "🧭"
: "📖";
md += `| ${kwIndex++} | ${kw.term} | ${intentEmoji} ${kw.intent} | ${kw.relevanceScore}/10 | ${cluster.clusterName} |\n`;
}
}
// ══════════════════════════════════════════════
// Section 2: Topic Clusters
// ══════════════════════════════════════════════
md += `\n---\n\n`;
md += `## 🗂️ Topic Clusters\n\n`;
md += `Die SEO Engine hat die Keywords automatisch in thematische Cluster gruppiert. Jeder Cluster sollte idealerweise durch eine **Pillar Page** und mehrere **Sub-Pages** abgedeckt werden.\n\n`;
for (const cluster of output.topicClusters) {
md += `### ${cluster.clusterName}\n\n`;
md += `- **Pillar Keyword:** \`${cluster.primaryKeyword}\`\n`;
md += `- **User Intent:** ${cluster.userIntent}\n`;
md += `- **Sub-Keywords:** ${cluster.secondaryKeywords.map((k) => `\`${k.term}\``).join(", ")}\n\n`;
}
// ══════════════════════════════════════════════
// Section 3: Konkurrenz-Landscape
// ══════════════════════════════════════════════
if (output.competitorRankings.length > 0) {
md += `---\n\n`;
md += `## 🏁 Konkurrenz-Landscape\n\n`;
md += `Für die wichtigsten Keywords wurde geprüft, welche Konkurrenten aktuell bei Google ranken:\n\n`;
md += `| Keyword | Konkurrent | Position | Titel |\n`;
md += `|---------|-----------|----------|-------|\n`;
for (const r of output.competitorRankings) {
md += `| ${r.keyword} | **${r.domain}** | #${r.position} | ${r.title.slice(0, 60)}${r.title.length > 60 ? "…" : ""} |\n`;
}
md += `\n`;
}
// ══════════════════════════════════════════════
// Section 4: Competitor Briefings
// ══════════════════════════════════════════════
if (Object.keys(output.competitorBriefings).length > 0) {
md += `---\n\n`;
md += `## 🔬 Konkurrenz-Briefings (Reverse Engineered)\n\n`;
md += `Für die folgenden Keywords wurde der aktuelle **Platz-1-Ranker** automatisch gescraped und analysiert. Diese Briefings zeigen exakt, was ein Artikel abdecken muss, um die Konkurrenz zu schlagen:\n\n`;
for (const [keyword, briefing] of Object.entries(
output.competitorBriefings,
)) {
const b = briefing as ReverseEngineeredBriefing;
md += `### Keyword: \`${keyword}\`\n\n`;
md += `- **Format:** ${b.contentFormat}\n`;
md += `- **Ziel-Wortanzahl:** ~${b.recommendedWordCount}\n`;
md += `- **Kernthemen:** ${b.coreTopicsToCover.join("; ")}\n`;
md += `- **Wichtige Entitäten:** ${b.entitiesToInclude.map((e) => `\`${e}\``).join(", ")}\n\n`;
}
}
// ══════════════════════════════════════════════
// Section 5: Content Gaps — Action Plan
// ══════════════════════════════════════════════
if (output.contentGaps.length > 0) {
md += `---\n\n`;
md += `## 🚧 Content Gaps — Fehlende Seiten\n\n`;
md += `Die folgenden Seiten existieren auf der Website noch **nicht**, werden aber von der Zielgruppe aktiv gesucht. Sie sind nach Priorität sortiert:\n\n`;
const highGaps = output.contentGaps.filter((g) => g.priority === "high");
const medGaps = output.contentGaps.filter((g) => g.priority === "medium");
const lowGaps = output.contentGaps.filter((g) => g.priority === "low");
if (highGaps.length > 0) {
md += `### 🔴 Hohe Priorität (direkt umsatzrelevant)\n\n`;
for (const g of highGaps) {
md += `- **${g.recommendedTitle}**\n`;
md += ` - Keyword: \`${g.targetKeyword}\` · Cluster: ${g.relatedCluster}\n`;
md += ` - ${g.rationale}\n\n`;
}
}
if (medGaps.length > 0) {
md += `### 🟡 Mittlere Priorität (stärkt Autorität)\n\n`;
for (const g of medGaps) {
md += `- **${g.recommendedTitle}**\n`;
md += ` - Keyword: \`${g.targetKeyword}\` · Cluster: ${g.relatedCluster}\n`;
md += ` - ${g.rationale}\n\n`;
}
}
if (lowGaps.length > 0) {
md += `### 🟢 Niedrige Priorität (Top-of-Funnel)\n\n`;
for (const g of lowGaps) {
md += `- **${g.recommendedTitle}**\n`;
md += ` - Keyword: \`${g.targetKeyword}\` · Cluster: ${g.relatedCluster}\n`;
md += ` - ${g.rationale}\n\n`;
}
}
}
// ══════════════════════════════════════════════
// Section 6: Autocomplete Insights
// ══════════════════════════════════════════════
if (output.autocompleteSuggestions.length > 0) {
md += `---\n\n`;
md += `## 💡 Google Autocomplete — Long-Tail Insights\n\n`;
md += `Diese Begriffe stammen direkt aus der Google-Suchleiste und spiegeln echtes Nutzerverhalten wider. Sie eignen sich besonders für **FAQ-Sektionen**, **H2/H3-Überschriften** und **Long-Tail Content**:\n\n`;
for (const s of output.autocompleteSuggestions) {
md += `- ${s}\n`;
}
md += `\n`;
}
// ══════════════════════════════════════════════
// Section 7: Verworfene Begriffe
// ══════════════════════════════════════════════
if (output.discardedTerms.length > 0) {
md += `---\n\n`;
md += `## 🗑️ Verworfene Begriffe\n\n`;
md += `Die folgenden Begriffe wurden von der KI als **nicht relevant** eingestuft:\n\n`;
md += `<details>\n<summary>Alle ${output.discardedTerms.length} verworfenen Begriffe anzeigen</summary>\n\n`;
for (const t of output.discardedTerms) {
md += `- ${t}\n`;
}
md += `\n</details>\n\n`;
}
// ══════════════════════════════════════════════
// Section 8: Copy-Paste Snippets
// ══════════════════════════════════════════════
md += `---\n\n`;
md += `## 📋 Copy-Paste Snippets\n\n`;
md += `Diese Listen sind optimiert für das schnelle Kopieren in SEO-Tools oder Tabellen.\n\n`;
md += `### Rank-Tracker (z.B. Serpbear) — Ein Keyword pro Zeile\n`;
md += `\`\`\`text\n`;
md += allKeywords.join("\n");
md += `\n\`\`\`\n\n`;
md += `### Excel / Google Sheets — Kommagetrennt\n`;
md += `\`\`\`text\n`;
md += allKeywords.join(", ");
md += `\n\`\`\`\n\n`;
md += `### Pillar-Keywords (Nur Primary Keywords)\n`;
md += `\`\`\`text\n`;
md += output.topicClusters.map((c) => c.primaryKeyword).join("\n");
md += `\n\`\`\`\n\n`;
// ══════════════════════════════════════════════
// Footer
// ══════════════════════════════════════════════
md += `---\n\n`;
md += `*Dieser Report wurde automatisch von der @mintel/seo-engine generiert. Alle Daten basieren auf echten Google-Suchergebnissen (via Serper API) und wurden durch ein LLM ausgewertet.*\n`;
// Write to disk
const outDir = path.resolve(process.cwd(), config.outputDir);
await fs.mkdir(outDir, { recursive: true });
const filename =
config.filename ||
`seo-report-${config.projectName.toLowerCase().replace(/\s+/g, "-")}.md`;
const filePath = path.join(outDir, filename);
await fs.writeFile(filePath, md, "utf8");
console.log(`[Report] Written SEO Strategy Report to: ${filePath}`);
return filePath;
}

View File

@@ -0,0 +1,84 @@
import { llmJsonRequest } from "../llm-client.js";
import type { TopicCluster } from "../types.js";
export interface ExistingPage {
url: string;
title: string;
}
export interface ContentGap {
recommendedTitle: string;
targetKeyword: string;
relatedCluster: string;
priority: "high" | "medium" | "low";
rationale: string;
}
const CONTENT_GAP_SYSTEM_PROMPT = `
You are a senior SEO Content Strategist. Your job is to compare a set of TOPIC CLUSTERS
(keywords the company should rank for) against the EXISTING PAGES on their website.
### OBJECTIVE:
Identify content gaps — topics/keywords that have NO corresponding page yet.
For each gap, recommend a page title, the primary target keyword, which cluster it belongs to,
and a priority (high/medium/low) based on commercial intent and relevance.
### RULES:
- Only recommend gaps for topics that are genuinely MISSING from the existing pages.
- Do NOT recommend pages that already exist (even if the title is slightly different — use semantic matching).
- Priority "high" = commercial/transactional intent, directly drives revenue.
- Priority "medium" = informational with strong industry relevance.
- Priority "low" = broad, top-of-funnel topics.
- LANGUAGE: Match the language of the project context (if German context, recommend German titles).
### OUTPUT FORMAT:
{
"contentGaps": [
{
"recommendedTitle": "string",
"targetKeyword": "string",
"relatedCluster": "string",
"priority": "high" | "medium" | "low",
"rationale": "string"
}
]
}
`;
export async function analyzeContentGaps(
topicClusters: TopicCluster[],
existingPages: ExistingPage[],
config: { openRouterApiKey: string; model?: string },
): Promise<ContentGap[]> {
if (topicClusters.length === 0) return [];
if (existingPages.length === 0) {
console.log(
"[Content Gap] No existing pages provided, skipping gap analysis.",
);
return [];
}
const userPrompt = `
TOPIC CLUSTERS (what the company SHOULD rank for):
${JSON.stringify(topicClusters, null, 2)}
EXISTING PAGES ON THE WEBSITE:
${existingPages.map((p, i) => `${i + 1}. "${p.title}" — ${p.url}`).join("\n")}
Identify ALL content gaps. Be thorough but precise.
`;
try {
const { data } = await llmJsonRequest<{ contentGaps: ContentGap[] }>({
model: config.model || "google/gemini-2.5-pro",
apiKey: config.openRouterApiKey,
systemPrompt: CONTENT_GAP_SYSTEM_PROMPT,
userPrompt,
});
return data.contentGaps || [];
} catch (err) {
console.error("[Content Gap] Analysis failed:", (err as Error).message);
return [];
}
}

View File

@@ -0,0 +1,53 @@
import * as dotenv from "dotenv";
import { runSeoEngine, createGapDrafts, generateSeoReport } from "./index.js";
dotenv.config({ path: "../../.env" });
dotenv.config({ path: "../../apps/web/.env" });
async function testSeoEngine() {
console.log("Starting SEO Engine test run...\n");
const result = await runSeoEngine(
{
companyName: "KLZ Cables",
industry: "Mittelspannungskabel, Kabeltiefbau, Spezialkabel",
briefing:
"KLZ Cables is a B2B provider of specialized medium-voltage cables. We do NOT do low voltage or generic home cables.",
targetAudience: "B2B Einkäufer, Bauleiter, Netzbetreiber",
competitors: ["nkt.de", "faberkabel.de"],
seedKeywords: ["NA2XS2Y", "VPE-isoliert"],
existingPages: [
{ url: "/produkte", title: "Produkte" },
{ url: "/kontakt", title: "Kontakt" },
{ url: "/ueber-uns", title: "Über uns" },
],
locale: { gl: "de", hl: "de" },
},
{
serperApiKey: process.env.SERPER_API_KEY || "",
openRouterApiKey: process.env.OPENROUTER_API_KEY || "",
model: "google/gemini-2.5-pro",
maxKeywords: 20,
},
);
// Generate the SEO Strategy Report
console.log("\n=== GENERATING SEO STRATEGY REPORT ===");
const reportPath = await generateSeoReport(result, {
projectName: "KLZ Cables",
outputDir: ".seo-output",
});
console.log(`Report saved to: ${reportPath}`);
// Generate MDX drafts
console.log("\n=== GENERATING MDX DRAFTS ===");
const generatedFiles = await createGapDrafts(
result.contentGaps,
new Map(Object.entries(result.competitorBriefings)),
{ outputDir: ".seo-output/drafts", authorName: "KLZ Content Team" },
);
console.log(
`Generated ${generatedFiles.length} MDX files in .seo-output/drafts/`,
);
}
testSeoEngine().catch(console.error);

View File

@@ -0,0 +1,38 @@
import * as dotenv from "dotenv";
import axios from "axios";
dotenv.config({ path: "../../.env" });
dotenv.config({ path: "../../apps/web/.env" });
async function testSerper() {
const query = "Mittelspannungskabel";
const apiKey = process.env.SERPER_API_KEY || "";
if (!apiKey) {
console.error("Missing SERPER_API_KEY");
return;
}
try {
const response = await axios.post(
"https://google.serper.dev/search",
{
q: query,
gl: "de",
hl: "de",
},
{
headers: {
"X-API-KEY": apiKey,
"Content-Type": "application/json",
},
},
);
console.log(JSON.stringify(response.data, null, 2));
} catch (error) {
console.error("Error:", error);
}
}
testSerper();

View File

@@ -0,0 +1,59 @@
export interface ProjectContext {
companyName?: string;
industry?: string;
briefing?: string;
targetAudience?: string;
competitors?: string[];
seedKeywords?: string[];
existingPages?: { url: string; title: string }[];
customGuidelines?: string;
locale?: { gl: string; hl: string };
}
export interface SeoConfig {
serperApiKey?: string;
openRouterApiKey?: string;
model?: string;
maxKeywords?: number;
}
export interface KeywordResult {
term: string;
intent: "informational" | "navigational" | "commercial" | "transactional";
relevanceScore: number; // 1-10
rationale: string;
estimatedVolume?: "high" | "medium" | "low";
}
export interface TopicCluster {
clusterName: string;
primaryKeyword: string;
secondaryKeywords: KeywordResult[];
userIntent: string;
}
export interface CompetitorRanking {
keyword: string;
domain: string;
position: number;
title: string;
snippet: string;
link: string;
}
export interface ContentGap {
recommendedTitle: string;
targetKeyword: string;
relatedCluster: string;
priority: "high" | "medium" | "low";
rationale: string;
}
export interface SeoEngineOutput {
topicClusters: TopicCluster[];
competitorRankings: CompetitorRanking[];
contentGaps: ContentGap[];
autocompleteSuggestions: string[];
discardedTerms: string[];
competitorBriefings: Record<string, any>; // Map targetKeyword to ReverseEngineeredBriefing
}

View File

@@ -0,0 +1,19 @@
{
"extends": "../../tsconfig.json",
"compilerOptions": {
"module": "NodeNext",
"moduleResolution": "NodeNext",
"target": "ES2022",
"lib": ["ES2022", "DOM"],
"outDir": "dist",
"rootDir": "src",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"declaration": true,
"sourceMap": true
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist", "**/*.test.ts"]
}

View File

@@ -0,0 +1,9 @@
import { defineConfig } from "tsup";
export default defineConfig({
entry: ["src/index.ts"],
format: ["esm"],
dts: true,
clean: true,
target: "es2022",
});

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/thumbnail-generator",
"version": "1.9.1",
"version": "1.9.7",
"private": false,
"type": "module",
"main": "./dist/index.js",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/tsconfig",
"version": "1.9.1",
"version": "1.9.7",
"publishConfig": {
"access": "public",
"registry": "https://git.infra.mintel.me/api/packages/mmintel/npm"

1590
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

50
scripts/release.sh Executable file
View File

@@ -0,0 +1,50 @@
#!/usr/bin/env bash
set -e
VERSION=$1
if [ -z "$VERSION" ]; then
echo "Error: Missing version argument."
echo "Usage: pnpm release <version>"
echo "Example: pnpm release 1.9.0"
exit 1
fi
# Ensure version does not have 'v' prefix for the variable
VERSION=${VERSION#v}
TAG="v$VERSION"
echo "🚀 Starting release process for $TAG..."
# 1. Sync versions across monorepo
pnpm exec tsx scripts/sync-versions.ts -- "$TAG"
# 2. Check if anything changed
SYNC_FILES="package.json packages/*/package.json apps/*/package.json .env.example"
CHANGES=$(git status --porcelain $SYNC_FILES)
if [[ -n "$CHANGES" ]]; then
echo "📝 Version sync made changes. Committing and tagging..."
# Stage and commit
git add $SYNC_FILES
git commit -m "chore: release $TAG" --no-verify
else
echo "✨ Versions are already in sync."
fi
# 3. Tag and push
echo "🏷️ Tagging $TAG..."
git tag -f "$TAG" > /dev/null
echo "🚀 Pushing branch and tag to origin..."
CURRENT_BRANCH=$(git branch --show-current)
if [ -n "$CURRENT_BRANCH" ]; then
git push origin "$CURRENT_BRANCH" --no-verify
fi
git push origin "$TAG" --force --no-verify
echo ""
echo "✨ VERSIONS SYNCED & PUSHED SUCCESSFULLY ✨"
echo "✅ Release $TAG is complete!"
echo ""