Compare commits

..

168 Commits

Author SHA1 Message Date
d96d6a4b13 chore: release v1.9.9
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Failing after 10s
Monorepo Pipeline / 🧪 Test (push) Failing after 9s
Monorepo Pipeline / 🏗️ Build (push) Failing after 9s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-03 12:24:39 +01:00
8f6b12d827 fix(packages): remove private flag from all feature/engine packages to allow npm publish 2026-03-03 12:24:38 +01:00
a11714d07d chore(ci): migrate docker registry publishers to git.infra.mintel.me 2026-03-03 12:13:39 +01:00
52f7e68f25 chore: release v1.9.8
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m15s
Monorepo Pipeline / 🧹 Lint (push) Successful in 4m17s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m15s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 37s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 41s
Monorepo Pipeline / 🚀 Release (push) Successful in 1m44s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 2m31s
2026-03-03 11:52:29 +01:00
217ac33675 chore: release v1.9.8
Some checks are pending
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m11s
Monorepo Pipeline / 🧹 Lint (push) Successful in 4m7s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m19s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 37s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 41s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 2m32s
Monorepo Pipeline / 🚀 Release (push) Has started running
2026-03-03 11:44:54 +01:00
f2b8b136af chore: release v1.9.7
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m15s
Monorepo Pipeline / 🧹 Lint (push) Successful in 4m6s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m19s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 38s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 43s
Monorepo Pipeline / 🚀 Release (push) Successful in 1m54s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 2m33s
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 4s
2026-03-02 21:16:51 +01:00
2e07b213d1 chore: remove unused 3d dependencies in gatekeeper to fix lint 2026-03-02 21:16:49 +01:00
a2c1eaefba chore: release v1.9.6
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m18s
Monorepo Pipeline / 🧹 Lint (push) Failing after 3m32s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m3s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 21:08:34 +01:00
80ff266f9c fix: allow vitest to pass with no tests in seo-engine
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 54s
Monorepo Pipeline / 🧹 Lint (push) Failing after 2m7s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m13s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 21:04:14 +01:00
6b1c5b7e30 chore: release @mintel/payload-ai
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Failing after 47s
Monorepo Pipeline / 🧹 Lint (push) Failing after 2m3s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m10s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 21:00:51 +01:00
80eefad5ea feat: extract reusable @mintel/payload-ai package 2026-03-02 21:00:09 +01:00
72556af24c fix(mcp): handle gitea api envelope responses and add safety checks
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Failing after 54s
Monorepo Pipeline / 🧹 Lint (push) Failing after 2m36s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m27s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 12:53:11 +01:00
2a5466c6c0 feat(gitea-mcp): add custom Gitea MCP server for Antigravity compatibility
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 3s
Monorepo Pipeline / 🧪 Test (push) Failing after 50s
Monorepo Pipeline / 🧹 Lint (push) Failing after 3m6s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m13s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 12:36:57 +01:00
2d36a4ec71 ci(qa): refactor QA suite into granular jobs and fix NPM_TOKEN auth
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Failing after 47s
Monorepo Pipeline / 🧹 Lint (push) Failing after 3m3s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m41s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 11:51:12 +01:00
ded9da7d32 feat(seo-engine): implement competitor scraper, MDX draft editor, and strategy report generator
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Failing after 51s
Monorepo Pipeline / 🧹 Lint (push) Failing after 2m25s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m28s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-03-02 10:16:11 +01:00
36ed26ad79 feat(gatekeeper): major UI upgrade - high-fidelity light theme, iridescent mouse-reactive form, and enhanced background animation
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m10s
Monorepo Pipeline / 🧹 Lint (push) Failing after 3m15s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m53s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 4s
2026-02-28 21:48:03 +01:00
4e72a0baac fix(pipeline): remove image-processor build job and cms-infra from gatekeeper dockerfile
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 46s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m48s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m48s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 4s
2026-02-27 23:08:39 +01:00
8ca7eb3f49 chore: release v1.9.5
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 59s
Monorepo Pipeline / 🧹 Lint (push) Successful in 3m33s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m4s
Monorepo Pipeline / 🐳 Build Image Processor (push) Failing after 24s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 37s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Failing after 14s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 41s
Monorepo Pipeline / 🚀 Release (push) Successful in 1m41s
2026-02-27 22:27:30 +01:00
32d3ff010a feat(release): introduce dedicated release script to replace flawed git push hook
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 7s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m4s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m50s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m21s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 21:03:54 +01:00
cb68e1fb5c chore: sync versions to v1.9.0
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m12s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m51s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m57s
Monorepo Pipeline / 🐳 Build Image Processor (push) Failing after 32s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 33s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 40s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 31s
Monorepo Pipeline / 🚀 Release (push) Successful in 1m36s
2026-02-27 21:01:52 +01:00
1bd7c6aba5 fix(husky): avoid echo syntax error under sh
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 3s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m5s
Monorepo Pipeline / 🧹 Lint (push) Successful in 3m11s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m46s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 19:41:16 +01:00
8b0e130b08 chore: sync versions to v1.9.4
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 13s
Monorepo Pipeline / 🧪 Test (push) Successful in 58s
Monorepo Pipeline / 🧹 Lint (push) Successful in 3m18s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m5s
Monorepo Pipeline / 🐳 Build Image Processor (push) Failing after 25s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 33s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 45s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 34s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m1s
2026-02-27 19:40:30 +01:00
bd1d33a157 fix(husky): aggressively intercept tag push and silence expected error 2026-02-27 19:40:28 +01:00
b70a89ec86 chore: sync versions to v1.9.3
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m15s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m55s
Monorepo Pipeline / 🧹 Lint (push) Successful in 4m20s
Monorepo Pipeline / 🐳 Build Image Processor (push) Failing after 35s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 45s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 52s
Monorepo Pipeline / 🚀 Release (push) Successful in 3m9s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 52s
2026-02-27 19:37:34 +01:00
da28305c2d fix(husky): simplify pre-push hook to let native git push modified tag
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m3s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m54s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m44s
Monorepo Pipeline / 🐳 Build Image Processor (push) Failing after 28s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 35s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 42s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 36s
Monorepo Pipeline / 🚀 Release (push) Successful in 3m2s
2026-02-27 19:37:30 +01:00
fecb5c50ea chore: sync versions to v1.9.2
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 6s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m6s
Monorepo Pipeline / 🧹 Lint (push) Successful in 3m30s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m12s
Monorepo Pipeline / 🐳 Build Image Processor (push) Failing after 31s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 33s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 39s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 43s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m54s
2026-02-27 19:37:02 +01:00
b4b81a8315 fix(husky): auto-push current branch to keep synced after version bump 2026-02-27 19:37:00 +01:00
98fb6e363f chore: sync versions to v1.9.1
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m13s
Monorepo Pipeline / 🧹 Lint (push) Successful in 3m6s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m18s
Monorepo Pipeline / 🐳 Build Image Processor (push) Failing after 55s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 43s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 42s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 38s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m50s
2026-02-27 19:36:15 +01:00
a3061b501a fix(husky): correct pre-push exit code to avoid duplicate pushes 2026-02-27 19:36:13 +01:00
ed271e260e fix(ci): add commitlint and globals to depcheck ignore list
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 47s
Monorepo Pipeline / 🧹 Lint (push) Has started running
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
2026-02-27 19:34:58 +01:00
f275b8c9f6 refactor: drop legacy image-processor and directus from pipeline
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 12s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m3s
Monorepo Pipeline / 🧹 Lint (push) Failing after 1m31s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m4s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 19:26:19 +01:00
526db11104 chore: sync versions to v1.9.0
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 3s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m10s
Monorepo Pipeline / 🧹 Lint (push) Failing after 1m53s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
2026-02-27 19:22:57 +01:00
a9d89aa25a chore: remove unused dependencies across workspace and add depcheck to CI
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m1s
Monorepo Pipeline / 🧹 Lint (push) Failing after 1m29s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m1s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 19:15:42 +01:00
7702310a9c chore: remove Directus CMS and related dependencies
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 3s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m19s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m5s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m26s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 19:06:06 +01:00
fbf2153430 ci: require pnpm install success before running QA checks
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Failing after 53s
Monorepo Pipeline / 🧪 Test (push) Failing after 57s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m1s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 18:37:08 +01:00
a43d96dd0e fix(pdf): decouple 6 distinct PDFs, fix layout issues and DataForSEO event loop
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Failing after 1m1s
Monorepo Pipeline / 🧪 Test (push) Failing after 1m7s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m10s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 18:26:00 +01:00
60a2709999 ci: add depcheck step to nightly qa template
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Failing after 1m11s
Monorepo Pipeline / 🧪 Test (push) Failing after 1m16s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m20s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 18:08:54 +01:00
7ff15a34fc ci: add reusable core smoke tests composite action
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Failing after 54s
Monorepo Pipeline / 🧪 Test (push) Failing after 58s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m1s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 15:32:37 +01:00
8ea2ba8dbf ci: add reusable nightly qa template
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Failing after 1m4s
Monorepo Pipeline / 🧹 Lint (push) Failing after 1m12s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m13s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 15:30:11 +01:00
6ba240db0f chore(workspace): add gitea repository url to all packages
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Failing after 49s
Monorepo Pipeline / 🧪 Test (push) Failing after 53s
Monorepo Pipeline / 🏗️ Build (push) Failing after 56s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 4s
This ensures packages published to the registry link back to the at-mintel
repository in the Gitea UI packages tab.
2026-02-27 02:55:23 +01:00
10aa12f359 fix(gatekeeper): fix missing mintel logos on login page
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 3s
Monorepo Pipeline / 🧹 Lint (push) Failing after 54s
Monorepo Pipeline / 🧪 Test (push) Failing after 57s
Monorepo Pipeline / 🏗️ Build (push) Failing after 49s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
- explicitly use /gatekeeper/ prefix for basePath routing
- make image unoptimized so it bypasses _next/image which can fail under traefik
2026-02-27 02:47:17 +01:00
863fe469d6 fix(gatekeeper): use GATEKEEPER_ORIGIN for login redirect URL in ForwardAuth verify
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Failing after 52s
Monorepo Pipeline / 🧪 Test (push) Failing after 57s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m1s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 02:02:48 +01:00
4fdf79b1bb fix: only scope @mintel to Gitea Packages, keep npmjs.org as default
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Failing after 53s
Monorepo Pipeline / 🧪 Test (push) Failing after 57s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m0s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 00:23:25 +01:00
5da88356a8 feat: migrate npm registry from Verdaccio to Gitea Packages
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Failing after 35s
Monorepo Pipeline / 🧪 Test (push) Failing after 35s
Monorepo Pipeline / 🏗️ Build (push) Failing after 12s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-27 00:12:00 +01:00
efd1341762 fix: add canvas build deps for gatekeeper x86 build
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Failing after 8s
Monorepo Pipeline / 🧪 Test (push) Failing after 7s
Monorepo Pipeline / 🏗️ Build (push) Failing after 7s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-26 23:10:47 +01:00
36a952db56 chore: sync versions to v1.8.21
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Failing after 9s
Monorepo Pipeline / 🧪 Test (push) Failing after 7s
Monorepo Pipeline / 🏗️ Build (push) Failing after 7s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-26 19:39:04 +01:00
8c637f0220 chore: trigger x86 ci build
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Failing after 1m31s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m27s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m30s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-26 19:05:19 +01:00
6dd97e7a6b chore: trigger x86 build for mb-grid and mintel.me
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Failing after 1m17s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m22s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m25s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-26 18:45:47 +01:00
9f426470bb fix(ci): update build platform from arm64 to amd64 for the new x86 server
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 50s
Monorepo Pipeline / 🏗️ Build (push) Failing after 56s
Monorepo Pipeline / 🧹 Lint (push) Failing after 1m8s
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-26 17:42:57 +01:00
960914ebb8 feat: content engine usw 2026-02-25 12:43:57 +01:00
a55a5bb834 fix: prevent .env changes during tagging and improve pre-push hook feedback
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 6s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m8s
Monorepo Pipeline / 🧪 Test (push) Successful in 4m16s
Monorepo Pipeline / 🧹 Lint (push) Successful in 5m34s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 4s
2026-02-23 14:10:04 +01:00
0aaf858f5b chore: sync versions to v1.8.20
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m3s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m57s
Monorepo Pipeline / 🏗️ Build (push) Successful in 4m38s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m12s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 1m42s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 1m31s
Monorepo Pipeline / 🐳 Build Image Processor (push) Successful in 2m50s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 8m14s
Monorepo Pipeline / 🚀 Release (push) Successful in 9m13s
2026-02-23 14:03:27 +01:00
ec562c1b2c fix: imgproxy issues 2026-02-23 14:03:17 +01:00
02e15c3f4a chore: sync versions to v1.8.19
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 3m54s
Monorepo Pipeline / 🧹 Lint (push) Successful in 4m12s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m42s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m7s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 1m43s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 1m37s
Monorepo Pipeline / 🐳 Build Image Processor (push) Successful in 2m47s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 6m39s
Monorepo Pipeline / 🚀 Release (push) Successful in 7m18s
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 8s
2026-02-23 00:52:35 +01:00
cd4c2193ce feat: implement legacy imgproxy compatibility and URL mapping
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m56s
Monorepo Pipeline / 🏗️ Build (push) Successful in 4m32s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-23 00:14:13 +01:00
df7a464e03 fix(ci): sync lockfile and remove deleted model scripts
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 5m26s
Monorepo Pipeline / 🏗️ Build (push) Successful in 7m18s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m5s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m8s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 1m43s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 1m27s
Monorepo Pipeline / 🐳 Build Image Processor (push) Successful in 2m38s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 5m49s
Monorepo Pipeline / 🚀 Release (push) Successful in 6m24s
2026-02-22 23:40:30 +01:00
e2e0653de6 chore(image-processor): use Gemini 3 Flash Preview
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🏗️ Build (push) Failing after 23s
Monorepo Pipeline / 🧹 Lint (push) Failing after 8s
Monorepo Pipeline / 🧪 Test (push) Failing after 21s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-22 23:31:44 +01:00
590ae6f69b chore: sync versions to v1.8.16
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Failing after 29s
Monorepo Pipeline / 🧹 Lint (push) Failing after 21s
Monorepo Pipeline / 🏗️ Build (push) Failing after 8s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-22 23:24:30 +01:00
2a169f1dfc feat(image-processor): switch to OpenRouter Vision for smart crop and remove heavy models 2026-02-22 23:24:22 +01:00
1bbe89c879 chore: sync versions to v1.8.15
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 4s
Monorepo Pipeline / 🧪 Test (push) Successful in 5m30s
Monorepo Pipeline / 🏗️ Build (push) Successful in 7m42s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m5s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m4s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 1m31s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 59s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m52s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 4m32s
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been cancelled
2026-02-22 23:07:34 +01:00
554ca81c9b chore(image-processor): fix tfjs-node cross compile arch flags 2026-02-22 23:07:32 +01:00
aac0fe81b9 fix(image-service): enforce arm64 cpu architecture for tfjs-node in dockerfile
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 4m59s
Monorepo Pipeline / 🧹 Lint (push) Successful in 6m11s
Monorepo Pipeline / 🏗️ Build (push) Successful in 9m49s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 2m13s
Monorepo Pipeline / 🚀 Release (push) Successful in 3m6s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m26s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 23s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 6m17s
Monorepo Pipeline / 🐳 Build Image Processor (push) Successful in 16m2s
2026-02-22 22:44:03 +01:00
ada1e9c717 fix(image-service): force rebuild tfjs-node for container architecture in Dockerfile
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 5m5s
Monorepo Pipeline / 🧹 Lint (push) Successful in 6m36s
Monorepo Pipeline / 🏗️ Build (push) Successful in 10m21s
Monorepo Pipeline / 🐳 Build Image Processor (push) Successful in 5m10s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m56s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 2m38s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 1m25s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 7m34s
Monorepo Pipeline / 🚀 Release (push) Successful in 9m13s
2026-02-22 22:29:25 +01:00
4d295d10d1 chore: sync versions to v1.8.12
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 3m55s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m11s
Monorepo Pipeline / 🏗️ Build (push) Successful in 5m59s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m12s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 1m39s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 1m29s
Monorepo Pipeline / 🐳 Build Image Processor (push) Successful in 5m35s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 6m14s
Monorepo Pipeline / 🚀 Release (push) Successful in 7m4s
2026-02-22 22:14:44 +01:00
c00f4e5ea5 fix(image-service): resolve next.js build crash and strict TS lint warnings for ci deploy 2026-02-22 22:14:35 +01:00
5f7a254fcb chore: sync versions to v1.8.11
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 6s
Monorepo Pipeline / 🏗️ Build (push) Failing after 2m56s
Monorepo Pipeline / 🧪 Test (push) Successful in 4m2s
Monorepo Pipeline / 🧹 Lint (push) Failing after 4m37s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-22 21:59:19 +01:00
21c0c778f9 feat(image-service): standalone processor 2026-02-22 21:59:14 +01:00
4f6d62a85c fix(image-service): Remove tfjs-node from pnpm rebuild to preserve ARM64 binary
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Failing after 59s
Monorepo Pipeline / 🧪 Test (push) Successful in 2m0s
Monorepo Pipeline / 🏗️ Build (push) Failing after 4m13s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-22 21:21:45 +01:00
7d9604a65a chore: sync versions to v1.8.6
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 3s
Monorepo Pipeline / 🧹 Lint (push) Failing after 4m37s
Monorepo Pipeline / 🧪 Test (push) Successful in 4m37s
Monorepo Pipeline / 🏗️ Build (push) Failing after 2m16s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-22 18:53:51 +01:00
b3d089ac6d feat(content-engine): enhance content pruning rule in orchestrator
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🧹 Lint (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
2026-02-22 18:53:17 +01:00
baecc9c83c feat: content engine 2026-02-22 18:33:58 +01:00
d5632b009a feat(content-engine): add autonomous validation layer to actively detect and correct hallucinated meme templates without user intervention
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m0s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m46s
Monorepo Pipeline / 🏗️ Build (push) Successful in 4m49s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-22 18:23:44 +01:00
90a9e34c7e fix(journaling): enforce stricter LLM evaluation rules for YouTube video selection
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 2m53s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m29s
Monorepo Pipeline / 🧹 Lint (push) Successful in 3m37s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-22 18:07:50 +01:00
99f040cfb0 feat(ai): forcefully randomize meme templates and expand B2B YouTube channels
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m3s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m38s
Monorepo Pipeline / 🏗️ Build (push) Successful in 6m23s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-22 17:55:09 +01:00
02bffbc67f feat(journaling): implement secondary LLM validation for YouTube video selection
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 3s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m6s
Monorepo Pipeline / 🧹 Lint (push) Failing after 3m2s
Monorepo Pipeline / 🏗️ Build (push) Successful in 5m24s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-22 17:43:37 +01:00
f4507ef121 fix(journaling): optimize serper video search queries to prevent MDX hallucination
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 59s
Monorepo Pipeline / 🧹 Lint (push) Failing after 2m0s
Monorepo Pipeline / 🏗️ Build (push) Successful in 5m9s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-22 17:35:38 +01:00
3a1a88db89 feat: content engine
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m12s
Monorepo Pipeline / 🧪 Test (push) Successful in 2m59s
Monorepo Pipeline / 🏗️ Build (push) Successful in 6m52s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 4s
2026-02-22 02:39:27 +01:00
a9adb2eff7 fix(ci): disable provenance to prevent manifest unknown on pull
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m0s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m23s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m59s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-21 20:51:12 +01:00
a50b8d6393 feat: content engine 2026-02-21 19:08:06 +01:00
3f1c37813a fix(ci): bypass buildx cache and ignore pnpm store to resolve EOF corruption
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 58s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m15s
Monorepo Pipeline / 🏗️ Build (push) Successful in 4m14s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-21 15:30:59 +01:00
8f32c80801 chore: optimize cms startup, refactor scripts and implement real-time dev mode 2026-02-16 18:23:38 +01:00
67750c886e chore: ignore and untrack cms-infra uploads 2026-02-15 18:45:57 +01:00
9fe9a74e71 chore: ignore and untrack generated directus extensions 2026-02-15 18:45:05 +01:00
92fe089619 chore: fix syntax error in pdf-library build script
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 22s
Monorepo Pipeline / 🧹 Lint (push) Successful in 3m45s
Monorepo Pipeline / 🧪 Test (push) Successful in 3m45s
Monorepo Pipeline / 🏗️ Build (push) Successful in 4m47s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 55s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 56s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 55s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m24s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 5m2s
2026-02-15 17:52:06 +01:00
7dcef0bc28 chore: fix unrelated lint errors to unblock release CI
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 52s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m0s
Monorepo Pipeline / 🧪 Test (push) Successful in 4m2s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 6s
2026-02-15 17:51:59 +01:00
2ba091f738 chore: sync versions to v1.8.10
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 3s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🧹 Lint (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
2026-02-15 17:49:11 +01:00
5757c1172b fix(next-feedback): strengthen embedded detection to prevent record-mode conflicts
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🏗️ Build (push) Has started running
Monorepo Pipeline / 🧹 Lint (push) Has been cancelled
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
2026-02-15 17:49:07 +01:00
e7d5798857 feat(next-feedback): implement transparent embedded isolation check
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Failing after 13m2s
Monorepo Pipeline / 🧹 Lint (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
2026-02-15 17:05:32 +01:00
29a414f385 fix(qa): resolve lint errors and unused variables across packages
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 7m9s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m18s
Monorepo Pipeline / 🏗️ Build (push) Successful in 8m33s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 53s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 57s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 55s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m22s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 4m53s
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 5s
2026-02-14 15:34:54 +01:00
69764e42c6 fix(pipeline): improve prioritization to prevent redundant branch and tag runs
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🏗️ Build (push) Failing after 23m30s
Monorepo Pipeline / 🧪 Test (push) Failing after 24m32s
Monorepo Pipeline / 🧹 Lint (push) Failing after 24m34s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
2026-02-14 14:00:08 +01:00
d69ade6268 chore: update lockfile and commit all pending release fixes
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 2m2s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m14s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
2026-02-14 13:57:46 +01:00
ceaf3ae3ea chore: sync versions to 1.8.4
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 4s
Monorepo Pipeline / 🧹 Lint (push) Failing after 26s
Monorepo Pipeline / 🧪 Test (push) Failing after 27s
Monorepo Pipeline / 🏗️ Build (push) Failing after 25s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-14 13:39:39 +01:00
169cb83f69 fix(pipeline): allow all tags and chore commits for releases
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m5s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m13s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m53s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-14 13:39:26 +01:00
f831a7e67e chore(next-feedback): bump to 1.8.4 and export FeedbackOverlay from root
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 27s
Monorepo Pipeline / 🧪 Test (push) Successful in 57s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m53s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m23s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-14 12:48:52 +01:00
cb4ffcaeda feat(next-feedback): convert FeedbackOverlay to controlled component
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Successful in 50s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m46s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m46s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 5s
2026-02-14 02:05:02 +01:00
9b1f3fb7e8 feat(next-feedback): add onActiveChange prop for controlled activation
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m1s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
2026-02-14 02:03:13 +01:00
f48d89c368 chore: comprehensive commit of all debugging, infrastructure, and extension fixes
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 6s
Monorepo Pipeline / 🧪 Test (push) Successful in 56s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m22s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m51s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
Summary of changes:
- Corrected Directus extensions to use 'vue-router' for 'useRouter' instead of '@directus/extensions-sdk' (Fixed runtime crash).
- Standardized extension folder structure and moved built extensions to the root 'directus/extensions' directory.
- Updated 'scripts/sync-extensions.sh' and 'scripts/validate-extensions.sh' for better extension management.
- Added 'scripts/validate-sdk-imports.sh' as a safeguard against future invalid SDK imports.
- Integrated import validation into the '.husky/pre-push' hook.
- Standardized Docker restart policies and network configurations in 'cms-infra/docker-compose.yml'.
- Updated tracked 'data.db' with the correct 'module_bar' settings to ensure extension visibility.
- Cleaned up legacy files and consolidated extension package source code.

This commit captures the full state of the repository after resolving the 'missing extensions' issue.
2026-02-14 01:44:18 +01:00
ad40e71757 fix: replace invalid useRouter import from @directus/extensions-sdk with vue-router
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 26s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
Monorepo Pipeline / 🧹 Lint (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
The Directus 11.x SDK does not export useRouter. Importing it caused a
SyntaxError that crashed the entire extensions bundle, preventing ALL
modules from appearing in the Data Studio sidebar.

Changes:
- Replace useRouter import from @directus/extensions-sdk → vue-router
- Add scripts/validate-sdk-imports.sh to catch invalid SDK imports
- Integrate SDK import validation into pre-push hook
- Add EXTENSIONS_AUTO_RELOAD to docker-compose.yml
- Remove debug NODE_ENV=development
2026-02-14 01:43:10 +01:00
911ceffdc5 fix(pipeline): serialize image builds to prevent act cache collisions
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m0s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m41s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m19s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-13 15:01:34 +01:00
23358fc708 fix: temporary trigger test
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 8m49s
Monorepo Pipeline / 🧹 Lint (push) Successful in 9m13s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Failing after 51s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 1m34s
Monorepo Pipeline / 🏗️ Build (push) Successful in 6m58s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 20s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 18s
Monorepo Pipeline / 🚀 Release (push) Successful in 3m58s
2026-02-13 14:38:01 +01:00
73ea958655 chore: remove [skip ci] from version sync and update image tag 2026-02-13 14:31:30 +01:00
f2035d79dd chore: automate re-push in pre-push hook
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 56s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m9s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m59s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-13 14:28:55 +01:00
f514349ccf chore: sync versions to v1.8.2 [skip ci] 2026-02-13 14:27:22 +01:00
a71f86560b chore: fix @mintel/directus-extension-toolkit build and update eslint ignores
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 27s
Monorepo Pipeline / 🧪 Test (push) Successful in 57s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m1s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m35s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-13 14:21:39 +01:00
de8314732d chore: fix remaining build script syntax errors in extensions
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m5s
Monorepo Pipeline / 🏗️ Build (push) Failing after 2m49s
Monorepo Pipeline / 🧪 Test (push) Successful in 3m0s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-13 12:15:30 +01:00
bdf7773310 chore: finalize 'meaningful' sync hook and pipeline stabilization
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🧹 Lint (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
2026-02-13 12:15:01 +01:00
a25e4aa1d4 chore: stabilize pipeline, fix extension build scripts, and finalize version sync hook
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 3s
Monorepo Pipeline / 🧹 Lint (push) Has started running
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
2026-02-13 12:14:27 +01:00
ecc2163b8e chore: remove redundant version sync from pre-push hook
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Successful in 47s
Monorepo Pipeline / 🏗️ Build (push) Failing after 3m27s
Monorepo Pipeline / 🧪 Test (push) Successful in 4m33s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-13 12:08:58 +01:00
af02378d29 chore: sync versions
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 4s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m10s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🧹 Lint (push) Has been cancelled
2026-02-13 12:05:14 +01:00
f8847a7a10 feat(next-feedback): refine selector filters for tailwind and dynamic classes
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 8s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m59s
Monorepo Pipeline / 🧪 Test (push) Successful in 4m5s
Monorepo Pipeline / 🧹 Lint (push) Successful in 5m8s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 20s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 18s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 20s
Monorepo Pipeline / 🚀 Release (push) Successful in 8m3s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 10m25s
2026-02-13 12:03:33 +01:00
117b23db1e feat(next-feedback): improve selector precision with @medv/finder and fix client/server boundary 2026-02-13 12:03:11 +01:00
d6f9a24823 chore: sync versions to 1.8.0 2026-02-12 22:05:20 +01:00
422e4fccba feat(cloner): add cloner-library and finalize pdf-library rename 2026-02-12 22:04:40 +01:00
57ec4d7544 chore: bump versions 2026-02-12 21:47:55 +01:00
a4d021c658 feat(pdf): rename acquisition-library to pdf-library and update package name to @mintel/pdf
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Failing after 13s
Monorepo Pipeline / 🏗️ Build (push) Failing after 11s
Monorepo Pipeline / 🧪 Test (push) Failing after 25s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-12 21:46:45 +01:00
269d19bbef fix(acquisition): finalize extension build and components
- Fixed IndustrialCard export in SharedUI.
- Successfully built all extensions including acquisition-library.
- Verified sitemap and briefing module updates.
2026-02-12 21:27:39 +01:00
30ff08c66d fix(acquisition): standardize bundling and externalize React/PDF dependencies
- Added JSX support and correctly externalized react/pdf dependencies in esbuild.
- Fixed acquisition-library exports by removing missing DINLayout reference.
- Standardized extension entry points across all modules.
2026-02-12 21:26:30 +01:00
81deaf447f fix(acquisition): standardize bundling and externalize React/PDF dependencies
- Added JSX support to esbuild configuration.
- Externalized react, react-dom, and @react-pdf/renderer to avoid redundant bundling.
- Updated acquisition-library exports for modular PDF generation.
2026-02-12 21:24:15 +01:00
a0ebc58d6d fix(directus): resolve extension visibility and registration failures
- Corrected module_bar settings to restore custom extension visibility in UI.
- Fixed 'fs' dynamic require in acquisition endpoint by externalizing Node.js built-ins.
- Standardized local environment branding to AT Mintel.
2026-02-12 21:20:28 +01:00
7498c24c9a fix(directus): resolve login failures and standardize project branding
- Fixed project isolation bypass (identity shadowing) by prefixing database service name.
- Standardized health check paths and protocols in docker-compose.yml.
- Resolved extension SyntaxError caused by duplicate banner injections in build scripts.
- Migrated extension build system to clean esbuild-based bundles (removing shims).
- Updated sync-directus.sh for project-prefixed service name.
- Synchronized latest production data and branding (AT Mintel).
2026-02-12 19:21:53 +01:00
efba82337c refactor(acquisition): change build output to dist/ directory
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Successful in 59s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m38s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m22s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 4s
2026-02-12 12:47:06 +01:00
c083b309fb fix(acquisition): add missing dependencies to acquisition-library and fix build failure
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m0s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m6s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m23s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-12 12:15:43 +01:00
eb8bf60408 fix(infra): use dynamic container detection for registry maintenance
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 20s
Monorepo Pipeline / 🧪 Test (push) Successful in 49s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m55s
Monorepo Pipeline / 🏗️ Build (push) Failing after 2m1s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-12 11:16:28 +01:00
a3819490ac chore(package): standardize formatting and cleanup temporary logs
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 26s
Monorepo Pipeline / 🧪 Test (push) Successful in 52s
Monorepo Pipeline / 🏗️ Build (push) Has started running
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🧹 Lint (push) Has been cancelled
2026-02-12 11:14:58 +01:00
1127954fea feat(cms): final restoration of extension logic and monorepo stabilization
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 55s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m32s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m50s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 4s
2026-02-12 02:11:53 +01:00
fa0b133012 feat(cms): restore extension logic and stabilize build pipeline
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 45s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m47s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m57s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
- restored People Manager and Acquisition Manager logic from git history
- standardized build scripts for directus extensions
- added mmintel user and enabled extensions in cms settings
- updated sync script for robust artifact distribution
2026-02-12 01:14:13 +01:00
1b40baebd4 fix(infra): use SHA detection and better logging in wait-for-upstream.sh
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 44s
Monorepo Pipeline / 🧪 Test (push) Successful in 2m24s
Monorepo Pipeline / 🏗️ Build (push) Successful in 3m43s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-11 23:25:47 +01:00
316c03869a fix(gatekeeper): enhance logging and stabilize upstream polling
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 6m38s
Monorepo Pipeline / 🧹 Lint (push) Successful in 7m14s
Monorepo Pipeline / 🏗️ Build (push) Successful in 10m24s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m39s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 2m7s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 2m8s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m18s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 6m58s
2026-02-11 22:49:16 +01:00
63d2acfab5 feat(infra): add wait-for-upstream script for smart dependencies
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Successful in 47s
Monorepo Pipeline / 🧪 Test (push) Successful in 39s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m49s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
2026-02-11 22:41:47 +01:00
bdeae0aca6 chore(gatekeeper): bump to 1.7.11 for fix
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 47s
Monorepo Pipeline / 🧪 Test (push) Successful in 40s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m51s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
2026-02-11 22:35:04 +01:00
47c70a16f1 fix(gatekeeper): trim auth inputs and prioritize access code to prevent autofill traps
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
2026-02-11 22:32:17 +01:00
b96d44bf6d chore: finalize version updates for v1.7.10
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 3s
Monorepo Pipeline / 🧪 Test (push) Successful in 45s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m12s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m52s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-11 16:55:17 +01:00
73b60f14a9 chore: release clean base image 1.7.10
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m3s
Monorepo Pipeline / 🧪 Test (push) Successful in 41s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m19s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 16s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 2m14s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m43s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 21s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 32s
2026-02-11 16:32:16 +01:00
b3f43c421f chore: manual version bump to 1.7.9
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 57s
Monorepo Pipeline / 🧪 Test (push) Successful in 50s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m49s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 57s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m7s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 4m4s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 3m1s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 34s
2026-02-11 16:02:51 +01:00
a2339f7106 fix: make directus extension build scripts more resilient
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Successful in 45s
Monorepo Pipeline / 🧪 Test (push) Successful in 52s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m50s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-11 15:58:58 +01:00
e83a76f111 chore: trigger CI build after disk cleanup
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 45s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m4s
Monorepo Pipeline / 🏗️ Build (push) Failing after 38s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-11 15:56:16 +01:00
0096c18098 fix(infra): correct registry data path and enable untagged manifest deletion
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 27s
Monorepo Pipeline / 🧪 Test (push) Failing after 24s
Monorepo Pipeline / 🏗️ Build (push) Failing after 22s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m4s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-11 12:39:50 +01:00
3284931f84 chore(next-utils): respect skip flags in refinements and publish v1.7.15
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Successful in 46s
Monorepo Pipeline / 🧪 Test (push) Successful in 55s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m48s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-11 01:32:01 +01:00
28517a3558 chore(next-utils): introduce withMintelRefinements and publish v1.7.14
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Has started running
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
2026-02-11 01:31:16 +01:00
3b9f10ec98 chore(next-utils): convert to ESM-only and publish v1.7.13
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 42s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m0s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m47s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-11 01:26:53 +01:00
65fd248993 chore(next-utils): fix generic propagation in createMintelDirectusClient and publish v1.7.12
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 45s
Monorepo Pipeline / 🧪 Test (push) Successful in 54s
Monorepo Pipeline / 🏗️ Build (push) Failing after 1m46s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-11 01:22:12 +01:00
ebd9ab132c chore(next-utils): add explicit return type to validateMintelEnv and publish v1.7.11
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 45s
Monorepo Pipeline / 🧪 Test (push) Successful in 58s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
2026-02-11 01:20:57 +01:00
ddaeb2c3ca chore(next-utils): rebuild with generic types and publish v1.7.10
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 42s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
Monorepo Pipeline / 🧹 Lint (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
2026-02-11 01:20:03 +01:00
ad1a8c4fbf feat(next-utils): support generic schema in directus client
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 34s
Monorepo Pipeline / 🧪 Test (push) Successful in 43s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m6s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m50s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-11 01:00:34 +01:00
013b0259b2 fix(pipeline): use POSIX sh compatible logic for release prioritization
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 44s
Monorepo Pipeline / 🧪 Test (push) Successful in 40s
Monorepo Pipeline / 🏗️ Build (push) Successful in 1m47s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
2026-02-11 00:33:06 +01:00
d5a9a3bce4 chore: refine release prioritization logic and bump v1.7.8
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m0s
Monorepo Pipeline / 🧪 Test (push) Successful in 39s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m23s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 19s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m42s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 2m17s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 16s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 5m29s
2026-02-11 00:31:03 +01:00
b9fd583ac4 chore: fix pipeline hang by disabling broken caching and using corepack
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 57s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
2026-02-11 00:29:10 +01:00
bfdbaba0d0 chore: implement release prioritization and streamline setup for speed
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧹 Lint (push) Successful in 1m2s
Monorepo Pipeline / 🧪 Test (push) Successful in 58s
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
2026-02-11 00:22:31 +01:00
4ea9cbc551 fix(next-utils): use natural type inference for validateMintelEnv to fix unknown type errors
Some checks failed
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
Monorepo Pipeline / 🧹 Lint (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
Monorepo Pipeline / 📦 Install & Sync (push) Has been cancelled
2026-02-11 00:17:46 +01:00
d8c1a38c0d chore: optimize pipeline for speed and parallelize QA jobs
Some checks failed
Monorepo Pipeline / 🧹 Lint (push) Has been cancelled
Monorepo Pipeline / 🧪 Test (push) Has been cancelled
Monorepo Pipeline / 🏗️ Build (push) Has been cancelled
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 📦 Install & Sync (push) Has been cancelled
2026-02-11 00:14:55 +01:00
b65b9a7fb2 fix(next-utils): finalize type safety for validateMintelEnv and fix pre-push hook
Some checks failed
Monorepo Pipeline / 🧪 Quality Assurance (push) Failing after 2m7s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-11 00:10:38 +01:00
858c7bbc39 fix(next-utils): use z.extend() for robust type inference in validateMintelEnv
Some checks failed
Monorepo Pipeline / 🧪 Quality Assurance (push) Successful in 2m33s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 33s
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
2026-02-11 00:04:14 +01:00
149123ef90 fix(next-utils): restore optional argument with robust types to satisfy linter
Some checks failed
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
Monorepo Pipeline / 🧪 Quality Assurance (push) Has been cancelled
2026-02-11 00:02:53 +01:00
6bc49d1c52 fix(next-utils): make validateMintelEnv generic for better type safety
Some checks failed
Monorepo Pipeline / 🧪 Quality Assurance (push) Failing after 2m58s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-10 23:57:18 +01:00
52ffe49019 feat(next-utils): make directus client environment-aware and standardize base env schema
All checks were successful
Monorepo Pipeline / 🧪 Quality Assurance (push) Successful in 2m10s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-10 23:44:27 +01:00
73fa292528 fix: remove klz from workspace 2026-02-10 21:39:48 +01:00
f2c0a4581c chore: sync versions
All checks were successful
Monorepo Pipeline / 🧪 Quality Assurance (push) Successful in 2m6s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-10 00:39:34 +01:00
367c4d8404 fix: cms schema 2026-02-10 00:35:26 +01:00
587c88980f chore: release next-config v1.7.0
All checks were successful
Monorepo Pipeline / 🧪 Quality Assurance (push) Successful in 2m8s
Monorepo Pipeline / 🚀 Release (push) Successful in 1m51s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 17s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 2m18s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 16s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 5m34s
2026-02-10 00:29:02 +01:00
fcdfdb4588 chore: release next-config v1.6.1
All checks were successful
Monorepo Pipeline / 🧪 Quality Assurance (push) Successful in 2m12s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 18s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m17s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 2m13s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 17s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 5m27s
2026-02-10 00:27:59 +01:00
6bbaa8d105 chore: cms sync 2026-02-10 00:26:13 +01:00
eccc084441 chore: cms sync commands 2026-02-10 00:13:42 +01:00
da6b8aba64 fix: cms sync
All checks were successful
Monorepo Pipeline / 🧪 Quality Assurance (push) Successful in 2m43s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-10 00:03:27 +01:00
290097b4e6 chore: fix linter
Some checks failed
Monorepo Pipeline / 🧪 Quality Assurance (push) Has been cancelled
Monorepo Pipeline / 🚀 Release (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been cancelled
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been cancelled
2026-02-10 00:02:26 +01:00
45894cce34 chore: fix linter
Some checks failed
Monorepo Pipeline / 🧪 Quality Assurance (push) Failing after 57s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-09 23:59:22 +01:00
7195906da0 chore: fix linter
Some checks failed
Monorepo Pipeline / 🧪 Quality Assurance (push) Failing after 42s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-09 23:54:58 +01:00
dcb466f53b chore: fix husky
Some checks failed
Monorepo Pipeline / 🧪 Quality Assurance (push) Failing after 1m3s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-09 23:44:34 +01:00
14089766ea feat: infra cms
Some checks failed
Monorepo Pipeline / 🧪 Quality Assurance (push) Failing after 1m5s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-09 23:33:45 +01:00
254 changed files with 22406 additions and 10476 deletions

View File

@@ -0,0 +1,82 @@
---
description: How to manage and deploy Directus CMS infrastructure changes.
---
# Directus CMS Infrastructure Workflow
This workflow ensures "Industrial Grade" consistency and stability across local, testing, and production environments for the `at-mintel` Directus CMS.
## 1. Local Development Lifecycle
### Starting the CMS
To start the local Directus instance with extensions:
```bash
cd packages/cms-infra
npm run up
```
### Modifying Schema
1. **Directus UI**: Make your changes directly in the local Directus Admin UI (Collections, Fields, Relations).
2. **Take Snapshot**:
```bash
cd packages/cms-infra
npm run snapshot:local
```
This updates `packages/cms-infra/schema/snapshot.yaml`.
3. **Commit**: Commit the updated `snapshot.yaml`.
## 2. Deploying Schema Changes
### To Local Environment (Reconciliation)
If you pull changes from Git and need to apply them to your local database:
```bash
cd packages/cms-infra
npm run schema:apply:local
```
> [!IMPORTANT]
> This command automatically runs `scripts/cms-reconcile.sh` to prevent "Field already exists" errors by registering database columns in Directus metadata first.
### To Production (Infra)
To deploy the local snapshot to the production server:
```bash
cd packages/cms-infra
npm run schema:apply:infra
```
This script:
1. Syncs built extensions via rsync.
2. Injects the `snapshot.yaml` into the remote container.
3. Runs `directus schema apply`.
4. Restarts Directus to clear the schema cache.
## 3. Data Synchronization
### Pulling from Production
To update your local environment with production data and assets:
```bash
cd packages/cms-infra
npm run sync:pull
```
### Pushing to Production
> [!CAUTION]
> This will overwrite production data. Use with extreme care.
```bash
cd packages/cms-infra
npm run sync:push
```
## 4. Extension Management
When modifying extensions in `packages/*-manager`:
1. Extensions are automatically built and synced when running `npm run up`.
2. To sync manually without restarting the stack:
```bash
cd packages/cms-infra
npm run build:extensions
```
## 5. Troubleshooting "Field already exists"
If `schema:apply` fails with "Field already exists", run:
```bash
./scripts/cms-reconcile.sh
```
This script ensures the database state matches Directus's internal field registry (`directus_fields`).

View File

@@ -1,12 +1,26 @@
node_modules
**/node_modules
.next
**/.next
.git
# .npmrc is allowed as it contains the registry template
dist
**/dist
build
**/build
out
**/out
coverage
**/coverage
.vercel
**/.vercel
.turbo
**/.turbo
*.log
**/*.log
.DS_Store
**/.DS_Store
.pnpm-store
**/.pnpm-store
.gitea
**/.gitea

43
.env Normal file
View File

@@ -0,0 +1,43 @@
# Project
IMAGE_TAG=v1.8.19
PROJECT_NAME=at-mintel
PROJECT_COLOR=#82ed20
GITEA_TOKEN=ccce002e30fe16a31a6c9d5a414740af2f72a582
OPENROUTER_API_KEY=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
ZYTE_API_KEY=1f0f74726f044f55aaafc7ead32cd489
REPLICATE_API_KEY=r8_W3grtpXMRfi0u3AM9VdkKbuWdZMmhwU2Tn0yt
SERPER_API_KEY=02f69a8db9578c41fb1c8ed9f7a999302da644ff
DATA_FOR_SEO_API_KEY=bWFyY0BtaW50ZWwubWU6MjQ0YjBjZmIzOGY3NTIzZA==
DATA_FOR_SEO_LOGIN=marc@mintel.me
DATA_FOR_SEO_PASSWORD=244b0cfb38f7523d
# Authentication
GATEKEEPER_PASSWORD=mintel
AUTH_COOKIE_NAME=mintel_gatekeeper_session
# Host Config (Local)
TRAEFIK_HOST=at-mintel.localhost
DIRECTUS_HOST=cms-legacy.localhost
# Next.js
NEXT_PUBLIC_BASE_URL=http://at-mintel.localhost
# Directus
DIRECTUS_URL=http://cms-legacy.localhost
DIRECTUS_KEY=F9IIfahEjPq6NZhKyRLw516D8GotuFj79EGK7pGfIWg=
DIRECTUS_SECRET=OZfxMu8lBxzaEnFGRKreNBoJpRiRu58U+HsVg2yWk4o=
CORS_ENABLED=true
CORS_ORIGIN=true
LOG_LEVEL=debug
DIRECTUS_ADMIN_EMAIL=mmintel@mintel.me
DIRECTUS_ADMIN_PASSWORD=Tim300493.
DIRECTUS_DB_NAME=directus
DIRECTUS_DB_USER=directus
DIRECTUS_DB_PASSWORD=mintel-db-pass
# Sentry / Glitchtip
SENTRY_DSN=
# Analytics (Umami)
NEXT_PUBLIC_UMAMI_WEBSITE_ID=
NEXT_PUBLIC_UMAMI_SCRIPT_URL=https://analytics.infra.mintel.me/script.js

View File

@@ -1,4 +1,5 @@
# Project
IMAGE_TAG=v1.9.9
PROJECT_NAME=sample-website
PROJECT_COLOR=#82ed20

4
.eslintignore Normal file
View File

@@ -0,0 +1,4 @@
**/index.js
**/dist/**
packages/cms-infra/extensions/**
packages/cms-infra/extensions/**

View File

@@ -0,0 +1,41 @@
name: "Mintel Core Smoke Tests"
description: "Executes standard fast HTTP, API, and Locale validation checks."
inputs:
TARGET_URL:
description: 'The deployed URL to test against'
required: true
GATEKEEPER_PASSWORD:
description: 'Gatekeeper bypass password'
required: true
UMAMI_API_ENDPOINT:
description: 'Umami Analytics Endpoint'
required: false
default: 'https://analytics.infra.mintel.me'
SENTRY_DSN:
description: 'Sentry / Glitchtip DSN'
required: false
runs:
using: "composite"
steps:
- name: 🌐 Full Sitemap HTTP Validation
shell: bash
env:
NEXT_PUBLIC_BASE_URL: ${{ inputs.TARGET_URL }}
GATEKEEPER_PASSWORD: ${{ inputs.GATEKEEPER_PASSWORD }}
run: pnpm run check:http
- name: 🌐 Locale & Language Switcher Validation
shell: bash
env:
NEXT_PUBLIC_BASE_URL: ${{ inputs.TARGET_URL }}
GATEKEEPER_PASSWORD: ${{ inputs.GATEKEEPER_PASSWORD }}
run: pnpm run check:locale
- name: 🌐 External API Smoke Test (Umami & Sentry)
shell: bash
env:
UMAMI_API_ENDPOINT: ${{ inputs.UMAMI_API_ENDPOINT }}
SENTRY_DSN: ${{ inputs.SENTRY_DSN }}
run: pnpm run check:apis

View File

@@ -0,0 +1,44 @@
name: 🏥 Server Maintenance
on:
schedule:
- cron: '0 3 * * *' # Every day at 3:00 AM
workflow_dispatch: # Allow manual trigger
jobs:
maintenance:
name: 🧹 Prune & Clean
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: 🚀 Execute Maintenance via SSH
run: |
mkdir -p ~/.ssh
echo "${{ secrets.SSH_KEY }}" > ~/.ssh/id_ed25519
chmod 600 ~/.ssh/id_ed25519
ssh-keyscan -H ${{ secrets.SSH_HOST }} >> ~/.ssh/known_hosts 2>/dev/null
# Run the prune script on the host
# We transfer the script and execute it to ensure it matches the repo version
scp packages/infra/scripts/mintel-optimizer.sh root@${{ secrets.SSH_HOST }}:/tmp/mintel-optimizer.sh
ssh root@${{ secrets.SSH_HOST }} "bash /tmp/mintel-optimizer.sh && rm /tmp/mintel-optimizer.sh"
- name: 🔔 Notification - Success
if: success()
run: |
curl -s -k -X POST "${{ secrets.GOTIFY_URL }}/message?token=${{ secrets.GOTIFY_TOKEN }}" \
-F "title=🏥 Maintenance Complete" \
-F "message=Server-Wartung erfolgreich ausgeführt.\nRegistry & Docker Ressourcen bereinigt." \
-F "priority=2" || true
- name: 🔔 Notification - Failure
if: failure()
run: |
curl -s -k -X POST "${{ secrets.GOTIFY_URL }}/message?token=${{ secrets.GOTIFY_TOKEN }}" \
-F "title=❌ Maintenance FAILED" \
-F "message=Die automatische Server-Wartung ist fehlgeschlagen!\nBitte Logs prüfen." \
-F "priority=8" || true

View File

@@ -5,15 +5,74 @@ on:
branches:
- '**'
tags:
- 'v*'
- '*'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
qa:
name: 🧪 Quality Assurance
prioritize:
name: ⚡ Prioritize Release
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
steps:
- name: 🛑 Cancel Redundant Runs
env:
GITEA_TOKEN: ${{ secrets.GITHUB_TOKEN }}
REPO: ${{ github.repository }}
RUN_ID: ${{ github.run_id }}
REF: ${{ github.ref }}
REF_NAME: ${{ github.ref_name }}
EVENT: ${{ github.event_name }}
SHA: ${{ github.sha }}
run: |
echo "🔎 Debug: Event=$EVENT, Ref=$REF, RefName=$REF_NAME, RunId=$RUN_ID"
# Fetch recent runs for the repository
RUNS=$(curl -s -H "Authorization: token $GITEA_TOKEN" "https://git.infra.mintel.me/api/v1/repos/$REPO/actions/runs?limit=30")
case "$REF" in
refs/tags/*)
echo "🚀 Release detected ($REF_NAME). Cancelling non-tag runs..."
# Identify runs to cancel: in_progress/queued, NOT this run, and NOT a tag run
echo "$RUNS" | jq -c '.workflow_runs[] | select(.status == "in_progress" or .status == "queued") | select(.id | tostring != "'$RUN_ID'")' | while read -r run; do
ID=$(echo "$run" | jq -r '.id')
RUN_REF=$(echo "$run" | jq -r '.ref')
TITLE=$(echo "$run" | jq -r '.display_title')
case "$RUN_REF" in
refs/tags/*)
echo "⏭️ Skipping parallel release run $ID ($TITLE) on $RUN_REF"
;;
*)
echo "🛑 Cancelling redundant branch run $ID ($TITLE) on $RUN_REF..."
curl -X POST -s -H "Authorization: token $GITEA_TOKEN" "https://git.infra.mintel.me/api/v1/repos/$REPO/actions/runs/$ID/cancel"
;;
esac
done
;;
*)
echo " Regular push. Checking for parallel release tag for SHA $SHA..."
# Check if there's a tag run for the SAME commit
TAG_RUN_ID=$(echo "$RUNS" | jq -r '.workflow_runs[] | select(.ref | startswith("refs/tags/")) | select(.head_sha == "'$SHA'") | .id' | head -n 1)
if [[ -n "$TAG_RUN_ID" && "$TAG_RUN_ID" != "null" ]]; then
echo "🚀 Found parallel tag run $TAG_RUN_ID for commit $SHA. Cancelling this branch run ($RUN_ID)..."
curl -X POST -s -H "Authorization: token $GITEA_TOKEN" "https://git.infra.mintel.me/api/v1/repos/$REPO/actions/runs/$RUN_ID/cancel"
exit 0
fi
echo "✅ No parallel tag run found. Proceeding."
;;
esac
lint:
name: 🧹 Lint
needs: prioritize
if: always() && !cancelled() && (needs.prioritize.result == 'success' || needs.prioritize.result == 'skipped')
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
@@ -22,37 +81,69 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install pnpm
uses: pnpm/action-setup@v4
with:
version: 10
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node_version: 20
- name: Enable pnpm
run: corepack enable && corepack prepare pnpm@10.2.0 --activate
- name: Install dependencies
run: pnpm install --frozen-lockfile
- name: 🏷️ Sync Versions (if Tagged)
if: startsWith(github.ref, 'refs/tags/v')
run: pnpm sync-versions
run: pnpm install --frozen-lockfile --prefer-offline --ignore-scripts --no-color
- name: Lint
run: pnpm lint
- name: Check Dependencies (Depcheck)
run: pnpm -r exec npx --yes depcheck --skip-missing --ignores="eslint*,@eslint/*,@types/*,typescript,tsup,tsx,vitest,tailwindcss,postcss,autoprefixer,@mintel/*,ts-node,*in-the-middle,pino*,@commitlint/*,@changesets/*,globals"
test:
name: 🧪 Test
needs: prioritize
if: always() && !cancelled() && (needs.prioritize.result == 'success' || needs.prioritize.result == 'skipped')
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
env:
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node_version: 20
- name: Enable pnpm
run: corepack enable && corepack prepare pnpm@10.2.0 --activate
- name: Install dependencies
run: pnpm install --frozen-lockfile --prefer-offline --ignore-scripts --no-color
- name: Test
run: pnpm test
build:
name: 🏗️ Build
needs: prioritize
if: always() && !cancelled() && (needs.prioritize.result == 'success' || needs.prioritize.result == 'skipped')
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
env:
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node_version: 20
- name: Enable pnpm
run: corepack enable && corepack prepare pnpm@10.2.0 --activate
- name: Install dependencies
run: pnpm install --frozen-lockfile --prefer-offline --ignore-scripts --no-color
- name: Build
run: pnpm build
release:
name: 🚀 Release
needs: qa
if: startsWith(github.ref, 'refs/tags/v')
needs: [lint, test, build]
if: startsWith(github.ref, 'refs/tags/')
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
@@ -64,20 +155,16 @@ jobs:
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Install pnpm
uses: pnpm/action-setup@v4
with:
version: 10
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node_version: 20
- name: Enable pnpm
run: corepack enable && corepack prepare pnpm@10.2.0 --activate
- name: Install dependencies
run: pnpm install --frozen-lockfile
run: pnpm install --frozen-lockfile --prefer-offline --ignore-scripts --no-color
- name: 🏷️ Sync Versions (if Tagged)
run: pnpm sync-versions
- name: 🏷️ Release Packages (Tag-Driven)
run: |
echo "🏷️ Tag detected [${{ github.ref_name }}], performing sync release..."
@@ -85,13 +172,14 @@ jobs:
build-images:
name: 🐳 Build ${{ matrix.name }}
needs: qa
if: startsWith(github.ref, 'refs/tags/v')
needs: [lint, test, build]
if: startsWith(github.ref, 'refs/tags/')
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
strategy:
fail-fast: false
max-parallel: 1
matrix:
include:
- image: nextjs
@@ -103,9 +191,7 @@ jobs:
- image: gatekeeper
file: packages/infra/docker/Dockerfile.gatekeeper
name: Gatekeeper (Product)
- image: directus
file: packages/infra/docker/Dockerfile.directus
name: Directus (Base)
steps:
- name: Checkout
uses: actions/checkout@v4
@@ -116,23 +202,22 @@ jobs:
- name: 🔐 Registry Login
uses: docker/login-action@v3
with:
registry: registry.infra.mintel.me
username: ${{ secrets.REGISTRY_USER }}
password: ${{ secrets.REGISTRY_PASS }}
registry: git.infra.mintel.me
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: 🏗️ Build & Push ${{ matrix.name }}
uses: docker/build-push-action@v5
with:
context: .
file: ${{ matrix.file }}
platforms: linux/arm64
platforms: linux/amd64
pull: true
provenance: false
push: true
secrets: |
NPM_TOKEN=${{ secrets.NPM_TOKEN }}
tags: |
registry.infra.mintel.me/mintel/${{ matrix.image }}:${{ github.ref_name }}
registry.infra.mintel.me/mintel/${{ matrix.image }}:latest
cache-from: type=registry,ref=registry.infra.mintel.me/mintel/${{ matrix.image }}:buildcache
cache-to: type=registry,ref=registry.infra.mintel.me/mintel/${{ matrix.image }}:buildcache,mode=max
git.infra.mintel.me/mmintel/${{ matrix.image }}:${{ github.ref_name }}
git.infra.mintel.me/mmintel/${{ matrix.image }}:latest

View File

@@ -0,0 +1,243 @@
name: Reusable Nightly QA
on:
workflow_call:
inputs:
TARGET_URL:
description: 'The URL to test (e.g., https://testing.klz-cables.com)'
required: true
type: string
PROJECT_NAME:
description: 'The internal project name for notifications'
required: true
type: string
secrets:
GOTIFY_URL:
required: true
GOTIFY_TOKEN:
required: true
GATEKEEPER_PASSWORD:
required: true
NPM_TOKEN:
required: false
MINTEL_PRIVATE_TOKEN:
required: false
GITEA_PAT:
required: false
jobs:
prepare:
name: 🏗️ Prepare & Install
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup pnpm
uses: pnpm/action-setup@v3
with:
version: 10
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20
- name: 🔐 Registry Auth
run: |
echo "@mintel:registry=https://git.infra.mintel.me/api/packages/mmintel/npm" > .npmrc
echo "//git.infra.mintel.me/api/packages/mmintel/npm/:_authToken=${{ secrets.NPM_TOKEN || secrets.MINTEL_PRIVATE_TOKEN || secrets.GITEA_PAT }}" >> .npmrc
- name: Install dependencies
run: |
pnpm store prune
pnpm install --no-frozen-lockfile
- name: 📦 Archive dependencies
uses: actions/upload-artifact@v4
with:
name: node_modules
path: |
node_modules
.npmrc
retention-days: 1
static:
name: 🔍 Static Analysis
needs: prepare
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup pnpm
uses: pnpm/action-setup@v3
with:
version: 10
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20
- name: 📥 Restore dependencies
uses: actions/download-artifact@v4
with:
name: node_modules
- name: 🌐 HTML Validation
env:
NEXT_PUBLIC_BASE_URL: ${{ inputs.TARGET_URL }}
GATEKEEPER_PASSWORD: ${{ secrets.GATEKEEPER_PASSWORD }}
run: pnpm run check:html
- name: 🖼️ Asset Scan
env:
NEXT_PUBLIC_BASE_URL: ${{ inputs.TARGET_URL }}
GATEKEEPER_PASSWORD: ${{ secrets.GATEKEEPER_PASSWORD }}
run: pnpm run check:assets
accessibility:
name: ♿ Accessibility
needs: prepare
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup pnpm
uses: pnpm/action-setup@v3
with:
version: 10
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20
- name: 📥 Restore dependencies
uses: actions/download-artifact@v4
with:
name: node_modules
- name: 🔍 Install Chromium
run: |
apt-get update && apt-get install -y gnupg wget ca-certificates
CODENAME=$(. /etc/os-release && echo $VERSION_CODENAME)
mkdir -p /etc/apt/keyrings
wget -qO- "https://keyserver.ubuntu.com/pks/lookup?op=get&search=0x82BB6851C64F6880" | gpg --dearmor > /etc/apt/keyrings/xtradeb.gpg
echo "deb [signed-by=/etc/apt/keyrings/xtradeb.gpg] http://ppa.launchpad.net/xtradeb/apps/ubuntu $CODENAME main" > /etc/apt/sources.list.d/xtradeb-ppa.list
printf "Package: *\nPin: release o=LP-PPA-xtradeb-apps\nPin-Priority: 1001\n" > /etc/apt/preferences.d/xtradeb
apt-get update && apt-get install -y --allow-downgrades chromium
ln -sf /usr/bin/chromium /usr/bin/google-chrome
- name: ♿ WCAG Scan
continue-on-error: true
env:
NEXT_PUBLIC_BASE_URL: ${{ inputs.TARGET_URL }}
GATEKEEPER_PASSWORD: ${{ secrets.GATEKEEPER_PASSWORD }}
run: pnpm run check:wcag
analysis:
name: 🧪 Maintenance & Links
needs: prepare
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup pnpm
uses: pnpm/action-setup@v3
with:
version: 10
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20
- name: 📥 Restore dependencies
uses: actions/download-artifact@v4
with:
name: node_modules
- name: 📦 Depcheck
continue-on-error: true
run: pnpm dlx depcheck --ignores="*eslint*,*typescript*,*tailwindcss*,*postcss*,*prettier*,*@types/*,*husky*,*lint-staged*,*@next/*,*@lhci/*,*commitlint*,*cspell*,*rimraf*,*@payloadcms/*,*start-server-and-test*,*html-validate*,*critters*,*dotenv*,*turbo*"
- name: 🔗 Lychee Link Check
uses: lycheeverse/lychee-action@v2
with:
args: --accept 200,204,429 --timeout 15 content/ app/ public/
fail: true
performance:
name: 🎭 Lighthouse
needs: prepare
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20
- name: Setup pnpm
uses: pnpm/action-setup@v3
with:
version: 10
- name: 📥 Restore dependencies
uses: actions/download-artifact@v4
with:
name: node_modules
- name: 🔍 Install Chromium
run: |
apt-get update && apt-get install -y gnupg wget ca-certificates
CODENAME=$(. /etc/os-release && echo $VERSION_CODENAME)
mkdir -p /etc/apt/keyrings
wget -qO- "https://keyserver.ubuntu.com/pks/lookup?op=get&search=0x82BB6851C64F6880" | gpg --dearmor > /etc/apt/keyrings/xtradeb.gpg
echo "deb [signed-by=/etc/apt/keyrings/xtradeb.gpg] http://ppa.launchpad.net/xtradeb/apps/ubuntu $CODENAME main" > /etc/apt/sources.list.d/xtradeb-ppa.list
printf "Package: *\nPin: release o=LP-PPA-xtradeb-apps\nPin-Priority: 1001\n" > /etc/apt/preferences.d/xtradeb
apt-get update && apt-get install -y --allow-downgrades chromium
ln -sf /usr/bin/chromium /usr/bin/google-chrome
- name: 🎭 LHCI Desktop
env:
LHCI_URL: ${{ inputs.TARGET_URL }}
GATEKEEPER_PASSWORD: ${{ secrets.GATEKEEPER_PASSWORD }}
run: pnpm run pagespeed:test -- --collect.settings.preset=desktop
- name: 📱 LHCI Mobile
env:
LHCI_URL: ${{ inputs.TARGET_URL }}
GATEKEEPER_PASSWORD: ${{ secrets.GATEKEEPER_PASSWORD }}
run: pnpm run pagespeed:test -- --collect.settings.preset=mobile
notifications:
name: 🔔 Notify
needs: [prepare, static, accessibility, analysis, performance]
if: always()
runs-on: docker
container:
image: catthehacker/ubuntu:act-latest
steps:
- name: 🔔 Gotify
shell: bash
run: |
PREPARE="${{ needs.prepare.result }}"
STATIC="${{ needs.static.result }}"
A11Y="${{ needs.accessibility.result }}"
ANALYSIS="${{ needs.analysis.result }}"
PERF="${{ needs.performance.result }}"
PROJECT="${{ inputs.PROJECT_NAME }}"
URL="${{ inputs.TARGET_URL }}"
if [[ "$PREPARE" != "success" || "$STATIC" != "success" || "$PERF" != "success" ]]; then
PRIORITY=8
EMOJI="🚨"
STATUS_LINE="Nightly QA Failed! Action required."
else
PRIORITY=2
EMOJI="✅"
STATUS_LINE="Nightly QA Passed."
fi
TITLE="$EMOJI $PROJECT Nightly QA"
MESSAGE="$STATUS_LINE
Prepare: $PREPARE | Static: $STATIC | A11y: $A11Y
Analysis: $ANALYSIS | Perf: $PERF
$URL"
curl -s -k -X POST "${{ secrets.GOTIFY_URL }}/message?token=${{ secrets.GOTIFY_TOKEN }}" \
-F "title=$TITLE" \
-F "message=$MESSAGE" \
-F "priority=$PRIORITY" || true

10
.gitignore vendored
View File

@@ -37,3 +37,13 @@ Thumbs.db
# Changesets
.changeset/*.lock
directus/extensions/
packages/cms-infra/extensions/
packages/cms-infra/uploads/
directus/uploads/directus-health-file
# Estimation Engine Data
data/crawls/
packages/estimation-engine/out/
apps/web/out/estimations/

View File

@@ -1,15 +1,8 @@
#!/usr/bin/env sh
. "$(dirname -- "$0")/_/husky.sh"
# Check if we are pushing a tag
if echo "$*" | grep -q "refs/tags/v"; then
echo "🏷️ Tag detected in push, syncing versions..."
pnpm sync-versions
# Stage the changed package.json files
git add "package.json" "packages/*/package.json" "apps/*/package.json"
# Amend the tag if it's on the current commit, but this is complex in pre-push.
# Better: Just warn the user that they might need to update the tag if package.json changed.
echo "⚠️ package.json files updated to match tag. Please ensure these changes are part of your tag/commit."
# Validate Directus SDK imports before push
# This prevents runtime crashes caused by importing non-existent exports
SCRIPT_DIR="$(cd "$(dirname "$0")/.." && pwd)"
if [ -f "$SCRIPT_DIR/scripts/validate-sdk-imports.sh" ]; then
"$SCRIPT_DIR/scripts/validate-sdk-imports.sh" || exit 1
fi

5
.npmrc
View File

@@ -1,6 +1,5 @@
@mintel:registry=https://npm.infra.mintel.me/
registry=https://npm.infra.mintel.me/
//npm.infra.mintel.me/:_authToken=${NPM_TOKEN}
@mintel:registry=https://git.infra.mintel.me/api/packages/mmintel/npm/
//git.infra.mintel.me/api/packages/mmintel/npm/:_authToken=${NPM_TOKEN}
always-auth=true
public-hoist-pattern[]=*

56
Dockerfile.template Normal file
View File

@@ -0,0 +1,56 @@
# Stage 1: Builder
FROM git.infra.mintel.me/mmintel/nextjs:latest AS builder
WORKDIR /app
# Clean the workspace in case the base image is dirty
RUN rm -rf ./*
# Arguments for build-time configuration
ARG NEXT_PUBLIC_BASE_URL
ARG NEXT_PUBLIC_TARGET
ARG DIRECTUS_URL
ARG NPM_TOKEN
# Environment variables for Next.js build
ENV NEXT_PUBLIC_BASE_URL=$NEXT_PUBLIC_BASE_URL
ENV NEXT_PUBLIC_TARGET=$NEXT_PUBLIC_TARGET
ENV DIRECTUS_URL=$DIRECTUS_URL
ENV SKIP_RUNTIME_ENV_VALIDATION=true
ENV CI=true
# Enable pnpm
RUN corepack enable
# Copy lockfile and manifest for dependency installation caching
COPY pnpm-lock.yaml package.json .npmrc* ./
# Install dependencies with cache mount
RUN --mount=type=cache,id=pnpm,target=/pnpm/store \
--mount=type=secret,id=NPM_TOKEN \
export NPM_TOKEN=$(cat /run/secrets/NPM_TOKEN 2>/dev/null || echo $NPM_TOKEN) && \
pnpm install --frozen-lockfile
# Copy source code
COPY . .
# Build application
RUN pnpm build
# Stage 2: Runner
FROM git.infra.mintel.me/mmintel/runtime:latest AS runner
WORKDIR /app
ENV HOSTNAME="0.0.0.0"
ENV PORT=3000
ENV NODE_ENV=production
# Copy standalone output and static files
# Adjust paths if using a monorepo structure (e.g., /app/apps/web/public)
COPY --from=builder --chown=nextjs:nodejs /app/public ./public
COPY --from=builder --chown=nextjs:nodejs /app/.next/standalone ./
COPY --from=builder --chown=nextjs:nodejs /app/.next/static ./.next/static
COPY --from=builder --chown=nextjs:nodejs /app/.next/cache ./.next/cache
USER nextjs
CMD ["node", "server.js"]

View File

@@ -80,3 +80,5 @@ Client websites scaffolded via the CLI use a **tag-based deployment** strategy:
- **Git Tag `v*.*.*`**: Deploys to the `production` environment.
See the [`@mintel/infra`](packages/infra/README.md) package for detailed template documentation.
Trigger rebuilding for x86 architecture.

View File

@@ -1,71 +0,0 @@
services:
app:
build:
context: .
dockerfile: Dockerfile
args:
NEXT_PUBLIC_BASE_URL: ${NEXT_PUBLIC_BASE_URL:-http://localhost:3000}
NEXT_PUBLIC_UMAMI_WEBSITE_ID: ${NEXT_PUBLIC_UMAMI_WEBSITE_ID}
NEXT_PUBLIC_UMAMI_SCRIPT_URL: ${NEXT_PUBLIC_UMAMI_SCRIPT_URL}
NEXT_PUBLIC_TARGET: ${TARGET:-development}
DIRECTUS_URL: ${DIRECTUS_URL:-http://directus:8055}
restart: always
networks:
- infra
env_file:
- .env
ports:
- "3000:3000"
labels:
- "traefik.enable=true"
- "traefik.http.routers.sample-website.rule=Host(`${TRAEFIK_HOST:-sample-website.localhost}`)"
- "traefik.http.services.sample-website.loadbalancer.server.port=3000"
directus:
image: registry.infra.mintel.me/mintel/directus:latest
restart: always
networks:
- infra
env_file:
- .env
environment:
KEY: ${DIRECTUS_KEY:-mintel-key}
SECRET: ${DIRECTUS_SECRET:-mintel-secret}
ADMIN_EMAIL: ${DIRECTUS_ADMIN_EMAIL:-admin@mintel.me}
ADMIN_PASSWORD: ${DIRECTUS_ADMIN_PASSWORD:-mintel-admin}
DB_CLIENT: 'pg'
DB_HOST: 'directus-db'
DB_PORT: '5432'
DB_DATABASE: ${DIRECTUS_DB_NAME:-directus}
DB_USER: ${DIRECTUS_DB_USER:-directus}
DB_PASSWORD: ${DIRECTUS_DB_PASSWORD:-mintel-db-pass}
WEBSOCKETS_ENABLED: 'true'
PUBLIC_URL: ${DIRECTUS_URL:-http://localhost:8055}
ports:
- "8055:8055"
volumes:
- ./directus/uploads:/directus/uploads
- ./directus/extensions:/directus/extensions
labels:
- "traefik.enable=true"
- "traefik.http.routers.sample-website-directus.rule=Host(`${DIRECTUS_HOST:-cms.sample-website.localhost}`)"
- "traefik.http.services.sample-website-directus.loadbalancer.server.port=8055"
directus-db:
image: postgres:15-alpine
restart: always
networks:
- infra
environment:
POSTGRES_DB: ${DIRECTUS_DB_NAME:-directus}
POSTGRES_USER: ${DIRECTUS_DB_USER:-directus}
POSTGRES_PASSWORD: ${DIRECTUS_DB_PASSWORD:-mintel-db-pass}
volumes:
- directus-db-data:/var/lib/postgresql/data
networks:
infra:
external: true
volumes:
directus-db-data:

View File

@@ -1,3 +0,0 @@
import { nextConfig } from "@mintel/eslint-config/next";
export default nextConfig;

View File

@@ -1,6 +1,8 @@
import mintelNextConfig from "@mintel/next-config";
/** @type {import('next').NextConfig} */
const nextConfig = {};
const nextConfig = {
transpilePackages: ["@mintel/ui"],
};
export default mintelNextConfig(nextConfig);

View File

@@ -1,6 +1,6 @@
{
"name": "sample-website",
"version": "1.6.0",
"version": "1.9.9",
"private": true,
"type": "module",
"scripts": {
@@ -8,25 +8,18 @@
"dev:local": "mintel dev --local",
"build": "next build",
"start": "next start",
"lint": "next lint",
"lint": "eslint src/",
"typecheck": "tsc --noEmit",
"test": "vitest run --passWithNoTests",
"cms:bootstrap": "mintel directus bootstrap",
"cms:push:testing": "mintel directus sync push testing",
"cms:pull:testing": "mintel directus sync pull testing",
"cms:push:staging": "mintel directus sync push staging",
"cms:pull:staging": "mintel directus sync pull staging",
"cms:push:prod": "mintel directus sync push production",
"cms:pull:prod": "mintel directus sync pull production",
"pagespeed:test": "mintel pagespeed"
},
"dependencies": {
"@mintel/next-observability": "workspace:*",
"@mintel/next-utils": "workspace:*",
"@mintel/observability": "workspace:*",
"@mintel/next-observability": "workspace:*",
"@sentry/nextjs": "^8.55.0",
"next": "15.1.6",
"next-intl": "^4.8.2",
"@sentry/nextjs": "10.38.0",
"next": "16.1.6",
"react": "^19.0.0",
"react-dom": "^19.0.0"
},

246
data/briefings/etib.txt Normal file
View File

@@ -0,0 +1,246 @@
Hallo Marc,
eine harte Deadline gibt es nicht Was denkst du ist realistisch? Ich habe als Ziel so
April / Mai im Kopf -> dann aber schon zu 95 % fertig. Viele Grüße
Mit freundlichen Grüßen
Danny Joseph
Geschäftsführer
E-TIB GmbH
Gewerbestraße 22
D-03172 Guben
Mobil +49 15207230518
E-Mail d.joseph@e-tib.com
Web www.e-tib.com
--------------------------------------------------------------------------------------------------
Hey,
ich würde wie bei https://www.schleicher-gruppe.de/ ein Video auf der Startseite
haben wollen. Da ginge sicherlich was vom bisherigen Messevideo. Liebe Grüße.
Mit freundlichen Grüßen
Danny Joseph
Geschäftsführer
E-TIB GmbH
Gewerbestraße 22
D-03172 Guben
Mobil +49 15207230518
E-Mail d.joseph@e-tib.com
Web www.e-tib.com
--------------------------------------------------------------------------------------------------
Geschäftsführung: Danny Joseph
Handelsregister: Amtsgericht Cottbus
HRB: 12403 CB
USt. ID-Nr.: DE304799919
--------------------------------------------------------------------------------------------------
Von: Frieder Helmich <f.helmich@etib-ing.com>
Gesendet: Donnerstag, 29. Januar 2026 08:49
An: Marc Mintel <marc@cablecreations.de>; Danny Joseph <d.joseph@e-tib.com>
Betreff: AW: Homepage E-TIB
Hi Marc,
brauchst du nur Fotos oder bindest du auch videos ein? Wir haben sehr viel Videomaterial. Wir haben auch einen kleinen Film den wir auf der Messe laufen lassen haben.
Mit freundlichen Grüßen
i.A. Frieder Helmich
E-TIB Ingenieurgesellschaft mbH
Kampstraße 3
D-27412 Bülstedt
Tel +49 4283 6979923
Mobil +49 173 6560514
Fax +49 4283 6084091
E-Mail f.helmich@etib-ing.com
Web www.etib-ing.com
ETIB_Ing_logo_mk
Datenschutzhinweise: www.etib-ing.com/datenschutz
-----------------------------------------------------------------------------------------------
Geschäftsführung: Julian Helmich
Handelsregister: Amtsgericht Tostedt
HRB: 207158
-----------------------------------------------------------------------------------------------
Von: Marc Mintel <marc@cablecreations.de>
Gesendet: Mittwoch, 28. Januar 2026 18:10
An: Danny Joseph <d.joseph@e-tib.com>
Cc: Frieder Helmich <f.helmich@etib-ing.com>
Betreff: Re: Homepage E-TIB
Hallo Danny,
Vielen Dank für die schnelle Rückmeldung.
Wie gesprochen werde ich mir die Unterlagen und Webseiten im Detail anschauen und mich dann noch einmal bei dir melden.
Gibt es eigentlich eine Deadline oder einen zeitlichen Rahmen, wo ihr mit der neuen Webseite rechnen möchtet?
Je nach dem könnte man auch Features priorisieren, so dass der Kern der Seite schnellstmöglich modernisiert online geht und der Rest im Nachgang.
Das Foto-Material würde ich auch gerne sichten, dann kann man schon sehen, wie viel sich damit arbeiten lässt.
Viele Grüße
From: Danny Joseph <d.joseph@e-tib.com>
Organization: E-TIB GmbH
Date: Wednesday, 28. January 2026 at 16:16
To: Marc Mintel <marc@cablecreations.de>
Cc: 'Frieder Helmich' <f.helmich@etib-ing.com>
Subject: Homepage E-TIB
Hallo Marc,
wie telefonisch besprochen erste wirre Gedanken:
Wir möchten eine minimalistische, hochwertige Homepage die sowohl am PV, als auch
Auf Smartphone / Tablet etc. vernünftig ausschaut.
Bisher war unser Aufhänger:
DIE EXPERTEN FÜR KABELTIEFBAU …
Alles nur Ideen: …
# Schaltflächen ähnlich: https://www.schleicher-gruppe.de/
E-TIB GmbH
E-TIB Verwaltung GmbH
E-TIB Ingenieurgesellschaft mbH
E-TIB Bohrtechnik GmbH
# Schaltflächen ähnlich: https://www.schleicher-gruppe.de/
(ehemals Kompetenzen www.e-tib.com)
Kabelbau
Kabelpflugarbeiten
Horizontalspülbohrungen
Elektromontagen bis 110 kV
Glasfaser-Kabelmontagen
Wartung & Störungsdienst
Genehmigungs- und Ausführungsplanung
Komplexe Querung (Bahn, Autobahn, Gewässer)
Elektro- und Netzanschlussplanung
Vermessung & Dokumentation
Input für Über uns: Grid … Timeline?
Gründung E-TIB GmbH: 16.12.2015
Kabelbau
Kabelpflugarbeiten
Horizontalspülbohrungen
Elektromontagen bis 110 kV
Glasfaser-Kabelmontagen
Wartung & Störungsdienst
Elektro- und Netzanschlussplanung
Vermessung & Dokumentation
Gründung E-TIB Verwaltung GmbH: 14.11.2019
Der Erwerb, die Vermietung, Verpachtung und Verwaltung
von Immobilien, Grundstücken, Maschinen und Geräten.
Gründung E-TIB Ingenieurgesellschaft mbH: 04.02.2019
Genehmigungs- und Ausführungsplanung
Komplexe Querung (Bahn, Autobahn, Gewässer)
Elektro- und Netzanschlussplanung
Gründung E-TIB Bohrtechnik GmbH: 21.10.2025
Horizontalspülbohrungen in allen Bodenklassen
GruppenKacheln (Beispieltexte) ...
ETIB GmbH Ausführung elektrischer Infrastrukturprojekte
ETIB Bohrtechnik GmbH Präzise Horizontalbohrungen in allen Bodenklassen
ETIB Verwaltung GmbH Zentrale Dienste, Einkauf, Finanzen
ETIB Ingenieurgesellschaft mbH Planung, Projektierung, Dokumentation
Kontaktseite siehe: www.e-tib.com
Karriere: ...
Messen: wo wir dieses Jahr einen Stand haben: Intersolar München, Windenergietage Linstow, Kabelwerkstatt Wiesbaden
Referenzen: … müsste ich dir zur Verfügung stellen
Pflichtseiten
Impressum (vollständig, Verantwortliche, Registernummer, UStID).
Datenschutz (Verarbeitungen, Rechtsgrundlagen, AVV, CookieGruppen, Löschfristen, Rechte).
CookieEinstellungen (Consent Manager: ...)
www.e-tib.com
www.etib-ing.com
Hier mein instagram account:
me.and.eloise
Verstehst du mich vielleicht ein kleines Stück mehr…
Unser Frieder Helmich kann erstes Foto-/Videomaterial zur Verfügung stellen:
f.helmich@etib-ing.com
Lass mir mal eine Idee vom Stundenaufwand / Kosten pro Stunde für Erstellung zukommen,
damit wir eine Vertragsgrundlage haben. Danach lass uns loslegen.
Besten Dank dir.
Mit freundlichen Grüßen
Danny Joseph
Geschäftsführer
E-TIB GmbH
Gewerbestraße 22
D-03172 Guben
Mobil +49 15207230518
E-Mail d.joseph@e-tib.com
Web www.e-tib.com
--------------------------------------------------------------------------------------------------
Geschäftsführung: Danny Joseph
Handelsregister: Amtsgericht Cottbus
HRB: 12403 CB
USt. ID-Nr.: DE304799919
--------------------------------------------------------------------------------------------------
Von: Marc Mintel <marc@cablecreations.de>
Gesendet: Donnerstag, 13. November 2025 16:30
An: d.joseph@e-tib.com
Betreff: Homepage
Hi Danny,
mein Vater meinte, ich könnte mich mal bei dir melden, weil ihr jemanden für eure Website sucht.
Kurz zu mir: Ich habe über 10 Jahre in der Webentwicklung gearbeitet. Inzwischen liegt mein Schwerpunkt zwar im 3D-Bereich (u. a. cablecreations.de), aber ich betreue weiterhin Websites für Firmen, die das Ganze unkompliziert abgegeben haben möchten. Unter anderem betreue ich auch die Seite von KLZ (klz-cables.com). Der Ablauf ist bei mir recht einfach: Wenn ihr etwas braucht, reicht in der Regel eine kurze Mail Anpassungen, Inhalte oder technische Themen erledige ich dann im Hintergrund. Dadurch spart ihr euch Schulungen, Zugänge oder lange Meetings, wie man sie oft mit Agenturen hat.
Wichtig ist: Eine Website braucht auch nach dem Aufbau regelmäßige Pflege, damit Technik und Sicherheit sauber laufen das übernehme ich dann ebenfalls, damit ihr im Alltag keinen Aufwand damit habt.
Um einschätzen zu können, ob und wie ich euch unterstützen kann, wäre es gut zu wissen, was ihr mit der Website vorhabt und was an der aktuellen Seite nicht mehr passt. Wenn du magst, können wir dazu auch kurz telefonieren.
Viele Grüße
Marc
Marc Mintel
Founder & 3D Artist
marc@cablecreations.de
Cable Creations
www.cablecreations.de
info@cablecreations.de
VAT: DE367588065
Georg-Meistermann-Straße 7
54586 Schüller
Germany

View File

@@ -0,0 +1,39 @@
services:
gatekeeper-proxy:
image: alpine:latest
command: sleep infinity
restart: unless-stopped
networks:
- infra
labels:
- "caddy=http://gatekeeper.localhost"
- "caddy.route=/*"
- "caddy.route.0_redir=/ /gatekeeper/login 302"
- "caddy.route.1_reverse_proxy=gatekeeper-app:3000"
gatekeeper-app:
image: node:20-alpine
working_dir: /app
volumes:
- .:/app
- gatekeeper_root_node_modules:/app/node_modules
- gatekeeper_pkg_node_modules:/app/packages/gatekeeper/node_modules
- gatekeeper_next_cache:/app/packages/gatekeeper/.next
- gatekeeper_pnpm_store:/pnpm
environment:
- NODE_ENV=development
- NPM_TOKEN=${NPM_TOKEN:-}
networks:
- infra
command: >
sh -c "corepack enable && pnpm config set store-dir /pnpm && pnpm install --no-frozen-lockfile && pnpm --filter @mintel/gatekeeper run dev --hostname 0.0.0.0 --port 3000"
networks:
infra:
external: true
volumes:
gatekeeper_root_node_modules:
gatekeeper_pkg_node_modules:
gatekeeper_next_cache:
gatekeeper_pnpm_store:

27
docker-compose.yml Normal file
View File

@@ -0,0 +1,27 @@
services:
app:
build:
context: ./apps/sample-website
dockerfile: Dockerfile
args:
NEXT_PUBLIC_BASE_URL: ${NEXT_PUBLIC_BASE_URL:-http://localhost:3000}
NEXT_PUBLIC_UMAMI_WEBSITE_ID: ${NEXT_PUBLIC_UMAMI_WEBSITE_ID}
NEXT_PUBLIC_UMAMI_SCRIPT_URL: ${NEXT_PUBLIC_UMAMI_SCRIPT_URL}
NEXT_PUBLIC_TARGET: ${TARGET:-development}
restart: always
networks:
- infra
env_file:
- .env
ports:
- "3000:3000"
labels:
- "traefik.enable=true"
- "traefik.http.routers.sample-website.rule=Host(`${TRAEFIK_HOST:-sample-website.localhost}`)"
- "traefik.http.services.sample-website.loadbalancer.server.port=3000"
- "caddy=http://${TRAEFIK_HOST:-acquisition.localhost}"
- "caddy.reverse_proxy={{upstreams 3000}}"
networks:
infra:
external: true

View File

@@ -5,9 +5,13 @@ export default [
{
ignores: [
"packages/cms-infra/extensions/**",
"packages/customer-manager/index.js",
"**/index.js",
"**/*.db",
"**/build/**",
"**/data/**",
"**/reference/**",
"**/dist/**",
"**/.next/**",
],
},
...baseConfig,

View File

@@ -0,0 +1 @@
404: Not Found

View File

@@ -0,0 +1,30 @@
[
{
"weights":
[
{"name":"conv0/filters","shape":[3,3,3,16],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.009007044399485869,"min":-1.2069439495311063}},
{"name":"conv0/bias","shape":[16],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.005263455241334205,"min":-0.9211046672334858}},
{"name":"conv1/depthwise_filter","shape":[3,3,16,1],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.004001977630690033,"min":-0.5042491814669441}},
{"name":"conv1/pointwise_filter","shape":[1,1,16,32],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.013836609615999109,"min":-1.411334180831909}},
{"name":"conv1/bias","shape":[32],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.0015159862590771096,"min":-0.30926119685173037}},
{"name":"conv2/depthwise_filter","shape":[3,3,32,1],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.002666276225856706,"min":-0.317286870876948}},
{"name":"conv2/pointwise_filter","shape":[1,1,32,64],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.015265831292844286,"min":-1.6792414422128714}},
{"name":"conv2/bias","shape":[64],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.0020280554598453,"min":-0.37113414915168985}},
{"name":"conv3/depthwise_filter","shape":[3,3,64,1],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.006100742489683862,"min":-0.8907084034938438}},
{"name":"conv3/pointwise_filter","shape":[1,1,64,128],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.016276211832083907,"min":-2.0508026908425725}},
{"name":"conv3/bias","shape":[128],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.003394414279975143,"min":-0.7637432129944072}},
{"name":"conv4/depthwise_filter","shape":[3,3,128,1],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.006716050119961009,"min":-0.8059260143953211}},
{"name":"conv4/pointwise_filter","shape":[1,1,128,256],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.021875603993733724,"min":-2.8875797271728514}},
{"name":"conv4/bias","shape":[256],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.0041141652009066415,"min":-0.8187188749804216}},
{"name":"conv5/depthwise_filter","shape":[3,3,256,1],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.008423839597141042,"min":-0.9013508368940915}},
{"name":"conv5/pointwise_filter","shape":[1,1,256,512],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.030007277283014035,"min":-3.8709387695088107}},
{"name":"conv5/bias","shape":[512],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.008402082966823203,"min":-1.4871686851277068}},
{"name":"conv8/filters","shape":[1,1,512,25],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.028336129469030042,"min":-4.675461362389957}},
{"name":"conv8/bias","shape":[25],"dtype":"float32","quantization":{"dtype":"uint8","scale":0.002268134028303857,"min":-0.41053225912299807}}
],
"paths":
[
"tiny_face_detector_model.bin"
]
}
]

14
optimize-images.sh Normal file
View File

@@ -0,0 +1,14 @@
#!/bin/bash
# Ghost Image Optimizer
# Target directory for Ghost content
TARGET_DIR="/home/deploy/sites/marisas.world/content/images"
echo "Starting image optimization for $TARGET_DIR..."
# Find all original images, excluding the 'size/' directory where Ghost stores thumbnails
# Resize images larger than 2500px down to 2500px width
# Compress JPEG/PNG to 80% quality
find "$TARGET_DIR" -type d -name "size" -prune -o \( -iname "*.jpg" -o -iname "*.jpeg" -o -iname "*.png" \) -type f -exec mogrify -resize '2500x>' -quality 80 {} +
echo "Optimization complete."

View File

@@ -5,11 +5,17 @@
"scripts": {
"build": "pnpm -r build",
"dev": "pnpm -r dev",
"lint": "pnpm -r lint",
"dev:gatekeeper": "bash -c 'trap \"COMPOSE_PROJECT_NAME=gatekeeper docker-compose -f docker-compose.gatekeeper.yml down\" EXIT INT TERM; docker network create infra 2>/dev/null || true && COMPOSE_PROJECT_NAME=gatekeeper docker-compose -f docker-compose.gatekeeper.yml down && COMPOSE_PROJECT_NAME=gatekeeper docker-compose -f docker-compose.gatekeeper.yml up --build --remove-orphans'",
"dev:mcps:up": "docker-compose -f docker-compose.mcps.yml up -d",
"dev:mcps:down": "docker-compose -f docker-compose.mcps.yml down",
"dev:mcps:watch": "pnpm -r --filter=\"./packages/*-mcp\" run dev",
"dev:mcps": "npm run dev:mcps:up && npm run dev:mcps:watch",
"lint": "pnpm -r --filter='./packages/**' --filter='./apps/**' lint",
"test": "pnpm -r test",
"changeset": "changeset",
"version-packages": "changeset version",
"sync-versions": "tsx scripts/sync-versions.ts",
"sync-versions": "tsx scripts/sync-versions.ts --",
"release:version": "bash scripts/release.sh",
"release": "pnpm build && changeset publish",
"release:tag": "pnpm build && pnpm -r publish --no-git-checks --access public",
"prepare": "husky"
@@ -20,6 +26,7 @@
"@commitlint/config-conventional": "^20.4.0",
"@mintel/eslint-config": "workspace:*",
"@mintel/husky-config": "workspace:*",
"@next/eslint-plugin-next": "16.1.6",
"@testing-library/jest-dom": "^6.9.1",
"@testing-library/react": "^16.3.2",
"@types/node": "^20.17.16",
@@ -27,7 +34,6 @@
"@types/react-dom": "^19.2.3",
"@vitejs/plugin-react": "^5.1.2",
"eslint": "^9.39.2",
"@next/eslint-plugin-next": "16.1.6",
"eslint-plugin-react": "^7.37.5",
"eslint-plugin-react-hooks": "^7.0.1",
"happy-dom": "^20.4.0",
@@ -41,10 +47,29 @@
"vitest": "^4.0.18"
},
"dependencies": {
"globals": "^17.3.0",
"import-in-the-middle": "^3.0.0",
"pino": "^10.3.1",
"pino-pretty": "^13.1.3",
"require-in-the-middle": "^8.0.1"
},
"version": "1.6.0"
"version": "1.9.9",
"pnpm": {
"onlyBuiltDependencies": [
"@parcel/watcher",
"@sentry/cli",
"@swc/core",
"@tensorflow/tfjs-node",
"canvas",
"core-js",
"esbuild",
"sharp",
"unrs-resolver",
"vue-demi"
],
"overrides": {
"next": "16.1.6",
"@sentry/nextjs": "10.38.0"
}
}
}

View File

@@ -1,9 +1,9 @@
{
"name": "@mintel/cli",
"version": "1.6.0",
"version": "1.9.9",
"publishConfig": {
"access": "public",
"registry": "https://npm.infra.mintel.me"
"registry": "https://git.infra.mintel.me/api/packages/mmintel/npm"
},
"type": "module",
"bin": {
@@ -16,16 +16,19 @@
"test": "vitest run"
},
"dependencies": {
"commander": "^11.0.0",
"fs-extra": "^11.1.0",
"chalk": "^5.3.0",
"prompts": "^2.4.2"
"commander": "^11.0.0",
"fs-extra": "^11.1.0"
},
"devDependencies": {
"tsup": "^8.0.0",
"typescript": "^5.0.0",
"@mintel/tsconfig": "workspace:*",
"@types/fs-extra": "^11.0.0",
"@types/prompts": "^2.4.4",
"@mintel/tsconfig": "workspace:*"
"tsup": "^8.0.0",
"typescript": "^5.0.0"
},
"repository": {
"type": "git",
"url": "https://git.infra.mintel.me/mmintel/at-mintel.git"
}
}

View File

@@ -36,153 +36,15 @@ program
console.log(
chalk.yellow(`
📱 App: http://localhost:3000
🗄️ CMS: http://localhost:8055/admin
🚦 Traefik: http://localhost:8080
`),
);
execSync(
"docker-compose down --remove-orphans && docker-compose up app directus directus-db",
"docker compose down --remove-orphans && docker compose up -d app",
{ stdio: "inherit" },
);
});
const directus = program
.command("directus")
.description("Directus management commands");
directus
.command("bootstrap")
.description("Setup Directus branding and settings")
.action(async () => {
const { execSync } = await import("child_process");
console.log(chalk.blue("🎨 Bootstrapping Directus..."));
execSync("npx tsx --env-file=.env scripts/setup-directus.ts", {
stdio: "inherit",
});
});
directus
.command("bootstrap-feedback")
.description("Setup Directus collections and flows for Feedback")
.action(async () => {
const { execSync } = await import("child_process");
console.log(chalk.blue("📧 Bootstrapping Visual Feedback System..."));
// Use the logic from setup-feedback-hardened.ts
const bootstrapScript = `
import { createDirectus, rest, authentication, createCollection, createDashboard, createPanel, createItems, createPermission, readPolicies, readRoles, readUsers } from '@directus/sdk';
async function setup() {
const url = process.env.DIRECTUS_URL || 'http://localhost:8055';
const email = process.env.DIRECTUS_ADMIN_EMAIL;
const password = process.env.DIRECTUS_ADMIN_PASSWORD;
if (!email || !password) {
console.error('❌ DIRECTUS_ADMIN_EMAIL or DIRECTUS_ADMIN_PASSWORD not set');
process.exit(1);
}
const client = createDirectus(url).with(authentication('json')).with(rest());
try {
console.log('🔑 Authenticating...');
await client.login(email, password);
const roles = await client.request(readRoles());
const adminRole = roles.find(r => r.name === 'Administrator');
const policies = await client.request(readPolicies());
const adminPolicy = policies.find(p => p.name === 'Administrator');
console.log('🏗️ Creating Collection "visual_feedback"...');
try {
await client.request(createCollection({
collection: 'visual_feedback',
meta: { icon: 'feedback', display_template: '{{user_name}}: {{text}}' },
fields: [
{ field: 'id', type: 'uuid', schema: { is_primary_key: true } },
{ field: 'status', type: 'string', schema: { default_value: 'open' }, meta: { interface: 'select-dropdown' } },
{ field: 'url', type: 'string' },
{ field: 'selector', type: 'string' },
{ field: 'x', type: 'float' },
{ field: 'y', type: 'float' },
{ field: 'type', type: 'string' },
{ field: 'text', type: 'text' },
{ field: 'user_name', type: 'string' },
{ field: 'user_identity', type: 'string' },
{ field: 'screenshot', type: 'uuid', meta: { interface: 'file' } },
{ field: 'date_created', type: 'timestamp', schema: { default_value: 'NOW()' } }
]
} as any));
} catch (_e) { console.log(' (Collection might already exist)'); }
try {
await client.request(createCollection({
collection: 'visual_feedback_comments',
meta: { icon: 'comment' },
fields: [
{ field: 'id', type: 'integer', schema: { is_primary_key: true, has_auto_increment: true } },
{ field: 'feedback_id', type: 'uuid', meta: { interface: 'select-dropdown' } },
{ field: 'user_name', type: 'string' },
{ field: 'text', type: 'text' },
{ field: 'date_created', type: 'timestamp', schema: { default_value: 'NOW()' } }
]
} as any));
} catch (e) { }
if (adminPolicy) {
console.log('🔐 Granting ALL permissions to Administrator Policy...');
for (const coll of ['visual_feedback', 'visual_feedback_comments']) {
for (const action of ['create', 'read', 'update', 'delete']) {
try {
await client.request(createPermission({
collection: coll,
action,
fields: ['*'],
policy: adminPolicy.id
} as any));
} catch (_e) { }
}
}
}
console.log('📊 Creating Dashboard...');
try {
const dash = await client.request(createDashboard({ name: 'Visual Feedback', icon: 'feedback', color: '#6366f1' }));
await client.request(createPanel({
dashboard: dash.id,
name: 'Total Feedbacks',
type: 'metric',
width: 12, height: 6, position_x: 1, position_y: 1,
options: { collection: 'visual_feedback', function: 'count', field: 'id' }
} as any));
} catch (e) { }
console.log('✨ FEEDBACK BOOTSTRAP DONE.');
} catch (e) { console.error('❌ FAILURE:', e); }
}
setup();
`;
const tempFile = path.join(process.cwd(), "temp-bootstrap-feedback.ts");
await fs.writeFile(tempFile, bootstrapScript);
try {
execSync("npx tsx --env-file=.env " + tempFile, { stdio: "inherit" });
} finally {
await fs.remove(tempFile);
}
});
directus
.command("sync <action> <env>")
.description("Sync Directus data (push/pull) for a specific environment")
.action(async (action, env) => {
const { execSync } = await import("child_process");
console.log(
chalk.blue(`📥 Executing Directus sync: ${action} -> ${env}...`),
);
execSync(`./scripts/sync-directus.sh ${action} ${env}`, {
stdio: "inherit",
});
});
program
.command("pagespeed")
.description("Run PageSpeed (Lighthouse) tests")
@@ -221,13 +83,6 @@ program
lint: "next lint",
typecheck: "tsc --noEmit",
test: "vitest run --passWithNoTests",
"directus:bootstrap": "mintel directus bootstrap",
"directus:push:testing": "mintel directus sync push testing",
"directus:pull:testing": "mintel directus sync pull testing",
"directus:push:staging": "mintel directus sync push staging",
"directus:pull:staging": "mintel directus sync pull staging",
"directus:push:prod": "mintel directus sync push production",
"directus:pull:prod": "mintel directus sync pull production",
"pagespeed:test": "mintel pagespeed",
},
dependencies: {
@@ -236,7 +91,6 @@ program
"react-dom": "^19.0.0",
"@mintel/next-utils": "workspace:*",
"@mintel/next-observability": "workspace:*",
"@directus/sdk": "^21.0.0",
},
devDependencies: {
"@types/node": "^20.0.0",
@@ -473,15 +327,6 @@ export default function Home() {
}
}
// Create Directus structure
await fs.ensureDir(path.join(fullPath, "directus/uploads"));
await fs.ensureDir(path.join(fullPath, "directus/extensions"));
await fs.writeFile(path.join(fullPath, "directus/uploads/.gitkeep"), "");
await fs.writeFile(
path.join(fullPath, "directus/extensions/.gitkeep"),
"",
);
// Create .env.example
const envExample = `# Project
PROJECT_NAME=${projectName}
@@ -493,21 +338,10 @@ AUTH_COOKIE_NAME=mintel_gatekeeper_session
# Host Config (Local)
TRAEFIK_HOST=\`${projectName}.localhost\`
DIRECTUS_HOST=\`cms.${projectName}.localhost\`
# Next.js
NEXT_PUBLIC_BASE_URL=http://${projectName}.localhost
# Directus
DIRECTUS_URL=http://cms.${projectName}.localhost
DIRECTUS_KEY=$(openssl rand -hex 32 2>/dev/null || echo "mintel-key")
DIRECTUS_SECRET=$(openssl rand -hex 32 2>/dev/null || echo "mintel-secret")
DIRECTUS_ADMIN_EMAIL=admin@mintel.me
DIRECTUS_ADMIN_PASSWORD=mintel-admin-pass
DIRECTUS_DB_NAME=directus
DIRECTUS_DB_USER=directus
DIRECTUS_DB_PASSWORD=mintel-db-pass
# Sentry / Glitchtip
SENTRY_DSN=

View File

@@ -0,0 +1,43 @@
import { build } from 'esbuild';
import { resolve, dirname } from 'path';
import { mkdirSync } from 'fs';
import { fileURLToPath } from 'url';
const __filename = fileURLToPath(import.meta.url);
const __dirname = dirname(__filename);
const entryPoints = [
resolve(__dirname, 'src/index.ts')
];
try {
mkdirSync(resolve(__dirname, 'dist'), { recursive: true });
} catch {
// ignore
}
console.log(`Building entry point...`);
build({
entryPoints: entryPoints,
bundle: true,
platform: 'node',
target: 'node18',
outdir: resolve(__dirname, 'dist'),
format: 'esm',
loader: {
'.ts': 'ts',
'.js': 'js',
},
external: ["playwright", "crawlee", "axios", "cheerio", "fs", "path", "os", "http", "https", "url", "stream", "util", "child_process"],
}).then(() => {
console.log("Build succeeded!");
}).catch((e) => {
if (e.errors) {
console.error("Build failed with errors:");
e.errors.forEach(err => console.error(` ${err.text} at ${err.location?.file}:${err.location?.line}`));
} else {
console.error("Build failed:", e);
}
process.exit(1);
});

View File

@@ -0,0 +1,33 @@
{
"name": "@mintel/cloner",
"version": "1.9.9",
"type": "module",
"main": "dist/index.js",
"module": "dist/index.js",
"types": "dist/index.d.ts",
"exports": {
".": {
"types": "./dist/index.d.ts",
"import": "./dist/index.js",
"default": "./dist/index.js"
}
},
"scripts": {
"build": "node build.mjs",
"dev": "node build.mjs --watch"
},
"devDependencies": {
"@types/node": "^22.0.0",
"esbuild": "^0.25.0",
"typescript": "^5.6.3"
},
"dependencies": {
"axios": "^1.6.0",
"crawlee": "^3.7.0",
"playwright": "^1.40.0"
},
"repository": {
"type": "git",
"url": "https://git.infra.mintel.me/mmintel/at-mintel.git"
}
}

View File

@@ -0,0 +1,98 @@
import axios from "axios";
import fs from "node:fs";
import path from "node:path";
export interface AssetMap {
[originalUrl: string]: string;
}
export class AssetManager {
private userAgent: string;
constructor(
userAgent: string = "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.0.0 Safari/537.36",
) {
this.userAgent = userAgent;
}
public sanitizePath(rawPath: string): string {
return rawPath
.split("/")
.map((p) => p.replace(/[^a-z0-9._-]/gi, "_"))
.join("/");
}
public async downloadFile(
url: string,
assetsDir: string,
): Promise<string | null> {
if (url.startsWith("//")) url = `https:${url}`;
if (!url.startsWith("http")) return null;
try {
const u = new URL(url);
const relPath = this.sanitizePath(u.hostname + u.pathname);
const dest = path.join(assetsDir, relPath);
if (fs.existsSync(dest)) return `./assets/${relPath}`;
const res = await axios.get(url, {
responseType: "arraybuffer",
headers: { "User-Agent": this.userAgent },
timeout: 15000,
validateStatus: () => true,
});
if (res.status !== 200) return null;
if (!fs.existsSync(path.dirname(dest)))
fs.mkdirSync(path.dirname(dest), { recursive: true });
fs.writeFileSync(dest, Buffer.from(res.data));
return `./assets/${relPath}`;
} catch {
return null;
}
}
public async processCssRecursively(
cssContent: string,
cssUrl: string,
assetsDir: string,
urlMap: AssetMap,
depth = 0,
): Promise<string> {
if (depth > 5) return cssContent;
const urlRegex = /(?:url\(["']?|@import\s+["'])([^"')]*)["']?\)?/gi;
let match;
let newContent = cssContent;
while ((match = urlRegex.exec(cssContent)) !== null) {
const originalUrl = match[1];
if (originalUrl.startsWith("data:") || originalUrl.startsWith("blob:"))
continue;
try {
const absUrl = new URL(originalUrl, cssUrl).href;
const local = await this.downloadFile(absUrl, assetsDir);
if (local) {
const u = new URL(cssUrl);
const cssPath = u.hostname + u.pathname;
const assetPath = new URL(absUrl).hostname + new URL(absUrl).pathname;
const rel = path.relative(
path.dirname(this.sanitizePath(cssPath)),
this.sanitizePath(assetPath),
);
newContent = newContent.split(originalUrl).join(rel);
urlMap[absUrl] = local;
}
} catch {
// Ignore
}
}
return newContent;
}
}

View File

@@ -0,0 +1,256 @@
import { chromium } from "playwright";
import fs from "node:fs";
import path from "node:path";
import axios from "axios";
import { AssetManager, AssetMap } from "./AssetManager.js";
export interface PageClonerOptions {
outputDir: string;
userAgent?: string;
}
export class PageCloner {
private options: PageClonerOptions;
private assetManager: AssetManager;
private userAgent: string;
constructor(options: PageClonerOptions) {
this.options = options;
this.userAgent =
options.userAgent ||
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.0.0 Safari/537.36";
this.assetManager = new AssetManager(this.userAgent);
}
public async clone(targetUrl: string): Promise<string> {
const urlObj = new URL(targetUrl);
const domainSlug = urlObj.hostname.replace("www.", "");
const domainDir = path.resolve(this.options.outputDir, domainSlug);
const assetsDir = path.join(domainDir, "assets");
if (!fs.existsSync(assetsDir)) fs.mkdirSync(assetsDir, { recursive: true });
let pageSlug = urlObj.pathname.split("/").filter(Boolean).join("-");
if (!pageSlug) pageSlug = "index";
const htmlFilename = `${pageSlug}.html`;
console.log(`🚀 INDUSTRIAL CLONE: ${targetUrl}`);
const browser = await chromium.launch({ headless: true });
const context = await browser.newContext({
userAgent: this.userAgent,
viewport: { width: 1920, height: 1080 },
});
const page = await context.newPage();
const urlMap: AssetMap = {};
const foundAssets = new Set<string>();
page.on("response", (response) => {
if (response.status() === 200) {
const url = response.url();
if (
url.match(
/\.(css|js|png|jpg|jpeg|gif|svg|woff2?|ttf|otf|mp4|webm|webp|ico)/i,
)
) {
foundAssets.add(url);
}
}
});
try {
await page.goto(targetUrl, { waitUntil: "networkidle", timeout: 90000 });
// Scroll Wave
await page.evaluate(async () => {
await new Promise((resolve) => {
let totalHeight = 0;
const distance = 400;
const timer = setInterval(() => {
const scrollHeight = document.body.scrollHeight;
window.scrollBy(0, distance);
totalHeight += distance;
if (totalHeight >= scrollHeight) {
clearInterval(timer);
window.scrollTo(0, 0);
resolve(true);
}
}, 100);
});
});
const fullHeight = await page.evaluate(() => document.body.scrollHeight);
await page.setViewportSize({ width: 1920, height: fullHeight + 1000 });
await page.waitForTimeout(3000);
// Sanitization
await page.evaluate(() => {
const assetPattern =
/\.(jpg|jpeg|png|gif|svg|webp|mp4|webm|woff2?|ttf|otf)/i;
document.querySelectorAll("*").forEach((el) => {
if (
["META", "LINK", "HEAD", "SCRIPT", "STYLE", "SVG", "PATH"].includes(
el.tagName,
)
)
return;
const htmlEl = el as HTMLElement;
const style = window.getComputedStyle(htmlEl);
if (style.opacity === "0" || style.visibility === "hidden") {
htmlEl.style.setProperty("opacity", "1", "important");
htmlEl.style.setProperty("visibility", "visible", "important");
}
for (const attr of Array.from(el.attributes)) {
const name = attr.name.toLowerCase();
const val = attr.value;
if (
assetPattern.test(val) ||
name.includes("src") ||
name.includes("image")
) {
if (el.tagName === "IMG") {
const img = el as HTMLImageElement;
if (name.includes("srcset")) img.srcset = val;
else if (!img.src || img.src.includes("data:")) img.src = val;
}
if (el.tagName === "SOURCE")
(el as HTMLSourceElement).srcset = val;
if (el.tagName === "VIDEO" || el.tagName === "AUDIO")
(el as HTMLMediaElement).src = val;
if (
val.match(/^(https?:\/\/|\/\/|\/)/) &&
!name.includes("href")
) {
const bg = htmlEl.style.backgroundImage;
if (!bg || bg === "none")
htmlEl.style.backgroundImage = `url('${val}')`;
}
}
}
});
if (document.body) {
document.body.style.setProperty("opacity", "1", "important");
document.body.style.setProperty("visibility", "visible", "important");
}
});
await page.waitForLoadState("networkidle");
await page.waitForTimeout(1000);
const content = await page.content();
const regexPatterns = [
/(?:src|href|url|data-[a-z-]+|srcset)=["']([^"'<>\s]+?\.(?:css|js|png|jpg|jpeg|gif|svg|woff2?|ttf|otf|mp4|webm|webp|ico)(?:\?[^"']*)?)["']/gi,
/url\(["']?([^"')]*)["']?\)/gi,
];
for (const pattern of regexPatterns) {
let match;
while ((match = pattern.exec(content)) !== null) {
try {
foundAssets.add(new URL(match[1], targetUrl).href);
} catch {
// Ignore invalid URLs
}
}
}
for (const url of foundAssets) {
const local = await this.assetManager.downloadFile(url, assetsDir);
if (local) {
urlMap[url] = local;
const clean = url.split("?")[0];
urlMap[clean] = local;
if (clean.endsWith(".css")) {
try {
const { data } = await axios.get(url, {
headers: { "User-Agent": this.userAgent },
});
const processedCss =
await this.assetManager.processCssRecursively(
data,
url,
assetsDir,
urlMap,
);
const relPath = this.assetManager.sanitizePath(
new URL(url).hostname + new URL(url).pathname,
);
fs.writeFileSync(path.join(assetsDir, relPath), processedCss);
} catch {
// Ignore stylesheet download/process failures
}
}
}
}
let finalContent = content;
const sortedUrls = Object.keys(urlMap).sort(
(a, b) => b.length - a.length,
);
if (sortedUrls.length > 0) {
const escaped = sortedUrls.map((u) =>
u.replace(/[.*+?^${}()|[\]\\]/g, "\\$&"),
);
const masterRegex = new RegExp(`(${escaped.join("|")})`, "g");
finalContent = finalContent.replace(
masterRegex,
(match) => urlMap[match] || match,
);
}
const commonDirs = [
"/wp-content/",
"/wp-includes/",
"/assets/",
"/static/",
"/images/",
];
for (const dir of commonDirs) {
const localDir = `./assets/${urlObj.hostname}${dir}`;
finalContent = finalContent
.split(`"${dir}`)
.join(`"${localDir}`)
.split(`'${dir}`)
.join(`'${localDir}`)
.split(`(${dir}`)
.join(`(${localDir}`);
}
const domainPattern = new RegExp(
`https?://(www\\.)?${urlObj.hostname.replace(/\./g, "\\.")}[^"']*`,
"gi",
);
finalContent = finalContent.replace(domainPattern, () => "./");
finalContent = finalContent.replace(
/<script\b[^>]*>([\s\S]*?)<\/script>/gi,
(match, scriptContent) => {
const lower = scriptContent.toLowerCase();
return lower.includes("google-analytics") ||
lower.includes("gtag") ||
lower.includes("fbq") ||
lower.includes("lazy") ||
lower.includes("tracker")
? ""
: match;
},
);
const headEnd = finalContent.indexOf("</head>");
if (headEnd > -1) {
const stabilityCss = `\n<style>* { transition: none !important; animation: none !important; scroll-behavior: auto !important; } [data-aos], .reveal, .lazypath, .lazy-load, [data-src] { opacity: 1 !important; visibility: visible !important; transform: none !important; clip-path: none !important; } img, video, iframe { max-width: 100%; display: block; } a { pointer-events: none; cursor: default; } </style>`;
finalContent =
finalContent.slice(0, headEnd) +
stabilityCss +
finalContent.slice(headEnd);
}
const finalPath = path.join(domainDir, htmlFilename);
fs.writeFileSync(finalPath, finalContent);
return finalPath;
} finally {
await browser.close();
}
}
}

View File

@@ -0,0 +1,150 @@
import { PlaywrightCrawler, RequestQueue } from "crawlee";
import * as path from "node:path";
import * as fs from "node:fs";
import { execSync } from "node:child_process";
export interface WebsiteClonerOptions {
baseOutputDir: string;
maxRequestsPerCrawl?: number;
maxConcurrency?: number;
}
export class WebsiteCloner {
private options: WebsiteClonerOptions;
constructor(options: WebsiteClonerOptions) {
this.options = {
maxRequestsPerCrawl: 100,
maxConcurrency: 3,
...options,
};
}
public async clone(
targetUrl: string,
outputDirName?: string,
): Promise<string> {
const urlObj = new URL(targetUrl);
const domain = urlObj.hostname;
const finalOutputDirName = outputDirName || domain.replace(/\./g, "-");
const baseOutputDir = path.resolve(
this.options.baseOutputDir,
finalOutputDirName,
);
if (fs.existsSync(baseOutputDir)) {
fs.rmSync(baseOutputDir, { recursive: true, force: true });
}
fs.mkdirSync(baseOutputDir, { recursive: true });
console.log(`🚀 Starting perfect recursive clone of ${targetUrl}...`);
console.log(`📂 Output: ${baseOutputDir}`);
const requestQueue = await RequestQueue.open();
await requestQueue.addRequest({ url: targetUrl });
const crawler = new PlaywrightCrawler({
requestQueue,
maxRequestsPerCrawl: this.options.maxRequestsPerCrawl,
maxConcurrency: this.options.maxConcurrency,
async requestHandler({ request, enqueueLinks, log }) {
const url = request.url;
log.info(`Capturing ${url}...`);
const u = new URL(url);
let relPath = u.pathname;
if (relPath === "/" || relPath === "") relPath = "/index.html";
if (!relPath.endsWith(".html") && !path.extname(relPath))
relPath += "/index.html";
if (relPath.startsWith("/")) relPath = relPath.substring(1);
const fullPath = path.join(baseOutputDir, relPath);
fs.mkdirSync(path.dirname(fullPath), { recursive: true });
try {
// Note: This assumes single-file-cli is available in the environment
execSync(
`npx single-file-cli "${url}" "${fullPath}" --browser-headless=true --browser-wait-until=networkidle0`,
{
stdio: "inherit",
},
);
} catch (_e) {
log.error(`Failed to capture ${url} with SingleFile`);
}
await enqueueLinks({
strategy: "same-domain",
transformRequestFunction: (req) => {
if (
/\.(download|pdf|zip|gz|exe|png|jpg|jpeg|gif|svg|css|js)$/i.test(
req.url,
)
)
return false;
return req;
},
});
},
});
await crawler.run();
console.log("🔗 Rewriting internal links for offline navigation...");
const allFiles = this.getFiles(baseOutputDir).filter((f) =>
f.endsWith(".html"),
);
for (const file of allFiles) {
let content = fs.readFileSync(file, "utf8");
const fileRelToRoot = path.relative(baseOutputDir, file);
content = content.replace(/href="([^"]+)"/g, (match, href) => {
if (
href.startsWith(targetUrl) ||
href.startsWith("/") ||
(!href.includes("://") && !href.startsWith("data:"))
) {
try {
const linkUrl = new URL(href, targetUrl);
if (linkUrl.hostname === domain) {
let linkPath = linkUrl.pathname;
if (linkPath === "/" || linkPath === "") linkPath = "/index.html";
if (!linkPath.endsWith(".html") && !path.extname(linkPath))
linkPath += "/index.html";
if (linkPath.startsWith("/")) linkPath = linkPath.substring(1);
const relativeLink = path.relative(
path.dirname(fileRelToRoot),
linkPath,
);
return `href="${relativeLink}"`;
}
} catch (_e) {
// Ignore link rewriting failures
}
}
return match;
});
fs.writeFileSync(file, content);
}
console.log(`\n✅ Done! Perfect clone complete in: ${baseOutputDir}`);
return baseOutputDir;
}
private getFiles(dir: string, fileList: string[] = []) {
const files = fs.readdirSync(dir);
for (const file of files) {
const name = path.join(dir, file);
if (fs.statSync(name).isDirectory()) {
this.getFiles(name, fileList);
} else {
fileList.push(name);
}
}
return fileList;
}
}

View File

@@ -0,0 +1,3 @@
export * from "./AssetManager.js";
export * from "./PageCloner.js";
export * from "./WebsiteCloner.js";

View File

@@ -0,0 +1,17 @@
{
"extends": "../tsconfig/base.json",
"compilerOptions": {
"outDir": "dist",
"rootDir": "src",
"declaration": true,
"emitDeclarationOnly": true,
"module": "ESNext",
"target": "ESNext",
"moduleResolution": "Bundler",
"allowImportingTsExtensions": true,
"noEmit": false
},
"include": [
"src/**/*"
]
}

Binary file not shown.

View File

@@ -1,39 +0,0 @@
services:
infra-cms:
image: directus/directus:11
ports:
- "8059:8055"
networks:
- default
- infra
environment:
KEY: "infra-cms-key"
SECRET: "infra-cms-secret"
ADMIN_EMAIL: "marc@mintel.me"
ADMIN_PASSWORD: "Tim300493."
DB_CLIENT: "sqlite3"
DB_FILENAME: "/directus/database/data.db"
WEBSOCKETS_ENABLED: "true"
EMAIL_TRANSPORT: "smtp"
EMAIL_SMTP_HOST: "smtp.eu.mailgun.org"
EMAIL_SMTP_PORT: "587"
EMAIL_SMTP_USER: "postmaster@mg.mintel.me"
EMAIL_SMTP_PASSWORD: "4592fcb94599ee1a45b4ac2386fd0a64-102c75d8-ca2870e6"
EMAIL_SMTP_SECURE: "false"
EMAIL_FROM: "postmaster@mg.mintel.me"
volumes:
- ./database:/directus/database
- ./uploads:/directus/uploads
- ./schema:/directus/schema
- ./extensions:/directus/extensions
labels:
- "traefik.enable=true"
- "traefik.http.routers.infra-cms.rule=Host(`cms.localhost`)"
- "traefik.http.services.infra-cms.loadbalancer.server.port=8055"
- "traefik.docker.network=infra"
networks:
default:
name: mintel-infra-cms-internal
infra:
external: true

View File

@@ -1,851 +0,0 @@
import { useApi as e, defineModule as a } from "@directus/extensions-sdk";
import {
defineComponent as t,
ref as l,
onMounted as n,
resolveComponent as i,
resolveDirective as s,
openBlock as d,
createBlock as r,
withCtx as u,
createVNode as o,
createElementBlock as m,
Fragment as c,
renderList as v,
createTextVNode as p,
toDisplayString as f,
createCommentVNode as g,
createElementVNode as y,
withDirectives as b,
nextTick as _,
} from "vue";
const h = { class: "content-wrapper" },
x = { key: 0, class: "empty-state" },
w = { class: "header" },
k = { class: "header-left" },
V = { class: "title" },
C = { class: "subtitle" },
M = { class: "header-right" },
F = { class: "user-cell" },
N = { class: "user-name" },
z = { key: 0, class: "status-date" },
E = { key: 0, class: "drawer-content" },
U = { class: "form-section" },
S = { class: "field" },
A = { class: "drawer-actions" },
T = { key: 0, class: "drawer-content" },
Z = { class: "form-section" },
j = { class: "field" },
$ = { class: "field" },
D = { class: "field" },
O = { key: 1, class: "field" },
W = { class: "drawer-actions" };
var q = t({
__name: "module",
setup(a) {
const t = e(),
q = l([]),
B = l(null),
K = l([]),
L = l(!1),
P = l(!1),
G = l(null),
I = l(null),
H = l(!1),
J = l(!1),
Q = l({ id: "", name: "" }),
R = l(!1),
X = l(!1),
Y = l({
id: "",
first_name: "",
last_name: "",
email: "",
temporary_password: "",
}),
ee = [
{ text: "Name", value: "name", sortable: !0 },
{ text: "E-Mail", value: "email", sortable: !0 },
{ text: "Zuletzt eingeladen", value: "last_invited", sortable: !0 },
];
async function ae() {
const e = await t.get("/items/companies", {
params: { fields: ["id", "name"], sort: "name" },
});
q.value = e.data.data;
}
async function te(e) {
((B.value = e), (L.value = !0));
try {
const a = await t.get("/items/client_users", {
params: {
filter: { company: { _eq: e.id } },
fields: ["*"],
sort: "first_name",
},
});
K.value = a.data.data;
} finally {
L.value = !1;
}
}
function le() {
((J.value = !1), (Q.value = { id: "", name: "" }), (H.value = !0));
}
async function ne() {
B.value &&
((Q.value = { id: B.value.id, name: B.value.name }),
(J.value = !0),
await _(),
(H.value = !0));
}
async function ie() {
var e;
if (Q.value.name) {
P.value = !0;
try {
(J.value
? (await t.patch(`/items/companies/${Q.value.id}`, {
name: Q.value.name,
}),
(I.value = { type: "success", message: "Firma aktualisiert!" }))
: (await t.post("/items/companies", { name: Q.value.name }),
(I.value = { type: "success", message: "Firma angelegt!" })),
(H.value = !1),
await ae(),
(null == (e = B.value) ? void 0 : e.id) === Q.value.id &&
(B.value.name = Q.value.name));
} catch (e) {
I.value = { type: "danger", message: e.message };
} finally {
P.value = !1;
}
}
}
function se() {
((X.value = !1),
(Y.value = {
id: "",
first_name: "",
last_name: "",
email: "",
temporary_password: "",
}),
(R.value = !0));
}
async function de() {
if (Y.value.email && B.value) {
P.value = !0;
try {
(X.value
? (await t.patch(`/items/client_users/${Y.value.id}`, {
first_name: Y.value.first_name,
last_name: Y.value.last_name,
email: Y.value.email,
}),
(I.value = {
type: "success",
message: "Mitarbeiter aktualisiert!",
}))
: (await t.post("/items/client_users", {
first_name: Y.value.first_name,
last_name: Y.value.last_name,
email: Y.value.email,
company: B.value.id,
}),
(I.value = {
type: "success",
message: "Mitarbeiter angelegt!",
})),
(R.value = !1),
await te(B.value));
} catch (e) {
I.value = { type: "danger", message: e.message };
} finally {
P.value = !1;
}
}
}
function re(e) {
const a = (null == e ? void 0 : e.item) || e;
a &&
a.id &&
(async function (e) {
((Y.value = {
id: e.id || "",
first_name: e.first_name || "",
last_name: e.last_name || "",
email: e.email || "",
temporary_password: e.temporary_password || "",
}),
(X.value = !0),
await _(),
(R.value = !0));
})(a);
}
return (
n(() => {
ae();
}),
(e, a) => {
const l = i("v-icon"),
n = i("v-list-item-icon"),
_ = i("v-text-overflow"),
ae = i("v-list-item-content"),
ue = i("v-list-item"),
oe = i("v-divider"),
me = i("v-list"),
ce = i("v-notice"),
ve = i("v-button"),
pe = i("v-info"),
fe = i("v-avatar"),
ge = i("v-chip"),
ye = i("v-table"),
be = i("v-input"),
_e = i("v-drawer"),
he = i("private-view"),
xe = s("tooltip");
return (
d(),
r(
he,
{ title: "Customer Manager" },
{
navigation: u(() => [
o(
me,
{ nav: "" },
{
default: u(() => [
o(
ue,
{ onClick: le, clickable: "" },
{
default: u(() => [
o(n, null, {
default: u(() => [
o(l, {
name: "add",
color: "var(--theme--primary)",
}),
]),
_: 1,
}),
o(ae, null, {
default: u(() => [
o(_, { text: "Neue Firma anlegen" }),
]),
_: 1,
}),
]),
_: 1,
},
),
o(oe),
(d(!0),
m(
c,
null,
v(q.value, (e) => {
var a;
return (
d(),
r(
ue,
{
key: e.id,
active:
(null == (a = B.value) ? void 0 : a.id) ===
e.id,
class: "company-item",
clickable: "",
onClick: (a) => te(e),
},
{
default: u(() => [
o(n, null, {
default: u(() => [
o(l, { name: "business" }),
]),
_: 1,
}),
o(
ae,
null,
{
default: u(() => [
o(_, { text: e.name }, null, 8, [
"text",
]),
]),
_: 2,
},
1024,
),
]),
_: 2,
},
1032,
["active", "onClick"],
)
);
}),
128,
)),
]),
_: 1,
},
),
]),
"title-outer:after": u(() => [
I.value
? (d(),
r(
ce,
{
key: 0,
type: I.value.type,
onClose: a[0] || (a[0] = (e) => (I.value = null)),
dismissible: "",
},
{ default: u(() => [p(f(I.value.message), 1)]), _: 1 },
8,
["type"],
))
: g("v-if", !0),
]),
default: u(() => [
y("div", h, [
B.value
? (d(),
m(
c,
{ key: 1 },
[
y("header", w, [
y("div", k, [
y("h1", V, f(B.value.name), 1),
y(
"p",
C,
f(K.value.length) + " Kunden-Mitarbeiter",
1,
),
]),
y("div", M, [
b(
(d(),
r(
ve,
{
secondary: "",
rounded: "",
icon: "",
onClick: ne,
},
{
default: u(() => [
o(l, { name: "edit" }),
]),
_: 1,
},
)),
[
[
xe,
"Firma bearbeiten",
void 0,
{ bottom: !0 },
],
],
),
o(
ve,
{ primary: "", onClick: se },
{
default: u(() => [
...(a[14] ||
(a[14] = [
p(" Mitarbeiter hinzufügen ", -1),
])),
]),
_: 1,
},
),
]),
]),
o(
ye,
{
headers: ee,
items: K.value,
loading: L.value,
class: "clickable-table",
"fixed-header": "",
"onClick:row": re,
},
{
"item.name": u(({ item: e }) => [
y("div", F, [
o(
fe,
{ name: e.first_name, "x-small": "" },
null,
8,
["name"],
),
y(
"span",
N,
f(e.first_name) + " " + f(e.last_name),
1,
),
]),
]),
"item.last_invited": u(({ item: e }) => {
return [
e.last_invited
? (d(),
m(
"span",
z,
f(
((t = e.last_invited),
new Date(t).toLocaleString(
"de-DE",
{
day: "2-digit",
month: "2-digit",
year: "numeric",
hour: "2-digit",
minute: "2-digit",
},
)),
),
1,
))
: (d(),
r(
ge,
{ key: 1, "x-small": "" },
{
default: u(() => [
...(a[15] ||
(a[15] = [p("Noch nie", -1)])),
]),
_: 1,
},
)),
];
var t;
}),
_: 2,
},
1032,
["items", "loading"],
),
],
64,
))
: (d(),
m("div", x, [
o(
pe,
{
title: "Firmen auswählen",
icon: "business",
center: "",
},
{
default: u(() => [
a[12] ||
(a[12] = p(
" Wähle eine Firma in der Navigation aus oder ",
-1,
)),
o(
ve,
{ "x-small": "", onClick: le },
{
default: u(() => [
...(a[11] ||
(a[11] = [
p("erstelle eine neue Firma", -1),
])),
]),
_: 1,
},
),
a[13] || (a[13] = p(". ", -1)),
]),
_: 1,
},
),
])),
]),
o(
_e,
{
modelValue: H.value,
"onUpdate:modelValue":
a[2] || (a[2] = (e) => (H.value = e)),
title: J.value
? "Firma bearbeiten"
: "Neue Firma anlegen",
icon: "business",
onCancel: a[3] || (a[3] = (e) => (H.value = !1)),
},
{
default: u(() => [
H.value
? (d(),
m("div", E, [
y("div", U, [
y("div", S, [
a[16] ||
(a[16] = y(
"span",
{ class: "label" },
"Firmenname",
-1,
)),
o(
be,
{
modelValue: Q.value.name,
"onUpdate:modelValue":
a[1] ||
(a[1] = (e) => (Q.value.name = e)),
placeholder: "z.B. KLZ Cables",
autofocus: "",
},
null,
8,
["modelValue"],
),
]),
]),
y("div", A, [
o(
ve,
{
primary: "",
block: "",
loading: P.value,
onClick: ie,
},
{
default: u(() => [
...(a[17] ||
(a[17] = [p("Speichern", -1)])),
]),
_: 1,
},
8,
["loading"],
),
]),
]))
: g("v-if", !0),
]),
_: 1,
},
8,
["modelValue", "title"],
),
o(
_e,
{
modelValue: R.value,
"onUpdate:modelValue":
a[9] || (a[9] = (e) => (R.value = e)),
title: X.value
? "Mitarbeiter bearbeiten"
: "Neuen Mitarbeiter anlegen",
icon: "person",
onCancel: a[10] || (a[10] = (e) => (R.value = !1)),
},
{
default: u(() => [
R.value
? (d(),
m("div", T, [
y("div", Z, [
y("div", j, [
a[18] ||
(a[18] = y(
"span",
{ class: "label" },
"Vorname",
-1,
)),
o(
be,
{
modelValue: Y.value.first_name,
"onUpdate:modelValue":
a[4] ||
(a[4] = (e) =>
(Y.value.first_name = e)),
placeholder: "Vorname",
autofocus: "",
},
null,
8,
["modelValue"],
),
]),
y("div", $, [
a[19] ||
(a[19] = y(
"span",
{ class: "label" },
"Nachname",
-1,
)),
o(
be,
{
modelValue: Y.value.last_name,
"onUpdate:modelValue":
a[5] ||
(a[5] = (e) => (Y.value.last_name = e)),
placeholder: "Nachname",
},
null,
8,
["modelValue"],
),
]),
y("div", D, [
a[20] ||
(a[20] = y(
"span",
{ class: "label" },
"E-Mail",
-1,
)),
o(
be,
{
modelValue: Y.value.email,
"onUpdate:modelValue":
a[6] ||
(a[6] = (e) => (Y.value.email = e)),
placeholder: "E-Mail Adresse",
type: "email",
},
null,
8,
["modelValue"],
),
]),
X.value
? (d(), r(oe, { key: 0 }))
: g("v-if", !0),
X.value
? (d(),
m("div", O, [
a[21] ||
(a[21] = y(
"span",
{ class: "label" },
"Temporäres Passwort",
-1,
)),
o(
be,
{
modelValue:
Y.value.temporary_password,
"onUpdate:modelValue":
a[7] ||
(a[7] = (e) =>
(Y.value.temporary_password = e)),
readonly: "",
class: "password-input",
},
null,
8,
["modelValue"],
),
a[22] ||
(a[22] = y(
"p",
{ class: "field-note" },
"Wird beim Senden der Zugangsdaten automatisch generiert.",
-1,
)),
]))
: g("v-if", !0),
]),
y("div", W, [
o(
ve,
{
primary: "",
block: "",
loading: P.value,
onClick: de,
},
{
default: u(() => [
...(a[23] ||
(a[23] = [p("Daten speichern", -1)])),
]),
_: 1,
},
8,
["loading"],
),
X.value
? (d(),
m(
c,
{ key: 0 },
[
o(oe),
b(
(d(),
r(
ve,
{
secondary: "",
block: "",
loading: G.value === Y.value.id,
onClick:
a[8] ||
(a[8] = (e) =>
(async function (e) {
G.value = e.id;
try {
if (
(await t.post(
"/flows/trigger/33443f6b-cec7-4668-9607-f33ea674d501",
[e.id],
),
(I.value = {
type: "success",
message: `Zugangsdaten für ${e.first_name} versendet. 📧`,
}),
await te(B.value),
R.value &&
Y.value.id === e.id)
) {
const a = K.value.find(
(a) => a.id === e.id,
);
a &&
(Y.value.temporary_password =
a.temporary_password);
}
} catch (e) {
I.value = {
type: "danger",
message: `Fehler: ${e.message}`,
};
} finally {
G.value = null;
}
})(Y.value)),
},
{
default: u(() => [
o(l, {
name: "send",
left: "",
}),
a[24] ||
(a[24] = p(
" Zugangsdaten senden ",
-1,
)),
]),
_: 1,
},
8,
["loading"],
)),
[
[
xe,
"Generiert PW, speichert es und sendet E-Mail",
void 0,
{ bottom: !0 },
],
],
),
],
64,
))
: g("v-if", !0),
]),
]))
: g("v-if", !0),
]),
_: 1,
},
8,
["modelValue", "title"],
),
]),
_: 1,
},
)
);
}
);
},
}),
B = [],
K = [];
!(function (e, a) {
if (e && "undefined" != typeof document) {
var t,
l = !0 === a.prepend ? "prepend" : "append",
n = !0 === a.singleTag,
i =
"string" == typeof a.container
? document.querySelector(a.container)
: document.getElementsByTagName("head")[0];
if (n) {
var s = B.indexOf(i);
(-1 === s && ((s = B.push(i) - 1), (K[s] = {})),
(t = K[s] && K[s][l] ? K[s][l] : (K[s][l] = d())));
} else t = d();
(65279 === e.charCodeAt(0) && (e = e.substring(1)),
t.styleSheet
? (t.styleSheet.cssText += e)
: t.appendChild(document.createTextNode(e)));
}
function d() {
var e = document.createElement("style");
if ((e.setAttribute("type", "text/css"), a.attributes))
for (var t = Object.keys(a.attributes), n = 0; n < t.length; n++)
e.setAttribute(t[n], a.attributes[t[n]]);
var s = "prepend" === l ? "afterbegin" : "beforeend";
return (i.insertAdjacentElement(s, e), e);
}
})(
"\n.content-wrapper[data-v-3fd11e72] { padding: 32px; height: 100%; display: flex; flex-direction: column;\n}\n.company-item[data-v-3fd11e72] { cursor: pointer;\n}\n.header[data-v-3fd11e72] { margin-bottom: 24px; display: flex; justify-content: space-between; align-items: flex-end;\n}\n.header-right[data-v-3fd11e72] { display: flex; gap: 12px;\n}\n.title[data-v-3fd11e72] { font-size: 24px; font-weight: 800; margin-bottom: 4px;\n}\n.subtitle[data-v-3fd11e72] { color: var(--theme--foreground-subdued); font-size: 14px;\n}\n.empty-state[data-v-3fd11e72] { height: 100%; display: flex; align-items: center; justify-content: center;\n}\n.user-cell[data-v-3fd11e72] { display: flex; align-items: center; gap: 12px;\n}\n.user-name[data-v-3fd11e72] { font-weight: 600;\n}\n.status-date[data-v-3fd11e72] { font-size: 12px; color: var(--theme--foreground-subdued);\n}\n.drawer-content[data-v-3fd11e72] { padding: 24px; display: flex; flex-direction: column; gap: 32px;\n}\n.form-section[data-v-3fd11e72] { display: flex; flex-direction: column; gap: 20px;\n}\n.field[data-v-3fd11e72] { display: flex; flex-direction: column; gap: 8px;\n}\n.label[data-v-3fd11e72] { font-size: 12px; font-weight: 700; text-transform: uppercase; color: var(--theme--foreground-subdued); letter-spacing: 0.5px;\n}\n.field-note[data-v-3fd11e72] { font-size: 11px; color: var(--theme--foreground-subdued); margin-top: 4px;\n}\n.drawer-actions[data-v-3fd11e72] { margin-top: 24px; display: flex; flex-direction: column; gap: 12px;\n}\n.password-input[data-v-3fd11e72] textarea {\n\tfont-family: var(--family-monospace);\n\tfont-weight: 800;\n\tcolor: var(--theme--primary) !important;\n\tbackground: var(--theme--background-subdued) !important;\n}\n.clickable-table[data-v-3fd11e72] tbody tr { cursor: pointer; transition: background-color 0.2s ease;\n}\n.clickable-table[data-v-3fd11e72] tbody tr:hover { background-color: var(--theme--background-subdued) !important;\n}\n[data-v-3fd11e72] .v-list-item { cursor: pointer !important;\n}\n",
{},
);
var L = a({
id: "customer-manager",
name: "Customer Manager",
icon: "supervisor_account",
routes: [
{
path: "",
component: ((e, a) => {
const t = e.__vccOpts || e;
for (const [e, l] of a) t[e] = l;
return t;
})(q, [
["__scopeId", "data-v-3fd11e72"],
["__file", "module.vue"],
]),
},
],
});
export { L as default };

View File

@@ -1,29 +0,0 @@
{
"name": "customer-manager",
"description": "Custom High-Fidelity Customer & Company Management for Directus",
"icon": "supervisor_account",
"version": "1.0.0",
"keywords": [
"directus",
"directus-extension",
"directus-extension-module"
],
"files": [
"dist"
],
"directus:extension": {
"type": "module",
"path": "index.js",
"source": "src/index.ts",
"host": "*",
"name": "Customer Manager"
},
"scripts": {
"build": "directus-extension build",
"dev": "directus-extension build -w"
},
"devDependencies": {
"@directus/extensions-sdk": "11.0.2",
"vue": "^3.4.0"
}
}

File diff suppressed because one or more lines are too long

View File

@@ -1,29 +0,0 @@
{
"name": "feedback-commander",
"description": "Custom High-Fidelity Feedback Management Extension for Directus",
"icon": "view_kanban",
"version": "1.0.0",
"keywords": [
"directus",
"directus-extension",
"directus-extension-module"
],
"files": [
"index.js"
],
"directus:extension": {
"type": "module",
"path": "index.js",
"source": "src/index.ts",
"host": "*",
"name": "Feedback Commander"
},
"scripts": {
"build": "directus-extension build",
"dev": "directus-extension build -w"
},
"devDependencies": {
"@directus/extensions-sdk": "11.0.2",
"vue": "^3.4.0"
}
}

View File

@@ -1,11 +0,0 @@
{
"name": "@mintel/cms-infra",
"version": "1.6.0",
"private": true,
"type": "module",
"scripts": {
"up": "docker compose up -d",
"down": "docker compose down",
"logs": "docker compose logs -f"
}
}

View File

@@ -1 +0,0 @@
xmKX5

View File

@@ -0,0 +1,36 @@
{
"name": "@mintel/concept-engine",
"version": "1.9.9",
"private": true,
"description": "AI-powered web project concept generation and analysis",
"type": "module",
"main": "./dist/index.js",
"module": "./dist/index.js",
"types": "./dist/index.d.ts",
"bin": {
"concept": "./dist/cli.js"
},
"scripts": {
"build": "tsup",
"dev": "tsup --watch",
"test": "vitest",
"clean": "rm -rf dist",
"lint": "eslint src --ext .ts",
"concept": "tsx src/cli.ts run"
},
"dependencies": {
"@mintel/journaling": "workspace:*",
"@mintel/page-audit": "workspace:*",
"axios": "^1.7.9",
"cheerio": "1.0.0-rc.12",
"commander": "^13.1.0",
"dotenv": "^16.4.7"
},
"devDependencies": {
"@types/node": "^20.17.17",
"tsup": "^8.3.6",
"tsx": "^4.19.2",
"typescript": "^5.7.3",
"vitest": "^3.0.5"
}
}

View File

@@ -0,0 +1,39 @@
import { config as dotenvConfig } from "dotenv";
import * as path from "node:path";
import * as fs from "node:fs/promises";
import { ConceptPipeline } from "./pipeline.js";
dotenvConfig({ path: path.resolve(process.cwd(), "../../.env") });
const briefing = await fs.readFile(
path.resolve(process.cwd(), "../../data/briefings/etib.txt"),
"utf8",
);
console.log(`Briefing loaded: ${briefing.length} chars`);
const pipeline = new ConceptPipeline(
{
openrouterKey: process.env.OPENROUTER_API_KEY || "",
zyteApiKey: process.env.ZYTE_API_KEY,
outputDir: path.resolve(process.cwd(), "../../out/estimations"),
crawlDir: path.resolve(process.cwd(), "../../data/crawls"),
},
{
onStepStart: (id, _name) => console.log(`[CB] Starting: ${id}`),
onStepComplete: (id) => console.log(`[CB] Done: ${id}`),
onStepError: (id, err) => console.error(`[CB] Error in ${id}: ${err}`),
},
);
try {
await pipeline.run({
briefing,
url: "https://www.e-tib.com",
});
console.log("\n✨ Pipeline complete!");
} catch (err: any) {
console.error("\n❌ Pipeline failed:", err.message);
console.error(err.stack);
}

View File

@@ -0,0 +1,334 @@
// ============================================================================
// Analyzer — Deterministic Site Analysis (NO LLM!)
// Builds a SiteProfile from crawled pages using pure code logic.
// This is the core fix against hallucinated page structures.
// ============================================================================
import type {
CrawledPage,
SiteProfile,
NavItem,
CompanyInfo,
PageInventoryItem,
} from "./types.js";
/**
* Build a complete SiteProfile from an array of crawled pages.
* This is 100% deterministic — no LLM calls involved.
*/
export function analyzeSite(pages: CrawledPage[], domain: string): SiteProfile {
const navigation = extractNavigation(pages);
const existingFeatures = extractExistingFeatures(pages);
const services = extractAllServices(pages);
const companyInfo = extractCompanyInfo(pages);
const colors = extractColors(pages);
const socialLinks = extractSocialLinks(pages);
const externalDomains = extractExternalDomains(pages, domain);
const images = extractAllImages(pages);
const employeeCount = extractEmployeeCount(pages);
const pageInventory = buildPageInventory(pages);
return {
domain,
crawledAt: new Date().toISOString(),
totalPages: pages.filter((p) => p.type !== "legal").length,
navigation,
existingFeatures,
services,
companyInfo,
pageInventory,
colors,
socialLinks,
externalDomains,
images,
employeeCount,
};
}
/**
* Extract the site's main navigation structure from <nav> elements.
* Uses the HOME page's nav as the canonical source.
*/
function extractNavigation(pages: CrawledPage[]): NavItem[] {
// Prefer the home page's nav
const homePage = pages.find((p) => p.type === "home");
const sourcePage = homePage || pages[0];
if (!sourcePage) return [];
// Deduplicate nav items
const seen = new Set<string>();
const navItems: NavItem[] = [];
for (const label of sourcePage.navItems) {
const normalized = label.toLowerCase().trim();
if (seen.has(normalized)) continue;
if (normalized.length < 2) continue;
seen.add(normalized);
navItems.push({ label, href: "" });
}
return navItems;
}
/**
* Aggregate all detected interactive features across all pages.
*/
function extractExistingFeatures(pages: CrawledPage[]): string[] {
const allFeatures = new Set<string>();
for (const page of pages) {
for (const feature of page.features) {
allFeatures.add(feature);
}
}
return [...allFeatures];
}
/**
* Aggregate all images found across all pages.
*/
function extractAllImages(pages: CrawledPage[]): string[] {
const allImages = new Set<string>();
for (const page of pages) {
if (!page.images) continue;
for (const img of page.images) {
allImages.add(img);
}
}
return [...allImages];
}
/**
* Extract employee count from page text.
* Looks for patterns like "über 50 Mitarbeitern", "200 Mitarbeiter", "50+ employees".
*/
function extractEmployeeCount(pages: CrawledPage[]): string | null {
const allText = pages.map((p) => p.text).join(" ");
// German patterns: 'über 50 Mitarbeitern', '120 Beschäftigte', '+200 MA'
const patterns = [
/(über|ca\.?|rund|mehr als|\+)?\s*(\d{1,4})\s*(Mitarbeiter(?:innen)?|Beschäftigte|MA|Fachkräfte)\b/gi,
/(\d{1,4})\+?\s*(employees|team members)/gi,
];
for (const pattern of patterns) {
const match = allText.match(pattern);
if (match && match[0]) {
const num = match[0].match(/(\d{1,4})/)?.[1];
const prefix = match[0].match(/über|ca\.?|rund|mehr als/i)?.[0];
if (num) return prefix ? `${prefix} ${num}` : num;
}
}
return null;
}
/**
* Extract services/competencies from service-type pages.
* Focuses on H2-H3 headings and list items on service pages.
*/
function extractAllServices(pages: CrawledPage[]): string[] {
const servicePages = pages.filter(
(p) => p.type === "service" || p.pathname.includes("kompetenz"),
);
const services = new Set<string>();
for (const page of servicePages) {
// Use headings as primary service indicators
for (const heading of page.headings) {
const clean = heading.trim();
if (clean.length > 3 && clean.length < 100) {
// Skip generic headings
if (/^(home|kontakt|impressum|datenschutz|menü|navigation|suche)/i.test(clean)) continue;
services.add(clean);
}
}
}
// If no service pages found, look at the home page headings too
if (services.size === 0) {
const homePage = pages.find((p) => p.type === "home");
if (homePage) {
for (const heading of homePage.headings) {
const clean = heading.trim();
if (clean.length > 3 && clean.length < 80) {
services.add(clean);
}
}
}
}
return [...services];
}
/**
* Extract company information from Impressum / footer content.
*/
function extractCompanyInfo(pages: CrawledPage[]): CompanyInfo {
const info: CompanyInfo = {};
// Find Impressum or legal page
const legalPage = pages.find(
(p) =>
p.type === "legal" &&
(p.pathname.includes("impressum") || p.title.toLowerCase().includes("impressum")),
);
const sourceText = legalPage?.text || pages.find((p) => p.type === "home")?.text || "";
// USt-ID
const taxMatch = sourceText.match(/USt[.\s-]*(?:ID[.\s-]*Nr\.?|IdNr\.?)[:\s]*([A-Z]{2}\d{9,11})/i);
if (taxMatch) info.taxId = taxMatch[1];
// HRB number
const hrbMatch = sourceText.match(/HRB[:\s]*(\d+\s*[A-Z]*)/i);
if (hrbMatch) info.registerNumber = `HRB ${hrbMatch[1].trim()}`;
// Phone
const phoneMatch = sourceText.match(/(?:Tel|Telefon|Fon)[.:\s]*([+\d\s()/-]{10,20})/i);
if (phoneMatch) info.phone = phoneMatch[1].trim();
// Email
const emailMatch = sourceText.match(/[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}/);
if (emailMatch) info.email = emailMatch[0];
// Address (look for German postal code pattern)
const addressMatch = sourceText.match(
/(?:[\w\s.-]+(?:straße|str\.|weg|platz|ring|allee|gasse)\s*\d+[a-z]?\s*,?\s*)?(?:D-)?(\d{5})\s+\w+/i,
);
if (addressMatch) info.address = addressMatch[0].trim();
// GF / Geschäftsführer
const gfMatch = sourceText.match(
/Geschäftsführ(?:er|ung)[:\s]*([A-ZÄÖÜ][a-zäöüß]+(?:\s+[A-ZÄÖÜ][a-zäöüß]+){1,3})/,
);
if (gfMatch) info.managingDirector = gfMatch[1].trim();
return info;
}
/**
* Extract brand colors from HTML (inline styles, CSS variables).
*/
function extractColors(pages: CrawledPage[]): string[] {
const colors = new Set<string>();
const homePage = pages.find((p) => p.type === "home");
if (!homePage) return [];
const hexMatches = homePage.html.match(/#(?:[0-9a-fA-F]{3}){1,2}\b/g) || [];
for (const hex of hexMatches) {
colors.add(hex.toLowerCase());
if (colors.size >= 8) break;
}
return [...colors];
}
/**
* Extract social media links from footers / headers.
*/
function extractSocialLinks(pages: CrawledPage[]): Record<string, string> {
const socials: Record<string, string> = {};
const platforms = [
{ key: "linkedin", patterns: ["linkedin.com"] },
{ key: "instagram", patterns: ["instagram.com"] },
{ key: "facebook", patterns: ["facebook.com", "fb.com"] },
{ key: "youtube", patterns: ["youtube.com", "youtu.be"] },
{ key: "twitter", patterns: ["twitter.com", "x.com"] },
{ key: "xing", patterns: ["xing.com"] },
];
const homePage = pages.find((p) => p.type === "home");
if (!homePage) return socials;
const urlMatches = homePage.html.match(/https?:\/\/[^\s"'<>]+/g) || [];
for (const url of urlMatches) {
for (const platform of platforms) {
if (platform.patterns.some((p) => url.includes(p)) && !socials[platform.key]) {
socials[platform.key] = url;
}
}
}
return socials;
}
/**
* Find domains that are linked but separate from the main domain.
* Critical for detecting sister companies with own websites (e.g. etib-ing.com).
*/
function extractExternalDomains(pages: CrawledPage[], mainDomain: string): string[] {
const externalDomains = new Set<string>();
const cleanMain = mainDomain.replace(/^www\./, "");
// Extract meaningful base parts: "e-tib.com" → ["e", "tib", "etib"]
const mainParts = cleanMain.split(".")[0].toLowerCase().split(/[-_]/).filter(p => p.length > 1);
const mainJoined = mainParts.join(""); // "etib"
for (const page of pages) {
const linkMatches = page.html.match(/https?:\/\/[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}/g) || [];
for (const url of linkMatches) {
try {
const urlObj = new URL(url);
const domain = urlObj.hostname.replace(/^www\./, "");
// Skip same domain
if (domain === cleanMain) continue;
// Skip common third-party services
if (
domain.includes("google") ||
domain.includes("facebook") ||
domain.includes("twitter") ||
domain.includes("linkedin") ||
domain.includes("instagram") ||
domain.includes("youtube") ||
domain.includes("cookie") ||
domain.includes("analytics") ||
domain.includes("cdn") ||
domain.includes("cloudflare") ||
domain.includes("fonts") ||
domain.includes("jquery") ||
domain.includes("bootstrap") ||
domain.includes("wordpress") ||
domain.includes("jimdo") ||
domain.includes("wix")
)
continue;
// Fuzzy match: check if the domain contains any base part of the main domain
// e.g. main="e-tib.com" → mainParts=["e","tib"], mainJoined="etib"
// target="etib-ing.com" → domainBase="etib-ing", domainJoined="etibing"
const domainBase = domain.split(".")[0].toLowerCase();
const domainJoined = domainBase.replace(/[-_]/g, "");
const isRelated =
domainJoined.includes(mainJoined) ||
mainJoined.includes(domainJoined) ||
mainParts.some(part => part.length > 2 && domainBase.includes(part));
if (isRelated) {
externalDomains.add(domain);
}
} catch {
// Invalid URL
}
}
}
return [...externalDomains];
}
/**
* Build a structured inventory of all pages.
*/
function buildPageInventory(pages: CrawledPage[]): PageInventoryItem[] {
return pages.map((page) => ({
url: page.url,
pathname: page.pathname,
title: page.title,
type: page.type,
headings: page.headings.slice(0, 10),
services: page.type === "service" ? page.headings.filter((h) => h.length > 3 && h.length < 80) : [],
hasSearch: page.features.includes("search"),
hasForms: page.features.includes("forms"),
hasMap: page.features.includes("maps"),
hasVideo: page.features.includes("video"),
contentSummary: page.text.substring(0, 500),
}));
}

View File

@@ -0,0 +1,163 @@
#!/usr/bin/env node
// ============================================================================
// @mintel/concept-engine — CLI Entry Point
// Simple commander-based CLI for concept generation.
// ============================================================================
import { Command } from "commander";
import * as path from "node:path";
import * as fs from "node:fs/promises";
import { existsSync } from "node:fs";
import { config as dotenvConfig } from "dotenv";
import { ConceptPipeline } from "./pipeline.js";
// Load .env from monorepo root
dotenvConfig({ path: path.resolve(process.cwd(), "../../.env") });
dotenvConfig({ path: path.resolve(process.cwd(), ".env") });
const program = new Command();
program
.name("concept")
.description("AI-powered project concept generator")
.version("1.0.0");
program
.command("run")
.description("Run the full concept pipeline")
.argument("[briefing]", "Briefing text or @path/to/file.txt")
.option("--url <url>", "Target website URL")
.option("--comments <comments>", "Additional notes")
.option("--clear-cache", "Clear crawl cache and re-crawl")
.option("--output <dir>", "Output directory", "../../out/concepts")
.option("--crawl-dir <dir>", "Crawl data directory", "../../data/crawls")
.action(async (briefingArg: string | undefined, options: any) => {
const openrouterKey =
process.env.OPENROUTER_API_KEY || process.env.OPENROUTER_KEY;
if (!openrouterKey) {
console.error("❌ OPENROUTER_API_KEY not found in environment.");
process.exit(1);
}
let briefing = briefingArg || "";
// Handle @file references
if (briefing.startsWith("@")) {
const rawPath = briefing.substring(1);
const filePath = rawPath.startsWith("/")
? rawPath
: path.resolve(process.cwd(), rawPath);
if (!existsSync(filePath)) {
console.error(`❌ Briefing file not found: ${filePath}`);
process.exit(1);
}
briefing = await fs.readFile(filePath, "utf8");
console.log(`📄 Loaded briefing from: ${filePath}`);
}
// Auto-discover URL from briefing
let url = options.url;
if (!url && briefing) {
const urlMatch = briefing.match(/https?:\/\/[^\s]+/);
if (urlMatch) {
url = urlMatch[0];
console.log(`🔗 Discovered URL in briefing: ${url}`);
}
}
if (!briefing && !url) {
console.error("❌ Provide a briefing text or --url");
process.exit(1);
}
const pipeline = new ConceptPipeline(
{
openrouterKey,
zyteApiKey: process.env.ZYTE_API_KEY,
outputDir: path.resolve(process.cwd(), options.output),
crawlDir: path.resolve(process.cwd(), options.crawlDir),
},
{
onStepStart: (_id, _name) => {
// Will be enhanced with Ink spinner later
},
onStepComplete: (_id, _result) => {
// Will be enhanced with Ink UI later
},
},
);
try {
await pipeline.run({
briefing,
url,
comments: options.comments,
clearCache: options.clearCache,
});
console.log("\n✨ Concept generation complete!");
} catch (err) {
console.error(`\n❌ Pipeline failed: ${(err as Error).message}`);
process.exit(1);
}
});
program
.command("analyze")
.description("Only crawl and analyze a website (no LLM)")
.argument("<url>", "Website URL to analyze")
.option("--crawl-dir <dir>", "Crawl data directory", "../../data/crawls")
.option("--clear-cache", "Clear existing crawl cache")
.action(async (url: string, options: any) => {
const { crawlSite } = await import("./scraper.js");
const { analyzeSite } = await import("./analyzer.js");
if (options.clearCache) {
const { clearCrawlCache } = await import("./scraper.js");
const domain = new URL(url).hostname;
await clearCrawlCache(
path.resolve(process.cwd(), options.crawlDir),
domain,
);
}
const pages = await crawlSite(url, {
zyteApiKey: process.env.ZYTE_API_KEY,
crawlDir: path.resolve(process.cwd(), options.crawlDir),
});
const domain = new URL(url).hostname;
const profile = analyzeSite(pages, domain);
console.log("\n📊 Site Profile:");
console.log(` Domain: ${profile.domain}`);
console.log(` Total Pages: ${profile.totalPages}`);
console.log(
` Navigation: ${profile.navigation.map((n) => n.label).join(", ")}`,
);
console.log(` Features: ${profile.existingFeatures.join(", ") || "none"}`);
console.log(` Services: ${profile.services.join(", ") || "none"}`);
console.log(
` External Domains: ${profile.externalDomains.join(", ") || "none"}`,
);
console.log(` Company: ${profile.companyInfo.name || "unbekannt"}`);
console.log(` Tax ID: ${profile.companyInfo.taxId || "unbekannt"}`);
console.log(` Colors: ${profile.colors.join(", ")}`);
console.log(` Images Found: ${profile.images.length}`);
console.log(
` Social: ${
Object.entries(profile.socialLinks)
.map(([_k, _v]) => `${_k}`)
.join(", ") || "none"
}`,
);
const outputPath = path.join(
path.resolve(process.cwd(), options.crawlDir),
domain.replace(/\./g, "-"),
"_site_profile.json",
);
console.log(`\n📦 Full profile saved to: ${outputPath}`);
});
program.parse();

View File

@@ -0,0 +1,7 @@
import { describe, it, expect } from "vitest";
describe("concept-engine", () => {
it("should pass", () => {
expect(true).toBe(true);
});
});

View File

@@ -0,0 +1,10 @@
// ============================================================================
// @mintel/concept-engine — Public API
// ============================================================================
export { ConceptPipeline } from "./pipeline.js";
export type { PipelineCallbacks } from "./pipeline.js";
export { crawlSite, clearCrawlCache } from "./scraper.js";
export { analyzeSite } from "./analyzer.js";
export { llmRequest, llmJsonRequest, cleanJson } from "./llm-client.js";
export * from "./types.js";

View File

@@ -0,0 +1,142 @@
// ============================================================================
// LLM Client — Unified interface with model routing via OpenRouter
// ============================================================================
import axios from "axios";
interface LLMRequestOptions {
model: string;
systemPrompt: string;
userPrompt: string;
jsonMode?: boolean;
apiKey: string;
}
interface LLMResponse {
content: string;
usage: {
promptTokens: number;
completionTokens: number;
cost: number;
};
}
/**
* Clean raw LLM output to parseable JSON.
* Handles markdown fences, control chars, trailing commas.
*/
export function cleanJson(str: string): string {
let cleaned = str.replace(/```json\n?|```/g, "").trim();
// eslint-disable-next-line no-control-regex
cleaned = cleaned.replace(/[\x00-\x1f\x7f-\x9f]/gi, " ");
cleaned = cleaned.replace(/,\s*([\]}])/g, "$1");
return cleaned;
}
/**
* Send a request to an LLM via OpenRouter.
*/
export async function llmRequest(
options: LLMRequestOptions,
): Promise<LLMResponse> {
const { model, systemPrompt, userPrompt, jsonMode = true, apiKey } = options;
const resp = await axios
.post(
"https://openrouter.ai/api/v1/chat/completions",
{
model,
messages: [
{ role: "system", content: systemPrompt },
{ role: "user", content: userPrompt },
],
...(jsonMode ? { response_format: { type: "json_object" } } : {}),
},
{
headers: {
Authorization: `Bearer ${apiKey}`,
"Content-Type": "application/json",
},
timeout: 120000,
},
)
.catch((err) => {
if (err.response) {
console.error(
"OpenRouter API Error:",
JSON.stringify(err.response.data, null, 2),
);
}
throw err;
});
const content = resp.data.choices?.[0]?.message?.content;
if (!content) {
throw new Error(`LLM returned no content. Model: ${model}`);
}
let cost = 0;
const usage = resp.data.usage || {};
if (usage.cost !== undefined) {
cost = usage.cost;
} else {
// Fallback estimation
cost =
(usage.prompt_tokens || 0) * (0.1 / 1_000_000) +
(usage.completion_tokens || 0) * (0.4 / 1_000_000);
}
return {
content,
usage: {
promptTokens: usage.prompt_tokens || 0,
completionTokens: usage.completion_tokens || 0,
cost,
},
};
}
/**
* Send a request and parse the response as JSON.
*/
export async function llmJsonRequest<T = any>(
options: LLMRequestOptions,
): Promise<{ data: T; usage: LLMResponse["usage"] }> {
const response = await llmRequest({ ...options, jsonMode: true });
const cleaned = cleanJson(response.content);
let parsed: T;
try {
parsed = JSON.parse(cleaned);
} catch (e) {
throw new Error(
`Failed to parse LLM JSON response: ${(e as Error).message}\nRaw: ${cleaned.substring(0, 500)}`,
);
}
// Unwrap common LLM artifacts: {"0": {...}}, {"state": {...}}, etc.
const unwrapped = unwrapResponse(parsed);
return { data: unwrapped as T, usage: response.usage };
}
/**
* Recursively unwrap common LLM wrapping patterns.
*/
function unwrapResponse(obj: any): any {
if (!obj || typeof obj !== "object" || Array.isArray(obj)) return obj;
const keys = Object.keys(obj);
if (keys.length === 1) {
const key = keys[0];
if (
key === "0" ||
key === "state" ||
key === "facts" ||
key === "result" ||
key === "data"
) {
return unwrapResponse(obj[key]);
}
}
return obj;
}

View File

@@ -0,0 +1,296 @@
// ============================================================================
// Pipeline Orchestrator
// Runs all steps sequentially, tracks state, supports re-running individual steps.
// ============================================================================
import * as fs from "node:fs/promises";
import * as path from "node:path";
import { crawlSite, clearCrawlCache } from "./scraper.js";
import { analyzeSite } from "./analyzer.js";
import { executeResearch } from "./steps/00b-research.js";
import { executeExtract } from "./steps/01-extract.js";
import { executeSiteAudit } from "./steps/00a-site-audit.js";
import { executeAudit } from "./steps/02-audit.js";
import { executeStrategize } from "./steps/03-strategize.js";
import { executeArchitect } from "./steps/04-architect.js";
import type {
PipelineConfig,
PipelineInput,
ConceptState,
ProjectConcept,
StepResult,
} from "./types.js";
export interface PipelineCallbacks {
onStepStart?: (stepId: string, stepName: string) => void;
onStepComplete?: (stepId: string, result: StepResult) => void;
onStepError?: (stepId: string, error: string) => void;
}
/**
* The main concept pipeline orchestrator.
* Runs conceptual steps sequentially and builds the ProjectConcept.
*/
export class ConceptPipeline {
private config: PipelineConfig;
private state: ConceptState;
private callbacks: PipelineCallbacks;
constructor(config: PipelineConfig, callbacks: PipelineCallbacks = {}) {
this.config = config;
this.callbacks = callbacks;
this.state = this.createInitialState();
}
private createInitialState(): ConceptState {
return {
briefing: "",
usage: {
totalPromptTokens: 0,
totalCompletionTokens: 0,
totalCost: 0,
perStep: [],
},
};
}
/**
* Run the full concept pipeline from scratch.
*/
async run(input: PipelineInput): Promise<ProjectConcept> {
this.state.briefing = input.briefing;
this.state.url = input.url;
this.state.comments = input.comments;
// Ensure output directories
await fs.mkdir(this.config.outputDir, { recursive: true });
await fs.mkdir(this.config.crawlDir, { recursive: true });
// Step 0: Scrape & Analyze (deterministic)
if (input.url) {
if (input.clearCache) {
const domain = new URL(input.url).hostname;
await clearCrawlCache(this.config.crawlDir, domain);
}
await this.runStep(
"00-scrape",
"Scraping & Analyzing Website",
async () => {
const pages = await crawlSite(input.url!, {
zyteApiKey: this.config.zyteApiKey,
crawlDir: this.config.crawlDir,
});
const domain = new URL(input.url!).hostname;
const siteProfile = analyzeSite(pages, domain);
this.state.siteProfile = siteProfile;
this.state.crawlDir = path.join(
this.config.crawlDir,
domain.replace(/\./g, "-"),
);
// Save site profile
await fs.writeFile(
path.join(this.state.crawlDir!, "_site_profile.json"),
JSON.stringify(siteProfile, null, 2),
);
return {
success: true,
data: siteProfile,
usage: {
step: "00-scrape",
model: "none",
promptTokens: 0,
completionTokens: 0,
cost: 0,
durationMs: 0,
},
};
},
);
}
// Step 00a: Site Audit (DataForSEO)
await this.runStep(
"00a-site-audit",
"IST-Analysis (DataForSEO)",
async () => {
const result = await executeSiteAudit(this.state, this.config);
if (result.success && result.data) {
this.state.siteAudit = result.data;
}
return result;
},
);
// Step 00b: Research (real web data via journaling)
await this.runStep(
"00b-research",
"Industry & Company Research",
async () => {
const result = await executeResearch(this.state);
if (result.success && result.data) {
this.state.researchData = result.data;
}
return result;
},
);
// Step 1: Extract facts
await this.runStep(
"01-extract",
"Extracting Facts from Briefing",
async () => {
const result = await executeExtract(this.state, this.config);
if (result.success) this.state.facts = result.data;
return result;
},
);
// Step 2: Audit features
await this.runStep(
"02-audit",
"Auditing Features (Skeptical Review)",
async () => {
const result = await executeAudit(this.state, this.config);
if (result.success) this.state.auditedFacts = result.data;
return result;
},
);
// Step 3: Strategic analysis
await this.runStep("03-strategize", "Strategic Analysis", async () => {
const result = await executeStrategize(this.state, this.config);
if (result.success) {
this.state.briefingSummary = result.data.briefingSummary;
this.state.designVision = result.data.designVision;
}
return result;
});
// Step 4: Sitemap architecture
await this.runStep("04-architect", "Information Architecture", async () => {
const result = await executeArchitect(this.state, this.config);
if (result.success) {
this.state.sitemap = result.data.sitemap;
this.state.websiteTopic = result.data.websiteTopic;
}
return result;
});
const projectConcept = this.buildProjectConcept();
await this.saveState(projectConcept);
return projectConcept;
}
/**
* Run a single step with callbacks and error handling.
*/
private async runStep(
stepId: string,
stepName: string,
executor: () => Promise<StepResult>,
): Promise<void> {
this.callbacks.onStepStart?.(stepId, stepName);
console.log(`\n📍 ${stepName}...`);
try {
const result = await executor();
if (result.usage) {
this.state.usage.perStep.push(result.usage);
this.state.usage.totalPromptTokens += result.usage.promptTokens;
this.state.usage.totalCompletionTokens += result.usage.completionTokens;
this.state.usage.totalCost += result.usage.cost;
}
if (result.success) {
const cost = result.usage?.cost
? ` ($${result.usage.cost.toFixed(4)})`
: "";
const duration = result.usage?.durationMs
? ` [${(result.usage.durationMs / 1000).toFixed(1)}s]`
: "";
console.log(`${stepName} complete${cost}${duration}`);
this.callbacks.onStepComplete?.(stepId, result);
} else {
console.error(`${stepName} failed: ${result.error}`);
this.callbacks.onStepError?.(stepId, result.error || "Unknown error");
throw new Error(result.error);
}
} catch (err) {
const errorMsg = (err as Error).message;
this.callbacks.onStepError?.(stepId, errorMsg);
throw err;
}
}
/**
* Build the final Concept object.
*/
private buildProjectConcept(): ProjectConcept {
return {
domain: this.state.siteProfile?.domain || "unknown",
timestamp: new Date().toISOString(),
briefing: this.state.briefing,
auditedFacts: this.state.auditedFacts || {},
siteProfile: this.state.siteProfile,
siteAudit: this.state.siteAudit,
researchData: this.state.researchData,
strategy: {
briefingSummary: this.state.briefingSummary || "",
designVision: this.state.designVision || "",
},
architecture: {
websiteTopic: this.state.websiteTopic || "",
sitemap: this.state.sitemap || [],
},
usage: this.state.usage,
};
}
/**
* Save the full concept generated state to disk.
*/
private async saveState(concept: ProjectConcept): Promise<void> {
const timestamp = new Date().toISOString().replace(/[:.]/g, "-");
const companyName = this.state.auditedFacts?.companyName || "unknown";
const stateDir = path.join(this.config.outputDir, "concepts");
await fs.mkdir(stateDir, { recursive: true });
const statePath = path.join(stateDir, `${companyName}_${timestamp}.json`);
await fs.writeFile(statePath, JSON.stringify(concept, null, 2));
console.log(`\n📦 Saved Project Concept to: ${statePath}`);
// Save debug trace
const debugPath = path.join(
stateDir,
`${companyName}_${timestamp}_debug.json`,
);
await fs.writeFile(debugPath, JSON.stringify(this.state, null, 2));
// Print usage summary
console.log("\n──────────────────────────────────────────────");
console.log("📊 PIPELINE USAGE SUMMARY");
console.log("──────────────────────────────────────────────");
for (const step of this.state.usage.perStep) {
if (step.cost > 0) {
console.log(
` ${step.step}: ${step.model}$${step.cost.toFixed(6)} (${(step.durationMs / 1000).toFixed(1)}s)`,
);
}
}
console.log("──────────────────────────────────────────────");
console.log(` TOTAL: $${this.state.usage.totalCost.toFixed(6)}`);
console.log(
` Tokens: ${(this.state.usage.totalPromptTokens + this.state.usage.totalCompletionTokens).toLocaleString()}`,
);
console.log("──────────────────────────────────────────────\n");
}
/** Get the current internal state (for CLI inspection). */
getState(): ConceptState {
return this.state;
}
}

View File

@@ -0,0 +1,478 @@
// ============================================================================
// Scraper — Zyte API + Local Persistence
// Crawls all pages of a website, stores them locally for reuse.
// Crawls all pages of a website, stores them locally for reuse.
// ============================================================================
import * as cheerio from "cheerio";
import * as fs from "node:fs/promises";
import * as path from "node:path";
import { existsSync } from "node:fs";
import type { CrawledPage, PageType } from "./types.js";
interface ScraperConfig {
zyteApiKey?: string;
crawlDir: string;
maxPages?: number;
}
/**
* Classify a URL pathname into a page type.
*/
function classifyPage(pathname: string): PageType {
const p = pathname.toLowerCase();
if (p === "/" || p === "" || p === "/index.html") return "home";
if (
p.includes("service") ||
p.includes("leistung") ||
p.includes("kompetenz")
)
return "service";
if (
p.includes("about") ||
p.includes("ueber") ||
p.includes("über") ||
p.includes("unternehmen")
)
return "about";
if (p.includes("contact") || p.includes("kontakt")) return "contact";
if (
p.includes("job") ||
p.includes("karriere") ||
p.includes("career") ||
p.includes("human-resources")
)
return "career";
if (
p.includes("portfolio") ||
p.includes("referenz") ||
p.includes("projekt") ||
p.includes("case-study")
)
return "portfolio";
if (
p.includes("blog") ||
p.includes("news") ||
p.includes("aktuelles") ||
p.includes("magazin")
)
return "blog";
if (
p.includes("legal") ||
p.includes("impressum") ||
p.includes("datenschutz") ||
p.includes("privacy") ||
p.includes("agb")
)
return "legal";
return "other";
}
/**
* Detect interactive features present on a page.
*/
function detectFeatures($: cheerio.CheerioAPI): string[] {
const features: string[] = [];
// Search
if (
$('input[type="search"]').length > 0 ||
$('form[role="search"]').length > 0 ||
$(".search-form, .search-box, #search, .searchbar").length > 0 ||
$('input[name="q"], input[name="s"], input[name="search"]').length > 0
) {
features.push("search");
}
// Forms (beyond search)
const formCount = $("form").length;
const searchForms = $('form[role="search"], .search-form').length;
if (formCount > searchForms) {
features.push("forms");
}
// Maps
if (
$(
'iframe[src*="google.com/maps"], iframe[src*="openstreetmap"], .map-container, #map, [data-map]',
).length > 0
) {
features.push("maps");
}
// Video
if (
$("video, iframe[src*='youtube'], iframe[src*='vimeo'], .video-container")
.length > 0
) {
features.push("video");
}
// Calendar / Events
if ($(".calendar, .event, [data-calendar]").length > 0) {
features.push("calendar");
}
// Cookie consent
if (
$(".cookie-banner, .cookie-consent, #cookie-notice, [data-cookie]").length >
0
) {
features.push("cookie-consent");
}
return features;
}
/**
* Extract all internal links from a page.
*/
function extractInternalLinks($: cheerio.CheerioAPI, origin: string): string[] {
const links: string[] = [];
$("a[href]").each((_, el) => {
const href = $(el).attr("href");
if (!href) return;
try {
const url = new URL(href, origin);
if (url.origin === origin) {
// Skip assets
if (
/\.(pdf|zip|jpg|jpeg|png|svg|webp|gif|css|js|ico|woff|woff2|ttf|eot)$/i.test(
url.pathname,
)
)
return;
// Skip anchors-only
if (url.pathname === "/" && url.hash) return;
links.push(url.pathname);
}
} catch {
// Invalid URL, skip
}
});
return [...new Set(links)];
}
/**
* Extract all images from a page.
*/
function extractImages($: cheerio.CheerioAPI, origin: string): string[] {
const images: string[] = [];
// Regular img tags
$("img[src]").each((_, el) => {
const src = $(el).attr("src");
if (src) images.push(src);
});
// CSS background images (inline styles)
$("[style*='background-image']").each((_, el) => {
const style = $(el).attr("style");
const match = style?.match(/url\(['"]?(.*?)['"]?\)/);
if (match && match[1]) {
images.push(match[1]);
}
});
// Resolve URLs to absolute
const absoluteImages: string[] = [];
for (const img of images) {
if (img.startsWith("data:image")) continue; // Skip inline base64
try {
const url = new URL(img, origin);
// Ignore small tracking pixels or generic vectors
if (url.pathname.endsWith(".svg") && !url.pathname.includes("logo"))
continue;
absoluteImages.push(url.href);
} catch {
// Invalid URL
}
}
return [...new Set(absoluteImages)];
}
/**
* Fetch a page via Zyte API with browser rendering.
*/
async function fetchWithZyte(url: string, apiKey: string): Promise<string> {
const auth = Buffer.from(`${apiKey}:`).toString("base64");
const resp = await fetch("https://api.zyte.com/v1/extract", {
method: "POST",
headers: {
Authorization: `Basic ${auth}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
url,
browserHtml: true,
}),
signal: AbortSignal.timeout(60000),
});
if (!resp.ok) {
const errorText = await resp.text();
console.error(
` ❌ Zyte API error ${resp.status} for ${url}: ${errorText}`,
);
// Rate limited — wait and retry once
if (resp.status === 429) {
console.log(" ⏳ Rate limited, waiting 5s and retrying...");
await new Promise((r) => setTimeout(r, 5000));
return fetchWithZyte(url, apiKey);
}
throw new Error(`HTTP ${resp.status}: ${errorText}`);
}
const data = await resp.json();
const html = data.browserHtml || "";
if (!html) {
console.warn(` ⚠️ Zyte returned empty browserHtml for ${url}`);
}
return html;
}
/**
* Fetch a page via simple HTTP GET (fallback).
*/
async function fetchDirect(url: string): Promise<string> {
const resp = await fetch(url, {
headers: {
"User-Agent":
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36",
},
signal: AbortSignal.timeout(30000),
}).catch(() => null);
if (!resp || !resp.ok) return "";
return await resp.text();
}
/**
* Parse an HTML string into a CrawledPage.
*/
function parsePage(html: string, url: string): CrawledPage {
const $ = cheerio.load(html);
const urlObj = new URL(url);
const title = $("title").text().trim();
const headings = $("h1, h2, h3")
.map((_, el) => $(el).text().trim())
.get()
.filter((h) => h.length > 0);
const navItems = $("nav a")
.map((_, el) => $(el).text().trim())
.get()
.filter((t) => t.length > 0 && t.length < 100);
const bodyText = $("body")
.text()
.replace(/\s+/g, " ")
.substring(0, 50000)
.trim();
const features = detectFeatures($);
const links = extractInternalLinks($, urlObj.origin);
const images = extractImages($, urlObj.origin);
const description =
$('meta[name="description"]').attr("content") || undefined;
const ogTitle = $('meta[property="og:title"]').attr("content") || undefined;
const ogImage = $('meta[property="og:image"]').attr("content") || undefined;
return {
url,
pathname: urlObj.pathname,
title,
html,
text: bodyText,
headings,
navItems,
features,
type: classifyPage(urlObj.pathname),
links,
images,
meta: { description, ogTitle, ogImage },
};
}
/**
* Crawl a website and persist all pages locally.
*
* Returns an array of CrawledPage objects.
*/
export async function crawlSite(
targetUrl: string,
config: ScraperConfig,
): Promise<CrawledPage[]> {
const urlObj = new URL(targetUrl);
const origin = urlObj.origin;
const domain = urlObj.hostname;
const domainDir = path.join(config.crawlDir, domain.replace(/\./g, "-"));
// Check for existing crawl
const metaFile = path.join(domainDir, "_crawl_meta.json");
if (existsSync(metaFile)) {
console.log(`📦 Found existing crawl for ${domain}. Loading from disk...`);
return loadCrawlFromDisk(domainDir);
}
console.log(
`🔍 Crawling ${targetUrl} via ${config.zyteApiKey ? "Zyte API" : "direct HTTP"}...`,
);
// Ensure output dir
await fs.mkdir(domainDir, { recursive: true });
const maxPages = config.maxPages || 30;
const visited = new Set<string>();
const queue: string[] = [targetUrl];
const pages: CrawledPage[] = [];
while (queue.length > 0 && visited.size < maxPages) {
const url = queue.shift()!;
const urlPath = new URL(url).pathname;
if (visited.has(urlPath)) continue;
visited.add(urlPath);
try {
console.log(` ↳ Fetching ${url} (${visited.size}/${maxPages})...`);
let html: string;
if (config.zyteApiKey) {
html = await fetchWithZyte(url, config.zyteApiKey);
} else {
html = await fetchDirect(url);
}
if (!html || html.length < 100) {
console.warn(` ⚠️ Empty/tiny response for ${url}, skipping.`);
continue;
}
const page = parsePage(html, url);
pages.push(page);
// Save HTML + metadata to disk
const safeName =
urlPath === "/"
? "index"
: urlPath.replace(/\//g, "_").replace(/^_/, "");
await fs.writeFile(path.join(domainDir, `${safeName}.html`), html);
await fs.writeFile(
path.join(domainDir, `${safeName}.meta.json`),
JSON.stringify(
{
url: page.url,
pathname: page.pathname,
title: page.title,
type: page.type,
headings: page.headings,
navItems: page.navItems,
features: page.features,
links: page.links,
images: page.images,
meta: page.meta,
},
null,
2,
),
);
// Discover new links
for (const link of page.links) {
if (!visited.has(link)) {
const fullUrl = `${origin}${link}`;
queue.push(fullUrl);
}
}
} catch (err) {
console.warn(` ⚠️ Failed to fetch ${url}: ${(err as Error).message}`);
}
}
// Save crawl metadata
await fs.writeFile(
metaFile,
JSON.stringify(
{
domain,
crawledAt: new Date().toISOString(),
totalPages: pages.length,
urls: pages.map((p) => p.url),
},
null,
2,
),
);
console.log(
`✅ Crawled ${pages.length} pages for ${domain}. Saved to ${domainDir}`,
);
return pages;
}
/**
* Load a previously crawled site from disk.
*/
async function loadCrawlFromDisk(domainDir: string): Promise<CrawledPage[]> {
const files = await fs.readdir(domainDir);
const metaFiles = files.filter(
(f) => f.endsWith(".meta.json") && f !== "_crawl_meta.json",
);
const pages: CrawledPage[] = [];
for (const metaFile of metaFiles) {
const baseName = metaFile.replace(".meta.json", "");
const htmlFile = `${baseName}.html`;
const meta = JSON.parse(
await fs.readFile(path.join(domainDir, metaFile), "utf8"),
);
let html = "";
if (files.includes(htmlFile)) {
html = await fs.readFile(path.join(domainDir, htmlFile), "utf8");
}
const text = html
? cheerio
.load(html)("body")
.text()
.replace(/\s+/g, " ")
.substring(0, 50000)
.trim()
: "";
pages.push({
url: meta.url,
pathname: meta.pathname,
title: meta.title,
html,
text,
headings: meta.headings || [],
navItems: meta.navItems || [],
features: meta.features || [],
type: meta.type || "other",
links: meta.links || [],
images: meta.images || [],
meta: meta.meta || {},
});
}
console.log(` 📂 Loaded ${pages.length} cached pages from disk.`);
return pages;
}
/**
* Delete a cached crawl to force re-crawl.
*/
export async function clearCrawlCache(
crawlDir: string,
domain: string,
): Promise<void> {
const domainDir = path.join(crawlDir, domain.replace(/\./g, "-"));
if (existsSync(domainDir)) {
await fs.rm(domainDir, { recursive: true, force: true });
console.log(`🧹 Cleared crawl cache for ${domain}`);
}
}

View File

@@ -0,0 +1,65 @@
// ============================================================================
// Step 00a: Site Audit (DataForSEO + AI)
// ============================================================================
import { PageAuditor } from "@mintel/page-audit";
import type { ConceptState, StepResult, PipelineConfig } from "../types.js";
export async function executeSiteAudit(
state: ConceptState,
config: PipelineConfig,
): Promise<StepResult> {
const startTime = Date.now();
if (!state.url) {
return {
success: true,
data: null,
usage: { step: "00a-site-audit", model: "none", promptTokens: 0, completionTokens: 0, cost: 0, durationMs: Date.now() - startTime },
};
}
try {
const login = process.env.DATA_FOR_SEO_LOGIN || process.env.DATA_FOR_SEO_API_KEY?.split(":")?.[0];
const password = process.env.DATA_FOR_SEO_PASSWORD || process.env.DATA_FOR_SEO_API_KEY?.split(":")?.slice(1)?.join(":");
if (!login || !password) {
console.warn(" ⚠️ Site Audit skipped: DataForSEO credentials missing from environment.");
return {
success: true,
data: null,
usage: { step: "00a-site-audit", model: "none", promptTokens: 0, completionTokens: 0, cost: 0, durationMs: Date.now() - startTime },
};
}
const auditor = new PageAuditor({
dataForSeoLogin: login,
dataForSeoPassword: password,
openrouterKey: config.openrouterKey,
outputDir: config.outputDir ? `${config.outputDir}/audits` : undefined,
});
// Run audit (max 20 pages for the estimation phase to keep it fast)
const result = await auditor.audit(state.url, { maxPages: 20 });
return {
success: true,
data: result,
usage: {
step: "00a-site-audit",
model: "dataforseo",
cost: 0, // DataForSEO cost tracking could be added later
promptTokens: 0,
completionTokens: 0,
durationMs: Date.now() - startTime,
},
};
} catch (err: any) {
console.warn(` ⚠️ Site Audit failed, skipping: ${err.message}`);
return {
success: true,
data: null,
usage: { step: "00a-site-audit", model: "none", promptTokens: 0, completionTokens: 0, cost: 0, durationMs: Date.now() - startTime },
};
}
}

View File

@@ -0,0 +1,121 @@
// ============================================================================
// Step 00b: Research — Industry Research via @mintel/journaling (No LLM hallus)
// Uses Serper API for real web search results about the industry/company.
// ============================================================================
import type { ConceptState, StepResult } from "../types.js";
interface ResearchResult {
companyContext: string[];
industryInsights: string[];
competitorInfo: string[];
}
/**
* Research the company and industry using real web search data.
* Uses @mintel/journaling's ResearchAgent — results are grounded in real sources.
*
* NOTE: The journaling package can cause unhandled rejections that crash the process.
* We wrap each call in an additional safety layer.
*/
export async function executeResearch(
state: ConceptState,
): Promise<StepResult<ResearchResult>> {
const startTime = Date.now();
const companyName = state.siteProfile?.companyInfo?.name || "";
const websiteTopic = state.siteProfile?.services?.slice(0, 3).join(", ") || "";
const domain = state.siteProfile?.domain || "";
if (!companyName && !websiteTopic && !domain) {
return {
success: true,
data: { companyContext: [], industryInsights: [], competitorInfo: [] },
usage: { step: "00b-research", model: "none", promptTokens: 0, completionTokens: 0, cost: 0, durationMs: 0 },
};
}
// Safety wrapper: catch ANY unhandled rejections during this step
const safeCall = <T>(fn: () => Promise<T>, fallback: T): Promise<T> => {
return new Promise<T>((resolve) => {
const handler = (err: any) => {
console.warn(` ⚠️ Unhandled rejection caught in research: ${err?.message || err}`);
process.removeListener("unhandledRejection", handler);
resolve(fallback);
};
process.on("unhandledRejection", handler);
fn()
.then((result) => {
process.removeListener("unhandledRejection", handler);
resolve(result);
})
.catch((err) => {
process.removeListener("unhandledRejection", handler);
console.warn(` ⚠️ Research call failed: ${err?.message || err}`);
resolve(fallback);
});
});
};
try {
const { ResearchAgent } = await import("@mintel/journaling");
const agent = new ResearchAgent(process.env.OPENROUTER_API_KEY || "");
const results: ResearchResult = {
companyContext: [],
industryInsights: [],
competitorInfo: [],
};
// 1. Research the company itself
if (companyName || domain) {
const searchQuery = companyName
? `${companyName} ${websiteTopic} Unternehmen`
: `site:${domain}`;
console.log(` 🔍 Researching: "${searchQuery}"...`);
const facts = await safeCall(
() => agent.researchTopic(searchQuery),
[] as any[],
);
results.companyContext = (facts || [])
.filter((f: any) => f?.fact || f?.value || f?.text || f?.statement)
.map((f: any) => f.fact || f.value || f.text || f.statement)
.slice(0, 5);
}
// 2. Industry research
if (websiteTopic) {
console.log(` 🔍 Researching industry: "${websiteTopic}"...`);
const insights = await safeCall(
() => agent.researchCompetitors(websiteTopic),
[] as any[],
);
results.industryInsights = (insights || []).slice(0, 5);
}
const totalFacts = results.companyContext.length + results.industryInsights.length + results.competitorInfo.length;
console.log(` 📊 Research found ${totalFacts} data points.`);
return {
success: true,
data: results,
usage: {
step: "00b-research",
model: "serper/datacommons",
promptTokens: 0,
completionTokens: 0,
cost: 0,
durationMs: Date.now() - startTime,
},
};
} catch (err) {
console.warn(` ⚠️ Research step skipped: ${(err as Error).message}`);
return {
success: true,
data: { companyContext: [], industryInsights: [], competitorInfo: [] },
usage: { step: "00b-research", model: "none", promptTokens: 0, completionTokens: 0, cost: 0, durationMs: Date.now() - startTime },
};
}
}

View File

@@ -0,0 +1,108 @@
// ============================================================================
// Step 01: Extract — Briefing Fact Extraction (Gemini Flash)
// ============================================================================
import { llmJsonRequest } from "../llm-client.js";
import type { ConceptState, StepResult, PipelineConfig } from "../types.js";
import { DEFAULT_MODELS } from "../types.js";
export async function executeExtract(
state: ConceptState,
config: PipelineConfig,
): Promise<StepResult> {
const models = { ...DEFAULT_MODELS, ...config.modelsOverride };
const startTime = Date.now();
// Build site context from the deterministic analyzer
const siteContext = state.siteProfile
? `
EXISTING WEBSITE ANALYSIS (FACTS — verifiably crawled, NOT guessed):
- Domain: ${state.siteProfile.domain}
- Total pages crawled: ${state.siteProfile.totalPages}
- Navigation items: ${state.siteProfile.navigation.map((n) => n.label).join(", ") || "nicht erkannt"}
- Existing features: ${state.siteProfile.existingFeatures.join(", ") || "keine"}
- Services / Kompetenzen: ${state.siteProfile.services.join(" | ") || "keine"}
- Employee count (from website text): ${(state.siteProfile as any).employeeCount || "nicht genannt"}
- Company name: ${state.siteProfile.companyInfo.name || "unbekannt"}
- Address: ${state.siteProfile.companyInfo.address || "unbekannt"}
- Tax ID (USt-ID): ${state.siteProfile.companyInfo.taxId || "unbekannt"}
- HRB: ${state.siteProfile.companyInfo.registerNumber || "unbekannt"}
- Managing Director: ${state.siteProfile.companyInfo.managingDirector || "unbekannt"}
- External related domains (HAVE OWN WEBSITES — DO NOT include as sub-pages!): ${state.siteProfile.externalDomains.join(", ") || "keine"}
- Social links: ${Object.entries(state.siteProfile.socialLinks).map(([k, v]) => `${k}: ${v}`).join(", ") || "keine"}
`
: "No existing website data available.";
const systemPrompt = `
You are a precision fact extractor. Your only job: extract verifiable facts from the BRIEFING.
Output language: GERMAN (strict).
Output format: flat JSON at root level. No nesting except arrays.
### CRITICAL RULES:
1. "employeeCount": take from SITE ANALYSIS if available. Only override if briefing states something more specific.
2. External domains (e.g. "etib-ing.com") have their OWN website. NEVER include them as sub-pages.
3. Videos (Messefilm, Imagefilm) are CONTENT ASSETS, not pages.
4. If existing site already has search, include "search" in functions.
5. DO NOT invent pages not mentioned in briefing or existing navigation.
### CONSERVATIVE RULE:
- simple lists (Jobs, Referenzen, Messen) = pages, NOT features
- Assume "page" as default. Only add "feature" for complex interactive systems.
### OUTPUT FORMAT:
{
"companyName": string,
"companyAddress": string,
"personName": string,
"email": string,
"existingWebsite": string,
"websiteTopic": string, // MAX 3 words
"isRelaunch": boolean,
"employeeCount": string, // from site analysis, e.g. "über 50"
"pages": string[], // ALL pages: ["Startseite", "Über Uns", "Leistungen", ...]
"functions": string[], // search, forms, maps, video, cookie_consent, etc.
"assets": string[], // existing_website, logo, media, photos, videos
"deadline": string,
"targetAudience": string,
"cmsSetup": boolean,
"multilang": boolean
}
BANNED OUTPUT KEYS: "selectedPages", "otherPages", "features", "apiSystems" — use pages[] and functions[] ONLY.
`;
const userPrompt = `BRIEFING (TRUTH SOURCE):
${state.briefing}
COMMENTS:
${state.comments || "keine"}
${siteContext}`;
try {
const { data, usage } = await llmJsonRequest({
model: models.flash,
systemPrompt,
userPrompt,
apiKey: config.openrouterKey,
});
return {
success: true,
data,
usage: {
step: "01-extract",
model: models.flash,
promptTokens: usage.promptTokens,
completionTokens: usage.completionTokens,
cost: usage.cost,
durationMs: Date.now() - startTime,
},
};
} catch (err) {
return {
success: false,
error: `Extract step failed: ${(err as Error).message}`,
};
}
}

View File

@@ -0,0 +1,110 @@
// ============================================================================
// Step 02: Audit — Feature Auditor + Skeptical Review (Gemini Flash)
// ============================================================================
import { llmJsonRequest } from "../llm-client.js";
import type { ConceptState, StepResult, PipelineConfig } from "../types.js";
import { DEFAULT_MODELS } from "../types.js";
export async function executeAudit(
state: ConceptState,
config: PipelineConfig,
): Promise<StepResult> {
const models = { ...DEFAULT_MODELS, ...config.modelsOverride };
const startTime = Date.now();
if (!state.facts) {
return { success: false, error: "No facts from Step 01 available." };
}
const systemPrompt = `
You are a "Strict Cost Controller". Your mission is to prevent over-billing.
Review the extracted FEATURES against the BRIEFING and the EXISTING SITE ANALYSIS.
### RULE OF THUMB:
- A "Feature" (1.500 €) is ONLY justified for complex, dynamic systems (logic, database, CMS-driven management, advanced filtering).
- Simple lists, information sections, or static descriptions (e.g., "Messen", "Team", "Historie", "Jobs" as mere text) are ALWAYS "Pages" (600 €).
- If the briefing doesn't explicitly mention "Management System", "Filterable Database", or "Client Login", it is a PAGE.
### ADDITIONAL CHECKS:
1. If any feature maps to an entity that has its own external website (listed in EXTERNAL_DOMAINS), remove it entirely — it's out of scope.
2. Videos are ASSETS not pages. Remove any video-related entries from pages.
3. If the existing site has features (search, forms, etc.), ensure they are in the functions list.
### MISSION:
Return the corrected 'features', 'otherPages', and 'functions' arrays.
### OUTPUT FORMAT:
{
"features": string[],
"otherPages": string[],
"functions": string[],
"removedItems": [{ "item": string, "reason": string }],
"addedItems": [{ "item": string, "reason": string }]
}
`;
const userPrompt = `
EXTRACTED FACTS:
${JSON.stringify(state.facts, null, 2)}
BRIEFING:
${state.briefing}
EXTERNAL DOMAINS (have own websites, OUT OF SCOPE):
${state.siteProfile?.externalDomains?.join(", ") || "none"}
EXISTING FEATURES ON CURRENT SITE:
${state.siteProfile?.existingFeatures?.join(", ") || "none"}
`;
try {
const { data, usage } = await llmJsonRequest({
model: models.flash,
systemPrompt,
userPrompt,
apiKey: config.openrouterKey,
});
// Apply audit results to facts
const auditedFacts = { ...state.facts };
auditedFacts.features = data.features || [];
auditedFacts.otherPages = [
...new Set([...(auditedFacts.otherPages || []), ...(data.otherPages || [])]),
];
if (data.functions) {
auditedFacts.functions = [
...new Set([...(auditedFacts.functions || []), ...data.functions]),
];
}
// Log changes
if (data.removedItems?.length) {
console.log(" 📉 Audit removed:");
for (const item of data.removedItems) {
console.log(` - ${item.item}: ${item.reason}`);
}
}
if (data.addedItems?.length) {
console.log(" 📈 Audit added:");
for (const item of data.addedItems) {
console.log(` + ${item.item}: ${item.reason}`);
}
}
return {
success: true,
data: auditedFacts,
usage: {
step: "02-audit",
model: models.flash,
promptTokens: usage.promptTokens,
completionTokens: usage.completionTokens,
cost: usage.cost,
durationMs: Date.now() - startTime,
},
};
} catch (err) {
return { success: false, error: `Audit step failed: ${(err as Error).message}` };
}
}

View File

@@ -0,0 +1,99 @@
// ============================================================================
// Step 03: Strategize — Briefing Summary + Design Vision (Gemini Pro)
// ============================================================================
import { llmJsonRequest } from "../llm-client.js";
import type { ConceptState, StepResult, PipelineConfig } from "../types.js";
import { DEFAULT_MODELS } from "../types.js";
export async function executeStrategize(
state: ConceptState,
config: PipelineConfig,
): Promise<StepResult> {
const models = { ...DEFAULT_MODELS, ...config.modelsOverride };
const startTime = Date.now();
if (!state.auditedFacts) {
return { success: false, error: "No audited facts from Step 02 available." };
}
const systemPrompt = `
You are a high-end Digital Architect. Your goal is to make the CUSTOMER feel 100% understood.
Analyze the BRIEFING and the EXISTING WEBSITE context.
### OBJECTIVE:
1. **briefingSummary**: Ein sachlicher, tiefgehender Überblick der Unternehmenslage.
- STIL: Keine Ich-Form. Keine Marketing-Floskeln. Nutze präzise Fachbegriffe. Sei prägnant.
- FORM: EXAKT ZWEI ABSÄTZE. Insgesamt ca. 6 Sätze.
- INHALT: Status Quo, was der Kunde will, welcher Sprung notwendig ist.
- ABSOLUTE REGEL: Keine Halluzinationen. Keine namentlichen Nennungen von Personen.
- RELAUNCH-REGEL: Wenn isRelaunch=true, NICHT sagen "keine digitale Präsenz". Es GIBT eine Seite.
- SORGLOS BETRIEB: MUSS erwähnt werden als Teil des Gesamtpakets.
2. **designVision**: Ein abstraktes, strategisches Konzept.
- STIL: Rein konzeptionell. Keine Umsetzungsschritte. Keine Ich-Form. Sei prägnant.
- FORM: EXAKT ZWEI ABSÄTZE. Insgesamt ca. 4 Sätze.
- DATENSCHUTZ: KEINERLEI namentliche Nennungen.
- FOKUS: Welche strategische Wirkung soll erzielt werden?
### RULES:
- NO "wir/unser". NO "Ich/Mein". Objective, fact-oriented narrative.
- NO marketing lingo. NO "innovativ", "revolutionär", "state-of-the-art".
- NO hallucinations about features not in the briefing.
- NO "SEO-Standards zur Fachkräftesicherung" or "B2B-Nutzerströme" — das ist Schwachsinn.
Use specific industry terms from the briefing (e.g. "Kabeltiefbau", "HDD-Bohrverfahren").
- LANGUAGE: Professional German. Simple but expert-level.
### OUTPUT FORMAT:
{
"briefingSummary": string,
"designVision": string
}
`;
const userPrompt = `
BRIEFING (TRUTH SOURCE):
${state.briefing}
EXISTING WEBSITE DATA:
- Services: ${state.siteProfile?.services?.join(", ") || "unbekannt"}
- Navigation: ${state.siteProfile?.navigation?.map((n) => n.label).join(", ") || "unbekannt"}
- Company: ${state.auditedFacts.companyName || "unbekannt"}
EXTRACTED & AUDITED FACTS:
${JSON.stringify(state.auditedFacts, null, 2)}
${state.siteAudit?.report ? `
TECHNICAL SITE AUDIT (IST-Analyse):
Health: ${state.siteAudit.report.overallHealth} (SEO: ${state.siteAudit.report.seoScore}, UX: ${state.siteAudit.report.uxScore}, Perf: ${state.siteAudit.report.performanceScore})
- Executive Summary: ${state.siteAudit.report.executiveSummary}
- Strengths: ${state.siteAudit.report.strengths.join(", ")}
- Critical Issues: ${state.siteAudit.report.criticalIssues.join(", ")}
- Quick Wins: ${state.siteAudit.report.quickWins.join(", ")}
` : ""}
`;
try {
const { data, usage } = await llmJsonRequest({
model: models.pro,
systemPrompt,
userPrompt,
apiKey: config.openrouterKey,
});
return {
success: true,
data,
usage: {
step: "03-strategize",
model: models.pro,
promptTokens: usage.promptTokens,
completionTokens: usage.completionTokens,
cost: usage.cost,
durationMs: Date.now() - startTime,
},
};
} catch (err) {
return { success: false, error: `Strategize step failed: ${(err as Error).message}` };
}
}

View File

@@ -0,0 +1,133 @@
// ============================================================================
// Step 04: Architect — Sitemap & Information Architecture (Gemini Pro)
// ============================================================================
import { llmJsonRequest } from "../llm-client.js";
import type { ConceptState, StepResult, PipelineConfig } from "../types.js";
import { DEFAULT_MODELS } from "../types.js";
export async function executeArchitect(
state: ConceptState,
config: PipelineConfig,
): Promise<StepResult> {
const models = { ...DEFAULT_MODELS, ...config.modelsOverride };
const startTime = Date.now();
if (!state.auditedFacts) {
return { success: false, error: "No audited facts available." };
}
// Build navigation constraint from the real site
const existingNav = state.siteProfile?.navigation?.map((n) => n.label).join(", ") || "unbekannt";
const existingServices = state.siteProfile?.services?.join(", ") || "unbekannt";
const externalDomains = state.siteProfile?.externalDomains?.join(", ") || "keine";
const systemPrompt = `
Du bist ein Senior UX Architekt. Erstelle einen ECHTEN SEITENBAUM für die neue Website.
Regelwerk für den Output:
### SEITENBAUM-REGELN:
1. KEIN MARKETINGSPRECH als Kategoriename. Gültige Kategorien sind nur die echten Navigationspunkte der Website.
ERLAUBT: "Startseite", "Leistungen", "Über uns", "Karriere", "Referenzen", "Kontakt", "Rechtliches"
VERBOTEN: "Kern-Präsenz", "Vertrauen", "Business Areas", "Digitaler Auftritt"
2. LEISTUNGEN muss in ECHTE UNTERSEITEN aufgeteilt werden — nicht eine einzige "Leistungen"-Seite.
Jede Kompetenz aus dem existierenden Leistungsspektrum = eine eigene Seite.
Beispiel statt:
{ category: "Leistungen", pages: [{ title: "Leistungen", desc: "..." }] }
So:
{ category: "Leistungen", pages: [
{ title: "Kabeltiefbau", desc: "Mittelspannung, Niederspannung, Kabelpflugarbeiten..." },
{ title: "Horizontalspülbohrungen", desc: "HDD in allen Bodenklassen..." },
{ title: "Elektromontagen", desc: "Bis 110 kV, Glasfaserkabelmontagen..." },
{ title: "Planung & Dokumentation", desc: "Genehmigungs- und Ausführungsplanung, Vermessung..." }
]}
3. SEITENTITEL: Kurz, klar, faktisch. Kein Werbejargon.
ERLAUBT: "Kabeltiefbau", "Über uns", "Karriere"
VERBOTEN: "Unsere Expertise", "Kompetenzspektrum", "Community"
4. Gruppe die Leistungen nach dem ECHTEN Kompetenzkatalog der bestehenden Site — nicht erfinden.
5. Keine doppelten Seiten. Keine Phantomseiten.
6. Videos = Content-Assets, keine eigene Seite.
7. Entitäten mit eigener Domain (${externalDomains}) = NICHT als Seite. Nur als Teaser/Link wenn nötig.
### KONTEXT:
Bestehende Navigation: ${existingNav}
Bestehende Services: ${existingServices}
Externe Domains (haben eigene Website): ${externalDomains}
Angeforderte zusätzliche Seiten aus Briefing: ${(state.auditedFacts as any)?.pages?.join(", ") || "keine spezifischen"}
### OUTPUT FORMAT (JSON):
{
"websiteTopic": string, // MAX 3 Wörter, beschreibend
"sitemap": [
{
"category": string, // Echter Nav-Eintrag. KEIN Marketingsprech.
"pages": [
{ "title": string, "desc": string } // Echte Unterseite, 1-2 Sätze Zweck
]
}
]
}
`;
const userPrompt = `
BRIEFING:
${state.briefing}
FAKTEN (aus Extraktion):
${JSON.stringify({ facts: state.auditedFacts, strategy: { briefingSummary: state.briefingSummary } }, null, 2)}
Erstelle den Seitenbaum. Baue die Leistungen DETAILLIERT aus — echte Unterseiten pro Kompetenzbereich.
`;
try {
const { data, usage } = await llmJsonRequest({
model: models.pro,
systemPrompt,
userPrompt,
apiKey: config.openrouterKey,
});
// Normalize sitemap structure
let sitemap = data.sitemap;
if (sitemap && !Array.isArray(sitemap)) {
if (sitemap.categories) sitemap = sitemap.categories;
else {
const entries = Object.entries(sitemap);
if (entries.every(([, v]) => Array.isArray(v))) {
sitemap = entries.map(([category, pages]) => ({ category, pages }));
}
}
}
if (Array.isArray(sitemap)) {
sitemap = sitemap.map((cat: any) => ({
category: cat.category || cat.kategorie || cat.Kategorie || "Allgemein",
pages: (cat.pages || cat.seiten || []).map((page: any) => ({
title: page.title || page.titel || "Seite",
desc: page.desc || page.beschreibung || page.description || "",
})),
}));
}
return {
success: true,
data: { websiteTopic: data.websiteTopic, sitemap },
usage: {
step: "04-architect",
model: models.pro,
promptTokens: usage.promptTokens,
completionTokens: usage.completionTokens,
cost: usage.cost,
durationMs: Date.now() - startTime,
},
};
} catch (err) {
return { success: false, error: `Architect step failed: ${(err as Error).message}` };
}
}

View File

@@ -0,0 +1,233 @@
// ============================================================================
// @mintel/concept-engine — Core Type Definitions
// ============================================================================
/** Page types recognized during crawling */
export type PageType =
| "home"
| "service"
| "about"
| "contact"
| "career"
| "portfolio"
| "blog"
| "legal"
| "other";
/** A single crawled page with extracted metadata */
export interface CrawledPage {
url: string;
pathname: string;
title: string;
html: string;
text: string;
headings: string[];
navItems: string[];
features: string[];
type: PageType;
links: string[];
images: string[];
meta: {
description?: string;
ogTitle?: string;
ogImage?: string;
};
}
/** Navigation item extracted from <nav> elements */
export interface NavItem {
label: string;
href: string;
children?: NavItem[];
}
/** Company info extracted from Impressum / footer */
export interface CompanyInfo {
name?: string;
address?: string;
phone?: string;
email?: string;
taxId?: string;
registerNumber?: string;
managingDirector?: string;
}
/** A page in the site inventory */
export interface PageInventoryItem {
url: string;
pathname: string;
title: string;
type: PageType;
headings: string[];
services: string[];
hasSearch: boolean;
hasForms: boolean;
hasMap: boolean;
hasVideo: boolean;
contentSummary: string;
}
/** Full site profile — deterministic, no LLM involved */
export interface SiteProfile {
domain: string;
crawledAt: string;
totalPages: number;
navigation: NavItem[];
existingFeatures: string[];
services: string[];
companyInfo: CompanyInfo;
pageInventory: PageInventoryItem[];
colors: string[];
socialLinks: Record<string, string>;
externalDomains: string[];
images: string[];
employeeCount: string | null;
}
/** Configuration for the estimation pipeline */
export interface PipelineConfig {
openrouterKey: string;
zyteApiKey?: string;
outputDir: string;
crawlDir: string;
modelsOverride?: Partial<ModelConfig>;
}
/** Model routing configuration */
export interface ModelConfig {
flash: string;
pro: string;
opus: string;
}
export const DEFAULT_MODELS: ModelConfig = {
flash: "google/gemini-3-flash-preview",
pro: "google/gemini-3.1-pro-preview",
opus: "anthropic/claude-opus-4-6",
};
/** Input for a pipeline run */
export interface PipelineInput {
briefing: string;
url?: string;
budget?: string;
comments?: string;
clearCache?: boolean;
}
/** State that flows through all concept pipeline steps */
export interface ConceptState {
// Input
briefing: string;
url?: string;
comments?: string;
// Output: Scrape & Analyze
siteProfile?: SiteProfile;
crawlDir?: string;
// Output: Site Audit
siteAudit?: any;
// Output: Research
researchData?: any;
// Output: Extract
facts?: Record<string, any>;
// Output: Audit
auditedFacts?: Record<string, any>;
// Output: Strategy
briefingSummary?: string;
designVision?: string;
// Output: Architecture
sitemap?: SitemapCategory[];
websiteTopic?: string;
// Cost tracking
usage: UsageStats;
}
/** Final output of the Concept Engine */
export interface ProjectConcept {
domain: string;
timestamp: string;
briefing: string;
auditedFacts: Record<string, any>;
siteProfile?: SiteProfile;
siteAudit?: any;
researchData?: any;
strategy: {
briefingSummary: string;
designVision: string;
};
architecture: {
websiteTopic: string;
sitemap: SitemapCategory[];
};
usage: UsageStats;
}
export interface SitemapCategory {
category: string;
pages: { title: string; desc: string }[];
}
export interface UsageStats {
totalPromptTokens: number;
totalCompletionTokens: number;
totalCost: number;
perStep: StepUsage[];
}
export interface StepUsage {
step: string;
model: string;
promptTokens: number;
completionTokens: number;
cost: number;
durationMs: number;
}
/** Result of a single pipeline step */
export interface StepResult<T = any> {
success: boolean;
data?: T;
error?: string;
usage?: StepUsage;
}
/** Validation result from the deterministic validator */
export interface ValidationResult {
passed: boolean;
errors: ValidationError[];
warnings: ValidationWarning[];
}
export interface ValidationError {
code: string;
message: string;
field?: string;
expected?: any;
actual?: any;
}
export interface ValidationWarning {
code: string;
message: string;
suggestion?: string;
}
/** Step definition for the concept pipeline */
export interface PipelineStep {
id: string;
name: string;
description: string;
model: "flash" | "pro" | "opus" | "none";
execute: (
state: ConceptState,
config: PipelineConfig,
) => Promise<StepResult>;
}

View File

@@ -0,0 +1,28 @@
{
"extends": "../../tsconfig.json",
"compilerOptions": {
"module": "NodeNext",
"moduleResolution": "NodeNext",
"target": "ES2022",
"lib": [
"ES2022",
"DOM"
],
"outDir": "dist",
"rootDir": "src",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"declaration": true,
"sourceMap": true
},
"include": [
"src/**/*"
],
"exclude": [
"node_modules",
"dist",
"**/*.test.ts"
]
}

View File

@@ -0,0 +1,9 @@
import { defineConfig } from "tsup";
export default defineConfig({
entry: ["src/index.ts", "src/cli.ts"],
format: ["esm"],
dts: true,
clean: true,
target: "es2022",
});

View File

@@ -0,0 +1,48 @@
import { ContentGenerator } from "../src/index";
import dotenv from "dotenv";
import path from "path";
import fs from "fs";
// Load .env from mintel.me (since that's where the key is)
dotenv.config({
path: path.resolve(__dirname, "../../../../mintel.me/apps/web/.env"),
});
async function main() {
const apiKey = process.env.OPENROUTER_API_KEY || process.env.OPENROUTER_KEY;
if (!apiKey) {
console.error("❌ OPENROUTER_API_KEY not found");
process.exit(1);
}
const generator = new ContentGenerator(apiKey);
const topic = "Why traditional CMSs are dead for developers";
console.log(`🚀 Generating post for: "${topic}"`);
try {
const post = await generator.generatePost({
topic,
includeResearch: true,
includeDiagrams: true,
includeMemes: true,
});
console.log("\n\n✅ GENERATION COMPLETE");
console.log("--------------------------------------------------");
console.log(`Title: ${post.title}`);
console.log(`Research Points: ${post.research.length}`);
console.log(`Memes Generated: ${post.memes.length}`);
console.log(`Diagrams Generated: ${post.diagrams.length}`);
console.log("--------------------------------------------------");
// Save to file
const outputPath = path.join(__dirname, "output.md");
fs.writeFileSync(outputPath, post.content);
console.log(`📄 Saved output to: ${outputPath}`);
} catch (error) {
console.error("❌ Generation failed:", error);
}
}
main();

View File

@@ -0,0 +1,58 @@
import { ContentGenerator } from "../src/index";
import dotenv from "dotenv";
import path from "path";
import fs from "fs";
import { fileURLToPath } from "url";
// Fix __dirname for ESM
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
// Load .env from mintel.me (since that's where the key is)
dotenv.config({
path: path.resolve(__dirname, "../../../../mintel.me/apps/web/.env"),
});
async function main() {
const apiKey = process.env.OPENROUTER_API_KEY || process.env.OPENROUTER_KEY;
if (!apiKey) {
console.error("❌ OPENROUTER_API_KEY not found");
process.exit(1);
}
const generator = new ContentGenerator(apiKey);
const draftContent = `# The Case for Static Sites
Static sites are faster and more secure. They don't have a database to hack.
They are also cheaper to host. You can use a CDN to serve them globally.
Dynamic sites are complex and prone to errors.`;
console.log("📄 Original Content:");
console.log(draftContent);
console.log("\n🚀 Optimizing content...\n");
try {
const post = await generator.optimizePost(draftContent, {
enhanceFacts: true,
addDiagrams: true,
addMemes: true,
});
console.log("\n\n✅ OPTIMIZATION COMPLETE");
console.log("--------------------------------------------------");
console.log(`Research Points Added: ${post.research.length}`);
console.log(`Memes Generated: ${post.memes.length}`);
console.log(`Diagrams Generated: ${post.diagrams.length}`);
console.log("--------------------------------------------------");
// Save to file
const outputPath = path.join(__dirname, "optimized.md");
fs.writeFileSync(outputPath, post.content);
console.log(`📄 Saved output to: ${outputPath}`);
} catch (error) {
console.error("❌ Optimization failed:", error);
}
}
main();

View File

@@ -0,0 +1,132 @@
import { ContentGenerator, ComponentDefinition } from "../src/index";
import dotenv from "dotenv";
import path from "path";
import fs from "fs";
import { fileURLToPath } from "url";
// Fix __dirname for ESM
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
// Load .env from mintel.me
dotenv.config({
path: path.resolve(__dirname, "../../../../mintel.me/apps/web/.env"),
});
async function main() {
const apiKey = process.env.OPENROUTER_API_KEY || process.env.OPENROUTER_KEY;
if (!apiKey) {
console.error("❌ OPENROUTER_API_KEY not found");
process.exit(1);
}
const generator = new ContentGenerator(apiKey);
const contentToOptimize = `
"Wir können nicht wechseln, das wäre zu teuer."
In meiner Arbeit als Digital Architect ist das der Anfang vom Ende jeder technologischen Innovation.
Vendor Lock-In ist die digitale Version einer Geiselnahme.
Ich zeige Ihnen, wie wir Systeme bauen, die Ihnen jederzeit die volle Freiheit lassen technologisch und wirtschaftlich.
Die unsichtbaren Ketten proprietärer Systeme
Viele Unternehmen lassen sich von der Bequemlichkeit großer SaaS-Plattformen oder Baukästen blenden.
Man bekommt ein schnelles Feature, gibt aber dafür die Kontrolle über seine Daten und seine Codebasis ab.
Nach zwei Jahren sind Sie so tief im Ökosystem eines Anbieters verstrickt, dass ein Auszug unmöglich scheint.
Der Anbieter weiß das und diktiert fortan die Preise und das Tempo Ihrer Entwicklung.
Ich nenne das technologische Erpressbarkeit.
Wahre Unabhängigkeit beginnt bei der strategischen Wahl der Architektur.
Technologische Souveränität als Asset
Software sollte für Sie arbeiten, nicht umgekehrt.
Indem wir auf offene Standards und portable Architekturen setzen, verwandeln wir Code in ein echtes Firmen-Asset.
Sie können den Cloud-Anbieter wechseln, die Agentur tauschen oder das Team skalieren ohne jemals bei Null anfangen zu müssen.
Das ist das Privileg der technologischen Elite.
Portabilität ist kein technisches Gimmick, sondern eine unternehmerische Notwendigkeit.
Meine Architektur der Ungebundenheit
Ich baue keine "Käfige" aus fertigen Plugins.
Mein Framework basiert auf Modularität und Klarheit.
Standard-basiertes Engineering: Wir nutzen Technologien, die weltweit verstanden werden. Keine geheimen "Spezial-Module" eines einzelnen Anbieters.
Daten-Portabilität: Ihre Daten gehören Ihnen. Zu jeder Zeit. Wir bauen Schnittstellen, die den Export so einfach machen wie den Import.
Cloud-agnostisches Hosting: Wir nutzen Container-Technologie. Ob AWS, Azure oder lokale Anbieter Ihr Code läuft überall gleich perfekt.
Der strategische Hebel für langfristige Rendite
Systeme ohne Lock-In altern besser.
Sie lassen sich schrittweise modernisieren, statt alle fünf Jahre komplett neu gebaut werden zu müssen.
Das spart Millionen an Opportunitätskosten und Fehl-Investitionen.
Seien Sie der Herr über Ihr digitales Schicksal.
Investieren Sie in intelligente Unabhängigkeit.
Für wen ich 'Freiheits-Systeme' erstelle
Ich arbeite für Gründer, die ihr Unternehmen langfristig wertvoll aufstellen wollen.
Ist digitale Exzellenz Teil Ihrer Exit-Strategie oder Ihres Erbes? Dann brauchen Sie meine Architektur.
Ich baue keine Provisorien, sondern nachhaltige Werte.
Fazit: Freiheit ist eine Wahl
Technologie sollte Ihnen Flügel verleihen, keine Fesseln anlegen.
Lassen Sie uns gemeinsam ein System schaffen, das so flexibel ist wie Ihr Business.
Werden Sie unersetzbar durch Qualität, nicht durch Abhängigkeit. Ihr Erfolg verdient absolute Freiheit.
`;
// Define components available in mintel.me
const availableComponents: ComponentDefinition[] = [
{
name: "LeadParagraph",
description: "Large, introductory text for the beginning of the article.",
usageExample: "<LeadParagraph>First meaningful sentence.</LeadParagraph>",
},
{
name: "H2",
description: "Section heading.",
usageExample: "<H2>Section Title</H2>",
},
{
name: "H3",
description: "Subsection heading.",
usageExample: "<H3>Subtitle</H3>",
},
{
name: "Paragraph",
description: "Standard body text paragraph.",
usageExample: "<Paragraph>Some text...</Paragraph>",
},
{
name: "ArticleBlockquote",
description: "A prominent quote block for key insights.",
usageExample: "<ArticleBlockquote>Important quote</ArticleBlockquote>",
},
{
name: "Marker",
description: "Yellow highlighter effect for very important phrases.",
usageExample: "<Marker>Highlighted Text</Marker>",
},
{
name: "ComparisonRow",
description: "A component comparing a negative vs positive scenario.",
usageExample:
'<ComparisonRow description="Cost Comparison" negativeLabel="Lock-In" negativeText="High costs" positiveLabel="Open" positiveText="Control" />',
},
];
console.log('🚀 Optimizing "Vendor Lock-In" post...');
try {
const post = await generator.optimizePost(contentToOptimize, {
enhanceFacts: true,
addDiagrams: true,
addMemes: true,
availableComponents,
});
console.log("\n\n✅ OPTIMIZATION COMPLETE");
// Save to a file in the package dir
const outputPath = path.join(__dirname, "VendorLockIn_OPTIMIZED.md");
fs.writeFileSync(outputPath, post.content);
console.log(`📄 Saved output to: ${outputPath}`);
} catch (error) {
console.error("❌ Optimization failed:", error);
}
}
main();

View File

@@ -0,0 +1,71 @@
import { ContentGenerator, ComponentDefinition } from "../src/index";
import dotenv from "dotenv";
import path from "path";
import fs from "fs";
import { fileURLToPath } from "url";
// Fix __dirname for ESM
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
// Load .env from mintel.me (since that's where the key is)
dotenv.config({
path: path.resolve(__dirname, "../../../../mintel.me/apps/web/.env"),
});
async function main() {
const apiKey = process.env.OPENROUTER_API_KEY || process.env.OPENROUTER_KEY;
if (!apiKey) {
console.error("❌ OPENROUTER_API_KEY not found");
process.exit(1);
}
const generator = new ContentGenerator(apiKey);
const draftContent = `# Improving User Retention
User retention is key. You need to keep users engaged.
Offer them value and they will stay.
If they have questions, they should contact support.`;
const availableComponents: ComponentDefinition[] = [
{
name: "InfoCard",
description: "A colored box to highlight important tips or warnings.",
usageExample:
'<InfoCard variant="warning" title="Pro Tip">Always measure retention.</InfoCard>',
},
{
name: "CallToAction",
description: "A prominent button for conversion.",
usageExample: '<CallToAction href="/contact">Get in Touch</CallToAction>',
},
];
console.log("📄 Original Content:");
console.log(draftContent);
console.log("\n🚀 Optimizing content with components...\n");
try {
const post = await generator.optimizePost(draftContent, {
enhanceFacts: true,
addDiagrams: false, // Skip diagrams for this test to focus on components
addMemes: false,
availableComponents,
});
console.log("\n\n✅ OPTIMIZATION COMPLETE");
console.log("--------------------------------------------------");
console.log(post.content);
console.log("--------------------------------------------------");
// Save to file
const outputPath = path.join(__dirname, "optimized-components.md");
fs.writeFileSync(outputPath, post.content);
console.log(`📄 Saved output to: ${outputPath}`);
} catch (error) {
console.error("❌ Optimization failed:", error);
}
}
main();

View File

@@ -0,0 +1,38 @@
{
"name": "@mintel/content-engine",
"version": "1.9.9",
"private": false,
"type": "module",
"main": "./dist/index.js",
"module": "./dist/index.js",
"types": "./dist/index.d.ts",
"exports": {
".": {
"types": "./dist/index.d.ts",
"import": "./dist/index.js"
}
},
"scripts": {
"build": "tsup src/index.ts --format esm --dts --clean",
"dev": "tsup src/index.ts --format esm --watch --dts",
"lint": "eslint src"
},
"dependencies": {
"@mintel/journaling": "workspace:*",
"@mintel/meme-generator": "workspace:*",
"@mintel/thumbnail-generator": "workspace:*",
"dotenv": "^17.3.1",
"openai": "^4.82.0"
},
"devDependencies": {
"@mintel/eslint-config": "workspace:*",
"@mintel/tsconfig": "workspace:*",
"@types/node": "^20.0.0",
"tsup": "^8.3.5",
"typescript": "^5.0.0"
},
"repository": {
"type": "git",
"url": "https://git.infra.mintel.me/mmintel/at-mintel.git"
}
}

View File

@@ -0,0 +1,990 @@
import OpenAI from "openai";
import { ResearchAgent, type Fact, type SocialPost } from "@mintel/journaling";
import { MemeGenerator, MemeSuggestion } from "@mintel/meme-generator";
import * as fs from "node:fs/promises";
import * as path from "node:path";
export interface ComponentDefinition {
name: string;
description: string;
usageExample: string;
}
export interface BlogPostOptions {
topic: string;
tone?: string;
targetAudience?: string;
includeMemes?: boolean;
includeDiagrams?: boolean;
includeResearch?: boolean;
availableComponents?: ComponentDefinition[];
}
export interface OptimizationOptions {
enhanceFacts?: boolean;
addMemes?: boolean;
addDiagrams?: boolean;
availableComponents?: ComponentDefinition[];
projectContext?: string;
/** Target audience description for all AI prompts */
targetAudience?: string;
/** Tone/persona description for all AI prompts */
tone?: string;
/** Prompt for DALL-E 3 style generation */
memeStylePrompt?: string;
/** Path to the docs folder (e.g. apps/web/docs) for full persona/tone context */
docsPath?: string;
}
export interface GeneratedPost {
title: string;
content: string;
research: Fact[];
memes: MemeSuggestion[];
diagrams: string[];
}
interface Insertion {
afterSection: number;
content: string;
}
// Model configuration: specialized models for different tasks
const MODELS = {
// Structured JSON output, research planning, diagram models: {
STRUCTURED: "google/gemini-3-flash-preview",
ROUTING: "google/gemini-3-flash-preview",
CONTENT: "google/gemini-3.1-pro-preview",
// Mermaid diagram generation - User requested Pro
DIAGRAM: "google/gemini-3.1-pro-preview",
} as const;
/** Strip markdown fences that some models wrap around JSON despite response_format */
function safeParseJSON(raw: string, fallback: any = {}): any {
let cleaned = raw.trim();
// Remove ```json ... ``` or ``` ... ``` wrapping
if (cleaned.startsWith("```")) {
cleaned = cleaned
.replace(/^```(?:json)?\s*\n?/, "")
.replace(/\n?```\s*$/, "");
}
try {
return JSON.parse(cleaned);
} catch (e) {
console.warn(
"⚠️ Failed to parse JSON response, using fallback:",
(e as Error).message,
);
return fallback;
}
}
export class ContentGenerator {
private openai: OpenAI;
private researchAgent: ResearchAgent;
private memeGenerator: MemeGenerator;
constructor(apiKey: string) {
this.openai = new OpenAI({
apiKey,
baseURL: "https://openrouter.ai/api/v1",
defaultHeaders: {
"HTTP-Referer": "https://mintel.me",
"X-Title": "Mintel Content Engine",
},
});
this.researchAgent = new ResearchAgent(apiKey);
this.memeGenerator = new MemeGenerator(apiKey);
}
// =========================================================================
// generatePost — for new posts (unchanged from original)
// =========================================================================
async generatePost(options: BlogPostOptions): Promise<GeneratedPost> {
const {
topic,
tone = "professional yet witty",
includeResearch = true,
availableComponents = [],
} = options;
console.log(`🚀 Starting content generation for: "${topic}"`);
let facts: Fact[] = [];
if (includeResearch) {
console.log("📚 Gathering research...");
facts = await this.researchAgent.researchTopic(topic);
}
console.log("📝 Creating outline...");
const outline = await this.createOutline(topic, facts, tone);
console.log("✍️ Drafting content...");
let content = await this.draftContent(
topic,
outline,
facts,
tone,
availableComponents,
);
const diagrams: string[] = [];
if (options.includeDiagrams) {
content = await this.processDiagramPlaceholders(content, diagrams);
}
const memes: MemeSuggestion[] = [];
if (options.includeMemes) {
const memeIdeas = await this.memeGenerator.generateMemeIdeas(
content.slice(0, 4000),
);
memes.push(...memeIdeas);
}
return { title: outline.title, content, research: facts, memes, diagrams };
}
// =========================================================================
// generateTldr — Creates a TL;DR block for the given content
// =========================================================================
async generateTldr(content: string): Promise<string> {
const context = content.slice(0, 3000);
const response = await this.openai.chat.completions.create({
model: MODELS.CONTENT,
messages: [
{
role: "system",
content: `Du bist ein kompromissloser Digital Architect.
Erstelle ein "TL;DR" für diesen Artikel.
REGELN:
- 3 knackige Bulletpoints
- TON: Sarkastisch, direkt, provokant ("Finger in die Wunde")
- Fokussiere auf den wirtschaftlichen Schaden von schlechter Tech
- Formatiere als MDX-Komponente:
<div className="my-8 p-6 bg-slate-50 border-l-4 border-blue-600 rounded-r-xl">
<H3>TL;DR: Warum Ihr Geld verbrennt</H3>
<ul className="list-disc pl-5 space-y-2 mb-0">
<li>Punkt 1</li>
<li>Punkt 2</li>
<li>Punkt 3</li>
</ul>
</div>`,
},
{
role: "user",
content: context,
},
],
});
return response.choices[0].message.content?.trim() ?? "";
}
// =========================================================================
// optimizePost — ADDITIVE architecture (never rewrites original content)
// =========================================================================
async optimizePost(
content: string,
options: OptimizationOptions,
): Promise<GeneratedPost> {
console.log("🚀 Optimizing existing content (additive mode)...");
// Load docs context if provided
let docsContext = "";
if (options.docsPath) {
docsContext = await this.loadDocsContext(options.docsPath);
console.log(`📖 Loaded ${docsContext.length} chars of docs context`);
}
const fullContext = [options.projectContext || "", docsContext]
.filter(Boolean)
.join("\n\n---\n\n");
// Split content into numbered sections for programmatic insertion
const sections = this.splitIntoSections(content);
console.log(`📋 Content has ${sections.length} sections`);
const insertions: Insertion[] = [];
const facts: Fact[] = [];
const diagrams: string[] = [];
const memes: MemeSuggestion[] = [];
// Build a numbered content map for LLM reference (read-only)
const sectionMap = this.buildSectionMap(sections);
// ----- STEP 1: Research -----
if (options.enhanceFacts) {
console.log("🔍 Identifying research topics...");
const researchTopics = await this.identifyResearchTopics(
content,
fullContext,
);
console.log(`📚 Researching: ${researchTopics.join(", ")}`);
for (const topic of researchTopics) {
const topicFacts = await this.researchAgent.researchTopic(topic);
facts.push(...topicFacts);
}
if (facts.length > 0) {
console.log(`📝 Planning fact insertions for ${facts.length} facts...`);
const factInsertions = await this.planFactInsertions(
sectionMap,
sections,
facts,
fullContext,
);
insertions.push(...factInsertions);
console.log(`${factInsertions.length} fact enrichments planned`);
}
// ----- STEP 1.5: Social Media Extraction (no LLM — regex only) -----
console.log("📱 Extracting existing social media embeds...");
const socialPosts = this.researchAgent.extractSocialPosts(content);
// If none exist, fetch real ones via Serper API
if (socialPosts.length === 0) {
console.log(
" → None found. Fetching real social posts via Serper API...",
);
const newPosts = await this.researchAgent.fetchRealSocialPosts(
content.slice(0, 500),
);
socialPosts.push(...newPosts);
}
if (socialPosts.length > 0) {
console.log(
`📝 Planning placement for ${socialPosts.length} social media posts...`,
);
const socialInsertions = await this.planSocialMediaInsertions(
sectionMap,
sections,
socialPosts,
fullContext,
);
insertions.push(...socialInsertions);
console.log(
`${socialInsertions.length} social embeddings planned`,
);
}
}
// ----- STEP 2: Component suggestions -----
if (options.availableComponents && options.availableComponents.length > 0) {
console.log("🧩 Planning component additions...");
const componentInsertions = await this.planComponentInsertions(
sectionMap,
sections,
options.availableComponents,
fullContext,
);
insertions.push(...componentInsertions);
console.log(
`${componentInsertions.length} component additions planned`,
);
}
// ----- STEP 3: Diagram generation -----
if (options.addDiagrams) {
console.log("📊 Planning diagrams...");
const diagramPlans = await this.planDiagramInsertions(
sectionMap,
sections,
fullContext,
);
for (const plan of diagramPlans) {
const mermaidCode = await this.generateMermaid(plan.concept);
if (!mermaidCode) {
console.warn(` ⏭️ Skipping invalid diagram for: "${plan.concept}"`);
continue;
}
diagrams.push(mermaidCode);
const diagramId = plan.concept
.toLowerCase()
.replace(/\s+/g, "-")
.replace(/[^a-z0-9-]/g, "")
.slice(0, 40);
insertions.push({
afterSection: plan.afterSection,
content: `<div className="my-8">\n <Mermaid id="${diagramId}" title="${plan.concept}" showShare={true}>\n${mermaidCode}\n </Mermaid>\n</div>`,
});
}
console.log(
`${diagramPlans.length} diagrams planned, ${diagrams.length} valid`,
);
}
// ----- STEP 4: Meme placement (memegen.link via ArticleMeme) -----
if (options.addMemes) {
console.log("✨ Generating meme ideas...");
let memeIdeas = await this.memeGenerator.generateMemeIdeas(
content.slice(0, 4000),
);
// User requested to explicitly limit memes to max 1 per page to prevent duplication
if (memeIdeas.length > 1) {
memeIdeas = [memeIdeas[0]];
}
memes.push(...memeIdeas);
if (memeIdeas.length > 0) {
console.log(
`🎨 Planning meme placement for ${memeIdeas.length} memes...`,
);
const memePlacements = await this.planMemePlacements(
sectionMap,
sections,
memeIdeas,
);
for (let i = 0; i < memeIdeas.length; i++) {
const meme = memeIdeas[i];
if (
memePlacements[i] !== undefined &&
memePlacements[i] >= 0 &&
memePlacements[i] < sections.length
) {
const captionsStr = meme.captions.join("|");
insertions.push({
afterSection: memePlacements[i],
content: `<div className="my-8">\n <ArticleMeme template="${meme.template}" captions="${captionsStr}" />\n</div>`,
});
}
}
console.log(`${memeIdeas.length} memes placed`);
}
}
// ----- Enforce visual spacing (no consecutive visualizations) -----
this.enforceVisualSpacing(insertions, sections);
// ----- Apply all insertions to original content -----
console.log(
`\n🔧 Applying ${insertions.length} insertions to original content...`,
);
let optimizedContent = this.applyInsertions(sections, insertions);
// ----- FINAL AGENTIC REWRITE (Replaces dumb regex scripts) -----
console.log(
`\n🧠 Agentic Rewrite: Polishing MDX, fixing syntax, and deduplicating...`,
);
const finalRewrite = await this.openai.chat.completions.create({
model: MODELS.CONTENT,
messages: [
{
role: "system",
content: `You are an expert MDX Editor. Your task is to take a draft blog post and output the FINAL, error-free MDX code.
CRITICAL RULES:
1. DEDUPLICATION: Ensure there is MAX ONE <ArticleMeme> in the entire post. Remove any duplicates or outdated memes. Ensure there is MAX ONE TL;DR section. Ensure there are no duplicate components.
2. TEXT-TO-COMPONENT RATIO: Ensure there are at least 3-4 paragraphs of normal text between any two visual components (<Mermaid>, <ArticleMeme>, <StatsGrid>, <BoldNumber>, etc.). If they are clumped together, spread them out or delete the less important ones.
3. SYNTAX: Fix any broken Mermaid/MDX syntax (e.g. unclosed tags, bad quotes).
4. FIDELITY: Preserve the author's original German text, meaning, and tone. Smooth out transitions into the components.
5. NO HALLUCINATION: Do not invent new URLs or facts. Keep the data provided in the draft.
6. OUTPUT: Return ONLY the raw MDX content. No markdown code blocks (\`\`\`mdx), no preamble. Just the raw code file.`,
},
{
role: "user",
content: optimizedContent,
},
],
});
optimizedContent =
finalRewrite.choices[0].message.content?.trim() || optimizedContent;
// Strip any residual markdown formatting fences just in case
if (optimizedContent.startsWith("```")) {
optimizedContent = optimizedContent
.replace(/^```[a-zA-Z]*\n/, "")
.replace(/\n```$/, "");
}
return {
title: "Optimized Content",
content: optimizedContent,
research: facts,
memes,
diagrams,
};
}
// =========================================================================
// ADDITIVE HELPERS — these return JSON instructions, never rewrite content
// =========================================================================
private splitIntoSections(content: string): string[] {
// Split on double newlines (paragraph/block boundaries in MDX)
return content.split(/\n\n+/);
}
private applyInsertions(sections: string[], insertions: Insertion[]): string {
// Sort by section index DESCENDING to avoid index shifting
const sorted = [...insertions].sort(
(a, b) => b.afterSection - a.afterSection,
);
const result = [...sections];
for (const ins of sorted) {
const idx = Math.min(ins.afterSection + 1, result.length);
result.splice(idx, 0, ins.content);
}
return result.join("\n\n");
}
/**
* Enforce visual spacing: visual components must have at least 2 text sections between them.
* This prevents walls of visualizations and maintains reading flow.
*/
private enforceVisualSpacing(
insertions: Insertion[],
sections: string[],
): void {
const visualPatterns = [
"<Mermaid",
"<ArticleMeme",
"<StatsGrid",
"<StatsDisplay",
"<BoldNumber",
"<MetricBar",
"<ComparisonRow",
"<PremiumComparisonChart",
"<DiagramFlow",
"<DiagramPie",
"<DiagramGantt",
"<DiagramState",
"<DiagramSequence",
"<DiagramTimeline",
"<Carousel",
"<WebVitalsScore",
"<WaterfallChart",
];
const isVisual = (content: string) =>
visualPatterns.some((p) => content.includes(p));
// Sort by section ascending
insertions.sort((a, b) => a.afterSection - b.afterSection);
// Minimum gap of 10 sections between visual components (= ~6-8 text paragraphs)
// User requested a better text-to-component ratio (not 1:1)
const MIN_VISUAL_GAP = 10;
for (let i = 1; i < insertions.length; i++) {
if (
isVisual(insertions[i].content) &&
isVisual(insertions[i - 1].content)
) {
const gap = insertions[i].afterSection - insertions[i - 1].afterSection;
if (gap < MIN_VISUAL_GAP) {
const newPos = Math.min(
insertions[i - 1].afterSection + MIN_VISUAL_GAP,
sections.length - 1,
);
insertions[i].afterSection = newPos;
}
}
}
}
private buildSectionMap(sections: string[]): string {
return sections
.map((s, i) => {
const preview = s.trim().replace(/\n/g, " ").slice(0, 120);
return `[${i}] ${preview}${s.length > 120 ? "…" : ""}`;
})
.join("\n");
}
private async loadDocsContext(docsPath: string): Promise<string> {
try {
const files = await fs.readdir(docsPath);
const mdFiles = files.filter((f) => f.endsWith(".md")).sort();
const contents: string[] = [];
for (const file of mdFiles) {
const filePath = path.join(docsPath, file);
const text = await fs.readFile(filePath, "utf8");
contents.push(`=== ${file} ===\n${text.trim()}`);
}
return contents.join("\n\n");
} catch (e) {
console.warn(`⚠️ Could not load docs from ${docsPath}: ${e}`);
return "";
}
}
// --- Fact insertion planning (Claude Sonnet — precise content understanding) ---
private async planFactInsertions(
sectionMap: string,
sections: string[],
facts: Fact[],
context: string,
): Promise<Insertion[]> {
const factsText = facts
.map((f, i) => `${i + 1}. ${f.statement} [Source: ${f.source}]`)
.join("\n");
const response = await this.openai.chat.completions.create({
model: MODELS.CONTENT,
messages: [
{
role: "system",
content: `You enrich a German blog post by ADDING new paragraphs with researched facts.
RULES:
- Do NOT rewrite or modify any existing content
- Only produce NEW <Paragraph> blocks to INSERT after a specific section number
- Maximum 5 insertions (only the most impactful facts)
- Match the post's tone and style (see context below)
- Use the post's JSX components: <Paragraph>, <Marker> for emphasis
- Cite sources using ExternalLink: <ExternalLink href="URL">Source: Name</ExternalLink>
- Write in German, active voice, Ich-Form where appropriate
CONTEXT (tone, style, persona):
${context.slice(0, 3000)}
EXISTING SECTIONS (read-only — do NOT modify these):
${sectionMap}
FACTS TO INTEGRATE:
${factsText}
Return JSON:
{ "insertions": [{ "afterSection": 3, "content": "<Paragraph>\\n Fact-enriched paragraph text. [Source: Name]\\n</Paragraph>" }] }
Return ONLY the JSON.`,
},
],
response_format: { type: "json_object" },
});
const result = safeParseJSON(
response.choices[0].message.content || '{"insertions": []}',
{ insertions: [] },
);
return (result.insertions || []).filter(
(i: any) =>
typeof i.afterSection === "number" &&
i.afterSection >= 0 &&
i.afterSection < sections.length &&
typeof i.content === "string",
);
}
// --- Social Media insertion planning ---
private async planSocialMediaInsertions(
sectionMap: string,
sections: string[],
posts: SocialPost[],
context: string,
): Promise<Insertion[]> {
if (!posts || posts.length === 0) return [];
const postsText = posts
.map(
(p, i) =>
`[${i}] Platform: ${p.platform}, ID: ${p.embedId} (${p.description})`,
)
.join("\n");
const response = await this.openai.chat.completions.create({
model: MODELS.CONTENT,
messages: [
{
role: "system",
content: `You enhance a German blog post by embedding relevant social media posts (YouTube, Twitter, LinkedIn).
RULES:
- Do NOT rewrite any existing content
- Return exactly 1 or 2 high-impact insertions
- Choose the best fitting post(s) from the provided list
- Use the correct component based on the platform:
- youtube -> <YouTubeEmbed videoId="ID" />
- twitter -> <TwitterEmbed tweetId="ID" theme="light" />
- linkedin -> <LinkedInEmbed urn="ID" />
- Add a 1-sentence intro paragraph above the embed to contextualize it naturally in the flow of the text (e.g. "Wie Experte XY im folgenden Video detailliert erklärt:"). This context is MANDATORY. Do not just drop the Component without text reference.
CONTEXT:
${context.slice(0, 3000)}
SOCIAL POSTS AVAILABLE TO EMBED:
${postsText}
EXISTING SECTIONS:
${sectionMap}
Return JSON:
{ "insertions": [{ "afterSection": 4, "content": "<Paragraph>Wie Experten passend bemerken:</Paragraph>\\n\\n<TwitterEmbed tweetId=\\"123456\\" theme=\\"light\\" />" }] }
Return ONLY the JSON.`,
},
],
response_format: { type: "json_object" },
});
const result = safeParseJSON(
response.choices[0].message.content || '{"insertions": []}',
{ insertions: [] },
);
return (result.insertions || []).filter(
(i: any) =>
typeof i.afterSection === "number" &&
i.afterSection >= 0 &&
i.afterSection < sections.length &&
typeof i.content === "string",
);
}
// --- Component insertion planning (Claude Sonnet — understands JSX context) ---
private async planComponentInsertions(
sectionMap: string,
sections: string[],
components: ComponentDefinition[],
context: string,
): Promise<Insertion[]> {
const fullContent = sections.join("\n\n");
const componentsText = components
.map((c) => `<${c.name}>: ${c.description}\n Example: ${c.usageExample}`)
.join("\n\n");
const usedComponents = components
.filter((c) => fullContent.includes(`<${c.name}`))
.map((c) => c.name);
const response = await this.openai.chat.completions.create({
model: MODELS.CONTENT,
messages: [
{
role: "system",
content: `You enhance a German blog post by ADDING interactive UI components.
STRICT BALANCE RULES:
- Maximum 34 component additions total
- There MUST be at least 34 text paragraphs between any two visual components
- Visual components MUST NEVER appear directly after each other
- Each unique component type should only appear ONCE (e.g., only one WebVitalsScore, one WaterfallChart)
- Multiple MetricBar or ComparisonRow in sequence are OK (they form a group)
CONTENT RULES:
- Do NOT rewrite any existing content — only ADD new component blocks
- Do NOT add components already present: ${usedComponents.join(", ") || "none"}
- Statistics MUST have comparison context (before/after, competitor vs us) — never standalone numbers
- All BoldNumber components MUST include source and sourceUrl props
- All ArticleQuote components MUST include source and sourceUrl; add "(übersetzt)" if translated
- MetricBar value must be a real number > 0, not placeholder zeros
- Carousel items array must have at least 2 items with substantive content
- Use exact JSX syntax from the examples
CONTEXT:
${context.slice(0, 3000)}
EXISTING SECTIONS (read-only):
${sectionMap}
AVAILABLE COMPONENTS:
${componentsText}
Return JSON:
{ "insertions": [{ "afterSection": 5, "content": "<StatsDisplay value=\\"100\\" label=\\"PageSpeed Score\\" subtext=\\"Kein Kompromiss.\\" />" }] }
Return ONLY the JSON.`,
},
],
response_format: { type: "json_object" },
});
const result = safeParseJSON(
response.choices[0].message.content || '{"insertions": []}',
{ insertions: [] },
);
return (result.insertions || []).filter(
(i: any) =>
typeof i.afterSection === "number" &&
i.afterSection >= 0 &&
i.afterSection < sections.length &&
typeof i.content === "string",
);
}
// --- Diagram planning (Gemini Flash — structured output) ---
private async planDiagramInsertions(
sectionMap: string,
sections: string[],
context: string,
): Promise<{ afterSection: number; concept: string }[]> {
const fullContent = sections.join("\n\n");
const hasDiagrams =
fullContent.includes("<Mermaid") || fullContent.includes("<Diagram");
const response = await this.openai.chat.completions.create({
model: MODELS.STRUCTURED,
messages: [
{
role: "system",
content: `Analyze this German blog post and suggest 1-2 Mermaid diagrams.
${hasDiagrams ? "The post already has diagrams. Only suggest NEW concepts not already visualized." : ""}
${context.slice(0, 1500)}
SECTIONS:
${sectionMap}
Return JSON:
{ "diagrams": [{ "afterSection": 5, "concept": "Descriptive concept name" }] }
Maximum 2 diagrams. Return ONLY the JSON.`,
},
],
response_format: { type: "json_object" },
});
const result = safeParseJSON(
response.choices[0].message.content || '{"diagrams": []}',
{ diagrams: [] },
);
return (result.diagrams || []).filter(
(d: any) =>
typeof d.afterSection === "number" &&
d.afterSection >= 0 &&
d.afterSection < sections.length,
);
}
// --- Meme placement planning (Gemini Flash — structural positioning) ---
private async planMemePlacements(
sectionMap: string,
sections: string[],
memes: MemeSuggestion[],
): Promise<number[]> {
const memesText = memes
.map((m, i) => `${i}: "${m.template}" — ${m.captions.join(" / ")}`)
.join("\n");
const response = await this.openai.chat.completions.create({
model: MODELS.STRUCTURED,
messages: [
{
role: "system",
content: `Place ${memes.length} memes at appropriate positions in this blog post.
Rules: Space them out evenly, place between thematic sections, never at position 0 (the very start).
SECTIONS:
${sectionMap}
MEMES:
${memesText}
Return JSON: { "placements": [sectionNumber, sectionNumber, ...] }
One section number per meme, in the same order as the memes list. Return ONLY JSON.`,
},
],
response_format: { type: "json_object" },
});
const result = safeParseJSON(
response.choices[0].message.content || '{"placements": []}',
{ placements: [] },
);
return result.placements || [];
}
// =========================================================================
// SHARED HELPERS
// =========================================================================
private async createOutline(
topic: string,
facts: Fact[],
tone: string,
): Promise<{ title: string; sections: string[] }> {
const factsContext = facts
.map((f) => `- ${f.statement} (${f.source})`)
.join("\n");
const response = await this.openai.chat.completions.create({
model: MODELS.STRUCTURED,
messages: [
{
role: "system",
content: `Create a blog post outline on "${topic}".
Tone: ${tone}.
Incorporating these facts:
${factsContext}
Return JSON: { "title": "Catchy Title", "sections": ["Introduction", "Section 1", "Conclusion"] }
Return ONLY the JSON.`,
},
],
response_format: { type: "json_object" },
});
return safeParseJSON(
response.choices[0].message.content || '{"title": "", "sections": []}',
{ title: "", sections: [] },
);
}
private async draftContent(
topic: string,
outline: { title: string; sections: string[] },
facts: Fact[],
tone: string,
components: ComponentDefinition[],
): Promise<string> {
const factsContext = facts
.map((f) => `- ${f.statement} (Source: ${f.source})`)
.join("\n");
const componentsContext =
components.length > 0
? `\n\nAvailable Components:\n` +
components
.map(
(c) =>
`- <${c.name}>: ${c.description}\n Example: ${c.usageExample}`,
)
.join("\n")
: "";
const response = await this.openai.chat.completions.create({
model: MODELS.CONTENT,
messages: [
{
role: "system",
content: `Write a blog post based on this outline:
Title: ${outline.title}
Sections: ${outline.sections.join(", ")}
Tone: ${tone}.
Facts: ${factsContext}
${componentsContext}
BLOG POST BEST PRACTICES (MANDATORY):
- DEVIL'S ADVOCATE: Füge zwingend eine kurze kritische Sektion ein (z.B. mit \`<ComparisonRow>\` oder \`<IconList>\`), in der du offen die Nachteile/Kosten/Haken deiner eigenen Lösung ansprichst ("Der Haken an der Sache...").
- FAQ GENERATOR: Am absoluten Ende des Artikels erstellst du zwingend eine Markdown-Liste mit den 3 wichtigsten Fragen (FAQ) und Antworten (jeweils 2 Sätze) für Google Rich Snippets.
- Nutze wo passend die obigen React-Komponenten für ein hochwertiges Layout.
Format as Markdown. Start with # H1.
For places where a diagram would help, insert: <!-- DIAGRAM_PLACEHOLDER: Concept Name -->
Return ONLY raw content.`,
},
],
});
return response.choices[0].message.content || "";
}
private async processDiagramPlaceholders(
content: string,
diagrams: string[],
): Promise<string> {
const matches = content.matchAll(/<!-- DIAGRAM_PLACEHOLDER: (.+?) -->/g);
let processedContent = content;
for (const match of Array.from(matches)) {
const concept = match[1];
const diagram = await this.generateMermaid(concept);
diagrams.push(diagram);
const diagramId = concept
.toLowerCase()
.replace(/\s+/g, "-")
.replace(/[^a-z0-9-]/g, "")
.slice(0, 40);
const mermaidJsx = `\n<div className="my-8">\n <Mermaid id="${diagramId}" title="${concept}" showShare={true}>\n${diagram}\n </Mermaid>\n</div>\n`;
processedContent = processedContent.replace(
`<!-- DIAGRAM_PLACEHOLDER: ${concept} -->`,
mermaidJsx,
);
}
return processedContent;
}
private async generateMermaid(concept: string): Promise<string> {
const response = await this.openai.chat.completions.create({
model: MODELS.DIAGRAM,
messages: [
{
role: "system",
content: `Generate a Mermaid.js diagram for: "${concept}".
RULES:
- Use clear labels in German where appropriate
- Keep it EXTREMELY SIMPLE AND COMPACT: strictly max 3-4 nodes for a tiny visual footprint.
- Prefer vertical layouts (TD) over horizontal (LR) to prevent wide overflowing graphs.
- CRITICAL: Generate ONLY ONE single connected graph. Do NOT generate multiple independent graphs or isolated subgraphs in the same Mermaid block.
- No nested subgraphs. Keep instructions short.
- Use double-quoted labels for nodes: A["Label"]
- VERY CRITICAL: DO NOT use curly braces '{}' or brackets '[]' inside labels unless they are wrapped in double quotes (e.g. A["Text {with braces}"]).
- VERY CRITICAL: DO NOT use any HTML tags (no <br>, no <br/>, no <b>, etc).
- VERY CRITICAL: DO NOT use special characters like '&', '<', '>', or double-quotes inside the label strings. They break the mermaid parser in our environment.
- Return ONLY the raw mermaid code. No markdown blocks, no backticks.
- The first line MUST be a valid mermaid diagram type: graph, flowchart, sequenceDiagram, pie, gantt, stateDiagram, timeline`,
},
],
});
const code =
response.choices[0].message.content
?.replace(/```mermaid/g, "")
.replace(/```/g, "")
.trim() || "";
// Validate: must start with a valid mermaid keyword
const validStarts = [
"graph",
"flowchart",
"sequenceDiagram",
"pie",
"gantt",
"stateDiagram",
"timeline",
"classDiagram",
"erDiagram",
];
const firstLine = code.split("\n")[0]?.trim().toLowerCase() || "";
const isValid = validStarts.some((keyword) =>
firstLine.startsWith(keyword),
);
if (!isValid || code.length < 10) {
console.warn(
`⚠️ Mermaid: Invalid diagram generated for "${concept}", skipping`,
);
return "";
}
return code;
}
private async identifyResearchTopics(
content: string,
context: string,
): Promise<string[]> {
try {
console.log("Sending request to OpenRouter...");
const response = await this.openai.chat.completions.create({
model: MODELS.STRUCTURED,
messages: [
{
role: "system",
content: `Analyze the following blog post and identify 3 key topics or claims that would benefit from statistical data or external verification.
Return relevant, specific research queries (not too broad).
Context: ${context.slice(0, 1500)}
Return JSON: { "topics": ["topic 1", "topic 2", "topic 3"] }
Return ONLY the JSON.`,
},
{
role: "user",
content: content.slice(0, 4000),
},
],
response_format: { type: "json_object" },
});
console.log("Got response from OpenRouter");
const parsed = safeParseJSON(
response.choices[0].message.content || '{"topics": []}',
{ topics: [] },
);
return (parsed.topics || []).map((t: any) =>
typeof t === "string" ? t : JSON.stringify(t),
);
} catch (e: any) {
console.error("Error in identifyResearchTopics:", e);
throw e;
}
}
}

View File

@@ -0,0 +1,2 @@
export * from "./generator";
export * from "./orchestrator";

View File

@@ -0,0 +1,705 @@
import OpenAI from "openai";
import { ResearchAgent, type Fact, type SocialPost } from "@mintel/journaling";
import { ThumbnailGenerator } from "@mintel/thumbnail-generator";
import { ComponentDefinition } from "./generator";
import * as fs from "node:fs/promises";
import * as path from "node:path";
export interface OrchestratorConfig {
apiKey: string;
replicateApiKey?: string;
model?: string;
}
export interface OptimizationTask {
content: string;
projectContext: string;
availableComponents?: ComponentDefinition[];
instructions?: string;
internalLinks?: { title: string; slug: string }[];
customSources?: string[];
}
export interface OptimizeFileOptions {
contextDir: string;
availableComponents?: ComponentDefinition[];
shouldRename?: boolean;
}
export class AiBlogPostOrchestrator {
private openai: OpenAI;
private researchAgent: ResearchAgent;
private thumbnailGenerator?: ThumbnailGenerator;
private model: string;
constructor(config: OrchestratorConfig) {
this.model = config.model || "google/gemini-3-flash-preview";
this.openai = new OpenAI({
apiKey: config.apiKey,
baseURL: "https://openrouter.ai/api/v1",
defaultHeaders: {
"HTTP-Referer": "https://mintel.me",
"X-Title": "Mintel AI Blog Post Orchestrator",
},
});
this.researchAgent = new ResearchAgent(config.apiKey);
if (config.replicateApiKey) {
this.thumbnailGenerator = new ThumbnailGenerator({
replicateApiKey: config.replicateApiKey,
});
}
}
/**
* Reusable context loader. Loads all .md and .txt files from a directory into a single string.
*/
async loadContext(dirPath: string): Promise<string> {
try {
const resolvedDir = path.resolve(process.cwd(), dirPath);
const files = await fs.readdir(resolvedDir);
const textFiles = files.filter((f) => /\.(md|txt)$/i.test(f)).sort();
const contents: string[] = [];
for (const file of textFiles) {
const filePath = path.join(resolvedDir, file);
const text = await fs.readFile(filePath, "utf8");
contents.push(`=== ${file} ===\n${text.trim()}`);
}
return contents.join("\n\n");
} catch (e) {
console.warn(`⚠️ Could not load context from ${dirPath}: ${e}`);
return "";
}
}
/**
* Reads a file, extracts frontmatter, loads context, optimizes body, and writes it back.
*/
async optimizeFile(
targetFile: string,
options: OptimizeFileOptions,
): Promise<void> {
const absPath = path.isAbsolute(targetFile)
? targetFile
: path.resolve(process.cwd(), targetFile);
console.log(`📄 Processing File: ${path.basename(absPath)}`);
const content = await fs.readFile(absPath, "utf8");
// Idea 4: We no longer split frontmatter and body. We pass the whole file
// to the LLM so it can optimize the SEO title and description.
// Idea 1: Build Internal Link Graph
const blogDir = path.dirname(absPath);
const internalLinks = await this.buildInternalLinkGraph(
blogDir,
path.basename(absPath),
);
console.log(`📖 Loading context from: ${options.contextDir}`);
const projectContext = await this.loadContext(options.contextDir);
if (!projectContext) {
console.warn(
"⚠️ No project context loaded. AI might miss specific guidelines.",
);
}
const optimizedContent = await this.optimizeDocument({
content: content,
projectContext,
availableComponents: options.availableComponents,
internalLinks: internalLinks, // pass to orchestrator
});
// Idea 4b: Extract the potentially updated title to rename the file (SEO Slug)
const newFmMatch = optimizedContent.match(/^---\s*\n([\s\S]*?)\n---/);
let finalPath = absPath;
let finalSlug = path.basename(absPath, ".mdx");
if (options.shouldRename && newFmMatch && newFmMatch[1]) {
const titleMatch = newFmMatch[1].match(/title:\s*["']([^"']+)["']/);
if (titleMatch && titleMatch[1]) {
const newTitle = titleMatch[1];
// Generate SEO Slug
finalSlug = newTitle
.toLowerCase()
.replace(/ä/g, "ae")
.replace(/ö/g, "oe")
.replace(/ü/g, "ue")
.replace(/ß/g, "ss")
.replace(/[^a-z0-9]+/g, "-")
.replace(/^-+|-+$/g, "");
const newAbsPath = path.join(path.dirname(absPath), `${finalSlug}.mdx`);
if (newAbsPath !== absPath) {
console.log(
`🔄 SEO Title changed! Renaming file to: ${finalSlug}.mdx`,
);
// Delete old file if the title changed significantly
try {
await fs.unlink(absPath);
} catch (_err) {
// ignore
}
finalPath = newAbsPath;
}
}
} else if (newFmMatch && newFmMatch[1]) {
console.log(
` Rename skipped (permalink stability active). If you want to rename, use --rename.`,
);
}
// Idea 5: Automatic Thumbnails
let finalContent = optimizedContent;
// Skip if thumbnail already exists in frontmatter
const hasExistingThumbnail = /thumbnail:\s*["'][^"']+["']/.test(
finalContent,
);
if (this.thumbnailGenerator && !hasExistingThumbnail) {
console.log("🎨 Phase 5: Generating/Linking visual thumbnail...");
try {
const webPublicDir = path.resolve(process.cwd(), "apps/web/public");
const thumbnailRelPath = `/blog/${finalSlug}.png`;
const thumbnailAbsPath = path.join(
webPublicDir,
"blog",
`${finalSlug}.png`,
);
// Check if the physical file already exists
let physicalFileExists = false;
try {
await fs.access(thumbnailAbsPath);
physicalFileExists = true;
} catch (_err) {
// File does not exist
}
if (physicalFileExists) {
console.log(
`⏭️ Thumbnail already exists on disk, skipping generation: ${thumbnailAbsPath}`,
);
} else {
const visualPrompt = await this.generateVisualPrompt(finalContent);
await this.thumbnailGenerator.generateImage(
visualPrompt,
thumbnailAbsPath,
);
}
// Update frontmatter with thumbnail
if (finalContent.includes("thumbnail:")) {
finalContent = finalContent.replace(
/thumbnail:\s*["'].*?["']/,
`thumbnail: "${thumbnailRelPath}"`,
);
} else {
finalContent = finalContent.replace(
/(title:\s*["'].*?["'])/,
`$1\nthumbnail: "${thumbnailRelPath}"`,
);
}
} catch (e) {
console.warn("⚠️ Thumbnail processing failed, skipping:", e);
}
}
await fs.writeFile(finalPath, finalContent);
console.log(`✅ Saved optimized file to: ${finalPath}`);
}
async generateSlug(
content: string,
title?: string,
instructions?: string,
): Promise<string> {
const response = await this.openai.chat.completions.create({
model: "google/gemini-3-flash-preview",
messages: [
{
role: "system",
content: `You generate SEO-optimized URL slugs for B2B blog posts based on the provided content.
Return ONLY a JSON object with a single string field "slug".
Example: {"slug": "how-to-optimize-react-performance"}
Rules: Use lowercase letters, numbers, and hyphens only. No special characters. Keep it concise (2-5 words).`,
},
{
role: "user",
content: `Title: ${title || "Unknown"}\n\nContent:\n${content.slice(0, 3000)}...${instructions ? `\n\nEDITOR INSTRUCTIONS:\nPlease strictly follow these instructions from the editor when generating the slug:\n${instructions}` : ""}`,
},
],
response_format: { type: "json_object" },
});
try {
const parsed = JSON.parse(
response.choices[0].message.content || '{"slug": ""}',
);
const slug = parsed.slug || "new-post";
return slug
.toLowerCase()
.replace(/[^a-z0-9]+/g, "-")
.replace(/^-+|-+$/g, "");
} catch {
return "new-post";
}
}
public async generateVisualPrompt(
content: string,
instructions?: string,
): Promise<string> {
const response = await this.openai.chat.completions.create({
model: this.model,
messages: [
{
role: "system",
content: `You are a Visual Discovery Agent for an architectural design system.
Review the provided blog post and create a 1-sentence abstract visual description for an image generator (like Flux).
THEME: Technical blueprint / structural illustration.
STYLE: Clean lines, geometric shapes, monochrome base with one highlighter accent color (green, pink, or yellow).
NO TEXT. NO PEOPLE. NO REALISTIC PHOTOS.
FOCUS: The core metaphor or technical concept of the article.
Example output: "A complex network of glowing fiber optic nodes forming a recursive pyramid structure, technical blue lineart style."`,
},
{
role: "user",
content: `${content.slice(0, 5000)}${instructions ? `\n\nEDITOR INSTRUCTIONS:\nPlease strictly follow these instructions from the editor when generating the visual prompt:\n${instructions}` : ""}`,
},
],
max_tokens: 100,
});
return (
response.choices[0].message.content ||
"Technical architectural blueprint of a digital system"
);
}
private async buildInternalLinkGraph(
blogDir: string,
currentFile: string,
): Promise<{ title: string; slug: string }[]> {
try {
const files = await fs.readdir(blogDir);
const mdxFiles = files.filter(
(f) => f.endsWith(".mdx") && f !== currentFile,
);
const graph: { title: string; slug: string }[] = [];
for (const file of mdxFiles) {
const fileContent = await fs.readFile(path.join(blogDir, file), "utf8");
const titleMatch = fileContent.match(/title:\s*["']([^"']+)["']/);
if (titleMatch && titleMatch[1]) {
graph.push({
title: titleMatch[1],
slug: `/blog/${file.replace(".mdx", "")}`,
});
}
}
return graph;
} catch (e) {
console.warn("Could not build internal link graph", e);
return [];
}
}
/**
* Executes the 3-step optimization pipeline:
* 1. Fakten recherchieren
* 2. Bestehende Social Posts extrahieren (kein LLM — nur Regex)
* 3. AI anweisen daraus Artikel zu erstellen
*/
async optimizeDocument(task: OptimizationTask): Promise<string> {
console.log(`🚀 Starting AI Orchestration Pipeline (${this.model})...`);
// 1. Fakten & Konkurrenz recherchieren
console.log("1⃣ Recherchiere Fakten und analysiere Konkurrenz...");
const researchTopics = await this.identifyTopics(task.content);
const facts: Fact[] = [];
const competitorInsights: string[] = [];
// Paralellize competitor research and fact research
await Promise.all(
researchTopics.map(async (topic) => {
const [topicFacts, insights] = await Promise.all([
this.researchAgent.researchTopic(topic),
this.researchAgent.researchCompetitors(topic),
]);
facts.push(...topicFacts);
competitorInsights.push(...insights);
}),
);
// 2. Bestehende Social Posts aus dem Content extrahieren (deterministisch, kein LLM)
console.log("2⃣ Extrahiere bestehende Social Media Embeds aus Content...");
const socialPosts = this.researchAgent.extractSocialPosts(task.content);
// Wenn keine vorhanden sind, besorge echte von der Serper API
if (socialPosts.length === 0) {
console.log(
" → Keine bestehenden Posts gefunden. Suche neue über Serper API...",
);
const realPosts = await this.researchAgent.fetchRealSocialPosts(
task.content.slice(0, 500),
task.customSources,
);
socialPosts.push(...realPosts);
}
// 3. AI anweisen daraus Artikel zu erstellen
console.log("3⃣ Erstelle optimierten Artikel (Agentic Rewrite)...");
return await this.compileArticle(
task,
facts,
competitorInsights,
socialPosts,
task.internalLinks || [],
);
}
private async identifyTopics(content: string): Promise<string[]> {
const response = await this.openai.chat.completions.create({
model: "google/gemini-3-flash-preview", // fast structured model for topic extraction
messages: [
{
role: "system",
content: `Analyze the following blog post and identify 1 to 2 key topics or claims that would benefit from statistical data or external verification.
Return JSON: { "topics": ["topic 1", "topic 2"] }
Return ONLY the JSON.`,
},
{
role: "user",
content: content.slice(0, 4000),
},
],
response_format: { type: "json_object" },
});
try {
const raw = response.choices[0].message.content || '{"topics": []}';
const cleaned = raw
.trim()
.replace(/^```(?:json)?\s*\n?/, "")
.replace(/\n?```\s*$/, "");
const parsed = JSON.parse(cleaned);
return parsed.topics || [];
} catch (e) {
console.warn("⚠️ Failed to parse research topics", e);
return [];
}
}
private async compileArticle(
task: OptimizationTask,
facts: Fact[],
competitorInsights: string[],
socialPosts: SocialPost[],
internalLinks: { title: string; slug: string }[],
retryCount = 0,
): Promise<string> {
const factsText = facts
.map((f, i) => `${i + 1}. ${f.statement} [Source: ${f.source}]`)
.join("\n");
let socialText = `CRITICAL RULE: NO VERIFIED SOCIAL MEDIA POSTS FOUND. You MUST NOT use <YouTubeEmbed />, <TwitterEmbed />, or <LinkedInEmbed /> under ANY circumstances in this article. DO NOT hallucinate IDs.`;
if (socialPosts.length > 0) {
const allowedTags: string[] = [];
if (socialPosts.some((p) => p.platform === "youtube"))
allowedTags.push('<YouTubeEmbed videoId="..." />');
if (socialPosts.some((p) => p.platform === "twitter"))
allowedTags.push('<TwitterEmbed tweetId="..." />');
if (socialPosts.some((p) => p.platform === "linkedin"))
allowedTags.push('<LinkedInEmbed url="..." />');
socialText = `Social Media Posts to embed (use ONLY these tags, do not use others: ${allowedTags.join(", ")}):\n${socialPosts.map((p) => `Platform: ${p.platform}, ID: ${p.embedId} (${p.description})`).join("\n")}\nCRITICAL: Do not invent any IDs that are not explicitly listed in the list above.`;
}
const componentsText = (task.availableComponents || [])
.filter((c) => {
if (
c.name === "YouTubeEmbed" &&
!socialPosts.some((p) => p.platform === "youtube")
)
return false;
if (
c.name === "TwitterEmbed" &&
!socialPosts.some((p) => p.platform === "twitter")
)
return false;
if (
c.name === "LinkedInEmbed" &&
!socialPosts.some((p) => p.platform === "linkedin")
)
return false;
return true;
})
.map((c) => {
// Ensure LinkedInEmbed usage example consistently uses 'url'
if (c.name === "LinkedInEmbed") {
return `<${c.name}>: ${c.description}\n Example: <LinkedInEmbed url="https://www.linkedin.com/posts/..." />`;
}
return `<${c.name}>: ${c.description}\n Example: ${c.usageExample}`;
})
.join("\n\n");
const memeTemplates = [
"db", // Distracted Boyfriend
"gb", // Galaxy Brain
"fine", // This is Fine
"ds", // Daily Struggle
"gru", // Gru's Plan
"cmm", // Change My Mind
"astronaut", // Always Has Been (ahb)
"disastergirl",
"pigeon", // Is this a pigeon?
"rollsafe",
"slap", // Will Smith
"exit", // Left Exit 12
"mordor",
"panik-kalm-panik",
"woman-cat", // Woman yelling at cat
"grumpycat",
"sadfrog",
"stonks",
"same", // They're the same picture
"spongebob",
];
const forcedMeme =
memeTemplates[Math.floor(Math.random() * memeTemplates.length)];
const response = await this.openai.chat.completions.create({
model: this.model,
messages: [
{
role: "system",
content: `You are an expert MDX Editor and Digital Architect.
YOUR TASK:
Take the given draft blog post and rewrite/enhance it into a final, error-free MDX file. Maintain the author's original German text, meaning, and tone, but enrich it gracefully.
CONTEXT & RULES:
Project Context / Tone:
${task.projectContext}
FACTS TO INTEGRATE:
${factsText || "No new facts needed."}
COMPETITOR BENCHMARK (TOP RANKING ARTICLES):
Here are snippets from the top 5 ranking Google articles for this topic. Read them carefully and ensure our article covers these topics but is fundamentally BETTER, deeper, and more authoritative:
${competitorInsights.length > 0 ? competitorInsights.join("\n") : "No competitor insights found."}
AVAILABLE UI COMPONENTS:
${componentsText}
SOCIAL MEDIA POSTS:
${socialText}
INTERNAL LINKING GRAPH:
Hier sind unsere existierenden Blog-Posts (Titel und URL-Slug). Finde 2-3 passende Stellen im Text, um organisch mit regulärem Markdown (\`[passender Text]([slug])\`) auf diese Posts zu verlinken. Nutze KEIN <ExternalLink> für B2B-interne Links.
${internalLinks.length > 0 ? internalLinks.map((l) => `- "${l.title}" -> ${l.slug}`).join("\n") : "Keine internen Links verfügbar."}
Special Instructions from User:
${task.instructions || "None"}
BLOG POST BEST PRACTICES (MANDATORY):
- DEVIL'S ADVOCATE: Füge zwingend eine kurze kritische Sektion ein (z.B. mit \`<ComparisonRow>\` oder \`<IconList>\`), in der du offen die Nachteile/Kosten/Haken deiner eigenen Lösung ansprichst ("Der Haken an der Sache..."). Das baut Vertrauen bei B2B Entscheidenden auf.
- FAQ GENERATOR: Am absoluten Ende des Artikels erstellst du zwingend eine Markdown-Liste mit den 3 wichtigsten Fragen (FAQ) und Antworten (jeweils 2 Sätze) für Google Rich Snippets. Nutze dazu das FAQSection Component oder normales Markdown.
- SUBTLE CTAs: Webe 1-2 subtile CTAs für High-End Website Entwicklung ein. Nutze ZWINGEND die Komponente [LeadMagnet] für diese Zwecke anstelle von einfachen Buttons. [LeadMagnet] bietet mehr Kontext und Vertrauen. Beispiel: <LeadMagnet title="Performance-Check anfragen" description="Wir analysieren Ihre Core Web Vitals und decken Umsatzpotenziale auf." buttonText="Jetzt analysieren lassen" href="/contact" variant="performance" />. Die Texte im LeadMagnet müssen absolut überzeugend, hochprofessionell und B2B-fokussiert sein (KEIN Robotik-Marketing-Sprech).
- MEME DIVERSITY: Du MUSST ZWINGEND für jedes Meme (sofern passend) abwechslungsreiche Templates nutzen. Um dies zu garantieren, wurde für diesen Artikel das folgende Template ausgewählt: '${forcedMeme}'. Du MUSST EXAKT DIESES TEMPLATE NUTZEN. Versuche nicht, es durch ein Standard-Template wie 'drake' zu ersetzen!
- Zitat-Varianten: Wenn du Organisationen oder Studien zitierst, nutze ArticleQuote (mit isCompany=true für Firmen). Für Personen lass isCompany weg.
- Füge zwingend ein prägnantes 'TL;DR' ganz am Anfang ein.
- Verwende unsere Komponenten stilvoll für Visualisierungen.
- Agiere als hochprofessioneller Digital Architect und entferne alte MDX-Metadaten im Body.
- Fazit: Schließe JEDEN Artikel ZWINGEND mit einem starken, klaren 'Fazit' ab.
- ORIGINAL LANGUAGE QUOTES: Übersetze NIEMALS Zitate (z.B. in ArticleQuote). Behalte das Original (z.B. Englisch), wenn du Studien von Deloitte, McKinsey oder Aussagen von CEOs zitierst. Das erhöht die Authentizität im B2B-Mittelstand.
- CONTENT PRUNING: Wenn das dir übergebene MDX bereits interaktive Komponenten (z.B. \`<YouTubeEmbed>\`) enthält, die **nicht** oder **nicht mehr** zum inhaltlichen Fokus passen (z.B. irrelevante Videos oder platzhalter-ähnliche Snippets), MUSST du diese radikal **entfernen**. Behalte keine halluzinierten oder unpassenden Medien, nur weil sie schon da waren.
STRICT MDX OUTPUT RULES:
1. ONLY use the exact components defined above.
2. For Social Media Embeds, you MUST ONLY use the EXACT IDs provided in the list above. Do NOT invent IDs.
3. If ANY verified social media posts are provided, you MUST integrate at least one naturally with a contextual sentence.
4. Keep the original content blocks and headings as much as possible, just improve flow.
5. FRONTMATTER SEO (Idea 4): Ich übergebe dir die KOMPLETTE Datei inklusive Markdown-Frontmatter (--- ... ---). Du MUSST das Frontmatter ebenfalls zurückgeben! Optimiere darin den \`title\` und die \`description\` maximal für B2B SEO. Lasse die anderen Keys im Frontmatter (date, tags) unangetastet.
CRITICAL GUIDELINES (NEVER BREAK THESE):
1. THE OUTPUT MUST START WITH YAML FRONTMATTER AND END WITH THE MDX BODY.
2. DO NOT INCLUDE MARKDOWN WRAPPERS (do not wrap in \`\`\`mdx ... \`\`\`).
5. Be clean. Do NOT clump all components together. Provide 3-4 paragraphs of normal text between visual items.
6. If you insert components, ensure their syntax is 100% valid JSX/MDX.
7. CRITICAL MERMAID RULE: If you use <Mermaid>, the inner content MUST be 100% valid Mermaid.js syntax. NO HTML inside labels. NO quotes inside brackets without valid syntax.
8. Do NOT hallucinate links or facts. Use only what is provided.`,
},
{
role: "user",
content: task.content,
},
],
});
let rawContent = response.choices[0].message.content || task.content;
rawContent = this.cleanResponse(rawContent, socialPosts);
// --- Autonomous Validation Layer ---
let hasError = false;
let errorFeedback = "";
// 1. Validate Meme Templates
const memeRegex = /<ArticleMeme[^>]+template=["']([^"']+)["'][^>]*>/g;
let memeMatch;
const invalidMemes: string[] = [];
while ((memeMatch = memeRegex.exec(rawContent)) !== null) {
if (!memeTemplates.includes(memeMatch[1])) {
invalidMemes.push(memeMatch[1]);
}
}
if (invalidMemes.length > 0) {
hasError = true;
errorFeedback += `\n- You hallucinated invalid meme templates: ${invalidMemes.join(", ")}. You MUST ONLY use templates from this exact list: ${memeTemplates.join(", ")}. DO NOT INVENT TEMPLATES.\n`;
}
// 2. Validate Mermaid Syntax
if (rawContent.includes("<Mermaid>")) {
console.log("🔍 Validating Mermaid syntax in AI response...");
const mermaidBlocks = this.extractMermaidBlocks(rawContent);
for (const block of mermaidBlocks) {
const validationResult = await this.validateMermaidSyntax(block);
if (!validationResult.valid) {
hasError = true;
errorFeedback += `\n- Invalid Mermaid block:\n${block}\nError context: ${validationResult.error}\n`;
}
}
}
if (hasError && retryCount < 3) {
console.log(
`❌ Validation errors detected. Retrying compilation (Attempt ${retryCount + 1}/3)...`,
);
return this.compileArticle(
{
...task,
content: `CRITICAL ERROR IN PREVIOUS ATTEMPT:\nYour generated MDX contained the following errors that MUST be fixed:\n${errorFeedback}\n\nPlease rewrite the MDX and FIX these errors. Pay strict attention to the rules.\n\nOriginal Draft:\n${task.content}`,
},
facts,
competitorInsights,
socialPosts,
internalLinks,
retryCount + 1,
);
}
return rawContent;
}
private extractMermaidBlocks(content: string): string[] {
const blocks: string[] = [];
// Regex to match <Mermaid>...</Mermaid> blocks across multiple lines
const regex = /<Mermaid>([\s\S]*?)<\/Mermaid>/g;
let match;
while ((match = regex.exec(content)) !== null) {
if (match[1]) {
blocks.push(match[1].trim());
}
}
return blocks;
}
private async validateMermaidSyntax(
graph: string,
): Promise<{ valid: boolean; error?: string }> {
// Fast LLM validation to catch common syntax errors like unbalanced quotes or HTML entities
try {
const validationResponse = await this.openai.chat.completions.create({
model: "google/gemini-3-flash-preview", // Switch from gpt-4o-mini to user requested model
messages: [
{
role: "system",
content:
'You are a strict Mermaid.js compiler. Analyze the given Mermaid syntax. If it is 100% valid and will render without exceptions, reply ONLY with "VALID". If it has syntax errors (e.g., HTML inside labels, unescaped quotes, unclosed brackets), reply ONLY with "INVALID" followed by a short explanation of the exact error.',
},
{
role: "user",
content: graph,
},
],
});
const reply =
validationResponse.choices[0].message.content?.trim() || "VALID";
if (reply.startsWith("INVALID")) {
return { valid: false, error: reply };
}
return { valid: true };
} catch (e) {
console.error("Syntax validation LLM call failed, passing through:", e);
return { valid: true }; // Fallback to passing if validator fails
}
}
private cleanResponse(content: string, socialPosts: SocialPost[]): string {
let cleaned = content.trim();
// 1. Strip Markdown Wrappers (e.g. ```mdx ... ```)
if (cleaned.startsWith("```")) {
cleaned = cleaned
.replace(/^```[a-zA-Z]*\n?/, "")
.replace(/\n?```\s*$/, "");
}
// 2. We NO LONGER strip redundant frontmatter, because we requested the LLM to output it.
// Ensure the output actually has frontmatter, if not, something went wrong, but we just pass it along.
// 3. Strip any social embeds the AI hallucinated (IDs not in our extracted set)
const knownYtIds = new Set(
socialPosts.filter((p) => p.platform === "youtube").map((p) => p.embedId),
);
const knownTwIds = new Set(
socialPosts.filter((p) => p.platform === "twitter").map((p) => p.embedId),
);
const knownLiIds = new Set(
socialPosts
.filter((p) => p.platform === "linkedin")
.map((p) => p.embedId),
);
cleaned = cleaned.replace(
/<YouTubeEmbed[^>]*videoId="([^"]+)"[^>]*\/>/gi,
(tag, id) => {
if (knownYtIds.has(id)) return tag;
console.log(
`🛑 Stripped hallucinated YouTubeEmbed with videoId="${id}"`,
);
return "";
},
);
cleaned = cleaned.replace(
/<TwitterEmbed[^>]*tweetId="([^"]+)"[^>]*\/>/gi,
(tag, id) => {
if (knownTwIds.has(id)) return tag;
console.log(
`🛑 Stripped hallucinated TwitterEmbed with tweetId="${id}"`,
);
return "";
},
);
cleaned = cleaned.replace(
/<LinkedInEmbed[^>]*(?:url|urn)="([^"]+)"[^>]*\/>/gi,
(tag, id) => {
if (knownLiIds.has(id)) return tag;
console.log(`🛑 Stripped hallucinated LinkedInEmbed with id="${id}"`);
return "";
},
);
return cleaned;
}
}

View File

@@ -0,0 +1,11 @@
{
"extends": "@mintel/tsconfig/base.json",
"compilerOptions": {
"module": "ESNext",
"target": "ESNext",
"moduleResolution": "Bundler",
"allowImportingTsExtensions": true,
"noEmit": true
},
"include": ["src"]
}

View File

@@ -1,851 +0,0 @@
import { useApi as e, defineModule as a } from "@directus/extensions-sdk";
import {
defineComponent as t,
ref as l,
onMounted as n,
resolveComponent as i,
resolveDirective as s,
openBlock as d,
createBlock as r,
withCtx as u,
createVNode as o,
createElementBlock as m,
Fragment as c,
renderList as v,
createTextVNode as p,
toDisplayString as f,
createCommentVNode as g,
createElementVNode as y,
withDirectives as b,
nextTick as _,
} from "vue";
const h = { class: "content-wrapper" },
x = { key: 0, class: "empty-state" },
w = { class: "header" },
k = { class: "header-left" },
V = { class: "title" },
C = { class: "subtitle" },
M = { class: "header-right" },
F = { class: "user-cell" },
N = { class: "user-name" },
z = { key: 0, class: "status-date" },
E = { key: 0, class: "drawer-content" },
U = { class: "form-section" },
S = { class: "field" },
A = { class: "drawer-actions" },
T = { key: 0, class: "drawer-content" },
Z = { class: "form-section" },
j = { class: "field" },
$ = { class: "field" },
D = { class: "field" },
O = { key: 1, class: "field" },
W = { class: "drawer-actions" };
var q = t({
__name: "module",
setup(a) {
const t = e(),
q = l([]),
B = l(null),
K = l([]),
L = l(!1),
P = l(!1),
G = l(null),
I = l(null),
H = l(!1),
J = l(!1),
Q = l({ id: "", name: "" }),
R = l(!1),
X = l(!1),
Y = l({
id: "",
first_name: "",
last_name: "",
email: "",
temporary_password: "",
}),
ee = [
{ text: "Name", value: "name", sortable: !0 },
{ text: "E-Mail", value: "email", sortable: !0 },
{ text: "Zuletzt eingeladen", value: "last_invited", sortable: !0 },
];
async function ae() {
const e = await t.get("/items/companies", {
params: { fields: ["id", "name"], sort: "name" },
});
q.value = e.data.data;
}
async function te(e) {
((B.value = e), (L.value = !0));
try {
const a = await t.get("/items/client_users", {
params: {
filter: { company: { _eq: e.id } },
fields: ["*"],
sort: "first_name",
},
});
K.value = a.data.data;
} finally {
L.value = !1;
}
}
function le() {
((J.value = !1), (Q.value = { id: "", name: "" }), (H.value = !0));
}
async function ne() {
B.value &&
((Q.value = { id: B.value.id, name: B.value.name }),
(J.value = !0),
await _(),
(H.value = !0));
}
async function ie() {
var e;
if (Q.value.name) {
P.value = !0;
try {
(J.value
? (await t.patch(`/items/companies/${Q.value.id}`, {
name: Q.value.name,
}),
(I.value = { type: "success", message: "Firma aktualisiert!" }))
: (await t.post("/items/companies", { name: Q.value.name }),
(I.value = { type: "success", message: "Firma angelegt!" })),
(H.value = !1),
await ae(),
(null == (e = B.value) ? void 0 : e.id) === Q.value.id &&
(B.value.name = Q.value.name));
} catch (e) {
I.value = { type: "danger", message: e.message };
} finally {
P.value = !1;
}
}
}
function se() {
((X.value = !1),
(Y.value = {
id: "",
first_name: "",
last_name: "",
email: "",
temporary_password: "",
}),
(R.value = !0));
}
async function de() {
if (Y.value.email && B.value) {
P.value = !0;
try {
(X.value
? (await t.patch(`/items/client_users/${Y.value.id}`, {
first_name: Y.value.first_name,
last_name: Y.value.last_name,
email: Y.value.email,
}),
(I.value = {
type: "success",
message: "Mitarbeiter aktualisiert!",
}))
: (await t.post("/items/client_users", {
first_name: Y.value.first_name,
last_name: Y.value.last_name,
email: Y.value.email,
company: B.value.id,
}),
(I.value = {
type: "success",
message: "Mitarbeiter angelegt!",
})),
(R.value = !1),
await te(B.value));
} catch (e) {
I.value = { type: "danger", message: e.message };
} finally {
P.value = !1;
}
}
}
function re(e) {
const a = (null == e ? void 0 : e.item) || e;
a &&
a.id &&
(async function (e) {
((Y.value = {
id: e.id || "",
first_name: e.first_name || "",
last_name: e.last_name || "",
email: e.email || "",
temporary_password: e.temporary_password || "",
}),
(X.value = !0),
await _(),
(R.value = !0));
})(a);
}
return (
n(() => {
ae();
}),
(e, a) => {
const l = i("v-icon"),
n = i("v-list-item-icon"),
_ = i("v-text-overflow"),
ae = i("v-list-item-content"),
ue = i("v-list-item"),
oe = i("v-divider"),
me = i("v-list"),
ce = i("v-notice"),
ve = i("v-button"),
pe = i("v-info"),
fe = i("v-avatar"),
ge = i("v-chip"),
ye = i("v-table"),
be = i("v-input"),
_e = i("v-drawer"),
he = i("private-view"),
xe = s("tooltip");
return (
d(),
r(
he,
{ title: "Customer Manager" },
{
navigation: u(() => [
o(
me,
{ nav: "" },
{
default: u(() => [
o(
ue,
{ onClick: le, clickable: "" },
{
default: u(() => [
o(n, null, {
default: u(() => [
o(l, {
name: "add",
color: "var(--theme--primary)",
}),
]),
_: 1,
}),
o(ae, null, {
default: u(() => [
o(_, { text: "Neue Firma anlegen" }),
]),
_: 1,
}),
]),
_: 1,
},
),
o(oe),
(d(!0),
m(
c,
null,
v(q.value, (e) => {
var a;
return (
d(),
r(
ue,
{
key: e.id,
active:
(null == (a = B.value) ? void 0 : a.id) ===
e.id,
class: "company-item",
clickable: "",
onClick: (a) => te(e),
},
{
default: u(() => [
o(n, null, {
default: u(() => [
o(l, { name: "business" }),
]),
_: 1,
}),
o(
ae,
null,
{
default: u(() => [
o(_, { text: e.name }, null, 8, [
"text",
]),
]),
_: 2,
},
1024,
),
]),
_: 2,
},
1032,
["active", "onClick"],
)
);
}),
128,
)),
]),
_: 1,
},
),
]),
"title-outer:after": u(() => [
I.value
? (d(),
r(
ce,
{
key: 0,
type: I.value.type,
onClose: a[0] || (a[0] = (e) => (I.value = null)),
dismissible: "",
},
{ default: u(() => [p(f(I.value.message), 1)]), _: 1 },
8,
["type"],
))
: g("v-if", !0),
]),
default: u(() => [
y("div", h, [
B.value
? (d(),
m(
c,
{ key: 1 },
[
y("header", w, [
y("div", k, [
y("h1", V, f(B.value.name), 1),
y(
"p",
C,
f(K.value.length) + " Kunden-Mitarbeiter",
1,
),
]),
y("div", M, [
b(
(d(),
r(
ve,
{
secondary: "",
rounded: "",
icon: "",
onClick: ne,
},
{
default: u(() => [
o(l, { name: "edit" }),
]),
_: 1,
},
)),
[
[
xe,
"Firma bearbeiten",
void 0,
{ bottom: !0 },
],
],
),
o(
ve,
{ primary: "", onClick: se },
{
default: u(() => [
...(a[14] ||
(a[14] = [
p(" Mitarbeiter hinzufügen ", -1),
])),
]),
_: 1,
},
),
]),
]),
o(
ye,
{
headers: ee,
items: K.value,
loading: L.value,
class: "clickable-table",
"fixed-header": "",
"onClick:row": re,
},
{
"item.name": u(({ item: e }) => [
y("div", F, [
o(
fe,
{ name: e.first_name, "x-small": "" },
null,
8,
["name"],
),
y(
"span",
N,
f(e.first_name) + " " + f(e.last_name),
1,
),
]),
]),
"item.last_invited": u(({ item: e }) => {
return [
e.last_invited
? (d(),
m(
"span",
z,
f(
((t = e.last_invited),
new Date(t).toLocaleString(
"de-DE",
{
day: "2-digit",
month: "2-digit",
year: "numeric",
hour: "2-digit",
minute: "2-digit",
},
)),
),
1,
))
: (d(),
r(
ge,
{ key: 1, "x-small": "" },
{
default: u(() => [
...(a[15] ||
(a[15] = [p("Noch nie", -1)])),
]),
_: 1,
},
)),
];
var t;
}),
_: 2,
},
1032,
["items", "loading"],
),
],
64,
))
: (d(),
m("div", x, [
o(
pe,
{
title: "Firmen auswählen",
icon: "business",
center: "",
},
{
default: u(() => [
a[12] ||
(a[12] = p(
" Wähle eine Firma in der Navigation aus oder ",
-1,
)),
o(
ve,
{ "x-small": "", onClick: le },
{
default: u(() => [
...(a[11] ||
(a[11] = [
p("erstelle eine neue Firma", -1),
])),
]),
_: 1,
},
),
a[13] || (a[13] = p(". ", -1)),
]),
_: 1,
},
),
])),
]),
o(
_e,
{
modelValue: H.value,
"onUpdate:modelValue":
a[2] || (a[2] = (e) => (H.value = e)),
title: J.value
? "Firma bearbeiten"
: "Neue Firma anlegen",
icon: "business",
onCancel: a[3] || (a[3] = (e) => (H.value = !1)),
},
{
default: u(() => [
H.value
? (d(),
m("div", E, [
y("div", U, [
y("div", S, [
a[16] ||
(a[16] = y(
"span",
{ class: "label" },
"Firmenname",
-1,
)),
o(
be,
{
modelValue: Q.value.name,
"onUpdate:modelValue":
a[1] ||
(a[1] = (e) => (Q.value.name = e)),
placeholder: "z.B. KLZ Cables",
autofocus: "",
},
null,
8,
["modelValue"],
),
]),
]),
y("div", A, [
o(
ve,
{
primary: "",
block: "",
loading: P.value,
onClick: ie,
},
{
default: u(() => [
...(a[17] ||
(a[17] = [p("Speichern", -1)])),
]),
_: 1,
},
8,
["loading"],
),
]),
]))
: g("v-if", !0),
]),
_: 1,
},
8,
["modelValue", "title"],
),
o(
_e,
{
modelValue: R.value,
"onUpdate:modelValue":
a[9] || (a[9] = (e) => (R.value = e)),
title: X.value
? "Mitarbeiter bearbeiten"
: "Neuen Mitarbeiter anlegen",
icon: "person",
onCancel: a[10] || (a[10] = (e) => (R.value = !1)),
},
{
default: u(() => [
R.value
? (d(),
m("div", T, [
y("div", Z, [
y("div", j, [
a[18] ||
(a[18] = y(
"span",
{ class: "label" },
"Vorname",
-1,
)),
o(
be,
{
modelValue: Y.value.first_name,
"onUpdate:modelValue":
a[4] ||
(a[4] = (e) =>
(Y.value.first_name = e)),
placeholder: "Vorname",
autofocus: "",
},
null,
8,
["modelValue"],
),
]),
y("div", $, [
a[19] ||
(a[19] = y(
"span",
{ class: "label" },
"Nachname",
-1,
)),
o(
be,
{
modelValue: Y.value.last_name,
"onUpdate:modelValue":
a[5] ||
(a[5] = (e) => (Y.value.last_name = e)),
placeholder: "Nachname",
},
null,
8,
["modelValue"],
),
]),
y("div", D, [
a[20] ||
(a[20] = y(
"span",
{ class: "label" },
"E-Mail",
-1,
)),
o(
be,
{
modelValue: Y.value.email,
"onUpdate:modelValue":
a[6] ||
(a[6] = (e) => (Y.value.email = e)),
placeholder: "E-Mail Adresse",
type: "email",
},
null,
8,
["modelValue"],
),
]),
X.value
? (d(), r(oe, { key: 0 }))
: g("v-if", !0),
X.value
? (d(),
m("div", O, [
a[21] ||
(a[21] = y(
"span",
{ class: "label" },
"Temporäres Passwort",
-1,
)),
o(
be,
{
modelValue:
Y.value.temporary_password,
"onUpdate:modelValue":
a[7] ||
(a[7] = (e) =>
(Y.value.temporary_password = e)),
readonly: "",
class: "password-input",
},
null,
8,
["modelValue"],
),
a[22] ||
(a[22] = y(
"p",
{ class: "field-note" },
"Wird beim Senden der Zugangsdaten automatisch generiert.",
-1,
)),
]))
: g("v-if", !0),
]),
y("div", W, [
o(
ve,
{
primary: "",
block: "",
loading: P.value,
onClick: de,
},
{
default: u(() => [
...(a[23] ||
(a[23] = [p("Daten speichern", -1)])),
]),
_: 1,
},
8,
["loading"],
),
X.value
? (d(),
m(
c,
{ key: 0 },
[
o(oe),
b(
(d(),
r(
ve,
{
secondary: "",
block: "",
loading: G.value === Y.value.id,
onClick:
a[8] ||
(a[8] = (e) =>
(async function (e) {
G.value = e.id;
try {
if (
(await t.post(
"/flows/trigger/33443f6b-cec7-4668-9607-f33ea674d501",
[e.id],
),
(I.value = {
type: "success",
message: `Zugangsdaten für ${e.first_name} versendet. 📧`,
}),
await te(B.value),
R.value &&
Y.value.id === e.id)
) {
const a = K.value.find(
(a) => a.id === e.id,
);
a &&
(Y.value.temporary_password =
a.temporary_password);
}
} catch (e) {
I.value = {
type: "danger",
message: `Fehler: ${e.message}`,
};
} finally {
G.value = null;
}
})(Y.value)),
},
{
default: u(() => [
o(l, {
name: "send",
left: "",
}),
a[24] ||
(a[24] = p(
" Zugangsdaten senden ",
-1,
)),
]),
_: 1,
},
8,
["loading"],
)),
[
[
xe,
"Generiert PW, speichert es und sendet E-Mail",
void 0,
{ bottom: !0 },
],
],
),
],
64,
))
: g("v-if", !0),
]),
]))
: g("v-if", !0),
]),
_: 1,
},
8,
["modelValue", "title"],
),
]),
_: 1,
},
)
);
}
);
},
}),
B = [],
K = [];
!(function (e, a) {
if (e && "undefined" != typeof document) {
var t,
l = !0 === a.prepend ? "prepend" : "append",
n = !0 === a.singleTag,
i =
"string" == typeof a.container
? document.querySelector(a.container)
: document.getElementsByTagName("head")[0];
if (n) {
var s = B.indexOf(i);
(-1 === s && ((s = B.push(i) - 1), (K[s] = {})),
(t = K[s] && K[s][l] ? K[s][l] : (K[s][l] = d())));
} else t = d();
(65279 === e.charCodeAt(0) && (e = e.substring(1)),
t.styleSheet
? (t.styleSheet.cssText += e)
: t.appendChild(document.createTextNode(e)));
}
function d() {
var e = document.createElement("style");
if ((e.setAttribute("type", "text/css"), a.attributes))
for (var t = Object.keys(a.attributes), n = 0; n < t.length; n++)
e.setAttribute(t[n], a.attributes[t[n]]);
var s = "prepend" === l ? "afterbegin" : "beforeend";
return (i.insertAdjacentElement(s, e), e);
}
})(
"\n.content-wrapper[data-v-3fd11e72] { padding: 32px; height: 100%; display: flex; flex-direction: column;\n}\n.company-item[data-v-3fd11e72] { cursor: pointer;\n}\n.header[data-v-3fd11e72] { margin-bottom: 24px; display: flex; justify-content: space-between; align-items: flex-end;\n}\n.header-right[data-v-3fd11e72] { display: flex; gap: 12px;\n}\n.title[data-v-3fd11e72] { font-size: 24px; font-weight: 800; margin-bottom: 4px;\n}\n.subtitle[data-v-3fd11e72] { color: var(--theme--foreground-subdued); font-size: 14px;\n}\n.empty-state[data-v-3fd11e72] { height: 100%; display: flex; align-items: center; justify-content: center;\n}\n.user-cell[data-v-3fd11e72] { display: flex; align-items: center; gap: 12px;\n}\n.user-name[data-v-3fd11e72] { font-weight: 600;\n}\n.status-date[data-v-3fd11e72] { font-size: 12px; color: var(--theme--foreground-subdued);\n}\n.drawer-content[data-v-3fd11e72] { padding: 24px; display: flex; flex-direction: column; gap: 32px;\n}\n.form-section[data-v-3fd11e72] { display: flex; flex-direction: column; gap: 20px;\n}\n.field[data-v-3fd11e72] { display: flex; flex-direction: column; gap: 8px;\n}\n.label[data-v-3fd11e72] { font-size: 12px; font-weight: 700; text-transform: uppercase; color: var(--theme--foreground-subdued); letter-spacing: 0.5px;\n}\n.field-note[data-v-3fd11e72] { font-size: 11px; color: var(--theme--foreground-subdued); margin-top: 4px;\n}\n.drawer-actions[data-v-3fd11e72] { margin-top: 24px; display: flex; flex-direction: column; gap: 12px;\n}\n.password-input[data-v-3fd11e72] textarea {\n\tfont-family: var(--family-monospace);\n\tfont-weight: 800;\n\tcolor: var(--theme--primary) !important;\n\tbackground: var(--theme--background-subdued) !important;\n}\n.clickable-table[data-v-3fd11e72] tbody tr { cursor: pointer; transition: background-color 0.2s ease;\n}\n.clickable-table[data-v-3fd11e72] tbody tr:hover { background-color: var(--theme--background-subdued) !important;\n}\n[data-v-3fd11e72] .v-list-item { cursor: pointer !important;\n}\n",
{},
);
var L = a({
id: "customer-manager",
name: "Customer Manager",
icon: "supervisor_account",
routes: [
{
path: "",
component: ((e, a) => {
const t = e.__vccOpts || e;
for (const [e, l] of a) t[e] = l;
return t;
})(q, [
["__scopeId", "data-v-3fd11e72"],
["__file", "module.vue"],
]),
},
],
});
export { L as default };

View File

@@ -1,29 +0,0 @@
{
"name": "customer-manager",
"description": "Custom High-Fidelity Customer & Company Management for Directus",
"icon": "supervisor_account",
"version": "1.6.0",
"keywords": [
"directus",
"directus-extension",
"directus-extension-module"
],
"files": [
"dist"
],
"directus:extension": {
"type": "module",
"path": "index.js",
"source": "src/index.ts",
"host": "*",
"name": "Customer Manager"
},
"scripts": {
"build": "directus-extension build",
"dev": "directus-extension build -w"
},
"devDependencies": {
"@directus/extensions-sdk": "11.0.2",
"vue": "^3.4.0"
}
}

View File

@@ -1,14 +0,0 @@
import { defineModule } from '@directus/extensions-sdk';
import ModuleComponent from './module.vue';
export default defineModule({
id: 'customer-manager',
name: 'Customer Manager',
icon: 'supervisor_account',
routes: [
{
path: '',
component: ModuleComponent,
},
],
});

View File

@@ -1,377 +0,0 @@
<template>
<private-view title="Customer Manager">
<template #navigation>
<v-list nav>
<v-list-item @click="openCreateCompany" clickable>
<v-list-item-icon><v-icon name="add" color="var(--theme--primary)" /></v-list-item-icon>
<v-list-item-content>
<v-text-overflow text="Neue Firma anlegen" />
</v-list-item-content>
</v-list-item>
<v-divider />
<v-list-item
v-for="company in companies"
:key="company.id"
:active="selectedCompany?.id === company.id"
class="company-item"
clickable
@click="selectCompany(company)"
>
<v-list-item-icon><v-icon name="business" /></v-list-item-icon>
<v-list-item-content>
<v-text-overflow :text="company.name" />
</v-list-item-content>
</v-list-item>
</v-list>
</template>
<template #title-outer:after>
<v-notice v-if="notice" :type="notice.type" @close="notice = null" dismissible>
{{ notice.message }}
</v-notice>
</template>
<div class="content-wrapper">
<div v-if="!selectedCompany" class="empty-state">
<v-info title="Firmen auswählen" icon="business" center>
Wähle eine Firma in der Navigation aus oder
<v-button x-small @click="openCreateCompany">erstelle eine neue Firma</v-button>.
</v-info>
</div>
<template v-else>
<header class="header">
<div class="header-left">
<h1 class="title">{{ selectedCompany.name }}</h1>
<p class="subtitle">{{ employees.length }} Kunden-Mitarbeiter</p>
</div>
<div class="header-right">
<v-button secondary rounded icon v-tooltip.bottom="'Firma bearbeiten'" @click="openEditCompany">
<v-icon name="edit" />
</v-button>
<v-button primary @click="openCreateEmployee">
Mitarbeiter hinzufügen
</v-button>
</div>
</header>
<v-table
:headers="tableHeaders"
:items="employees"
:loading="loading"
class="clickable-table"
fixed-header
@click:row="onRowClick"
>
<template #[`item.name`]="{ item }">
<div class="user-cell">
<v-avatar :name="item.first_name" x-small />
<span class="user-name">{{ item.first_name }} {{ item.last_name }}</span>
</div>
</template>
<template #[`item.last_invited`]="{ item }">
<span v-if="item.last_invited" class="status-date">
{{ formatDate(item.last_invited) }}
</span>
<v-chip v-else x-small>Noch nie</v-chip>
</template>
</v-table>
</template>
</div>
<!-- Drawer: Company Form -->
<v-drawer
v-model="drawerCompanyActive"
:title="isEditingCompany ? 'Firma bearbeiten' : 'Neue Firma anlegen'"
icon="business"
@cancel="drawerCompanyActive = false"
>
<div v-if="drawerCompanyActive" class="drawer-content">
<div class="form-section">
<div class="field">
<span class="label">Firmenname</span>
<v-input v-model="companyForm.name" placeholder="z.B. KLZ Cables" autofocus />
</div>
</div>
<div class="drawer-actions">
<v-button primary block :loading="saving" @click="saveCompany">Speichern</v-button>
</div>
</div>
</v-drawer>
<!-- Drawer: Employee Form -->
<v-drawer
v-model="drawerEmployeeActive"
:title="isEditingEmployee ? 'Mitarbeiter bearbeiten' : 'Neuen Mitarbeiter anlegen'"
icon="person"
@cancel="drawerEmployeeActive = false"
>
<div v-if="drawerEmployeeActive" class="drawer-content">
<div class="form-section">
<div class="field">
<span class="label">Vorname</span>
<v-input v-model="employeeForm.first_name" placeholder="Vorname" autofocus />
</div>
<div class="field">
<span class="label">Nachname</span>
<v-input v-model="employeeForm.last_name" placeholder="Nachname" />
</div>
<div class="field">
<span class="label">E-Mail</span>
<v-input v-model="employeeForm.email" placeholder="E-Mail Adresse" type="email" />
</div>
<v-divider v-if="isEditingEmployee" />
<div v-if="isEditingEmployee" class="field">
<span class="label">Temporäres Passwort</span>
<v-input v-model="employeeForm.temporary_password" readonly class="password-input" />
<p class="field-note">Wird beim Senden der Zugangsdaten automatisch generiert.</p>
</div>
</div>
<div class="drawer-actions">
<v-button primary block :loading="saving" @click="saveEmployee">Daten speichern</v-button>
<template v-if="isEditingEmployee">
<v-divider />
<v-button
v-tooltip.bottom="'Generiert PW, speichert es und sendet E-Mail'"
secondary
block
:loading="invitingId === employeeForm.id"
@click="inviteUser(employeeForm)"
>
<v-icon name="send" left /> Zugangsdaten senden
</v-button>
</template>
</div>
</div>
</v-drawer>
</private-view>
</template>
<script setup lang="ts">
import { ref, onMounted, nextTick } from 'vue';
import { useApi } from '@directus/extensions-sdk';
const api = useApi();
const companies = ref<any[]>([]);
const selectedCompany = ref<any>(null);
const employees = ref<any[]>([]);
const loading = ref(false);
const saving = ref(false);
const invitingId = ref<string | null>(null);
const notice = ref<{ type: string; message: string } | null>(null);
// Forms State
const drawerCompanyActive = ref(false);
const isEditingCompany = ref(false);
const companyForm = ref({ id: '', name: '' });
const drawerEmployeeActive = ref(false);
const isEditingEmployee = ref(false);
const employeeForm = ref({
id: '',
first_name: '',
last_name: '',
email: '',
temporary_password: ''
});
const tableHeaders = [
{ text: 'Name', value: 'name', sortable: true },
{ text: 'E-Mail', value: 'email', sortable: true },
{ text: 'Zuletzt eingeladen', value: 'last_invited', sortable: true }
];
async function fetchCompanies() {
const res = await api.get('/items/companies', {
params: {
fields: ['id', 'name'],
sort: 'name',
},
});
companies.value = res.data.data;
}
async function selectCompany(company: any) {
selectedCompany.value = company;
loading.value = true;
try {
const res = await api.get('/items/client_users', {
params: {
filter: { company: { _eq: company.id } },
fields: ['*'],
sort: 'first_name',
},
});
employees.value = res.data.data;
} finally {
loading.value = false;
}
}
// Company Actions
function openCreateCompany() {
isEditingCompany.value = false;
companyForm.value = { id: '', name: '' };
drawerCompanyActive.value = true;
}
async function openEditCompany() {
if (!selectedCompany.value) return;
companyForm.value = {
id: selectedCompany.value.id,
name: selectedCompany.value.name
};
isEditingCompany.value = true;
await nextTick();
drawerCompanyActive.value = true;
}
async function saveCompany() {
if (!companyForm.value.name) return;
saving.value = true;
try {
if (isEditingCompany.value) {
await api.patch(`/items/companies/${companyForm.value.id}`, { name: companyForm.value.name });
notice.value = { type: 'success', message: 'Firma aktualisiert!' };
} else {
await api.post('/items/companies', { name: companyForm.value.name });
notice.value = { type: 'success', message: 'Firma angelegt!' };
}
drawerCompanyActive.value = false;
await fetchCompanies();
if (selectedCompany.value?.id === companyForm.value.id) {
selectedCompany.value.name = companyForm.value.name;
}
} catch (e: any) {
notice.value = { type: 'danger', message: e.message };
} finally {
saving.value = false;
}
}
// Employee Actions
function openCreateEmployee() {
isEditingEmployee.value = false;
employeeForm.value = { id: '', first_name: '', last_name: '', email: '', temporary_password: '' };
drawerEmployeeActive.value = true;
}
async function openEditEmployee(item: any) {
employeeForm.value = {
id: item.id || '',
first_name: item.first_name || '',
last_name: item.last_name || '',
email: item.email || '',
temporary_password: item.temporary_password || ''
};
isEditingEmployee.value = true;
await nextTick();
drawerEmployeeActive.value = true;
}
async function saveEmployee() {
if (!employeeForm.value.email || !selectedCompany.value) return;
saving.value = true;
try {
if (isEditingEmployee.value) {
await api.patch(`/items/client_users/${employeeForm.value.id}`, {
first_name: employeeForm.value.first_name,
last_name: employeeForm.value.last_name,
email: employeeForm.value.email
});
notice.value = { type: 'success', message: 'Mitarbeiter aktualisiert!' };
} else {
await api.post('/items/client_users', {
first_name: employeeForm.value.first_name,
last_name: employeeForm.value.last_name,
email: employeeForm.value.email,
company: selectedCompany.value.id
});
notice.value = { type: 'success', message: 'Mitarbeiter angelegt!' };
}
drawerEmployeeActive.value = false;
await selectCompany(selectedCompany.value);
} catch (e: any) {
notice.value = { type: 'danger', message: e.message };
} finally {
saving.value = false;
}
}
async function inviteUser(user: any) {
invitingId.value = user.id;
try {
await api.post(`/flows/trigger/33443f6b-cec7-4668-9607-f33ea674d501`, [user.id]);
notice.value = { type: 'success', message: `Zugangsdaten für ${user.first_name} versendet. 📧` };
await selectCompany(selectedCompany.value);
if (drawerEmployeeActive.value && employeeForm.value.id === user.id) {
const updated = employees.value.find(e => e.id === user.id);
if (updated) {
employeeForm.value.temporary_password = updated.temporary_password;
}
}
} catch (e: any) {
notice.value = { type: 'danger', message: `Fehler: ${e.message}` };
} finally {
invitingId.value = null;
}
}
function onRowClick(event: any) {
const item = event?.item || event;
if (item && item.id) {
openEditEmployee(item);
}
}
function formatDate(dateStr: string) {
return new Date(dateStr).toLocaleString('de-DE', {
day: '2-digit', month: '2-digit', year: 'numeric',
hour: '2-digit', minute: '2-digit'
});
}
onMounted(() => {
fetchCompanies();
});
</script>
<style scoped>
.content-wrapper { padding: 32px; height: 100%; display: flex; flex-direction: column; }
.company-item { cursor: pointer; }
.header { margin-bottom: 24px; display: flex; justify-content: space-between; align-items: flex-end; }
.header-right { display: flex; gap: 12px; }
.title { font-size: 24px; font-weight: 800; margin-bottom: 4px; }
.subtitle { color: var(--theme--foreground-subdued); font-size: 14px; }
.empty-state { height: 100%; display: flex; align-items: center; justify-content: center; }
.user-cell { display: flex; align-items: center; gap: 12px; }
.user-name { font-weight: 600; }
.status-date { font-size: 12px; color: var(--theme--foreground-subdued); }
.drawer-content { padding: 24px; display: flex; flex-direction: column; gap: 32px; }
.form-section { display: flex; flex-direction: column; gap: 20px; }
.field { display: flex; flex-direction: column; gap: 8px; }
.label { font-size: 12px; font-weight: 700; text-transform: uppercase; color: var(--theme--foreground-subdued); letter-spacing: 0.5px; }
.field-note { font-size: 11px; color: var(--theme--foreground-subdued); margin-top: 4px; }
.drawer-actions { margin-top: 24px; display: flex; flex-direction: column; gap: 12px; }
.password-input :deep(textarea) {
font-family: var(--family-monospace);
font-weight: 800;
color: var(--theme--primary) !important;
background: var(--theme--background-subdued) !important;
}
.clickable-table :deep(tbody tr) { cursor: pointer; transition: background-color 0.2s ease; }
.clickable-table :deep(tbody tr:hover) { background-color: var(--theme--background-subdued) !important; }
:deep(.v-list-item) { cursor: pointer !important; }
</style>

View File

@@ -1,7 +1,17 @@
import js from "@eslint/js";
import tseslint from "typescript-eslint";
import globals from "globals";
export default tseslint.config(
{
languageOptions: {
globals: {
...globals.browser,
...globals.node,
...globals.es2021,
},
},
},
{
ignores: ["**/dist/**", "**/node_modules/**", "**/.next/**", "**/build/**"],
},

View File

@@ -2,40 +2,37 @@ import nextPlugin from "@next/eslint-plugin-next";
import reactPlugin from "eslint-plugin-react";
import hooksPlugin from "eslint-plugin-react-hooks";
import tseslint from "typescript-eslint";
import js from "@eslint/js";
/**
* Mintel Next.js ESLint Configuration (Flat Config)
*
*
* This configuration replaces the legacy 'eslint-config-next' which
* relies on @rushstack/eslint-patch and causes issues in ESLint 9.
*/
export const nextConfig = tseslint.config(
{
plugins: {
"react": reactPlugin,
"react-hooks": hooksPlugin,
"@next/next": nextPlugin,
export const nextConfig = tseslint.config({
plugins: {
react: reactPlugin,
"react-hooks": hooksPlugin,
"@next/next": nextPlugin,
},
languageOptions: {
globals: {
// Add common browser/node globals if needed,
// though usually handled by base configs
},
languageOptions: {
globals: {
// Add common browser/node globals if needed,
// though usually handled by base configs
},
},
rules: {
...reactPlugin.configs.recommended.rules,
...hooksPlugin.configs.recommended.rules,
...nextPlugin.configs.recommended.rules,
...nextPlugin.configs["core-web-vitals"].rules,
"react/react-in-jsx-scope": "off",
"react/no-unescaped-entities": "off",
"@next/next/no-img-element": "warn",
},
settings: {
react: {
version: "detect",
},
rules: {
...reactPlugin.configs.recommended.rules,
...hooksPlugin.configs.recommended.rules,
...nextPlugin.configs.recommended.rules,
...nextPlugin.configs["core-web-vitals"].rules,
"react/react-in-jsx-scope": "off",
"react/no-unescaped-entities": "off",
"@next/next/no-img-element": "warn",
},
settings: {
react: {
version: "detect",
},
},
}
);
},
});

View File

@@ -1,9 +1,9 @@
{
"name": "@mintel/eslint-config",
"version": "1.6.0",
"version": "1.9.9",
"publishConfig": {
"access": "public",
"registry": "https://npm.infra.mintel.me"
"registry": "https://git.infra.mintel.me/api/packages/mmintel/npm"
},
"type": "module",
"main": "index.js",
@@ -25,5 +25,9 @@
"eslint-plugin-react": "^7.37.5",
"eslint-plugin-react-hooks": "^7.0.1",
"typescript-eslint": "^8.54.0"
},
"repository": {
"type": "git",
"url": "https://git.infra.mintel.me/mmintel/at-mintel.git"
}
}

View File

@@ -0,0 +1,37 @@
{
"name": "@mintel/estimation-engine",
"version": "1.9.9",
"private": true,
"type": "module",
"main": "./dist/index.js",
"module": "./dist/index.js",
"types": "./dist/index.d.ts",
"bin": {
"estimate": "./dist/cli.js"
},
"exports": {
".": {
"types": "./dist/index.d.ts",
"import": "./dist/index.js"
}
},
"scripts": {
"build": "tsup src/index.ts src/cli.ts --format esm --dts --clean",
"dev": "tsup src/index.ts src/cli.ts --format esm --watch --dts",
"lint": "eslint src",
"estimate": "tsx src/cli.ts"
},
"dependencies": {
"@mintel/concept-engine": "workspace:*",
"axios": "^1.6.0",
"commander": "^12.0.0",
"dotenv": "^17.3.1"
},
"devDependencies": {
"@mintel/tsconfig": "workspace:*",
"@types/node": "^20.0.0",
"tsup": "^8.3.5",
"tsx": "^4.7.0",
"typescript": "^5.0.0"
}
}

View File

@@ -0,0 +1,62 @@
import { config as dotenvConfig } from "dotenv";
import * as path from "node:path";
import * as fs from "node:fs/promises";
import { EstimationPipeline } from "./pipeline.js";
dotenvConfig({ path: path.resolve(process.cwd(), "../../.env") });
const briefing = await fs.readFile(
path.resolve(process.cwd(), "../../data/briefings/etib.txt"),
"utf8",
);
console.log(`Briefing loaded: ${briefing.length} chars`);
const pipeline = new EstimationPipeline(
{
openrouterKey: process.env.OPENROUTER_API_KEY || "",
zyteApiKey: process.env.ZYTE_API_KEY,
outputDir: path.resolve(process.cwd(), "../../out/estimations"),
crawlDir: path.resolve(process.cwd(), "../../data/crawls"),
},
{
onStepStart: (id, _name) => console.log(`[CB] Starting: ${id}`),
onStepComplete: (id) => console.log(`[CB] Done: ${id}`),
onStepError: (id, err) => console.error(`[CB] Error in ${id}: ${err}`),
},
);
try {
const result = await pipeline.run({
concept: {
strategy: {
briefingSummary: briefing,
projectGoals: [],
targetAudience: [],
coreMessage: "",
designVision: "",
uniqueValueProposition: "",
competitorAnalysis: "",
},
architecture: {
sitemap: [],
recommendedTechStack: [],
integrations: [],
websiteTopic: "",
dataModels: [],
},
auditedFacts: {
companyName: "E-TIB",
},
} as any,
});
console.log("\n✨ Pipeline complete!");
console.log(
"Validation:",
result.validationResult?.passed ? "PASSED" : "FAILED",
);
} catch (err: any) {
console.error("\n❌ Pipeline failed:", err.message);
console.error(err.stack);
}

View File

@@ -0,0 +1,81 @@
#!/usr/bin/env node
// ============================================================================
// @mintel/estimation-engine — CLI Entry Point
// ============================================================================
import { Command } from "commander";
import * as path from "node:path";
import * as fs from "node:fs/promises";
import { existsSync } from "node:fs";
import { config as dotenvConfig } from "dotenv";
import { EstimationPipeline } from "./pipeline.js";
import type { ProjectConcept } from "@mintel/concept-engine";
// Load .env from monorepo root
dotenvConfig({ path: path.resolve(process.cwd(), "../../.env") });
dotenvConfig({ path: path.resolve(process.cwd(), ".env") });
const program = new Command();
program
.name("estimate")
.description("AI-powered project estimation engine")
.version("1.0.0");
program
.command("run")
.description("Run the financial estimation pipeline from a concept file")
.argument("<concept-file>", "Path to the ProjectConcept JSON file")
.option("--budget <budget>", "Budget constraint (e.g. '15.000 €')")
.option("--output <dir>", "Output directory", "../../out/estimations")
.action(async (conceptFile: string, options: any) => {
const openrouterKey =
process.env.OPENROUTER_API_KEY || process.env.OPENROUTER_KEY;
if (!openrouterKey) {
console.error("❌ OPENROUTER_API_KEY not found in environment.");
process.exit(1);
}
const filePath = path.resolve(process.cwd(), conceptFile);
if (!existsSync(filePath)) {
console.error(`❌ Concept file not found: ${filePath}`);
process.exit(1);
}
console.log(`📄 Loading concept from: ${filePath}`);
const rawConcept = await fs.readFile(filePath, "utf8");
const concept = JSON.parse(rawConcept) as ProjectConcept;
const pipeline = new EstimationPipeline(
{
openrouterKey,
outputDir: path.resolve(process.cwd(), options.output),
crawlDir: "", // No longer needed here
},
{
onStepStart: (_id, _name) => {},
onStepComplete: (_id, _result) => {},
},
);
try {
const result = await pipeline.run({
concept,
budget: options.budget,
});
console.log("\n✨ Estimation complete!");
if (result.validationResult && !result.validationResult.passed) {
console.log(
`\n⚠ ${result.validationResult.errors.length} validation issues found.`,
);
console.log(" Review the output JSON and re-run problematic steps.");
}
} catch (err) {
console.error(`\n❌ Pipeline failed: ${(err as Error).message}`);
process.exit(1);
}
});
program.parse();

View File

@@ -0,0 +1,9 @@
// ============================================================================
// @mintel/estimation-engine — Public API
// ============================================================================
export { EstimationPipeline } from "./pipeline.js";
export type { PipelineCallbacks } from "./pipeline.js";
export { validateEstimation } from "./validators.js";
export { llmRequest, llmJsonRequest, cleanJson } from "./llm-client.js";
export * from "./types.js";

View File

@@ -0,0 +1,132 @@
// ============================================================================
// LLM Client — Unified interface with model routing via OpenRouter
// ============================================================================
import axios from "axios";
interface LLMRequestOptions {
model: string;
systemPrompt: string;
userPrompt: string;
jsonMode?: boolean;
apiKey: string;
}
interface LLMResponse {
content: string;
usage: {
promptTokens: number;
completionTokens: number;
cost: number;
};
}
/**
* Clean raw LLM output to parseable JSON.
* Handles markdown fences, control chars, trailing commas.
*/
export function cleanJson(str: string): string {
let cleaned = str.replace(/```json\n?|```/g, "").trim();
// eslint-disable-next-line no-control-regex
cleaned = cleaned.replace(/[\x00-\x1f\x7f-\x9f]/gi, " ");
cleaned = cleaned.replace(/,\s*([\]}])/g, "$1");
return cleaned;
}
/**
* Send a request to an LLM via OpenRouter.
*/
export async function llmRequest(
options: LLMRequestOptions,
): Promise<LLMResponse> {
const { model, systemPrompt, userPrompt, jsonMode = true, apiKey } = options;
const resp = await axios.post(
"https://openrouter.ai/api/v1/chat/completions",
{
model,
messages: [
{ role: "system", content: systemPrompt },
{ role: "user", content: userPrompt },
],
...(jsonMode ? { response_format: { type: "json_object" } } : {}),
},
{
headers: {
Authorization: `Bearer ${apiKey}`,
"Content-Type": "application/json",
},
timeout: 120000,
},
);
const content = resp.data.choices?.[0]?.message?.content;
if (!content) {
throw new Error(`LLM returned no content. Model: ${model}`);
}
let cost = 0;
const usage = resp.data.usage || {};
if (usage.cost !== undefined) {
cost = usage.cost;
} else {
// Fallback estimation
cost =
(usage.prompt_tokens || 0) * (0.1 / 1_000_000) +
(usage.completion_tokens || 0) * (0.4 / 1_000_000);
}
return {
content,
usage: {
promptTokens: usage.prompt_tokens || 0,
completionTokens: usage.completion_tokens || 0,
cost,
},
};
}
/**
* Send a request and parse the response as JSON.
*/
export async function llmJsonRequest<T = any>(
options: LLMRequestOptions,
): Promise<{ data: T; usage: LLMResponse["usage"] }> {
const response = await llmRequest({ ...options, jsonMode: true });
const cleaned = cleanJson(response.content);
let parsed: T;
try {
parsed = JSON.parse(cleaned);
} catch (e) {
throw new Error(
`Failed to parse LLM JSON response: ${(e as Error).message}\nRaw: ${cleaned.substring(0, 500)}`,
);
}
// Unwrap common LLM artifacts: {"0": {...}}, {"state": {...}}, etc.
const unwrapped = unwrapResponse(parsed);
return { data: unwrapped as T, usage: response.usage };
}
/**
* Recursively unwrap common LLM wrapping patterns.
*/
function unwrapResponse(obj: any): any {
if (!obj || typeof obj !== "object" || Array.isArray(obj)) return obj;
const keys = Object.keys(obj);
if (keys.length === 1) {
const key = keys[0];
if (
key === "0" ||
key === "state" ||
key === "facts" ||
key === "result" ||
key === "data"
) {
return unwrapResponse(obj[key]);
}
}
return obj;
}

View File

@@ -0,0 +1,256 @@
// ============================================================================
// Pipeline Orchestrator
// Runs all steps sequentially, tracks state, supports re-running individual steps.
// ============================================================================
import * as fs from "node:fs/promises";
import * as path from "node:path";
import { validateEstimation } from "./validators.js";
import { executeSynthesize } from "./steps/05-synthesize.js";
import { executeCritique } from "./steps/06-critique.js";
import type {
PipelineConfig,
PipelineInput,
EstimationState,
StepResult,
} from "./types.js";
export interface PipelineCallbacks {
onStepStart?: (stepId: string, stepName: string) => void;
onStepComplete?: (stepId: string, result: StepResult) => void;
onStepError?: (stepId: string, error: string) => void;
}
/**
* The main estimation pipeline orchestrator.
* Runs steps sequentially, persists state between steps, supports re-entry.
*/
export class EstimationPipeline {
private config: PipelineConfig;
private state: EstimationState;
private callbacks: PipelineCallbacks;
constructor(config: PipelineConfig, callbacks: PipelineCallbacks = {}) {
this.config = config;
this.callbacks = callbacks;
this.state = this.createInitialState();
}
private createInitialState(): EstimationState {
return {
concept: null as any, // Will be set in run()
usage: {
totalPromptTokens: 0,
totalCompletionTokens: 0,
totalCost: 0,
perStep: [],
},
};
}
/**
* Run the full estimation pipeline from a completed project concept.
*/
async run(input: PipelineInput): Promise<EstimationState> {
this.state.concept = input.concept;
this.state.budget = input.budget;
// Ensure output directories
await fs.mkdir(this.config.outputDir, { recursive: true });
// Step 5: Position synthesis
await this.runStep("05-synthesize", "Position Descriptions", async () => {
const result = await executeSynthesize(this.state, this.config);
if (result.success) this.state.positionDescriptions = result.data;
return result;
});
// Step 6: Quality critique
await this.runStep(
"06-critique",
"Quality Gate (Industrial Critic)",
async () => {
const result = await executeCritique(this.state, this.config);
if (result.success) {
this.state.critiquePassed = result.data.passed;
this.state.critiqueErrors =
result.data.errors?.map((e: any) => `${e.field}: ${e.issue}`) || [];
// Apply corrections
if (result.data.corrections) {
const corrections = result.data.corrections;
// Note: We only correct the positionDescriptions since briefing/design/sitemap are locked in the concept phase.
// If the critique suggests changes to those, it should be a warning or failure.
if (corrections.positionDescriptions) {
this.state.positionDescriptions = {
...this.state.positionDescriptions,
...corrections.positionDescriptions,
};
}
}
}
return result;
},
);
// Step 7: Deterministic validation
await this.runStep("07-validate", "Deterministic Validation", async () => {
// Build the merged form state first
this.state.formState = this.buildFormState();
const validationResult = validateEstimation(this.state);
this.state.validationResult = validationResult;
if (!validationResult.passed) {
console.log("\n⚠ Validation Issues:");
for (const error of validationResult.errors) {
console.log(` ❌ [${error.code}] ${error.message}`);
}
}
if (validationResult.warnings.length > 0) {
console.log("\n⚡ Warnings:");
for (const warning of validationResult.warnings) {
console.log(` ⚡ [${warning.code}] ${warning.message}`);
if (warning.suggestion) console.log(`${warning.suggestion}`);
}
}
return {
success: true,
data: validationResult,
usage: {
step: "07-validate",
model: "none",
promptTokens: 0,
completionTokens: 0,
cost: 0,
durationMs: 0,
},
};
});
// Save final state
await this.saveState();
return this.state;
}
/**
* Run a single step with callbacks and error handling.
*/
private async runStep(
stepId: string,
stepName: string,
executor: () => Promise<StepResult>,
): Promise<void> {
this.callbacks.onStepStart?.(stepId, stepName);
console.log(`\n📍 ${stepName}...`);
try {
const result = await executor();
if (result.usage) {
this.state.usage.perStep.push(result.usage);
this.state.usage.totalPromptTokens += result.usage.promptTokens;
this.state.usage.totalCompletionTokens += result.usage.completionTokens;
this.state.usage.totalCost += result.usage.cost;
}
if (result.success) {
const cost = result.usage?.cost
? ` ($${result.usage.cost.toFixed(4)})`
: "";
const duration = result.usage?.durationMs
? ` [${(result.usage.durationMs / 1000).toFixed(1)}s]`
: "";
console.log(`${stepName} complete${cost}${duration}`);
this.callbacks.onStepComplete?.(stepId, result);
} else {
console.error(`${stepName} failed: ${result.error}`);
this.callbacks.onStepError?.(stepId, result.error || "Unknown error");
throw new Error(result.error);
}
} catch (err) {
const errorMsg = (err as Error).message;
this.callbacks.onStepError?.(stepId, errorMsg);
throw err;
}
}
/**
* Build the final FormState compatible with @mintel/pdf.
*/
private buildFormState(): Record<string, any> {
const facts = this.state.concept.auditedFacts || {};
return {
projectType: "website",
...facts,
briefingSummary: this.state.concept.strategy.briefingSummary || "",
designVision: this.state.concept.strategy.designVision || "",
sitemap: this.state.concept.architecture.sitemap || [],
positionDescriptions: this.state.positionDescriptions || {},
websiteTopic:
this.state.concept.architecture.websiteTopic ||
facts.websiteTopic ||
"",
statusQuo: facts.isRelaunch ? "Relaunch" : "Neuentwicklung",
name: facts.personName || "",
email: facts.email || "",
};
}
/**
* Save the full state to disk for later re-use.
*/
private async saveState(): Promise<void> {
const timestamp = new Date().toISOString().replace(/[:.]/g, "-");
const companyName =
this.state.concept.auditedFacts?.companyName || "unknown";
// Save full state
const stateDir = path.join(this.config.outputDir, "json");
await fs.mkdir(stateDir, { recursive: true });
const statePath = path.join(stateDir, `${companyName}_${timestamp}.json`);
await fs.writeFile(
statePath,
JSON.stringify(this.state.formState, null, 2),
);
console.log(`\n📦 Saved state to: ${statePath}`);
// Save full pipeline state (for debugging / re-entry)
const debugPath = path.join(
stateDir,
`${companyName}_${timestamp}_debug.json`,
);
await fs.writeFile(debugPath, JSON.stringify(this.state, null, 2));
// Print usage summary
console.log("\n──────────────────────────────────────────────");
console.log("📊 PIPELINE USAGE SUMMARY");
console.log("──────────────────────────────────────────────");
for (const step of this.state.usage.perStep) {
if (step.cost > 0) {
console.log(
` ${step.step}: ${step.model}$${step.cost.toFixed(6)} (${(step.durationMs / 1000).toFixed(1)}s)`,
);
}
}
console.log("──────────────────────────────────────────────");
console.log(` TOTAL: $${this.state.usage.totalCost.toFixed(6)}`);
console.log(
` Tokens: ${(this.state.usage.totalPromptTokens + this.state.usage.totalCompletionTokens).toLocaleString()}`,
);
console.log("──────────────────────────────────────────────\n");
}
/** Get the current state (for CLI inspection). */
getState(): EstimationState {
return this.state;
}
/** Load a saved state from JSON. */
async loadState(jsonPath: string): Promise<void> {
const raw = await fs.readFile(jsonPath, "utf8");
const formState = JSON.parse(raw);
this.state.formState = formState;
}
}

View File

@@ -0,0 +1,95 @@
// ============================================================================
// Step 05: Synthesize — Position Descriptions (Gemini Pro)
// ============================================================================
import { llmJsonRequest } from "../llm-client.js";
import type { EstimationState, StepResult, PipelineConfig } from "../types.js";
import { DEFAULT_MODELS } from "../types.js";
export async function executeSynthesize(
state: EstimationState,
config: PipelineConfig,
): Promise<StepResult> {
const models = { ...DEFAULT_MODELS, ...config.modelsOverride };
const startTime = Date.now();
if (!state.concept?.auditedFacts || !state.concept?.architecture?.sitemap) {
return { success: false, error: "Missing audited facts or sitemap." };
}
const facts = state.concept.auditedFacts;
// Determine which positions are required
const requiredPositions = [
"Das technische Fundament",
(facts.selectedPages?.length || 0) + (facts.otherPages?.length || 0) > 0
? "Individuelle Seiten"
: null,
facts.features?.length > 0 ? "System-Module (Features)" : null,
facts.functions?.length > 0 ? "Logik-Funktionen" : null,
facts.apiSystems?.length > 0 ? "Schnittstellen (API)" : null,
facts.cmsSetup ? "Inhalts-Verwaltung" : null,
facts.multilang ? "Mehrsprachigkeit" : null,
"Inhaltliche Initial-Pflege",
"Sorglos Betrieb",
].filter(Boolean);
const systemPrompt = `
You are a Senior Solution Architect. Write position descriptions for a professional B2B quote.
### REQUIRED POSITIONS (STRICT — ONLY DESCRIBE THESE):
${requiredPositions.map((p) => `"${p}"`).join(", ")}
### RULES (STRICT):
1. NO FIRST PERSON: NEVER "Ich", "Mein", "Wir", "Unser". Lead with nouns or passive verbs.
2. QUANTITY PARITY: Description MUST list EXACTLY the number of items matching 'qty'.
3. CMS GUARD: If cmsSetup=false, do NOT mention "CMS", "Inhaltsverwaltung". Use "Plattform-Struktur".
4. TONE: "Erstellung von...", "Anbindung der...", "Bereitstellung von...". Technical, high-density.
5. PAGES: List actual page names. NO implementation notes in parentheses.
6. HARD SPECIFICS: Use industry terms from the briefing (e.g. "Kabeltiefbau", "110 kV").
7. KEYS: Return EXACTLY the keys from REQUIRED POSITIONS.
8. NO AGB: NEVER mention "AGB" or "Geschäftsbedingungen".
9. Sorglos Betrieb: "Inklusive 1 Jahr technischer Betrieb, Hosting, SSL, Sicherheits-Updates, Monitoring und techn. Support."
10. Inhaltliche Initial-Pflege: Refers to DATENSÄTZE (datasets like products, references), NOT Seiten.
Use "Datensätze" in the description, not "Seiten".
11. Mehrsprachigkeit: This is a +20% markup on the subtotal. NOT an API. NOT a Schnittstelle.
### EXAMPLES:
- GOOD: "Erstellung der Seiten: Startseite, Über uns, Leistungen, Kontakt."
- GOOD: "Native API-Anbindung an Google Maps mit individueller Standort-Visualisierung."
- BAD: "Ich richte dir das CMS ein."
- BAD: "Verschiedene Funktionen" (too generic — name the things!)
### DATA CONTEXT:
${JSON.stringify({ facts, sitemap: state.concept.architecture.sitemap, strategy: { briefingSummary: state.concept.strategy.briefingSummary } }, null, 2)}
### OUTPUT FORMAT:
{
"positionDescriptions": { "Das technische Fundament": string, ... }
}
`;
try {
const { data, usage } = await llmJsonRequest({
model: models.pro,
systemPrompt,
userPrompt: state.concept.briefing,
apiKey: config.openrouterKey,
});
return {
success: true,
data: data.positionDescriptions || data,
usage: {
step: "05-synthesize",
model: models.pro,
promptTokens: usage.promptTokens,
completionTokens: usage.completionTokens,
cost: usage.cost,
durationMs: Date.now() - startTime,
},
};
} catch (err) {
return { success: false, error: `Synthesize step failed: ${(err as Error).message}` };
}
}

View File

@@ -0,0 +1,99 @@
// ============================================================================
// Step 06: Critique — Industrial Critic Quality Gate (Claude Opus)
// ============================================================================
import { llmJsonRequest } from "../llm-client.js";
import type { EstimationState, StepResult, PipelineConfig } from "../types.js";
import { DEFAULT_MODELS } from "../types.js";
export async function executeCritique(
state: EstimationState,
config: PipelineConfig,
): Promise<StepResult> {
const models = { ...DEFAULT_MODELS, ...config.modelsOverride };
const startTime = Date.now();
const currentState = {
facts: state.concept?.auditedFacts,
briefingSummary: state.concept?.strategy?.briefingSummary,
designVision: state.concept?.strategy?.designVision,
sitemap: state.concept?.architecture?.sitemap,
positionDescriptions: state.positionDescriptions,
siteProfile: state.concept?.siteProfile
? {
existingFeatures: state.concept.siteProfile.existingFeatures,
services: state.concept.siteProfile.services,
externalDomains: state.concept.siteProfile.externalDomains,
navigation: state.concept.siteProfile.navigation,
totalPages: state.concept.siteProfile.totalPages,
}
: null,
};
const systemPrompt = `
You are the "Industrial Critic" — the final quality gate for a professional B2B estimation.
Your job is to find EVERY error, hallucination, and inconsistency before this goes to the client.
### CRITICAL ERROR CHECKLIST (FAIL IF ANY FOUND):
1. HALLUCINATION: FAIL if names, software versions, or details not in the BRIEFING are used.
- "Sie", "Ansprechpartner" for personName when an actual name exists = FAIL.
2. LOGIC CONFLICT: FAIL if isRelaunch=true but text claims "no website exists".
3. IMPLEMENTATION FLUFF: FAIL if "React", "Next.js", "TypeScript", "Tailwind" are mentioned.
4. GENERICISM: FAIL if text could apply to ANY company. Must use specific industry terms.
5. NAMEN-VERBOT: FAIL if personal names in briefingSummary or designVision.
6. CMS-LEAKAGE: FAIL if cmsSetup=false but descriptions mention "CMS", "Inhaltsverwaltung".
7. AGB BAN: FAIL if "AGB" or "Geschäftsbedingungen" appear anywhere.
8. LENGTH: briefingSummary ~6 sentences, designVision ~4 sentences. Shorten if too wordy.
9. LEGAL SAFETY: FAIL if "rechtssicher" is used. Use "Standard-konform" instead.
10. BULLSHIT DETECTOR: FAIL if jargon like "SEO-Standards zur Fachkräftesicherung",
"B2B-Nutzerströme", "Digitale Konvergenzstrategie" or similar meaningless buzzwords are used.
The text must make SENSE to a construction industry CEO.
11. PAGE STRUCTURE: FAIL if the sitemap contains:
- Videos as pages (Messefilm, Imagefilm)
- Internal functions as pages (Verwaltung)
- Entities with their own domains as sub-pages (check externalDomains!)
12. SORGLOS-BETRIEB: FAIL if not mentioned in the summary or position descriptions.
13. TONE: FAIL if "wir/unser" or "Ich/Mein" in position descriptions. FAIL if marketing fluff.
14. MULTILANG: FAIL if Mehrsprachigkeit is described as an API or Schnittstelle.
15. INITIAL-PFLEGE: FAIL if described in terms of "Seiten" instead of "Datensätze".
### MISSION:
Return corrected fields ONLY for fields with issues. If everything passes, return empty corrections.
### OUTPUT FORMAT:
{
"passed": boolean,
"errors": [{ "field": string, "issue": string, "severity": "critical" | "warning" }],
"corrections": {
"briefingSummary"?: string,
"designVision"?: string,
"positionDescriptions"?: Record<string, string>,
"sitemap"?: array
}
}
`;
try {
const { data, usage } = await llmJsonRequest({
model: models.opus,
systemPrompt,
userPrompt: `BRIEFING_TRUTH:\n${state.concept?.briefing}\n\nCURRENT_STATE:\n${JSON.stringify(currentState, null, 2)}`,
apiKey: config.openrouterKey,
});
return {
success: true,
data,
usage: {
step: "06-critique",
model: models.opus,
promptTokens: usage.promptTokens,
completionTokens: usage.completionTokens,
cost: usage.cost,
durationMs: Date.now() - startTime,
},
};
} catch (err) {
return { success: false, error: `Critique step failed: ${(err as Error).message}` };
}
}

View File

@@ -0,0 +1,113 @@
// ============================================================================
// @mintel/estimation-engine — Core Type Definitions
// ============================================================================
import type { ProjectConcept } from "@mintel/concept-engine";
/** Configuration for the estimation pipeline */
export interface PipelineConfig {
openrouterKey: string;
zyteApiKey?: string;
outputDir: string;
crawlDir: string;
modelsOverride?: Partial<ModelConfig>;
}
/** Model routing configuration */
export interface ModelConfig {
flash: string;
pro: string;
opus: string;
}
export const DEFAULT_MODELS: ModelConfig = {
flash: "google/gemini-3-flash-preview",
pro: "google/gemini-3.1-pro-preview",
opus: "anthropic/claude-opus-4-6",
};
/** Input for the estimation pipeline */
export interface PipelineInput {
concept: ProjectConcept;
budget?: string;
}
/** State that flows through all pipeline steps */
export interface EstimationState {
// Input
concept: ProjectConcept;
budget?: string;
// Step 5 output: Position Synthesis
positionDescriptions?: Record<string, string>;
// Step 6 output: Critique
critiquePassed?: boolean;
critiqueErrors?: string[];
// Step 7 output: Validation
validationResult?: ValidationResult;
// Final merged form state for PDF generation
formState?: Record<string, any>;
// Cost tracking
usage: UsageStats;
}
export interface UsageStats {
totalPromptTokens: number;
totalCompletionTokens: number;
totalCost: number;
perStep: StepUsage[];
}
export interface StepUsage {
step: string;
model: string;
promptTokens: number;
completionTokens: number;
cost: number;
durationMs: number;
}
/** Result of a single pipeline step */
export interface StepResult<T = any> {
success: boolean;
data?: T;
error?: string;
usage?: StepUsage;
}
/** Validation result from the deterministic validator */
export interface ValidationResult {
passed: boolean;
errors: ValidationError[];
warnings: ValidationWarning[];
}
export interface ValidationError {
code: string;
message: string;
field?: string;
expected?: any;
actual?: any;
}
export interface ValidationWarning {
code: string;
message: string;
suggestion?: string;
}
/** Step definition for the pipeline */
export interface PipelineStep {
id: string;
name: string;
description: string;
model: "flash" | "pro" | "opus" | "none";
execute: (
state: EstimationState,
config: PipelineConfig,
) => Promise<StepResult>;
}

View File

@@ -0,0 +1,436 @@
// ============================================================================
// Validators — Deterministic Math & Logic Checks (NO LLM!)
// Catches all the issues reported by the user programmatically.
// ============================================================================
import type {
EstimationState,
ValidationResult,
ValidationError,
ValidationWarning,
} from "./types.js";
/**
* Run all deterministic validation checks on the final estimation state.
*/
export function validateEstimation(state: EstimationState): ValidationResult {
const errors: ValidationError[] = [];
const warnings: ValidationWarning[] = [];
if (!state.formState) {
return {
passed: false,
errors: [
{
code: "NO_FORM_STATE",
message: "No form state available for validation.",
},
],
warnings: [],
};
}
const fs = state.formState;
// 1. PAGE COUNT PARITY
validatePageCountParity(fs, errors);
// 2. SORGLOS-BETRIEB IN SUMMARY
validateSorglosBetrieb(fs, errors, warnings);
// 3. NO VIDEOS AS PAGES
validateNoVideosAsPages(fs, errors);
// 4. EXTERNAL DOMAINS NOT AS PAGES
validateExternalDomains(fs, state.concept?.siteProfile, errors);
// 5. SERVICE COVERAGE
validateServiceCoverage(fs, state.concept?.siteProfile, warnings);
// 6. EXISTING FEATURE DETECTION
validateExistingFeatures(fs, state.concept?.siteProfile, warnings);
// 7. MULTILANG LABEL CORRECTNESS
validateMultilangLabeling(fs, errors);
// 8. INITIAL-PFLEGE UNITS
validateInitialPflegeUnits(fs, warnings);
// 9. SITEMAP vs PAGE LIST CONSISTENCY
validateSitemapConsistency(fs, errors);
return {
passed: errors.length === 0,
errors,
warnings,
};
}
/**
* 1. Page count: the "Individuelle Seiten" position description must mention
* roughly the same number of pages as the sitemap contains.
* "er berechnet 15 Seiten nennt aber nur 11"
*
* NOTE: fs.pages (from auditedFacts) is a conceptual list of page groups
* (e.g. "Leistungen") while the sitemap expands those into sub-pages.
* Therefore we do NOT compare fs.pages.length to the sitemap count.
* Instead, we verify that the position description text lists the right count.
*/
function validatePageCountParity(
fs: Record<string, any>,
errors: ValidationError[],
): void {
// Count pages listed in the sitemap (the source of truth)
let sitemapPageCount = 0;
if (Array.isArray(fs.sitemap)) {
for (const cat of fs.sitemap) {
sitemapPageCount += (cat.pages || []).length;
}
}
if (sitemapPageCount === 0) return;
// Extract page names mentioned in the "Individuelle Seiten" position description
const positions = fs.positionDescriptions || {};
const pagesDesc =
positions["Individuelle Seiten"] ||
positions["2. Individuelle Seiten"] ||
"";
if (!pagesDesc) return;
const descStr = typeof pagesDesc === "string" ? pagesDesc : "";
// Count distinct page names mentioned (split by comma)
// We avoid splitting by "&" or "und" because actual page names like
// "Wartung & Störungsdienst" or "Genehmigungs- und Ausführungsplanung" contain them.
const afterColon = descStr.includes(":")
? descStr.split(":").slice(1).join(":")
: descStr;
const segments = afterColon
.split(/,/)
.map((s: string) => s.replace(/\.$/, "").trim())
.filter((s: string) => s.length > 2);
// Handle consolidated references like "Leistungen (6 Unterseiten)" or "(inkl. Messen)"
let mentionedCount = 0;
for (const seg of segments) {
const subPageMatch = seg.match(/\((\d+)\s+(?:Unter)?[Ss]eiten?\)/);
if (subPageMatch) {
mentionedCount += parseInt(subPageMatch[1], 10);
} else if (seg.match(/\(inkl\.\s+/)) {
// "Unternehmen (inkl. Messen)" = 2 pages
mentionedCount += 2;
} else {
mentionedCount += 1;
}
}
if (mentionedCount > 0 && Math.abs(mentionedCount - sitemapPageCount) > 2) {
errors.push({
code: "PAGE_COUNT_MISMATCH",
message: `Seiten-Beschreibung nennt ~${mentionedCount} Seiten, aber ${sitemapPageCount} Seiten in der Sitemap.`,
field: "positionDescriptions.Individuelle Seiten",
expected: sitemapPageCount,
actual: mentionedCount,
});
}
}
/**
* 2. Sorglos-Betrieb must be included in summary.
* "Zusammenfassung der Schätzung hat Sorglos-Betrieb nicht miteingenommen"
*/
function validateSorglosBetrieb(
fs: Record<string, any>,
errors: ValidationError[],
_warnings: ValidationWarning[],
): void {
const positions = fs.positionDescriptions || {};
const hasPosition = Object.keys(positions).some(
(k) =>
k.toLowerCase().includes("sorglos") ||
k.toLowerCase().includes("betrieb") ||
k.toLowerCase().includes("pflege"),
);
if (!hasPosition) {
errors.push({
code: "MISSING_SORGLOS_BETRIEB",
message: "Der Sorglos-Betrieb fehlt in den Position-Beschreibungen.",
field: "positionDescriptions",
});
}
}
/**
* 3. Videos must not be treated as separate pages.
* "Er hat Videos als eigene Seite aufgenommen"
*/
function validateNoVideosAsPages(
fs: Record<string, any>,
errors: ValidationError[],
): void {
const allPages = [...(fs.selectedPages || []), ...(fs.otherPages || [])];
const sitemapPages = Array.isArray(fs.sitemap)
? fs.sitemap.flatMap((cat: any) =>
(cat.pages || []).map((p: any) => p.title),
)
: [];
const allPageNames = [...allPages, ...sitemapPages];
const videoKeywords = ["video", "film", "messefilm", "imagefilm", "clip"];
for (const pageName of allPageNames) {
const lower = (typeof pageName === "string" ? pageName : "").toLowerCase();
if (
videoKeywords.some(
(kw) => lower.includes(kw) && !lower.includes("leistung"),
)
) {
errors.push({
code: "VIDEO_AS_PAGE",
message: `"${pageName}" ist ein Video-Asset, keine eigene Seite.`,
field: "sitemap",
});
}
}
}
/**
* 4. External sister-company domains must not be proposed as sub-pages.
* "er hat ingenieursgesellschaft als seite integriert, die haben aber eine eigene website"
*/
function validateExternalDomains(
fs: Record<string, any>,
siteProfile: any,
errors: ValidationError[],
): void {
if (!siteProfile?.externalDomains?.length) return;
const sitemapPages = Array.isArray(fs.sitemap)
? fs.sitemap.flatMap((cat: any) =>
(cat.pages || []).map((p: any) => p.title || ""),
)
: [];
for (const extDomain of siteProfile.externalDomains) {
// Extract base name (e.g. "etib-ing" from "etib-ing.com")
const baseName = extDomain
.replace(/^www\./, "")
.split(".")[0]
.toLowerCase();
for (const pageTitle of sitemapPages) {
const lower = pageTitle.toLowerCase();
// Check if the page title references the external company
if (
lower.includes(baseName) ||
(lower.includes("ingenieur") && extDomain.includes("ing"))
) {
errors.push({
code: "EXTERNAL_DOMAIN_AS_PAGE",
message: `"${pageTitle}" hat eine eigene Website (${extDomain}) und darf nicht als Unterseite vorgeschlagen werden.`,
field: "sitemap",
});
}
}
}
}
/**
* 5. Services from the existing site should be covered.
* "er hat leistungen ausgelassen die ganz klar auf der kompetenz seite genannt werden"
*/
function validateServiceCoverage(
fs: Record<string, any>,
siteProfile: any,
warnings: ValidationWarning[],
): void {
if (!siteProfile?.services?.length) return;
const allContent = JSON.stringify(fs).toLowerCase();
for (const service of siteProfile.services) {
const keywords = service
.toLowerCase()
.split(/[\s,&-]+/)
.filter((w: string) => w.length > 4);
const isCovered = keywords.some((kw: string) => allContent.includes(kw));
if (!isCovered && service.length > 5) {
warnings.push({
code: "MISSING_SERVICE",
message: `Bestehende Leistung "${service}" ist nicht in der Schätzung berücksichtigt.`,
suggestion: `Prüfen ob "${service}" im Briefing gewünscht ist und ggf. in die Seitenstruktur aufnehmen.`,
});
}
}
}
/**
* 6. Existing features (search, forms) must be acknowledged.
* "er hat die suchfunktion nicht bemerkt, die gibts schon auf der seite"
*/
function validateExistingFeatures(
fs: Record<string, any>,
siteProfile: any,
warnings: ValidationWarning[],
): void {
if (!siteProfile?.existingFeatures?.length) return;
const functions = fs.functions || [];
const features = fs.features || [];
const allSelected = [...functions, ...features];
for (const existingFeature of siteProfile.existingFeatures) {
if (existingFeature === "cookie-consent") continue; // Standard, don't flag
if (existingFeature === "video") continue; // Usually an asset, not a feature
const isMapped = allSelected.some(
(f: string) => f.toLowerCase() === existingFeature.toLowerCase(),
);
if (!isMapped) {
warnings.push({
code: "EXISTING_FEATURE_IGNORED",
message: `Die bestehende Suchfunktion/Feature "${existingFeature}" wurde auf der aktuellen Website erkannt, aber nicht in der Schätzung berücksichtigt.`,
suggestion: `"${existingFeature}" als Function oder Feature aufnehmen, da es bereits existiert und der Kunde es erwartet.`,
});
}
}
}
/**
* 7. Multilang +20% must not be labeled as API.
* "die +20% beziehen sich nicht auf API"
*/
function validateMultilangLabeling(
fs: Record<string, any>,
errors: ValidationError[],
): void {
const positions = fs.positionDescriptions || {};
for (const [key, desc] of Object.entries(positions)) {
if (
key.toLowerCase().includes("api") ||
key.toLowerCase().includes("schnittstelle")
) {
const descStr = typeof desc === "string" ? desc : "";
if (
descStr.toLowerCase().includes("mehrsprach") ||
descStr.toLowerCase().includes("multilang") ||
descStr.toLowerCase().includes("20%")
) {
errors.push({
code: "MULTILANG_WRONG_POSITION",
message: `Mehrsprachigkeit (+20%) ist unter "${key}" eingeordnet, gehört aber nicht zu API/Schnittstellen.`,
field: key,
});
}
}
}
}
/**
* 8. Initial-Pflege should refer to "Datensätze" not "Seiten".
* "Initialpflege => 100€/Stk => damit sind keine Seiten sondern Datensätze"
*/
function validateInitialPflegeUnits(
fs: Record<string, any>,
warnings: ValidationWarning[],
): void {
const positions = fs.positionDescriptions || {};
for (const [key, desc] of Object.entries(positions)) {
if (
key.toLowerCase().includes("pflege") ||
key.toLowerCase().includes("initial")
) {
const descStr = typeof desc === "string" ? desc : "";
if (
descStr.toLowerCase().includes("seiten") &&
!descStr.toLowerCase().includes("datensätz")
) {
warnings.push({
code: "INITIALPFLEGE_WRONG_UNIT",
message: `"${key}" spricht von "Seiten", aber gemeint sind Datensätze (z.B. Produkte, Referenzen).`,
suggestion: `Beschreibung auf "Datensätze" statt "Seiten" ändern.`,
});
}
}
}
}
/**
* 9. Position descriptions must match calculated quantities.
*/
/**
* 9. Position descriptions must match calculated quantities.
*/
// eslint-disable-next-line @typescript-eslint/no-unused-vars
function validatePositionDescriptionsMath(
fs: Record<string, any>,
errors: ValidationError[],
): void {
const positions = fs.positionDescriptions || {};
// Check pages description mentions correct count
const pagesDesc =
positions["Individuelle Seiten"] ||
positions["2. Individuelle Seiten"] ||
"";
if (pagesDesc) {
// Use the sitemap as the authoritative source of truth for page count
let sitemapPageCount = 0;
if (Array.isArray(fs.sitemap)) {
for (const cat of fs.sitemap) {
sitemapPageCount += (cat.pages || []).length;
}
}
// Count how many page names are mentioned in the description
const descStr = typeof pagesDesc === "string" ? pagesDesc : "";
const mentionedPages = descStr
.split(/,|und|&/)
.filter((s: string) => s.trim().length > 2);
if (
sitemapPageCount > 0 &&
mentionedPages.length > 0 &&
Math.abs(mentionedPages.length - sitemapPageCount) > 2
) {
errors.push({
code: "PAGES_DESC_COUNT_MISMATCH",
message: `Seiten-Beschreibung nennt ~${mentionedPages.length} Seiten, aber ${sitemapPageCount} in der Sitemap.`,
field: "positionDescriptions.Individuelle Seiten",
expected: sitemapPageCount,
actual: mentionedPages.length,
});
}
}
}
/**
* 10. Sitemap categories should be consistent with selected pages/features.
*/
function validateSitemapConsistency(
fs: Record<string, any>,
errors: ValidationError[],
): void {
if (!Array.isArray(fs.sitemap)) return;
const sitemapTitles = fs.sitemap.flatMap((cat: any) =>
(cat.pages || []).map((p: any) => (p.title || "").toLowerCase()),
);
// Check for "Verwaltung" page (hallucinated management page)
for (const title of sitemapTitles) {
if (title.includes("verwaltung") && !title.includes("inhalt")) {
errors.push({
code: "HALLUCINATED_MANAGEMENT_PAGE",
message: `"Verwaltung" als Seite ist vermutlich halluziniert. Verwaltung ist typischerweise eine interne Funktion, keine öffentliche Webseite.`,
field: "sitemap",
});
}
}
}

View File

@@ -0,0 +1,14 @@
{
"extends": "@mintel/tsconfig/base.json",
"compilerOptions": {
"module": "ESNext",
"target": "ESNext",
"moduleResolution": "Bundler",
"allowImportingTsExtensions": true,
"noEmit": true,
"jsx": "react-jsx"
},
"include": [
"src"
]
}

View File

@@ -1,29 +0,0 @@
{
"name": "@mintel/extension-feedback-commander",
"description": "Custom High-Fidelity Feedback Management Extension for Directus",
"icon": "view_kanban",
"version": "1.6.0",
"keywords": [
"directus",
"directus-extension",
"directus-extension-module"
],
"files": [
"dist"
],
"directus:extension": {
"type": "module",
"path": "dist/index.js",
"source": "src/index.ts",
"host": "*",
"name": "Feedback Commander"
},
"scripts": {
"build": "directus-extension build",
"dev": "directus-extension build -w"
},
"devDependencies": {
"@directus/extensions-sdk": "11.0.2",
"vue": "^3.4.0"
}
}

View File

@@ -1,14 +0,0 @@
import { defineModule } from '@directus/extensions-sdk';
import ModuleComponent from './module.vue';
export default defineModule({
id: 'feedback-commander',
name: 'Feedback Commander',
icon: 'view_kanban',
routes: [
{
path: '',
component: ModuleComponent,
},
],
});

View File

@@ -1,723 +0,0 @@
<template>
<private-view title="Feedback Commander">
<template #headline>
<v-breadcrumb :items="[{ name: 'Feedback', to: '/feedback-commander' }]" />
</template>
<template #title-outer:after>
<v-chip v-if="loading" label color="blue" small>Loading...</v-chip>
<v-chip v-else-if="fetchError" label color="red" small>Fetch Error</v-chip>
<v-chip v-else label color="green" small>{{ items.length }} Items</v-chip>
</template>
<template #navigation>
<div class="sidebar-header">
<v-text-overflow text="Websites" class="header-text" />
</div>
<v-list nav>
<v-list-item
:active="currentProject === 'all'"
@click="currentProject = 'all'"
clickable
>
<v-list-item-icon><v-icon name="language" /></v-list-item-icon>
<v-list-item-content><v-text-overflow text="All Projects" /></v-list-item-content>
</v-list-item>
<v-list-item
v-for="project in projects"
:key="project"
:active="currentProject === project"
@click="currentProject = project"
clickable
>
<v-list-item-icon><v-icon name="public" color="var(--primary)" /></v-list-item-icon>
<v-list-item-content><v-text-overflow :text="project || 'Unknown'" /></v-list-item-content>
</v-list-item>
</v-list>
</template>
<div class="feedback-container">
<div v-if="!items.length && !loading && !fetchError" class="empty-state">
<v-info icon="inbox" title="Clean Inbox" center>
All feedback has been processed. Great job!
</v-info>
</div>
<div v-if="fetchError" class="empty-state">
<v-info icon="error" title="Fetch Failed" :description="fetchError" center />
<v-button @click="fetchData" secondary small>Retry</v-button>
</div>
<div class="operational-layout" v-else-if="items.length">
<!-- Detailed Triage Lane -->
<aside class="triage-lane">
<div class="lane-header">
<v-select
v-model="currentStatusFilter"
:items="statusOptions"
small
placeholder="Status Filter"
/>
</div>
<div class="lane-content scrollbar">
<TransitionGroup name="list">
<div
v-for="item in filteredItems"
:key="item.id"
class="feedback-card"
:class="{ active: selectedItem?.id === item.id }"
@click="selectItem(item)"
>
<div class="card-status-bar" :style="{ background: getStatusColor(item.status || 'open') }"></div>
<div class="card-body">
<header class="card-header">
<span class="card-user">{{ item.user_name }}</span>
<span class="card-date">{{ formatDate(item.date_created || item.id) }}</span>
</header>
<div class="card-text">{{ item.text }}</div>
<footer class="card-footer">
<div class="meta-tags">
<v-chip x-small outline>{{ item.project }}</v-chip>
<v-icon :name="item.type === 'bug' ? 'bug_report' : 'lightbulb'" :color="item.type === 'bug' ? '#E91E63' : '#FFC107'" small />
</div>
<v-icon v-if="selectedItem?.id === item.id" name="chevron_right" small />
</footer>
</div>
</div>
</TransitionGroup>
</div>
</aside>
<!-- Elaborated Master-Detail Desk -->
<main class="processing-desk scrollbar">
<Transition name="fade" mode="out-in">
<div v-if="selectedItem" :key="selectedItem.id" class="desk-content">
<header class="desk-header">
<div class="headline-group">
<div class="status-indicator">
<div class="status-dot" :style="{ background: getStatusColor(selectedItem.status || 'open') }"></div>
<span class="status-text">{{ capitalize(selectedItem.status || 'open') }}</span>
</div>
<h2>{{ selectedItem.user_name }}'s Submission</h2>
</div>
<div class="header-actions">
<v-button primary @click="openDeepLink(selectedItem)">
<v-icon name="open_in_new" left /> Open & Highlight
</v-button>
<v-select
v-model="selectedItem.status"
:items="statuses"
inline
@update:model-value="updateStatus"
/>
</div>
</header>
<div class="desk-grid">
<!-- Message Container -->
<div class="main-column">
<v-card class="content-card">
<v-card-title>
<v-icon name="format_quote" left />
Feedback Content
</v-card-title>
<v-card-text class="feedback-body">
<div v-if="selectedItem.screenshot" class="visual-proof">
<label class="proof-label"><v-icon name="photo" x-small /> Element Snapshot</label>
<img :src="getAssetUrl(selectedItem.screenshot)" class="screenshot-img" />
</div>
<div class="main-text">{{ selectedItem.text }}</div>
</v-card-text>
</v-card>
<section class="reply-section">
<div class="section-divider">
<v-divider />
<span class="divider-label">Internal Communication</span>
<v-divider />
</div>
<div class="thread">
<TransitionGroup name="thread-list">
<div v-for="reply in comments" :key="reply.id" class="reply-bubble">
<header class="reply-header">
<span class="reply-user">{{ reply.user_name }}</span>
<span class="reply-date">{{ formatDate(reply.date_created || reply.id) }}</span>
</header>
<div class="reply-text">{{ reply.text }}</div>
</div>
</TransitionGroup>
<div v-if="!comments.length" class="empty-state-mini">
<v-icon name="auto_awesome" small /> No replies yet. Start the thread.
</div>
</div>
<div class="composer">
<v-textarea v-model="replyText" placeholder="Compose internal response..." auto-grow />
<div class="composer-actions">
<v-button secondary :loading="sending" @click="sendReply">Post Reply</v-button>
</div>
</div>
</section>
</div>
<!-- Technical Sidebar -->
<aside class="meta-column">
<v-card class="meta-card">
<v-card-title>Context</v-card-title>
<v-card-text class="meta-list">
<div class="meta-item">
<label><v-icon name="public" x-small /> Website</label>
<strong>{{ selectedItem.project }}</strong>
</div>
<div class="meta-item">
<label><v-icon name="link" x-small /> Source Path</label>
<span class="truncate-path" :title="selectedItem.url">{{ formatUrl(selectedItem.url) }}</span>
<v-button icon small @click="openExternal(selectedItem.url)"><v-icon name="launch" /></v-button>
</div>
<v-divider />
<div class="meta-item">
<label><v-icon name="layers" x-small /> Element Trace</label>
<code class="trace-code">{{ selectedItem.selector || 'Body' }}</code>
</div>
<div class="meta-item">
<label><v-icon name="location_searching" x-small /> Precise Mark</label>
<span class="coords">X: {{ Math.round(selectedItem.x) }}px / Y: {{ Math.round(selectedItem.y) }}px</span>
</div>
<div class="meta-item">
<label><v-icon name="fingerprint" x-small /> Reference ID</label>
<code class="id-code">{{ selectedItem.id }}</code>
</div>
</v-card-text>
</v-card>
<div class="help-box">
<v-icon name="help_outline" x-small />
<span>Click "Open & Highlight" to jump directly to this element on the live site.</span>
</div>
</aside>
</div>
</div>
<div v-else class="no-selection-desk">
<v-info icon="touch_app" title="Select Feedback" center>
Choose an entry from the triage list to view details and process.
</v-info>
</div>
</Transition>
</main>
</div>
</div>
</private-view>
</template>
<script setup lang="ts">
import { ref, computed, onMounted } from 'vue';
import { useApi } from '@directus/extensions-sdk';
const api = useApi();
const items = ref([]);
const comments = ref([]);
const loading = ref(true);
const fetchError = ref(null);
const sending = ref(false);
const selectedItem = ref(null);
const currentProject = ref('all');
const currentStatusFilter = ref('open');
const replyText = ref('');
const statuses = [
{ text: 'Open', value: 'open', icon: 'warning', color: '#E91E63' },
{ text: 'In Progress', value: 'in_progress', icon: 'play_arrow', color: '#2196F3' },
{ text: 'Resolved', value: 'resolved', icon: 'check_circle', color: '#4CAF50' }
];
const statusOptions = [
{ text: 'All Statuses', value: 'all' },
...statuses
];
const projects = computed(() => {
const projSet = new Set(items.value.map(i => i.project).filter(Boolean));
return Array.from(projSet).sort();
});
const filteredItems = computed(() => {
return items.value.filter(item => {
const matchProject = currentProject.value === 'all' || item.project === currentProject.value;
const status = item.status || 'open';
const matchStatus = currentStatusFilter.value === 'all' || status === currentStatusFilter.value;
return matchProject && matchStatus;
});
});
async function fetchData() {
loading.value = true;
fetchError.value = null;
try {
const response = await api.get('/items/visual_feedback', {
params: {
sort: '-date_created,-id',
limit: 300
}
});
items.value = response.data.data;
} catch (e: any) {
fetchError.value = e.message;
} finally {
loading.value = false;
}
}
async function selectItem(item) {
selectedItem.value = null;
setTimeout(async () => {
selectedItem.value = item;
comments.value = [];
try {
const response = await api.get('/items/visual_feedback_comments', {
params: {
filter: { feedback_id: { _eq: item.id } },
sort: '-date_created,-id'
}
});
comments.value = response.data.data;
} catch (e) {
console.error(e);
}
}, 10);
}
async function updateStatus(val) {
if (!selectedItem.value) return;
try {
await api.patch(`/items/visual_feedback/${selectedItem.value.id}`, {
status: val
});
fetchData();
} catch (e) {
console.error(e);
}
}
async function sendReply() {
if (!replyText.value.trim() || !selectedItem.value) return;
sending.value = true;
try {
const response = await api.post('/items/visual_feedback_comments', {
feedback_id: selectedItem.value.id,
user_name: 'Operator',
text: replyText.value
});
comments.value.unshift(response.data.data);
replyText.value = '';
} catch (e) {
console.error(e);
} finally {
sending.value = false;
}
}
function formatDate(dateStr) {
if (!dateStr || typeof dateStr === 'number') return 'Legacy';
return new Date(dateStr).toLocaleDateString() + ' ' + new Date(dateStr).toLocaleTimeString([], { hour: '2-digit', minute: '2-digit' });
}
function formatUrl(url) {
if (!url) return '';
return url.replace(/^https?:\/\//, '');
}
function capitalize(s) {
return s.charAt(0).toUpperCase() + s.slice(1).replace('_', ' ');
}
function getDeepLinkUrl(item) {
if (!item || !item.url) return '';
try {
const url = new URL(item.url);
url.searchParams.set('fb_id', item.id);
return url.toString();
} catch (e) {
return item.url + '?fb_id=' + item.id;
}
}
function openDeepLink(item) {
const url = getDeepLinkUrl(item);
if (url) window.open(url, '_blank');
}
function openExternal(url) {
if (url) window.open(url, '_blank');
}
function getAssetUrl(id) {
if (!id) return '';
return `/assets/${id}`;
}
function getStatusColor(status) {
const s = statuses.find(st => st.value === status);
return s ? s.color : 'var(--foreground-subdued)';
}
onMounted(() => {
fetchData();
});
</script>
<style scoped>
.feedback-container {
height: calc(100vh - 64px);
display: flex;
flex-direction: column;
background: var(--background-subdued);
}
.operational-layout {
display: flex;
height: 100%;
}
/* Triage Lane Polish */
.triage-lane {
width: 360px;
height: 100%;
display: flex;
flex-direction: column;
background: var(--background-normal);
border-right: 1px solid var(--border-normal);
box-shadow: 2px 0 8px rgba(0,0,0,0.02);
}
.lane-header {
padding: 16px;
background: var(--background-normal);
border-bottom: 1px solid var(--border-normal);
}
.lane-content {
flex: 1;
overflow-y: auto;
padding: 12px;
display: flex;
flex-direction: column;
gap: 12px;
}
.feedback-card {
background: var(--background-normal);
border: 1px solid var(--border-subdued);
border-radius: 8px;
display: flex;
overflow: hidden;
cursor: pointer;
transition: all var(--transition);
}
.feedback-card:hover {
border-color: var(--border-normal);
background: var(--background-subdued);
transform: translateY(-1px);
box-shadow: 0 4px 8px rgba(0,0,0,0.04);
}
.feedback-card.active {
border-color: var(--primary);
background: var(--background-accent);
box-shadow: 0 4px 12px rgba(var(--primary-rgb), 0.1);
}
.card-status-bar {
width: 4px;
}
.card-body {
flex: 1;
padding: 12px;
display: flex;
flex-direction: column;
gap: 8px;
}
.card-header {
display: flex;
justify-content: space-between;
font-size: 11px;
}
.card-user { font-weight: bold; color: var(--foreground-normal); }
.card-date { color: var(--foreground-subdued); }
.card-text {
font-size: 13px;
line-height: 1.5;
color: var(--foreground-normal);
display: -webkit-box;
-webkit-line-clamp: 2;
-webkit-box-orient: vertical;
overflow: hidden;
}
.card-footer {
display: flex;
justify-content: space-between;
align-items: center;
}
.meta-tags {
display: flex;
gap: 8px;
align-items: center;
}
/* Processing Desk Refinement */
.processing-desk {
flex: 1;
height: 100%;
overflow-y: auto;
padding: 32px;
}
.desk-content {
max-width: 1100px;
margin: 0 auto;
}
.desk-header {
display: flex;
justify-content: space-between;
align-items: flex-end;
margin-bottom: 32px;
border-bottom: 2px solid var(--border-normal);
padding-bottom: 20px;
}
.headline-group {
display: flex;
flex-direction: column;
gap: 8px;
}
.status-indicator {
display: flex;
align-items: center;
gap: 8px;
font-size: 12px;
font-weight: bold;
text-transform: uppercase;
color: var(--foreground-subdued);
}
.status-dot {
width: 8px;
height: 8px;
border-radius: 50%;
}
.status-text { letter-spacing: 0.5px; }
.header-actions {
display: flex;
gap: 16px;
align-items: center;
}
.desk-grid {
display: grid;
grid-template-columns: 1fr 300px;
gap: 24px;
align-items: start;
}
.content-card {
border-radius: 12px;
overflow: hidden;
}
.feedback-body {
font-size: 18px;
line-height: 1.6;
padding: 24px;
color: var(--foreground-normal);
display: flex;
flex-direction: column;
gap: 20px;
}
.visual-proof {
display: flex;
flex-direction: column;
gap: 8px;
}
.proof-label {
font-size: 10px;
text-transform: uppercase;
font-weight: 800;
color: var(--foreground-subdued);
letter-spacing: 0.5px;
}
.screenshot-img {
width: 100%;
border-radius: 8px;
border: 1px solid var(--border-normal);
box-shadow: 0 4px 12px rgba(0,0,0,0.1);
background: var(--background-subdued);
}
.main-text {
white-space: pre-wrap;
}
.reply-section {
margin-top: 40px;
}
.section-divider {
display: flex;
align-items: center;
gap: 16px;
margin-bottom: 24px;
}
.divider-label {
font-size: 11px;
text-transform: uppercase;
font-weight: 800;
color: var(--foreground-subdued);
white-space: nowrap;
letter-spacing: 1px;
}
.thread {
display: flex;
flex-direction: column;
gap: 16px;
margin-bottom: 32px;
}
.reply-bubble {
padding: 16px;
border-radius: 12px;
background: var(--background-normal);
border: 1px solid var(--border-subdued);
}
.reply-header {
display: flex;
justify-content: space-between;
font-size: 11px;
margin-bottom: 8px;
}
.reply-user { font-weight: 800; color: var(--primary); }
.reply-date { color: var(--foreground-subdued); }
.reply-text { font-size: 14px; line-height: 1.5; }
.composer {
background: var(--background-normal);
border: 1px solid var(--border-normal);
border-radius: 12px;
padding: 16px;
}
.composer-actions {
display: flex;
justify-content: flex-end;
margin-top: 12px;
}
.meta-card {
border-radius: 12px;
}
.meta-list {
display: flex;
flex-direction: column;
gap: 16px;
padding: 16px;
}
.meta-item {
display: flex;
flex-direction: column;
gap: 4px;
font-size: 13px;
}
.meta-item label {
font-size: 10px;
text-transform: uppercase;
font-weight: bold;
color: var(--foreground-subdued);
display: flex;
align-items: center;
gap: 4px;
}
.truncate-path {
color: var(--primary);
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
}
.trace-code, .id-code {
background: var(--background-subdued);
padding: 4px 8px;
border-radius: 4px;
font-size: 11px;
word-break: break-all;
}
.coords { font-weight: bold; font-family: var(--family-monospace); }
.help-box {
margin-top: 20px;
padding: 16px;
background: rgba(var(--primary-rgb), 0.05);
border-radius: 12px;
font-size: 12px;
color: var(--primary);
display: flex;
gap: 8px;
line-height: 1.4;
}
.no-selection-desk {
height: 100%;
display: flex;
align-items: center;
justify-content: center;
}
.empty-state-mini {
text-align: center;
padding: 24px;
font-size: 12px;
color: var(--foreground-subdued);
background: var(--background-subdued);
border-radius: 12px;
border: 1px dashed var(--border-normal);
}
/* Animations */
.list-enter-active, .list-leave-active { transition: all 0.3s ease; }
.list-enter-from, .list-leave-to { opacity: 0; transform: translateX(-20px); }
.fade-enter-active, .fade-leave-active { transition: opacity 0.2s ease, transform 0.2s ease; }
.fade-enter-from { opacity: 0; transform: translateY(10px); }
.fade-leave-to { opacity: 0; transform: translateY(-10px); }
.thread-list-enter-active { transition: all 0.4s ease; transform-origin: top; }
.thread-list-enter-from { opacity: 0; transform: scaleY(0.9); }
.scrollbar::-webkit-scrollbar { width: 6px; }
.scrollbar::-webkit-scrollbar-track { background: transparent; }
.scrollbar::-webkit-scrollbar-thumb { background: var(--border-subdued); border-radius: 3px; }
.scrollbar::-webkit-scrollbar-thumb:hover { background: var(--border-normal); }
</style>

View File

@@ -3,6 +3,7 @@ import { NextConfig } from "next";
const nextConfig: NextConfig = {
basePath: '/gatekeeper',
output: 'standalone',
};
export default mintelNextConfig(nextConfig);

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/gatekeeper",
"version": "1.6.0",
"version": "1.9.9",
"private": true,
"type": "module",
"scripts": {
@@ -12,13 +12,11 @@
},
"dependencies": {
"@mintel/next-utils": "workspace:*",
"clsx": "^2.1.1",
"framer-motion": "^11.18.2",
"lucide-react": "^0.474.0",
"next": "16.1.6",
"next-intl": "^4.8.2",
"react": "^19.0.0",
"react-dom": "^19.0.0",
"tailwind-merge": "^2.6.0"
"react-dom": "^19.0.0"
},
"devDependencies": {
"@mintel/eslint-config": "workspace:*",

View File

@@ -1,4 +1,3 @@
/* global module */
module.exports = {
plugins: {
tailwindcss: {},

View File

@@ -11,6 +11,8 @@ export async function GET(req: NextRequest) {
// 1. URL Parameter Bypass (for automated tests/staging)
const originalUrl = req.headers.get("x-forwarded-uri") || "/";
console.log(`[Verify] Check: ${originalUrl} | Cookie: ${session ? "Found" : "Missing"}`);
const host =
req.headers.get("x-forwarded-host") || req.headers.get("host") || "";
const proto = req.headers.get("x-forwarded-proto") || "https";
@@ -44,7 +46,7 @@ export async function GET(req: NextRequest) {
return response;
}
} catch (e) {
} catch (_e) {
// URL parsing failed, proceed with normal logic
}
@@ -54,15 +56,17 @@ export async function GET(req: NextRequest) {
if (session?.value) {
if (session.value === password) {
isAuthenticated = true;
console.log(`[Verify] Legacy password match`);
} else {
try {
const payload = JSON.parse(session.value);
if (payload.identity) {
isAuthenticated = true;
identity = payload.identity;
console.log(`[Verify] Identity verified: ${identity}`);
}
} catch (e) {
// Fallback or old format
} catch (_e) {
console.log(`[Verify] JSON Parse failed for cookie: ${session.value.substring(0, 10)}...`);
}
}
}
@@ -78,7 +82,7 @@ export async function GET(req: NextRequest) {
// Traefik ForwardAuth headers
const gatekeeperUrl =
process.env.NEXT_PUBLIC_BASE_URL || `${proto}://gatekeeper.${host}`;
process.env.GATEKEEPER_ORIGIN || process.env.NEXT_PUBLIC_BASE_URL || `${proto}://gatekeeper.${host}`;
const absoluteOriginalUrl = `${proto}://${host}${originalUrl}`;
const loginUrl = `${gatekeeperUrl}/login?redirect=${encodeURIComponent(absoluteOriginalUrl)}`;

View File

@@ -1,7 +1,7 @@
import { NextRequest, NextResponse } from "next/server";
import { cookies } from "next/headers";
export async function GET(req: NextRequest) {
export async function GET(_req: NextRequest) {
const cookieStore = await cookies();
const authCookieName =
process.env.AUTH_COOKIE_NAME || "mintel_gatekeeper_session";
@@ -17,7 +17,7 @@ export async function GET(req: NextRequest) {
const payload = JSON.parse(session.value);
identity = payload.identity || "Guest";
company = payload.company || null;
} catch (e) {
} catch (_e) {
// Old format probably just the password
}

View File

@@ -8,7 +8,7 @@
}
body {
@apply bg-white text-slate-800 font-serif antialiased selection:bg-slate-900 selection:text-white;
@apply bg-[#f5f5f7] text-black/80 font-serif antialiased selection:bg-black/10 selection:text-black;
line-height: 1.6;
}
@@ -18,15 +18,15 @@
h4,
h5,
h6 {
@apply font-sans font-bold text-slate-900 tracking-tighter;
@apply font-sans font-bold text-black tracking-tighter;
}
p {
@apply mb-4 text-base leading-relaxed text-slate-700;
@apply mb-4 text-base leading-relaxed text-black/50;
}
a {
@apply text-slate-900 hover:text-slate-700 transition-colors no-underline;
@apply text-black/50 hover:text-black transition-colors no-underline;
}
}
@@ -36,34 +36,58 @@
}
.btn {
@apply inline-flex items-center justify-center px-6 py-3 border border-slate-200 bg-white text-slate-600 font-sans font-bold text-sm uppercase tracking-widest rounded-full transition-all duration-500 ease-industrial hover:border-slate-400 hover:text-slate-900 hover:bg-slate-50 hover:-translate-y-0.5 hover:shadow-xl hover:shadow-slate-100 active:translate-y-0 active:shadow-sm;
@apply inline-flex items-center justify-center px-6 py-3 border border-black/10 bg-white text-black/60 font-sans font-bold text-sm uppercase tracking-widest rounded-full transition-all duration-500 ease-industrial hover:border-black/20 hover:text-black hover:bg-white hover:-translate-y-0.5 hover:shadow-xl hover:shadow-black/5 active:translate-y-0 active:shadow-sm;
}
.btn-primary {
@apply border-slate-900 text-slate-900 hover:bg-slate-900 hover:text-white;
@apply border-black bg-black text-white hover:bg-black/85 hover:text-white;
}
}
/* Custom scrollbar */
/* Custom scrollbar - light theme */
::-webkit-scrollbar {
width: 8px;
height: 8px;
width: 6px;
height: 6px;
}
::-webkit-scrollbar-track {
background: #f1f5f9;
background: #f5f5f7;
}
::-webkit-scrollbar-thumb {
background: #cbd5e1;
background: #d1d1d6;
border-radius: 4px;
}
::-webkit-scrollbar-thumb:hover {
background: #94a3b8;
background: #b0b0b8;
}
/* Animations */
@keyframes fade-in {
from {
opacity: 0;
transform: translateY(12px);
}
to {
opacity: 1;
transform: translateY(0);
}
}
@keyframes slide-up {
from {
opacity: 0;
transform: translateY(20px);
}
to {
opacity: 1;
transform: translateY(0);
}
}
@keyframes shake {
0%,
100% {
@@ -79,6 +103,15 @@
}
}
.animate-fade-in {
animation: fade-in 0.8s ease-out forwards;
}
.animate-slide-up {
animation: slide-up 0.8s ease-out 0.2s forwards;
opacity: 0;
}
.animate-shake {
animation: shake 0.2s ease-in-out 0s 2;
}

View File

@@ -13,6 +13,17 @@ const newsreader = Newsreader({
export const metadata: Metadata = {
title: "Gatekeeper | Access Control",
description: "Mintel Infrastructure Protection",
openGraph: {
title: "Gatekeeper | Access Control",
description: "Mintel Infrastructure Protection",
siteName: "Mintel Gatekeeper",
type: "website",
},
twitter: {
card: "summary_large_image",
title: "Gatekeeper | Access Control",
description: "Mintel Infrastructure Protection",
},
};
export default function RootLayout({

Some files were not shown because too many files have changed in this diff Show More