Compare commits

..

14 Commits

Author SHA1 Message Date
0aaf858f5b chore: sync versions to v1.8.20
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m3s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m57s
Monorepo Pipeline / 🏗️ Build (push) Successful in 4m38s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m12s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 1m42s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 1m31s
Monorepo Pipeline / 🐳 Build Image Processor (push) Successful in 2m50s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 8m14s
Monorepo Pipeline / 🚀 Release (push) Successful in 9m13s
2026-02-23 14:03:27 +01:00
ec562c1b2c fix: imgproxy issues 2026-02-23 14:03:17 +01:00
02e15c3f4a chore: sync versions to v1.8.19
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 3m54s
Monorepo Pipeline / 🧹 Lint (push) Successful in 4m12s
Monorepo Pipeline / 🏗️ Build (push) Successful in 2m42s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m7s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 1m43s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 1m37s
Monorepo Pipeline / 🐳 Build Image Processor (push) Successful in 2m47s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 6m39s
Monorepo Pipeline / 🚀 Release (push) Successful in 7m18s
🏥 Server Maintenance / 🧹 Prune & Clean (push) Failing after 8s
2026-02-23 00:52:35 +01:00
cd4c2193ce feat: implement legacy imgproxy compatibility and URL mapping
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 1m1s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m56s
Monorepo Pipeline / 🏗️ Build (push) Successful in 4m32s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-23 00:14:13 +01:00
df7a464e03 fix(ci): sync lockfile and remove deleted model scripts
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 5m26s
Monorepo Pipeline / 🏗️ Build (push) Successful in 7m18s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m5s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m8s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 1m43s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 1m27s
Monorepo Pipeline / 🐳 Build Image Processor (push) Successful in 2m38s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 5m49s
Monorepo Pipeline / 🚀 Release (push) Successful in 6m24s
2026-02-22 23:40:30 +01:00
e2e0653de6 chore(image-processor): use Gemini 3 Flash Preview
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🏗️ Build (push) Failing after 23s
Monorepo Pipeline / 🧹 Lint (push) Failing after 8s
Monorepo Pipeline / 🧪 Test (push) Failing after 21s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-22 23:31:44 +01:00
590ae6f69b chore: sync versions to v1.8.16
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Failing after 29s
Monorepo Pipeline / 🧹 Lint (push) Failing after 21s
Monorepo Pipeline / 🏗️ Build (push) Failing after 8s
Monorepo Pipeline / 🚀 Release (push) Has been skipped
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been skipped
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Has been skipped
Monorepo Pipeline / 🐳 Build Build-Base (push) Has been skipped
Monorepo Pipeline / 🐳 Build Production Runtime (push) Has been skipped
2026-02-22 23:24:30 +01:00
2a169f1dfc feat(image-processor): switch to OpenRouter Vision for smart crop and remove heavy models 2026-02-22 23:24:22 +01:00
1bbe89c879 chore: sync versions to v1.8.15
Some checks failed
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 4s
Monorepo Pipeline / 🧪 Test (push) Successful in 5m30s
Monorepo Pipeline / 🏗️ Build (push) Successful in 7m42s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m5s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m4s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 1m31s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 59s
Monorepo Pipeline / 🚀 Release (push) Successful in 2m52s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 4m32s
Monorepo Pipeline / 🐳 Build Image Processor (push) Has been cancelled
2026-02-22 23:07:34 +01:00
554ca81c9b chore(image-processor): fix tfjs-node cross compile arch flags 2026-02-22 23:07:32 +01:00
aac0fe81b9 fix(image-service): enforce arm64 cpu architecture for tfjs-node in dockerfile
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 4m59s
Monorepo Pipeline / 🧹 Lint (push) Successful in 6m11s
Monorepo Pipeline / 🏗️ Build (push) Successful in 9m49s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 2m13s
Monorepo Pipeline / 🚀 Release (push) Successful in 3m6s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m26s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 23s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 6m17s
Monorepo Pipeline / 🐳 Build Image Processor (push) Successful in 16m2s
2026-02-22 22:44:03 +01:00
ada1e9c717 fix(image-service): force rebuild tfjs-node for container architecture in Dockerfile
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 2s
Monorepo Pipeline / 🧪 Test (push) Successful in 5m5s
Monorepo Pipeline / 🧹 Lint (push) Successful in 6m36s
Monorepo Pipeline / 🏗️ Build (push) Successful in 10m21s
Monorepo Pipeline / 🐳 Build Image Processor (push) Successful in 5m10s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m56s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 2m38s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 1m25s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 7m34s
Monorepo Pipeline / 🚀 Release (push) Successful in 9m13s
2026-02-22 22:29:25 +01:00
4d295d10d1 chore: sync versions to v1.8.12
All checks were successful
Monorepo Pipeline / ⚡ Prioritize Release (push) Successful in 1s
Monorepo Pipeline / 🧪 Test (push) Successful in 3m55s
Monorepo Pipeline / 🧹 Lint (push) Successful in 2m11s
Monorepo Pipeline / 🏗️ Build (push) Successful in 5m59s
Monorepo Pipeline / 🐳 Build Directus (Base) (push) Successful in 1m12s
Monorepo Pipeline / 🐳 Build Build-Base (push) Successful in 1m39s
Monorepo Pipeline / 🐳 Build Production Runtime (push) Successful in 1m29s
Monorepo Pipeline / 🐳 Build Image Processor (push) Successful in 5m35s
Monorepo Pipeline / 🐳 Build Gatekeeper (Product) (push) Successful in 6m14s
Monorepo Pipeline / 🚀 Release (push) Successful in 7m4s
2026-02-22 22:14:44 +01:00
c00f4e5ea5 fix(image-service): resolve next.js build crash and strict TS lint warnings for ci deploy 2026-02-22 22:14:35 +01:00
42 changed files with 462 additions and 790 deletions

2
.env
View File

@@ -1,5 +1,5 @@
# Project
IMAGE_TAG=v1.8.6
IMAGE_TAG=v1.8.19
PROJECT_NAME=at-mintel
PROJECT_COLOR=#82ed20
GITEA_TOKEN=ccce002e30fe16a31a6c9d5a414740af2f72a582

View File

@@ -1,5 +1,5 @@
# Project
IMAGE_TAG=v1.8.11
IMAGE_TAG=v1.8.20
PROJECT_NAME=sample-website
PROJECT_COLOR=#82ed20

View File

@@ -6,34 +6,21 @@ RUN npm install -g pnpm@10.30.1
FROM base AS build
WORKDIR /app
COPY . .
# Note: Canvas needs build tools on Debian
RUN apt-get update && apt-get install -y python3 make g++ libcairo2-dev libpango1.0-dev libjpeg-dev libgif-dev librsvg2-dev
# We only need standard pnpm install now, no C++ tools needed for basic Sharp
RUN pnpm install --frozen-lockfile
# Force tfjs-node to build the native addon from source so it compiles for arm64 (bypassing pnpm quirks)
RUN for f in $(find /app/node_modules/.pnpm -path "*/@tensorflow/tfjs-node/scripts/install.js"); do cd $(dirname $(dirname $f)) && npm run install -- build-addon-from-source; done
RUN pnpm install --frozen-lockfile
# Generate models explicitly for Docker
RUN ls -la packages/image-processor/scripts || true
RUN pnpm dlx tsx packages/image-processor/scripts/download-models.ts
RUN pnpm --filter @mintel/image-processor build
RUN pnpm --filter image-service build
# Generated locally for caching
FROM base
WORKDIR /app
COPY --from=build /app/node_modules ./node_modules
COPY --from=build /app/apps/image-service/node_modules ./apps/image-service/node_modules
COPY --from=build /app/packages/image-processor/node_modules ./packages/image-processor/node_modules
# Make sure directories exist to prevent COPY errors
RUN mkdir -p /app/packages/image-processor/models /app/apps/image-service/dist
RUN mkdir -p /app/apps/image-service/dist
COPY --from=build /app/apps/image-service/dist ./apps/image-service/dist
COPY --from=build /app/apps/image-service/package.json ./apps/image-service/package.json
COPY --from=build /app/packages/image-processor/dist ./packages/image-processor/dist
COPY --from=build /app/packages/image-processor/package.json ./packages/image-processor/package.json
COPY --from=build /app/packages/image-processor/models ./packages/image-processor/models
# Need runtime dependencies for canvas/sharp on Debian
RUN apt-get update && apt-get install -y libcairo2 libpango-1.0-0 libjpeg62-turbo libgif7 librsvg2-2 && rm -rf /var/lib/apt/lists/*
EXPOSE 8080
WORKDIR /app/apps/image-service

View File

@@ -1,6 +1,6 @@
{
"name": "image-service",
"version": "1.8.11",
"version": "1.8.20",
"private": true,
"type": "module",
"scripts": {

View File

@@ -1,18 +1,75 @@
import Fastify from "fastify";
import { processImageWithSmartCrop } from "@mintel/image-processor";
import {
processImageWithSmartCrop,
parseImgproxyOptions,
mapUrl,
} from "@mintel/image-processor";
const fastify = Fastify({
logger: true,
});
fastify.get("/unsafe/:options/:urlSafeB64", async (request, reply) => {
// Compatibility endpoint for old imgproxy calls (optional, but requested by some systems sometimes)
// For now, replacing logic in clients is preferred. So we just redirect or error.
return reply
.status(400)
.send({ error: "Legacy imgproxy API not supported. Use /process" });
const { options, urlSafeB64 } = request.params as {
options: string;
urlSafeB64: string;
};
// urlSafeB64 might be "plain/http://..." or a Base64 string
let url = "";
if (urlSafeB64.startsWith("plain/")) {
url = urlSafeB64.substring(6);
} else {
try {
url = Buffer.from(urlSafeB64, "base64").toString("utf-8");
} catch (e) {
return reply.status(400).send({ error: "Invalid Base64 URL" });
}
}
const parsedOptions = parseImgproxyOptions(options);
const mappedUrl = mapUrl(url, process.env.IMGPROXY_URL_MAPPING);
return handleProcessing(mappedUrl, parsedOptions, reply);
});
// Helper to avoid duplication
async function handleProcessing(url: string, options: any, reply: any) {
const width = options.width || 800;
const height = options.height || 600;
const quality = options.quality || 80;
const format = options.format || "webp";
try {
const response = await fetch(url);
if (!response.ok) {
return reply.status(response.status).send({
error: `Failed to fetch source image: ${response.statusText}`,
});
}
const arrayBuffer = await response.arrayBuffer();
const buffer = Buffer.from(arrayBuffer);
const processedBuffer = await processImageWithSmartCrop(buffer, {
width,
height,
format,
quality,
openRouterApiKey: process.env.OPENROUTER_API_KEY,
});
reply.header("Content-Type", `image/${format}`);
reply.header("Cache-Control", "public, max-age=31536000, immutable");
return reply.send(processedBuffer);
} catch (err) {
fastify.log.error(err);
return reply
.status(500)
.send({ error: "Internal Server Error processing image" });
}
}
fastify.get("/process", async (request, reply) => {
const query = request.query as {
url?: string;
@@ -26,41 +83,14 @@ fastify.get("/process", async (request, reply) => {
const width = parseInt(query.w || "800", 10);
const height = parseInt(query.h || "600", 10);
const quality = parseInt(query.q || "80", 10);
const format = (query.format || "webp") as "webp" | "jpeg" | "png" | "avif";
const format = (query.format || "webp") as any;
if (!url) {
return reply.status(400).send({ error: 'Parameter "url" is required' });
}
try {
const response = await fetch(url);
if (!response.ok) {
return reply
.status(response.status)
.send({
error: `Failed to fetch source image: ${response.statusText}`,
});
}
const arrayBuffer = await response.arrayBuffer();
const buffer = Buffer.from(arrayBuffer);
const processedBuffer = await processImageWithSmartCrop(buffer, {
width,
height,
format,
quality,
});
reply.header("Content-Type", `image/${format}`);
reply.header("Cache-Control", "public, max-age=31536000, immutable");
return reply.send(processedBuffer);
} catch (err) {
fastify.log.error(err);
return reply
.status(500)
.send({ error: "Internal Server Error processing image" });
}
const mappedUrl = mapUrl(url, process.env.IMGPROXY_URL_MAPPING);
return handleProcessing(mappedUrl, { width, height, quality, format }, reply);
});
fastify.get("/health", async () => {

View File

@@ -1,6 +1,13 @@
import mintelNextConfig from "@mintel/next-config";
/** @type {import('next').NextConfig} */
const nextConfig = {};
const nextConfig = {
serverExternalPackages: [
"@mintel/image-processor",
"@tensorflow/tfjs-node",
"sharp",
"canvas",
],
};
export default mintelNextConfig(nextConfig);

View File

@@ -1,6 +1,6 @@
{
"name": "sample-website",
"version": "1.8.11",
"version": "1.8.20",
"private": true,
"type": "module",
"scripts": {

View File

@@ -1,45 +1,60 @@
import { NextRequest, NextResponse } from 'next/server';
import { processImageWithSmartCrop } from '@mintel/image-processor';
import { NextRequest, NextResponse } from "next/server";
export const dynamic = "force-dynamic";
export const runtime = "nodejs";
export async function GET(request: NextRequest) {
const { searchParams } = new URL(request.url);
const url = searchParams.get('url');
let width = parseInt(searchParams.get('w') || '800');
let height = parseInt(searchParams.get('h') || '600');
let q = parseInt(searchParams.get('q') || '80');
const { searchParams } = new URL(request.url);
const url = searchParams.get("url");
const width = parseInt(searchParams.get("w") || "800");
const height = parseInt(searchParams.get("h") || "600");
const q = parseInt(searchParams.get("q") || "80");
if (!url) {
return NextResponse.json({ error: 'Missing url parameter' }, { status: 400 });
if (!url) {
return NextResponse.json(
{ error: "Missing url parameter" },
{ status: 400 },
);
}
try {
// 1. Fetch image from original URL
const response = await fetch(url);
if (!response.ok) {
return NextResponse.json(
{ error: "Failed to fetch original image" },
{ status: response.status },
);
}
try {
// 1. Fetch image from original URL
const response = await fetch(url);
if (!response.ok) {
return NextResponse.json({ error: 'Failed to fetch original image' }, { status: response.status });
}
const arrayBuffer = await response.arrayBuffer();
const buffer = Buffer.from(arrayBuffer);
const arrayBuffer = await response.arrayBuffer();
const buffer = Buffer.from(arrayBuffer);
// Dynamically import to prevent Next.js from trying to bundle tfjs-node/sharp locally at build time
const { processImageWithSmartCrop } =
await import("@mintel/image-processor");
// 2. Process image with Face-API and Sharp
const processedBuffer = await processImageWithSmartCrop(buffer, {
width,
height,
format: 'webp',
quality: q,
});
// 2. Process image with Face-API and Sharp
const processedBuffer = await processImageWithSmartCrop(buffer, {
width,
height,
format: "webp",
quality: q,
});
// 3. Return the processed image
return new NextResponse(new Uint8Array(processedBuffer), {
status: 200,
headers: {
'Content-Type': 'image/webp',
'Cache-Control': 'public, max-age=31536000, immutable',
},
});
} catch (error) {
console.error('Image Processing Error:', error);
return NextResponse.json({ error: 'Failed to process image' }, { status: 500 });
}
// 3. Return the processed image
return new NextResponse(new Uint8Array(processedBuffer), {
status: 200,
headers: {
"Content-Type": "image/webp",
"Cache-Control": "public, max-age=31536000, immutable",
},
});
} catch (error) {
console.error("Image Processing Error:", error);
return NextResponse.json(
{ error: "Failed to process image" },
{ status: 500 },
);
}
}

View File

@@ -57,7 +57,7 @@
"pino-pretty": "^13.1.3",
"require-in-the-middle": "^8.0.1"
},
"version": "1.8.11",
"version": "1.8.20",
"pnpm": {
"onlyBuiltDependencies": [
"@parcel/watcher",

View File

@@ -2,7 +2,7 @@
"name": "acquisition-manager",
"description": "Custom High-Fidelity Management for Directus",
"icon": "extension",
"version": "1.8.11",
"version": "1.8.20",
"type": "module",
"keywords": [
"directus",

View File

@@ -1,6 +1,6 @@
{
"name": "acquisition",
"version": "1.8.11",
"version": "1.8.20",
"type": "module",
"directus:extension": {
"type": "endpoint",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/cli",
"version": "1.8.11",
"version": "1.8.20",
"publishConfig": {
"access": "public",
"registry": "https://npm.infra.mintel.me"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/cloner",
"version": "1.8.11",
"version": "1.8.20",
"type": "module",
"main": "dist/index.js",
"module": "dist/index.js",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/cms-infra",
"version": "1.8.11",
"version": "1.8.20",
"private": true,
"type": "module",
"scripts": {

View File

@@ -2,7 +2,7 @@
"name": "company-manager",
"description": "Custom High-Fidelity Management for Directus",
"icon": "extension",
"version": "1.8.11",
"version": "1.8.20",
"type": "module",
"keywords": [
"directus",

View File

@@ -1,7 +1,7 @@
{
"name": "@mintel/content-engine",
"version": "1.8.11",
"private": true,
"version": "1.8.20",
"private": false,
"type": "module",
"main": "./dist/index.js",
"module": "./dist/index.js",

View File

@@ -2,7 +2,7 @@
"name": "customer-manager",
"description": "Custom High-Fidelity Management for Directus",
"icon": "extension",
"version": "1.8.11",
"version": "1.8.20",
"type": "module",
"keywords": [
"directus",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/directus-extension-toolkit",
"version": "1.8.11",
"version": "1.8.20",
"description": "Shared toolkit for Directus extensions in the Mintel ecosystem",
"type": "module",
"main": "./dist/index.js",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/eslint-config",
"version": "1.8.11",
"version": "1.8.20",
"publishConfig": {
"access": "public",
"registry": "https://npm.infra.mintel.me"

View File

@@ -2,7 +2,7 @@
"name": "feedback-commander",
"description": "Custom High-Fidelity Management for Directus",
"icon": "extension",
"version": "1.8.11",
"version": "1.8.20",
"type": "module",
"keywords": [
"directus",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/gatekeeper",
"version": "1.8.11",
"version": "1.8.20",
"private": true,
"type": "module",
"scripts": {

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/husky-config",
"version": "1.8.11",
"version": "1.8.20",
"publishConfig": {
"access": "public",
"registry": "https://npm.infra.mintel.me"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/image-processor",
"version": "1.8.11",
"version": "1.8.20",
"private": true,
"type": "module",
"main": "./dist/index.js",
@@ -18,9 +18,6 @@
"lint": "eslint src"
},
"dependencies": {
"@tensorflow/tfjs-node": "^4.22.0",
"@vladmandic/face-api": "^1.7.13",
"canvas": "^2.11.2",
"sharp": "^0.33.2"
},
"devDependencies": {

View File

@@ -1,55 +0,0 @@
import * as fs from "node:fs";
import * as path from "node:path";
import * as https from "node:https";
import { fileURLToPath } from "node:url";
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const MODELS_DIR = path.join(__dirname, "..", "models");
const BASE_URL =
"https://raw.githubusercontent.com/vladmandic/face-api/master/model/";
const models = [
"tiny_face_detector_model-weights_manifest.json",
"tiny_face_detector_model-shard1",
];
async function downloadModel(filename: string) {
const destPath = path.join(MODELS_DIR, filename);
if (fs.existsSync(destPath)) {
console.log(`Model ${filename} already exists.`);
return;
}
return new Promise((resolve, reject) => {
console.log(`Downloading ${filename}...`);
const file = fs.createWriteStream(destPath);
https
.get(BASE_URL + filename, (response) => {
response.pipe(file);
file.on("finish", () => {
file.close();
resolve(true);
});
})
.on("error", (err) => {
fs.unlinkSync(destPath);
reject(err);
});
});
}
async function main() {
if (!fs.existsSync(MODELS_DIR)) {
fs.mkdirSync(MODELS_DIR, { recursive: true });
}
for (const model of models) {
await downloadModel(model);
}
console.log("All models downloaded successfully!");
}
main().catch(console.error);

View File

@@ -1,51 +1,174 @@
import * as faceapi from "@vladmandic/face-api";
// Provide Canvas fallback for face-api in Node.js
import { Canvas, Image, ImageData } from "canvas";
import sharp from "sharp";
import * as path from "node:path";
import { fileURLToPath } from "node:url";
// @ts-expect-error FaceAPI does not have type definitions for monkeyPatch
faceapi.env.monkeyPatch({ Canvas, Image, ImageData });
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
// Path to the downloaded models
const MODELS_PATH = path.join(__dirname, "..", "models");
let isModelsLoaded = false;
async function loadModels() {
if (isModelsLoaded) return;
await faceapi.nets.tinyFaceDetector.loadFromDisk(MODELS_PATH);
isModelsLoaded = true;
}
export interface ProcessImageOptions {
width: number;
height: number;
format?: "webp" | "jpeg" | "png" | "avif";
quality?: number;
openRouterApiKey?: string;
}
/**
* Maps a URL based on the IMGPROXY_URL_MAPPING environment variable.
* Format: "match1:replace1,match2:replace2"
*/
export function mapUrl(url: string, mappingString?: string): string {
if (!mappingString) return url;
const mappings = mappingString.split(",").map((m) => {
if (m.includes("|")) {
return m.split("|");
}
// Legacy support for simple "host:target" or cases where one side might have a protocol
// We try to find the split point that isn't part of a protocol "://"
const colonIndices = [];
for (let i = 0; i < m.length; i++) {
if (m[i] === ":") {
// Check if this colon is part of "://"
if (!(m[i + 1] === "/" && m[i + 2] === "/")) {
colonIndices.push(i);
}
}
}
if (colonIndices.length === 0) return [m];
// In legacy mode with colons, we take the LAST non-protocol colon as the separator
// This handles "http://host:port" or "host:http://target" better
const lastColon = colonIndices[colonIndices.length - 1];
return [m.substring(0, lastColon), m.substring(lastColon + 1)];
});
let mappedUrl = url;
for (const [match, replace] of mappings) {
if (match && replace && url.includes(match)) {
mappedUrl = url.replace(match, replace);
}
}
return mappedUrl;
}
/**
* Parses legacy imgproxy options string.
* Example: rs:fill:300:400/q:80
*/
export function parseImgproxyOptions(
optionsStr: string,
): Partial<ProcessImageOptions> {
const parts = optionsStr.split("/");
const options: Partial<ProcessImageOptions> = {};
for (const part of parts) {
if (part.startsWith("rs:")) {
const [, , w, h] = part.split(":");
if (w) options.width = parseInt(w, 10);
if (h) options.height = parseInt(h, 10);
} else if (part.startsWith("q:")) {
const q = part.split(":")[1];
if (q) options.quality = parseInt(q, 10);
} else if (part.startsWith("ext:")) {
const ext = part.split(":")[1] as any;
if (["webp", "jpeg", "png", "avif"].includes(ext)) {
options.format = ext;
}
}
}
return options;
}
interface FaceDetection {
x: number;
y: number;
width: number;
height: number;
}
/**
* Detects faces using OpenRouter Vision API.
* Uses a small preview to save bandwidth and tokens.
*/
async function detectFacesWithCloud(
inputBuffer: Buffer,
apiKey: string,
): Promise<FaceDetection[]> {
try {
// Generate a small preview for vision API (max 512px)
const preview = await sharp(inputBuffer)
.resize(512, 512, { fit: "inside" })
.jpeg({ quality: 60 })
.toBuffer();
const base64Image = preview.toString("base64");
const response = await fetch(
"https://openrouter.ai/api/v1/chat/completions",
{
method: "POST",
headers: {
Authorization: `Bearer ${apiKey}`,
"Content-Type": "application/json",
"HTTP-Referer": "https://mintel.me",
"X-Title": "Mintel Image Service",
},
body: JSON.stringify({
model: "google/gemini-3-flash-preview", // Fast, cheap, and supports vision
messages: [
{
role: "user",
content: [
{
type: "text",
text: 'Detect all human faces in this image. Return ONLY a JSON array of bounding boxes like: [{"x": 0.1, "y": 0.2, "width": 0.05, "height": 0.05}]. Coordinates must be normalized (0 to 1). If no faces, return [].',
},
{
type: "image_url",
image_url: {
url: `data:image/jpeg;base64,${base64Image}`,
},
},
],
},
],
response_format: { type: "json_object" },
}),
},
);
if (!response.ok) {
throw new Error(`OpenRouter API error: ${response.statusText}`);
}
const data = (await response.json()) as any;
const content = data.choices[0]?.message?.content;
if (!content) return [];
// The model might return directly or wrapped in a json field
const parsed = typeof content === "string" ? JSON.parse(content) : content;
const detections = (parsed.faces || parsed.detections || parsed) as any[];
if (!Array.isArray(detections)) return [];
return detections.map((d) => ({
x: d.x,
y: d.y,
width: d.width,
height: d.height,
}));
} catch (error) {
console.error("Cloud face detection failed:", error);
return [];
}
}
export async function processImageWithSmartCrop(
inputBuffer: Buffer,
options: ProcessImageOptions,
): Promise<Buffer> {
await loadModels();
// Load image via Canvas for face-api
const img = new Image();
img.src = inputBuffer;
// Detect faces
const detections = await faceapi.detectAllFaces(
// @ts-expect-error FaceAPI does not have type definitions for monkeyPatch
img,
new faceapi.TinyFaceDetectorOptions(),
);
const sharpImage = sharp(inputBuffer);
const metadata = await sharpImage.metadata();
@@ -53,35 +176,36 @@ export async function processImageWithSmartCrop(
throw new Error("Could not read image metadata");
}
const detections = options.openRouterApiKey
? await detectFacesWithCloud(inputBuffer, options.openRouterApiKey)
: [];
// If faces are found, calculate the bounding box containing all faces
if (detections.length > 0) {
// Map normalized coordinates back to pixels
const pixelDetections = detections.map((d) => ({
x: d.x * (metadata.width || 0),
y: d.y * (metadata.height || 0),
width: d.width * (metadata.width || 0),
height: d.height * (metadata.height || 0),
}));
let minX = metadata.width;
let minY = metadata.height;
let maxX = 0;
let maxY = 0;
for (const det of detections) {
const { x, y, width, height } = det.box;
if (x < minX) minX = Math.max(0, x);
if (y < minY) minY = Math.max(0, y);
if (x + width > maxX) maxX = Math.min(metadata.width, x + width);
if (y + height > maxY) maxY = Math.min(metadata.height, y + height);
for (const det of pixelDetections) {
if (det.x < minX) minX = Math.max(0, det.x);
if (det.y < minY) minY = Math.max(0, det.y);
if (det.x + det.width > maxX)
maxX = Math.min(metadata.width, det.x + det.width);
if (det.y + det.height > maxY)
maxY = Math.min(metadata.height, det.y + det.height);
}
const faceBoxWidth = maxX - minX;
const faceBoxHeight = maxY - minY;
// Calculate center of the faces
const centerX = Math.floor(minX + faceBoxWidth / 2);
const centerY = Math.floor(minY + faceBoxHeight / 2);
// Provide this as a focus point for sharp's extract or resize
// We can use sharp's resize with `position` focusing on crop options,
// or calculate an exact bounding box. However, extracting an exact bounding box
// and then resizing usually yields the best results when focusing on a specific coordinate.
// A simpler approach is to crop a rectangle with the target aspect ratio
// centered on the faces, then resize. Let's calculate the crop box.
const centerX = Math.floor(minX + (maxX - minX) / 2);
const centerY = Math.floor(minY + (maxY - minY) / 2);
const targetRatio = options.width / options.height;
const currentRatio = metadata.width / metadata.height;
@@ -90,18 +214,14 @@ export async function processImageWithSmartCrop(
let cropHeight = metadata.height;
if (currentRatio > targetRatio) {
// Image is wider than target, calculate new width
cropWidth = Math.floor(metadata.height * targetRatio);
} else {
// Image is taller than target, calculate new height
cropHeight = Math.floor(metadata.width / targetRatio);
}
// Try to center the crop box around the faces
let cropX = Math.floor(centerX - cropWidth / 2);
let cropY = Math.floor(centerY - cropHeight / 2);
// Keep crop box within image bounds
if (cropX < 0) cropX = 0;
if (cropY < 0) cropY = 0;
if (cropX + cropWidth > metadata.width) cropX = metadata.width - cropWidth;
@@ -116,9 +236,7 @@ export async function processImageWithSmartCrop(
});
}
// Finally, resize to the requested dimensions and format
let finalImage = sharpImage.resize(options.width, options.height, {
// If faces weren't found, default to entropy/attention based cropping as fallback
fit: "cover",
position: detections.length > 0 ? "center" : "attention",
});

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/infra",
"version": "1.8.11",
"version": "1.8.20",
"publishConfig": {
"access": "public",
"registry": "https://npm.infra.mintel.me"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/journaling",
"version": "1.8.11",
"version": "1.8.20",
"private": true,
"type": "module",
"main": "./dist/index.js",

View File

@@ -1,7 +1,7 @@
import OpenAI from "openai";
import { DataCommonsClient } from "./clients/data-commons";
import { TrendsClient } from "./clients/trends";
import { SerperClient, type SerperVideoResult } from "./clients/serper";
import { SerperClient } from "./clients/serper";
export interface Fact {
statement: string;
@@ -54,7 +54,6 @@ export class ResearchAgent {
if (data.length > 0) {
// Analyze trend
const latest = data[data.length - 1];
const max = Math.max(...data.map((d) => d.value));
facts.push({
statement: `Interest in "${kw}" is currently at ${latest.value}% of peak popularity.`,
source: "Google Trends",
@@ -246,7 +245,7 @@ Return a JSON object with a single string field "query". Example: {"query": "cor
const evalPrompt = `You are a strict technical evaluator. You must select the MOST RELEVANT educational tech video from the list below based on this core article context: "${topic.slice(0, 800)}..."
Videos:
${ytVideos.map((v, i) => `[ID: ${i}] Title: "${v.title}" | Channel: "${v.channel}" | Snippet: "${v.snippet || 'none'}"`).join("\n")}
${ytVideos.map((v, i) => `[ID: ${i}] Title: "${v.title}" | Channel: "${v.channel}" | Snippet: "${v.snippet || "none"}"`).join("\n")}
RULES:
1. The video MUST be highly relevant to the EXACT technical topic of the context.
@@ -268,7 +267,7 @@ Return ONLY a JSON object: {"bestVideoId": number}`;
evalResponse.choices[0].message.content || '{"bestVideoId": -1}',
);
bestIdx = evalParsed.bestVideoId;
} catch (e) {
} catch {
console.warn("Failed to parse video evaluation response");
}
@@ -343,7 +342,7 @@ CRITICAL: Do NOT provide more than 2 trendsKeywords. Keep it extremely focused.`
try {
let parsed = JSON.parse(
response.choices[0].message.content ||
'{"trendsKeywords": [], "dcVariables": []}',
'{"trendsKeywords": [], "dcVariables": []}',
);
if (Array.isArray(parsed)) {
parsed = parsed[0] || { trendsKeywords: [], dcVariables: [] };

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/mail",
"version": "1.8.11",
"version": "1.8.20",
"private": false,
"publishConfig": {
"access": "public",

View File

@@ -1,7 +1,7 @@
{
"name": "@mintel/meme-generator",
"version": "1.8.11",
"private": true,
"version": "1.8.20",
"private": false,
"type": "module",
"main": "./dist/index.js",
"module": "./dist/index.js",

View File

@@ -123,7 +123,7 @@ IMPORTANT: Return ONLY the JSON object. No markdown wrappers.`,
let result;
try {
result = JSON.parse(body);
} catch (e) {
} catch {
console.error("Failed to parse AI response", body);
return [];
}

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/next-config",
"version": "1.8.11",
"version": "1.8.20",
"publishConfig": {
"access": "public",
"registry": "https://npm.infra.mintel.me"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/next-feedback",
"version": "1.8.11",
"version": "1.8.20",
"publishConfig": {
"access": "public",
"registry": "https://npm.infra.mintel.me"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/next-observability",
"version": "1.8.11",
"version": "1.8.20",
"publishConfig": {
"access": "public",
"registry": "https://npm.infra.mintel.me"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/next-utils",
"version": "1.8.11",
"version": "1.8.20",
"publishConfig": {
"access": "public",
"registry": "https://npm.infra.mintel.me"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/observability",
"version": "1.8.11",
"version": "1.8.20",
"publishConfig": {
"access": "public",
"registry": "https://npm.infra.mintel.me"

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/pdf",
"version": "1.8.11",
"version": "1.8.20",
"type": "module",
"main": "dist/index.js",
"module": "dist/index.js",

View File

@@ -2,7 +2,7 @@
"name": "people-manager",
"description": "Custom High-Fidelity Management for Directus",
"icon": "extension",
"version": "1.8.11",
"version": "1.8.20",
"type": "module",
"keywords": [
"directus",

View File

@@ -1,7 +1,7 @@
{
"name": "@mintel/thumbnail-generator",
"version": "1.8.11",
"private": true,
"version": "1.8.20",
"private": false,
"type": "module",
"main": "./dist/index.js",
"module": "./dist/index.js",

View File

@@ -1,6 +1,6 @@
{
"name": "@mintel/tsconfig",
"version": "1.8.11",
"version": "1.8.20",
"publishConfig": {
"access": "public",
"registry": "https://npm.infra.mintel.me"

View File

@@ -2,7 +2,7 @@
"name": "unified-dashboard",
"description": "Custom High-Fidelity Management for Directus",
"icon": "extension",
"version": "1.8.11",
"version": "1.8.20",
"type": "module",
"keywords": [
"directus",

656
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff