Product Information Management Platform
Production-grade PIM with integrated DAM, brand-specific rules, channel validation, and AI-ready enrichment that proposes - never silently overwrites.
// AI capabilities
- Pluggable image intelligence (Vision API ready)
- LLM-based enrichment proposals routed to a review queue
- Confidence and source tracking per proposed value
- Completeness and channel-readiness scoring engines
// Architecture flow
Overview
A Next.js + Prisma PIM (Product Information Management) system with a Digital Asset Manager (DAM) baked in. Multi-brand catalog support with brand-specific rules in JSON, per-channel readiness validation, AI enrichment that proposes but never overwrites, and a clean operator UI for the catalog team.
Problem
Multi-brand retailers like Mike Sport need a single product hub that respects brand-specific data rules (Adidas requires X attributes, Crocs requires Y), validates against per-channel requirements (Shopify, marketplace, retail), and uses AI for enrichment without giving up control of master data. Off-the-shelf PIMs were either rigid or too expensive for the size of the operation.
Approach
Build a governance-first PIM where brand rules and channel profiles are configuration, not code. Treat AI as a contributor with proposal rights, not commit rights. Never silent-overwrite anything: every machine-generated value goes to an EnrichmentTask queue with confidence and source attribution.
Architecture
- App: Next.js 15 App Router, TypeScript, Tailwind, shadcn/ui primitives.
- ORM: Prisma over SQLite by default (swap to Postgres for production deploys).
- Validation: Zod schemas for brand rules, channel profiles, and runtime input.
- Asset storage: SHA-256 deduplication; any backend (S3, R2, local) works behind the storage adapter.
- Brand rules: JSON-defined per-brand attribute requirements parsed by the rules engine on every product change.
- Channel profiles: Per-channel JSON specs that drive the validator's structured issue reports.
Tech stack
- Frontend / backend: Next.js 15, TypeScript, Tailwind, shadcn/ui
- ORM and DB: Prisma, SQLite (default) / PostgreSQL (prod)
- Validation: Zod
- Storage: Adapter pattern (S3 / R2 / local)
AI work
- Image intelligence stub: heuristic-based by default, swappable to OpenAI Vision, Google Vision, or AWS Rekognition behind a single interface.
- Enrichment LLM stub: proposes values to an
EnrichmentTaskqueue with confidence and source tracking. Never overwrites. - Completeness engine: weighted scoring across required attributes, locales, imagery, and SEO metadata, with weights tuned per brand.
- Channel validation: structured issue reports point operators directly at what's missing for which channel.
Engineering highlights
- Non-negotiable integrity rules: AI never overwrites; every proposal traces back to a source; assets dedupe by SHA-256; locales never collapse silently.
- 12+ admin pages: products, assets, brands, channels, validation, imports, exports, activity, settings, and more, all with the same operator-grade UX.
- Pluggable everywhere: vision provider, LLM provider, and storage backend all live behind narrow interfaces, making vendor swaps a config change.
- Audit trail: every enrichment, every overwrite-by-human, every channel publication is timestamped and attributable.
Outcome
Production-ready, used as the catalog backbone for downstream commerce and analytics. AI proposals reduce manual data work without ever compromising the master data layer.
Lessons
- Brand rules are configuration, not code. The moment you hard-code one brand's quirks you've created a maintenance trap.
- "Propose, never overwrite" is a five-word architecture decision that prevents an entire category of AI failure modes.
- Operator UIs deserve the same care as customer-facing ones. The catalog team is using this every day.
Want to dig deeper?
Ask my AI agent anything about how this was built, what tradeoffs I made, or how it could fit your team.
Ask my AI →// related projects
Product Data Enrichment Dashboard
AI-assisted product enrichment pipeline with confidence scoring, source-tracked LLM proposals, and a queue-based architecture that never silently overwrites master data.
AI SEO Collection Optimizer
Autonomous SEO content engine that captures Lebanese organic search demand by generating high-confidence collection landing pages on a parallel VPS layer, grounded in Search Console signals, Shopify orders, and live catalog data, with a self-improving GSC measurement loop.
Marketing Intelligence Dashboard
Enterprise marketing analytics platform with real-time dashboards, OpenAI-powered insights via the Vercel AI SDK, and one-click PPTX stakeholder reporting.