Using AI and vectorDB data to automate rote tasks, improve site inlink quality, and enhance user experience and session length.
An internal AI-powered tool that recommends optimal inline document links for newly published content. By automating the link discovery and selection process, the tool improves SEO, enhances user experience, and reduces editorial workload.
When new documents were published, editors were required to manually create inline links from ~5 existing documents in the corpus. This process was slow and inconsistent — often skipped or performed with low-quality shortcuts, leading to:
Overuse of the same few documents for linking.
Many documents with little to no inbound links despite relevance.
Missed SEO opportunities and degraded user navigation.
With thousands of new documents published each month across 30+ brands, the manual process was inefficient, unscalable, and detrimental to site performance.
As Product Manager, I partnered with editorial stakeholders to validate the need, ran a proof of concept to test AI-driven link recommendations, and led development of a full-scale solution. I defined product requirements, guided the integration of AI and vector search, and oversaw rollout across brands.
We built an AI-driven recommendation system that analyzes both new and existing content to surface high-quality inline link opportunities:
Document Input – User enters the new document’s URL and primary SEO keyword phrase.
Keyword Expansion – AI suggests related keyword phrases to broaden discovery.
Vector Database Search – Keyword phrases and document context are matched against a vector database containing chunked sections of all existing brand documents.
Candidate Selection – The system retrieves document chunks meeting a minimum similarity threshold.
Phrase Extraction – AI identifies candidate phrases within chunks that could serve as inline links.
Relevance Scoring – AI scores and ranks potential link phrases for contextual fit and quality.
Editorial Output – Editors receive a sorted list of recommended link options and can directly insert links into the CMS.
Saved 5–10 minutes per document across thousands of monthly publications, significantly reducing editorial workload.
Improved linking coverage and diversity, reducing over-reliance on a small set of documents.
Strengthened SEO performance through optimized internal linking structures.
Enhanced user experience, ensuring readers discover the most relevant related content.
Vector database storing chunked document sections for semantic similarity search.
AI/NLP models for keyword expansion, phrase extraction, and relevance scoring.
Integrated directly with editorial workflows and CMS for ease of use.
Scalable across 30+ brands publishing thousands of documents monthly.