Content Ops for GEO: case studies and scripts to appear in AI responses

Sep 23, 2025

By Daniel Espejo, Founder & CEO at Omnia. 23th of September 2025

In this article

  • What Content Ops for GEO Really Means

  • Case Study 1 - Skincare Brand

  • Case Study 2 - B2B SaaS

  • Reviews That AI Actually Uses

  • Templates and Scripts

  • Where to Publish

  • Maintenance and Review

  • Simple Measurement

  • Common Mistakes

  • Before & After Examples

  • Closing Thoughts

In AI searches, such as ChatGPT or Gemini, users get direct answers where they learn, compare, and decide everything in one step. In some cases, the purchase is also initiated from these responses. If you do not appear in those responses, you miss out on key opportunities for discovery and choice. If your content is not easy for AI to cite or is not present in the sources consulted by the models, your brand will not appear. Content Ops for GEO (optimisation for generative AI engines) is about that: not publishing more, but creating useful and reliable information, publishing it on key sites and keeping it updated in an organised manner. This way, you manage your visibility in a more controllable and consistent manner. 

To see how you can do this in a clear and actionable way, we have prepared two case studies. We have chosen two completely different niches, and, as you will see, the strategy and steps to follow are practically the same.

Case 1. Skincare (B2C): From Generic Blogs to Reference Mentions in Just a Few Weeks

As we recently saw in the ChatGPT chart, Health and Skincare is one of the most searched sectors on ChatGPT. More and more users are looking for recommended products and guides on this engine.

Imagine a brand of anti-ageing creams for sensitive skin that publishes many posts about lifestyle, but hardly ever appears in AI responses. To change this, the brand focused on key prompts (questions) that users ask when they are ready to buy. We chose three simple questions based on their difficulty and volume and created practical content around them. This helped the brand get cited as a recommended option in AI engines, showing how clear data and real reviews make a difference.

  • Key Prompts (3 Selected): ‘Fragrance-free anti-ageing cream for sensitive skin with clinical testing.’; ‘Simple skincare routine for travel (in carry-on) with SPF.’; ‘Alternatives to retinol for sensitive skin (safe during pregnancy).’ Knowing the prompts that drive decision-making is essential. With tools like Omnia, you can analyse the most relevant prompts according to volume and difficulty in different markets. 

  • Created Content (First Phase): A clear comparison on the website, with criteria such as ingredients and tests, a table of six products, and an honest conclusion about who it is suitable for; A frequently asked questions page with numerical answers, such as ‘How often should it be applied?’ (two to three nights per week); A short video (3–4 min) with transcribed text, explaining common problems and mistakes; A programme for useful reviews, asking customers for details such as skin type and results in weeks.

  • How we distribute it (Middle Phase): Send it to beauty blogs with the table for download; Upload the video and short clips to YouTube with text; Respond in skincare forums with data from the table (without selling directly); Encourage detailed reviews on marketplaces.

  • How We Measure (Final Phase): See whether we appear in the response and in what role (recommended/listed) in 2–3 search engines; how many days it takes for us to appear after publishing; what content drove the mentions (video, reviews, etc.).

  • What we learned: AI responses copy your actual criteria and reviews, so tables with data and details are essential.

Case 2. B2B SaaS (CRM for 5–50 Users): From Feature Lists to Practical Guides

In a B2B example, CRM software for small teams competes with big brands, with lots of technical information but little clarity about who needs it. The brand focused on questions that help decide purchases. We chose three and created simple guides for comparison. This accelerated mentions in AI, where the tool was positioned as a recommended option thanks to easy explanations and steps.

  • Key Questions: ‘CRM for 5–10 users with SSO and integration with Notion.’; ‘CRM with annual contract and configuration in less than 30 days.’; ‘GDPR-ready CRM with audit logs and hosting in Europe.’

  • Created Content (First Phase): Guide to ‘How to Choose a CRM for Small Teams’ (1,200–1,500 words), with 6 criteria such as security and cost, downloadable list and common mistakes; Table of features on the web and in CSV format (so other people can share it), comparing with competitors and links to tests; Quick start guide (7 steps) for configuration, with real times; Demo video (5-6 min) with text, showing problems and solutions.

  • How we distribute it: Guest article on sales blogs with downloadable list; Upload demo and error video to YouTube with sections; Share the table on forums such as RevOps, with update notes.

  • How we measure: Whether we appear in the top results on 3 search engines; Whether we are recommended or just listed; Days until we appear for each piece of content; How competitors change every 2 weeks.

  • What we learned: AI uses your tables to explain options; the quick guide speeds up recommendations; strong mentions come when other sites copy your data.

Scripts and Templates (Ready to Use)

These are simple tools to get you started quickly in any business. Use the comparison structure to create clear, easy-to-cite tables. Include questions for reviews that generate useful comments.

  • 1. Comparison Structure (For Any Area): Title: How to choose (product) in (situation); Why these criteria matter (4–6 simple points); Table: Columns such as Criteria / Why / Your brand / Competitors / Source; Honest conclusion: For whom yes / no; Updated on... + Changes.

  • 2. Questions for Useful Reviews: What problem does it solve? How did you use it (how much, where)? What changed and how long did it take? What didn't work or for whom? Your Brand's Response: Thank you for the details. We confirm: version X, batch Z, use for N weeks. Note: If [condition], adjust to [advice]. (Link to guide explaining this.)

  • 3. Video Script Questions (3–4 min): Opening: ‘If you have (problem), this saves you time.’; Criteria: 3 rules + 2 common mistakes.; Short demo.; Summary with 3 points (what to watch/avoid).; Description with full text and links.

Priority Distribution (Without Wasting Effort)

Distributing well means choosing a few channels but doing them well, so that AI sees your information in several places. Prioritise your website for easy control. Add videos with text so that they are easy for AI to read. Choose a relevant blog or forum to gain credibility. And use reviews for real user testimonials.

  • Your website: Include comparisons, frequently asked questions, and quick guides.

  • YouTube with text: Videos with questions or demos divided into sections.

  • A niche blog: Article with a downloadable table.

  • An expert forum: Useful answers with data.

  • Review sites: Ask for details and respond with information. Better to have three strong channels than many weak ones. 

The goal: That different sites repeat your criteria and figures with a link to the source.

Maintenance and Review

Keeping content fresh is key because AI changes quickly. Conduct regular reviews to update prices or features. Before publishing, check that it is easy to use and cite. This prevents your information from becoming obsolete.

  • Monthly Reviews: Check frequently asked questions (prices, limits); Review comparisons (data, tables); Update tests and delete old ones.

  • Pre-Publishing Check: Is it easy to cite? (Table or link); Does it explain the method and limitations?; Does everything match on your website and in your documents?; Does it have a date and changes?

Simple Measurement (Guiding Your Steps)

Measure the basics to see what works and decide on changes. See if you appear, how you are mentioned, and what drove that. This helps you improve without complications.

  • Do you appear? (Yes/no by AI engine); Recommended or just listed; What drove it: Video, blog, forum, reviews; Days until appearance: From publication to mention.

Here, a tool such as Omnia naturally helps: finding and tracking key questions, seeing which engines and sources matter, and measuring days to appear, facilitating data-driven decisions.)

Common Mistakes That Lower Your Visibility (and How to Avoid Them)

Some simple habits ruin your efforts, such as vague comparisons or empty reviews. Identify them to save time. For example, always use text in videos and update dates. That way, you keep everything consistent and fresh.

  • Comparisons without details: Replace them with tables with clear criteria and limits.

  • Vague reviews (“good product”): Ask for context, usage, and results.

  • Videos without transcript and chapters: Add transcripts and sections.

  • Different information on different sites: Unify everything on your website and synchronise it.

  • Publish and forget: Add a date and a note about changes (freshness counts).

Closing

GEO is about creating real, useful information: clear criteria, easy-to-see data, practical guides, and a presence on sites that AI reviews, such as honest reviews. With a list of questions, simple templates, focused distribution, and basic measurement, you can manage your visibility. In a nutshell: Key questions → Citable information → Placement on sites → Updates → Simple measurement. If you support the system, you are much more likely to appear in useful responses and convert better when the click comes.