Significant advances in generative AI in recent years have made artificial intelligence a top priority for businesses globally. As a result, large language models (LLMs) have become foundational in powering everything from virtual service agents to online search engines to fraud detection.
– PwC Annual CEO Survey, 2025
In CTV, LLMs will be foundational in powering rich search and discovery experiences, but they can’t do it alone. Because LLMs are prediction engines, they require complementary technologies to fact check the results they provide. These technologies improve accuracy, provide contextual relevance, enrich results, and align LLM outputs with real-world knowledge.
The Model Context Protocol (MCP) is ideal for ensuring that an LLM’s output is a reliable single source of truth facilitating a dynamic connection between an LLM and Gracenote’s knowledge base. This white paper details how MCP facilitates that connection to ensure that search and discovery experiences are rich and personalized, as well as accurate, recent and complete.
Almost 50% of the programs on FAST channels that Gracenote tracks were made in 2020 or later.
SVOD catalogs continue to grow, but a large portion of content isn’t being watched.
Centralized infotainment provides a better in-car media experience.
Success! Please access the white paper below.
DownloadFill out the form to contact us!
Your inquiry has been received, and our team is eager to assist you. We will review your message promptly and respond to you as soon as possible.