Context Engineering 2.0: MCP, Agentic RAG & Memory // Simba Khadder

MLOps Community
MLOps CommunityFeb 28, 2026

Why It Matters

It gives enterprises a scalable, unified data layer for AI agents, accelerating reliable, production‑grade intelligent applications.

Key Takeaways

  • Redis Context Engine unifies retrieval, tools, memory.
  • MCP native interface acts like GraphQL for agents.
  • Structured data now searchable alongside unstructured text.
  • Agent workflows become real-time and reliable.
  • Scales AI agent access across organization.

Pulse Analysis

The rise of autonomous AI agents has exposed a critical gap: fragmented access to both unstructured text and structured data sources. Traditional Retrieval‑Augmented Generation (RAG) excels with static documents but falters when agents need to query relational databases, invoke APIs, or maintain contextual state. Redis’s Context Engine addresses this by introducing a schema‑driven, MCP‑native layer that abstracts diverse data formats behind a single query surface. This approach mirrors GraphQL’s flexibility, allowing developers to define semantic schemas that seamlessly blend vector‑based semantic search with precise filter criteria, effectively turning any data store into an agent‑friendly endpoint.

From an operational perspective, the unified interface reduces engineering overhead dramatically. Teams no longer paste massive JSON payloads into prompts, rely on brittle Text‑to‑SQL hacks, or build custom OpenAPI wrappers for each service. Instead, agents can traverse relationships, invoke live APIs, and persist context via built‑in memory mechanisms—all within one call. This consolidation improves latency, reliability, and observability, key factors for scaling AI workloads across large organizations. Moreover, leveraging Redis’s high‑performance vector database ensures that semantic similarity searches remain fast even as data volumes grow.

Strategically, Context Engineering 2.0 positions Redis as a foundational infrastructure layer for the next wave of production AI. By enabling real‑time, reliable agent workflows that span the full data spectrum, enterprises can accelerate use cases such as automated support bots, dynamic knowledge assistants, and autonomous decision‑making systems. The partnership with Prosus underscores market confidence, suggesting broader adoption among firms seeking to embed intelligent agents into core business processes without the complexity of disparate data pipelines.

Original Description

March 3rd, Computer History Museum CODING AGENTS CONFERENCE, come join us while there are still tickets left.
Thanks to @ProsusGroup for collaborating on the Agents in Production Virtual Conference 2025.
Abstract //
Context Engineering 2.0 treats retrieval, tools, and memory as one surface that agents can navigate. The aim is to make documents, databases, events, and live APIs addressable and navigable through a single MCP native interface. Think GraphQL for agents. RAG works well for one shot queries from textual corpora like help centers and docs. With Redis's vector database, users can index, embed, and retrieve relevant chunks. Sources like relational databases and APIs are out of reach through RAG. Teams paste large ad hoc JSON objects into prompts, rely on Text2SQL, or struggle with OpenAPI to MCP wrappers. It is not reliable and it does not scale across the organization. With Redis Context Engine we are engineering a better way to expose data to agents. A unified, schema driven, MCP native layer connects all your data and powers real time, reliable agent workflows. Define a semantic schema and structured data enters the same path as unstructured text. Agents blend semantic search with structured filters in one call, traverse relationships, call APIs, and keep state via memory. All powered by Redis.
Bio //
Simba Khadder is the Founder & CEO of Featureform, now part of Redis. Following the acquisition, he joined Redis to lead Context Engine, which helps developers deliver the right data at the right time to power next-generation AI and agents.
Before Featureform, Simba founded TritonML after leaving Google, building ML infrastructure that supported over 100M monthly active users. He channeled those learnings into Featureform’s virtual feature store, designed to turn existing infrastructure into a fully managed feature store.
Outside of tech, Simba is an avid surfer, mixed martial artist, published astrophysicist for his work on Planet 9, and once ran the SF marathon in basketball shoes.
A Prosus | MLOps Community Production

Comments

Want to join the conversation?

Loading comments...