How to invalidate LLM cache

I want the user to be able to invalidate the cache, if any, for a particular LLM call.

I’m using Langchain + Langgraph with RedisCache.

self._models[model_name] = ChatBedrockConverse(
                    model=AVAILABLE_MODELS[model_name]["model"],
                    provider="anthropic"
                )

redis_client = redis.Redis.from_url("redis://localhost:6379/0")
set_llm_cache(RedisCache(redis_client))

response = self._models[model_name].invoke(messages)
return {"response": response}

I’m omitting a lot of code just for brevity. The caching implementation works like a charm, no issue with that. But I want to be able to force a new LLM call even if there is already a cache for it.

I know I can just disable the cache before the invoke, but I also want the new LLM call response to replace the old cache.

I tried playing around with:

BaseCache.update(prompt, llm_string, return_val)

But I’m having a hard time manually creating promp and llm_string to be able to update the correct Redis key.

Is there an easy way of doing what I’m trying to do?

Any help is much appreciated.

Hi @claudio.donate

If this is for a particular LLM call, then I guess there is no other ways but RedisCache(BaseCache).update(...) or RedisClient.delete(cache_key) - a redis prop on RedisCache(BaseCache)..

For RedisClient.delete(cache_key), the parameter is calculated this way on RedisCache(BaseCache):

    def _key(self, prompt: str, llm_string: str) -> str:
        """Create a key for the cache."""
        prompt_hash = hashlib.md5(prompt.encode()).hexdigest()
        llm_string_hash = hashlib.md5(llm_string.encode()).hexdigest()
        return f"{self.prefix}:{prompt_hash}:{llm_string_hash}"

References:

Thank you for the reply @pawel-twardziak . I did debug how everything was being cached by langchain.

My issue now is how to recreate prompt and llm_string after the model.invoke is done. I need that to know the hash key used to save the reponse in Redis to either update or delete it.

1 Like

hi @claudio.donate

That sounds like a tedious task… I wish you luck! :orange_heart: