Human Like Memory in Langgraph

I’m working on a project using LangChain and an LLM to build a memory system that mimics human memory—preserving important context while avoiding overload. I’ve explored chunking, embeddings, and memory stores, but I’m looking for a more efficient way to manage context without losing meaningful information over time.

Solutions like Langmem and Mem0 don’t fully address the challenge of replicating human-like memory. Any tips or resources would be greatly appreciated!