../engram

Source: https://www.arxiv.org/pdf/2601.07372

Date: 12 January 2026

Organisation(s): Deepseek

Author(s): X cheng, W Zeng, D Dai, Q Chen, B Wang, Z Xie, K Huang, X YU, Z Hao, Y Li, Ha. Zhang, Hu. Zhang, D Zhao, W Liang

Note: An addition to the existing transformer for best NIAH

[Paper Short] Engram

conditionnal memory c’est untruc clé ca ressemble à un moyen de mieux orienté la mémoire à l’expret dédié dans le MoE

Engram, a conditional memory module grounded in the classic 𝑁-gram structure but equipped with modern adaptations such as tokenizer compression, multi-head hashing, contextualized gating, and multi-branch integration

TLDR

Complete later

Goal

How

Conclusion

Source overview

/machine learning/ /LLM/ /Context/