πŸŽ“How I Study AIHISA
πŸ“–Read
πŸ“„PapersπŸ“°Blogs🎬Courses
πŸ’‘Learn
πŸ›€οΈPathsπŸ“šTopicsπŸ’‘Concepts🎴Shorts
🎯Practice
πŸ“Daily Log🎯Prompts🧠Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way

Papers3

AllBeginnerIntermediateAdvanced
All SourcesarXiv
#long-context LLMs

Spectral Attention Steering for Prompt Highlighting

Beginner
Weixian Waylon Li, Yuchen Niu et al.Mar 1arXiv

This paper teaches a new way to make a language model pay extra attention to the exact words you highlight in a prompt.

#attention steering#prompt highlighting#key embeddings

Anatomy of Agentic Memory: Taxonomy and Empirical Analysis of Evaluation and System Limitations

Intermediate
Dongming Jiang, Yi Li et al.Feb 22arXiv

This paper explains how AI agents remember things across long conversations and why many current tests don’t truly measure that memory.

#agentic memory#memory-augmented generation#long-context LLMs

Elastic Attention: Test-time Adaptive Sparsity Ratios for Efficient Transformers

Beginner
Zecheng Tang, Quantong Qiu et al.Jan 24arXiv

Transformers slow down on very long inputs because standard attention looks at every token pair, which is expensive.

#elastic attention#sparse attention#full attention