Skip to main content
← Choose a different target

Unlock: Attention Sinks and Retrieval Decay

Why transformers disproportionately attend to initial tokens (attention sinks), how StreamingLLM exploits this for infinite-length inference, and how retrieval accuracy degrades with distance and position within the context window.

174 Prerequisites0 Mastered0 Working145 Gaps
Prerequisite mastery17%
Recommended probe

Chernoff Bounds is your weakest prerequisite with available questions. You haven't been assessed on this topic yet.

Chernoff BoundsFoundationsWEAKEST
Not assessed3 questions
No quiz
Not assessed11 questions

Sign in to track your mastery and see personalized gap analysis.